I want to know if my current backup strategy is solid or not and if I need to improve something.
Situation: 7 computers and 4 smartphone need their data backed up.
Solution: 1 computer is my main storage server whilst also running a couple other services.
The "server" (thats how I'll call it here) automatically scans each pc every hour or the moment they come online to see if files have changed. If it detect a filechange (add,move,remove,...) it will mirror this over to itself in the designated folder. This is done for all folders that contain important data (not program files, appdata, windows, ... to prevent useless writes). The same goes for phones but those are only backed up when charging (programmed this myself as a check for when to scan and not use battery).
The server houses all data on 1 10tb hdd. It then makes an exact copy (not raid 1) to another 10tb hdd and if a file is removed, moved, changed or what so ever it will keep an image off it on another 6tb drive.
Then the second 10 tb drive gets uploaded to a online cloud that 1 year file history and the 6tb drive too but this one has a lower priority.
I also make a disk image every 2 weeks of my computer so that if it has an issue I can return to it. This does not get uploaded to the cloud and is done manually by me on a external ssd.
Would this prevent data loss? (of course data made on that day itself will probably be lost but slightly older data should be fine)
Situation: 7 computers and 4 smartphone need their data backed up.
Solution: 1 computer is my main storage server whilst also running a couple other services.
The "server" (thats how I'll call it here) automatically scans each pc every hour or the moment they come online to see if files have changed. If it detect a filechange (add,move,remove,...) it will mirror this over to itself in the designated folder. This is done for all folders that contain important data (not program files, appdata, windows, ... to prevent useless writes). The same goes for phones but those are only backed up when charging (programmed this myself as a check for when to scan and not use battery).
The server houses all data on 1 10tb hdd. It then makes an exact copy (not raid 1) to another 10tb hdd and if a file is removed, moved, changed or what so ever it will keep an image off it on another 6tb drive.
Then the second 10 tb drive gets uploaded to a online cloud that 1 year file history and the 6tb drive too but this one has a lower priority.
I also make a disk image every 2 weeks of my computer so that if it has an issue I can return to it. This does not get uploaded to the cloud and is done manually by me on a external ssd.
Would this prevent data loss? (of course data made on that day itself will probably be lost but slightly older data should be fine)