Question Best way to do backups

Apr 23, 2023
13
1
15
After several near irrecoverable data losses, I have finally decided to do something about the situation and make backups
However, I have realized I am not sure what would be the optimal way to make a backup that'd protect both against hardware and software failure.
I do have some external hard drives, which are very good protection against software failure. I copy my data to them, unplug the drive, and it sits there safe from viruses or mistakes.
However, this does not protect it from corruption and data rot, it could occur on the drive and wouldn't be noticed until I tried to access the very file, by which time it might be too late.
I thought this could be avoided using RAID, but recently found out most modern RAID systems don't protect against this either, so I'd have to use ZFS.
I have plenty of drives to do this, but then that doesn't offer any sort of protection from software errors, if I accidentally delete something, or malware does, the deletion occurs on all the drives.
What I have are five 500GB drives, four 250GB drives, several PCs I could use to house those seperately, and will also be getting an used QNAP TS-431P NAS with two 3TB hard drives. My first thought was to have a NAS PC that'd be opened to the internet and my home network so I can access it from all my devices, set up the 500GB drives with redudancy, and the 250GB in a seperate machine without redudancy (for stuff like downloads that wouldn't be catastrophic to loose but could benefit from speed), and have the machine with the 500GB drive be somehow backed up onto the NAS with 3TB drives, in a way that wouldn't allow any virus or attack to get to the NAS yet somehow could back up the files when needed (problem one)

And by that point, there's extra redundancy from the fact that both the "working copy" and the "backup" are redundantly copied within themselves (assuming that that specific NAS will let me set up ZFS or something like it, problems two and three)

As someone who has never before ever had backups, and just stuffed all my files wherever, I am very lost now that I really want to do it properly. What would be the most efficient way to make a backup safe from both software and hardware errors, and if possible being opened to the internet?
 

USAFRet

Titan
Moderator
First off, kudos for being proactive with this.
Far too many people aren't.

As you note, RAID is not a backup.
A RAID 1 or 5 provides physical drive redundancy, to support continued Uptime. It does little or nothing for actual data security.

The basic concept is 3-2-1
3 copies, on at least 2 different media, at least 1 offsite or otherwise inaccessible.

The first post here is mine, somewhat changed since I wrote this:
 
  • Like
Reactions: ZugTheDragon
Apr 23, 2023
13
1
15
First off, kudos for being proactive with this.
Far too many people aren't.

As you note, RAID is not a backup.
A RAID 1 or 5 provides physical drive redundancy, to support continued Uptime. It does little or nothing for actual data security.

The basic concept is 3-2-1
3 copies, on at least 2 different media, at least 1 offsite or otherwise inaccessible.

The first post here is mine, somewhat changed since I wrote this:
Hello, thank you for the response!
While this does give me a somewhat good rule of thumb, it still leaves me with a lot of questions.
How would the backups be made in a way that would automatically correct any errors on them, while also remaining untouchable (while not using more than necessary redundancy)? This part worries me since I'd like them to be performed automatically, and not rely on myself to remember to do everything correctly
 

USAFRet

Titan
Moderator
Hello, thank you for the response!
While this does give me a somewhat good rule of thumb, it still leaves me with a lot of questions.
How would the backups be made in a way that would automatically correct any errors on them, while also remaining untouchable (while not using more than necessary redundancy)? This part worries me since I'd like them to be performed automatically, and not rely on myself to remember to do everything correctly
Not sure what you mean by ...automatically correct for errors...

My systems do a nightly (or weekly) incremental to the NAS.
Keep for XX days or weeks.
Given that, I can go back in time to a previous Image, and recover.

This is (mostly) all automated. The nightly weekly backups happen all hands off.
Sometime between midnight and 5AM.

The weekly 'next copy' in the NAS is also hands off.

The 'offsite' is currently a couple of drives in a desk drawer at work.
Updated quarterly or whenever.
Vacuum sealed bags, with the date written on it.
This is "life critical" data.
(which reminds me...next week for the update)



And yes, I have had to use one of those Macrium images after a physically dead drive.
960GB SanDisk SSD died suddenly.
Slot in a new drive, click click in Macrium...all 605GB data recovered exactly as it was at 4AM that morning when it ran its nightly Incremental.
 

DSzymborski

Curmudgeon Pursuivant
Moderator
There's no "best way" to do backups. There are benefits and drawbacks that vary and those will depend on the nature of the data, how you wish to back it up, and what hardware you have available.

I usually advocate for the simplest plan that gets the job done. Are you sure you *need* a NAS? It sounds like you have another system.

For my needs, my computers that are actively used get their data backed up in three places. The backup server has a simple Windows install and a bunch of hard drives; all important files and general data files (things like media files) are automatically backed up to it every morning at 6 AM using Bvckup2, a simple program that allows delta copying (so it checks the status of all the files and then only updates things that are changed or need to be backed up). A clone of the OS drive is sent to the backup server once a week.

My file server, which feeds music and data all around the house and remotely (Plex to other houses and Plexamp feeds my car's infotainment system through Android Auto) is another system with Unraid. It's there for speed rather than backups like the backup server, but also serves as a backup. That server is updated every morning at 6:30 AM.

My backup server sends all the data I need to be backed up or updated about (6 TB, though it obviously only sends a very small fraction of that amount at any time) to my cloud backup (I use crashplan) at 7 AM every morning.

My most crucial data, that would be a very big deal if I lost -- some of it pretty crucial professionally and would take me at least a year to rewrite -- is also kept on physical media in my home safe and encrypted flash drives stored over at my mom's house and in my safe deposit box. I update these a couple times a year.

What works for *you* may differ. I'd focus first on what you *need* to back up your data and *then* focus on the how.
 
Apr 23, 2023
13
1
15
Not sure what you mean by ...automatically correct for errors...

My systems do a nightly (or weekly) incremental to the NAS.
Keep for XX days or weeks.
Given that, I can go back in time to a previous Image, and recover.

This is (mostly) all automated. The nightly weekly backups happen all hands off.
Sometime between midnight and 5AM.

The weekly 'next copy' in the NAS is also hands off.

The 'offsite' is currently a couple of drives in a desk drawer at work.
Updated quarterly or whenever.
Vacuum sealed bags, with the date written on it.
This is "life critical" data.
(which reminds me...next week for the update)



And yes, I have had to use one of those Macrium images after a physically dead drive.
960GB SanDisk SSD died suddenly.
Slot in a new drive, click click in Macrium...all 605GB data recovered exactly as it was at 4AM that morning when it ran its nightly Incremental.
By automatically checking for errors, I mean if an error occurs on the main media, of a file that was already backed up, that whatever program or script backs it up (still not sure how to do that either) won't just overwrite the good copy on the backup
That'd make the backup rather useless
Also, I sadly do not have enough hard drives to have seperate daily and weekly copies
 
Apr 23, 2023
13
1
15
There's no "best way" to do backups. There are benefits and drawbacks that vary and those will depend on the nature of the data, how you wish to back it up, and what hardware you have available.

I usually advocate for the simplest plan that gets the job done. Are you sure you *need* a NAS? It sounds like you have another system.

For my needs, my computers that are actively used get their data backed up in three places. The backup server has a simple Windows install and a bunch of hard drives; all important files and general data files (things like media files) are automatically backed up to it every morning at 6 AM using Bvckup2, a simple program that allows delta copying (so it checks the status of all the files and then only updates things that are changed or need to be backed up). A clone of the OS drive is sent to the backup server once a week.

My file server, which feeds music and data all around the house and remotely (Plex to other houses and Plexamp feeds my car's infotainment system through Android Auto) is another system with Unraid. It's there for speed rather than backups like the backup server, but also serves as a backup. That server is updated every morning at 6:30 AM.

My backup server sends all the data I need to be backed up or updated about (6 TB, though it obviously only sends a very small fraction of that amount at any time) to my cloud backup (I use crashplan) at 7 AM every morning.

My most crucial data, that would be a very big deal if I lost -- some of it pretty crucial professionally and would take me at least a year to rewrite -- is also kept on physical media in my home safe and encrypted flash drives stored over at my mom's house and in my safe deposit box. I update these a couple times a year.

What works for *you* may differ. I'd focus first on what you *need* to back up your data and *then* focus on the how.
A server would be very useful considering I often need to access the same files from my laptop, desktop and phone, and being able to use them away from home would be very useful as well. I am taking the NAS offer as well as it seems to be a perfect puzzle piece in this job, and because just the hard drives in it alone are a good offer at that price.
The software you mentioned seems like it is what I need, however then it brings up the question for me, how it decides what needs to be copied? If it does this by checking whether or not the files are the same, and updates the backup if they're not, then if the main copy got corrupted it'd just blindly overwrite the backup with the corrupted copy
 

USAFRet

Titan
Moderator
By automatically checking for errors, I mean if an error occurs on the main media, of a file that was already backed up, that whatever program or script backs it up (still not sure how to do that either) won't just overwrite the good copy on the backup
That'd make the backup rather useless
Also, I sadly do not have enough hard drives to have seperate daily and weekly copies
Embrace the beauty of Incremental.

My current main C drive has 277GB consumed.
A Full image and 30 days worth of Incrementals consumes 348GB.
For instance, last nights Incremental was 5.2GB.

I can recover that drive to any state in the last 30 days.
 
  • Like
Reactions: ZugTheDragon
Apr 23, 2023
13
1
15
Embrace the beauty of Incremental.

My current main C drive has 277GB consumed.
A Full image and 30 days worth of Incrementals consumes 348GB.
For instance, last nights Incremental was 5.2GB.

I can recover that drive to any state in the last 30 days.
I see, so there could be for example a weekly backup and the "incremental" backup that only notes down the changes daily since the weekly one?
That sounds perfect, but I am still not sure what to use to make those, and how can I be safe that the backup server could not be breached
 
  • Like
Reactions: Nathkrul

Tac 25

Estimable
Jul 25, 2021
1,391
421
3,890
I have a habit of saving important files into 64 gb usb flash drives. Why not just put them all in one large External Hard Drive? I prefer multiple usb, because if the External Drive meet an accident, then all backups are gone with it (have a friend whose external died, and he lost pretty much all data). If one usb drive dies, then the damage is minimal. -- I backup video editor installers, game installers, pet videos, party videos with family and friends, and pretty much anything hard to replace or irreplaceable this way.
 
Apr 23, 2023
13
1
15
I have a habit of saving important files into 64 gb usb flash drives. Why not just put them all in one large External Hard Drive? I prefer multiple usb, because if the External Drive meet an accident, then all backups are gone with it (have a friend whose external died, and he lost pretty much all data). If one usb drive dies, then the damage is minimal. -- I backup video editor installers, game installers, pet videos, party videos with family and friends, and pretty much anything hard to replace or irreplaceable this way.
I don't think I'd trust an USB flash drive with reliability, I think most are designed cheaply to be just carriers of files from one PC to another quickly
And it doesn't address many of the concerns that I've pointed out
 

USAFRet

Titan
Moderator
I see, so there could be for example a weekly backup and the "incremental" backup that only notes down the changes daily since the weekly one?
That sounds perfect, but I am still not sure what to use to make those, and how can I be safe that the backup server could not be breached
Incrementals record only those changes since the last Full or Incremental.

In time of recovery, all the intervening Incrementals get rolled up with the relevant Full image.


"could not be breached" == offsite or offline.
 

Tac 25

Estimable
Jul 25, 2021
1,391
421
3,890
well, I'm not suggesting you do it. It's simply my preference. The data I stored in usb's in 2012 and 2015 still work. Actually i would like to add that the more important files are actually backed up twice or thrice, meaning two or three usb's contain the same thing (lol). This is the method I found reliable through the years. The video of my passed away friend cat still lives.

oh, and I would like to say... in addition to the usb method. My files are also backed up into two other pc's. Have three pc here at home with almost identical files in them. The only difference is the weakest one mostly has installers and only a few games installed, while the two better pc's have many games actively installed (in addition to having installer files).
 
Apr 23, 2023
13
1
15
Incrementals record only those changes since the last Full or Incremental.

In time of recovery, all the intervening Incrementals get rolled up with the relevant Full image.


"could not be breached" == offsite or offline.
Is there any way to make something "offline" that would be able to do the job automatically?
Not sure if a networked PC that doesn't have any ports open to the internet would count as "offline"
or perhaps if it was on it's own network shared with only the main machine which it backs up? I can also do that if that would be necessary but effective
 

punkncat

Polypheme
Ambassador
For a long time, I utilized a reoccurring task with Robo-Copy which would then do a copy of the intended drive to another drive on a separate PC as well as to a NAS. I would also physically keep a copy updated from time to time by hand to an external enclosure that I kept off and sitting on a closet shelf most of the time. I still use the latter method, but now have one BIG drive on each of my (2) main use PC which contain all the files I wish to keep. I also keep that backed up to a NAS unit as well as the external.

I created physical media (CD/DVD) of all of the music, pictures, and other important files that were of significance not to lose, no matter what (outside of acts of God and such). I recently created a small bit of things that were important to me and put them on Cloud Storage, more as a test and see than really needing it.

To be fair, carrying around a huge bunch of Data is something to consider. How much are you really using it? How important is it, really?

I would say that in my own use case the only thing that I deem TRULY important to me in digital media are all my pictures. Second behind that is the digital copy of what used to be my physical music collection. I have been giving a lot of thought to that importance I mentioned in that streaming services have made most video storage redundant, music can easily be listened to on demand from various pay services and often at better quality when considering long term storage (and mentioned bit rot).
IF I am going to continue actually keeping a full NAS backup of what I currently have as well as planning for the future, I am going to need to purchase some pretty big drives upcoming, as the 'critical' storage on those units is actually right at full from the (now) larger in PC drives.
 
Last edited:
  • Like
Reactions: ZugTheDragon

USAFRet

Titan
Moderator
Is there any way to make something "offline" that would be able to do the job automatically?
Not sure if a networked PC that doesn't have any ports open to the internet would count as "offline"
or perhaps if it was on it's own network shared with only the main machine which it backs up? I can also do that if that would be necessary but effective
offline means not accessible by anything.
A drive in a desk drawer.
A system powered down and unplugged from the wall.


A networked PC in your house LAN is still accessible from the other systems.
If you get some nasty virus, it can and will easily access that system and its drives.
 
Both at home and at work, I practice backup routines that is both versioning (similar to incremental backup) and partially offline. The versioning has already saved me a couple of times from data loss caused by software/user errors leading to destroyed/deleted files.

For my own case, having scheduled a backup task is out of the question because it require that the backup location must either be accessible all times, or have some sort of timed service to open access at certain times.
Reason for this is to minimize loss of backup in case of crypto virus.

My solution at work - Windows
I've used the software called FreeFileSync, because it support file versioning. Backup is mainly done once a day, takes a couple of minutes from connecting a usb drive until disconnecting.

Before I used FreeFileSync, I used robocopy - but I soon discovered the weakness that whenever a file got changed in a way that I later regret - after backup was performed, there was no way of getting the original file back.


My solution at home - Linux
I have two backup locations. First backup is just a usb hdd. Secondary backup is another computer in the basement that use an SSH server so that I can use rsync to perform backup. I also do use rsync to backup to the usb hdd.
Versioning is accomplished by a backup parameter in rsync combined by the bash shell ability to use date and time as any other variable.


Why and what is versioning
For a backup scheme where it exist a source and a destination directory for your backup, in addition there will be a third folder for versioning purpose.
I personally tend to name this folder so that imply what backup and that it is for versioning, so for example "versioning_photos" would be pretty common.

Each time a backup is performed, and a file in backup directory is to be replaced, updated or deleted - then the original file in the backup dir - instead of being removed - is moved to the versioning folder, usually in a sub folder that have incremental number or date+time. I prefer date+time scheme.

This way - If I have a project in progress and do something stupid to a document with irreversible changes, I can now easily go back and retrieve an older version of that file, or group of files.


[edit]
Forgot to mention - For both backup at work and at home, I also have off-site backups in case of fire or other all-destroying accidents.
 
Last edited:
After several near irrecoverable data losses, I have finally decided to do something about the situation and make backups
However, I have realized I am not sure what would be the optimal way to make a backup that'd protect both against hardware and software failure.
I do have some external hard drives, which are very good protection against software failure. I copy my data to them, unplug the drive, and it sits there safe from viruses or mistakes.
However, this does not protect it from corruption and data rot, it could occur on the drive and wouldn't be noticed until I tried to access the very file, by which time it might be too late.
I thought this could be avoided using RAID, but recently found out most modern RAID systems don't protect against this either, so I'd have to use ZFS.
I have plenty of drives to do this, but then that doesn't offer any sort of protection from software errors, if I accidentally delete something, or malware does, the deletion occurs on all the drives.
What I have are five 500GB drives, four 250GB drives, several PCs I could use to house those seperately, and will also be getting an used QNAP TS-431P NAS with two 3TB hard drives. My first thought was to have a NAS PC that'd be opened to the internet and my home network so I can access it from all my devices, set up the 500GB drives with redudancy, and the 250GB in a seperate machine without redudancy (for stuff like downloads that wouldn't be catastrophic to loose but could benefit from speed), and have the machine with the 500GB drive be somehow backed up onto the NAS with 3TB drives, in a way that wouldn't allow any virus or attack to get to the NAS yet somehow could back up the files when needed (problem one)

And by that point, there's extra redundancy from the fact that both the "working copy" and the "backup" are redundantly copied within themselves (assuming that that specific NAS will let me set up ZFS or something like it, problems two and three)

As someone who has never before ever had backups, and just stuffed all my files wherever, I am very lost now that I really want to do it properly. What would be the most efficient way to make a backup safe from both software and hardware errors, and if possible being opened to the internet?
There is no one size fits all method.

Some folks use an image...others a clone...others the cloud or maybe copy a few files to a flash stick.

It's what fits your needs.
 

The Electro Machine

Commendable
BANNED
Jan 28, 2021
162
2
1,595
Is there any way to make something "offline" that would be able to do the job automatically?
[...]
You could use some sort of event detection software, maybe even build-in Windows Task Scheduler - which would automatically create a copy of locations specified by you to a drive whenever it is detected as being n the system. And then also keep on your desk a cable with male and female connector with power for that drive. So that whenever you physically join them a backup would be executed

I myself:
● export manually settings whenever I change them
● copy manually settings [if possible] with BackUp Maker on a daily basis - to an online archive drive
● save the same project with new consecutive number after making a profound input / changes to it
● automatically create backups of new data with BackUp Maker on a minute basis - to the same online archive drive
● automatically create backups of all data from chosen locations with BackUp Maker on an hour basis - to the same online archive drive
● manually create backups of all data from chosen locations - to 1 of 2 offline archive drives *
● manually create backups of all data from chosen locations - to 25 GB Blu-ray discs
● manually create partial / temporary backups on different online drives and offline pendrives

* A relevant pool: https://forums.tomshardware.com/thr...th-both-of-them-having-the-same-ones.3805246/
 

Misgar

Notable
Mar 2, 2023
1,496
395
1,090
My "offline" archival backups consist of a couple of Hewlett Packard RAID-Z2 TrueNAS Core servers and a third RAID-Z2 system which are powered down most of the time and never connected to the internet. Of course these servers could be infected by Ransomware when switched on and connected to another computer during file copying.

In the past I used to archive important digital camera images to 25GB Bluray WORM discs but I needed a large number of discs per annum. After the initial copying process the discs were closed for further writing, making them read-only and less susceptible to virus attack. Nowadays, I archive photos to 800GB LTO4 tapes and slide the write-protect tab to the read-only position, so files can't easily be overwritten or deleted. A single 800GB tape can hold a 4-week vacation set of photos. Video files go on another tape.

None of the above constitute "normal" daily or weekly backups, but are my insurance against virus attack. Tapes and Blurays are easily transportable off site. I did once consider an LTO automated library system, where cartridges are fed into one or more tape drives for incremental nightly backups, but I abandoned the concept because it was too expensive at the time.

Everyone has their own favourite backup regime (or none at all). You just have to choose one.
 
Apr 23, 2023
13
1
15
offline means not accessible by anything.
A drive in a desk drawer.
A system powered down and unplugged from the wall.


A networked PC in your house LAN is still accessible from the other systems.
If you get some nasty virus, it can and will easily access that system and its drives.
The problem with that is that it cannot automatised, it'd rely on me manually plugging in the external drive and backing up everything to it, and if I'll be using an external drive for that how can I be sure that nothing on it could get corrupted?
And what is the point of all the other suggestions then, like what is the point of incremental backups if they can get removed just as well as the main copy?

Also I meant that one of the deeper backup servers would be on a LAN seperate from my home LAN (I have plenty of old routers), and use the fact the NAS has two LAN ports to perhaps safely backup the NAS onto itself without interacting with anything else
 
Last edited:
Hello ZUG... first let me say i am very impressed by the amount of gear you have and how you use it.
I have been a pc user for 20 years and by some peoples standards i guess my backup methods are a bit primitive.

I use flash drives to back up photos,doc,music,videos, and saved game data , i also have another flash drive that has copies of the individual drives on it in case one of them failed , and yes its happened. I take screenshots of different folder layouts so i can remember how i had them.

I do a full clean install about once every 18 months as i think this is the only way to get rid of leftovers you probably dont know you still have hiding away somewhere.

I do have some external hard drives, which are very good protection against software failure. I copy my data to them, unplug the drive, and it sits there safe from viruses or mistakes.

Hope less experienced users see that bit because some people think its ok to keep external devices connected just in case they need them ...... WRONG
 

USAFRet

Titan
Moderator
The problem with that is that it cannot automatised, it'd rely on me manually plugging in the external drive and backing up everything to it, and if I'll be using an external drive for that how can I be sure that nothing on it could get corrupted?
And what is the point of all the other suggestions then, like what is the point of incremental backups if they can get removed just as well as the main copy?

Also I meant that one of the deeper backup servers would be on a LAN seperate from my home LAN (I have plenty of old routers), and use the fact the NAS has two LAN ports to perhaps safely backup the NAS onto itself without interacting with anything else
What can't be automated?

Except for my End of the World offsite backup drives, my routine IS all automated.
The PCs run a nightly or weekly Full or Incremental, as needs dictate.
All hands off, takes zero seconds out of my day.
 

Misgar

Notable
Mar 2, 2023
1,496
395
1,090
Hi Zug,
Each time you plug your external drive into a computer, the data could become encrypted by Ransomware if the computer has become infected. I have a separate fibre optic 10Gbe SFP+ network linking my three servers, with no connection to the internet, but each time I copy files to the servers, I risk corrupting the data. It's very difficult to fully isolate a network, whilst continuing to save data to it.

I worked for a multi-national company that runs two separate networks, one connected to the internet, the second completely isolated. We went through a very complicated security process with encrypted USB drives, physical key tokens and passwords to transfer data on to the isolated network. Security protocols were rigorously enforced by IT and disciplinary action taken for any transgressions.

As others have already stated, the only data that's safe from virus attack is when re-writeable media is left in a drawer and never reconnected to a computer.

My solution is to archive data to Blu-ray and close the optical disk for further writing. Similarly I archive data to 800GB LTO4 tapes and slide the red tab in the cartridge over to the write-protect (read-only) position. If a computer cannot write data back to the archive, it's difficult for a virus to delete or encrypt it.

I don't think you are going to find a totally automatic home backup solution that protects you entirely against virus attack. Instead I recommend archiving vital data to WORM media. (Write Once Read Many).
 
  • Like
Reactions: The Electro Machine