Question Suggestion on programs to monitor file storage for corruption / "bit-rot" on periodic (automatic) basis?

sirhawkeye64

Distinguished
May 28, 2015
133
0
18,690
Are there any applications that you can use to monitor drives or folders for bit-rot/data corruption on a regular (automated) basis? I'm not talking about things like SMART that monitor the drive's health, but more so a program that monitors the file system or a folder. For example, maybe you run your files through the program and it generates a hash, and then periodically checks files against the hash to see if there is any corruption detected.

Context: I have a computer that syncs to cloud storage. I run it like a server/NAS and so I want to make sure that if data corruption occurs, I'll know so hopefully I can prevent it from uploading a corrupted file to the cloud, should things like bit-rot occur. I used to use a script to generate MD5 checksums, but this was a manual process and had no real way of automatically checking files for changes using the checksums, so I'd like to find something that does that for me, perhaps using a database or something to catalog the files and then check them.
 
Not sure about "applications". Much less "bit-rot"......

My suggestion is to have a regular and proven backup routine.

I.e., multiple backups to multiple locations with verification that the backups are recoverable and readable.

Just my thoughts on the matter.
 
Are there any applications that you can use to monitor drives or folders for bit-rot/data corruption on a regular (automated) basis? I'm not talking about things like SMART that monitor the drive's health, but more so a program that monitors the file system or a folder. For example, maybe you run your files through the program and it generates a hash, and then periodically checks files against the hash to see if there is any corruption detected.

Context: I have a computer that syncs to cloud storage. I run it like a server/NAS and so I want to make sure that if data corruption occurs, I'll know so hopefully I can prevent it from uploading a corrupted file to the cloud, should things like bit-rot occur. I used to use a script to generate MD5 checksums, but this was a manual process and had no real way of automatically checking files for changes using the checksums, so I'd like to find something that does that for me, perhaps using a database or something to catalog the files and then check them.
64bit CRC when "golden" save that CRC then recalculate before reading and compare. This was what we used at work with a multi PB tape archive.
 
There are many ways to achieve this, let me mention a few
  • BTRFS file system does have a built in CRC check (block wise) that actually runs and do the check every read. In a situation --checksum where crc failure happens on a hard drive that still operates, the affected files cannot be read or altered. The user will get an error message. This is the default file system for Fedora Linux. The CRC check is only done when a file (a block actually) are being read. However, there are built-in tools to check the whole file system at once.
  • If backup is done by rsync: If this is launched from a script, you may run a reverse check but using --dry-run and --checksum arguments to compare every single file to your backup. This will check all files, but are time consuming and put tear and wear to the storage devices.
  • FreeFilesync also have an option to only compare the contents + file size. That will also be a time consuming task, dependent of the amount of data.
 
FreeFilesync also have an option to only compare the contents + file size. That will also be a time consuming task, dependent of the amount of data.
That's what I use when copying RAW, JPG and MOV files from digital media (CF, SD, uSD) to my laptop drives over USB.

In FreeFileSync Compare, you need to select "File content"to perform a bit-by-bit comparison. It doubles the time taken to backup files each evening, but you know if corruption has occured during transfer before it's too late.