Question Data fade - does coping and opening data from HDD / SSD make that data more secure?

Status
Not open for further replies.

The Electro Machine

Commendable
BANNED
Jan 28, 2021
162
2
1,595
In regards to repelling data fade, I finally got time to try out a proper piece of software - plus I have a piece of hardware that I do not care about thus can run various tests on it


That first part of the above equation is a freeware [at least for home usage] and is called Disk Fresh [by Puran Software] - while the second one is my old slow 8 TB HDD that I will be selling soon. The help file of Disk Fresh says that HDDs should be scanned and rewritten - while SSDs only scanned so that they do not wear out unnecessary. And that is when it hit me:

if for rewriting of the data first a scanning process is required - then does a simple act of coping such data refresh it in the same way; thus makes usage of such software unnecessary?

As I said, I will be selling that HDD, but wanting to retain the data it currently holds - and so I already copied all that data from it to another drive. So if I were after all to not sell now that HDD but instead of that keep it, would then scanning it now with software like Disk Fresh be totally pointless? Because I already "scanned" it during the copying process? If yes- then is also using all of the data [i.e. all files] on some drive an equivalent of such professional anti data fade scanning?


Lets say I have an SSD that holds all of my music to which I listen to everyday, both in a randomly generated playlists and also in some organized / systematic way. Would such reading of each and every single file at some point in time [at least once every couple of months when it finally gets send to the audio player either by random or by hand] do the trick for that tiny part of the drive holding those audio files? And to be very precise: depending on the audio format / player / plugin - it might be that such file would have to be actually listened from the very beginning of it to its end; plus it would also need to have all of its tag fields [i.e. its whole metadata] read - so that every single byte of such file be accessed for 100% thus constitute as a proxy-anti-data-fade-scanning?


And one last logical question: do NVMe disks also succumb to data fade - and should be treated like SSDs in this regard?

And also: I am eager to start using Disk Fresh - but at the same time afraid that I can somehow destroy terabytes of my data without realizing it months or even years later. Because there is no way I will test in some other way all of the files that will be rewritten by software of such kind, right? Or am I wrong because I can just save a checksum of the whole drive or [to make it easier] of one of the folder at a time - and then compare that pre-test checksum with the checksum generated after Disk Fresh will be finished with rewriting? But then again: I would need a second drive that would host a copy of the data that is about to be rewritten - which again would not only make the whole scanning and rewriting totally unnecessary, but also on top of that would require from me a doubled number of archive storage space
 
Last edited:
What you should be looking to do is a true backup.
A second/third copy of your data.

Anything that lives on a single storage device may be considered to not exist at all. No matter what magic dust you sprinkle on it.

 
  • Like
Reactions: CountMike
What you should be looking to do is a true backup.

A second/third copy of your data.



Anything that lives on a single storage device may be considered to not exist at all. No matter what magic dust you sprinkle on it.


{The Electro Machine sighs}

I have and 18TB and 16TB HDD. One of them is fee meters away from the machine, so that if the neighbor upstairs decides to have a swimming pool in a room when I am away or there is a small fire, I might be able to save one of them. The second HDD is near the PC but it is not connected to power and motherboard unless I am making a proper backup- so that it is not affected from a power surge. The ordinary backups are done automatically to an SSD which is connected all the time- so that a copy is done no later than few minutes after new data is created or aquired by me. I also make backups on Blu-ray discs- some of those optical discs backups are in physical possession of my family and friends . And I am thinking of staring a private Raspberry Pi server with SSD, located at a house of my family member which would be located 2 kilometers way. and if that works I could relocate it to a friend who leaves in another country across a sea

All of that still does not and will not protect me from data fade
 
Last edited:
"data fade". Also known by the more common term, bit rot.

For a solid state storage device, you do not need to "read" every bit of data on it.
You just need to power it up once in a while.

Reading all the data on it is not a thing, because the drive firmware shuffles data around on the cells, for wear leveling.
It does this all by itself.

le sigh
 
In regards to repelling data fade, I finally got time to try out a proper piece of software - plus I have a piece of hardware that I do not care about thus can run various tests on it


That first part of the above equation is a freeware [at least for home usage] and is called Disk Fresh [by Puran Software] - while the second one is my old slow 8 TB HDD that I will be selling soon. The help file of Disk Fresh says that HDDs should be scanned and rewritten - while SSDs only scanned so that they do not wear out unnecessary. And that is when it hit me:

if for rewriting of the data first a scanning process is required - then does a simple act of coping such data refresh it in the same way; thus makes usage of such software unnecessary?

As I said, I will be selling that HDD, but wanting to retain the data it currently holds - and so I already copied all that data from it to another drive. So if I were after all to not sell now that HDD but instead of that keep it, would then scanning it now with software like Disk Fresh be totally pointless? Because I already "scanned" it during the copying process? If yes- then is also using all of the data [i.e. all files] on some drive an equivalent of such professional anti data fade scanning?


Lets say I have an SSD that holds all of my music to which I listen to everyday, both in a randomly generated playlists and also in some organized / systematic way. Would such reading of each and every single file at some point in time [at least once every couple of months when it finally gets send to the audio player either by random or by hand] do the trick for that tiny part of the drive holding those audio files? And to be very precise: depending on the audio format / player / plugin - it might be that such file would have to be actually listened from the very beginning of it to its end; plus it would also need to have all of its tag fields [i.e. its whole metadata] read - so that every single byte of such file be accessed for 100% thus constitute as a proxy-anti-data-fade-scanning?


And one last logical question: do NVMe disks also succumb to data fade - and should be treated like SSDs in this regard?

And also: I am eager to start using Disk Fresh - but at the same time afraid that I can somehow destroy terabytes of my data without realizing it months or even years later. Because there is no way I will test in some other way all of the files that will be rewritten by software of such kind, right? Or am I wrong because I can just save a checksum of the whole drive or [to make it easier] of one of the folder at a time - and then compare that pre-test checksum with the checksum generated after Disk Fresh will be finished with rewriting? But then again: I would need a second drive that would host a copy of the data that is about to be rewritten - which again would not only make the whole scanning and rewriting totally unnecessary, but also on top of that would require from me a doubled number of archive storage space
My guess for a hdd.
I can see how data on a hdd might fade out over time....magnetic fields.
I doubt that reading that data would do anything to restore the magnetic field.
Rewriting that data would apply a new magnetic field.

Ssd.....they have their own internal magic but they need power applied for the magic to work.

Is any of that a guarantee the data will be secure.....no.
That's where you get into have multiple copies.
 
Does anyone have anything to add about countermeasures to this issue?
If someone did, they probably would have responded sometime in the last 18 months.
Please don't resurrect ancient threads, even if they are your own.

But, this:
 
Status
Not open for further replies.