I don't normally comment on articles, but this one annoyed me quite a bit.
The Backblaze report is hardly "questionable" as the author of the article claims. It's well sourced, with a huge sample size (40K drives), and the initial report has been followed up several times with further information - one of them even with the raw data so you can do your own number crunching. Subsequent reports have dispelled the concerns that their chassis revisions, drive sourcing (drives removed from USB enclosures vs. outright purchased bare), and other anomalies.
Additionally, I see zero mention on Backblaze's site that they've switched to NAS or Enterprise drives, simply that they've tested them. The author of this Tomshardware article may have misinterpreted their followup posts, badly.
Where a lot of the consternation about the Backblaze reports come from is that people assume the testing is somehow flawed by using the drives in their environment. That's anything but true. The flaw is how people interpret it.
I work in automotive part reliability and testing, previously I worked for Raytheon doing testing for clients such as NASA.
Backblaze's methodologies are not too far out of whack for what gets done with automotive testing. Parts are subjected to the worst scenarios, under 24/7 load, totally outside of their design parameters. Think taking a Toyota Corolla and driving it at maximum speed around a street circuit for 96 hours straight. Way outside of typical usage, and way outside of what it was designed for.
The failure rates are calculated. *Then* a decision is made what to continue using - often it's not the part that fails the least that is chosen, but all factors are considered - cost, manufacturing time, materials, etc. Reliability is just a factor. But when it all comes down to it, if you have two identically priced parts, and one is more reliable than the other, even completely outside of their design parameters - you're going to go with the more reliable one.
Backblaze has done much the same. The drives in their data center all are subjected to the same workload. They fail at different rates. Why not make decisions based on that? It's stupid not to for Backblaze, and they're showing their methodology for us to get a glimpse at what they do. If end users make decisions based on that as well, that's their own issue.
Finally, Backblaze's data is the only set of data like it out there. No other company (including Google) has had the balls to outright state what drive brands and models have been more/less reliable for them. They're too busy avoiding upsetting the drive manufacturers to actually give the rest of us data. So while the Backblaze data may not equate directly to the hard drive I stick in the next PC I build, it's better than completely blind or anecdotal evidence, as was the world before Backblaze released their data.
It's the same reason Goodyear advertises on NASCAR. People think it's good enough for NASCAR races at 200mph, it's good enough for their Chrysler around town.
If anything, the article should be titled "Questionable Class-Action Lawsuit Against Seagate Built On Backblaze Reliability Report" rather than the current way, though I'm in no way qualified as a lawyer, but I am for reliability testing. Keep up the quality writing Toms. I think some editorial review is order on this article.