Hi,
I've been lurking in lockdown until now, but I'm coming out.....:-}
Thanks for your testing, and for the Forum.
It's a very good source of information.
Can anyone please tell me some more details on how the Sustained Sequential Write Speed Test is performed?
I've read "How we do the Tests", and it's a bit sparse on detail.
Also searched the Forum.
Yeah, that needs to be updated, that was how the previous reviewer tested afaik a few years ago.
Specifically:
(1) How do you prepare the drive before?
Do you write over with fresh writes?
Leave it blank?
For testing the sustained sequential write speed I prep the drive by doing the following.
- Secure erase
- Boot from another drive into Windows
- Format drive as GPT and NTFS
- Delete the partition
- Convert the drive to MBR and leave without a partition
(2) What packet size do you use?
A lot of reviwers use 128kB, but not all.
(3) What Queue Depth?
(4) Do you do just do 1 test, or do you repeat it to check?
Run iometer script that entails:
10 seconds idle
10 seconds sequential write preconditioning for at qd8 - 1MB block size
10 seconds idle
15 minutes 30 seconds sequential write at QD32 - 1MB block size
30 seconds idle
5 min sequential write at QD32 - 1MB block size
1 minute idle
5 min sequential write at QD32 - 1MB block size
5 min idle
5 min sequential write at QD32 - 1MB block size
30 min idle
5 min sequential write at QD32 - 1MB block size
Sometimes I run this twice (secure erase before again) if the results seem strange for the hardware. Usually once gives me enough data and detail to use. I used to use 128KB, but some drives underperformed. 1MB is closer to what real-world behavior is anyways. I test QD 1-128 4KB random read/write and QD1-32 128KB sequential read/write with iometer, too, which is where I get my peak sequential and random performance data. I may script a replacement test for ATTO one day soon.
(5) What frequency do you record the data at?
Your plots are solid lines but they are actualy composed of datum points at each reading.
Do you do it time based (say once or twice every second?).
Or is it data based? (say every 1 GB?).
Data capture is logged every second from iometer. It is time (seconds) on the x-axis and MBps on the Y-axis on the first two detailed charts. The 3rd-5th charts are GB on the Y-axis.
(6) Have you always used Iometer to do it?
Yep
(7) Is it always for 15 minutes?
Yep, but I didn't have it run the idle rounds to check cache recovery until maybe a half a year ago.
(8) What Over Provisioning do you use?
Is it the standard amount, as formatted?
I'm assuming it is, because you don't say anything about that.
I don't adjust user capacity settings. The sequential write saturation test is the only test where I test the drive empty and as a secondary device. All the other tests are done with the drive running the OS and at 50% full for 3-4 rounds at varying idle times to represent the closest real-world use case results. By now, however, most drives I test end up getting tested at least 5-6 times more after that when doing other articles and testing new workflow patterns.
I may transition to running the tests with the DUT as a secondary device in the future though, except for SPECworkstation3 since it needs to be run on the OS drive.
I like the way that you report the output, in particular the graph that shows Gibabytes written as a function of time.
I think that's a better metric than the usual MB/sec per Data in GB.
Thanks very much.
Thank you, if you have anything else you would like to see or adjustments, I am always open to feedback. Just hard to balance out how much time to spend on testing and graphing for these at the end of the day.
In addition, would it be possible to archive your output somewhere on. your site?
A lot of the other reviewers do that, and it's very useful.
Again, that's for your efforts, and helpful comments.
Best,
Alan Jarvis
Sheffield
I'll see what I can do.
Manufacturers would love to have influence on how we test products, that way they can steer reviews to even more favorable results, see where this is going? BTW - They don' have any influence here though I've seen where they've been contacted about a glaring but that should be fixed asap.
As for testing methodology, would you give yours away; that took hundreds of hours to perfect, if not years; away to a competitor ? And for Free? so they can undercut you and take your job? I think not.
And then there is the whole " Who actually owns the process?" is it property of the reviewer or is it the Website's owners intellectual property? give that away and you could see a hefty fine or even jail time.
And to be clear I am not a product reviewer but those are the some of the more obvious issues I can see.
My testing is an open book. I do what I can to cover most aspects of performance while exploiting and highlighting the weaknesses and strengths between different architectures. Manufacturers can be quite sneaky and sometimes implement specific benchmark focused modifications in the firmware that don't carry over into other tests. I've seen it happen over the years and adjust my workflow as needed.
Anyone can go and try doing what I can and follow my workflow somewhat, but not many can take a confident professional stance on these products week after week, year after year as I have. And power testing? Ha, hardly anyone can, let alone try. After toying with and reporting on SSDs for the past decade, I haven't seen many try to keep up with the workflow. Those who have usually don't hang around reviewing storage too long or are owners of a tech review site themselves. Well, I mean there are those who simply review storage based on one run of crystal disk mark and ATTO and don't even mention the underlying hardware...but then again, is that really a review? Do those actually count?
I've been there and done that to a point, but man, you gotta be deep into storage to sense things. It can take a lot of time to get data and conclusions and comparisons right each and every week with a new product each time, especially when you have little data to go off of. I am the point of contact to get any juicy hardware detail out of the product managers on SSD storage...and they still don't give me much of anything. So, I always gotta do some extra digging to get some special detail from random white papers or download software I probably shouldn't have in the first place to get the answers I need.
I am a freelance contributor to Tom'sHardware. My reviews, the test procedure, sample acquisition, basically everything, is all my own workflow and I have no influence from any company besides Tom's requesting me to keep up with an SSD content output schedule. I'm near fully independent, all opinions are of my own and own finding. I can change anything I want with my testing/reviews as long as it is for reader benefit.