1M QD32.
Anybody ever use that in real life at home? try QD1.
You're missing the point, which is that it's sequential write. Even QD=1 should show similar results.
You raised the issue. I was merely supplying the evidence to support (most of) your point. Why are you mad at me? ...maybe you're a Samsung fanboy?
That only matters if you were to be using the device to store data from a faster drive,
Not true. Try unzipping a big archive, for instance. ZStd compression is
very fast, and you can get good enough compression ratios on some data for it to saturate the write bandwidth of these SSDs.
Another example: I've been write-limited on a NVMe drive, when saving lossless edits to large video files.
However, it's not a major issue for most consumers, which is why consumer SSDs get away with it.
Also did you notice that Tom's stopped showing Optane in SSD reviews?
Because they were discontinued, I presume. Even if the were still included, they're at a disadvantage with their PCIe3 interfaces.
But back to your response from my post: is there a cache in these drives or not? How many pcie gens will pass before nand can get to 1st gen Optane SSD 4k q1t1 latency? 10?
Huh? Why do you think NAND will
ever reach the latency of Optane? As far as I understand, NAND isn't on track to achieve this, and nobody is saying otherwise.
More importantly, it doesn't
need to, for most purposes. If microsecond latency had been
that important for SSDs, Optane wouldn't have gotten canceled.
I get that Optane loses on practical price/perf. But it only loses to dram on heavy use, consistent performance, gaming, light use, etc.
Huh? It's an order of magnitude slower than DRAM, has vastly lower endurance, and isn't a whole lot cheaper. It fails the test as a DRAM substitute, which was a big part of Intel's strategy.
It is a shame that it won't be further developed
You presume it
could be. We don't know that. As far as we know, an unlimited budget and development time wouldn't be adequate to make it commercially viable. It's a fundamentally different technology that you can't simply presume will follow the same cost & density curve as DRAM or NAND flash.
and have it's production scaled up so costs go down and everyone gets better performance.
If you look at the density of the 2nd gen Optane dies, all the volume in the world wouldn't make it competitive.
If you gave a 5800x a similar sized cache of dram that the others have of dram+slc, what would the following chart look like?
That chart is Optane at it's weakest.
It's not clear why you'd want a DRAM cache, since it can already sustain more than 75% of PCIe 4.0 bandwidth and the amount of DRAM you could fit in there would only last a few seconds. In a typical use case, the OS buffers writes using system memory, which has an order of magnitude more bandwidth and far greater capacity than what DRAM the drive could hold. Plus, using system memory to buffer writes is more flexible.
BTW, the 800 GB P5800X cost over $2k, which is almost 10 times as much as the rest of the drives in that comparison. It's also in a U.2 enclosure, which gives it more PCB space and the benefit of better cooling. Not a very fair comparison.
Why are you so enthusiastically championing mediocrity?
I'm not. I wish Optane would get cheaper and faster, as much as anyone here. However, I think Intel
usually makes sound business decisions, and the fact that they saw no way forward and couldn't sell it or spin it off means the technology simply didn't have enough gas left in the tank. I also trust Gelsinger to make data-driven decisions and listen to what his technical folks tell him.
You can see the jump in performance from pcie gen3 to pcie gen4 in Optane. Where do you think pcie gen5 would land? My guess is the slowest point wouldn't even be slow enough to be on the chart you posted. But we'll never know.
* sigh *
Everything about the drives changed, between the P4800X and P5800X - not
only the PCIe gen. The Optane memory is Gen 2 (with 4 layers instead of 2), the controller is new,
and the PCIe speed doubled. You can't just chalk this up to PCIe speed, as the limiting factor is most likely the Optane memory itself. And we really can't extrapolate these results to whatever Gen 3 Optane might've been, without actually knowing anything about it.
Truth be told, I already bought a P5800X after they announced Optane would be discontinued. I'm just not sure if I should hold onto it as an investment to resell in a couple years, or if I should use it as a OS drive. I know I won't notice much difference vs. a conventional SSD, but there's something alluring about having exotic hardware that you know is just a little better than anything else currently available.
It's similar to the reason I bought a Radeon VII, for having the combination of HBM and fp64 that would never again grace a consumer GPU. Since I didn't find time to do anything with it, I ebay'd it for a $1.5k profit, back in 2021.