Intel Optane SSD 905P Bulks Up to 1.5TB

dudmont

Reputable
Feb 23, 2015
1,404
0
5,660


Because it's random performance is an order of magnitude better.......
That being said, I don't know why anyone who didn't need all their files on the fast possible drive, would need a 1.5tb. An extraordinary boot drive, but not much more, in my opinion, is all that's needed out of optane.
Ole' Chris Ramsayer(who seems to have exited Tom's, sadly), tried very hard to convey the qualities of Optane. I, sadly, think that only a handful of us ever got the picture that he was trying, rather hard, to make.
 

Brian_R170

Honorable
Jun 24, 2014
288
2
10,785
Somebody wake me up when Optane is available in a 500GB-class (or larger) M.2 2280 form-factor. Previous articles say that the upcoming Optane M.2 will be the 22110 form-factor, which is practically non-existent in consumer desktops.
 

jonathan1683

Distinguished
Jul 15, 2009
445
33
18,840


Thanks i knew i was missing something. seems too bulky probably expensive and slower read/write taking up space and would imagine there would be booting to OS issue being plugged into a pci slot
 


These drives are great for servers as they excel in consistent performance at all queue depths and have low, consistent latency. I especially am happy to see the capacity go up.

Intel sells Optane to enterprises in the model named P4800X. However, it is very expensive, and capacity is still limited (can't really get it above 750GB right now). These new capacities will make it into the enterprise model line.

I will agree that the appeal to the consumer is a bit more limited; however, that doesn't mean they can't benefit, even if the cost might outweigh the perceived gains for them.
 

USAFRet

Titan
Moderator


Because it is significantly faster, but at 5x the price per GB.
1TB 860 EVO = $168
1TB 970 EVO = $300
1TB 905 Optane = $1400

You are not the target market for this. Yet.


This is where regular SSD's were 10 years ago.
 

rbarone69

Distinguished
Aug 16, 2006
241
0
18,690
If you ever worked on very large visual studio projects or other systems that require fast sequential access to many many files it would be a huge benefit as well.
 

bit_user

Polypheme
Ambassador

What? Sounds like you're still missing something. Why do you think it has slower read/write?

Most modern motherboards shouldn't have an issue booting off it, either. You might not know this, but M.2 NVMe drives are also on PCIe.
 

bit_user

Polypheme
Ambassador

NAND-based NVMe drives can do fast sequential access for a lot less $$$.

Databases are the killer app for Optane.
 


He might have meant fast random access. When you're compiling, files are pulled in from everywhere, and not necessarily sequentially. Thus, Optane could shine there as well. At least, that's what I think he meant. :)
 

bit_user

Polypheme
Ambassador

Edit: Fixed, after my misreading of the original post was cleared up. Regarding building of Visual Studio projects, see my next post.

As for what I thought we were talking about:

Most access to A/V containers is sequential, and the I/O sizes involved mean that even random access will still mostly follow the performance curve of sequential access performance.

If you buy this for A/V editing, you're wasting your money. I'd seriously look for benchmarks of a fast NAND-based NVMe drive before making such an expenditure.
 

merlinq

Distinguished
Aug 7, 2012
19
2
18,515



Sorry, but you are the one misunderstanding.
Visual Studio is an integrated development environment, for programming, it has nothing to do with A/V, and everything to do with compiling and random I/O .
 

bit_user

Polypheme
Ambassador

Oops, my bad. Sorry about that. Thanks for pointing that out.

I don't know how I thought @rbarone69 was talking about video editing. My comments should be taken in that context.

As for compiling... I'm skeptical. The reason being that most I/O is of frequently-referenced files and those only need to be read once. After that, they're in cache. And compared to the typical read latency of a fast SSD, the amount of CPU time needed to compile a source file is far greater. And writes to the output files are all buffered by the kernel. This explains why building on a hard disk (which I've spent many years doing) isn't actually that painful.

The exception would be if your system is under extreme memory pressure. For instance, some poorly-coded files can consume gigs of memory to compile. If you're doing a parallel build with multiple such files, it can result in your cache getting purged and even cause swapping. In that case, I'd imagine you'd actually see some quantifiable benefit from Optane. Otherwise, just get a decent SSD or an abundance of RAM. A high-core-count system could even justify NVMe, but IMO a good NAND-based model would still be sufficient.
 

emv

Distinguished
Nov 7, 2013
30
2
18,535
We have been through this before. The specs look great for random performance and latency. When you actually use it in a real world application it is typically faster by 2-5%. Unless you get your application to behave like IOMETER, you will not see the same benefits. If 2-5% performance improvement is worth paying 4-5x more for the SSD ... go for it.
 

dudmont

Reputable
Feb 23, 2015
1,404
0
5,660


Question, have you ever used a computer with an Optane drive in it? The Tom's storage editor, Chris Ramseyer(the guy who did the optane reviews), stated that optane drives were noticeably faster in use. If you've used one, and then taken it out, and used a fast m.2 PCIEx4 drive in the same machine and using the same programs and still don't notice the difference, then by all means speak up.
 

bit_user

Polypheme
Ambassador

The responsiveness argument definitely carries some weight, for me. I aspire to use Optane storage for at least the boot/apps drive of my next PC, depending on pricing and products available at the time.

When I open apps or files, I often notice some delay, and I sometimes wonder exactly how much of that is I/O vs. CPU. For all I know, the bulk of it could even be on-access virus scanning.
 

rbarone69

Distinguished
Aug 16, 2006
241
0
18,690
The problem with Visual Studio is that there are a ton of small files for R/W. They are almost always random and not sequential reads. On one project I work on there is 14,257 files, 3512 folders for 1.4gb of data. This of course is a ton of copies of files for distribution that is done every project rebuild.

If you look at this video you'll see that where it excels is at small file copies.
https://www.youtube.com/watch?v=E1c9GOv1_hM
https://www.youtube.com/watch?v=MhhR2_M1OG8

Let's add some numbers to this.

- Let's say a developer rebuild takes 5 minutes.
- They do this about 6 times a day (So that's 30 minutes of waiting per day)
- Let's say the person is paid 100/hr (that's $50 per day in wait time)
- We install Optane and get a 25% decrease in file copy/rebuild times.
- We see a savings of $12.50 per day of increased efficiency.
- In that rate that's a savings of $62/week
- $3,250 per year.

I know it's oversimplified and not very accurate, but you get my point.