Question i9-9900K now or wait?

Page 10 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

RobCrezz

Titan
Herald
I don't even understand why the good ol' AS5 got such a bad rep... A single application kept my 2700K since 2012 at 4.6Ghz using a TT Frio. I've even used it for some GPUs with amazing results. The thermal transfer properties are superb for the price. While I won't do it in the short term, because the paste bundled with the Noctua had excellent reviews, it's still wearing out quicker than what the AS5 would. Even the MX-4, which is everyone fav, wears out faster than the AS5. Curation period is longer though and I won't deny I got scared the first few times I used it thinking the temps would remain higher than expected by me.

But hey, I'm always looking for more information, so if you have a couple links to share some benchies for the AS5 that show it's showing its age, I always welcome those :p

Cheers!
In all seriousness, its not that bad theres just much better products on the market that dont need curing like AS5 does. I would class it more as out of date.

Either way, NT-H1 is much better.

What do you mean about wear out? I havent noticed wear out at all, unless its a crap paste that drys quickly. Are you sure its not just becoming less tight over time and needing more pressure?
 
Reactions: boju
Thanks, Yuka.

I'm having the system built for me by Scan. I'm fairly confident I could do it myself, but not quite there yet... ;)


Memory

I think it's between the quad (as above):

Corsair Vengeance LPX 64GB DDR4 3000 MHz RAM/Memory Kit 4x 16GB 64GB (4x16GB) Corsair DDR4 Vengeance LPX Black, PC4-24000 (3000), Non-ECC Unbuffered, CAS 16-20-20-38, XMP 2.0, 1.35V
(https://www.scan.co.uk/products/64gb-4x16gb-corsair-ddr4-vengeance-lpx-black-pc4-24000-3000-non-ecc-unbuffered-cas-16-20-20-38-xmp-2)

Or this dual kit (of which I'd need 2x):

Corsair Vengeance LPX 32GB DDR4 3000 MHz Memory Kit 2x 16GB 32GB (2x16GB) Corsair DDR4 Vengeance LPX Black, PC4-24000 (3000), Non-ECC Unbuffered, CAS 16-20-20-38, XMP 2.0, 1.35V
(https://www.scan.co.uk/products/32gb-2x16gb-corsair-ddr4-vengeance-lpx-black-pc4-24000-3000-non-ecc-unbuffered-cas-16-20-20-38-xmp-2)

Only £18 between them. I'm afraid I don't understand the timings etc. but they look identical to me, so presumably it's just the quad vs dual difference?


Storage

I'll be adding a lot more storage, don't worry (I already have 3 HDDs in my current rig which I'll transfer over, plus will be adding some SSDs shortly); the 500 GB NVMe will only be used for the OS and primary progs.
Details of the two NVMe drives I'm considering below (WD is actually slightly more expensive, but not much in it):

500GB WD Black SN750 M.2 (2280) PCIe 3.0 (x4) NVMe SSD, 3D NAND,3430MB/s Read,2600MB/s Write, 420k IOPS
500GB Samsung 970 EVO Plus, M.2 (2280) PCIe 3.0 (x4), NVMe SSD, MLC V-NAND, 3500MB/s Read,3200MB/s Write, 480k/550k IOPS

The overall specs of the Samsung appear superior to the WD as you suggested; WD's warranty is 5 years vs 3 and WTBF, 1,750K hrs vs 1,500K hrs, but Samsung's write speed is much better at 3200MB/s vs 2600MB/s. I was going to go with WD as have had a very good experience with their other products, but that write speed has me leaning towards the Samsung (though as an OS/progs drive read seems much more important than write). Oddly the 'EVO Plus' looks faster than the 'PRO' and is significantly cheaper... am I missing something?


Thanks
Those 2 kits are, from what I can read in the specs alone, the same. They're just bundled differently. Buy whatever is cheaper or more convenient out of the two.

As for the NVMe's... The "PRO" Samsung drives should have better life, from what I remember and better sustained and random writes than the EVO counterparts. I haven't read about the "EVO Plus" though. Benchies are everything, so you may want to google for some more information about the differences about the 970 PRO and the EVO Plus, or at least what does the "plus" actually mean.

Well I've no idea about such things (see above!), but I did spot that Noctua have a newer paste than the one bundled with the D15: https://noctua.at/en/products/thermal-grease/nt-h2-3-5g
I'm just curious, that's all. I know good quality grease for CPUs is ~3 years, but AS5 feels like it's eternal, haha. Not quite, but still.

Cheers!
 

boju

Champion
I have 1st Noctua paste application still in use with two builds, 920 @ 4ghz w/ D12 and 2600k @ 4.5 w/ D14 since these cpus release. Don't remember if it was nh1 back then but still, their stuff has been good.
 

Darkbreeze

Titan
Moderator
You do NOT want to EVER use 2 separate memory kits. When you do that, you are intentionally introducing the potential for one or more sticks from either kit not wanting to play nice with the rest. In reality, there ARE NO "dual" or "quad" kits. There are only a number of sticks, all of which, when included IN a kit together, have been factory tested to be compatible, to "play nice", with all the rest of the sticks in the kit.

If you need four sticks to get the capacity you require, then buy a kit that COMES with four sticks in it, and ignore whether the marketing claims it is "dual", "triple", "quad", or any other type of "channel". There are no memory modules designed to better support a specific TYPE of channel architecture. There are only memory modules and they can be compatible or not, with each other, in any given memory architecture channel configuration.


As for the thermal paste, if you are intending to use what the CPU cooler comes with, which will be either NT-H1 or NT-H2, depending on how long that cooler has been sitting on the shelf, then that is fine. If you are intending to BUY thermal paste, then while the AS5 is fine, I've never had ANY problems using that or the MX4 over the years, I'd highly recommend the Thermal Grizzly Kryonaut paste as most reviews show it having a slight degree or two advantage over the Noctua NT-H1 paste, but Kryonaut and NT-H2 seem to be about the same. In any case, there is only about a four degree difference from worst to best among any of the common, well known thermal pastes.

Go with the Samsung 970 EVO Plus. There should not even be a question here in that regard.
 
Reactions: R_G_S

R_G_S

Honorable
Dec 17, 2012
129
9
10,585
0
Hey guys, back again!

I received an email from ASUS regarding bandwidth limitations on the Maximus Extreme, unfortunately it wasn't very helpful and didn't answer my question. From all the documentation I've read it seems that if both M.2s are installed as PCIe, then nothing is lost and if one is SATA, you lose a single SATA port. The bottom PCIe slot runs at x2 by default, but can be made to run at x4 with 2 SATA ports disabled. I don't think any info's missing and that's just the way they have the board setup, i.e. 2 SATAs linked to the lower PCIe slot as opposed to the second M.2 - which I like as it gives me 2 full speed NVMe drives and 6 SATAs should I need them.

As mentioned, using the DIMM.2 expansion card (2 M.2s at x4) will drop the GPU slot to x8, which with a 2080 Ti results in ~2% performance loss (at 4K), but presumably this is likely to increase as cards get faster. I was wondering whether it would be possible to still make use of the DIMM.2 card via a PCIe adaptor, though obviously running at a reduced speed, for e.g. the aforementioned lower PCIe slot with 2 SATAs disabled (x4), could in theory run the DIMM.2 (with 2 M.2 dives) at x2, correct? No big deal (at all!), and not something I'd be doing any time soon, just wondering theoretically ;).

On a different note, latest GPU 'Super' rumours: https://wccftech.com/exclusive-nvidias-super-gpus-unleashing-monsters/
 
Well, you've taken long enough to now say: wait for Ry3K benchmarks. PCIe 4 will make a difference for you, I'd say and the CPU power you can get from it will be on par to the 9900K.

And I don't think the "super" announcement from nVidia will shake things up that much; not for the 2080ti at least? In any case, you're always welcome to wait a bit more for both the Ry3K benchmarks and what the "super" stuff nVidia teased is all about.

Cheers!
 

R_G_S

Honorable
Dec 17, 2012
129
9
10,585
0
Yeah, I had thought the 2080 Ti would be pretty safe (as per previous rumours) + expected the 'Super' series to be nothing more than a minor refresh.

Though I'm aware some question the reliability of WCCF Tech (no idea myself), here's what they say:


New chip: the NVIDIA RTX 2080 Ti SUPER exists and will be an absolute beast of a card
Turns out, there is an RTX 2080 Ti SUPER variant and its a brand new chip. It’s not an old part and it’s not a repurposed Quadro part (trust me, I asked). Oh, and its completely unlocked, meaning the AIBs can have 300 watts TDP on this thing if they wanted to – and knowing the insane custom designs that come out of AIBs, I wouldn’t be surprised to see this. I don’t have the exact specifications of the chip or the exact pricing yet but I am fairly confident this will have a higher core count as well as faster GDDR6 memory.

This is not the PN-A for OC version chip or the non-OC chip, its a completely new chip as I mentioned earlier and will have a new PN. Since it’s unlocked and will have higher TDP as well as higher core count we expect this to be much faster than the existing RTX 2080 Ti card. Everything we know so far points to this slotting in at roughly the same price point as the existing RTX 2080 Ti or slightly above. In other words, you are going to get a card that is much more powerful but for roughly the same price or slightly more. Unfortunately, however, you are going to have to wait a bit for this card as per my info, it will have a delayed launch.



But who knows. It does look likely that a 2080 Ti refresh is coming (other sources too); much more powerful or not is anyone's guess though, obviously WCCF think it will be which is a concern if buying now. To me a small speed-bump is an annoyance, but no big deal, however if it's Titan-level performance for the same price, well...

I waited for Computex, then E3 and now WCCF say June 21, but elsewhere Gamescom is suggested, so August... Additionally WCCF expect that the 2080 Ti Super will have a delayed launch. If the -60/70/80 Super cards were to launch in July we could see the -80 Ti Super launch Aug/Sept, but if the former were only announced in Aug, with a release in Sept, we might not see the 2080 Ti Super until Christmas + we're still guessing on pricing, which could even be a bit above current, so £1,200, or more...

I also want the EVGA XC Ultra version (for the warranty + quiet/cool operation) which is currently priced at £1,100, but the price will be back to £1,220 from next week.
 

TCA_ChinChin

Reputable
Feb 15, 2015
202
36
4,740
7
This threads been up for a while now so I assume you already have that i9-9900k? Also the Super announcement stuff does sound exciting cause if its anywhere near true, then the current RTX cards are due for a price drop and the incoming Navi graphics are basically DOA at their current pricing.
 

R_G_S

Honorable
Dec 17, 2012
129
9
10,585
0
OK, I think I'm going to go for it (system as detailed previously). I stated right at the start of this thread, I need a new machine now and can't wait months. I have put it off and off and off already... though I did learn a lot and avoided some pretty serious mistakes, thanks to the advice given here, so it's not as though the delay was without worth!

If nVidia bring out a 2080 Ti Super this summer and Intel's 10-core comes in Oct, too bad - I'm just going to have to suck it up ;). At least there's a discount on the card I'm after right now so that should cushion the blow (only very slightly!) + though expensive, prices on the other parts are also good/discounted, so I've been fortunate in that regard at least. I could easily hold out a couple more weeks in hope of GPU/CPU news, only to end up getting the same rig as specced now and pay a few hundred more.

I'm sure Ry3K will review very well, and I can totally see why it's make sense to go with that, but I'm not personally willing to take the risk with AMD, again reasons given previously.

Will post again once the deed is done.
 

R_G_S

Honorable
Dec 17, 2012
129
9
10,585
0
I'm not sure whether it was hardware or software or a combo of both, but the system I had in the past was really buggy and frequently crashed (progs and games). As for the games, maybe I was just unlucky with a bad GPU, but I do recall that I wasn't alone as plenty of others were also reporting the same thing (I can't say 100%, but pretty sure I replaced the card and the replacement had the same instability issues). At time all the reviews stated that the AMD card was superior to the nVidia offering (I forget which right now as it was quite a while back) and it probably would have been were it stable. Eventually I gave up and went back to Intel/nVidia and everything just worked (whilst I do remember the nVidia card ran hot and wasn't wonderful, at least it was dependable and offered solid, if not fantastic, performance).

Sharing this experience with others in the games industry at the time in a effort to fix the issues, the consensus was that AMD was not really a good platform for development, and I must admit that every company I've worked for has used the Intel/nVidia combo (either Core/GeForce or Xeon/Quadro). Maybe I was just unlucky with my particular setup and I strongly suspect that the GPU was more to blame than the CPU, but it really put me off AMD.

On top of that, I was watching a 'Tech Deals' video where the guy was saying how in his experience, whilst AMD are fantastic value, Intel is just a nicer, smoother platform to work with. Whether correct or not, it did chime with my experience as aside from the serious issues I'd had, everything just felt a bit 'budget.'

There was an interview on YouTube during Computex (might have been GN, can't remember) where they were discussing Intel's problems and one of the guys, who whilst in full agreement that Intel were in serious trouble, also said not to underestimate the power they wield though their connections/infrastructure/financial heft which enables them to ensure software is often better optimised for their platform and also better supported (drivers etc). Again, right or wrong, I don't know, but it echoed my experience which is why, tempting as AMD is right now (and I do get that!), I still feel uneasy about them. I know Intel have had their fair share of issues, but generally speaking I feel they're there the safer bet, in my specific case at least.
 

R_G_S

Honorable
Dec 17, 2012
129
9
10,585
0
Update - I have placed the order!

By pure luck a number of parts I wanted (the exact GPU and PSU for e.g.) were both discounted which was a nice bonus.

There's a delay however of 10-15 working days (normally 5), so should there be a shock GPU/CPU announcement before the machine is actually built I can still make changes.

Cheers!
 

Darkbreeze

Titan
Moderator
The AMD Bulldozer/Piledriver AM3+ platform was a catastrophe, you won't get much argument there although considering the price differences at the time I will say that builds with those parts for budget builders worked out well enough, but for serious users it was a pile of crap. Prior to the AM3+ platforms though AMD had some solid, competitive architectures and some compelling reasons to look at them although aside from a brief moment in time when Intel stumbled with the Pentium 4, they've never really had parity with Intel in terms of overall performance. They've had small victories here and there, but have mostly lain prone with Intel's boot on their neck due to the legal restrictions Intel had leveraged against them for years and years due to fabrication and licensing restrictions related to the x86 platform.

Now, it feels like the roles are about to be reversed, and it's mostly because Intel has ignored the fact that AMD was diligently working towards the creation of a new platform with real advancements in design while Intel was coasting on their momentum from years of domination and not trying to actually invest much in real innovation or improvements. And then, too late, they realized that the sleepy efforts they had put into the 10nm platform were not going to pan out in production due to a number of problems and THEN they really dropped a brick in their shorts when all these vulnerabilities started coming to the light.

As to their graphics card architectures, I'd agree that for a period of time AMD had real troubles with power consumption and driver development. Most of that was sorted out a few years back when they canned the Catalyst control panel and hired all new driver developers to straighten that mess out. I don't think they are likely to catch Nvidia in terms of performance anytime soon, but you never know, because Nvidia has been guilty of riding their laurels much like Intel although that was only publicly. In private, they've been ready to release what they've actually been working on in the event it ever looked like somebody was getting too close to the level of parity necessary to steal away significant market share.
 
Reactions: rigg42

R_G_S

Honorable
Dec 17, 2012
129
9
10,585
0
Final build is:

Case: SilverStone Temjin TJ11
Motherboard: ASUS Maximus XI Extreme
CPU: Intel i9 9900K
CPU Cooler: Noctua NH-D15
Memory: 64 GB (4x16GB) Corsair Vengeance LPX 3000 MHz
Graphics Card: EVGA GeForce RTX2080 Ti XC Ultra
PSU: Seasonic PRIME Ultra 850 W 80+ Titanium
Operating System: Windows 10 Pro High End 64-Bit

Still working on my storage options, but the OS will be on a 512 GB NVMe drive. As mentioned, I was lucky to get good discounts on both the GPU and the PSU.

Also as mentioned, the system builder is running a bit behind, so should there be an RTX Super announcement on the 21st (as rumoured) I would be able to change the GPU if needed. I think it's pretty unlikely I'd do that though as going by the leaked specs the -60/70/80 Supers don't seem to be all that big of a jump over their standard brethren (despite some suggesting everything's going up a notch, e.g. the 2070 Super will almost equal the current 2080 and so on, I'm not sure I see that from the figures). Plus it seems that if there is a 2080 Ti Super, it won't be launching with the others (again, according to the rumours...).

Thanks for the interesting AMD info there, Darkbreeze. I am tempted by the 16-core Ryzen processor, but for a number of reasons feel it's better to play it safe right now.

One other thing - I can now confirm (from ASUS) that fitting 2 NVMe drives (PCIe) to the Maximus XI Extreme MB will not lose you any functionality. In fact the board is really great for storage options.
 

Darkbreeze

Titan
Moderator
And just when exactly can I expect to see this system arrive by Fedex to my front porch? LOL.

Seriously though, that's a serious system. Should be very nice indeed. I just read yesterday though how Nvidia JUST notified the board partners about the plans for the SUPER models, so I seriously doubt there is even anything in the works AT the moment. Probably going to be late 2019 at the earliest before we see any of these cards in any quantity at all.
 

R_G_S

Honorable
Dec 17, 2012
129
9
10,585
0
Thanks, Darkbreeze - and thanks so much for all the help - really has been excellent!

I wouldn't be surprised to see the 2060/70/80 Super variants arrive this summer (using existing cooling designs) with the 2080 Ti Super later in the year. Right now it's not really needed, either in existing games or to combat anything from AMD. I don't mind if the card gets a bit of a price drop as I got £120 off RRP anyway, but would be bummed to see a new model in July/August at roughly the same price.

In an ideal world, Intel would have just launched the 9900KS and nVidia, the 2080 Ti Super, but we know the former's not coming till 'holiday 2019' and I expect the latter quite likely at the same time. As I see it, we're probably looking at relatively minor speed bumps here (both cases) and thus I couldn't justify delaying any further. If it had been Intel's next gen 10-core processors and Nvidia's 3000 series that would've been a different matter, but I couldn't keep waiting and guessing!
 
Even if they announce they'll be faster, your system is not going to be a slouch anyway. Plus, you got some nice discounts on the build, didn't you?

I just hope the 9900K doesn't have throttling issues and the assembly fellas do an excellent job attaching the NH-D15 and an even better job with the airflow.

Cheers!
 
Jul 20, 2017
738
2
985
0
I've been buying parts for literally over 2 years and constantly going back and forth on whether to build intel or amd. lol. I started out planning to for sure use 7700k. Then 8700 and 8700k came out and I was surely going to use those. Then I finally bought a 2700x, only to sell it and plan to get a 3700x. Then I decided 3800x. Now I am deciding between 9900, 9900k, and 3800x. lol. And this whole process started due to my windows screwing up and my pc making noises and somehow I have gone 2 more years with it not falling apart.
 
How many NVMe drives are being used in the build. If any are used in the DIMM.2 adapter(uses CPU PCI-E lanes) your GPU will drop top x8 instead of x16 bandwidth. Only if you use 1 of the lower two m.2 slots can it be used in x4 full bandwidth mode. This is of course if some SATA ports are disabled. Just should be aware of that in case you planned multiple NMVe drives. However, I still don't think many games are affected by GPU drop from x16 to x8 if you go that route.
 
Reactions: DerCribben

ASK THE COMMUNITY

TRENDING THREADS