Question i9-9900K now or wait?

Page 9 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Karadjgne

Titan
Herald
All I can say is you are doing this right in weighing all these options before making this kind of purchase. If half our users spent half as much time educating themselves as to the facts regarding their hardware and potential purchase, we'd have half as many threads to deal with because half of them are simply down to making stupid decisions and then later on going "what? really?".
Damn thing only allows me to hit the Like button once!
 

Darkbreeze

Titan
Moderator
I agree. Short of motherboards, there are no EVGA products that I would hesitate to buy. I've seen too many weird issues on their boards to consider one, but for graphics cards and power supplies, they are top notch in terms of quality (IF it's a quality model) and especially customer support. That being said, it's still hard to deny the value of that kind of extended warranty, for that price, and I've generally had mostly good experiences dealing with Gigabyte on warranty claims so I wouldn't have much reservation in that regard.
 
Reactions: R_G_S

rigg42

Prominent
Oct 17, 2018
540
165
640
13
Warranty


Personal experience obviously makes quite a difference - I've used warranties quite a bit. I've also had a fair few PC parts fail (GPUs, RAM and PSU - all of which were top-tier at time of purchase); most under warranty and replaced, but not all.

The extended warranty at £20/40 is pretty insignificant on a card which starts at £1,000, with plenty costing £1,200+, but each to their own. As said, Gigabyte's warranty is the best I've come across as standard at 4 years and I'm considering that too.

Agree that checking small print and weighing up shipping costs is important, am doing that right now as regards warranty options on the system - Standard is 3 years parts/labour, with 1st year onsite (they come to you and fix the issue), 2-3rd return to base (system goes back to them); Extended adds onsite for the second and third year, but for £225... Shipping the system is around £40 so will prob stick with the standard warranty in this case as a failed GPU or RAM would not require shipping the entire system and £200-odd extra is a fair chunk of cash, but then again packing up a beast like this, + the CPU cooler issues with transport are a bit of a pain (but prob not worth £225 though!).


Motherboard

Have been trying hard to find a board less RGB-tastic/gamer-themed than the AORUS Master that still shares a similar build quality and features and it's proving near impossible! I was looking at ASUS's WS Pro, but that disables x2 SATA ports for each (!) M.2 drive used. The Master OTOH, has 3x M.2 slots, one of which is free (no restrictions), another reduces a PCIe slot speed (no big deal) and only the third cuts 2 SATA connections - so, in theory, I could run 2x NVMe + 6x SATA drives, should I wish. On top of that it has fantastic VRM cooling, a back plate heat sink, 2x 8-pin connectors and built-in WiFi.

Am I missing something here, or is this simply the best Z390 board there is (unless you want extreme overclocking/water cooling features)?

The only other option that appeals to me is the ASUS Maximus XI Extreme (similar price to the WS Pro, but prob better suited to me, i.e. workstation/gaming combo), which I actually like the look of (shock horror!) - but info as to which M.2 drives disable what (or not) seems very limited. It offers an additional 2 (4 total), but it seems that they might then limit the GPU to 8x... Maybe, or not, no one seems to know...

Not sure if it's just me, but finding out this info is a real pain, particularly as forum posts seem to contradict one another and the manuals (aside from the Gigabyte one) are rather lacking. I am beginning to wonder whether there's something I'm missing here - how can Gigabyte have such great connectivity options, with all other manufactures disabling SATA ports left right and centre, even on their super high-end boards?

Whilst the TJ11 doesn't officially support E-ATX boards, it states in the manual that "Although TJ11 was not designed for Extended-ATX motherboard, the internal space can still allow installation for motherboards with width of up to 11 inches. In addition, the motherboard tray has mounting standoffs for supporting SSI-CEB dual CPU motherboards. Enthusiast motherboards such as ASUS’s Rampage III Extreme and EVGA’s X58 Classified 4-Way SLI are 10.6 and 10.375 inches wide respectively. These are wider than standard ATX motherboard specification of 9.6 inches, but will fit inside TJ11 without any problems. Even if there are SATA connectors mounted on the edge of the motherboard facing to the side, the TJ11 will have room to accommodate them as well." - so presumably the Extreme should fit (at 12 x 1 x 10.9 in) should I go that route, agree?

Thoughts on the AORUS Master vs the Maximus Extreme, plus any clarification as to the M.2/SATA/PCIe issues, much appreciated.

Manuals for ref.:

AORUS Master: http://download.gigabyte.us/FileList/Manual/mb_manual_z390-aorus-master_1001_e.pdf
Maximus XI Extreme: https://dlcdnets.asus.com/pub/ASUS/mb/LGA1151/ROG_MAXIMUS_XI_EXTREME/E14681_ROG_MAXIMUS_XI_EXTREME_UM_V2_WEB.pdf?_ga=2.88325403.1839548784.1559893512-1999083181.1556226774

Cheers.
Have you looked into this? https://www.evga.com/products/product.aspx?pn=123-CS-E397-KR

It covers the aesthetic at least. I quickly breezed through the manual and didn't see anything about the how the sata ports are affected when 2 m.2 ssds are installed. EVGA support is excellent and they might clear up any questions you have if you reach out to them. This mobo seems like a good fit for your system in a lot of regards.

EDIT: If you need to leverage the IGPU for anything this board is a no go. There isn't even a VRM section for it.
 
Last edited:

R_G_S

Honorable
Dec 17, 2012
129
9
10,585
0
Darkbreeze - thanks for the Designare suggestion. I had looked at it briefly previously, but from what I recall found the AORUS Master a better fit feature-wise, though on paper it should be more suited to me (Workstation/Gaming). I'll check again.

rigg42 - yes, I spotted that EVGA motherboard earlier today and it does fit the aesthetic I like. Unfortunately however my builder doesn't stock it (going with Scan now, as opposed to PCSpecialist) so not really an option; good thought though, thanks! I like the look of their Z390 'Dark' board too, but the 32 GB RAM limit precludes it + also unavailable.

This whole M.2/SATA/PCIe thing is driving me nuts - I can't believe it's so hard to nail down the specifics, and from my research so far, I'm not the only one having this issue. I shall go over the AORUS Master and Maximus Extreme manuals again and see if I can learn any more. I think I must be missing something or else Gigabyte would be making more of this feature and presumably ASUS wouldn't have opted for an additional 2 M.2 slots (4 total) if they were unusable without crippling other parts of the system? I believe utilising these extra slots limits the GPU to x8 (from x16), which I initially thought was outrageous, but appears may only result in a 2% performance decrease (though this seems to vary). More research required, some info here though if interested: https://www.pugetsystems.com/labs/articles/Titan-X-Performance-PCI-E-3-0-x8-vs-x16-851/ and https://www.techpowerup.com/reviews/NVIDIA/GeForce_RTX_2080_Ti_PCI-Express_Scaling/6.html.

Again, any help, particularly on the Master/Extreme differences, would be much appreciated.

Thanks.
 
Reactions: DerCribben

knickle

Distinguished
Jan 25, 2008
226
11
18,695
3
NO hardware warranties on graphics cards are EVER transferable to a third party. You will ALWAYS need to provide the buyer with the original purchase documentation in order for the warranty to be honored.
That's not what I meant. Some warranties (not specifically electronics) state that you MUST be the original buyer for the warranty to be valid. A receipt is not always enough. It also depends on what your local laws and the country the product is shipped from. If you are from the wrong country, the warranty can automatically be null and void. And remember, this is an extended warranty.

My point is, cover your ass. Do your due diligence and don't ever assume anything.
 

Darkbreeze

Titan
Moderator
Ten years ago we were a lot further back down the path of Moore's law. Five years ago, that path got a lot rockier with a plethora of thorny vines growing across it in many places. To think that we will have progressed as far in ten years as what we did in the last ten is just being intentionally obtuse IMO.

They're having a pretty hard time just scratching out small gains for both CPUs and graphics cards and the way forward doesn't currently seem as though there are any obvious "ah ha" moments coming anytime soon. You never know, but then again, you don't "know" either.
 
Reactions: R_G_S

R_G_S

Honorable
Dec 17, 2012
129
9
10,585
0
10 years from now is 2029... Ten years ago, the best GPU in the market (2009) was a GTX295 with 1.8GB VRAM.

I'll stick to my comment about the 10 year warranty. I will die on that hill, haha.
Whilst I get where you're coming from, as said previously - it's not really 10 years, but anything over 5...

For me personally 3 years is a bit touch and go as I don't upgrade GPU every generation (or even every other!) and whilst I've always bought the best (Titan excluded), £1,000+ is in a whole other league. Realising that I wanted more reassurance than 3 years is what led me to Gigabyte and EVGA, and the cost of EVGA's extended warranty is peanuts when compared to the cost of the product, and insignificant when considering that of the total build.

As far as upgrades go, I think there are a couple of basic methodologies: Some will take a mid-range setup and upgrade every year (allowing for 'high' settings at all times in games), whilst others prefer to go high-end and upgrade less often (translating to 'ultra' through to 'medium' in terms of component lifespan). Both are totally valid ways of doing things and I think generally work out about the same price-wise; personally I go the latter route which is why warranties are important to me (I am also a heavy user so things are more likely to fail). Of course some are fortunate enough to go high-end and upgrade every year/other year!
 
Ten years ago we were a lot further back down the path of Moore's law. Five years ago, that path got a lot rockier with a plethora of thorny vines growing across it in many places. To think that we will have progressed as far in ten years as what we did in the last ten is just being intentionally obtuse IMO.

They're having a pretty hard time just scratching out small gains for both CPUs and graphics cards and the way forward doesn't currently seem as though there are any obvious "ah ha" moments coming anytime soon. You never know, but then again, you don't "know" either.
If you think there's not going to be anything better that can deliver even more than 5x or even 10x the performance (comparing the gap from 2009 to now ir roughly 10x), you're also being obtuse on purpose. At the very least, pessimist.

While I don't disagree the linearity of "shrink -> more transistors -> profit" methods of now are not sustainable, the big fellas are working around that problem with a lot of different approaches to how manufacturing transistors and arranging them. I'm pretty damn sure we'll get or see a Quantum GPU (sort of speaking) by 2029 or more graphene stuff that can make clock speeds boom.

Cheers!
 
Reactions: DerCribben and boju

boju

Champion
I'm optimistic, but the world being stuck in the 2nd industrial era it's taking longer than it needs to be for significant advances, could be much more than what today has to offer and all of that is to do with money ~ R&D and overheads/cost of business and can put the word greed in there too.

3rd industrial revolution is where stuff will happen, if the human race ever gets there.

Very interesting video, i recommend anyone with passion to watch.
 
Last edited:

R_G_S

Honorable
Dec 17, 2012
129
9
10,585
0
Continuing to research the bandwidth situation:


Gigabyte Z390 AORUS Master

Here's some info from the AORUS Master manual (as referred to previously + MB layout showing location of the drives):






Whilst this looks pretty good, I've read reports of the 'P' slot running at x1 mode even if the PCIEX4 slot is empty, which is rather odd (posts by
7Stark):

https://www.reddit.com/r/gigabyte/comments/a1u57t View: https://www.reddit.com/r/gigabyte/comments/a1u57t/z390_auros_ultra_m2_question/


M2A looks perfect as it doesn't affect any connections (why is it only Gigabyte can do this?), but is situated right below the GPU...
The 'M' slot is standard fare in that it disables 2x SATA drives.

If the M2P x1 issue was a fault, then things appear pretty decent, allowing for a couple of NVMe drives + a full set of 6 SATAs, with the only 'cost' being PCIEX4 running in x2 mode which is no big deal for me as it'd be empty anyway. Even if one should wish to use M2M to avoid the unfortunately situated 'A' slot (GPU proximity), that's 2 NVMe drives at the cost of only 2 SATAs (+ the aforementioned, but in my case unused, PCIE4).


ASUS Maximus XI Extreme

Here's the board diagram showing 2x M.2 slots, lower right (22), and 2 further slots added via the DIMM.2 card (7), upper right:



Unfortunately ASUS do not include a table showing you what disables what and info in that regard is not easy to find. A reviewer states the following (translated):

"In addition to the mentioned ROG DIMM.2 card with a pair of M.2 Socket 3, the possibilities of organizing the disk subsystem are represented by two interfaces M.2 Socket 3 and six SATA 6 Gb / s ports. There are several limitations. One of the slots M.2 Socket 3 (“M.2_1”) shares the bandwidth with one of the SATA 6 Gb / s ports (“SATA6G_2”) when installing a SATA M.2 drive. In turn, the two SATA 6 Gb / s ports (“SATA6G_5” and “SATA6G_6”) share the bandwidth with the lower PCI Express 3.0 x16 (x4) slot, which by default runs in x2 mode. And the DIMM.2 slot itself is disabled by default. Its activation in BIOS will put the PCI Express x16 slot (“PCIex16_1”) into x8 mode." (https://ru.gecid.com/prtart.php?id=54289)

Now, that's all well and good but...

"One of the slots M.2 Socket 3 (“M.2_1”) shares the bandwidth with one of the SATA 6 Gb / s ports (“SATA6G_2”) when installing a SATA M.2 drive." - So that's a single SATA drive disabled if installing a SATA M.2 drive in M.2_1, but what if it's a PCIe drive (as it would be)? I wonder if this is the same, functionality-wise, as Gigabyte's M2A slot, i.e. it'd be fine/no conflicts if a using a PCIe M.2 drive. As for the other M.2 slot (M.2_2), the manual states that whilst M.2_1 supports PCIe 3.0 x4 and SATA mode; M.2_2 is PCIe 3.0 x4 only. I'm not sure what would be disabled here, at a guess 2x SATA ports (equivalent to Gigabyte's 'M' slot)?

It seems pretty clear that making use of the DIMM.2 card downgrades the GPU (PCIEX16_1) slot to x8 mode, but I'm unsure as to whether that's the only thing it negatively impacts or whether any SATA ports are also sacrificed. Also unsure as to the effect in terms of running at x8 as opposed to x16; 2% reduction in GPU performance is OK with me, but results seem to vary and not sure whether this'll hold true for the future. I'm guessing that it's only the PCIe downgrade which would make it the equivalent to Gigabyte's 'P' connection, though irritatingly hitting the primary GPU slot. A shame it can't be configured to take out PCIEX16_2 and PCIEX16_3 instead and/or have the DIMM.2 run at x2 (which would still far exceed SSD speeds).

One final thing, in the ASUS bios you can select x2 or x4 for the PCIEX16_3 slot, running under x4 disables SATA ports 5/6 (not a problem for me as will not being using PCIEX16_3). The CPU PCIE can be configured as follows: [PCIEX16_1 + PCIEX16_2] [DIMM.2_1 + PCIEX16_2] [DIMM.2_1 + DIMM.2_2] though honestly I'm not quite sure what that means as still getting to grips with all this ;).

In summary (my best guess!):

M.2_1: No conflict if PCIe; disables 1x SATA port if SATA
M.2_2: Disables 2x SATA ports (EDIT: I actually think there's a chance that the M.2_2 slot is clear as I can't find anything that states otherwise, whereas all the other limitations are specifically listed, but that's probably wishful thinking)

DIMM.2_1 & DIMM.2_2: PCIEX16_1 reduced to x8 mode (irrespective of whether one or both drives are installed)

PCIEX16_3: If run in x4 mode disables 2x SATA ports (no conflict at x2)


Conclusion

Well, that's where I am so far - sorry for the wall of text!

I'm leaning towards the Extreme board as I prefer the look and have the option of running 4x NVMe drives should I wish in future. That said the Master allows for 2x NVMe drives + 6x SATAs without really impacting on anything else (PCIEX4 only), which is nice. I prefer the locations of the Extreme's M.2 slots over those on AORUS board, though Gigabytes are probably easier to remove and also allow for 3rd party heat sinks (such as some WD drives have), plus I do worry that the DIMM.2 card might block the (otherwise excellent) airflow to the RAM sticks.

If my assumptions are correct, as regards the Extreme, I could put the OS in the M.2_1 slot and run 6x SATA drives for now, with the option of either replacing 2 of them with another (single) NVMe, and/or utilising the DIMM.2 card for a total of 4x SATA and 4x NVMe or 3x NVMe and 6x SATA, depending on the cost of running the GPU at x8.

Anyone still here... :sleep:

Thoughts appreciated as always.
 
Last edited:

R_G_S

Honorable
Dec 17, 2012
129
9
10,585
0
One final thing, and (thankfully!) unrelated to the bandwidth issues...

Ideally I'd be putting an i9 9900KS in this board; as I don't overclock, it should be perfect for me and I've ruled out waiting for 10-core Comet Lake which for all we know might not arrive until 2020. Now, in honesty, I don't expect there to be much of a performance gap between the K and the KS (stock), but being the freak I am I like to have the best I can get (+ will be using the rig for many years).

Waiting for the KS seems somewhat silly as it's slated for Q4 (or 'holiday' as mentioned), so likely 5-6 months away; thus going for the K right now makes sense. I was, however, considering the possibility of having my system builder install the KS upon release (say December), which is a service they provide for free (CPU upgrade - sans shipping and parts obviously!) and then selling the K to recoup a good chunk of the cost. But I don't tend to do the eBay thing (yet) and am unsure as to how much it'd go for. From what I can tell high-end Intel chips seem to retain value fairly well and the i9 9900K fits a good number of MBs which would make it a popular upgrade, nevertheless it's a risk and a hassle. Another thought which came to mind was going with a cheap low-end chip in the system right now that could tide me over until the launch of the KS, but I've no idea as to how that'd run in the interim.

I've seen some crazy videos on YouTube (below) but unsure how such a CPU would fare with 3ds Max and Photoshop?

View: https://www.youtube.com/watch?v=n0z3R2exRPQ


View: https://www.youtube.com/watch?v=EMmLBc2nqHo


Gut instinct tells me I'm being silly, but just thought I'd ask for a reality check ;).

(quick reminder of system specs, aside from CPU: Z390 MB; 64 GB RAM; 2080 Ti, 850 W PSU, Win 10 Pro)
 

Darkbreeze

Titan
Moderator
The reality check is that the difference in performance, for which you'd be paying a premium even if you were able to sell that K unit for most of the original value, which you probably would not, not used, would be along the lines of maybe 2-4%. Not worth it. Either get the K sku and stick with it, or wait, and get the KS sku.

Honestly, at this point, I'd really advise waiting to see what performance reviews for the Ryzen systems shows. It might be a facepalm moment if you don't and then realize you could have had a much better performing system, for less money. I certainly understand wanting Intel since it's traditionally been the better performer, but as has been discussed throughout this thread, that might not be the case for much longer.

As for the motherboard. Is there a reason you need so many x4 M.2 devices? Unless you are doing sequential transfers from one of them to another one of them, you're never going to see that kind of performance anyhow. The random performance on most SATA SSDs at low queue depths is not dramatically different from what you see on M.2 PCI drives. There is a small bump in performance on NVME or a big one at very high queue depths, but few people are ever going to be working at QD32 anyhow so it's mostly pointless.

If it is simply for aesthetics, then that's a different story although the SATA drives are stashed out of sight on that case so I'm not sure that's a big factor either.
 
Reactions: TCA_ChinChin

R_G_S

Honorable
Dec 17, 2012
129
9
10,585
0
Yeah, I know the KS isn't going to bring much of a performance increase and I don't think it's worth holding off for what might be 6 months. As I say, it's mainly just the freak in me that likes to have that top-tier part and, I don't know, there's something I like about '5 GHz across all 8 cores' (given I don't want to OC for longevity/stability reasons). I'm also aware that it'd be priced at a premium and I'd lose money on the existing K chip (naturally), thus it makes very little sense from a value perspective. The thinking behind the low-tier CPU interim solution was that it might be able to still deliver a capable (though far from ideal) machine that I could comfortably work with for now and easily replace in Q4 at a fixed loss (around £60 or so for a low-end Pentium, £120 for an i3 or £180, an i5) as opposed to gamble on what I might get back by selling the 9900K. But yeah, probably not the best plan (and certainly not the most sensible!), plus I have no idea whatsoever how these super-budget parts perform as never considered them previously.

As for AMD, my issue with them isn't that Intel has always been historically ahead, it's that my personal experience was that of an unstable system and glitchy performance in my main progs (Max + PS). Actually, there were also issues with games, now that I think of it (driver related, if memory serves). A lot of dev software (at the time at least) was just made to work better with Intel; as soon as I switched back to Intel/nVidia, everything just worked and I said to myself, "I won't be repeating that experiment again any time soon..." That was a good few years back mind you, so things may very well have changed, I'm just not ready to take the risk right now when I know that Intel's platform will deliver without question. I suppose it doesn't help that at the time I chose AMD all the forums were ablaze with the same "Intel are a dead; AMD is both cheaper and faster" talk we have now (though prob a little less than now to be fair), which was also backed up by glowing reviews. Out of interest however, when do you expect the reviews of Ryzen to hit, shortly before or after release?

Motherboard-wise, nope I don't need loads of M.2 drives, but I would like to be able to use a couple at least without losing any serious functionality elsewhere. Looking at boards at a glance they seem very feature-rich, until you begin to learn about all the limitations - I just want be sure sure I know what they all are - no nasty surprises. Take the Extreme for e.g., had I not done my research I'd have seen 2x GPU slots, 4x M.2 drives and 6x SATA connections and naively just expected them to all work well together (this platform's limited bandwidth is new to me, so just checking I know exactly what I'm getting myself into, before rather than after the purchase!). Not worried about the look of SSDs/HDDs, as you say, I have bays for them elsewhere and I like that setup. In terms of board aesthetics, I much prefer the Maximus XI Extreme to the AORUS Master which is what made me look at it in the first place, but aside from that, the layout and extra functionality also appeal (including the DIMM.2 card). I mailed ASUS about the bandwidth conflicts and will report back soon, but think I'll likely go with this one.

Cheers.
 

Darkbreeze

Titan
Moderator
You do realize that it's the same part, and they've just give it a "factory" overclock, by picking a highly binned part and hard coding that overclock themselves, right? Overclocking a K sku and getting a factory overclocked KS is really the same in terms of longevity. Both are going to require similar configurations and probably the same or near same voltage.

I think you're letting yourself get roped into thinking that that extra 100-200mhz is really going to make a lot of difference overall. It's not. Some, but not that much.

This platform, has more available lanes than any prior consumer non-HEDT platform. They all have the same available lanes, it's just how they are implemented and allocated that can vary. ALL of these boards can handle a graphics card, plus two M.2 devices AND likely four additional SATA drives, with no problem. I'm really not sure where the shortcoming is.

All of them lose something somewhere if you use something else. M.2 has to steal lanes from somewhere. Whether that is eSATA, PCI slots or whatever, there is finite resources no matter what board it is.
 
Reactions: Gee_Simpson

R_G_S

Honorable
Dec 17, 2012
129
9
10,585
0
I don't expect to see much of a real-world performance difference, which is exactly why I admit it's not sensible to swap a 9900K for a KS, or for that matter wait 6 months. I'd put it in the same category as wanting paying an extra £100 for an OC'd GPU which in almost all tests only delivers v. minor improvements (a few frames at best, with both already over 60 FPS...); in other words it's not really logical, but sometimes for whatever reason you fancy it. Maybe it's the Special Edition thing, or maybe it's as I'd been waiting for something faster than the 9900K that even though it's only faster by a fraction it somehow appeals, I don't know. Presumably being binned the KS chips are guaranteed to be 'better' than the K though, in general terms at least, but yeah, 'factory overclock' is how I described it myself ;). I know it's odd; I'll prob come to my senses soon!

Yes, I was comparing mainstream vs HEDT. As I say, there were certain things I wasn't aware of as regards bandwidth sharing so just making sure I've got my bases covered, particularly as the boards have slightly different ways of doing things albeit within the same platform restrictions.

Thanks again for taking the time to respond BTW.
 
Déjà vu!

Whatever brings you peace of mind at this point, @R_G_S. @Darkbreeze made the best points any of us could make against the KS, but if you still will feel better by just waiting for it, there's little we can do other than warn you about it.

At this point, if you feel like you'll be short on PCIe lanes for a bazillion SSDs, PCIe cards and whatever else, then the Z platform from Intel won't cut it and we'll be back to square one :p

Cheers!
 

R_G_S

Honorable
Dec 17, 2012
129
9
10,585
0
Honestly, guys, I think I'm good now.

I actually agree with you as regards the KS (see my post above) - it doesn't make sense, logically. My rather strange hankering for it isn't practical or logical, as I freely admit ;).
.
Also should be fine with the bandwidth, as I say I just wanted to make sure I wasn't going to hobble myself in the future, expansion-wise, but I think it's perfectly doable. Additionally, I 100% get the AMD argument; just my bad personal experience sets me against going with them, this time round at least.

Just waiting on nVidia's 'Super' announcement. If there's nothing new for the 2080 Ti I'll be going with the EVGA XC Ultra Gaming. It's £80 more than the Black Edition, but has better cooling and quieter operation, including the 'fan-stop-in-idle' feature (fans look the same as the Black, so presumably they just run at a slower RPM due to the larger heat sink). I'll also be adding that divisive 10 year warranty ;). MB will either be the ASUS Maximus XI Extreme or possibly Code.

Thanks again. I will post when I have another update - hopefully the final one!
 
Last edited:

Darkbreeze

Titan
Moderator
The EVGA XC Ultra gaming is ALSO a BINNED card, so it's going to likely run cooler, have more overclocking headroom and better overall performance in general, than most other cards that are probably not binned or the same quality as that card. It's a very good choice and a very good company that has always had superior customer service.

If I was going to buy one right now, that would likely be the 2080 TI model I'd be most interested in.
 
Reactions: R_G_S

R_G_S

Honorable
Dec 17, 2012
129
9
10,585
0
Final system:

Case: SilverStone Temjin TJ11
Motherboard: ASUS Maximus XI Extreme
CPU: Intel i9 9900K
CPU Cooler: Noctua NH-D15
Memory: 64GB (4x16GB) Corsair Vengeance LPX 3000MHz
Graphics Card: EVGA GeForce RTX2080 Ti XC Ultra
PSU: Seasonic PRIME Ultra 850 W 80+ Titanium
Storage: WD Black SN750 500GB M.2 PCIe NVMe
Operating System: Windows 10 Pro High End 64-Bit

I note that the memory is quad channel, though the CPU only supports dual channel - is this a problem? Assuming not, presumably the only advantage would be I could use it in an X299 system if I were to swap platform at a later date?

Also, do you think this is the best 64 GB kit for under £300? Other options available to me here: https://www.scan.co.uk/shop/computer-hardware/memory-ram/1955/2056/2571 (currently selected: https://www.scan.co.uk/products/64gb-4x16gb-corsair-ddr4-vengeance-lpx-black-pc4-24000-3000-non-ecc-unbuffered-cas-16-20-20-38-xmp-2).

I may go with Samsung for the NVMe drive as their 500 GB model has faster write speeds compared to WD.
I may go with the Maximus XI Code over the Extreme MB-wise, due to price.

Thanks.
 
Last edited:
No, Quad to Dual is fine, as it will just pair the "duals" instead of creating a single "quad", if that makes sense...

Also, you're going with a single hard drive? Those 500GB will be used really quickly. Do you have an external NAS or thinking about using external harddrives (like WD's passport thingies)? And I usually suggest Samsung's 970PRO NVMe drive, as it's really good. If you want more capacity, the WD is fine, but speed favours Samsung most of the time. Specially their PRO line. More comparable to the WD you selected would be the Samsung EVO line, but I think it would still be a tad more expensive.

Like I said before though, we're at "nitpicking" point now as there's nothing that stands out as "bad" from your build.

Oh, and before I forget to ask, are you buying paste or you'll be using the Noctua bundled one? I'm using the bundled one and I have no complaints, even though I'll be re-pasting my 2700X with my good ol' AS5.

Cheers!
 

RobCrezz

Titan
Herald
No, Quad to Dual is fine, as it will just pair the "duals" instead of creating a single "quad", if that makes sense...

Also, you're going with a single hard drive? Those 500GB will be used really quickly. Do you have an external NAS or thinking about using external harddrives (like WD's passport thingies)? And I usually suggest Samsung's 970PRO NVMe drive, as it's really good. If you want more capacity, the WD is fine, but speed favours Samsung most of the time. Specially their PRO line. More comparable to the WD you selected would be the Samsung EVO line, but I think it would still be a tad more expensive.

Like I said before though, we're at "nitpicking" point now as there's nothing that stands out as "bad" from your build.

Oh, and before I forget to ask, are you buying paste or you'll be using the Noctua bundled one? I'm using the bundled one and I have no complaints, even though I'll be re-pasting my 2700X with my good ol' AS5.

Cheers!
Dont replace Noctua NT-H1 with AS5! You might aswell use toothpaste :D
 
Dont replace Noctua NT-H1 with AS5! You might aswell use toothpaste :D
I don't even understand why the good ol' AS5 got such a bad rep... A single application kept my 2700K since 2012 at 4.6Ghz using a TT Frio. I've even used it for some GPUs with amazing results. The thermal transfer properties are superb for the price. While I won't do it in the short term, because the paste bundled with the Noctua had excellent reviews, it's still wearing out quicker than what the AS5 would. Even the MX-4, which is everyone fav, wears out faster than the AS5. Curation period is longer though and I won't deny I got scared the first few times I used it thinking the temps would remain higher than expected by me.

But hey, I'm always looking for more information, so if you have a couple links to share some benchies for the AS5 that show it's showing its age, I always welcome those :p

Cheers!
 

R_G_S

Honorable
Dec 17, 2012
129
9
10,585
0
Thanks, Yuka.

I'm having the system built for me by Scan. I'm fairly confident I could do it myself, but not quite there yet... ;)


Memory

I think it's between the quad (as above):

Corsair Vengeance LPX 64GB DDR4 3000 MHz RAM/Memory Kit 4x 16GB 64GB (4x16GB) Corsair DDR4 Vengeance LPX Black, PC4-24000 (3000), Non-ECC Unbuffered, CAS 16-20-20-38, XMP 2.0, 1.35V
(https://www.scan.co.uk/products/64gb-4x16gb-corsair-ddr4-vengeance-lpx-black-pc4-24000-3000-non-ecc-unbuffered-cas-16-20-20-38-xmp-2)

Or this dual kit (of which I'd need 2x):

Corsair Vengeance LPX 32GB DDR4 3000 MHz Memory Kit 2x 16GB 32GB (2x16GB) Corsair DDR4 Vengeance LPX Black, PC4-24000 (3000), Non-ECC Unbuffered, CAS 16-20-20-38, XMP 2.0, 1.35V
(https://www.scan.co.uk/products/32gb-2x16gb-corsair-ddr4-vengeance-lpx-black-pc4-24000-3000-non-ecc-unbuffered-cas-16-20-20-38-xmp-2)

Only £18 between them. I'm afraid I don't understand the timings etc. but they look identical to me, so presumably it's just the quad vs dual difference?


Storage

I'll be adding a lot more storage, don't worry (I already have 3 HDDs in my current rig which I'll transfer over, plus will be adding some SSDs shortly); the 500 GB NVMe will only be used for the OS and primary progs.
Details of the two NVMe drives I'm considering below (WD is actually slightly more expensive, but not much in it):

500GB WD Black SN750 M.2 (2280) PCIe 3.0 (x4) NVMe SSD, 3D NAND,3430MB/s Read,2600MB/s Write, 420k IOPS
500GB Samsung 970 EVO Plus, M.2 (2280) PCIe 3.0 (x4), NVMe SSD, MLC V-NAND, 3500MB/s Read,3200MB/s Write, 480k/550k IOPS

The overall specs of the Samsung appear superior to the WD as you suggested; WD's warranty is 5 years vs 3 and WTBF, 1,750K hrs vs 1,500K hrs, but Samsung's write speed is much better at 3200MB/s vs 2600MB/s. I was going to go with WD as have had a very good experience with their other products, plus all my other drives are/will be WD so can monitor them together via the WD SSD Dashboard, but that write speed has me leaning towards the Samsung (though as an OS/progs drive read seems much more important than write). Oddly the 'EVO Plus' looks faster than the 'PRO' and is significantly cheaper... or am I missing something?


Thanks
 
Last edited:

ASK THE COMMUNITY

TRENDING THREADS