Question New Build and Future Proofing

Most future proof gaming CPU?


  • Total voters
    3

elijahsearle

Distinguished
Jul 6, 2012
54
0
18,630
0
Hi there everybody! I’ll try to keep this short and sweet, so I’ll ask my question and then explain everything after that.

My question is, which CPU would you buy for a 90% gaming usage computer that I don’t plan on updating (CPU/MOBO wise) for 5-6 years?

I’m stuck between the i9 9900k and the 3900x. (Or maybe even waiting, if I can be persuaded)

so... that out of the way let me explain myself.

While I recognize that the 9900k, beats the 3900x per clock, overclocks better and generally has better gaming performance there are a few things about the 3900x that interest me.

The 3900x has 4 more cores, 8 more threads (than the 9900k) A bigger L2 and L3 cache, and also PCI-E 4.0. The problem is, the 3900x doesn’t overclock as well, and most games don’t take advantage of high thread/core counts anyway.
-So is there any real advantage to gaming with a larger L2/L3 cache?
-While PCI-E 3.0x16 isn’t fulled capped out even with a 2080ti, is it worth going to 4.0 because eventually a graphics card will?
-Also, realistically games will get better at using more cores and threads eventually right?
Realistically the only 4.0 advantage at this point in time is with m.2 nvme drives and even then that’s not hugely important right?

That pretty much sums it up, sorry for the wall of text. I’m nervous about spending so much money on a computer, especially when new stuff comes out right around the corner.

thank you so much for your help!
 
Last edited:

Eximo

Titan
Herald
You can basically ignore clock speed, it is what the output is that matters. 5Ghz may look impressive, but if it only gets you there 5% faster, worth it?

That cache is really there to feed the core's architecture. Some tasks will benefit from having extra, but given AMD's design, less so. Particularly if that data needs to span across chiplets.

Can't really predict the future of gaming, but generally this has been the case. Some quad cores are starting to struggle on the latest games. But remember, they are also targeting the largest audience possible, so in a few years time, with all these recent 2600 and 3600 Ryzen purchases, you will see a lot of 6/12 thread optimization, I would guess. Consoles are looking to be APUs again with a pretty beefy mid-range GPU, so they will also have a decently high core count. But not 12/16/20 cores, it will be a while before that becomes standard.

Also they will start hitting diminishing returns eventually for consumer applications.

PCIe is another issue entirely. Are you one to buy GPUs that hefty? Going to be the top tier graphics cards that need PCIe 4 to operate well, if at all. Right now it is just that much marketing buzz.

Could be in a few years time they focus on core clock speed for consumers and efficiency/core count for servers. AMD is in a good position to deliver on both with the chiplets. Intel appears to be going down a stacking route, so they will also end up with a modular design I believe.

Intel's LGA1151 socket is certainly already EOL. LGA1200 slated to replace it.

AMD's AM4 will still be supported, maybe one more refresh of chips, a Zen2+. I would guess Zen3 will maybe have a new socket.

Also DDR5 to look forward to.
 

Newtonius

Notable
Sep 25, 2019
897
175
1,140
185
What Eximo said ^ you also say you'll spend 10% on productive work but once you see the powerhouse that is the 3900X you might start delving into new hobbies and skills. Ever since I got my 3900X I really got into video editing. The performance for both rendering and compressing makes clipping and sequencing videos so much easier (the 32" monitor helped out a little too ;) ).

The 9900K to me is merely a pure gamer choice, excluding the usage of a few adobe applications, that's all it's really good for compared to AMD.

And when the time comes to upgrade even the 3900X, I could chuck it into my server case and have a 24 thread game, web, and NAS server.
 
Reactions: elijahsearle

elijahsearle

Distinguished
Jul 6, 2012
54
0
18,630
0
Thanks for both of your quick replies.

So, okay, I’m feeling more confident with the 3900x now but between what you both brought up, I in turn have some more questions.

I do plan on buying a single 2080ti, or waiting for it’s successor, as I’d like to play at 4k 60hz, to answer your question. I know not even a 2080ti can max out 3.0x16 now, but it’s something I might upgrade to in 2-5 years.

So with AMD’s infinity fabric, I read that RAM speeds above 3600 go from a 1:1 to 2:1, slightly negating performance gains from higher speeds but with high enough speeds and timings that cancels out as well.
This isn’t true of intel right? (I’ll google after this)

Also, do you think it’s likely DDR5 will come out this year with Intel’s new CPU’s? Is it worth waiting? My understanding is that RAM doesn’t have a huge role in gaming, besides having enough of it and at a fast enough speed.

for example, my RAM will be 2x8gb Corsair Vengeance at 3600 with 14-16-16-36 timings. That should be competent for awhile, eventually upgrading to 32 gbs right?
 

Eximo

Titan
Herald
Yes, the infinity fabric between the CPUs and I/O chip run at the speed of the memory. And if you exceed 3600Mhz, the IF speed drops to half that. But they can be independently controlled. From everything I have seen, it doesn't make a lot of sense to go above 3600Mhz.

Intel has no such fabric or setting, or anything close. Why their CPUs aren't as sensitive to memory speed as Intel's.

If your goal is 4K 60hz, then memory speed is the least of your problems, as much GPU as possible is needed. Even negates the difference between Intel and AMD for the most part.

16GB is plenty for gaming, at least for now. A few games that use like 11GB, so unless you really like to leave a lot of things open while gaming, safe to stick with 16GB long term. Always replaceable.

I don't think we will see desktop DDR5 this year, no. Maybe not for a good while actually. Probably makes its way into high performance applications, there is even talk of LPDDR5 making it into mobile devices first. So until DDR4 production is subsumed by DDR5 production, probably going to be cost prohibitive for a while.

Might see it on high end desktop though, i9/Xeon and Threadripper/Epyc 4.
 
Reactions: elijahsearle

Karadjgne

Titan
Herald
Still trying to figure out how Op came to the conclusion that the i9 9900k overclocks better than the 3900x. With the 9900k you are lucky to get nothing more than locking the cores and not pushing high 80's with the largest coolers and a simple OC to 5.1GHz is often too much.

It'll be next gen before Intel uses pcie 4.0, go figure Intel and nvidia playing catchup, but that can be important for some ppl who have a need for drive bandwidth speeds and future gpu releases.
 

ASK THE COMMUNITY

TRENDING THREADS