Question Do I need a 16 or 24-core CPU for gaming?

fordongreeman

Honorable
Nov 5, 2017
421
3
10,785
Do I need a 16 or 24-core CPU for gaming?

The new flagship processors from both AMD and Intel have this many cores, but for gaming, I was told that I only need 6 or 8 cores.

If I was to get an RTX 4090, do I need a flagship CPU as well?
 
No, you don't. Practically any last or this Gen Ryzen 5, 7 or 9, or any last or this Gen Intel Core i5, i7 or i9 is perfectly fine for any existing titles. In fact, depending on your expectations many users are doing quite well with even lower tiered CPUs but for the RTX 4090 I'd stick with either one of the this Gen Ryzen products or one of the this Gen Intel products and I'd go with whatever the best is you can reasonably afford without having to sacrifice too much on the rest of the build if this is a full build.

A 5000 series or 7000 series Ryzen 5 or 7 is plenty good. A 12th or 13th Gen Intel i5 or i7 is plenty good.

Of course, you must also factor in how much multi-tasking you plan to do. If you ONLY run the games themselves without much else running simultaneously then pretty much any modern CPU with between 6-12 cores is perfectly fine. If you plan to do a lot of things at once, like gaming along with streaming, recording, many browsers, lots of mods, many overlays, or whatever, that is when you might want to fudge upwards a little bit in terms of going for a few more cores simply to offset the additional processes you think you will need to run simultaneously.
 

Karadjgne

Titan
Ambassador
Neither. 8/16 will get you everything you need. The only advantage of higher core flagship cpus is the increased Lcache and stock clocks and the better binning because of such.

Cpu = fps. Gpu = eye candy. If you use a 4090 on a 4k monitor, anything over @ 100fps is gravy anyway, so even a decent 10th Gen intel or 30 series Ryzen is sufficient for that.

The fps issue arises when trying to max fps, and get the detail settings. That's why most ppl use flagship cpus with a 4090, because they must get 500fps in CSGO. Even though they have a 1440p/144Hz monitor.
 
Neither. 8/16 will get you everything you need. The only advantage of higher core flagship cpus is the increased Lcache and stock clocks and the better binning because of such.

Cpu = fps. Gpu = eye candy. If you use a 4090 on a 4k monitor, anything over @ 100fps is gravy anyway, so even a decent 10th Gen intel or 30 series Ryzen is sufficient for that.

The fps issue arises when trying to max fps, and get the detail settings. That's why most ppl use flagship cpus with a 4090, because they must get 500fps in CSGO. Even though they have a 1440p/144Hz monitor.
Honestly, that is very much dependent on the resolution and settings you plan to run, what game titles you plan to run and what else you plan to be running alongside the game engine. Like I said. Mostly, as I stated before, I'm in agreement but I can envision some scenarios where a few more cores might actually be helpful. And, like we've been saying for years now, each year the new game releases get optimized more and more for increased multithreading performance. This year the most demanding games might use a maximum of X number of cores but next year that may increase to Y number of cores. It's impossible to predict the future, but so far pretty much every generation has increased core counts and the NEED for more cores in most software and games. Slowly, but surely.

But, there are sure as hell no games that even remotely require a 24 core CPU. 12-16 would be the maximum right now that I'd say could possibly be needed by any gaming machine, and then only if you are really busy running a lot of crap.
 
Do I need a 16 or 24-core CPU for gaming?

The new flagship processors from both AMD and Intel have this many cores, but for gaming, I was told that I only need 6 or 8 cores.

If I was to get an RTX 4090, do I need a flagship CPU as well?
 

fordongreeman

Honorable
Nov 5, 2017
421
3
10,785
No, you don't. Practically any last or this Gen Ryzen 5, 7 or 9, or any last or this Gen Intel Core i5, i7 or i9 is perfectly fine for any existing titles. In fact, depending on your expectations many users are doing quite well with even lower tiered CPUs but for the RTX 4090 I'd stick with either one of the this Gen Ryzen products or one of the this Gen Intel products and I'd go with whatever the best is you can reasonably afford without having to sacrifice too much on the rest of the build if this is a full build.

A 5000 series or 7000 series Ryzen 5 or 7 is plenty good. A 12th or 13th Gen Intel i5 or i7 is plenty good.

Of course, you must also factor in how much multi-tasking you plan to do. If you ONLY run the games themselves without much else running simultaneously then pretty much any modern CPU with between 6-12 cores is perfectly fine. If you plan to do a lot of things at once, like gaming along with streaming, recording, many browsers, lots of mods, many overlays, or whatever, that is when you might want to fudge upwards a little bit in terms of going for a few more cores simply to offset the additional processes you think you will need to run simultaneously.
What about RTX 4090 bottleneck?
 

Karadjgne

Titan
Ambassador
Wondering how a cpu could ever 'slow down' the performance of a gpu...

Cpu is what it is, it creates the fps. Gpu just has to deal with whatever fps it gets. Some games that's more fps than it can handle at the resolution and detail levels, some games it isn't.
 
Do I need a 16 or 24-core CPU for gaming?

The new flagship processors from both AMD and Intel have this many cores, but for gaming, I was told that I only need 6 or 8 cores.

If I was to get an RTX 4090, do I need a flagship CPU as well?

If you would share whats your current CPU and RAM, it will narrow the answers.

If gaming is all you care about, and you are going to use the RTX 4090 for 1440p or 4K resolutions, in most (99% of them) modern tittles any new 6/8 cores from either Intel/AMD should do the trick just fine.

If you are using an AM4 platform, you can always upgrade to a Ryzen 7 5800X3D for some extra performance in a few game titles.

But really, depending ony our current system and gaming resolution you may no need to upgrade at all.

On the other hand, I would be really carefull on the PSU selection for such a powerfull GPU.
 
If you look at how many systems are melting the connectors on those graphics cards, I'd say avoiding them for now (Even IF you DO get the cablemod 90° adapter, which isn't "time tested" either) until the issues get worked out, would be a really good idea. Seems like every day we get one or more new accounts of this happening. And I'm quite sure not all of them are making it to our awareness.

View: https://youtu.be/kRkjUtH4nIE
 

jnjnilson6

Distinguished
What is your definition of the term "bottleneck"?

What do you think that might do?
On a dreary tangent from the topic, I remember way back when I used to play video games in about 2012-2014 (I don't really play games anymore for I have found a grander and loftier medium for the distillation of beauty and perspective keen to one's ethereal contemplations), I had a Sapphire HD 6770 and Celeron G530. Well, Crysis servers hosted on my machine ran well until more than 5-7 people played within (the maximum was 32) and then the frame rate would drop significantly. After I upgraded to an Intel Core i7-3770K the frames never did drop with the increase in players and maintained a high and smooth level on the line of dauntless performance and overbearingly stimulating experience (I would get a stable 80 FPS and over where beforehand that would be about 30). It was like finally taking in a deeper breath and standing upon solider a ground; the beauty of palm trees and glittering sun-strung waves creating a colossal and mesmerizing experience, the significance and portrayal of the languorous days of gaming, inept and highly strung, cherished by the senses and immemorially outlined in softer tints and hues.
 
Last edited:
way back when I used to play video games in about 2012-2014
2014 is "way back"? LOL.

"Way back" is like 1986, which is when I first started REALLY playing PC video games. But you are right regarding the fact that a lot of games will dramatically increase demand for CPU resources when there are big jumps in the number of players, mass battles or like when view ranges are set to very long and substantially more tangibles have to be processed. Even so, the difference going from a dual core to a 4/8 isn't quite the same thing as going from something with 8-16 thread capability to something with more than that, when 8-16 are already plenty.
 

jnjnilson6

Distinguished
2014 is only 1 PC ago.

(BigBrother, listed in my sigs specs.
Sounds cool though, 'Big Brother!' Gives an impression of power and toughness and thoroughness; like a machine that would never cease to excite the temperament and wouldn't be brought down by time. Like those toughened characters you read about with a flair to perseverance in boxing matches; only it crushes hardcore games with computing power instead of faces with fists. (y)
 

USAFRet

Titan
Moderator
Sounds cool though, 'Big Brother!' Gives an impression of power and toughness and thoroughness; like a machine that would never cease to excite the temperament and wouldn't be brought down by time. Like those toughened characters you read about with a flair to perseverance in boxing matches; only it crushes hardcore games with computing power instead of faces with fists. (y)
At the time, it was.
Also, the connotation of 'Big Brother' holding information on everything. And it still does...;)
 
  • Like
Reactions: jnjnilson6