Question Gaming and CPU cores overkill

Aug 30, 2022
2
0
10
Hello,
I've been working on server for the longest time, over 20 years, and since they use big amounts of data, the more cores the better for computing. Always been a console person, plug'n'play, but I intend to buy a pc for gaming/video editing for my lectures.
My question is about gaming, and want to know what is considered "useless" cores for games, many people tell me that 4 cores (with fast speed for all of them) is how far a game can use (due to optimization I think ?), but until now, nobody brought up a testing experiment to provide data about the cores necessary to run games without spending $1000 on a top of the line CPU that will not be used, and how to not go down a threshold to not bottleneck the GPU.
I base every decision I make on science, experiment and data, so, I would hope to find someone here who can provide lab tests or papers that are independent in the matter (YouTube is full of sponsored content and I’m immune to marketing), I tried everywhere, but i can't find any such data, also, I understand that games need to be optimized to the utmost limit to use the hardware to its full potential, as I saw a 3090 RTX with a 12900K struggling with some poorly optimized games.

I will be very grateful, thank you so much in advance.
 
Hello,
I've been working on server for the longest time, over 20 years, and since they use big amounts of data, the more cores the better for computing. Always been a console person, plug'n'play, but I intend to buy a pc for gaming/video editing for my lectures.
My question is about gaming, and want to know what is considered "useless" cores for games, many people tell me that 4 cores (with fast speed for all of them) is how far a game can use (due to optimization I think ?), but until now, nobody brought up a testing experiment to provide data about the cores necessary to run games without spending $1000 on a top of the line CPU that will not be used, and how to not go down a threshold to not bottleneck the GPU.
I base every decision I make on science, experiment and data, so, I would hope to find someone here who can provide lab tests or papers that are independent in the matter (YouTube is full of sponsored content and I’m immune to marketing), I tried everywhere, but i can't find any such data, also, I understand that games need to be optimized to the utmost limit to use the hardware to its full potential, as I saw a 3090 RTX with a 12900K struggling with some poorly optimized games.

I will be very grateful, thank you so much in advance.
Every now and then you get an article on how devs made a game and that's basically as good as it gets on getting any real info.
On the destiny one on page 204 you get the main issue we have right now.
They use multiple threads that are doing the same thing at different times (send work to the GPU) so the game can then choose the freshest frame from the GPU anytime it is ready to display a frame.

If it is hardcoded it will use 4 cores most of the times for anything made on or for the PS4, because it runs the OS on one of the two quads so any thread there would fight for resources and also the latency between the two quads is high so devs stick mainly to the 4 empty cores.
This is why many people say that anything more than 4 cores are wasted.

If it's not hardcoded it will run one thread per available hardware thread and the amount of more beneficial that is to doing that on just 4 cores is not great, you can look up gaming benches and it's pretty obvious in many cases which games scale up like that but the amount of more FPS you get compared to 4 cores isn't game changing but noticeable, it's up to you to decide on if that's worth it to you.
(Video editing will use more cores so they won't be wasted anyway)

PS5 effect on games will take many many years to show up.

http://advances.realtimerendering.com/destiny/gdc_2015/Tatarchuk_GDC_2015__Destiny_Renderer_web.pdf
 
Once you get out of mainstream space, that tops out at a 12900k or 5950x, you are in total overkill mode. HEDT like Threadripper, are a waste, if all you intend to do is game. Some games are so poorly optimized. An old analogy I have had heard used for WoW, it will play on just about anything, but play well on nothing.
 
Once you get out of mainstream space, that tops out at a 12900k or 5950x, you are in total overkill mode. HEDT like Threadripper, are a waste, if all you intend to do is game. Some games are so poorly optimized. An old analogy I have had heard used for WoW, it will play on just about anything, but play well on nothing.
well wow has dx12 now, but ye it started with dx9...xD but there are plenty of games which can run fine on potato hardware while running same/worse on much better hardware
which reminds me when 3d gaming started (mostly software rendered)...was always wondering which cpu you gonna need to play it on decent resolution xD
 
A very good question, and good answers are hard to come by.
The key question is how many threads can be USEFULLY used in a particular game.
It will vary by game.

In 2014 there was a reasonable test done:
I have been unable to find a similar test of more modern games.

Many games can make use of many threads.
One category that does well with many threads is multiplayer with many participants.
Unfortunately, a controlled test is impractical with too many variables.

But, the question is how many threads can be usefully used.
My guess is that the answer is something like 6 to 12 would be the most.
It used to be that windows would spread out the activity of a single cpu thread out over all available threads. The reasoning is to balance heat load.
In that case, one might think that a single threaded cpu limited game had no issue if task manager showed 20% usage.

If you now have a game, look at task manager. Newer versions seem to allocate cpu activity of a single thread to a single core because performance is better.

When I look at task manager while running games on a 12900K, I can clearly see perhaps two P cores near 100%. Sometimes as many as 4-6.
I can also see minimal activity on all 8 E cores.

I think the key to game performance, once you get past 6-8 threads will be the single thread cpu performance.
Run cpu-Z bench on your current processor and look at the single thread rating.
Here is a list which includes a 12900K as well as some other competitors:
https://valid.x86.fr/bench/mx5qrf/1

If you are in the market, and can wait, Intel and AMD will launch newer gen products soon.
I would expect a 15% boost in price/performance.
 
In 2014 there was a reasonable test done:
For this article, we used a stock i7 4930K (6cores)
So, let’s see how a high-end CPU performs on a variety of games. For obvious reasons, we tested the following games with and without Hyper Threading enabled. Hyper Threading is basically ‘cutting a CPU core in half’, meaning that two threads can run ideally via this method. Hyper Threading is ideal for games that take advantage of more than four-six CPU cores or for games that have more than eight threads enabled by default

well nice article..atleast they used 6 core cpu to compare 4-6 cores,,but 2014 was pretty much 4 core era...back then ive had bulldozer...thanks to ps3/ps4 garbage tier CPUs, games could keep up xD but around 2016 i think cpu wasnt keeping up anymore in more and more games...from 2017 switched to ryzen
 
Hello,
I've been working on server for the longest time, over 20 years, and since they use big amounts of data, the more cores the better for computing. Always been a console person, plug'n'play, but I intend to buy a pc for gaming/video editing for my lectures.
My question is about gaming, and want to know what is considered "useless" cores for games, many people tell me that 4 cores (with fast speed for all of them) is how far a game can use (due to optimization I think ?), but until now, nobody brought up a testing experiment to provide data about the cores necessary to run games without spending $1000 on a top of the line CPU that will not be used, and how to not go down a threshold to not bottleneck the GPU.
I base every decision I make on science, experiment and data, so, I would hope to find someone here who can provide lab tests or papers that are independent in the matter (YouTube is full of sponsored content and I’m immune to marketing), I tried everywhere, but i can't find any such data, also, I understand that games need to be optimized to the utmost limit to use the hardware to its full potential, as I saw a 3090 RTX with a 12900K struggling with some poorly optimized games.

I will be very grateful, thank you so much in advance.
The ability to scale with core count varies widely depending on the game and you tend to get diminishing returns as the core count increases. Not all cores are created equal either so a fast 6 core can outperform a slower 8 core for example. More cores won't make much difference to your average frame rate but higher core count CPU's tend to have smoother frame times. Essentially they have less variance in their frame rate so motion in games appears smoother. How perceivable that is though depends on what CPU's your comparing and what game.

To give you an example of a game that can use lots of cores, this is me playing Cyberpunk 2077 on a 10 core/20 thread CPU:
View: https://imgur.com/i0OKdTB


All 20 threads are loaded and the CPU is at 76%. Due to Hyper Threading a load of 50% would mean all 10 cores are being fully utilised.

In terms of the CPU's you should buy, a 6 core/12 thread CPU should be your baseline. If you don't want something super expensive then I would recommend the upcoming i5 13500 and 13600K. They are 10 and 14 cores, 6 P cores supported by 4 E cores on the 13500 and 8 E cores on the 13600K. They would be a good balance of performance for gaming and video editing tasks.
 
A very good question, and good answers are hard to come by.
The key question is how many threads can be USEFULLY used in a particular game.
It will vary by game.

In 2014 there was a reasonable test done:
I have been unable to find a similar test of more modern games.

Many games can make use of many threads.
One category that does well with many threads is multiplayer with many participants.
Unfortunately, a controlled test is impractical with too many variables.

But, the question is how many threads can be usefully used.
My guess is that the answer is something like 6 to 12 would be the most.
It used to be that windows would spread out the activity of a single cpu thread out over all available threads. The reasoning is to balance heat load.
In that case, one might think that a single threaded cpu limited game had no issue if task manager showed 20% usage.

If you now have a game, look at task manager. Newer versions seem to allocate cpu activity of a single thread to a single core because performance is better.

When I look at task manager while running games on a 12900K, I can clearly see perhaps two P cores near 100%. Sometimes as many as 4-6.
I can also see minimal activity on all 8 E cores.

I think the key to game performance, once you get past 6-8 threads will be the single thread cpu performance.
Run cpu-Z bench on your current processor and look at the single thread rating.
Here is a list which includes a 12900K as well as some other competitors:
https://valid.x86.fr/bench/mx5qrf/1

If you are in the market, and can wait, Intel and AMD will launch newer gen products soon.
I would expect a 15% boost in price/performance.
Thank you.

I've been looking into that particular number and it seems to be more prelevant than the core number, for example i was looking at the i7-9700K(8c-8t) & the i9-9900K(8c-16t) which are 4 years old, same base frequency, but a bit more in the boost mode in the i9, nonetheless, they seem to be performing just fine in modern games, if they are not CPU heavy, mostly early access games that lack the optimisation necessary to make use of the GPU.

My observation is the pricing, since there's high demand and low offer, the price is more expensive, that's why I'm trying to get a machine that will do the job and not spend the excess energy on nothing productive, it's just wasted power, the new generation approach to have two sets of cores, eco & performance is a really good idea, but to buy the new generation hardware in my country, i'll get it for over 220% the MSRP (tax and other things), that's why i was trying to find the middle ground CPU to pair with a 3070 RTX (current mid range, but rivals the high end 2080ti in performance, which is fine to me).
 
My question is about gaming, and want to know what is considered "useless" cores for games, many people tell me that 4 cores (with fast speed for all of them) is how far a game can use (due to optimization I think ?), but until now, nobody brought up a testing experiment to provide data about the cores necessary to run games without spending $1000 on a top of the line CPU that will not be used, and how to not go down a threshold to not bottleneck the GPU.
Core count for the most part is irrelevant. What matters more still in CPU performance is how strong each of those cores are. A 4-core CPU doesn't mean anything. That could be a Core 2 Quad, which is in an entirely different league of performance compared to say a Core i3-12100. If anything, people are only suggesting "4-cores minimum" under the assumption you're getting something recent.

However, even then, there's no hard limit in what's required. If an application was designed for 4-cores, that doesn't mean it can only run on 4-cores. It can run on a 2-core or a single core processor. How work gets scheduled on an application is by how many threads it has ready to run. An application designed for 4-cores likely means that on average, 4 threads are ready to run. So it'll run better on a 4-core processor, but that doesn't mean it won't run on anything less.

So ignore how many cores you need. The only way you can tell if you're getting the best bang for your money is to look at how they actually perform through benchmarks and whatnot.
 
  • Like
Reactions: John Chesterfield