Question Need some clarification about CPU bottlenecking

WiseElf

Reputable
Sep 16, 2020
15
3
4,515
Hello!

I'd like to get clarifications about what the practical implications of CPU bottlenecking are. I did some research to better understand this phenomenon and from what I understand, CPU bottlenecking occurs when a graphics card is so fast that it renders frames faster than the CPU can draw, so the GPU is basically held back and cannot produce its maximum FPS.

It's with this understanding that in 2019 I bought an RTX 2060 which was a good match for a Full HD Ultra 60 FPS experience. Back then, the idea of buying an RTX 2080 at that resolution was considered overkill for a lot of people because a CPU bottlenecking problem would occur.

Now in 2023 for around 90-95% of my games, my RTX 2060 still holds up pretty well and I can still play with an optimal performance. However, some newer games are starting to be too much like A Plague Tale: Requiem. Even when I lower the settings to medium, I roughly get 40-45 FPS and looking at the recommended settings in Steam, an RTX 3070 is advised to get a Full HD Ultra experience. This makes me realize that in 2019, getting an RTX 2080 would have been overkill for some years, but would have been a better future proof choice, since I could have played A Plague Tale with that card Today.

This experience is making me wonder about what are the actual practical impacts when CPU bottlenecking happens? For people that disable V-Sync, this makes sense to me since they'll want the maximum FPS their card can produce and getting an 80 class card for running Full HD games will be an issue (right now in 2023 at least), but in my case using V-Sync does cap my FPS at 60 frames all the time and my goal is also to get cards that could ideally last 5-6 years before I have to upgrade again, so I don't really get what the bottlenecking issue is all about. Now, I understand that everyone has different goals and gaming contexts, which is why I'm asking for your input so I can broaden my vision to better understand that issue.

Thanks!
 
Now in 2023 for around 90-95% of my games, my RTX 2060 still holds up pretty well and I can still play with an optimal performance. However, some newer games are starting to be too much like A Plague Tale: Requiem. Even when I lower the settings to medium, I roughly get 40-45 FPS and looking at the recommended settings in Steam, an RTX 3070 is advised to get a Full HD Ultra experience. This makes me realize that in 2019, getting an RTX 2080 would have been overkill for some years, but would have been a better future proof choice, since I could have played A Plague Tale with that card Today.
Reducing resolution and or quality settings without getting an increase in FPS means that the CPU can't create more FPS, it's what you explained in the first paragraph so I'm confused on how you now seem to think that a better GPU would get you better FPS...
 
I'd like to get clarifications about what the practical implications of CPU bottlenecking are. I did some research to better understand this phenomenon and from what I understand, CPU bottlenecking occurs when a graphics card is so fast that it renders frames faster than the CPU can draw, so the GPU is basically held back and cannot produce its maximum FPS.
Or, the CPU is theoretically capable of providing more frames per second than the GPU can handle.

'bottleneck' is one of the most misunderstood terms around.
Remove it from your lexicon, and move ahead with your system.


CPU provides the framerate, GPU applies the eyecandy to those frames.
 
Reducing resolution and or quality settings without getting an increase in FPS means that the CPU can't create more FPS, it's what you explained in the first paragraph so I'm confused on how you now seem to think that a better GPU would get you better FPS...
I might not have been clear, sorry about that. Trying to play this game at Ultra settings gave me a worse experience (around 25-30 FPS) and lowering the settings at Medium improved the situation to 40-45 FPS, which wasn't enough to me since I want that steady 60 FPS experience.
 
CPU bottlenecking is a symptom of a problem, but not the problem itself.

The problem is your computer is not performing to the level you want. To get it to the level you want, you identify what parts of the computer need upgrading. To check if the CPU needs upgrading, run the games you want to play at the lowest possible graphics settings. This will give you the maximum frame rate possible for the system, assuming the GPU is reasonably up to snuff. If that performance is not what you want, then it's time to upgrade the CPU.
 
CPU bottlenecking is a symptom of a problem, but not the problem itself.

The problem is your computer is not performing to the level you want. To get it to the level you want, you identify what parts of the computer need upgrading. To check if the CPU needs upgrading, run the games you want to play at the lowest possible graphics settings. This will give you the maximum frame rate possible for the system, assuming the GPU is reasonably up to snuff. If that performance is not what you want, then it's time to upgrade the CPU.
ok thanks!

This is good for an upgrade scenario, but for a full build situation, (like in happened to me in 2019) this seems tougher to decide.

It might be considered Today that an RTX 4060 would be a good match for Full HD Ultra instead of getting a 4070 or 4080 card, but there's always the chance that the performance of these 2 cards will be considered the sweet spot at that resolution in 3-4 years.

So if the objective of buying a new card isn't limited at being good enough Today, but should still be good in the next 5 years, getting a higher tier cards seems like a good idea? I mean based on my current situation, the idea of getting a 2080 back in 2019 was considered overkill by many and yet in 2023, this card now meet what is considered the recommended setting for a Full HD Plague Tale experience, so it does not feel overkill anymore. And as far as CPU goes, a 3600X is recommended and I use a 3700X so my CPU should be good enough.
 
So if the objective of buying a new card isn't limited at being good enough Today, but should still be good in the next 5 years, getting a higher tier cards seems like a good idea? I mean based on my current situation, the idea of getting a 2080 back in 2019 was considered overkill by many and yet in 2023, this card now meet what is considered the recommended setting for a Full HD Plague Tale experience, so it does not feel overkill anymore. And as far as CPU goes, a 3600X is recommended and I use a 3700X so my CPU should be good enough.
Exactly! Your understanding of the concept of "bottleneck" is perfectly fine.

A bottleneck is whichever part that SIGNIFICANTLY caps your performance even when the rest of the hardware is capable of much more. In different scenarios, a different part of the PC becomes a bottleneck. Even a perfectly balanced system will only stay balanced in SOME (preferably in most) of the use cases.

Anyway, since most games are GPU bound (the gpu does the heavy lifting), thinking long term i'd rather have a CPU bottleneck in SOME gaming scenarios, rather than a GPU bottleneck in MOST gaming scenarios. Not to mention the fact that a CPU upgrade (and even a platform upgrade) is cheaper than upgrading GPUs (well, in my region, at least).

But, if i would've been upgrading my GPU every 2-3 years, i would want to take benefit of its maximum potential RIGHT NOW. And thats why, i guess, people are paranoid over it (it's mostly people who are lazy and/or unable to understand the not so nuanced meaning behind the concept, as far as i can tell).
 
Exactly! Your understanding of the concept of "bottleneck" is perfectly fine.

A bottleneck is whichever part that SIGNIFICANTLY caps your performance even when the rest of the hardware is capable of much more. In different scenarios, a different part of the PC becomes a bottleneck. Even a perfectly balanced system will only stay balanced in SOME (preferably in most) of the use cases.

Anyway, since most games are GPU bound (the gpu does the heavy lifting), thinking long term i'd rather have a CPU bottleneck in SOME gaming scenarios, rather than a GPU bottleneck in MOST gaming scenarios. Not to mention the fact that a CPU upgrade (and even a platform upgrade) is cheaper than upgrading GPUs (well, in my region, at least).

But, if i would've been upgrading my GPU every 2-3 years, i would want to take benefit of its maximum potential RIGHT NOW. And thats why, i guess, people are paranoid over it (it's mostly people who are lazy and/or unable to understand the not so nuanced meaning behind the concept, as far as i can tell).
ok thanks! 👍
 

TRENDING THREADS