Question Please Help Me with This Bottlenecking Myth

IDProG

Reputable
Jul 6, 2016
815
15
5,165
80
So, I'm always a believer of a statement that you can remove GPU bottlenecking by increasing the resolution of what you're playing.

For example: Ryzen 5 1400, if paired with GTX 1080Ti, is BLUE (Semi-bottlenecking in most demanding games).
Let's say that it bottlenecks at 1080p. If you increase the resolution to 1440p or 4K, you can reduce or even remove the bottleneck entirely.
This is because with the increase of resolution, the GPU tasks become much harder, while CPU tasks remain almost the same.

Anybody ever tried this?
Is the myth true?
 
It is true. The CPU can still only feed so many frames to the GPU for rendering. So all you are changing is the utilization of the GPU, and getting better visuals. You aren't going to improve CPU performance by doing this, but it will balance out usage and at least you'll have a better looking game.

I don't think it is good justification for pairing an overpowered GPU with an underpowered CPU, but it can make you feel better about your choice.

All that said, mileage may vary. You may find that cranking the settings will hurt CPU performance because of additional physics calculations or the addition of rendered items in the game.

So... the myth is more accurately, partially true.
 

ScrewySqrl

Champion
Moderator
Sorta. As GPUs work harder you see less difference between CPUs, but its not zero.

But that isn't really a BOTTLENECK. a CPU is Bottlenecked if , and only if, you see no difference between two different GPUs: So if you are getting. say 25 Frames a second with a 1070 and the same 25 FPS on a 1080ti, the CPU is bottlenecking the GPU. if the 1080ti has more than the 1070, even if it only jumps to 30fps, then the CPU isn't a bottleneck.

What you are describing is where something is GPU constrained. CPU/GPU constraints are normal. (In that you get less FPS from the same GPU from a R5-1400 than an R5-2600. Or less FPS from a 1060 than a 2060 on the same CPU. Those aren't bottlenecks though. As long as increasing the GPU nets more frames, then you are looking at a constraint, not a bottleneck
 
Bottleneck apps are junk science.
Do not pay much attention to them.

There is no such thing as "bottlenecking"
If, by that, you mean that upgrading a cpu or graphics card can
somehow lower your performance or FPS.
A better term might be limiting factor.
That is where adding more cpu or gpu becomes increasingly
less effective.

Games will be limited in performance usually by cpu core speed or graphics card performance.
Sometimes, games are limited by the number of available threads available.

As a general rule,
Multiplayer games with many participants need many threads.
Fast action games need fast graphics cards.
Cpu limited games such as sims, mmo and strategy games need fast single thread performance.

That is one reason that s simple bottleneck app is worthless without knowing the games you play.

There is always a limiting factor.

One can do some experiments to find the most effective upgrade.
For example, if you use power management to reduce your max cpu performance from 100% to 80%
Yowill see how important core speed is(or is not)
If you see little difference, your limiter is not the cpu.

Similarly, if you remove one processing thread, you can get some insight as to how important many threads are to the games you play.
If a game plays just as well with one less thread, you are not thread limited.

Or, if you lower the resolution or settings and you get better fps performance, you can assume that the graphics card is your most limiting factor.
 
CPU tells the GPU what to draw. GPU draws that in as much detail as the game settings are dictating. If the detail is comparatively easy and the GPU finishes rendering a frame before the CPU can deliver the next draw order, the GPU has to wait. If the detail is comparatively complex, the opposite is [basically] true.
 

ASK THE COMMUNITY

TRENDING THREADS