Question Bottleneck calculator is legit?

Status
Not open for further replies.

USAFRet

Titan
Moderator
If any of your people still maintain it is 'legit', have them explain this.

Same system, same parts, same OS, 9 months apart
WSz617j.jpg
 

Karadjgne

Titan
Ambassador
It's a joke. Bottlenecks aren't created by a cpu. Period. Bottlenecks are created by software and given life and affirmation by imagination and misconception.
A dual core cpu will run minecraft all day long, get 300+fps. Pair it with a RTX2080ti. All of a sudden ppl see a bottleneck from the cpu because the gpu isn't being used 100%. Hate to break it to ppl, but cpus and gpus always run at 100%, it just might not take 100% of their resources to get that 100% speed. Take that same dual core / 2080ti and slap it with 4k, no longer a bottleneck.

Or

Pair an i7 with a gtx750ti, that'll handle minecraft easily too, at 1080p, don't even consider 4k.

Same pc's, change games to Battlefield5, whole different story. For both.

So how a online calculator can figure a definitive result, when there's 1000's of variables that change at any given moment or depend on outside influences like resolution, is beyond me.

It's seriously a joke, paid for by advertising not reality.
 
It's a joke. Bottlenecks aren't created by a cpu. Period. Bottlenecks are created by software and given life and affirmation by imagination and misconception.
A dual core cpu will run minecraft all day long, get 300+fps. Pair it with a RTX2080ti. All of a sudden ppl see a bottleneck from the cpu because the gpu isn't being used 100%. Hate to break it to ppl, but cpus and gpus always run at 100%, it just might not take 100% of their resources to get that 100% speed. Take that same dual core / 2080ti and slap it with 4k, no longer a bottleneck.

Or

Pair an i7 with a gtx750ti, that'll handle minecraft easily too, at 1080p, don't even consider 4k.

Same pc's, change games to Battlefield5, whole different story. For both.

So how a online calculator can figure a definitive result, when there's 1000's of variables that change at any given moment or depend on outside influences like resolution, is beyond me.

It's seriously a joke, paid for by advertising not reality.

Wouldn't your 2080ti/Minecraft scenario actually be a "bottleneck"? If your GPU is capable of processing more frames but the CPU cannot feed it fast enough wouldn't that be considered a "bottleneck"?
 

USAFRet

Titan
Moderator
Wouldn't your 2080ti/Minecraft scenario actually be a "bottleneck"? If your GPU is capable of processing more frames but the CPU cannot feed it fast enough wouldn't that be considered a "bottleneck"?
Technically, yes.
But if the low grade CPU can deliver 300fps, it does not matter what the GPU can do.
Pair that 2080 with an uber CPU, and we get 700fps? No.
At some point, the actual game becomes the 'bottleneck', no matter what the hardware is.
 

Karadjgne

Titan
Ambassador
No. It's not realistically a bottleneck. A bottleneck is a component that slows down the flow of info. The cpu will pre-render X amount of frames and thats it, no more. It'll send every last one of those frames to the gpu to be processed and put up on screen. If the gpu cannot fulfill that, the gpu is a bottleneck, it's slowing down the flow. If the gpu can do it, then good. Just because a cpu puts out less frames than the gpu can process doesn't make it a bottleneck, the flow from the cpu at 100%. It's the software that dictates the amount of frames a cpu can pre-render. Obviously a stronger, more able cpu will pre-render more frames. Just as obvious is that other factors can play a role, ram, ram speeds, storage ability etc.

By itself, there's no such thing as a bottleneck as perceived. You'll only know any component is slowing down info if there's a comparison. And comparing your pc against YouTube videos, online bottleneck calculators etc is a fools game.

I get 300fps in cs:go with a 3770k and gtx970. With a rtx2060 I'll get 300fps. With a 2080ti I'll get 300fps. Why would bumping up power in the gpu all of a sudden make my cpu a bottleneck. It isn't. It put out what it puts out. You only perceive its a bottleneck because you expect a stronger gpu should have higher fps. Reality is, the cpu is capable of 300fps in that game, at those details, at that resolution, it just happens to be lower than what those gpus, mine included, can output. Change games to my heavily modded skyrim that uses 170 scripted mods, fps drops to 60. At that output, even a 750ti wouldn't have any problems. Cpu bottleneck? No. Software flood.
 
Last edited:
'Bottleneck' is always a trigger-word! :)

If you don't have the fastest CPU paired with the fastest GPU...so what?

Once of those two will generally be limiting your performance somewhat... I've seen people advocate getting too weak-CPUs just because the intended GPU is mid-range, meaning, if/when the GPU is upgraded 18 months later, now the CPU is then not enough. :)

Folks choosing a 9900K don't have to get a 2080Ti to enjoy it, and, conversely, folks getting a 2060 don't have to get only a 9400F ...

Someone getting an 8700K and 1060 about 18 months ago could replace it with a 2060 now (if needed), and possibly again with a 3060 in a years time, and had a grand old time even though not having THE fastest CPU or GPU....

That being said..would I get a 9700K or 9900K and pair it with a GTX1030 or 1050..for gaming? Absolutely not.

Would I get a 2070 or 2080Ti and pair it with an i5-9100 or even a 9400F...again,,... absolutely not...

Just think....Somewhere, ....someone is ordering a 9980X/X299 combo and a 2080Ti (or two!) with a spinning drive to play Minecraft/solitaire as we speak! :)

Certain pairings make little sense. Most combos fall in between those extremes; it's up to the user to do 5-10 minutes of research and comparisons in a variety of applications before choosing..
 
Status
Not open for further replies.

TRENDING THREADS