Question I can't find my bottleneck

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

0Ware

Prominent
Sep 27, 2022
33
2
535
3060 not using full usage
low gpu usage for no reason

I have a RTX 3060 for my gpu

i5 11400f for my cpu

16gb 2X8 3200mhz CL22 XMP enabled ram

500W PSU I do not reach the limit of the PSU at all

n50-620 motherboard

Most games at the highest settings use 99% usage but when I lower settings in any game my GPU usage goes down and I don't gain much fps. I only gain 5 fps from changing Ultra to medium in most games. How do I fix my GPU so it actually gives fps when I lower settings?
 

Karadjgne

Titan
Ambassador
the CPU to some extent should be a bottleneck but not as bad as mine. Ive seen people in benchmark videos with a 3060 and 11400f actually gain a reasonable amount of fps when the settings are lowered in different games.
Game settings are not all graphical in nature. There are more than a few that can be highly cpu dependent, like shadows and pre-process lighting, clouds, field of view, fog of war etc. The default, preprogrammed generic settings will change most settings to differing degrees, so dropping from Ultra to Very high can change the amount of instructions the cpu has to deal with, so fps goes up.

Having a 3060ti and 11400 in some games can mean the gpu is under used, in some games the cpu is under used. If the gpu is under used and cpu highly used, dropping the generics can have a sizable impact to fps. If the cpu is under used, gpu highly used, it generally has less of an impact.

Custom settings have the most impact if the cpu is highly used, drop specific cpu intense settings to low-med, and often not see much of a change visually, but the cpu has a lot of stress removed so is free to add many more frames.
 

0Ware

Prominent
Sep 27, 2022
33
2
535
You really do not want to see 100% usage of either cpu or gpu.
If you did, it would represent a factor that limits higher performance.
From your tests, I suspect that you are playing games that are cpu limited.
It may be that some of your background tasks are higher priority which will limit the capability of the cpu.

CPU-Z will identify the make/model of the motherboard used in your Acer n50-620 pc.
From that you can get what cpu options are available to you.
Probably, the strongest available upgrade would be a $240 i7-11700K
https://www.newegg.com/intel-core-i...=11700k-_-19-118-233-_-Product&quicklink=true
Probably, a better cooler would be in order also.

my goal is to get 100% gpu usage so I can actually use my 3060 to gain fps
 

Karadjgne

Titan
Ambassador
my goal is to get 100% gpu usage so I can actually use my 3060 to gain fps
Just the opposite will happen at 100% or close. Imagine the gpu can process 10,000 instructions per second to render 100 fps. You are at 100%. Now imagine your toon walks past a wall and a tank bursts through the wall. All those debris chunks, lighting and shadows, vectors, smoke etc are going to add a bunch of new instructions.

So instead of 10,000 instructions, there's 20,000 instructions that need to be rendered. But at 100% already, it's going to take the gpu twice as long, so instead of staying anywhere near 100fps, you'll be at 50fps.

Usage is Not how much of the gpu is Used, but how much it Uses. Massive difference. 100% usage literally means the gpu has no headroom for anything more, and if there Is anything more, fps Will go Down.

The gpu is eye-candy. The cpu is fps. If the cpu is at 100% usage, exactly the same thing happens. For smoother fps at maximum outputs, you want to be in the 50-70% usage at most. You need the cpu and gpu headroom for changing conditions or fps will go in the toilet.
 
I have to nitpick on something: GPU utilization is not how much of the GPU is actually being used. It isn't something like "3000 out of the 4000 shader units are in use" It's only "how often did the GPU have something to do?" Because of this, you can have scenarios where the GPU has high utilization in earlier or not-demanding games because the GPU is at a lower clock speed because in more of the clock cycles, there was something to do.

I will also argue that 100% GPU utilization is the best one to have. But at the end of the day, utilization is a metric that should be ignored unless performance requirements are not being met.
 

0Ware

Prominent
Sep 27, 2022
33
2
535
I have to nitpick on something: GPU utilization is not how much of the GPU is actually being used. It isn't something like "3000 out of the 4000 shader units are in use" It's only "how often did the GPU have something to do?" Because of this, you can have scenarios where the GPU has high utilization in earlier or not-demanding games because the GPU is at a lower clock speed because in more of the clock cycles, there was something to do.

I will also argue that 100% GPU utilization is the best one to have. But at the end of the day, utilization is a metric that should be ignored unless performance requirements are not being met.
I end up with much lower fps in my games compared to other similar builds. Any game I play when I lower the settings my fps does not gain much of an improvement. As of the games I never hit near 99% gpu usage in them. I know very well my gpu is capable of running them with higher than 50 fps on medium to low settings
 

USAFRet

Titan
Moderator
I end up with much lower fps in my games compared to other similar builds. Any game I play when I lower the settings my fps does not gain much of an improvement. As of the games I never hit near 99% gpu usage in them. I know very well my gpu is capable of running them with higher than 50 fps on medium to low settings
As was stated early on...your CPU generates the FPS.
The GPU just applies the eyecandy.

If your CPU can't feed more than 50FPS, your GPU will just be loafing, waiting for something to do.
 

0Ware

Prominent
Sep 27, 2022
33
2
535
As was stated early on...your CPU generates the FPS.
The GPU just applies the eyecandy.

If your CPU can't feed more than 50FPS, your GPU will just be loafing, waiting for something to do.
my cpu is supposed to feed way more and benchmarks are showing that its underperforming than the standard idk what to do about it
 
Lets try this again.
You do not seem to understand how pc gaming works.
For every frame you see the CPU must calculate every item on the screens size, position,volume,speed,trajectory,shadow etc....
Every tree, blade of grass,building,vehicles,clouds,NPCs,other players etc............... all must be calculated by the CPU.

This information is then sent to the GPU and a Wire Frame is made.
Now the
GPU takes over and adds skins, textures, certain lighting effects, bump mapping,tessellation etc...............
All of the "EYE CANDY" is added to the Wire frame the
CPU creates.

Games have one MAIN thread that runs on one
CPU core.
Other threads run on other cores. But very few games use these cores to 100%.
So for example a 4 core processor to keep math simple.
Core 1 main thread 100% usage.
Core 2 second thread 25% usage.
Core 3 third thread 50% usage.
Core 4 fourth thread 25% usage.
So windows adds these up and reports 50% CPU usage. When the main thread for the game is using 100% of its core.
So the only way to increase frame rates is to give Core one (main thread) less to do.
The other cores are mostly waiting on information from core one (main thread) to do their work. and their resources can not be used to help core one.

Most of the high scoring benchmarks you are chasing will not be possible by your current computer.
Things you can do.
Go to Task manager /start up and disable everything except mouse, sound,and anti virus.
Everything else is just running in the background using CPU cycles for nothing.

Reinstall windows fresh. complete wipe and format.
Only install windows and drivers.
Install the ONE game you want to benchmark.
Any thing else you install will lower benchmark scores.
Run your benchmarks.
This is the best your computer will benchmark.
As soon as you start installing other games,programs, software etc..... your benchmarks wil slowly get lower scores.
 

0Ware

Prominent
Sep 27, 2022
33
2
535
Lets try this again.
You do not seem to understand how pc gaming works.
For every frame you see the CPU must calculate every item on the screens size, position,volume,speed,trajectory,shadow etc....
Every tree, blade of grass,building,vehicles,clouds,NPCs,other players etc............... all must be calculated by the CPU.

This information is then sent to the GPU and a Wire Frame is made.
Now the
GPU takes over and adds skins, textures, certain lighting effects, bump mapping,tessellation etc...............
All of the "EYE CANDY" is added to the Wire frame the CPU creates.

Games have one MAIN thread that runs on one
CPU core.
Other threads run on other cores. But very few games use these cores to 100%.
So for example a 4 core processor to keep math simple.
Core 1 main thread 100% usage.
Core 2 second thread 25% usage.
Core 3 third thread 50% usage.
Core 4 fourth thread 25% usage.
So windows adds these up and reports 50% CPU usage. When the main thread for the game is using 100% of its core.
So the only way to increase frame rates is to give Core one (main thread) less to do.
The other cores are mostly waiting on information from core one (main thread) to do their work. and their resources can not be used to help core one.

Most of the high scoring benchmarks you are chasing will not be possible by your current computer.
Things you can do.
Go to Task manager /start up and disable everything except mouse, sound,and anti virus.
Everything else is just running in the background using CPU cycles for nothing.

Reinstall windows fresh. complete wipe and format.
Only install windows and drivers.
Install the ONE game you want to benchmark.
Any thing else you install will lower benchmark scores.
Run your benchmarks.
This is the best your computer will benchmark.
As soon as you start installing other games,programs, software etc..... your benchmarks wil slowly get lower scores.
already tried those
 
D

Deleted member 2947362

Guest
Just out of interest you say running 3200mhz ram CL22 equates 13 or whatever latency how do you work that out?
and how much better is cl16 for latency in games ?
sorry I'm out of my depth here lol
 

Karadjgne

Titan
Ambassador
I will also argue that 100% GPU utilization is the best one to have. But at the end of the day, utilization is a metric that should be ignored unless performance requirements are not being met
Everything takes time, that's a resource too, so at 100% usage, there's no time extra for any added instructions, no cushion, which means that time needs to be added to frame render time, fps goes down or lower the eye candy to compensate.

A gpu closer to 70% usage is still productive but has a resource cushion, so any additional instructions can be accommodated and not affect the total time to render a frame in any meaningful way.
 
D

Deleted member 2947362

Guest
Everything takes time, that's a resource too, so at 100% usage, there's no time extra for any added instructions, no cushion, which means that time needs to be added to frame render time, fps goes down or lower the eye candy to compensate.

A gpu closer to 70% usage is still productive but has a resource cushion, so any additional instructions can be accommodated and not affect the total time to render a frame in any meaningful way.
So in layman's terms what your saying is, the CPU is processing the data to pass on to the GPU the CPU will only become a bottle neck when it can't process the data faster enough the GPU.

CPU can become a bottle neck if the data is highly complex or the resolution is low enough for the CPU to reach it's max ability with the data it's processing, which could also be related to how optimised the game engine code is with the CPU's architecture.

The GPU can become a bottle neck if the screen resolution has saturated the amount of GPU's processing ability and/or Vram bandwidth, Vram amount or a combination of all.

Thus if your system is running a game and has free resources your system does not have a bottle neck with the game you are playing.

It could be to do with many things why it's not performing the way you expect, like driver revisions, type of storage drives, Overclocks, if the data is optimised for CPU and GPU, loads of variables can come into play.
 
Last edited by a moderator:
Thus if your system is running a game and has free resources your system does not have a bottle neck with the game you are playing.
A lot of game engines also have limiters in place, because otherwise they have the potential to run the hardware to the ground (which was the case for Amazon's MMO blowing up GPUs) or prevent the rest of the system from running in a useful manner.

For example, the pinball game Windows used to come with had a problem in Windows XP: it'd drive the CPU up to 100% utilization. Not because it was that intense of a game, but because there was no frame rate limiter, so it'd run at, according to the author, "one million frames per second."
 
D

Deleted member 2947362

Guest
A lot of game engines also have limiters in place, because otherwise they have the potential to run the hardware to the ground (which was the case for Amazon's MMO blowing up GPUs) or prevent the rest of the system from running in a useful manner.

For example, the pinball game Windows used to come with had a problem in Windows XP: it'd drive the CPU up to 100% utilization. Not because it was that intense of a game, but because there was no frame rate limiter, so it'd run at, according to the author, "one million frames per second."
I knew some games had to be frame rate limited to make them playable/not run stupidly fast

But blowing up hardware due to pushing to many frames? never heard of that.

These devices are designed to run code as fast as they can with in it's thermal limits at it's given frequency for extended periods of time.

although I know mining crypto can put excessive stress on VRM's and GPU's

But those cards are left running 24/7 for months or even longer and lot of them overclocked and/or a mod BIOS

I would of thought with gaming it not an issue because your not gaming 24/7 for months at a time before you finish playing your game and go to bed lol

Surly the hardware would down clock it's self to prevent damage due to the thermal's that would ensue if such a process were to come to such an event?

Well that is unless you overvolt and mod your card from it rated official spec's then fair enough but that's the risks with mods and overvolting etc your tuned your card past it's official supported spec's parts are going to wear a lot quicker.

bit like tuning a standard car engine past it's manufactured spec's, It's going to wear quicker.

PC GPU's blowing up because game engine didn't limit the frames ... I find that hard to swallow if I'm honest.
 
Last edited by a moderator:
Surly the hardware would down clock it's self to prevent damage due to the thermal's that would ensue if such a process were to come to such an event?
It depends on what those sensors act upon and how fast the firmware in the GPU reacts to it. You can have an event that lasts for milliseconds cause serious damage to hardware but it wasn't caught by the hardware because the firmware either used an average or polled once a second.

Semi-related, I had to troubleshoot an issue for a company I worked for where the system aborted a step, but none of the flags that described why the abort happened were set. I only found out that the event happened much faster than the part that sets those flags, so it never saw the event in the first place.
 
D

Deleted member 2947362

Guest
It depends on what those sensors act upon and how fast the firmware in the GPU reacts to it. You can have an event that lasts for milliseconds cause serious damage to hardware but it wasn't caught by the hardware because the firmware either used an average or polled once a second.

Semi-related, I had to troubleshoot an issue for a company I worked for where the system aborted a step, but none of the flags that described why the abort happened were set. I only found out that the event happened much faster than the part that sets those flags, so it never saw the event in the first place.
That makes sense.
Or could the software have shown a unknown flaw in the design of the hardware?

I can understand how malicious code could destroy hardware
But game engines destroying hardware due to push to many FPS ?
That's new to me and cant say I have ever had that happen or know of anyone who's graphic card was destroyed because it was pushing to many frames lol (Watch out RTX4090 owners!)
I'm not trying to undermine what your saying, I just find it hard to .. well believe if I'm totally honest.

To be fair I just benchmarked my RX5600XT using Microsoft Solitaire
Max 60FPS
Min 57FPS
So I guess I don't have to worry about my graphics card blowing up.

Im more shocked it drop 3fps in solitaire lol WTF?
 
Last edited by a moderator:
But game engines destroying hardware due to push to many FPS ?
That's new to me and cant say I have ever had that happen or know of anyone who's graphic card was destroyed because it was pushing to many frames lol (Watch out RTX4090 owners!)
I'm not trying to undermine what your saying, I just find it hard to .. well believe if I'm totally honest.
I'm too lazy to look it up, but check out RTX 3090 failures on Amazon's MMO.

But a lot of the problem was related to the design of the more powerful GeForce 30's power delivery circuitry allowing up to 200% TBP spikes for ~20ms.
 
D

Deleted member 2947362

Guest
I'm too lazy to look it up, but check out RTX 3090 failures on Amazon's MMO.

But a lot of the problem was related to the design of the more powerful GeForce 30's power delivery circuitry allowing up to 200% TBP spikes for ~20ms.
Well there ya go. that sound more like hardware to me than too many FPS.