Question Low GPU Usage In Game RTX 3060 Ti After Upgrading GPU ONLY

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

ldefinis

Prominent
May 12, 2020
17
1
515
I've run out of options so I'm coming here as a last resort. I know there are lots of posts about this but none of them have worked for me and none of them really match my situation. I upgraded my GPU from a Asus 1070Ti turbo to the 3060 Ti. Everything else in my build stayed the same. My 1070ti was getting higher frames by 30-40 or more. Some games I cant even run because the frames drop below 10. I can see my GPU usage is stuck at 30% in task manager and it shows 0% load in NZXT CAM.

All power settings are set to performance mode.

The strange part is that when I run a benchmark the GPU usage is as expected 90% or more. I was using the Heaven Benchmark.

Games I was playing:
Destiny 2 low settings across the board getting 70-80 frames
Cyberpunk is unplayable on the rtx 3060ti but I was getting above 60 frames on my 1070ti

Build:
i7-8700
MSI Z390 gaming edge
600W Be Quiet power supply
Zotac RTX3060 Ti
32GB G.SKILL RAM
 

ldefinis

Prominent
May 12, 2020
17
1
515
Change that to High. If the gpu is whats at fault, your fps will tank hard with the additional workload. If fps really doesn't change much, then you are cpu bound for some reason. If that's the case, you'll need to think in a totally different direction.

But first get the bios and drivers straight.
Frames site at around 90 average on high settings.
 
Mar 16, 2022
84
15
45
Frames site at around 90 average on high settings.
I am home now friend, give me some minutes and I shall see how things go in PVP.
I would like to add that the 5800X is ahead of even the Intel 10 series CPU's, it is massively faster and well the performance adds up.

iu
 
  • Like
Reactions: ldefinis
Mar 16, 2022
84
15
45
If I'm upgrading I will go for i9 12th gen. No real budget.
The top i9 is hard to cool so you are going to need some serious cooling, I have added the best of the best AIO out there.

PCPartPicker Part List

CPU: Intel Core i9-12900K 3.2 GHz 16-Core Processor ($610.99 @ Walmart)
CPU Cooler: EK EK-AIO Elite 360 D-RGB 66.04 CFM Liquid CPU Cooler ($210.99 @ Amazon)
Motherboard: MSI MAG Z690 TOMAHAWK WIFI ATX LGA1700 Motherboard ($292.09 @ B&H)
Memory: G.Skill Trident Z5 RGB 32 GB (2 x 16 GB) DDR5-6400 CL32 Memory ($529.99 @ Newegg)
Power Supply: Corsair RMx (2021) 850 W 80+ Gold Certified Fully Modular ATX Power Supply ($127.99 @ Amazon)
Total: $1772.05
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2022-03-17 22:17 EDT-0400


Make sure the case you already have can actually house this 360 radiator, if not we can go through some cases that can and see which you like.
Also I would advise a PSU upgrade as 600w is cutting it close with a power hungry chip like this. I have added a great unit for the price already.


EDIT forgot motherboard.

MSI have been dropping excellent boards across both AMD and intel for a long time, definitely recommend them.
 

ldefinis

Prominent
May 12, 2020
17
1
515
The top i9 is hard to cool so you are going to need some serious cooling, I have added the best of the best AIO out there.

PCPartPicker Part List

CPU: Intel Core i9-12900K 3.2 GHz 16-Core Processor ($610.99 @ Walmart)
CPU Cooler: EK EK-AIO Elite 360 D-RGB 66.04 CFM Liquid CPU Cooler ($210.99 @ Amazon)
Motherboard: MSI MAG Z690 TOMAHAWK WIFI ATX LGA1700 Motherboard ($292.09 @ B&H)
Memory: G.Skill Trident Z5 RGB 32 GB (2 x 16 GB) DDR5-6400 CL32 Memory ($529.99 @ Newegg)
Power Supply: Corsair RMx (2021) 850 W 80+ Gold Certified Fully Modular ATX Power Supply ($127.99 @ Amazon)
Total: $1772.05
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2022-03-17 22:17 EDT-0400


Make sure the case you already have can actually house this 360 radiator, if not we can go through some cases that can and see which you like.
Also I would advise a PSU upgrade as 600w is cutting it close with a power hungry chip like this. I have added a great unit for the price already.


EDIT forgot motherboard.

MSI have been dropping excellent boards across both AMD and intel for a long time, definitely recommend them.
Looks like I have an excuse to build a new computer lol
Thanks a ton man, I appreciate all the help.
 
  • Like
Reactions: ProfSeaGoat

Karadjgne

Titan
Ambassador
There's no point to the i9. It's a total waste for gaming, it's only value is for those who do production stuff, content creation etc where all the eco cores are a major benefit. Otherwise the 12600k/12700k are right on its heels in gaming, and is 100% gpu bound at 1080p. At that resolution, the 3060ti will definitely be a limiting factor.

Why don't you fix the cpu you have. Or move to 1440p super wide-screen resolutions and really tax the gpu to its limits.

Games do not take any real advantage of the i9. They aren't designed for it. They are designed around the i5's in order to reach the biggest market share and still be playable on older generations. 6-Core/18-thread gets you 90%-100% of what the i9 can do.
 
  • Like
Reactions: KyaraM
Mar 16, 2022
84
15
45
There's no point to the i9. It's a total waste for gaming, it's only value is for those who do production stuff, content creation etc where all the eco cores are a major benefit. Otherwise the 12600k/12700k are right on its heels in gaming, and is 100% gpu bound at 1080p. At that resolution, the 3060ti will definitely be a limiting factor.

Why don't you fix the cpu you have. Or move to 1440p super wide-screen resolutions and really tax the gpu to its limits.

Games do not take any real advantage of the i9. They aren't designed for it. They are designed around the i5's in order to reach the biggest market share and still be playable on older generations. 6-Core/18-thread gets you 90%-100% of what the i9 can do.

Bolded part.

This persons frame rates will not improve moving to a higher resolution, still CPU bound in CPU limited titles.

At 2560x1440 my RTX 3070 which is roughly 25% faster is actually around 60% faster all because I run a 5800X vs this persons i7 8700 in Destiny 2.

So by this person moving to 2560x1440, we add a bit more GPU usage but the FPS does not improve, even gets worse.

Some of us run high refresh rate monitors, the reason I use a fast CPU is because my Gigabyte M27Q is 170hz.
In the test videos I showcased the performance, most of my aiming skills and reaction times are very fluid even without trying due to superior response time (0.5ms) and high frame rates.
Yes you heard, I was not even trying when playing Destiny 2 for the first time.

I am a Counter Strike veteran so am just able to use muscle memory quite well.

My aim looks similar to aimbot in some of the video.


Also the CPU this person has is not enough for high FPS in Destiny 2. Very lightly threaded game which relies on IPC which is why I brought up the point that it is relatively similar to the 2600X from AMD.

The Nvidia RTX series all have rathr high driver overhead further pushing the numbers down.

These processors are not high end, they are mostly mid low end due to mediocre IPC.

Ryzen 1000 series are even slower than this as is anything under the 8000 series intel CPU's.

You can't even run a lot of modern games with 4 cores and get 60 FPS so how on Earth would a 2600x be classed as high end or even an 8700?

this is not limited to Nvidias higher end cards.


Denial =/= truth.
 
Last edited:

Karadjgne

Titan
Ambassador
This persons frame rates will not improve moving to a higher resolution, still CPU bound in CPU limited titles.
Nobody's fps goes UP when raising resolution. If running a 4k monitor, you'd be happy as a pig in slop to get 110fps in anything, being absolutely gpu bound in everything. Even a R5 1600x does absolutely fine at 4k. Saying that op needs a faster cpu isn't truth, especially an overpowered production cpu like a 12900k. He'd do just fine, especially at 1080p with a 12400

So it's a trade off. Anything beyond the monitor refresh is wasted, meaningless. With a 170Hz monitor it'd not make any difference if you got 300fps or 500fps, you get 170 period. The only thing those extra fps gets you is the possibility of reduced latency, and thats a very uneven fraction that can be easily indistinguishable depending on the game.

Moving to a 1080/1440p superwide (basically 2 monitors in 1 screen) would get Op amazing field of view and actually help game play a lot more than a few fps.

You aren't the only CSGO vet around, I've been playing for years in both comp and noob, and as any vet knows, it's not the aim as much as the timing and knowing the mechanics of the game. That's how the scout snapshot works. And that timing is specific to the relationship of gpu to monitor to ping, which changes drastically depending on whether you are in Sand II or Office, whether you have a hdd or ssd, your inet connection, hub traffic, distance to nodes ....
View: https://youtu.be/OX31kZbAXsA
 
Last edited:
Mar 16, 2022
84
15
45
Nobody's fps goes UP when raising resolution. If running a 4k monitor, you'd be happy as a pig in slop to get 110fps in anything, being absolutely gpu bound in everything. Even a R5 1600x does absolutely fine at 4k. Saying that op needs a faster cpu isn't truth, especially an overpowered production cpu like a 12900k. He'd do just fine, especially at 1080p with a 12400

So it's a trade off. Anything beyond the monitor refresh is wasted, meaningless. With a 170Hz monitor it'd not make any difference if you got 300fps or 500fps, you get 170 period. The only thing those extra fps gets you is the possibility of reduced latency, and thats a very uneven fraction that can be easily indistinguishable depending on the game.

Moving to a 1080/1440p superwide (basically 2 monitors in 1 screen) would get Op amazing field of view and actually help game play a lot more than a few fps.

You aren't the only CSGO vet around, I've been playing for years in both comp and noob, and as any vet knows, it's not the aim as much as the timing and knowing the mechanics of the game. That's how the scout snapshot works. And that timing is specific to the relationship of gpu to monitor to ping, which changes drastically depending on whether you are in Sand II or Office, whether you have a hdd or ssd, your inet connection, hub traffic, distance to nodes ....
View: https://youtu.be/OX31kZbAXsA


Well explain how you gain no FPS with a Res increase if you are CPU limited by like 30% or more.... If you add a faster CPU and turn up the Res in this case for the OP here he will gain FPS even at 1440p.

,You have linked a 240hz screen which is not in context. 60to 144hz is a huge jump in visual quality 144 to 240 is not. Not everyone needs it though.
 

Karadjgne

Titan
Ambassador
Well explain how you gain no FPS with a Res increase if you are CPU limited by like 30% or more.... If you add a faster CPU and turn up the Res in this case for the OP here he will gain FPS even at 1440p.
The cpu has a job. It takes all the info and data necessary to create a single frame, every object, every lighting affect (that isn't post processing) applicable, every dimension, vector, field of view item, calculates motion from last frame, deals with Ai, everything required, and organizes all that into a packet which gets sent to the gpu. The amount of times a cpu can do that in one second (frames per second) is your fps limit. Resolution plays No part in that at all, nor does gpu bound details.

Resolution and graphics bound details are the sole responsibility of the gpu. A gpu is always a limiting factor, a potential bottleneck. At no time can a gpu increase fps above what it receives. It gets what it gets, no matter a Rx470 or rtx3090. If all the cpu can send is 100fps, then that's what the gpu has to work with.

When the gpu receives the packet, it first renders a wire frame, placing every object in its space, according to size, dimension, xyz axis etc. Then it adds colors, textures, shadows and lighting and finish renders that frame into final resolution.

1440p is a little over 1.7x more brutal on a gpu, due to pixel count, than 1080p.

Op changed from low to high and lost 20fps. That's normal because of cpu bound detail adjustments. Basically meaning the cpu is capped at what it is and the gpu has room for more. If Op moved to 1440p, in simpler graphics games like CSGO, he'd most likely not lose a single frame over 1080p. The 3060ti has space. It's a decently capable 1440p card.

In D2, it's a toss-up. Might not lose any fps, might lose a few, but not many, and details at that resolution can be turned down with no ill affects, still look 100% better than 1080p.

Op needs to check HWInfo64 when gaming, look at usage per core, not usage total. D2 and Win11 have both had issues, to the point where both amd and Intel have had to have vendors release bios fixes.

The 8700 is still a respectable cpu, does well in many areas, but I'd not write it off based on a sample of 1 game with questionable optimization on an OS that's not exactly trouble free yet.
View: https://youtu.be/IzitAf2skTE
 
Last edited:
  • Like
Reactions: KyaraM
Mar 16, 2022
84
15
45
The cpu has a job. It takes all the info and data necessary to create a single frame, every object, every lighting affect (that isn't post processing) applicable, every dimension, vector, field of view item, calculates motion from last frame, deals with Ai, everything required, and organizes all that into a packet which gets sent to the gpu. The amount of times a cpu can do that in one second (frames per second) is your fps limit. Resolution plays No part in that at all, nor does gpu bound details.

Removed useless theory as it is not true, in bolded, yes and when a CPU is not fast enough it limits the GPU, you have a very wrong version of what a CPU bottleneck is.

I lost around 50 FPS+ going from low to high, rewatch the videos.

All information is proven.

Good day sir.
 
Last edited:

Karadjgne

Titan
Ambassador
when a CPU is not fast enough it limits the GPU
No. It most certainly does not limit the gpu, the gpu remains as it is, 100% output, it's just not fully utilized because there's no need. A big difference.

Picture this. You put a nail into the wall to hang a picture. You will use every single muscle in your hand, arm, shoulder just to hold that hammer, swing it accurately. That does not mean you will use all of the strength in your hand, or arm or shoulder in order to do so. By your logic, you'd have used a much larger hammer and swung it with all your strength instead. It's not needed in order to drive that nail in. Unless you find a stud in the wall, which then you have strength in reserve to hit the nail harder.

A gpu rendering frames in a game is identical. It will use everything it has to get the frames on screen, every core, every bit of vram, as much power as is necessary, but doesn't require using the full power of the card. And that changes per game.

Use a 750ti in CSGO at 200fps and it'll be close to 100% utilized, use a 3090 with same cpu at 200fps and you'd be lucky to get it 15% utilized. But CSGO is so simple you'd only see 30% usage out of either to get that 200fps. Change resolution to 1440p and to get 200fps out of the 750ti will require 100% usage, but the 3090 would only move to 45% because the workload demands that much more.

I lost around 50 FPS+ going from low to high, rewatch the videos.
Of course, because using presets going from low to high you also changed cpu bound settings not just graphical settings. You added details, objects, shading, a bunch of cpu required stuff. Which adds load to the cpu, slows fps outputs as the cores used reach 100%. Viewing distance is a cpu setting. If low = 5000ft, every single object at 5001ft + disappears, no matter how big or small it is. The cpu does not place it, calculate dimensions based on distance, interactive behavior or requirements, vectors, nothing. It's not even there. Move to high, you add distance to view, it raises to 10,000ft. So every object, every bird, blade of grass, mountain or insect in that 5000-10,000 ft range gets added in, visible. That adds a bunch of work for the cpu to place those objects, especially stuff like grass.

Op lost a little over 22% performance by moving from low to high. So did you. Only you show a higher frame loss because 22% of your performance is a much larger number. 20% of 100 is 20. 20% of 200 is 40 etc. Your loss of frames is proportional ± for varience.
 
Last edited:
Mar 16, 2022
84
15
45
Oh god please just stop responding. Your mental gymnastics and trying to appear smarter than you really are is taking the piss.

Reported for misinformation and wanting to take credit for such fact.