Question New GPU low usage!

Sep 4, 2022
7
0
10
Hello everyone.
So recently I upgraded my GPU, I went from 1050ti to a 3060ti. Everything was fine, I tried games like Rocket League, Cod Warzone and CS:GO and it gave me really good FPS results. But as soon as I go to games like Valorant, League of Legends and Fortnite it seems that the game is running the same or even worse with this new GPU. I noticed something, my GPU usage on Valorant, League of Legends and Fortnite doesn't go beyond 20%-40%, which might be causing me to lose some performance. I've done some research and people with the same build as me can get 500 FPS on Fortnite and I'm barely getting 200-250 fps.

I've tried change power plans, updating chipset drivers, gpu drivers and almost everything but the issue keeps happening on those games I named.

GPU: RTX 3060 TI.
CPU: I7 8700
16GB RAM

Please help if you know what's wrong with this situation, I would really appreciate!
 

ötzi

Proper
Apr 2, 2022
182
40
140
What resolution do you play on? Cause the benchmarks I'm looking at seem up to par with what you're saying, 200-250 fps @1080p. Fortnite, Valorant and League of Legends are CPU heavy games, if you want more performance on those games you may want to upgrade the CPU.
 
Sep 4, 2022
7
0
10
What resolution do you play on? Cause the benchmarks I'm looking at seem up to par with what you're saying, 200-250 fps @1080p. Fortnite, Valorant and League of Legends are CPU heavy games, if you want more performance on those games you may want to upgrade the CPU.
I usually play 1920x1080 on those games. But it still doesn't answer my main problem, which is the low GPU usage. I think I'm losing some performance here with a way beter GPU. With my old one (1050 TI) I would get the same FPS on those games that I'm getting now with my new GPU, which makes 0 sense...
 
FPS can be limited by CPU or GPU.
Easy games like CS GO are CPU limited at 1080p settings.
Other games are more CPU dependent . To get more FPS you need a faster CPU.
Running a 3060TI at low settings @1080p is ridiculous.
I run most games@ max settings and get between 75-175 FPS. Rust, War thunder. FS, DCS World etc... with a tweeked R5 5600X.
Faster CPUs would get higher FPS.
Turn up graphics settings(except draw distance) and your FPS should stay about the same for most games and give higher GPU usage.
The CPU draws the frames wire screen. The video card makes it pretty.
 

ötzi

Proper
Apr 2, 2022
182
40
140
I usually play 1920x1080 on those games. But it still doesn't answer my main problem, which is the low GPU usage. I think I'm losing some performance here with a way beter GPU. With my old one (1050 TI) I would get the same FPS on those games that I'm getting now with my new GPU, which makes 0 sense...

Actually it makes a lot of sense, the CPU is the limiting factor here, that’s why you’re getting the same performance as the 1050Ti on those games. The CPU can’t keep up with the GPU.
 
The performance of the CPU, along with what it has to run on top of the game, determines the maximum frame rate you can achieve in the game, not the GPU. The GPU only gets you up that frame rate.

If you're happy with the frame rate and you're not happy seeing low GPU usage, crank up the quality settings and rendering resolution.
 
Exactly. If you are running on low settings you are artificially limiting the cpu because the 3060ti is basically sitting there bored and is passing frames so quickly that your cpu can’t keep up. When you turn up the details, what you should find is the graphics card will have to work harder and your cpu should then be keeping up with it better.

As someone else said, the i7 8700 is getting older now. You should be ok for now, but in the next couple of years or so you should consider upgrading that part. The 8700 is an 8th gen part, they are about to release 13th gen. Which is wild technology has progressed this far already but that’s the way it goes.
 
Sep 4, 2022
7
0
10
FPS can be limited by CPU or GPU.
Easy games like CS GO are CPU limited at 1080p settings.
Other games are more CPU dependent . To get more FPS you need a faster CPU.
Running a 3060TI at low settings @1080p is ridiculous.
I run most games@ max settings and get between 75-175 FPS. Rust, War thunder. FS, DCS World etc... with a tweeked R5 5600X.
Faster CPUs would get higher FPS.
Turn up graphics settings(except draw distance) and your FPS should stay about the same for most games and give higher GPU usage.
The CPU draws the frames wire screen. The video card makes it pretty.
Thanks for answering.
The thing is I don't want to have max settings. For example, on Valorant I like to have the on low settings to get most performance, but I shouldn't be getting only 200-250fps, since that what I would get with a 1050 TI. On CS:GO I do the same, I have all the settings on low and I run it at like 500-600 fps, before I used to run it at like 200 fps, there was a huge upgrade here, but in Valorant I don't see anything, it's so weird...
 
Sep 4, 2022
7
0
10
The performance of the CPU, along with what it has to run on top of the game, determines the maximum frame rate you can achieve in the game, not the GPU. The GPU only gets you up that frame rate.

If you're happy with the frame rate and you're not happy seeing low GPU usage, crank up the quality settings and rendering resolution.
Thanks for answering.
That makes a lot of sense indeed. But one thing is not making sense on my mind. I saw people with the exact build or even worst then mine and getting more FPS on Fortnite, for example. So if I have a better build why am I not getting way more FPS then those people with lower/same builds as mine?
Maybe it's because I'm not getting the full performance on my GPU...
 
Thanks for answering.
The thing is I don't want to have max settings. For example, on Valorant I like to have the on low settings to get most performance, but I shouldn't be getting only 200-250fps, since that what I would get with a 1050 TI. On CS:GO I do the same, I have all the settings on low and I run it at like 500-600 fps, before I used to run it at like 200 fps, there was a huge upgrade here, but in Valorant I don't see anything, it's so weird...

I've never understood people that want 200fps+ as like after 120fps its gets not so more smoother than it was already. I have a 143hz Gsync Monitor.

Also say you have a 60z monitor that 140 out of your 200fps out the door not being used, as the monitor will always only be able to show its cap hz which is 60 frames per second, this could actually cause stuttering as your splicing out of the film reel so to speak.

A CPU can only prepare so many frames at a time and a better CPU can do this faster. The GPU is just trying to get to 99% when uncapped so is nagging the CPU to give it the data it needs to fulfil its role but if the CPU can't keep up the GPU has to roll back That's why say your getting 40fps but the GPU has pulled back to 60%. The telltale sign is seeing CPU cores at 90%-ish.

So if you make the job easier for the GPU it make the problem worse thats why 1080p is going to show a bottleneck more than say 1440p or 4K. So it is in your best interest to challenge the GPU the best you can. Use DSR or increase the graphics settings to use that extra horepower.
 
Last edited:
Thanks for answering.
The thing is I don't want to have max settings. For example, on Valorant I like to have the on low settings to get most performance, but I shouldn't be getting only 200-250fps, since that what I would get with a 1050 TI. On CS:GO I do the same, I have all the settings on low and I run it at like 500-600 fps, before I used to run it at like 200 fps, there was a huge upgrade here, but in Valorant I don't see anything, it's so weird...
Very simplified answer.
In all games the processor determines minumum/ max FPS. It must calculate position, size,placement etc.... of all objects within a frame. Referred to as a wire frame.
Then it sends it to the Video card.
The video card adds surfaces,textures,skins,tessellation, bump maps etc.... etc.... to the wire frame to make a single frame or picture to be displayed.
So running low quality setting gives the video card very little to do. Hence low video card usage.
Turning up image quality gives the video card more work to do increasing usage.
Turning up image quality settings does not affect FPS until you reach a certain point. Which is the point the video card/or certain parts reaches 100% usage.
Then FPS start dropping from GPU over-usage,

Most youtube videos are faked or rigged with highly overclocked and tweeked system settings.
If you click on the CPUZ link in my sig you can see my 5600x outperforms a 10700k/s and a 5800x in multithreaded work loads with fewer cores.
 
Last edited:
That makes a lot of sense indeed. But one thing is not making sense on my mind. I saw people with the exact build or even worst then mine and getting more FPS on Fortnite, for example
if your pc run on stock settings, that could explain alot, its like car engine tunning
reduce ram latencies, system latencies, overclock cpu a bit, overclock pci-e a bit (that reduces pcie latency), overclock uncore frequency, cache frequency...and so on
being CPU bottlenecked means anything cpu/mem sensitivite will result in minimum FPS gain (your bottleneck), cpu cant do anything without ram, you know right?
 
  • Like
Reactions: tiagofp
Sep 4, 2022
7
0
10
I've never understood people that want 200fps+ as like after 120fps its gets not so more smoother than it was already. I have a 143hz Gsync Monitor.

Also say you have a 60z monitor that 140 out of your 200fps out the door not being used, as the monitor will always only be able to show its cap hz which is 60 frames per second, this could actually cause stuttering as your splicing out of the film reel so to speak.

A CPU can only prepare so many frames at a time and a better CPU can do this faster. The GPU is just trying to get to 99% when uncapped so is nagging the CPU to give it the data it needs to fulfil its role but if the CPU can't keep up the GPU has to roll back That's why say your getting 40fps but the GPU has pulled back to 60%. The telltale sign is seeing CPU cores at 90%-ish.

So if you make the job easier for the CPU it make the problem worse thats why 1080p is going to show a bottleneck more than say 1440p or 4K. So it is in your best interest to challenge the GPU the best you can. Use DSR or increase the graphics settings to use that extra horepower.
Thanks for answering.
I'm not saying I need those FPS badly because currently I can play any game that I used to play at 144+ fps. It's just weird how people with the same build or worse get more FPS then me. Thought with this new GPU I would get way more FPS but I'm not seeing any difference on those games I said.
I didn't think it was normal to have such a low GPU usage on some games, I thought that was the reason for my FPS not being higher.
 
Sep 4, 2022
7
0
10
Very simplified answer.
In all games the processor determines minumum/ max FPS. It must calculate position, size,placement etc.... of all objects within a frame. Referred to as a wire frame.
Then it sends it to the Video card.
The video card adds surfaces,textures,skins,tessellation, bump maps etc.... etc.... to the wire frame to make a single frame or picture to be displayed.
So running low quality setting gives the video card very little to do. Hence low video card usage.
Turning up image quality gives the video card more work to do increasing usage.
Turning up image quality settings does not affect FPS until you reach a certain point. Which is the point the video card/or certain parts reaches 100% usage.
Then FPS

Most youtube videos are faked or rigged with highly overclocked and tweeked system settings.
If you click on the CPUZ link in my sig you can see my 5600x outperforms a 10700k/s and a 5800x in multithreaded work loads with fewer cores.
Thanks for answering.
I don't know if the videos are fake on YouTube. I just thought that I could get more FPS on some games with a way better GPU, but it's not happening sadly. I never thought I couldn't play Fortnite on a RTX 3060 TI and i7 8700, that's crazy how I can barely play, because on some situations I get fps under 144... Which for my build doesn't seem to make any sorte of sense.
 
Thanks for answering.
I don't know if the videos are fake on YouTube. I just thought that I could get more FPS on some games with a way better GPU, but it's not happening sadly. I never thought I couldn't play Fortnite on a RTX 3060 TI and i7 8700, that's crazy how I can barely play, because on some situations I get fps under 144... Which for my build doesn't seem to make any sorte of sense.

There could be lots of reasons.
-Drivers
-Game Engine
-Internet speed
-A bottleneck in your system, for example having a HDD instead of an SSD.
-Windows Updates

For example before COVID 19 Warthunder ran perfectly but after numerous patches an updates it is just a stuttering mess for me now but I see others play fine.
 

jeremy0118

Distinguished
Feb 29, 2016
165
15
18,615
I've never understood people that want 200fps+ as like after 120fps its gets not so more smoother than it was already. I have a 143hz Gsync Monitor.

Also say you have a 60z monitor that 140 out of your 200fps out the door not being used, as the monitor will always only be able to show its cap hz which is 60 frames per second, this could actually cause stuttering as your splicing out of the film reel so to speak.

A CPU can only prepare so many frames at a time and a better CPU can do this faster. The GPU is just trying to get to 99% when uncapped so is nagging the CPU to give it the data it needs to fulfil its role but if the CPU can't keep up the GPU has to roll back That's why say your getting 40fps but the GPU has pulled back to 60%. The telltale sign is seeing CPU cores at 90%-ish.

So if you make the job easier for the CPU it make the problem worse thats why 1080p is going to show a bottleneck more than say 1440p or 4K. So it is in your best interest to challenge the GPU the best you can. Use DSR or increase the graphics settings to use that extra horepower.

This is such a load of rubbish lol. I use 240 hz and play CSGO on a fairly high level and i can instantly feel the diffrence between locking my FPS to eirher 240, 300 or 400 in terms of smoothness. If you run 120 hz there is still a big diffrence in your mouse input and smoothness if you compare 120 fps to double or even triple that framerate. Anyone who claims these things, that FPS over your monitors framerate has no significant impact on the experience , is clueless or an ultra casual gamer.
 
There could be lots of reasons.
-Drivers
-Game Engine
-Internet speed
-A bottleneck in your system, for example having a HDD instead of an SSD.
-Windows Updates

For example before COVID 19 Warthunder ran perfectly but after numerous patches an updates it is just a stuttering mess for me now but I see others play fine.

Warthunder got high quality texture packs. So it could be low system memory or low video card memory. If either get full and the game needs a texture not available in video card memory it must remove something fron GPU memory then transfer in the texture/skin etc... from system memory to complete the frame.
This causes quick microstutters.
If the needed texture or skin is not available in GPU memory or system memory it must retrieve it from HDD/SSD/M.2 storage.
This usually causes a longer hard skip. As the data has to be written to system memory first, them transferred to GPU memory.
Direct transfer from SSD/NVME to video card is new to PC gaming and only supported by the newest Games/ systems
 
This is such a load of rubbish lol. I use 240 hz and play CSGO on a fairly high level and i can instantly feel the diffrence between locking my FPS to eirher 240, 300 or 400 in terms of smoothness. If you run 120 hz there is still a big diffrence in your mouse input and smoothness if you compare 120 fps to double or even triple that framerate. Anyone who claims these things, that FPS over your monitors framerate has no significant impact on the experience , is clueless or an ultra casual gamer.

I think you miss understood me. Also don't appreciate the rudeness of your post in response of my own.

What I mean is that people that uncap their framerate and get many hundreds of frames but it is way past their monitors HZ's are wasting power and GPU usage.

I have a 144hz and I'd rather have it 120-143fps that say 50-60 as ofc its way smoother but its a lot harder to see the smoothness difference when you got above 144. But I would never run uncapped so say I was getting 300fps for CSGO Source as I would still only be seeing 143, the other 156 frames would be thrown out. I would have Vysnc enabled in Nvidia Control Panel and off in the game to keep Gsynced enabled and not go over the HZ
 

jeremy0118

Distinguished
Feb 29, 2016
165
15
18,615
I think you miss understood me. Also don't appreciate the rudeness of your post in response of my own.

What I mean is that people that uncap their framerate and get many hundreds of frames but it is way past their monitors HZ's are wasting power and GPU usage.

I have a 144hz and I'd rather have it 120-143fps that say 50-60 as ofc its way smoother but its a lot harder to see the smoothness difference when you got above 144. But I would never run uncapped so say I was getting 300fps for CSGO Source as I would still only be seeing 143, the other 156 frames would be thrown out. I would have Vysnc enabled in Nvidia Control Panel and off in the game to keep Gsynced enabled and not go over the HZ

my apologies sir
 

jeremy0118

Distinguished
Feb 29, 2016
165
15
18,615
I think you miss understood me. Also don't appreciate the rudeness of your post in response of my own.

What I mean is that people that uncap their framerate and get many hundreds of frames but it is way past their monitors HZ's are wasting power and GPU usage.

I have a 144hz and I'd rather have it 120-143fps that say 50-60 as ofc its way smoother but its a lot harder to see the smoothness difference when you got above 144. But I would never run uncapped so say I was getting 300fps for CSGO Source as I would still only be seeing 143, the other 156 frames would be thrown out. I would have Vysnc enabled in Nvidia Control Panel and off in the game to keep Gsynced enabled and not go over the HZ

i was slightly agitated. because you cant see past your monitors refreshrate, i read it asif it was a fact. there is plenty of people including myself who can tell major diffrence between 1x 2x 3x the fps of the monitors refreshrate. so for professional csgo players there is a reason they are always on the latest and greatest processor, because the more fps you push the higher your advantage. of course it becomes far less noticable if you are get lets say past 400 fps. actually linus did a video collab on it with shroud a while ago have a look if that sounds interesting to you. not on specificly that topic but they discuss it if i remember correctly
 
Last edited:
Hello everyone.
So recently I upgraded my GPU, I went from 1050ti to a 3060ti. Everything was fine, I tried games like Rocket League, Cod Warzone and CS:GO and it gave me really good FPS results. But as soon as I go to games like Valorant, League of Legends and Fortnite it seems that the game is running the same or even worse with this new GPU. I noticed something, my GPU usage on Valorant, League of Legends and Fortnite doesn't go beyond 20%-40%, which might be causing me to lose some performance. I've done some research and people with the same build as me can get 500 FPS on Fortnite and I'm barely getting 200-250 fps.

I've tried change power plans, updating chipset drivers, gpu drivers and almost everything but the issue keeps happening on those games I named.

GPU: RTX 3060 TI.
CPU: I7 8700
16GB RAM

Please help if you know what's wrong with this situation, I would really appreciate!
Is this a pre-built or a custom machine?
Do you know what your clock speed your CPU is running at under a sustained load?
Have you checked GPU-Z to confirm the PCI-E link is running at x16?
 
i was slightly agitated. because you cant see past your monitors refreshrate, i read it asif it was a fact. there is plenty of people including myself who can tell major diffrence between 1x 2x 3x the fps of the monitors refreshrate. so for professional csgo players there is a reason they are always on the latest and greatest processor, because the more fps you push the higher your advantage. of course it becomes far less noticable if you are get lets say past 400 fps. actually linus did a video collab on it with shroud a while ago have a look if that sounds interesting to you. not on specificly that topic but they discuss it if i remember correctly

Well to me its all in peoples heads, take that with a pinch of salt. Even if you have a 250HZ monitor and your getting 400fps you are still only seeing 250. The monitor can not do more than its HZ.
 
Back to the original post though, its a CPU bottleneck.

OP's CPU is only marginally better than my own i7 6800k even with its 4.0ghz overclock. My CPU would bottleneck a RTX 2080 so ofc it will heavily bottleneck the next gen up.

Its a CPU issue.
 

jeremy0118

Distinguished
Feb 29, 2016
165
15
18,615
Well to me its all in peoples heads, take that with a pinch of salt. Even if you have a 250HZ monitor and your getting 400fps you are still only seeing 250. The monitor can not do more than its HZ.

Lol you are really still argueing me wrong on this, so we all have it happen in our heads you're saying? yeah let me just stop arguing with you , pointless.
 

jeremy0118

Distinguished
Feb 29, 2016
165
15
18,615
Back to the original post though, its a CPU bottleneck.

OP's CPU is only marginally better than my own i7 6800k even with its 4.0ghz overclock. My CPU would bottleneck a RTX 2080 so ofc it will heavily bottleneck the next gen up.

Its a CPU issue.

Since you refuse to believe anything i am saying to face value

View: https://youtube.com/watch?v=OX31kZbAXsA&t=139s


Put it on 2:10 and listen to what he says . that immediately throws your argument out of the window This goes for any refreshrate, 120 144 240. if you still dont believe me watch the whole video.
 
Last edited: