Question RTX 3080 Mysterious Hindered Performance (Low GPU Usage) (Low FPS)

Oct 16, 2022
3
0
10
I am at a complete loss here. Recently I had decided it was time to upgrade my graphics card from a 1050 TI I purchased almost 5 years ago to a RTX 3080. At first I was impressed with the performance increase but soon realized my RTX 3080 was not coming close to full utilization. I chopped this up to the power supply (750w) I had recently installed must have been producing insufficient power to fully utilize this GPU. Today I replaced the 750w power supply with a 1000w and my issue of low GPU utilization has persisted. Please note that my NVidia drivers are up to date. (v. 522.25)
Here is a link to screenshots of my Task Manager and GPU-Z Performance during games (most noticeable in Overwatch 2) and rendering in Blender (where the problem is non-existent)
- View: https://imgur.com/a/nb7LwzP

I have read online that it is possible I am being bottlenecked by my PCIe gen3 motherboard and it could help performance by switching to a motherboard with PCIe 4.0. Could this be the issue? I have read conflicting statements on this.

Here is a list of my specs -
CPU: AMD Ryzen 5 3600 3.6 GHz 6-Core Processor
Motherboard: Gigabyte B550M DS3H Micro ATX AM4 Motherboard
Memory: Corsair Vengeance LPX 16 GB (2 x 8 GB)
Storage: Samsung 970 Evo Plus 2 TB M.2-2280 PCIe 3.0 X4 NVME Solid State Drive
Video Card: Gigabyte GV-N3080VISION OC-10GD GeForce RTX 3080 10GB 10 GB
Case: NZXT H7 ATX Mid Tower Case
Power Supply: Corsair HX1000 Platinum 1000 W
 

Karadjgne

Titan
Ambassador
Monitor? That's kinda important. A 1080p resolution isn't going to challenge a 3080 on a good day, not without certain games like FarCry with the HD texture pack, which is basically a 4k resolution. Want higher usage? Turn on Raytracing to maximum. Put the gpu to work, because at 1080p it's taking a nap.

Think of it as 2 guys carrying groceries. First guy is scrawny, thin as a rail, looks like he couldn't punch his way out of a wet paper bag. 100lbs of groceries is going to take everything he has, just to get out the door. 2nd guy is the circus strongman, more muscles than 3 ordinary men. He's going to have that 100lbs tucked under 1 arm and be considering carrying the scrawny guy with the other. That's the difference between the 1050ti and the 3080, one struggles, the other is not even winded. To make the strongman worried you'd need to load him down with 400lbs of groceries. 1080p vs 4k.

Usage is trixy. And lies flat out. Usage in a gpu is basically the amount of cores used. Nothing more. However, what it doesn't say is how much those cores are used. Cores have a bandwidth, how much data each can hold/process at any given time period. For instance, if your gpu had 1000 cores and it was showing 40% usage, that's 400 cores used. What it isn't saying is if each of those cores is using 1% or 99% of the bandwidth, just that it's used. So a 40% usage could be a massive amount of data processed, or next to nothing. The only number that matters is 100. Get close to that and you'll be at the breaking point where the card can do no more and anything more will force the fps to tank hard.
 
Oct 16, 2022
3
0
10
Thank you for your insight! I am using an Asus 144hz 1080p monitor. Your last paragraph really helped change my view of the GPU usage. But realistically could I be able to get more out of the card with only an upgrade to my CPU?
 

Karadjgne

Titan
Ambassador
The cpu is totally responsible for fps. It takes all the game code, all your inputs, objects, dimensions, all the Ai and movements, collaborates, compiles everything into a frame packet and the amount of times it can do that in 1 second is max fps. That's what gets shipped to the gpu.

The gpu takes that incoming packet, takes the instructions to place all objects into a wire frame model. Adds color and lighting and shadows etc, renders that according to your resolution and the amount of times that can be done is the fps you see.

So. If your current cpu can only ship 100 packets, that's all the 3080 has to deal with, rendering at 1080p. Upgrade the cpu and it'll have 150 or 200 packets to deal with. Usage goes up. Normally. Depends on how taxing the packets are. It could be the 3080 is strong enough at 1080p to deal with 1000 packers, for that particular game, so 100 going upto 150 really isn't much of a stretch, usage might go from 40 to 42% etc. Or could be that extra 50 packets moves usage from 40 to 60% if the gpu would top out at 250 etc.

Either way, a 3080 is realistically a 4k card, 1080p is pathetically easy for it in almost every game, maybe 0.1% of all games can challenge it in any meaningful way.

And you are kinda looking at it backwards. The gpu will only use what it needs to use, in order to get the packets dealt with. You can't get more out of the card without increasing just what it needs in order to deal with that.

It's like picking up a coffee cup. You only use a fraction of grip on the cup, because that's all that's required to pick it up. You don't 'death grip' the cup, there's no need. You can if you wish to, but that doesn't change anything, end result is that you still picked up the cup.

So there is no 'get more' out of the gpu because the games and resolution and detail settings simply don't require it. Get a 4k monitor, and that'll drastically change those results, detail settings, fps, usage etc as 4k has 4x the amount of pixels 1080p has, that the gpu then has to populate with a color in every single frame.
 
Last edited:

warriorlax1234

Distinguished
Nov 1, 2009
46
2
18,545
Try turning on super sampling resolution. Set it to 4k or 8k in game. See if the utilization kicks up for the GPU. I game on a 1080p monitor but run games with super resolution set to 4k or 8k. It still taxes the GPU.
 
Oct 16, 2022
3
0
10
I am at a complete loss here. Recently I had decided it was time to upgrade my graphics card from a 1050 TI I purchased almost 5 years ago to a RTX 3080. At first I was impressed with the performance increase but soon realized my RTX 3080 was not coming close to full utilization. I chopped this up to the power supply (750w) I had recently installed must have been producing insufficient power to fully utilize this GPU. Today I replaced the 750w power supply with a 1000w and my issue of low GPU utilization has persisted. Please note that my NVidia drivers are up to date. (v. 522.25)
Here is a link to screenshots of my Task Manager and GPU-Z Performance during games (most noticeable in Overwatch 2) and rendering in Blender (where the problem is non-existent)
- View: https://imgur.com/a/nb7LwzP

I have read online that it is possible I am being bottlenecked by my PCIe gen3 motherboard and it could help performance by switching to a motherboard with PCIe 4.0. Could this be the issue? I have read conflicting statements on this.

Here is a list of my specs -
CPU: AMD Ryzen 5 3600 3.6 GHz 6-Core Processor
Motherboard: Gigabyte B550M DS3H Micro ATX AM4 Motherboard
Memory: Corsair Vengeance LPX 16 GB (2 x 8 GB)
Storage: Samsung 970 Evo Plus 2 TB M.2-2280 PCIe 3.0 X4 NVME Solid State Drive
Video Card: Gigabyte GV-N3080VISION OC-10GD GeForce RTX 3080 10GB 10 GB
Case: NZXT H7 ATX Mid Tower Case
Power Supply: Corsair HX1000 Platinum 1000 W
UPDATE JAN/21/2023: BAD MOTHERBOARD
In the time between updates I have upgraded my CPU (which provided slight increase in performance) and more recently upgraded my motherboard, which completely solved the issue. I am now getting much better performance. The Ryzen 3600 absolutely bottlenecks the RTX 3080 but not to the degree I was seeing. After researching online I came to the confusion the Gigabyte motherboard must have been worn out (it was around 4 years old) and/or had weak VRM (Voltage Regulator Module) Though I am not familiar with these issues, I could wrong. Anyways thank you to everyone who left advice on how to fix this. Anyone else having a similar issue I recommend ensuring there is no CPU bottleneck first, but never underestimate the damage an old motherboard can cause.
 

Karadjgne

Titan
Ambassador
Cpu can't bottleneck a gpu. The cpu is the source of fps, it is what ever it is. Going overpowered on a gpu will Not increase the amount of fps a cpu can supply, that number doesn't change regardless of the gpu used. An underpowered gpu will not render all that the cpu can give, an overpowered gpu will and have room to spare, a balanced gpu will be the same as both, depending on the game itself.

A 3600 doesn't bottleneck a 3080. If it's got low utilization, it simply means the gpu isn't being graphically challenged. At 4k on a 3080, there's a very small margin of difference between a 3600 and a 5600, simply because the gpu is the limiting factor, not the cpu. At 1080p, that's reversed.

Changing mobo's can have an affect, depending on what the old mobo was and the new. Doesn't necessarily mean the old mobo was broken or failing or slowing down, could simply be the difference between an A320 running the gpu at pcie2.0 and the new X570 running the gpu at pcie4.0, or having an out-dated bios not allowing resizable bar or limiting power to the gpu etc.