[SOLVED] 3080ti 75-85% gpu usage but framerate stable at 4k 60 Ultra-High?

Aug 21, 2021
48
2
35
0
So playing God of War last night I noticed my gpu was only using about 85% max usage with dips into 50-60s. Playing at 4k native no DLSS and settings maxed. Framerate stayed at 60 stable but I noticed a texture pop in on a cabinet (no idea if related or just game issue)

I decided to try Far Cry 6 and see what usage looked like. Playing at 4k again with everything basically max except ray trace reflections off I was getting 60 frames stable with a stutter every once in awhile but I have always had an issue with stutter in that game regardless of settings and resolution. That game is wonky sometimes.

Core Clocks are correct
Memory Clock is correct
Temps in the 60s to 70ish
Using MSI Afterburner for custom fan curve and monitoring. No overclock. Core voltage slider is all the way down? Which afterburner set that way, not me.

R7 3800xt
B450 MSI Gaming Pro Carbon
G Skill Flare X running at 3600 cl16
EVGA Supernova 1000w Gold with 3 separate pcie cables independent to gpu.
MSI Gaming X trio 3080ti

Nvidia Drivers up to date.

I'm using Vsync because I play on a TV without VRR and I keep my framerate locked at 60 in game because that's all the framerate I care about achieving.

I recently upgraded cpu from R5 1600af, PSU from 650 EVGA white, RTX 2060. All within the span of about a month.

R7 3800xt is usually 30-40 average percent in game on demanding titles at 4k.

Am I concerned for nothing? Or should gpu usage always be 100% regardless of vsync and framerate cap?
 

tennis2

Judicious
I guess it depends on how often you're at that ~80% GPU usage. Is that your average, or do you only periodically see that?

What FPS do you get if you turn VSync off?
I'd recommend lowering settings a bit, and/or enabling DLSS to ensure even minimum frame rates mostly stay above 60FPS if your test with VSync off still has a fair number of frames below 60FPS.

Simplified way a PC plays a game:
  1. CPU figures out what needs to be in a given frame (imagine a rough sketch) based on user and game world input. Issues draw call to GPU to tell it what to render.
  2. GPU receives draw call and makes a pretty picture. Sends to monitor when complete.
  3. The GPU can't do any work until the CPU tells it what to draw. Raising graphics settings and/or resolution increases the complexity of the GPU's job, making it take longer to render each frame. Lowering settings decreases the complexity of the GPUs job making it take less time to render each frame.
  4. If the GPU finishes rendering a frame before the CPU has finished figuring out what the next frame should contain, the GPU has to wait (<100% GPU usage).
  5. Based on #3 & #4, you should be able to optimize for 90% or greater GPU usage (depending on a game's CPU stress and the CPU/GPU balance of a system)
  6. CPU usage is usually reported as active time across all available threads of a CPU. Most* games don't leverage more than....6-7 threads. Monitoring CPU usage isn't really useful.
Far Cry has habitually been on CPU reviews because it's a CPU-bound game. Not sure about God of War but reviews suggest the 3080Ti is capable of 75fps at 4k/ultra (Using a 5800X)

Since you don't have a VRR display, it's possible there could be times when frame rates drop below your 60fps Vsync. In these situations, yes, you'll get hitches/stutters/etc because there isn't a frame ready for the next 60Hz refresh interval. I get a little fuzzy on VSync and triple buffering, but if a frame juuust misses 60Hz refresh #1 and another one doesn't get delivered in that next interval, the "older" frame stored is still the most recent, so you could get an [effective] 30fps refresh on that "loop". etc etc.

Frame rate caps can result in lower GPU usage since that's telling the GPU it only has to work hard enough deliver a 60FPS framerate even though it may be capable of 90FPS for example. VSync shouldn't act as a frame rate cap as far as GPU usage is concerned (IIRC). But it's not hard to test.
 
Reactions: CameronCant

tennis2

Judicious
I guess it depends on how often you're at that ~80% GPU usage. Is that your average, or do you only periodically see that?

What FPS do you get if you turn VSync off?
I'd recommend lowering settings a bit, and/or enabling DLSS to ensure even minimum frame rates mostly stay above 60FPS if your test with VSync off still has a fair number of frames below 60FPS.

Simplified way a PC plays a game:
  1. CPU figures out what needs to be in a given frame (imagine a rough sketch) based on user and game world input. Issues draw call to GPU to tell it what to render.
  2. GPU receives draw call and makes a pretty picture. Sends to monitor when complete.
  3. The GPU can't do any work until the CPU tells it what to draw. Raising graphics settings and/or resolution increases the complexity of the GPU's job, making it take longer to render each frame. Lowering settings decreases the complexity of the GPUs job making it take less time to render each frame.
  4. If the GPU finishes rendering a frame before the CPU has finished figuring out what the next frame should contain, the GPU has to wait (<100% GPU usage).
  5. Based on #3 & #4, you should be able to optimize for 90% or greater GPU usage (depending on a game's CPU stress and the CPU/GPU balance of a system)
  6. CPU usage is usually reported as active time across all available threads of a CPU. Most* games don't leverage more than....6-7 threads. Monitoring CPU usage isn't really useful.
Far Cry has habitually been on CPU reviews because it's a CPU-bound game. Not sure about God of War but reviews suggest the 3080Ti is capable of 75fps at 4k/ultra (Using a 5800X)

Since you don't have a VRR display, it's possible there could be times when frame rates drop below your 60fps Vsync. In these situations, yes, you'll get hitches/stutters/etc because there isn't a frame ready for the next 60Hz refresh interval. I get a little fuzzy on VSync and triple buffering, but if a frame juuust misses 60Hz refresh #1 and another one doesn't get delivered in that next interval, the "older" frame stored is still the most recent, so you could get an [effective] 30fps refresh on that "loop". etc etc.

Frame rate caps can result in lower GPU usage since that's telling the GPU it only has to work hard enough deliver a 60FPS framerate even though it may be capable of 90FPS for example. VSync shouldn't act as a frame rate cap as far as GPU usage is concerned (IIRC). But it's not hard to test.
 
Reactions: CameronCant
Aug 21, 2021
48
2
35
0
80% I would say is my average. 85% sometimes.

I don't have stutters or anything performance related outside of Far Cry 6 which has frame drops on every platform and every hardware available. For me that is minimal.

I just noticed that usage wasn't going above say 80-85 percent and wondered if I should see max usage no matter what.

I will test with vsync and framerate cap off when I get off work and see what happens.

Thank you for your help
 

tennis2

Judicious
You're certainly in a resolution where you'd expect to NOT encounter CPU-limitations, but stranger things can happen, especially in CPU-bound games. I wouldn't be too bothered by >80% GPU usage TBH.

It'll be interesting to see whether your GPU usage is different with VSync off. Like I said, it SHOULDN'T be, but......
 
Aug 21, 2021
48
2
35
0
You're certainly in a resolution where you'd expect to NOT encounter CPU-limitations, but stranger things can happen, especially in CPU-bound games. I wouldn't be too bothered by >80% GPU usage TBH.

It'll be interesting to see whether your GPU usage is different with VSync off. Like I said, it SHOULDN'T be, but......
Ok, so I fired up God of War, left it 4k, maxed everything, turned off Vsync and unlocked Framerate cap, card immediately went to 99%.

I had no idea that would throttle gpu usage like that but I guess it's a good thing? I assume it will keep card from working too hard and overly getting hot? I don't need MAX graphics so if I lose a little performance with it running around 80-85% I can compensate for that. I'm just more worried about 4k 60 than anything else.

I greatly appreciate your guidance and Thank You for taking the time to respond. I enjoyed reading about how frames are rendered. This is a learning experience and maybe someday someone else will have this come up and find this thread.

Thanks again!
 

tennis2

Judicious
I had no idea that would throttle gpu usage like that but I guess it's a good thing? I assume it will keep card from working too hard and overly getting hot?
I would encourage you to play around with unervolting and power limits as well. You can also set frame rate limits in-game or within Nvidia control panel (say....90FPS if you're planning Vsync-ing to 60). Not every game will fully challenge a 3080Ti @ 4k/60. This will reduce power draw, temps, and noise in situations where it's not needed/used.

  • Undervolting - In MSI Afterburner, set "Core Clock (MHz)" to +130 (should be stable, greater values are dependent on chip lottery, test for stability). This will result in the core clock being 130MHz higher for any given voltage along the frequency-voltage curve. Results in less power draw for any given frequency.
  • Power limits - This will limit the maximum power draw of the card. You may notice that even with a power limit, your GPU will still hit the stock boost clocks
    • I run my 3060Ti @ 70% power limit most of the time.
  • Frame rate cap is just that. If your GPU is capable of outputting 400FPS but your display only shows 60FPS, you can cap the output to a lower value so the GPU doesn't have to work so hard generating frames that wont get displayed. Some games will expose a frame rate cap in the graphics settings also. In my experience, you want to do one or the other (in-game or Nvidia control panel) as they don't always work well together.
    • There are some benefits to generating frames faster than your display can show, since VSync should be pulling the most recently generated frame for each 60Hz interval. It has benefits mostly for twitch-shooters/fast action PvP games where your reaction time starts from the output of each frame on the screen.
    • Probably more related to your usage, it ensures that you reduce stuttering from runt frames that come in slower than 60FPS. ie if you set your cap to 60FPS (especially since that's going to be controlled by software and the GPUs ability to ramp frequencies up/down respectively) you very well could end up with a lot of hitches. 90FPS cap will reduce those runt frames to a degree. etc etc. It depends on the frame rate consistency of a given game.
 
Last edited:

ASK THE COMMUNITY

TRENDING THREADS