Question 5060ti slacking on the job, what could be the culprit?

Ottomic

Distinguished
Mar 20, 2017
12
0
18,510
Hi, this is my computer:

Mobo: Aorus X470 Ultra Gaming
CPU: Ryzen 5 5600x
GPU: GTX 5060ti
PSU: Corsair RM750x
RAM: 32GB (4x8GB) DDR4

This has been happening consistently in at least a couple games: Stalker 2 and Monster Hunter Wilds. (1440p, medium-ish graphical settings)

The game will be running under 60 FPS for extended periods of time, however, the GPU and CPU will stay well below maximum utilization (around 60-80%). I have checked if the problem was that individual GPU threads might not be hitting 100%, but no, they all seem to be more or less hovering over the same 60-80% utilization. Sometimes it will come up to 60 FPS, but it seems more of a case of the game having to catch up to the components and not the other way around.

Going over all the different metrics in Open Hardware Monitor, the only thing I can see consistently maxing out is the GPU bus utilization which is jumping like crazy between 0 and 100%.

Now, the first time I realized this was while I was testing a dual GPU setup (RTX 3070 upscaling the output of the 5060ti), and the x470 has shared bandwidth between the 16x and 8x PCI slots (PCIE 3.0, to be specific) so I figured maybe two GPUs was too much to ask of the old girl, but testing with only the 5060ti on the main PCIE is still resulting in the bus showing as pretty much full use all the time.

Since the readings are pretty much identical with 1 or 2 GPUs, the argument for the PCIE slot being the culprit is quite suspect, but quite honestly, I am pretty stumped as to what could otherwise be causing this. The motherboard is definitely the oldest component in the computer, but it would seem strange to me that would be the problem.

Thank you in advance for any advice on the matter!
 
the argument for the PCIE slot being the culprit is quite suspect, but quite honestly, I am pretty stumped as to what could otherwise be causing this. The motherboard is definitely the oldest component in the computer, but it would seem strange to me that would be the problem.
Well, it is the MoBo issue.

Your MoBo has PCI-E 3.0 x16 slot.
Specs: https://www.gigabyte.com/Motherboard/X470-AORUS-ULTRA-GAMING-rev-10/sp#sp

While the GPU you have, is PCI-E 5.0 x8.
Specs: https://www.techpowerup.com/gpu-specs/geforce-rtx-5060-ti-8-gb.c4246

PCI-E is backwards compatible, but your GPU is currently running in PCI-E 3.0 x8 mode.
That gives ~13% reduced performance on 2K (1440p).

minimum-fps-relative-2560-1440.png


Source: https://www.techpowerup.com/review/nvidia-geforce-rtx-5090-pci-express-scaling/32.html

If your MoBo would have PCI-E 5.0 x16/x8 slot, the GPU can utilize itself fully and isn't bandwidth bound.
 
I thought that could be the problem, but just 16% reduced performance? I mean, looking at the numbers it doesn't feel like I'm just short by that margin.

Also, I'm assuming PCIE slots are designed to outstrip anything the GPU is planned to utilize, so is there any resources regarding what thoroughput a game expects to be able to have? I'm not considering upgrading socket (and CPU) yet, but I could relatively easily upgrade to a 16x 4.0 mobo if necessary, but I don't want to be just throwing money blind in the hopes of fixing this problem.
 
I thought that could be the problem, but just 16% reduced performance? I mean, looking at the numbers it doesn't feel like I'm just short by that margin.

Also, I'm assuming PCIE slots are designed to outstrip anything the GPU is planned to utilize, so is there any resources regarding what thoroughput a game expects to be able to have? I'm not considering upgrading socket (and CPU) yet, but I could relatively easily upgrade to a 16x 4.0 mobo if necessary, but I don't want to be just throwing money blind in the hopes of fixing this problem.
It's just the game.
 
so is there any resources regarding what thoroughput a game expects to be able to have?
Well, there are, but not in the sense you think of.

E.g Stalker 2 min sys req for GPU is GTX 1060 6GB. That's PCI-E 3.0 x16 GPU. So, bare minimum is what GTX 1060 6GB GPU can deliver.
For Monster Hunter Wilds, minimum GTX 1660 6GB. Also PCI-E 3.0 x16 GPU.

Issue could be both, game issue and PCI-E slot revision. Most likely both combined. Since for both games, your GPU operates at PCI-E 3.0 x8. While the game is expecting to see PCI-E 3.0 x16 GPU or better.

Now, i can not tell if those old games can make sense of GPUs that are and operate in x8 mode or not. Or are those games coded to work best with x16 GPUs. After all, 1st x8 GPU to come out was RTX 4060, which is much newer than GTX 1060/1660.

but I could relatively easily upgrade to a 16x 4.0 mobo if necessary
Best would be PCI-E 5.0 x16 slot MoBo, to remove all and any bandwidth issues. But when part of the issue is game itself (poor optimization), then new MoBo won't fix your issue fully. At least then you know it isn't due to hardware, but instead due to software.
 
Is it the 5060ti 8 or 16gb version out of interest?
16gb.

Best would be PCI-E 5.0 x16 slot MoBo, to remove all and any bandwidth issues. But when part of the issue is game itself (poor optimization), then new MoBo won't fix your issue fully. At least then you know it isn't due to hardware, but instead due to software.
Indeed, but for now I just want to hold out with what I have for at least a couple years more without making any extravagant spending. Next time I'm getting the whole kit and caboodle, possibly even an AM5 prebuilt if I can get a good price on it, but for now I am more concerned on being able to get the most out of the components I have.

Running on an x8 3.0 bus, in Monster Hunter wilds, does not impact performance to a noticeable degree.
I'm not particularly concerned about what benchmark my gpu can reach on these games (which I absolutely expect to be pretty horrid) but what really bugs me is how both GPU and CPU don't seem to be feeling like it and just hovering at a comfy 60% or so when I would expect at least one of them sweating their figurative ass off.

Anyway, I've managed to get a Gigabyte B550 GAMING X for about 80 bucks, seems like a reasonable price and has a main 16x 4.0 pcie slot. Honestly, I've had worse, and it might be worth a shot. Here's hoping.
 
I'm not particularly concerned about what benchmark my gpu can reach on these games (which I absolutely expect to be pretty horrid) but what really bugs me is how both GPU and CPU don't seem to be feeling like it and just hovering at a comfy 60% or so when I would expect at least one of them sweating their figurative ass off.

Anyway, I've managed to get a Gigabyte B550 GAMING X for about 80 bucks, seems like a reasonable price and has a main 16x 4.0 pcie slot. Honestly, I've had worse, and it might be worth a shot. Here's hoping.

The point was your 3.0 bus is not holding you back, not the actual result of the benchmark.
 
  • Like
Reactions: Roland Of Gilead
The point was your 3.0 bus is not holding you back, not the actual result of the benchmark.

Right, I'm just saying I'm not surprised at those numbers, however, I am struggling to find what could otherwise be the cause for both CPU and GPU refusing to put on the work when the game is below target FPS. I mean, I will freely admit I have picked two of the absolute worst optimized games of the current year, but at least either CPU or GPU should be maxing out or damn near their limit, but they seem to be just chilling at about 60%. If you have any ideas about what this could be, or if there is a particular metric that you believe could be the culprit, by all means, please let me know.