Question Is my Cpu Bottlenecking my GPU?

xnikx

Distinguished
Mar 9, 2011
47
8
18,565
i9 10900k, 3080 ti, 32 gb 3600mhz ram, z490 dark, 850w psu. I have two monitors 2560x1440p 240hz and 3440x1440p 120hz, I prefer playing on my 240hz monitor.

So the issue I'm having is that my gpu is severely underperforming when playing at 2560x1440, not in all games though. The example I will use here is bf5. Max settings, future frame rendering on, dx11, ultra settings. While playing at 2560 my gpu usage is extremely low averaging between 50% to 60%, with cpu usage being the same. This results in my fps being around 120. While gaming at 3440 my gpu is pretty much maxed out at 99%, with the pcu around 60% to 70% which results in fps over 200.

I have also noticed while playing at 2560, some of the single core usages will jump as high as 80% to 100%. Why am I getting better performance at a high resolution. Shouldn't my card still be pushing to provide 240fps at 2560 since thats my refresh rate?

There are other games where I am having the same issue, but some games run perfectly fine. Am I experiencing a cpu bottleneck? I've watched youtube benchmarks of bf5 with the exact same specs as me and the person gets 99% usage at 1440p.

A few notes

G sync is on, max power is on in nc and windows, xmp enabled, no overclocks, latest drivers and windows updates installed, have tried multiple clean installs.

I recently tried capping my fps to 237 in nc which seemed to have helped a little bit.
 
Yes, your CPU is definitely bottlenecking your GPU. The i9-10900K can even bottleneck an RX 6600. Now, I know that it sounds nuts and when I first heard it said in a YouTube video from Tech Deals, I thought "That guy's crazy!" but then he blew my mind by demonstrating it!

I've cued that video up to the part that I'm talking about so you don't have to search, just press play:
If the i9-10900K can even slightly bottleneck an RX 6600, just imagine what it's doing to your RTX 3080 Ti. I was so shocked by this that I never forgot it.
 
So the issue I'm having is that my gpu is severely underperforming when playing at 2560x1440, not in all games though. The example I will use here is bf5. Max settings, future frame rendering on, dx11, ultra settings. While playing at 2560 my gpu usage is extremely low averaging between 50% to 60%, with cpu usage being the same. This results in my fps being around 120. While gaming at 3440 my gpu is pretty much maxed out at 99%, with the pcu around 60% to 70% which results in fps over 200.

I have also noticed while playing at 2560, some of the single core usages will jump as high as 80% to 100%. Why am I getting better performance at a high resolution. Shouldn't my card still be pushing to provide 240fps at 2560 since thats my refresh rate?
I find this behavior a little weird because the CPU is responsible for generating frames. The GPU is responsible for drawing them. But if you really want to know how much the system can push, drop all of the graphical quality settings to low.

There are other games where I am having the same issue, but some games run perfectly fine. Am I experiencing a cpu bottleneck? I've watched youtube benchmarks of bf5 with the exact same specs as me and the person gets 99% usage at 1440p.
If the "exact same specs" have different performance characteristics, then it could be a multitude of other factors, including the software installed and the configuration of the drivers.

Also "bottleneck" isn't really applicable unless there's a performance requirement problem. Not a performance problem in general, but "I want this computer to do X FPS at Y settings and it can't do that no matter what I do"
 
Yes, your CPU is definitely bottlenecking your GPU. The i9-10900K can even bottleneck an RX 6600. Now, I know that it sounds nuts and when I first heard it said in a YouTube video from Tech Deals, I thought "That guy's crazy!" but then he blew my mind by demonstrating it!

I've cued that video up to the part that I'm talking about so you don't have to search, just press play:
If the i9-10900K can even slightly bottleneck an RX 6600, just imagine what it's doing to your RTX 3080 Ti. I was so shocked by this that I never forgot it.
That's what I'm struggling with, as the 10900k and 3080 came out around the same time. But it's the only thing that makes sense. I'm just worried about dropping the money for a mobo, cpu, and ram upgrade and not having it pay off.

Correct me if I'm wrong here but isn't a cpu bottleneck usually indicated when the cpu usage is above the gpus?
 
I find this behavior a little weird because the CPU is responsible for generating frames. The GPU is responsible for drawing them. But if you really want to know how much the system can push, drop all of the graphical quality settings to low.


If the "exact same specs" have different performance characteristics, then it could be a multitude of other factors, including the software installed and the configuration of the drivers.

Also "bottleneck" isn't really applicable unless there's a performance requirement problem. Not a performance problem in general, but "I want this computer to do X FPS at Y settings and it can't do that no matter what I do"
The problem is that if my gpu isn't being fully utilized than I'm not getting the performance that I invested it. Unless I'm playing at 3440x1440p, then my performance is basically on par with my old 2080 ti.
 
The problem is that if my gpu isn't being fully utilized than I'm not getting the performance that I invested it. Unless I'm playing at 3440x1440p, then my performance is basically on par with my old 2080 ti.
However you say you have watched videos on YouTube of people with basically the same specs running at 2560x1440 (I'm presuming, since you just said 1440p). This means there's nothing wrong with your hardware per se. There could be something wrong with your software configuration.

Also again, the best way to figure out the actual, absolute highest performance your system can get is to run everything with the lowest graphical quality, including the resolution. If you run BF5 at say 1024x768 and the frame rate still doesn't improve above 3440x1440p while the CPU utilization is more or less the same, I would say there's a software configuration problem than a hardware problem.
 
The example I will use here is bf5. Max settings, future frame rendering on, dx11, ultra setting
Do not use DirectX 11 with Battlefield V, it runs like crap. It cannot properly take advantage of your hardware. Switch to DirectX 12 and try again, I would be amazed if you noticed no improvement. For reference I use a i9 10850K and 3080 and play at 1440p 144hz.

I get a lot more than 120 FPS and my GPU is slower than yours.
 
i9 10900k, 3080 ti, 32 gb 3600mhz ram, z490 dark, 850w psu. I have two monitors 2560x1440p 240hz and 3440x1440p 120hz, I prefer playing on my 240hz monitor.

So the issue I'm having is that my gpu is severely underperforming when playing at 2560x1440, not in all games though. The example I will use here is bf5. Max settings, future frame rendering on, dx11, ultra settings. While playing at 2560 my gpu usage is extremely low averaging between 50% to 60%, with cpu usage being the same. This results in my fps being around 120. While gaming at 3440 my gpu is pretty much maxed out at 99%, with the pcu around 60% to 70% which results in fps over 200.

I have also noticed while playing at 2560, some of the single core usages will jump as high as 80% to 100%. Why am I getting better performance at a high resolution. Shouldn't my card still be pushing to provide 240fps at 2560 since thats my refresh rate?

There are other games where I am having the same issue, but some games run perfectly fine. Am I experiencing a cpu bottleneck? I've watched youtube benchmarks of bf5 with the exact same specs as me and the person gets 99% usage at 1440p.

A few notes

G sync is on, max power is on in nc and windows, xmp enabled, no overclocks, latest drivers and windows updates installed, have tried multiple clean installs.

I recently tried capping my fps to 237 in nc which seemed to have helped a little bit.
Yes, the CPU is too weak for the GPU if you're doing graphics intensive tasks. About almost 23% bottleneck if I quote PC-builds bottleneck calculator. You should go under 5% to be fine. And it's even worse with CPU intensive tasks, with almost 40% bottleneck, by the same calculator. Definitely not the best choice with a 3080 Ti.
 
And that is a useless number, and even more useless tool.

"bottleneck" cannot be calculated like that.
And you're right, but it doesn't change the point that this combo between this CPU and this GPU isn't the best choice. But you're definitely right to point it out.
 
And you're right, but it doesn't change the point that this combo between this CPU and this GPU isn't the best choice. But you're definitely right to point it out.
Then why bring it up?

Every system has a "bottleneck".

Might be CPU, might be GPU, ....ram, mouse, user....
whatever.

Al it means is that some part is not letting some other part reach its full potential.
 
Then why bring it up?

Every system has a "bottleneck".

Might be CPU, might be GPU, ....ram, mouse, user....
whatever.

Al it means is that some part is not letting some other part reach its full potential.
Because I thought it would be useful. But I stand corrected, and you bringing the information was necessary, that's all.
 
i9 10900k, 3080 ti, 32 gb 3600mhz ram, z490 dark, 850w psu. I have two monitors 2560x1440p 240hz and 3440x1440p 120hz, I prefer playing on my 240hz monitor.

So the issue I'm having is that my gpu is severely underperforming when playing at 2560x1440, not in all games though. The example I will use here is bf5. Max settings, future frame rendering on, dx11, ultra settings. While playing at 2560 my gpu usage is extremely low averaging between 50% to 60%, with cpu usage being the same. This results in my fps being around 120. While gaming at 3440 my gpu is pretty much maxed out at 99%, with the pcu around 60% to 70% which results in fps over 200.

I have also noticed while playing at 2560, some of the single core usages will jump as high as 80% to 100%. Why am I getting better performance at a high resolution. Shouldn't my card still be pushing to provide 240fps at 2560 since thats my refresh rate?

There are other games where I am having the same issue, but some games run perfectly fine. Am I experiencing a cpu bottleneck? I've watched youtube benchmarks of bf5 with the exact same specs as me and the person gets 99% usage at 1440p.

A few notes

G sync is on, max power is on in nc and windows, xmp enabled, no overclocks, latest drivers and windows updates installed, have tried multiple clean installs.

I recently tried capping my fps to 237 in nc which seemed to have helped a little bit.
A CPU bottleneck would never result in higher frame rates at higher resolution. If it was a CPU bottleneck you'd get the same performance at both.

There's something else going on with your system, and honestly it sounds like a compatibility issue between your two screens and freesync/gsync. Try rebooting your system with just the 16:9 display attached and see what happens.
 
Do not use DirectX 11 with Battlefield V, it runs like crap. It cannot properly take advantage of your hardware. Switch to DirectX 12 and try again, I would be amazed if you noticed no improvement. For reference I use a i9 10850K and 3080 and play at 1440p 144hz.

I get a lot more than 120 FPS and my GPU is slower than yours.
DX12 actually runs far worse than DX11 for me.
 
A CPU bottleneck would never result in higher frame rates at higher resolution. If it was a CPU bottleneck you'd get the same performance at both.

There's something else going on with your system, and honestly it sounds like a compatibility issue between your two screens and freesync/gsync. Try rebooting your system with just the 16:9 display attached and see what happens.
The monitors aren't plugged in at the same time, sorry I probably should have specified that. But isn't your first statement untrue? The higher the resolution the less cpu dependent the system becomes?
 
DX12 actually runs far worse than DX11 for me.
That surprises me, that hasn't been my experience.

Are you using two monitors at the same time while they are running at different refresh rates?

For example displaying things on your ultrawide at 120hz while your playing on your 1440p? The reason I ask is because I found it caused problems when the displays were not running at the same refresh rate.

I switch to only my primary display when gaming at high refresh rates to avoid this issue.
 
The monitors aren't plugged in at the same time, sorry I probably should have specified that. But isn't your first statement untrue? The higher the resolution the less cpu dependent the system becomes?
In most cases the gap between different CPU's diminishes substantially at higher resolutions because the greater demands on the GPU limit the frame rate the latter can produce. I have found however some games significantly increase the load on the CPU at the same frame rate at higher resolutions.
 
The monitors aren't plugged in at the same time, sorry I probably should have specified that.
Hmm it still sounds like some weird sort of configuration problem, but it shouldn't happen if you've only got one of them hooked up. I'm assuming you're using DisplayPort?
But isn't your first statement untrue? The higher the resolution the less cpu dependent the system becomes?
No, because that's not how it works. As resolution increases the more you're using the video card, but that doesn't suddenly circumvent your CPU performance. Your CPU is going to limit the maximum FPS you're capable of getting, and certainly impact the 1%/0.1% lows.

This is why TPU had to switch from their standardized 5800X GPU test system to a 13900K after the RTX 4090 launch. You can go through the games list and compare the like for likes and even in 4K for a lot of them you're seeing big gains due to CPU:
Here's after the switch: https://www.techpowerup.com/review/msi-geforce-rtx-4090-gaming-x-trio/4.html
Here's before the switch: https://www.techpowerup.com/review/nvidia-geforce-rtx-4090-founders-edition/5.html
 
Hmm it still sounds like some weird sort of configuration problem, but it shouldn't happen if you've only got one of them hooked up. I'm assuming you're using DisplayPort?

No, because that's not how it works. As resolution increases the more you're using the video card, but that doesn't suddenly circumvent your CPU performance. Your CPU is going to limit the maximum FPS you're capable of getting, and certainly impact the 1%/0.1% lows.

This is why TPU had to switch from their standardized 5800X GPU test system to a 13900K after the RTX 4090 launch. You can go through the games list and compare the like for likes and even in 4K for a lot of them you're seeing big gains due to CPU:
Here's after the switch: https://www.techpowerup.com/review/msi-geforce-rtx-4090-gaming-x-trio/4.html
Here's before the switch: https://www.techpowerup.com/review/nvidia-geforce-rtx-4090-founders-edition/5.html
Yes, I am using display port. This is very frustrating. I feel like I paid a lot of money and am not getting the performance I paid for.

My system is water cooled so it makes it a bit difficult to swap out items but I have swapped out the power supply, ram, and graphics card and still have the same issue which leads me to believe it has to be the cpu or motherboard.

I'm just worried about spending over $1000 to upgrade to the am5 platform only to have the same issue I'm having now.
 
Yes, I am using display port. This is very frustrating. I feel like I paid a lot of money and am not getting the performance I paid for.

My system is water cooled so it makes it a bit difficult to swap out items but I have swapped out the power supply, ram, and graphics card and still have the same issue which leads me to believe it has to be the cpu or motherboard.

I'm just worried about spending over $1000 to upgrade to the am5 platform only to have the same issue I'm having now.
If you're getting higher performance out of 3440 than 2560 it's a configuration problem of some sort. It sounds like your performance at the ultrawide resolution is pretty much as expected. I just can't think of what settings would be changing between the two monitors.

I was thinking maybe not having a DP 1.4+ cable could be it, but I believe you'd have visual output problems not frame rate if that were the case.
 
If you're getting higher performance out of 3440 than 2560 it's a configuration problem of some sort. It sounds like your performance at the ultrawide resolution is pretty much as expected. I just can't think of what settings would be changing between the two monitors.

I was thinking maybe not having a DP 1.4+ cable could be it, but I believe you'd have visual output problems not frame rate if that were the case.
The two monitors are a AW2723DF and and Acer Predator x34P if that helps.
 
That's what I'm struggling with, as the 10900k and 3080 came out around the same time. But it's the only thing that makes sense. I'm just worried about dropping the money for a mobo, cpu, and ram upgrade and not having it pay off.
Well, of course you'd be worried about that, who wouldn't be?
Correct me if I'm wrong here but isn't a cpu bottleneck usually indicated when the cpu usage is above the gpus?
Usually yes but I've seen situations where that hasn't been the case. A CPU bottleneck can be due to a lack of CPU speed but it can also be caused by the system not properly using the CPU. What you really need to look for is GPU usage that's well below 99%. All I know for sure is that if the i9-10900K can bottleneck an RX 6600, even if only slighly so, then it will bottleneck the hell out of a powerful card like an RTX 3080. I know the performance difference first-hand because I have an RX 6800 XT (RTX 3080's equal rival) and an RX 6600. The performance delta between the 6600 and 6800 XT is gigantic so there's no question that you're not gaming at the speed of an RTX 3080, you're gaming at the speed of an i9-10900K.

Then there's also a question of storage media. If you're gaming off of a spinning platter HDD, that can (and often will) make things even worse. SSDs are dirt cheap right now and it makes little difference for gaming whether you use a low-end SATA SSD or a top-end NVMe SSD.

Techspot did a test on this awhile back. Just check out what a difference the storage media makes, even to a game that came out 5 years ago (Jesus, where does the time go?), Assassin's Creed Odyssey:
6-p.webp

You'll see that the performance delta across ALL types of SSD was less than three seconds. The spinning hard drive was... well, you can see it at the bottom. Click HERE to read the entire test.

It doesn't have to be expensive to upgrade your platform because 10th-gen Intel is pretty old at this point and there are some fantastic CPU options for gamers right now, depending on what you want to spend. Since you already have DDR4 RAM, if you don't want to spend much, you could do a massive platform upgrade for as little as $220 that would solve your problem completely. Check this out:
CPU: AMD Ryzen 5 5600 (includes a free CPU cooler) - $145
MOBO: Gigabyte B550M K - $75
Use your existing DDR4 RAM - Priceless!
Total: $220

Another path is to make the jump to AM5. This will be much more expensive because not only are AM5 parts more expensive, but you'll also need to buy RAM. It can still be had for under $500 however:
CPU: AMD Ryzen 5 7600 (includes a free CPU cooler) - $227
MOBO: Gigabyte B650M K - $120
RAM: G.SKILL Flare X5 Series AMD EXPO 32GB (2 x 16GB) DDR5-6000 CL36 - $93
Total $440

Given the choice, I'd pay the $440 for the platform that I know will last a lot longer with a CPU and RAM that are a lot faster to begin with.
 
Well, of course you'd be worried about that, who wouldn't be?

Usually yes but I've seen situations where that hasn't been the case. A CPU bottleneck can be due to a lack of CPU speed but it can also be caused by the system not properly using the CPU. What you really need to look for is GPU usage that's well below 99%. All I know for sure is that if the i9-10900K can bottleneck an RX 6600, even if only slighly so, then it will bottleneck the hell out of a powerful card like an RTX 3080. I know the performance difference first-hand because I have an RX 6800 XT (RTX 3080's equal rival) and an RX 6600. The performance delta between the 6600 and 6800 XT is gigantic so there's no question that you're not gaming at the speed of an RTX 3080, you're gaming at the speed of an i9-10900K.

Then there's also a question of storage media. If you're gaming off of a spinning platter HDD, that can (and often will) make things even worse. SSDs are dirt cheap right now and it makes little difference for gaming whether you use a low-end SATA SSD or a top-end NVMe SSD.

Techspot did a test on this awhile back. Just check out what a difference the storage media makes, even to a game that came out 5 years ago (Jesus, where does the time go?), Assassin's Creed Odyssey:
6-p.webp

You'll see that the performance delta across ALL types of SSD was less than three seconds. The spinning hard drive was... well, you can see it at the bottom. Click HERE to read the entire test.

It doesn't have to be expensive to upgrade your platform because 10th-gen Intel is pretty old at this point and there are some fantastic CPU options for gamers right now, depending on what you want to spend. Since you already have DDR4 RAM, if you don't want to spend much, you could do a massive platform upgrade for as little as $220 that would solve your problem completely. Check this out:
CPU: AMD Ryzen 5 5600 (includes a free CPU cooler) - $145
MOBO: Gigabyte B550M K - $75
Use your existing DDR4 RAM - Priceless!
Total: $220

Another path is to make the jump to AM5. This will be much more expensive because not only are AM5 parts more expensive, but you'll also need to buy RAM. It can still be had for under $500 however:
CPU: AMD Ryzen 5 7600 (includes a free CPU cooler) - $227
MOBO: Gigabyte B650M K - $120
RAM: G.SKILL Flare X5 Series AMD EXPO 32GB (2 x 16GB) DDR5-6000 CL36 - $93
Total $440

Given the choice, I'd pay the $440 for the platform that I know will last a lot longer with a CPU and RAM that are a lot faster to begin with.
Well I think I'm going to bite the bullet and pick up a am5 board, 7800x3d, ram, and a new ssd tomorrow. My systems kind of outdated now anyways I suppose. Wish me luck, I'll report back if things are fixed with the upgrade.
 
  • Like
Reactions: Avro Arrow
Well I think I'm going to bite the bullet and pick up a am5 board, 7800x3d, ram, and a new ssd tomorrow. My systems kind of outdated now anyways I suppose. Wish me luck, I'll report back if things are fixed with the upgrade.
I can't imagine that they won't be. However, I think that the 7800X3D would be a mistake because you'll be swinging the pendulum too far in the other direction. I'll explain:

In Shadow of the Tomb Raider, a game known to be CPU-intensive, the R7-7800X3D does 476FPS:
SotTR_Power-p.webp

While the RTX 3080 only does 164FPS:
index.php

So your system will be heavily bottlenecked by your GPU. Now, this isn't necessarily a bad thing because it means that you'll be able to do at least two GPU upgrades without upgrading your CPU. If that's what you have in mind, go for it. If you're trying to be the least bit economical, I would recommend getting something slower. If you still want the R7-7800X3D, keep in mind that you don't need the fastest RAM around because the 3D V-Cache more than makes up for slower RAM as you can see here:
2023-02-28-image-8-p.webp

I would say that from a price/performance standpoint, DDR5-5200 CL36 is the sweet spot because there's no point in paying the premium for DDR5-6000 to get only 1FPS on average. I'm just trying to make sure you don't spend needlessly. Cheers! 😉👍
 
I can't imagine that they won't be. However, I think that the 7800X3D would be a mistake because you'll be swinging the pendulum too far in the other direction. I'll explain:

In Shadow of the Tomb Raider, a game known to be CPU-intensive, the R7-7800X3D does 476FPS:
SotTR_Power-p.webp

While the RTX 3080 only does 164FPS:
index.php

So your system will be heavily bottlenecked by your GPU. Now, this isn't necessarily a bad thing because it means that you'll be able to do at least two GPU upgrades without upgrading your CPU. If that's what you have in mind, go for it. If you're trying to be the least bit economical, I would recommend getting something slower. If you still want the R7-7800X3D, keep in mind that you don't need the fastest RAM around because the 3D V-Cache more than makes up for slower RAM as you can see here:
2023-02-28-image-8-p.webp

I would say that from a price/performance standpoint, DDR5-5200 CL36 is the sweet spot because there's no point in paying the premium for DDR5-6000 to get only 1FPS on average. I'm just trying to make sure you don't spend needlessly. Cheers! 😉👍
The only other option would be a z790 with a 13700k, which I'd rather buy the AM5 board since Lg1700 will be dead after the 14th gen refresh. I'm also going to be getting DDR5-6000 30cl 32gb for like $120. I do plan on eventually upgrading to either a 4080 or the 50 series, but that would be months down the road.

My main goal right now is just getting my 3080 ti to be utiliized properly, so as long as that happens I will be happy.