GPU/CPU bottle-necking - how to know when one or the other is causing an issue?

ptrthgr8

Distinguished
Oct 17, 2014
53
0
18,530
Hi, everyone!

I need some help understanding GPU/CPU bottlenecking and how to tell when one or the other might be causing issues. I just purchased an EVGA RTX 2080 XC (08G-P4-2182-KR), replacing my Gigabyte GTX 1080 G1 Gaming (GV-N1080G1 GAMING-8GD) and I can see an improvement in terms of visuals (particularly anything that has to do with lighting effects), but I’m a little surprised I’m still getting under 60 FPS on higher settings in some scenarios. For starters, here are my systems specs as of today:

Case: CoolerMaster HAF 932 Advanced Full Tower Gaming Case
Motherboard: ASRock ATX DDR4 FATAL1TY Z170 GAMING K4
CPU: Intel Core I7-6700 FC-LGA14C 3.40 GHz 8 M Processor Cache 4 LGA 1151 BX80662I76700
GPU: EVGA RTX 2080 XC (08G-P4-2182-KR)
Monitor: ASUS Swift PG278Q Gaming (1440p)
RAM: G.SKILL 32GB (2 x 16GB) Ripjaws V Series DDR4 PC4-25600 3200MHz for Intel Z170 Platform Model F4-3200C16D-32GVK
PSU: EVGA SuperNOVA 850 G2 80+ GOLD, 850W ECO Mode Fully Modular 220-G2-0850-XR

I’ve enabled G-Sync, using Nvidia driver 416.34, and I’m using GeForce Experience to optimize the games. Also, I have two separate power cables connected to the RTX 2080 (rather than using a single cable that has both the 6+2 and 6 pin connector). I’ve never done any sort of overclocking or anything like that, but I wanted to find something to monitor the various sensors, etc. So, I downloaded MSI Afterburner (v4.6.0 Beta 9). So far I have to say it’s one of the handiest tools I’ve ever used.

This afternoon I did some testing. I enabled the Afterburner On Screen Display for the following settings: GPU Temp, GPU Usage, GPU Fan1, GPU Fan2, CPU Temp, CPU Usage, and FPS. I didn’t have a way to show averages over time (or at least I didn’t know where to look to find that option), so I just randomly took screen shots (that included the OSD figures at the time) while gaming, getting at least 10 examples for each of the games I was playing. And here are the results:

Game: Assassin's Creed Odyssey
GPU Temp 68.3
GPU Usage 96.4
GPU Fan 1 50.2
GPU Fan2 50.2
CPU Temp 64.0
CPU Usage 92.1
D3D11 FPS 78.5

Game: War Thunder
GPU Temp 71.6
GPU Usage 97.9
GPU Fan 1 54.1
GPU Fan2 54.1
CPU Temp 46.6
CPU Usage 22.7
D3D11 FPS 65.4

Game: Far Cry 5
GPU Temp 71.4
GPU Usage 99.0
GPU Fan 1 51.9
GPU Fan2 51.9
CPU Temp 57.9
CPU Usage 51.1
D3D11 FPS 49.1

I need some help interpreting these results. All three games seem to be using the GPU about the same – which is to say it’s being put to a lot of work. But only ACO seems to be using the CPU a lot, though I’m not sure if that’s a good thing or not. War Thunder and FC5 only seem to be making use of half of the CPU’s power. But, again, I don’t know if that’s a good thing or not.

Since I was upgrading things anyhow, I was considering getting an i7-7700 processor (since that’s the top CPU in the ASRock compatibility list for my motherboard), but before I did that I wanted to see if that’s even an issue. If War Thunder and Far Cry 5 aren’t even really pushing my i7-6700 that much, would the faster CPU make a difference at all? But of all three games, ACO seems to have the best FPS results. Is that because the CPU is helping out more compared to the other two games? And would getting the i7-7700 processor make the results even better for ACO?

Thanks in advance for any help/guidance you can provide.

Cheers!
 
Solution
Also check GE more thoroughly how it optimises your games. Often it sets dsr/resolution scaling and antialiasing too high which'll degrade fps quite a bit. There's a quality slider in GE you can adjust to favour more quality or performance and it'll set game settings according to that. Think its the spanner icon from memory.

Personally i don't like using GE, i feel i get more performance setting up games myself and i also use no antialiasing at all on a 1440p screen, imo don't really need it at this resolution.
Well your GPU can only work up to 100% so getting a better CPU won't help you.
You can lower the resolution or the quality to get more FPS to see at what point your CPU won't increase performance anymore.
At the moment -with your current settings- you don't have any bottleneck since your GPU is being used to it's fullest.
Different games use different amounts of CPU this is normal and nothing to be worried about as long as FPS are high enough for you.
 

boju

Titan
Ambassador
Also check GE more thoroughly how it optimises your games. Often it sets dsr/resolution scaling and antialiasing too high which'll degrade fps quite a bit. There's a quality slider in GE you can adjust to favour more quality or performance and it'll set game settings according to that. Think its the spanner icon from memory.

Personally i don't like using GE, i feel i get more performance setting up games myself and i also use no antialiasing at all on a 1440p screen, imo don't really need it at this resolution.
 
Solution
What you did was at best a side-grade so shouldn’t expect more than a few fps increase.

Given the slower speed of your cpu compared to this test sysyem, your numbers may not be far off:
https://www.techpowerup.com/reviews/Performance_Analysis/Far_Cry_5/4.html

But... I still feel like something is off with your results. Have you tried a DDU driver uninstall and a clean re-install? I would NOT use GE and see how it goes.

Would also make sure you have latest bios and ALL latest drivers for your motherboard.
 

ptrthgr8

Distinguished
Oct 17, 2014
53
0
18,530


I'm pretty sure this was the issue. In War Thunder, I left all the settings maxed out, but dropped SSAA from 4X to none (the only other option) and the FPS jumped up to ~180 on the ground and ~250 in the air. Far Cry 5 didn't have the exact same thing, but there was a setting for "resolution scale" that was set to 1.8 (out of 2, I think), so I dropped that down to 1.5 and the FPS jumped up to ~85 (with all other settings being left as-is). In both cases, I couldn't notice any difference in quality - things looked the same to my eyes anyhow. I didn't make any changes to ACO since I was getting perfectly fine FPS there. And after adjusting those settings, all the other figures were pretty consistent with the usage and temp figures I posted initially.

So, lesson learned... GeForce Experience is lame. Duly noted! :)

Thank you for the assist!

 

ptrthgr8

Distinguished
Oct 17, 2014
53
0
18,530


I thought the GTX 1080Ti and RTX 2080 were pretty much on par with each other. Shouldn't the RTX 2080 have been at least the same upgrade as going to a GTX 1080Ti? I had been looking at various 1080Ti models, but all of them were in the $730-$780 range, plus another ~$60 in sales tax (even when ordering from the larger internet sellers). I was able to take advantage of EVGA's partner pricing program and ended up getting the RTX 2080 XC for about $50 or so less compared to those 1080Ti models. Plus I'll also have the ray tracing and DLSS tech... whenever that starts getting developed in games. So, probably about the same time as the next gen of Nvidia cards come out. :)