Question Ryzen 7 7800x3d low CPU utilization at 1440p or 1080p

ilukey77

Reputable
Jan 30, 2021
778
325
5,290
Hi
maybe im being paranoid but im finding that at 1440p and 1080p my 7800x3d is really low utilization like in the 3 to 10%
is this normal ??

In cinebench r23 100% utilization obviously (17k plus benchmark ) and i obviously expect low CPU usage at 4k but for some reason in games at 1080p and 1440p the GPU is still doing most of the work not the CPU..

I never play at 1080p and 1440p is not as important now as my 7900xtx playable frame rates at 4k are quite good in most of the games i play im just finding it super strange that even at 1080p and 1440p ( while FPS goes up huge ) im using little to no CPU usage

latest bios update was done a week or so before i changed from the 7600x to 7800x3d and im running window 11
 

ilukey77

Reputable
Jan 30, 2021
778
325
5,290
super strange to me in some of the games when i drop it down to 1080p its uses more GPU utilization but it will lower it from 100% to 80 % /90% the cpu is still at 3% fps goes up obviously same at 1440p uses more gpu than cpu but still at 3% and frames still higher than 4k and still at 3% with 100% utilization at 4k which is to be expected ..
makes no sense to me!!
stupid as it seems unless i use cinebench r23 and force only the cpu to work in all games it will not use the cpu much at all in any resolution !!

if it wasnt using full utilization at any point i could say there is a huge issue but im current running cinebench r23 100% 4.8ghz 75c
even cpuid hw monitor is showing full usage at all the same specs as adrenalin software over lay !!
 
Last edited:
Are you monitoring total cpu usage instead of each core/thread? Total cpu usage is fairly meaningless as many games won’t use all available threads, then of threads being used often some will be doing more work than others. You can be cpu limited with 1 thread at 100% while the others do little if the game is poor at using multiple threads.
 

ilukey77

Reputable
Jan 30, 2021
778
325
5,290
Are you monitoring total cpu usage instead of each core/thread? Total cpu usage is fairly meaningless as many games won’t use all available threads, then of threads being used often some will be doing more work than others. You can be cpu limited with 1 thread at 100% while the others do little if the game is poor at using multiple threads.
total yes but should it not revert back to using more of the cpu at lower resolutions cyberpunk 2077 1080p with RT ultra was running the benchmark 60fps but still only 3-6% and the gpu was full usage 100%

cpuid hw is showing some cores working but nothing at 100% for 55 to 60 fps in the benchmark run at 1080p RT ultra!!
 
Last edited:
total yes but should it not revert back to using more of the cpu at lower resolutions cyberpunk 2077 1080p with RT ultra was running the benchmark 60fps but still only 3-6% and the gpu was full usage 100%

cpuid hw is showing some cores working but nothing at 100% for 55 to 60 fps in the benchmark run at 1080p RT ultra!!
If you are looking at the total usage then you are looking at a calculated average. The ONLY WAY is to look at per core usage. What you describe sounds normal. If your performance is adequate and compares to similar systems then forget about it and just use the system. Sometimes peeping at the details causes us to see problems that aren't there.

FWIW 100% component utilization IS NOT the endgame. You don't want that, ever (Unless you are some competetive gamer with millions in corporate sponsor deals). There is a common misconception that you just drop vsync and let it fly and everything will be the bestest ever (Thanks, random YouTubers). Pick a target resolution and refresh rate (IE. Your monitor native resolution and refresh rate), hit that with components that are not overly stressed by running at 100% all the time you will have some overhead as future titles come out. With a CPU that is stressed as little as modern ones in most titles that nets you at maybe 2 future GPU upgrades in the future before the CPU becomes a limiter. Now, I'm not saying you don't have a problem here as I'm not familiar with 2077 but since you stated the GPU is at 100% usage in all scenarios it sound to me like business as usual.
 
total yes but should it not revert back to using more of the cpu at lower resolutions cyberpunk 2077 1080p with RT ultra was running the benchmark 60fps but still only 3-6% and the gpu was full usage 100%

cpuid hw is showing some cores working but nothing at 100% for 55 to 60 fps in the benchmark run at 1080p RT ultra!!
The way cpu load is described as moving to the cpu at lower resolutions is not really correct. 60fps at 1080p is about the same cpu workload or only slightly more at 60fps at 4K. The gpu load will be higher at 4K. All that changes is what becomes the limiting factor first, cpu or gpu. At lower resolutions with a strong gpu it’s far more likely the cpu becomes the limiting factor.

At the settings you are running the gpu is 100% and the limiting factor.
 

ilukey77

Reputable
Jan 30, 2021
778
325
5,290
If you are looking at the total usage then you are looking at a calculated average. The ONLY WAY is to look at per core usage. What you describe sounds normal. If your performance is adequate and compares to similar systems then forget about it and just use the system. Sometimes peeping at the details causes us to see problems that aren't there.

FWIW 100% component utilization IS NOT the endgame. You don't want that, ever (Unless you are some competetive gamer with millions in corporate sponsor deals). There is a common misconception that you just drop vsync and let it fly and everything will be the bestest ever (Thanks, random YouTubers). Pick a target resolution and refresh rate (IE. Your monitor native resolution and refresh rate), hit that with components that are not overly stressed by running at 100% all the time you will have some overhead as future titles come out. With a CPU that is stressed as little as modern ones in most titles that nets you at maybe 2 future GPU upgrades in the future before the CPU becomes a limiter. Now, I'm not saying you don't have a problem here as I'm not familiar with 2077 but since you stated the GPU is at 100% usage in all scenarios it sound to me like business as usual.
it just looks strange to me as with my 5800x3d it would hit up more of my cpu when at 1440p same games .. i may even put the 7600x back in and see if its load changes with different resolutions !!

for all purposes im not overly fussed as the cpu is obviously working otherwise it wouldnt even post so its not like its a world end issue it just seems to be performing strange..

might even be a smart move go back to the 7600x with the 7800x3d and Asus board issue at the moment !!
 
it just looks strange to me as with my 5800x3d it would hit up more of my cpu when at 1440p same games .. i may even put the 7600x back in and see if its load changes with different resolutions !!

for all purposes im not overly fussed as the cpu is obviously working otherwise it wouldnt even post so its not like its a world end issue it just seems to be performing strange..

might even be a smart move go back to the 7600x with the 7800x3d and Asus board issue at the moment !!
Ya, I get it lol. Unfortunately it seems it might not be limited to X3D variants. In any case, be sure to have everything at defaults, on the latest BIOS, and perhaps avoid EXPO or any other "performance settings. FWIW X3D CPUs are not as memory speed sensitive as non X3D so you should not see any performance regression, assuming you had it enabled.

As for swapping back the 7600X, FOR SCIENCE! I'd be interested in the results. Modern game engines don't seem to respond the way we are used to in regards to resolution scaling, perhaps it's the RT wildcard?
 
  • Like
Reactions: ilukey77

ilukey77

Reputable
Jan 30, 2021
778
325
5,290
Ya, I get it lol. Unfortunately it seems it might not be limited to X3D variants. In any case, be sure to have everything at defaults, on the latest BIOS, and perhaps avoid EXPO or any other "performance settings. FWIW X3D CPUs are not as memory speed sensitive as non X3D so you should not see any performance regression, assuming you had it enabled.

As for swapping back the 7600X, FOR SCIENCE! I'd be interested in the results. Modern game engines don't seem to respond the way we are used to in regards to resolution scaling, perhaps it's the RT wildcard?
I think for <Mod Edit> and giggles in the next few days i might swap back to the 7600x although i did just update to the latest ASUS bios and ive left EXPO off for the time being ..
just to be safe i mean if it <Mod Edit> its self ill send them back but its the mess around that would annoy me more !!
 
Last edited by a moderator:
I think for <Mod Edit> and giggles in the next few days i might swap back to the 7600x although i did just update to the latest ASUS bios and ive left EXPO off for the time being ..
just to be safe i mean if it <Mod Edit> its self ill send them back but its the mess around that would annoy me more !!
that wont help you, cpu usage in msi afterbuner and also in radeon software is bugged, gpu driver (radeon) missreports it
1bVonEV.png


3lDjmSD.png

DIa8QGr.png

CCyFM9h.png
 
Last edited: