[SOLVED] CPU low utilisation during gaming


Feb 19, 2018
Hi, anytime I play a game I notice that my GPU is at 99% utilisation while CPU is at 30-35% (e.g. Far Cry 6, Dying Light 2). Does this mean my GPU is bottlenecking, or is it normal and the games just aren't CPU intensive? Should I worry that the CPU is also not fully utilised to 100%?

The laptop comes with a Control Center for changing wattages supplied to the CPU and GPU, but when I change the CPU wattages there is absolutely no difference in game performance (not a single fps change when choosing anywhere between 20-80W), and there is only difference if I make it below 20W. Contrary, changing the GPU wattage even slightly affects gaming performance significantly as expected.

I play on Ultra settings 1440p and get 50fps average on FC6, and 30fps on Dying Light 2 (without DLSS) or 50fps with DLSS. The cooling of the components is sufficient as none go above 88C even after 6 hours of constant pushing on max wattage. However, I can't help but wonder if my performance is not optimal. What do you guys think? I see benchmarks on YouTube with lower specs (half RAM, half VRAM, and lower GPU wattage limit like 115), and they have %0-10 higher FPS than me in the same games, with same resolution, and settings.

-AMD Ryzen 9 5900HX (3.3GHz, 4.6GHz Turbo)
-NVIDIA GeForce RTX 3080 - 16.0GB (165W)
-32GB Corsair 3200MHz SODIMM DDR4 (2 x 16GB)
-TongFang GM7ZG8M chasis
-230W charger

Note: My laptop is always plugged in and I use dGPU mode. I have all latest drivers, however, the AMD Radeon Software doesn't work and says that it is incompatible with my graphics drivers (likely because of the dGPU Mode using NVIDIA). Lastly, I have checked all NVIDIA settings and have optimised everything for performance. I have turned off all battery boosters, Whispermode, FPS Limiters, and have removed many background processes and applications during play.
Last edited:
Does this mean my GPU is bottlenecking, or is it normal and the games just aren't CPU intensive?
No, 99% GPU usage is good because it means that it is being used most effectively (unless your temps are abnormally high). CPU usage also seems fine to me.
Thanks, I thought by the same logic, the CPU should also be utilised as much as possible? Or is that not desirable?
Simplified way a PC plays a game:
  1. CPU figures out what needs to be in a given frame (imagine a rough sketch) based on user and game world input. Issues draw call to GPU to tell it what to render.
  2. GPU receives draw call and makes a pretty picture. Sends to monitor when complete.
  3. The GPU can't do any work until the CPU tells it what to draw. Raising graphics settings and/or resolution increases the complexity of the GPU's job, making it take longer to render each frame. Lowering settings decreases the complexity of the GPUs job making it take less time to render each frame.
  4. If the GPU finishes rendering a frame before the CPU has finished figuring out what the next frame should contain, the GPU has to wait (<100% GPU usage).
  5. Based on #3 & #4, you should be able to optimize for 90% or greater GPU usage (depending on a game's CPU stress and the CPU/GPU balance of a system)
  6. CPU usage is usually reported as active time across all available threads of a CPU. Most* games don't leverage more than....6-7 threads. Monitoring CPU usage isn't really useful. It can be misleading, especially in today's high core-count CPUs.