Hi, anytime I play a game I notice that my GPU is at 99% utilisation while CPU is at 30-35% (e.g. Far Cry 6, Dying Light 2). Does this mean my GPU is bottlenecking, or is it normal and the games just aren't CPU intensive? Should I worry that the CPU is also not fully utilised to 100%?
The laptop comes with a Control Center for changing wattages supplied to the CPU and GPU, but when I change the CPU wattages there is absolutely no difference in game performance (not a single fps change when choosing anywhere between 20-80W), and there is only difference if I make it below 20W. Contrary, changing the GPU wattage even slightly affects gaming performance significantly as expected.
I play on Ultra settings 1440p and get 50fps average on FC6, and 30fps on Dying Light 2 (without DLSS) or 50fps with DLSS. The cooling of the components is sufficient as none go above 88C even after 6 hours of constant pushing on max wattage. However, I can't help but wonder if my performance is not optimal. What do you guys think? I see benchmarks on YouTube with lower specs (half RAM, half VRAM, and lower GPU wattage limit like 115), and they have %0-10 higher FPS than me in the same games, with same resolution, and settings.
Specs:
-AMD Ryzen 9 5900HX (3.3GHz, 4.6GHz Turbo)
-NVIDIA GeForce RTX 3080 - 16.0GB (165W)
-32GB Corsair 3200MHz SODIMM DDR4 (2 x 16GB)
-TongFang GM7ZG8M chasis
-230W charger
Note: My laptop is always plugged in and I use dGPU mode. I have all latest drivers, however, the AMD Radeon Software doesn't work and says that it is incompatible with my graphics drivers (likely because of the dGPU Mode using NVIDIA). Lastly, I have checked all NVIDIA settings and have optimised everything for performance. I have turned off all battery boosters, Whispermode, FPS Limiters, and have removed many background processes and applications during play.
The laptop comes with a Control Center for changing wattages supplied to the CPU and GPU, but when I change the CPU wattages there is absolutely no difference in game performance (not a single fps change when choosing anywhere between 20-80W), and there is only difference if I make it below 20W. Contrary, changing the GPU wattage even slightly affects gaming performance significantly as expected.
I play on Ultra settings 1440p and get 50fps average on FC6, and 30fps on Dying Light 2 (without DLSS) or 50fps with DLSS. The cooling of the components is sufficient as none go above 88C even after 6 hours of constant pushing on max wattage. However, I can't help but wonder if my performance is not optimal. What do you guys think? I see benchmarks on YouTube with lower specs (half RAM, half VRAM, and lower GPU wattage limit like 115), and they have %0-10 higher FPS than me in the same games, with same resolution, and settings.
Specs:
-AMD Ryzen 9 5900HX (3.3GHz, 4.6GHz Turbo)
-NVIDIA GeForce RTX 3080 - 16.0GB (165W)
-32GB Corsair 3200MHz SODIMM DDR4 (2 x 16GB)
-TongFang GM7ZG8M chasis
-230W charger
Note: My laptop is always plugged in and I use dGPU mode. I have all latest drivers, however, the AMD Radeon Software doesn't work and says that it is incompatible with my graphics drivers (likely because of the dGPU Mode using NVIDIA). Lastly, I have checked all NVIDIA settings and have optimised everything for performance. I have turned off all battery boosters, Whispermode, FPS Limiters, and have removed many background processes and applications during play.
Last edited: