Discussion The CPU can matter a lot in gaming, even when the GPU is the bottleneck.

mafi

Distinguished
Oct 15, 2008
68
2
18,535
I recently upgraded from a Core i7 4790K(4C 8T)@4.4 Ghz and 16GB DDR3 2400 Mhz, to Ryzen 7 7700(8C 16T) and 32GB DDR5 6000Mhz CL30.
I still have the old graphics card, an RTX 2070 which I plan to upgrade soon.
In Cyberpunk 2077, with the 4790K, I was getting around 45 FPS on average in 2560x1440 with Ultra settings (Ray Tracing Off) and DLSS Quality. But the 0.1% lows (monitored with MSI Afterburner) were dipping sometimes below 20 FPS.
After upgrading to Ryzen 7 7700, the average FPS increased from 45 to around 50 FPS but the 0.1% low is around 40 FPS. The gameplay feels a lot smoother and the frametime graph looks almost like a flat line.
The average FPS can be very misleading.
 
I recently upgraded from a Core i7 4790K(4C 8T)@4.4 Ghz and 16GB DDR3 2400 Mhz, to Ryzen 7 7700(8C 16T) and 32GB DDR5 6000Mhz CL30.
I still have the old graphics card, an RTX 2070 which I plan to upgrade soon.
In Cyberpunk 2077, with the 4790K, I was getting around 45 FPS on average in 2560x1440 with Ultra settings (Ray Tracing Off) and DLSS Quality. But the 0.1% lows (monitored with MSI Afterburner) were dipping sometimes below 20 FPS.
After upgrading to Ryzen 7 7700, the average FPS increased from 45 to around 50 FPS but the 0.1% low is around 40 FPS. The gameplay feels a lot smoother and the frametime graph looks almost like a flat line.
The average FPS can be very misleading.
A game is running in CPU and RAM, AI and all physics are handled by CPU. GPU is there to make a picture from data supplied by CPU. A "heavy" game with slower CPU with a lot happening at same time will have problems handling everything in real time. That's where those 1% lows show up. Average FPS is misleading because it drops just slightly comparing to lows.