• Happy holidays, folks! Thanks to each and every one of you for being part of the Tom's Hardware community!

Why is CPU usage higher at low resolutions?

darkstar845

Distinguished
Mar 29, 2010
230
1
18,680
Reviews always use low resolution on games when they are testing the performance of CPUs. Is there a simple explanation to this?
My assumption is that the CPU must have a higher usage at higher resolution in order to help the GPU.
 
  • Like
Reactions: haematocrit
Lowering the resolution of a computer game or software program increases the effect on a CPU. As the resolution decreases, less strain is placed on the graphics card because there are fewer pixels to render, but the strain is then transferred to the CPU. At a lower resolution, the frames per second are limited to the CPU's speed.

CPU sets the frame up + handles all the AI/resource allocation and then passes the parameters to the GPU which then draws the frame. So the CPU does its thing sends the frame along to the GPU. The larger the frame and the more processing required the longer it takes for the GPU to finish. At low resolutions the frames are drawn much much faster therefore the CPU has to do alot more work setting up more frames. Of course using vSync limits you to 60FPS so the CPU strain wouldnt change regardless of resolution provided that your GPU can handle 60FPS at the larger resolution.
 
  • Like
Reactions: haematocrit
A fairly simple explanation is that at low resolutions, the GPU can render more frames per second. So it becomes about how quickly the CPU can send those frames to the GPU.

when the CPU is the limiting factor, it becomes much easier to gauge performance.