I keep seeing people not understand why CPUs become less important as resolution increases, so I want to explain why.
So, anything more than 1080p and you are becoming much more GPU-dependent than CPU. At 1080p, the GPU is less dependent on rendering the pixels, as you have FAR fewer pixels in 1080p than 1440p, and such a drastic difference than 4k that when you leave 1080p, the CPU no longer has to work as hard, and the GPU takes more load; therefore meaning, the CPU starts waiting to send tasks to the GPU instead of the GPU waiting on tasks from the CPU.
At 1080p, most modern GPUs run just about every game quite efficiently (meaning fast enough); therefore, the GPU is not working all that hard compared to 1440p, especially 4k. Due to this, the GPU relies on the CPU to feed it information while the CPU works on tasks such as game logic, AI, physics calculations, and other tasks. Also, as the CPU handles all of that, at 1080p, the GPU is working efficiently enough that it starts waiting on the CPU to give it over info to process, making the CPU the limiting factor.
In 1440p and 4k, the GPU starts becoming/becomes the limiting factor because of how many more pixels it needs to render before it can even accept the CPU data it is being tossed. Given that the CPU can render its tasks before the GPU can render all of its tasks due to the significant increase in pixels, the CPU is now waiting, making it far less important because it does not need to work as hard. Hence, 1440p is usually considered a sweet spot, and 1080p is used to test CPU performance.
The CPU load never changes; however, the GPU efficiency starts reducing (meaning it gets a lot more load and is not able to handle it as well as 1080p), and because of that, it is now the one that is processing more as the CPU is waiting on the GPU to get done with its tasks before it sends more info to the GPU for processing.
So, anything more than 1080p and you are becoming much more GPU-dependent than CPU. At 1080p, the GPU is less dependent on rendering the pixels, as you have FAR fewer pixels in 1080p than 1440p, and such a drastic difference than 4k that when you leave 1080p, the CPU no longer has to work as hard, and the GPU takes more load; therefore meaning, the CPU starts waiting to send tasks to the GPU instead of the GPU waiting on tasks from the CPU.
Why?
1080p has 2 million nearly 2.1 million pixels, 1440p has 1.62 million more pixels, and 4k has 6.22 million more pixels; the GPU has to render these pixels. And for each frame, the GPU has to draw close to 1.8x more pixels in 1440p than 1080p (reducing the load on the CPU), and then in 4k, it has to draw right around 4x more pixels than 1080p (SIGNIFICANTLY decreasing the load on the CPU).At 1080p, most modern GPUs run just about every game quite efficiently (meaning fast enough); therefore, the GPU is not working all that hard compared to 1440p, especially 4k. Due to this, the GPU relies on the CPU to feed it information while the CPU works on tasks such as game logic, AI, physics calculations, and other tasks. Also, as the CPU handles all of that, at 1080p, the GPU is working efficiently enough that it starts waiting on the CPU to give it over info to process, making the CPU the limiting factor.
In 1440p and 4k, the GPU starts becoming/becomes the limiting factor because of how many more pixels it needs to render before it can even accept the CPU data it is being tossed. Given that the CPU can render its tasks before the GPU can render all of its tasks due to the significant increase in pixels, the CPU is now waiting, making it far less important because it does not need to work as hard. Hence, 1440p is usually considered a sweet spot, and 1080p is used to test CPU performance.
The CPU load never changes; however, the GPU efficiency starts reducing (meaning it gets a lot more load and is not able to handle it as well as 1080p), and because of that, it is now the one that is processing more as the CPU is waiting on the GPU to get done with its tasks before it sends more info to the GPU for processing.