I am a little confused about how the CPU and the GPU work together.
My understanding is that the CPU does its work on a frame, then passes its work to the GPU. Immediately thereafter, the CPU begins the next frame while the GPU is working on the previous frame. In other words, the CPU and GPU can work on two different frames simultanously. Am I right so far?
(Arbitrary numbers incoming.)
Let's say the CPU takes 15ms to work on a frame. Let's also say we have a game on low resolution and detail, and on this setting it takes 10ms for the GPU to render the frame. Does this mean the CPU is effectively a bottleneck, since the frame will require at least 15ms (~67 FPS) to render regardless of how fast the GPU does its work?
Now let's say we have the game on high resolution and detail, and it takes the GPU 20ms to render a frame. Does this now mean that the GPU is a bottleneck, since changes in the on-screen image can only occur every 20ms (50 FPS)?
If so, this would mean there is always some form of bottleneck unless the CPU and GPU take the exact same amount of time to work on the frames.
However, it seems like modifying CPU-dependent features and modifying GPU-dependent features at any given time will both affect the frame rate. There does not seem to be a specific cut off point for either, as if neither is bottlenecking the other.
So I'm a bit confused. Can you explain to me how this works? Thanks in advance.
My understanding is that the CPU does its work on a frame, then passes its work to the GPU. Immediately thereafter, the CPU begins the next frame while the GPU is working on the previous frame. In other words, the CPU and GPU can work on two different frames simultanously. Am I right so far?
(Arbitrary numbers incoming.)
Let's say the CPU takes 15ms to work on a frame. Let's also say we have a game on low resolution and detail, and on this setting it takes 10ms for the GPU to render the frame. Does this mean the CPU is effectively a bottleneck, since the frame will require at least 15ms (~67 FPS) to render regardless of how fast the GPU does its work?
Now let's say we have the game on high resolution and detail, and it takes the GPU 20ms to render a frame. Does this now mean that the GPU is a bottleneck, since changes in the on-screen image can only occur every 20ms (50 FPS)?
If so, this would mean there is always some form of bottleneck unless the CPU and GPU take the exact same amount of time to work on the frames.
However, it seems like modifying CPU-dependent features and modifying GPU-dependent features at any given time will both affect the frame rate. There does not seem to be a specific cut off point for either, as if neither is bottlenecking the other.
So I'm a bit confused. Can you explain to me how this works? Thanks in advance.