I've been meaning to address this for a while, but keep putting it off. I think it's time to set the record straight because your commonly posted "CPU sets the frame rate" is not just partially wrong, in some cases it may be significantly wrong, even though in other cases it might be significantly correct. A lot depends on the game, the CPU, the graphics card, the memory and also how well things are optimized by the developer in the first place.
But the point is, simply saying "the CPU sets the frame rate" as if that is the deciding factor, is wrong. Plain wrong.
It is MUCH more complicated than that.
In the MOST basic terms, the CPU sends information and/or instructions from the game to the graphics card. The graphics card then processes the instructions, renders the image and sends it to the monitor for display. For gaming, the CPU and GPU are HIGHLY interdependent. You could have the 10900k or 3950x, and you are not going to run high frame rates if you have them paired with a GTX 1650, even with all of the quality settings turned down to low. The graphics card simply does not have good enough performance to process and render the information fast enough to keep up with the CPU and so in THIS type of scenario, the GPU is absolutely limiting the frame rate regardless that you have a CPU that is more than capable enough to deliver any kind of frame rate imaginable on modern hardware.
Therefore, in THAT scenario, or variations on it, the CPU does not "set" the frame rate and is not the limiting factor or cause of a low frame rate. The GPU is.
Additionally, there are a LOT of games that are very much NOT CPU dependent or "CPU bound" as we like to call it. For those kind of games you can still achieve a very high frame rate so long as you have a moderately (Or highly) capable graphics card despite having only a mediocre CPU.
And that's all without even getting into scenarios where 3D environments, game physics, moving objects, calculations, draw distance (Obviously, a heavily CPU relevant setting, in many games) or DirectX shader model requirements might all play a role and for the most part, those are all primarily relevant to GPU performance and absolutely have an effect on frames per second. Very obviously many of those are configurable settings or are affected by specific settings, but they still are non-CPU factors that have a direct impact on how many frames can be rendered per second.
MSAA (Multi-Sample Anti-Aliasing) is something that is generally entirely reliant on the GPU and can have a tremendous affect on performance and frame rates.
Further, some games are optimized in such a way that specific instructions that in other games might rely on the CPU, to be offloaded to the GPU to lighten the load, or visa versa, depending on whether the game has been designed to be primarily more dependent on one or the other, as some are. Some games primarily run on the CPU side of the fence and mainly only use the GPU for rendering and shading purposes. Other games don't use the CPU that much and are primarily run on the GPU so that yes, the CPU DOES send that data to the GPU, but calculations, physics computations, and other complex processing aren't done by the CPU and are instead performed ON the GPU side of the fence.
Clearly, even this is probably a gross over-simplification of how things work and I'm definitely no developer or expert in this area, but I can't help but cringe, and I'm sorry to have to put it this way, here, every time I see you say that the CPU "sets" the frame rate as if that was the end of the story and nothing more to say.
It just isn't that simply from one side to the other. It is an symbiotic relationship between CPU, GPU, in game settings, resolution and optimization, and an imbalance in ANY of those areas could have a detrimental and possibly significantly negative affect on performance.