A higher resolution has no bearing on how well the CPU can "game", except that in most cases it actually lightens the load on the CPU when you move to a higher resolution using the same graphics card because it takes the GPU longer to do it's thing. Same goes for settings. If you are playing at Ultra settings there will be less load on the CPU than if you were playing at low settings. At low settings or low resolutions the GPU is able to easily and quickly complete it's tasks and that puts a lot of demand on the CPU to get it's job done.
Conversely, with high resolutions or higher settings it takes the GPU longer so the CPU has more "free" time.
It is "relevant" for as long as it does what you need it to do. If it games fine at the resolution and settings you prefer to game at OR at some combination that you are willing to settle for, then it is doing it's job. When it begins to not be able to do that, that's when it loses relevance or becomes a choke point for the rest of the system and you need to start considering options.
There are people still using 3rd gen i7's like the 3770k, with very high end graphics cards like the 2080 ti, because it is still able to provide enough FPS for their needs. If they move up to a 144hz display then that might change and then they'd need to look at other options. It all depends on YOU and what YOU require the hardware to do. If it does that, then there is no point in even having the conversation. If it does not, then it's a problem.