Resolution has nothing to do with the cpu. Cpu deals with the game code. The gpu deals with resolution and detail levels.
If a cpu can pre-render 100 frames, that's what you get whether it's 480p or 5k. The difference being whether the gpu is capable of populating the amount of pixels per frame and shoving it on screen.
A 2080ti is used for the gpu because it eliminates potential gpu limits at lower resolutions. By the time you get to 1440p and up, the workload shifts from the cpu to the gpu making maximum fps all but non existant because of gpu limitations.
1080p is used mostly as a basis because the vast majority of people still use 1080p monitors, so results are quantifiable, 1440p is also used because of its rise in popularity...