As the owner and current user of an i7-3770K I'm going to say this. So much bs in this thread, far too much disinformation and supposition.
Cpu sets the fps limit according to the game code instructions. With caps or limitations on thread usage, this can be lower than expected, or with simple code, can be far higher than can possibly be used. But it will vary from game to game. The cpu will take the code, pre-render it into a frame, giving objects places, addresses, dimensions, statistics, movement, shader info etc. The speeds at which that happen are a product of IPC and clocks. Faster it gets pre-rendered, the more frames can get processed per second, higher the fps. Got nothing to do with resolution. After the cpu pre-renders the frames, it ships them to the gpu.
It's there that the gpu will finish render the frame according to detail settings and resolution. It'll do one of only 2 things, either reach the fps limit set by the cpu, or fail. If it surpasses the fps limit, then changing from lower settings to higher doesn't change fps, if it fails then changing settings lower increases fps output, upto the cpu set limit.
Still according to the game.
In CSGO I get 300fps, so fps over 144 is most definitely attainable. In Skyrim (vanilla) it's over 180fps. Add in the 170 scripted mods I use, which is extremely heavy cpu usage, going from 2 threads to 6 threads, and fps tanks, to 60.
The problem here is that the 3770k is getting old in the tooth. It's IPC isn't comparable to newer platforms, even if clocks are. So regardless of gpu used, it's maximum attainable fps is going to be somewhat lower than new. All according to which game is played. The 1080ti is strong enough that it'll actually reach cpu limits in most games.
That's not a bottleneck, that's just inability of the cpu to fully utilize the gpu. Bump up resolution, higher gpu demand, then that changes. At 4k a 1080ti is perfectly suited for the old 3rd gen, you'll only need to attempt 60fps and the gpu will struggle with that at times.