Most people don't have a clear understanding of how CPU and GPU bottlenecks work, it seems. In simple terms:
In a given scene, there will be a certain amount of CPU power need for a given framerate. I'll make up some numbers - your CPU may be able to deliver 90fps at stock clocks in a particular scene in a game. If you drop in a GTX 1080, at low settings, it may be able to deliver 200fps, so your CPU will be bottlenecking you at 90fps. However, if you crank up the graphical settings, the CPU will still be able to deliver 90fps, but the GPU may only deliver 50fps, so you're now "GPU bottlenecked". One is always bottlenecking the other, or you'd have infinite frames per second.
The amount of CPU and GPU power needed is contantly varying. The goal is to have enough of either to have the framerates you want, at the graphical settings you want. However, you can always reduce GPU load by lowering graphics, but you can't do anything about it if your CPU is limiting you to a framerate that's less than you want, other than overclock or replace the chip.
Any CPU will bottleneck any GPU sometimes, but it probably won't matter, because a 2600K will still deliver an excellent experience in the vast majority of games.