Don't even bother with Bottleneck Calculator, it's complete garbage and has Zero basis in reality. It's a gimmick that vendors use to trick the unwary into thinking they need upgrades.
While you are at it, throw out any preconceived notions of bottleneck, that's bs too.
The cpu is what it is. It pre-renders all the frames according to the game code design. It'll pre-render only so many frames a second. Those frames get sent to the gpu which finish renders the frames according to resolution, detail settings and post-processing affects. At no time does one affect the other, each has seperate jobs.
So let's say the cpu can pre-render 100fps. That gets sent to the gpu. The gpu takes those frames and finishes them at 1080p ultra and yet has power enough to do more, you get 100fps on screen as that's the limit of what the cpu can send. Most ppl call that a cpu bottlenecking the gpu, but it's not, it's just that one game the gpu is under-utilized. Changing settings does little to nothing in this case, you can't put on screen more than the cpu can give.
Or a different game the cpu can do 200fps, the gpu at 1080p ultra can only on-screen 150fps. Lower settings raises the on-screen fps but still will not exceed the 200fps cap from the cpu. Most say this is the gpu bottlenecking the cpu. It's not, it's the detail settings beyond the ability of the gpu to reproduce as many as the cpu can send.
Both at 1080p. Change that to 1440p which has 1.7x as many pixels to populate vs 1080p and the results change as the gpu now has to work considerably harder. 4k is worse, there's less cpu need for high fps and more gpu power needed to reproduce frames.
The cpu doesn't slow the gpu down, nor does the gpu slow a cpu down. They both work at 100% ability, the only difference being just how much resources the game code demands that get used. Not the same thing.
So a stronger gpu can increase fps, but it's just playing 'catch-up' to what the cpu can output. A stronger cpu can increase the original output allowing for a higher fps ceiling a gpu has to work with.
Moving to a 5700xt might get you higher fps in some games, but that's only because the 6600k can give the frames to the gpu. Or moving to a 5700xt can do nothing if the 6600k is capped at a lower rate by the game code and your current gpu isn't fully utilized.
Bottleneck calculator takes inputted gaming statistics from multiple games. You supply a gpu, it takes that series of results. You give it a cpu, and it tells you that you are bottlenecking because there's stronger cpus around that have higher levels of pre-rendered frames. It's a bunch of horse manure. Fps changes from game to game, resolutions, detail levels, post-processing like hairworks or ray tracing etc. Bottleneck calculator is an averaging of multiple games based on its settings, not yours.
I'd really like to know how a i5-6600k, capable of over 300fps in CSGO, could be considered a bottleneck. Does that mean the gpu is capable of far more than 300fps? And you have what monitor that'll handle that kind of output? So what if you only get 80fps on BF1, that's more on the gpu than cpu. Oh wait, multi-player boss fight will change that due to AI demands, fps drops due to massive cpu loading. Does that mean the cpu is now bottlenecking the gpu when 2 minutes ago it was the other way around?