so wait, you are apparently some well educated programmer, and you don't even know how a game engine works? What i posted was not a spin, and the chart you linked is laughable at best. But, I would not make a statement and expect you to blindly listen to me, as you have tried with your chart. So I will give another example:
Lets say I am playing a game like Borderlands. And lets say i have a 144hz monitor (which i do so this example speaks to me).The reason I bring up borderlands is because its a called a CPU bound game in all corners of the internet. Now, i love this game, and it benefits greatly from FPS above 100hz and i have a monitor capable of that frame rate. Heres the thing, it is bottlenecked by my 4670k. i have a 980ti and my 4670K is bottlenecking the performance in this game for me. SO i have 2 choices, 1 upgrade the CPU (i will use a heavy overclock to simulate an upgrade) or turn down the graphics settings. since the CPU is the bottleneck, naturally the CPU OC is the preferred upgrade. So lets look at what i have actually done to balance my frame rate....
My target was a minimum 120hz and i was getting 90hz
So first things first, I did what i needed to do and OC the processor to 4.6Ghz. and by golly, there it was an average FPS of like 120FPS, but i noticed i was dipping closer to 110FPS. this made me sad. Knowing that the CPU was at its OC limits and that the game was still CPU bound by definition (i.e. 90% CPU utilization) according to your chart turning down the Graphics settings would have gotten me little gain because it was CPU bottlenecked.
But unlike you, i know how computers work and understand that there are 2 parts to each frame, CPU and GPU bound tasks. So i dropped Anisotropy to 4x and i turned bullet decals down and turned off AA (i was never a fan of it). And what do you know the frame rate shot up to a 140hz averagre and but still had its dips to around 115 at the minimum.
The best part is, the GPU utilization actually goes down in this case because the graphics cards isnt even working as hard as it could be..... that's right, 90% CPU utilization and actually dropping the utilization of the GPU netted me a gain in frame rate. by turning those settings down the GPU was completing its part of the task faster, which meant that it was spending more time waiting for CPU information, however, even with added down time the overall frame rate goes up.
My point is, bottlenecking in a video games is something many people don't understand. When you are explaining this to a community of people without a deeper understanding it makes sense to call a game with 90% CPU utilization and 75% GPU utilization a "CPU bottleneck", but the reality is that if the next processor up gains an improvement of lets say 7% (like most generation changes actually have in games) you will see a "90% utilization" of the 7% increase (so like a 6% increase).
so lets say the the old GPU at 75% utilization but the GPU upgrade has a 30% theoretical performance increase,. then the over all performance increase is theoretically the utilization x the performance of the new card minus the utilization x the perfromance of the old card:
(75% x 130%) - (75% x 100%) = 22.5% increase
but as we both know utilization of the new GPU will be lower as well so lets account for that by knocking off 5% utilization. The reason for that is because we will increase downtime, but we are not decreasing the amount of information the CPU can deliver, because that CPU is still the same.
(70% x 130%) - (75% x 100%) = 16% increase.
so basically that chart that claims a 4670k will bottleneck a 1080 in some CPU intensive games is definitely true, but i will still gain more performance by upgrading my 980ti to a 1080ti next year than i will if i upgrade my 4670k to a 7700k. I have one friend happily running a 2500k with a 1070 and playing games at 4k better than me with a 980ti because, bottlenecks are game dependent and CPU performance means so little in the majority of games.
And dont take this the wrong way, those charts are a great guide for people who are building a new PC and dont have time to learn, but in reality we can make much better use of our time and money by gaining a better understanding of the products we use.