If you set an 8700 non k up to constantly run at 4.3 all core under load it would probably be close to that number in the perfect storm gaming scenario. You need a good cooler, and in some cases, bios tweaks to achieve this. This would be lessened significantly if overclocking the 2600 to 4.2 ghz with a good cooler.
Here in the real world all of this is dependent on the game, GPU, resolution, and settings. You can throw out numbers until the cows come home but unless you have enough GPU, or use reduced settings, a faster CPU won't necessarily give you significantly higher frame rates. In most scenarios, once you actually have the surrounding hardware and settings combination to see the gaming benefit of a 25% faster CPU, the frame rate will be so high you probably won't be able to notice the benefit anyway. I doubt 99% of people could tell the difference between 120 and 165 fps in a double blind test if using adaptive sync and getting similarly smooth frame times.
All this stuff gets insanely blown out of proportion by enthusiasts and tech press IMO. Gaming CPU benchmarks would be pretty boring if they used a sensible GPU choice, resolution, and graphics quality settings for the CPU's being compared. This why the CPU's are almost always compared with flagship GPU's. The significant gains only show up at lower resolution in most cases. More focus should be put on frame times when comparing gaming CPU's IMO.