joeblowsmynose
Distinguished
Isn't it 30-40% now?... Intel for highest fps in games, but more expensive and with fewer cores (except 9900k and until next year.) AMD for 10-20% less fps in games, but lower price point with higher core/thread count making production work faster.
There's a lot more to consider than simplifying it to that equation (which had wrong variables anyway)
"Testing with Assassin's Creed: Odyssey, 3rd-gen Ryzen is ~10% faster than the 2700X which is good, but not good enough to beat the 9900K, at least when looking at the average frame rate. Despite similar frame time performance, the 9900K was 4% faster on average at 1080p with an RTX 2080 Ti. " - testing R7 3700x --- this testing was done with very high or ultra game quality settings and a mix of 1080p and 1440p - which the OP said he would be likely using both. Unless he's the kind of guy to turn all the game settings to low, it is this 4% difference he should be considering, along with other points
Source: https://www.techspot.com/review/1869-amd-ryzen-3900x-ryzen-3700x/
Steve Burke, with his testing on medium game settings, and strictly 1080p noted 6-8% difference on average, as I mentioned earlier in the thread.
Its not 10-20% in real life. Also note the comment on frame times as roughly equal - smooth gameplay is superior to faster fps with worse frametimes. Average FPS does NOT tell the story of how smoothly a game plays.