That's actually incorrect. More and more people now are twitch streaming. Almost everyone who I know that games has a twitch and attempts to stream. Guess what their limiting factor is going to be? =p. CPU power for gaming is relevant now more than ever due to this. You ever try gaming and streaming on a 9600k but a 2080ti (or any gpu for that matter)? Yeah, doesn't work out too well. Enter a beefy, speedy cpu with multi-threading. So while your point about 4k 60hz gaming not bottlenecking a cpu by itself is technically true, the rise of streamers is demanding more cpu power day by day regardless and making your statement of "almost impossible" not only misguided, but it's actually very common. Even more so for folks who stream non AAA/lower gpu demanding titles.
Edit - And don't get me wrong I understand what you're saying about you wouldn't be pushing those frames in a real life scenario, but it still is an indicator of the general gaming performance of one cpu vs another.
Well no, that is not incorrect because what you are saying is an
entirely different use case than my point altogether ... what you describe highlights the need for a CPU that has lots of
cores and threads though (which AMD highlighted in their E3 stream demo vs 9900k - if it can be said that the 9900k can "game" better, why does the 3900x win in this task then? (actual use case issue aside -- no one needs to stream on "slow")) -- This represents how well your CPU can
multitask, not how many max frames it can hit in an artificial situation.
None of that really relates to the "need" to bottleneck your CPU with a $1200 video card and quality/resolution settings on low. That's the need that doesn't exist and only shows up in gaming benchmarks and never, ever in 99.999% of regular gaming.
"And don't get me wrong I understand what you're saying about you wouldn't be pushing those frames in a real life scenario...indicator of the general gaming performance"
- Do note that those two points
do somewhat contradict. Actually a bottleneck CPU isn't really an indicator of gaming performance at all - it's an indicator of
something, but hardly real world gaming. Here's where the issue lies - in that contradiction.
And I'm not asking to outlaw low quality low res $1200 GPU benchmarks, I'm asking that reviewers make a distinction in benchmark numbers between typical real-use case scenarios, and wholly artificial induce ones. The onslaught of people and fanboys who don't understand this dichotomy between real life and artificial is getting a bit silly - even relatively intelligent people are loosing sight of this reality. What would be wrong with seeing CPU difference on other cards besides a 2080ti? The GTX 1060 is still the most popular card by a
massive margin, why not show the difference on the card that almost everyone has - a benchmark they can relate to? Why not a 2070 and a 1070 so we can see the difference between them in real world values? Why
hide the "real" world values? Why
only show values that 99.999% of people will never see? Are there 3rd party incentives at play here? It is
disingenuous and a
disservice to people reading the reviews and believe the
wrong things about their results.
And don't get me wrong, I was actually getting prepared to make that
same speech to AMD fanboys if zen2 did end up wiping the floor with Intel in gaming ... It obviously goes both way - that issue has nothing to do with "brand"