AgentLozen :
It is also, at the same time, an unrealistic benchmark forcing a 1080 TI at 1080p with a top of the line CPU... totally unrealistic. These result were totally not in par with a realistic built of a 1600x with an RX 580 or 1060 GTX. At 1080p, with anything below a 1070 GTX, you see no real differences.
I'm afraid I don't completely understand what you're getting at. I'm interpreting your post as "Why did this article feature a GeForce GTX 1080 FE in the test setup when no one would use that card for 1080p gaming. It's mismatched." I hope I got that correct.
I think my last post summarizes the reason well enough but I'll try and clarify further. The purpose of the benchmarks in this article is to test only the performance of the Ryzen 2700x. The author (tester?) has to make sure that other variables don't taint the results. For example, you wouldn't test software using only 512MB of ram because it would bottleneck the system and produce inaccurate results. However, if you used 128GB of ram, it wouldn't make a difference because the Ryzen 2700x is the bottleneck at that point.
The same logic applies to graphics cards. You shouldn't use a GeForce GTX 650 Ti because the results would be bottle necked by the graphics card. But if you used an overclocked GeForce GTX 1080 Ti, the Ryzen 2700x is the bottleneck in that situation and it wouldn't matter.
I hope I understood your comment and that this example explains why the GeForce GTX 1080 FE used in this test is appropriate.
It is unrealistic because it doesn't affect the customer while rendering a false sense of superiority.
Basically, you are telling Joe Blo that Intel is better at gaming, however Joe Blo is going to buy a 8400 and a 1600 GTX. If you take a 1060 GTX at 1080p with either Ryzen (one) or Coffee Lake, it doesn't matter. They behave basically the same way because the card is the bottleneck.
What you introduce is scope creep and a behavior only present in at 1080p with a GPU above 500$. At 1440p or 2160p, this is a non issue, however the multi-threaded performance are going to matter.
So basically, this bench is true in what... 3% of the situation in the best scenario with new builts? While the contrary is not even mentioned.
So, the comment about why you should STILL include 1440p and 2160p bench is still valid... it is because you want to give people the whole picture.
What you should always mention is that this is only true when the card is not the bottleneck, which only happens with anything above a 1070 GTX. At that point, I hope that you mentioned it is only affecting people at 1080p @ 144HZ and above, because at 60 HZ, again, it is a non issue.
So, do I have a problem with the way that bench is done... you sure bet I do. It is totally misleading information that doesn't render the real picture.
If you test a CPU at 1080p for gaming, than you should provide a budget, a high end and an enthusiasm perspective. If not, you are just propagating FUD.