InvalidError :
Slatteew :
Who the hell buys these parts and games at low res (were talking 800x600 for the tests) and low settings? NO ONE!
A *CPU* benchmark would be worthless as a *CPU* benchmark if the results end up practically identical across the board due to being almost entirely GPU-bound.
A component-centric benchmark (such as a *CPU* bechmark which is exclusively focusing on *CPU* performance) has to minimize the influence of other factors as much as possible and the easiest way to reduce the GPU's influence as an unknown and potential bottleneck is to lower resolution and disable GPU-intensive options. This is a basic principle of scientific testing: eliminate as many potential variables and unknowns as possible.
If you want benchmarks under more typical gaming conditions, look at GPU reviews. If you really want a sense of how much influence the CPU has on GPU performance under more typical resolutions and details, you can be more specific and hunt down CPU-GPU scaling benchmarks.
The benchmark is not invalid: it does represent the theoretical frame rate the CPU might be able to achieve if it had infinite GPU processing power with vsync off. You are simply looking at the wrong benchmark for the wrong reasons.
I will give you the GPU scaling, although if you look at game settings compared across the board with only CPUs differing, you still get different levels of performance outside of standard error (~3-5%). Also, some processors have been shown to handle higher resolutions better than others. Basically, is my CPU having any affect on my performance with this setup at this resolution that MOST people play at (usually 1080p according to Steam numbers). AMD 8350 had very little dropoff going from 1080p to 1440p compared to the 3770k in a review by Tek Syndicate with a Dual GPU setup, which might account for some CPU bottlenecking. The theoretical performance with unlimited GPU is still moot because we can never have unlimited GPU performance. Games will definitely favor using a CPU over a GPU or the other way around (like most games), and they only used 3 games. Two of those games are very CPU bound (SC2 and Skyrim), which will automatically favor Intel........Popular games of course, but it still cannot be said as useful as it has no basis in the real world.
Many videos from Linus and Logan show the 3570k and 3770k sometimes beating and sometimes being beat by the AMD 8350 in gaming. There is a difference when you keep settings the same and change ONLY the CPU. The difference becomes lopsided favoring the AMD when you do streaming.........and it cost less than both of those.........i got mine for 150 over Black Friday sale (what a friggin deal, right?). Synthetics and low res/bench marks have little to no basis in the real world and misleading consumers to buy products they actually do not need. Buy 4960X, get 10 more FPS in Crysis 3 than the FX 8350........justify the 5x more cost.
What good is the experimentation if it is not applicable to the real world? Not very.......i work in the scientific research community and i see a lot wrong with the design and interpretation of these results.