I am wondering about the practice of benchmarking hardware (CPUs and GPUs) for gaming using various game engines set to "ultra" or "crazy" or "very high."
The thought experiment goes like this: we want to see which hardware does the best on the most taxing settings. But what if the most taxing settings are badly programmed and cause things to go wrong?
Many AAA games are built first and foremost for the Playstation 4 or Xbox One, which run a lot of games on what are, by PC gaming standards, "medium" setting. The same game engines are then ported to the PC version and a lot of bells and whistles are added. So what if the developer put most of their effort into the console version and optimized the heck out of the lower end settings, and didn't put much time into the "ultra" settings?
I believe that this is most significant for mid-range hardware. With no disrespect intended to the reviewers, a common comment I read here on Tom's Hardware is that <insert affordable component here> would be OK if you were willing to "sacrifice detail settings." But then, we don't see any tests about how the game engine scales. What if the engine, if dialed down, is disproportionately more forgiving on less expensive gear?
Has anyone got some first-hand experience trying this out?
I would love to see a benchmark series where you had say three popular games, and then a series of hardware and settings along these lines:
Intel CPU high, medium, low
AMD CPU high, medium, low
Nvidia GPU high, medium, low
AMD GPU high, medium, low (although 'high' may not be released yet)
and then Game A, B and C, tested at Ultra, High, and Medium modes.
That's a lot of combinations. I get the feeling that this is why things like 3D Mark exist, only (again, no disrespect intended to the reporters) the comment is that the synthetic benchmarks don't really tell you much about real world performance.
The thought experiment goes like this: we want to see which hardware does the best on the most taxing settings. But what if the most taxing settings are badly programmed and cause things to go wrong?
Many AAA games are built first and foremost for the Playstation 4 or Xbox One, which run a lot of games on what are, by PC gaming standards, "medium" setting. The same game engines are then ported to the PC version and a lot of bells and whistles are added. So what if the developer put most of their effort into the console version and optimized the heck out of the lower end settings, and didn't put much time into the "ultra" settings?
I believe that this is most significant for mid-range hardware. With no disrespect intended to the reviewers, a common comment I read here on Tom's Hardware is that <insert affordable component here> would be OK if you were willing to "sacrifice detail settings." But then, we don't see any tests about how the game engine scales. What if the engine, if dialed down, is disproportionately more forgiving on less expensive gear?
Has anyone got some first-hand experience trying this out?
I would love to see a benchmark series where you had say three popular games, and then a series of hardware and settings along these lines:
Intel CPU high, medium, low
AMD CPU high, medium, low
Nvidia GPU high, medium, low
AMD GPU high, medium, low (although 'high' may not be released yet)
and then Game A, B and C, tested at Ultra, High, and Medium modes.
That's a lot of combinations. I get the feeling that this is why things like 3D Mark exist, only (again, no disrespect intended to the reporters) the comment is that the synthetic benchmarks don't really tell you much about real world performance.