Are game engines actually optimized for "Ultra" mode?

Retrogame

Distinguished
Aug 31, 2007
34
0
18,530
I am wondering about the practice of benchmarking hardware (CPUs and GPUs) for gaming using various game engines set to "ultra" or "crazy" or "very high."

The thought experiment goes like this: we want to see which hardware does the best on the most taxing settings. But what if the most taxing settings are badly programmed and cause things to go wrong?

Many AAA games are built first and foremost for the Playstation 4 or Xbox One, which run a lot of games on what are, by PC gaming standards, "medium" setting. The same game engines are then ported to the PC version and a lot of bells and whistles are added. So what if the developer put most of their effort into the console version and optimized the heck out of the lower end settings, and didn't put much time into the "ultra" settings?

I believe that this is most significant for mid-range hardware. With no disrespect intended to the reviewers, a common comment I read here on Tom's Hardware is that <insert affordable component here> would be OK if you were willing to "sacrifice detail settings." But then, we don't see any tests about how the game engine scales. What if the engine, if dialed down, is disproportionately more forgiving on less expensive gear?

Has anyone got some first-hand experience trying this out?

I would love to see a benchmark series where you had say three popular games, and then a series of hardware and settings along these lines:

Intel CPU high, medium, low
AMD CPU high, medium, low

Nvidia GPU high, medium, low
AMD GPU high, medium, low (although 'high' may not be released yet)

and then Game A, B and C, tested at Ultra, High, and Medium modes.

That's a lot of combinations. I get the feeling that this is why things like 3D Mark exist, only (again, no disrespect intended to the reporters) the comment is that the synthetic benchmarks don't really tell you much about real world performance.

 
You bring up a good point that many reviewers tend to ignore, pretend isn't there, or are just ignorant of. It could also be laziness or lack of time on the reviewers part, as the amount of work involved goes up significantly as you add in more variables, plus trying to benchmark image quality gets into subjective territory that a lot of reviews would likely wish to avoid. They already take enough flak from readers.

Here is a simple YouTube that gives a comparison of just the single graphic setting choice of HBAO, HDAO, and SSAO. It shows how the visual difference is minimal but the FPS impact is significant, with a 17 - 25% performance hit for the more expensive options. Most gamers would be hard pressed to tell you which mode was currently in use, without having an on-screen label to tell them.

Unfortunately, everybody seems to want to set their games to ultra, and they fail to realize how much just a single tweak can improve their gaming experience, usually at the cost of imperceptible quality differences.

Also, a good comparison that I saw a while back is that, newer titles look far better at medium settings than older titles used to, so turning down the graphics has less of an overall image quality degradation now, which is nice.

Here's an ever better example from ROTR comparing the same graphic setting.
 


Yes, and things like Ambient Occlusion modes get more subtle as they get more complex and difficult to render.

One of the things that made me think about this was the way the the Nvidia Geforce Experience game optimization system will pick a crazy bunch of optimal settings that have weird combinations like (for example) Shadows set to high, shadow distance set to medium, grass draw distance set to minimum, lighting quality set to ultra, anti-aliasing set to bonkers... So a bunch of low, medium, high, and "experimental" toggles selected all at once.

Also, I agree, today's "medium" visual setting is yesterday's "ultra" for some game engines.
 


I think it's less prevalent. For instance, Nvidia helped Bethesda put some GameWorks tech into Fallout 4, but they still had to make it so the game could run on the XBone and PS4's AMD APU cores.

There's a few examples of specialized stuff. In reviews I think they try to turn those specific features off because they're not supported by the "other team." However, by default the "ultra" mode is usually where the extra bells and whistles appear that might not be well optimized.