Discussion MAX game settings (and a note on texture resolution - e.g. 4k textures, etc.)

I've seen a few posts recently bashing the idea of running games at MAX settings so I thought I would give my take on the subject as well as get some feedback from you all.

Main idea - 'It's not worth it from a cost or performance-hit perspective since you usually can't tell the difference.'
Most of the time this is in reference to going from very high to ultra/max with the preset quality settings in AAA games. This can also include those who set the game to ultra/max and then go into the individual slider section and turn down a couple/few setting to claw back 5, 10, 20FPS. Now I 100% agree with the idea that you usually can't tell the difference. In most of these games we are looking at a constantly moving image. To be able to pick out the lighting/shadow effects (or lack thereof) on a tiny palm tree off in the distance, that's only visible for three seconds, all while running around, dodging bullets is a fool's errand.

However, most (all?) of these discussions never touch on the main reason I go for a top end video card - LONGEVITY. One example of this longevity is the GTX 1080 Ti. If you bought a GTX 1080 Ti video card 5 years ago, you still have a very capable GPU today, even with the latest crop of AAA games at near max settings. I can't say the same for someone who bought the GTX 1060 6GB. For me, being able to play games at max settings now is just a nice side effect of buying a top end card that will probably still perform well 5+ years from now.

A note on texture resolutions.
I usually see the below video posted as anecdotal evidence for why 'gaming at max settings is dumb'.
View: https://www.youtube.com/watch?v=PjDgKXe8gxs

This video does make some good points, that I agree with, but at 6:23, it states that 'higher resolution textures do look a lot better if you have the right screen for them'. This implies that a higher resolution screen is needed to see the difference in higher resolution textures.
I actually had to listen to it a few times to make sure I was hearing it right because texture size and screen resolution have NOTHING to do with one another. Someone with a 1080p monitor can benefit from a 4K texture just as much as someone with a 4K monitor. For example, a 4K texture on the small mesh of a pencil sitting on a desk would be a waste at any resolution but the difference between a 1K and 4K texture on a big spot of land right in front of you would be immediately noticeable to everyone - regardless of whether you are on a 1080p or 4K monitor.
 
don't know where you're seeing this "bashing" but i've never encountered any groups or individuals against maxing all possible settings within any games.
just some idiot's YouTube video doesn't mean much except they are desperate for views and coming up with crap that will draw you in.

as long as anyone is getting an fps that they are comfortable with i don't see why they wouldn't turn up everything as high as they can for that experience.
 
The problem with max settings
... in particular is there's a lot of extra work the GPU has to do and there's not a whole lot of extra quality to show for it. I remember playing around with tessellation demos back in the day and I noticed something: the amount of apparent quality you get by increasing the tessellation factor drops really fast. I recall one of the demos, it stopped adding appreciable amount of detail past 0.25 out of 1.00. However, jacking it up higher certainly ate into performance. Another example would be something like in Ungine Heaven, though their implementation is exaggerated.

Similarly, in a lot of side-by-side comparisons between max and high details, I can't really see anything appreciable that would actually bother me if I wasn't scrutinizing, but they do impact performance in a non-trivial way. For example in NVIDIA's GTA V settings guide (https://www.nvidia.com/en-us/geforce/news/grand-theft-auto-v-pc-graphics-and-performance-guide/)
  • Ultra vs High quality grass. Outside of the grass having self-shadowing, I can't really tell a difference. And between these two, Ultra eats up about 15 FPS
  • Ultra vs. High quality Post FX. Though the samples they used may not be indicative of how this feature is really used, but for the samples they've chosen, Ultra doesn't look that much better where it causes a 10 FPS hit.
  • Ultra vs. Very High quality reflections. Ultra takes another 15 FPS hit, and while you can point out the lower resolution quality in Very High, I don't think it's enough to detract from things.
It's the same reason why people dunk on ray tracing. They don't see a massive quality improvement for the massive performance hit ray tracing does.

A counter point to getting a top-end GPU
Getting a top-end GPU might matter for longevity if you are someone who demands maxing out every slider and option and still want to maintain 60 FPS or higher. But a lot of people can't afford these, stick with midrange GPUs, and can live with turning down the image quality or dealing with a lower frame rate.

Not to mention, longevity doesn't really matter if the GPU doesn't support the latest and greatest AAA features. While the 1080 Ti can possibly max out most AAA games, if that means turning on ray tracing, then it's going to have worse performance than a 2060. In one case, Metro Exodus Enhanced Edition requires a GPU that has hardware ray tracing support; ray tracing is a core component of the rendering engine and you can't turn it off. So the 1080 Ti is out of the picture with that game and who's to say other developers won't follow suit?

And this was a major sticking point with multi-GPU setups: you get next-gen performance with current gen features. Having say two GTX 280s didn't matter once DirectX 11 kicked off because the cards can't use it.

About texture resolution and screen resolution
I agree with the points, but at the same time, two extremes were picked out. Sure a 4K texture would massive benefit a large expansive mesh over a 1K texture and putting a 4K texture on a tiny object is a waste, but what about stuff in between? A 4K texture on a wall will look nice on a 1080p screen for sure, but it'll look nicer on a 4K screen since the finer details weren't blotted out due to resampling.

There was a thing floating around, I forget who but I want to say PC Gamer, showing off Rise of the Tomb Raider's menu screen in 4K and 1080p and noted how much more detail you can see in the 4K version, even though the 1080p one looked fine.
 
don't know where you're seeing this "bashing" but i've never encountered any groups or individuals against maxing all possible settings within any games.
just some idiot's YouTube video doesn't mean much except they are desperate for views and coming up with crap that will draw you in.

as long as anyone is getting an fps that they are comfortable with i don't see why they wouldn't turn up everything as high as they can for that experience.
I would not call it bashing but more pointing out how little there is to gain for what can be a significant cost in fps. That review isn’t the only one, here is one from Hardware Unboxed who are an excellent review site
View: https://youtu.be/f1n1sIQM5wc
 
  • Like
Reactions: alceryes
@hotaru.hino @sizzling. Yup, you both bring up good points.

For me, running at ultra setting is just a side effect of the longevity factor I wanted when I bought my 6900XT though. In 5 years, I'll definitely need to turn down some settings with the latest games, but I'd hate to see how an RTX 3060 12GB performs with AAA titles of 2027.
As far as RT is concerned, it may actually be a brighter future for AMD's 6000 cards than what we have currently, especially if the latest Unreal Engine 5 and its lumens-based RT really catches on.