I've seen a few posts recently bashing the idea of running games at MAX settings so I thought I would give my take on the subject as well as get some feedback from you all.
Main idea - 'It's not worth it from a cost or performance-hit perspective since you usually can't tell the difference.'
Most of the time this is in reference to going from very high to ultra/max with the preset quality settings in AAA games. This can also include those who set the game to ultra/max and then go into the individual slider section and turn down a couple/few setting to claw back 5, 10, 20FPS. Now I 100% agree with the idea that you usually can't tell the difference. In most of these games we are looking at a constantly moving image. To be able to pick out the lighting/shadow effects (or lack thereof) on a tiny palm tree off in the distance, that's only visible for three seconds, all while running around, dodging bullets is a fool's errand.
However, most (all?) of these discussions never touch on the main reason I go for a top end video card - LONGEVITY. One example of this longevity is the GTX 1080 Ti. If you bought a GTX 1080 Ti video card 5 years ago, you still have a very capable GPU today, even with the latest crop of AAA games at near max settings. I can't say the same for someone who bought the GTX 1060 6GB. For me, being able to play games at max settings now is just a nice side effect of buying a top end card that will probably still perform well 5+ years from now.
A note on texture resolutions.
I usually see the below video posted as anecdotal evidence for why 'gaming at max settings is dumb'.
View: https://www.youtube.com/watch?v=PjDgKXe8gxs
This video does make some good points, that I agree with, but at 6:23, it states that 'higher resolution textures do look a lot better if you have the right screen for them'. This implies that a higher resolution screen is needed to see the difference in higher resolution textures.
I actually had to listen to it a few times to make sure I was hearing it right because texture size and screen resolution have NOTHING to do with one another. Someone with a 1080p monitor can benefit from a 4K texture just as much as someone with a 4K monitor. For example, a 4K texture on the small mesh of a pencil sitting on a desk would be a waste at any resolution but the difference between a 1K and 4K texture on a big spot of land right in front of you would be immediately noticeable to everyone - regardless of whether you are on a 1080p or 4K monitor.
Main idea - 'It's not worth it from a cost or performance-hit perspective since you usually can't tell the difference.'
Most of the time this is in reference to going from very high to ultra/max with the preset quality settings in AAA games. This can also include those who set the game to ultra/max and then go into the individual slider section and turn down a couple/few setting to claw back 5, 10, 20FPS. Now I 100% agree with the idea that you usually can't tell the difference. In most of these games we are looking at a constantly moving image. To be able to pick out the lighting/shadow effects (or lack thereof) on a tiny palm tree off in the distance, that's only visible for three seconds, all while running around, dodging bullets is a fool's errand.
However, most (all?) of these discussions never touch on the main reason I go for a top end video card - LONGEVITY. One example of this longevity is the GTX 1080 Ti. If you bought a GTX 1080 Ti video card 5 years ago, you still have a very capable GPU today, even with the latest crop of AAA games at near max settings. I can't say the same for someone who bought the GTX 1060 6GB. For me, being able to play games at max settings now is just a nice side effect of buying a top end card that will probably still perform well 5+ years from now.
A note on texture resolutions.
I usually see the below video posted as anecdotal evidence for why 'gaming at max settings is dumb'.
This video does make some good points, that I agree with, but at 6:23, it states that 'higher resolution textures do look a lot better if you have the right screen for them'. This implies that a higher resolution screen is needed to see the difference in higher resolution textures.
I actually had to listen to it a few times to make sure I was hearing it right because texture size and screen resolution have NOTHING to do with one another. Someone with a 1080p monitor can benefit from a 4K texture just as much as someone with a 4K monitor. For example, a 4K texture on the small mesh of a pencil sitting on a desk would be a waste at any resolution but the difference between a 1K and 4K texture on a big spot of land right in front of you would be immediately noticeable to everyone - regardless of whether you are on a 1080p or 4K monitor.