[SOLVED] Changing graphics settings - performance vs quality tradeoffs

King_V

Illustrious
Ambassador
So, I know a lot of the time performance would be determined for a video card at a specific resolution, at some "high" or "max" or "ultra" settings, to give you an idea of what a video card can do.

But, what if you're trying to strike a balance?

Let's say you start with maximum settings at everything on your given resolution, but performance less than you'd like.

Where would you start on turning down settings? Which settings tend to put a lot of load on the GPU without giving you a whole lot of visual benefit?

I wonder if there should be a sticky for this?

(also, I wasn't sure if this should go under PC Gaming, or Graphics Cards)

 
Solution
I always figure the "balance" is roughly matching the FPS to your monitors refresh rate, be it if you can run everything on Ultra or if you have to dial it back to low. Personally I've always preferred smoother FPS over better graphics but you can always meet in the middle. For example my system run Destiny 2 on Highest settings at 40-60 FPS however my monitor is 75hz so I lower it to high to keep the FPS from 60-75. Freesync helps with any stuttering but if the system struggled on high then I'd lower it to medium. Also I feel it depends on the game, even Destiny 2 looks good even on low and the action can be so quick that you really don't have time to look around too much. Obviously for more slower paced single player games then turn...

xxxlun4icexxx

Honorable
Jun 13, 2013
519
5
11,065


A lot of times it depends on the game. In some games for instance "shadow quality" or "effects quality" may put a very large burden on your gpu, while in other games it is more optimized and doesn't take as much of a toll.

If I had to guess, I'd have to say Resolution Scale --> Ambient Occlusion --> Texture Quality --> AA are the biggest performance hitters universally which also make the biggest visual difference, but that's not to say other settings don't have a big effect in specific titles. So perhaps you could try tweaking settings outside of these and see if it has beneficial results?

And don't forget settings which don't affect quality but attempt to increase performance (looking at you triple buffering)
 

WildCard999

Titan
Moderator
I always figure the "balance" is roughly matching the FPS to your monitors refresh rate, be it if you can run everything on Ultra or if you have to dial it back to low. Personally I've always preferred smoother FPS over better graphics but you can always meet in the middle. For example my system run Destiny 2 on Highest settings at 40-60 FPS however my monitor is 75hz so I lower it to high to keep the FPS from 60-75. Freesync helps with any stuttering but if the system struggled on high then I'd lower it to medium. Also I feel it depends on the game, even Destiny 2 looks good even on low and the action can be so quick that you really don't have time to look around too much. Obviously for more slower paced single player games then turn up the graphics and a casual game would be fine to play around 30-45 FPS.
 
Solution

TRENDING THREADS