Question Questions about G-Sync and setting the refresh rate

photon123

Distinguished
Nov 12, 2010
25
1
18,535
0
I just bought a G-Sync capable monitor, ASUS XG27UQ. I am trying to understand what should I set my refresh rate to. With my graphics card, GTX 1080, the monitor supports up to 98Hz with HDR or 144Hz with SDR.
What happens if I set the refresh rate to 98Hz and watch a 30fsp or 60fps video on YouTube or Netflix? Will it:
  1. Display the video at its original frame rate?
  2. Display the video at 98Hz with uneven time each frame is shown? (bad)
  3. Interpolate the video to 98Hz?
  4. Display the video at 98Hz with tearing(very bad)
What should I set my refresh rate to? In case of 2 or 4, it would probably be best to leave it to 60Hz. Otherwise 98Hz should be fine, right?

Also, with G-Sync enabled, what does enabling/disabling V-Sync do in games?
 

Eximo

Titan
Ambassador
Video is not being 'rendered' by the GPU, just displayed. Just like when you have a 60hz monitor playing a 24 or 30 FPS video it shows the same frame multiple times. Tearing shouldn't be evident.

Your refresh rate should be set to whatever you want it to be.

With G-Sync enabled setting V-Sync on will cap your FPS at the monitor's refresh rate. With it off, when the FPS exceeds the refresh rate, it will just turn off syncing. When the FPS comes below the refresh rate, G-Sync will re-enable.
 

photon123

Distinguished
Nov 12, 2010
25
1
18,535
0
Video is not being 'rendered' by the GPU, just displayed. Just like when you have a 60hz monitor playing a 24 or 30 FPS video it shows the same frame multiple times.
That would be bad and should distort the video somewhat. Are you sure this is what is happening? Better solutions would be to either adapt the display refresh rate to the video rate or its multiple, or possibly use some form of interpolation.
If this is the case, I wonder if it won't be better to set the refresh rate to 60Hz, and set it higher only when playing games.
 

Eximo

Titan
Ambassador
Not sure what to tell you. This has always been a thing.

24 fps films shown on 60hz North American Televisions. American TV was regularly filmed at 30FPS (ie the Soap Opera effect)
25 fps TV and films properly shown on 50hz European Televisions.

144hz was the standard since it is a multiple of 24, 120 is also a multiple of 24, so a lot of your TVs are '120hz' or '240hz'

Interpolation is one method they use, very common on TVs (I usually turn it off).

If you are watching 30/60 FPS content, than 60 or 120 hz is the way to go. Shouldn't be any frame doubling or anything odd there. 98Hz, would be odd for pretty much everything but 25 FPS TV (assuming that is just a rounding artifact)

In truth the 60hz American TVs are actually 59.97 hz or something like that. There was a physics reason that escapes me. Which is why you commonly see 59hz as an option in display settings.
 

photon123

Distinguished
Nov 12, 2010
25
1
18,535
0
Right now my options for HDR is 60Hz, 80Hz and 98Hz. If I want to go higher, I need to either disable HDR or buy a new graphic card. Considering that I already have GTX 1080 and the crazy prices of graphics cards now, buying a new one seems like a waste of lots of money and is not really an option.
So I wonder if I should just set it at 60Hz for normal use, as 30 and 60FPS videos are very common, and change it to 98 only for games, which I barely play lately, if at all.
 

ASK THE COMMUNITY