[SOLVED] V-Sync G-Sync and Free-Sync questions?

Apr 7, 2021
5
1
15
Hello,
Sorry if this has been done before... A lot of video games tweak guides recommend the use of V-Sync with triple buffering. I understand this prevents screen tearing but it also locks the frames at 60 fps (with a 60hz monitor) and creates input lag/stutters.
What does Free-Sync do? Does it allow you to both prevent screen tearing but also minimize input lag/stutters by letting the monitor match the GPU in terms of frames (by varying the hz frequency).
Is Free-Sync the same as G-Sync? The guide at the top of the forum says it is different but it was written in 2017. Some research online suggests it is now compatible....?

I am trying to assess whether there would be an advantage in getting a monitor with Free-Sync (G-Sync) or not, in particular to get rid of some stutters.... I have a Nvidia GPU (RTX 3070) I am trying to use to its full potential....
Thank you for enlightening me.
 
Solution
Yes, all types of Sync incur some input lag penalties, but the end result is no tearing, which can be very distracting if the FPS/Refresh rate get in just the right way to put the tear right in the middle of the screen.

What you probably want to hear:
G-Sync Compatible
Nvidia finally caved in and allows a FreeSync monitor to use DisplayPort and enable Variable Refresh Rate with Nvidia graphics cards. A freesync monitor with only HDMI will not work.

General facts:
G-sync uses Nvidia hardware in the monitor scalar. This basically puts the monitor in charge and reduces input lag as much as possible. This is why G-Sync monitors are more expensive.
Free Sync is an extension of the Adaptive Sync protocol found in the DisplayPort 1.3 and 1.4...

Eximo

Titan
Ambassador
Yes, all types of Sync incur some input lag penalties, but the end result is no tearing, which can be very distracting if the FPS/Refresh rate get in just the right way to put the tear right in the middle of the screen.

What you probably want to hear:
G-Sync Compatible
Nvidia finally caved in and allows a FreeSync monitor to use DisplayPort and enable Variable Refresh Rate with Nvidia graphics cards. A freesync monitor with only HDMI will not work.

General facts:
G-sync uses Nvidia hardware in the monitor scalar. This basically puts the monitor in charge and reduces input lag as much as possible. This is why G-Sync monitors are more expensive.
Free Sync is an extension of the Adaptive Sync protocol found in the DisplayPort 1.3 and 1.4 specifications. It is software based. They also enabled VRR through HDMI using similar methods. Cheaper FreeSync monitors only come with HDMI.

G-Sync : Original G-Sync module, DisplayPort only 60-144hz (Below 40 FPS the system will drop into Adaptive V-Sync, which will start doubling up frames, at 60 it basically is just V-Sync)
G-Sync version 2 : Still DisplayPort only for G-Sync, but they added an HDMI port. 40-165hz, later 240hz with 1080p
FreeSync generally has a narrower VRR range than G-Sync monitors, but it depends on the model. So tearing may resume after it drops below the range. It does not default to Adaptive V-Sync if I understand correctly.

Free Sync Premium / G-Sync Ultimate
These are newer standards and support VRR at very high resolutions like 4K 144hz and offer support for HDR and/or Motion blur reduction. At the moment, these are quite the premium price for larger displays.
 
Solution

Eximo

Titan
Ambassador
Stutters would be more a result of some GPU or system performance issue. Running a 60hz monitor above or below 60FPS will simply result in a visible line where one frame ends and another begins. Basically just the monitor receiving whatever is in the frame buffer at the time of setting a pixel.

V-Sync forces the GPU to draw an entire frame before sending it, and waits for the beginning of the refresh cycle.
Triple buffering stores up multiple frames so that there is no interruption if the GPU has a problem, essentially.
G-Sync module and an Nvidia GPU will work together to predict somewhat the GPUs capability and upcoming needs. If the GPU starts to lower performance, the module will react by reducing refresh rate and wait on the GPU to deliver the next completed frame.
FreeSync operates much the same way, except that it is software executed, which is slower, but in a practical sense most people can't tell the difference, particularly at high refresh rates. If the GPU doesn't have a frame, the monitor slows down.

Since all that happens in milliseconds, and typical 60hz monitors have 16.7ms per frame, there is nothing really to lose.

Pro gamers will still run non-synched and get as many FPS as possible. This updates the information on the screen that much faster.
 
As you noted, Vsync locks your game output to the monitors refresh rate in order to prevent screen tearing. If you can't keep a steady 60FPS output, the result is effectively a halving of FPS.

Gsync was NVIDIAs attempt to address this. Rather then having a fixed refresh rate, instead Gsync displays would refresh whenever they received a new frame (within defined limits). Basically, this would eliminate all refresh related lag since the display would process and display the frame as soon as it arrived. Note NVIDIA accomplished this via a HW module, which added some additional cost to capable displays.

Freesync was AMDs alternative. It basically accomplishes the same thing as Gsync entire in software, using an optional part of the Displayport standard. There have been a couple of iterative updates to Freesync, but note it does still remain optional. Freesync over HDMI basically implements Freesync so it works over an HDMI connection.

Gsync-Compatible displays are displays that do not support native Gsync (Eg: Lack the HW module) but support a mode of VRR that NVIDIA considers to be just as good as Gsync. (Eg: the displays VRR capabilities are the same as ones that include the Gsync HW module).

HDMI-VRR is HDMI Forums attempt to get in on the VRR game. HDMI-VRR is an optional part of the HDMI 2.1 specification, but most of the released displays thus far either support it or plan to add it as a firmware update. Note that NVIDIA considers many of these "Gsync Compatible" (such as LGs TVs going back to 2019).

What really matters is less the implementation (aside from the note that Gsync is only supported by NVIDIA and all other modes are supported by both NVIDIA and AMD) is what the refresh window is. Fall outside the window, and you lose the benefits to VRR. Some displays have wide windows (10-120Hz), others have much smaller ones (I've seen a few in the 40-60Hz range).
 
  • Like
Reactions: Fred Nav