Gysnc with vsync on or off - input lag

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Jonathan Cave

Honorable
Oct 17, 2013
1,426
0
11,660
I watched this video which has left me confised.. slightly...

[video="https://www.youtube.com/watch?v=MzHxhjcE0eQ"][/video]

Should i run my games with vsync on or off to reduce input lag?

According to this research if i run a 4k game like Witcher 3 ultra ~45 FPS i would benefit from reduced input lag with vsync off.

but isnt the point of having gsync that you can turn vysnc off and lt gsync do the work?

Games like cs:go become unplayable with gsync on and vsync off due to the input lag. i've enabled vsync via the control pannel and its improves it alot.

edit: Turning G-Sync on and V-Sync on and setting fps_max to 60 results in buttery smooth gameplay with little to no input lag.

all help appreicated!
 
Solution
If you turn on Gsync, the only thing the Vsync option in NVCP does is determine whether Vsync gets switched on or off when the game renders at a framerate above your monitor's maximum refresh rate. Below that point Gsync is on and Vsync doesn't even enter the equation.

This means that if you have Gsync enabled and you are not hitting max refresh rate, there should be zero difference with the Vsync setting on or off. Now if you are hitting the max refresh rate, turning the Vsync option on would increase your input lag, not decrease it, since that is one of the drawbacks of Vsync.

To minimize input lag, you should have either Gsync-on/Vsync-off or Gsync-off/Vsync-off.


That actually depends on the situation. If the game does not use triple buffering and you do not have multiple GPU's, that is true, but there are games which use triple buffering, either by option or default. You might also use a multi-GPU setup, which behaves like triple buffering (each GPU has a separate buffer along with the front buffer).
 
The above tests were completed on a single GPU system. I understand the only setting they changed was they enabled/disabled the vsync optons in the NVCP (global settings) and tripple buffer is disabled by default.

The ONLY difference between the charts below is FPS and vsync on/off and fps.

IT MAKES NO SENSE THAT VSYNC OFF HAS LOWER LATENCY THAN WITH IT ON SPECIFICALLY AT 45 FPS ON THE SAME 144HZ MONITOR.

screenshot_22.png

screenshot_21.png

screenshot_17.png

screenshot_18.png
 


The higher FPS showed the reverse effect as lower FPS.

There are another factors at play, and that is the engine and drivers. Nvidia does use prerendered frames by default, and the engine may be doing some odd things. Perhaps even the monitor plays a role in it.

Anyways, those charts are definitely inconsistent.
 
After a 2nd look, I do have a theory. It is the very high FPS charts which do the reverse of what you'd expect, and that may be due to how Nvidia has disabled V-sync at your refresh rate or higher FPS. Originally, Nvidia did not offer that as an option, and perhaps their work around to add that as an option, due to AMD having it as an option, required some coding that just isn't very efficient.