MrN1ce9uy :
^What about Nvidia Fast Sync?
MEH.
First of all, the game must be outputting (at minimum) just over 2x the FPS compared to the Hz of the monitor. For example, just over 120FPS if the monitor is 60Hz.
What happens is the GPU output is uncapped, then the last FULLY COMPLETE frame is used and any others discarded (if it does over 180FPS you drop two full frames every refresh cycle).
This reduces some of the LAG (latency). It's better than VSYNC ON, but not as good as GSYNC, nor is it as good as 120FPS on a 120Hz monitor.
I have a GTX1080 and experimented with several games but found I simply didn't notice the benefit. I tried i COD MW2 (old game) and was toggling between 120FPS and 180FPS but it really didn't feel much different than a locked 60FPS experience though I'm no "twitch shooter" so may be that's just me.
Heck, I tried VSYNC OFF and was jumping over 200FPS at times and that seemed about the same. I thought I'd see a lot of screen tear but I saw NONE. (it's there, but wasn't obvious)... weird because I thought High FPS vs Low Refresh made it weird, and High Refresh vs Low FPS was good for not seeing tears.
SUMMARY:
to be clear, the monitor itself still updates like normal (always do unless GSYNC/Freesync) such as 60x per second, and Fast Sync still always uses VSYNC.
So the only difference is in choosing the frame output from the GPU to ensure it's the newest one possible. So no screen tearing, and slightly reduced lag (though I felt no obvious difference).
OTHER:
I laughed my ass off when AMD was explaining how it "switched" from Freesync to their version of Fast Sync (part of Enhanced Sync or something). Why? Well they explained on a 144Hz monitor, so what was the scenario you'd expect that to work in?
I mean you'd have to be hitting over 288FPS for that portion to be active. You don't run an average, say, 120FPS in a shooter than "switch" over to that option. Not saying there's no point, but people need to understand how the technology works to use it correctly.