Thank you for such a beautiful answer!
Back in the days, when you bought a new monitor, you just plugged it in, vga cable or hdmi cable. Now you have to buy extra stuff for extra option. This is new to me, if I understand correct.
So the technology of g-sync is a bit older and I didn't noticed it because of a missing cable?
Yes, the monitor has hdmi 2.0. It can also select 1.4, but I think that is bad.
Typical Amd marketing, making cheap stuff, like they did with Amd Athlon and Phenom.
So if I understand correct, freesync is weaker than g-sync?
What do you mean, no latency, but keeps tearing away?
No your monitor doesn't have G-Sync, so not because you are missing a cable. VRR is a built in feature of Display Port, and so a G-Sync compatible card can make use of it. Freesync over DisplayPort is just corporate branding of an existing feature. I want to say G-Sync came out around 2013 in the form of an upgrade kit for a certain ASUS monitor. The first G-sync monitor the ASUS PG278Q (1440p 144hz) came out in July 2014. The first freesync monitor came out in December 2014, but was 4K 60hz, so they didn't directly compete. 4K was pretty tough to run in 2014, so it made sense.
AMD's implementation over older HDMI is a nearly a pure software solution, so there are processing penalties which lead to latency.
Total Latency is a measure of how much time passes between a command to generate a frame and that frame actually being displayed. Also similar to input lag, how long an action you make takes to get to the screen.
I believe it was mentioned before, but true G-Sync monitors have an Nvidia designed chip that runs the monitor, and a G-Sync GPU can control it directly. Since it is Nvidia from GPU through the monitor they did all they could to reduce the time it takes for the GPU to modify the Monitor's refresh rate and actually display an image. A lot of prediction involved in how many frames the GPU thinks it can make and changing the refresh rate to match.
Fully completed frames are delivered to the monitor, and since the GPU commands the refresh rate, it only draws the frame at the start of a refresh cycle. This means a single frame per cycle, ie no tearing.
Without V-Sync, G-Sync, or FreeSync the monitor simply draws whatever is in the frame buffer, whether it be one two, or a dozen renders. You can see the line between different frames when this happens, which is known as tearing.
G-Sync uses Adaptive Sync below 40 FPS, which allows it to continue preventing tearing, but it does that by showing the same frame multiple times. This effectively doubles the latency since no change is made to the scene. FreeSync Premium and Premium Pro also have this capability.
FreeSync is simpler. It uses the monitor's normal scalar. In the case of display port the GPU can inform the monitor what refresh rate to use as it predicts how many frames it can deliver. Freesync over HDMI is still hardware dependent, they have wired an additional chip into the monitor's scalar to control the vertical blanking signal, but that extra software to hardware layer adds a little processing time.
Blind tests have shown that people can't really tell the difference, but the differences can be measured.