Free sync vs G sync

Carl Patrick

Commendable
Nov 5, 2016
77
0
1,630
Are there any performance differences between free sync and g sync or are they the same but you can only use the respective manufacturers card?
 
Solution
Found it !

Here you can see a video of the ghosting effect of motion blur. The phenomenon is caused as frames will contain mages of "transitions" which 'blur" the images. By strobing the monitor backlight, this can be virtually eliminated.

https://youtu.be/hD5gjAs1A2s?t=38

Without the MBR technology enabled, you will see the alternating black and white images on screen at the same time. Turn on the MBR tech and it's virtually gone.

Here's the comparison of what you see when using motion blur reduction in a more typical viewing scenario

https://frames-per-second.appspot.com/

Obviously the top image is far superior ... the bottom image will give ya a headache.

Whether it's worth for the cost increase is a question that can...
There is or there is not a very significant performance difference ... it's a semantic issue. However, if anyone tells you that they are "the same", they are misinformed.

But yes, Freesync is cheaper because G-Sync offers you options and technology which Freesync does not. nVidia G-Sync has a hardware module that they require to be installed in every G-Sync compatible module. Freesync is cheaper because they omit this module and therefore you don't get to use Motion Blur Reduction technology on Freesync monitors.

Yes, G-Sync and FreeSync are competing technologies, main difference being one if for one card manufacturer and the other for the other card manufacturer.

But the more significant point is this:

G-Sync comes with a huge extra. There is a hardware module in every G-Sync compatible monitor which provides ULMB. This accounts for the Lion's share of the cost difference between G-Sync and Freesync. Both syncs are great technology and I wouldn't build a system today without one or the other. Both syncs do the job up to about 60 fps and they work well up to about 75 fps ... at that point you may choose to utilize the nVidia option which offers the option to turn OFF G-Sync and use ULMB which is a motion blur reduction (MBR) technology. Given the choice, if the fps is there, I like ULMB.

Freesync has no competing technology and hence the difference in cost between the two. That's not to say that you can't buy Freesync monitor with MBR technology. When provided, the necessary hardware module is designed and provided by the monitor manufacturer. As such, every one does it different, some better than others / some worse. Of course, like the nVidia option, this adds a cost to the price off the monitor. When the Asus VG248QE came out, it was module compatible but the module was not, as yet, available. When it was, you could buy it ... but for $200 which included installing and shipping both ways, it wasn't a popular option. Obviously economies of scale, mature production lines, eliminating 2-way shipping not having to disassemble the monitor before installing and reassembling cuts the cost down substantially.

There's a site that shows how well MBR works... I'll see if i can find it and if i do I'll edit the post. read this in the meantime :)

http://www.tftcentral.co.uk/articles/variable_refresh.htm

 
Freesync is an AMD only option (at least at this point). G-sync is an Nvidia only option. They operate in basically the same way which is allowing the monitor's refresh rate to be adjusted according to the frame rates the GPU is able to sustain. At this point th emajor issues are that if you want to take advantage of the technology you must first be locked into one brand of GPU, either Nvidia or AMD. This is because G-sync will not work with an AMD card and Free-sync will not work with Nvidia cards.

Performance is between them is very little but the performance is also tied in with he performance of the card. The higher the frame rates the smoother the game should feel it is running.
 
Found it !

Here you can see a video of the ghosting effect of motion blur. The phenomenon is caused as frames will contain mages of "transitions" which 'blur" the images. By strobing the monitor backlight, this can be virtually eliminated.

https://youtu.be/hD5gjAs1A2s?t=38

Without the MBR technology enabled, you will see the alternating black and white images on screen at the same time. Turn on the MBR tech and it's virtually gone.

Here's the comparison of what you see when using motion blur reduction in a more typical viewing scenario

https://frames-per-second.appspot.com/

Obviously the top image is far superior ... the bottom image will give ya a headache.

Whether it's worth for the cost increase is a question that can only be answered by each individual... kind alike choosing between a 600k and 6700k ... not doubt it's better, but not everyone will notice it and not everone will care.
 
Solution