FreeSync: AMD's Approach To Variable Refresh Rates

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

InvalidError

Titan
Moderator

Adaptive Sync does not need to be certified by either AMD or Nvidia, it is a VESA standard and display manufacturers can certify that themselves. As long as the display manufacturers follow the VESA DP1.2a spec, it makes no difference who the GPU manufacturer is, what the details of their AdaptiveSync implementation on the GPU/driver side are or what brand they are marketing it under. All the display and GPU outputs care about is that the GPU and display are following the DP1.2a/1.3 AdaptiveSync spec and as long as they do that, any AdaptiveSync display can be used with any AdaptiveSync GPU regardless of marketing brands.

AMD's additional FreeSync "certification" is entirely for marketing purposes. The simple fact that you thought AMD is involved in AdaptiveSync certification says you got caught up in the confusion as well.

If Nvidia decided to adopt AdaptiveSync, they could very well choose to market their GPU/driver implementation under the "G-Sync 2.0" brand and those hypothetical G-Sync 2.0 GPUs and Nvidia-approved displays would still work with FreeSync GPUs/displays or generic AdaptiveSync displays that carry neither marketing brands.
 

hannibal

Distinguished
However it also has 2 major disadvantages:
1) Quality. There is no required level of quality for freesync other then it can do some variable refresh. No min/max range, no anti-ghosting. No guarantees of behaviour outside of variable refresh range. It's very much buyer beware - most freesync displays have problems. This is very different to gsync which has a requirement for a high level of quality - you can buy any gsync display and know it will work well.

2) Market share. There are a lot less freesync enabled machines out there then gsync. Not only does nvidia have most of the market but most nvidia graphics cards support gsync. Only a few of the newest radeon cards support freesync, and sales of those cards have been weak. In addition the high end where you are most likely to get people spending extra on fancy monitors is dominated by nvidia, as is the whole gaming laptop market. Basically there are too few potential sales for freesync for it to really take off, unless nvidia or perhaps Intel decide to support it.

True, to some aspect.
1) Not having strict requirements means that we will get free-sync monitors in many different price segments. It also means that there are quite big differences in quality. So we need help from reliable ICT-sites to test free-sync monitors. How much I have to pay, to get a decent monitor.

2) Market share. Because Intel has said that they will support also Adaptive sync, it means that Intel and AMD has huge advantage in Market share compared to Nvidia.
Yep, Intel is no good as an gaming GPU maker, but it still is the biggest GPU maker in the world... Also Intel products benefits from cheap adaptive sync monitors more than AMD or Nvidia from G-sync, just because Intel solution is so slow that Adaptive sync is the only way of getting useful image to the even light games.
We gamers seems to quite often forget that Intel rule the GPU segment in the world, by big lead...
 

mlee 2500

Honorable
Oct 20, 2014
298
6
10,785
I've found I can tolerate lower FPS with G-SYNC then without.

In other words, on a game which my threashold is around 40-50fps without G-SYNC, I can go as low as 20-30fps WITH G-SYNC with equal or greater satisfaction.

(GTX980 with Acer XB280HK monitor)
 

InvalidError

Titan
Moderator

Adaptive Sync is a unified standard: it is part of the DP spec. AMD just messed it up by loudly promoting their FreeSync marketing name for it.

With everyone so strongly associating AdaptiveSync with AMD when it is a VESA spec and AMD deserves no credit beyond being one of its earliest supporters and the first to market with GPU/driver-side support for it, it makes it that much more difficult for Nvidia to change their stance on not supporting it.
 


Intel is going to support AdaptiveSync from what I have read. So that should really help it get a large market share.
 
Status
Not open for further replies.