FreeSync: AMD's Approach To Variable Refresh Rates

Status
Not open for further replies.

DbD2

Reputable
Apr 14, 2015
30
0
4,540
1
Imo freesync has 2 advantages over gsync:
1) price. No additional hardware required makes it relatively cheap. Gsync does cost substantially more.

2) ease of implementation. It is very easy for a monitor maker to do the basics and slap a freesync sticker on a monitor. Gsync is obviously harder to add.

However it also has 2 major disadvantages:
1) Quality. There is no required level of quality for freesync other then it can do some variable refresh. No min/max range, no anti-ghosting. No guarantees of behaviour outside of variable refresh range. It's very much buyer beware - most freesync displays have problems. This is very different to gsync which has a requirement for a high level of quality - you can buy any gsync display and know it will work well.

2) Market share. There are a lot less freesync enabled machines out there then gsync. Not only does nvidia have most of the market but most nvidia graphics cards support gsync. Only a few of the newest radeon cards support freesync, and sales of those cards have been weak. In addition the high end where you are most likely to get people spending extra on fancy monitors is dominated by nvidia, as is the whole gaming laptop market. Basically there are too few potential sales for freesync for it to really take off, unless nvidia or perhaps Intel decide to support it.
 

InvalidError

Titan
Moderator
It sounds hilarious to me how some companies and representatives refuse to disclose certain details "for competitive reasons" when said details are either part of a standard that anyone interested in for whatever reason can get a copy of if they are willing to pay the ticket price, or can easily be determined by simply popping the cover on the physical product.
 

xenol

Distinguished
Jun 18, 2008
216
0
18,680
0
I still think the concept of V-Sync must die because there's no real reason for it to exist any more. There are no displays that require precise timing that need V-Syncing to begin with. The only timing that should exist is the limit of the display itself to transition to another pixel.

It sounds hilarious to me how some companies and representatives refuse to disclose certain details "for competitive reasons" when said details are either part of a standard that anyone interested in for whatever reason can get a copy of if they are willing to pay the ticket price, or can easily be determined by simply popping the cover on the physical product.
Especially if it's supposedly an "open" standard.
 

nukemaster

Titan
Moderator
It sounds hilarious to me how some companies and representatives refuse to disclose certain details "for competitive reasons" when said details are either part of a standard that anyone interested in for whatever reason can get a copy of if they are willing to pay the ticket price, or can easily be determined by simply popping the cover on the physical product.
It was kind of the highlight of the article.

(Ed.: Next time, I'll make a mental note to open up the display and look before sending it back. Unfortunately, the display had been shipped back at the time we received this answer)
It is a shame that AMD is not pushing for some more standardization on these freesync enabled displays. A competition to ULMB would also be nice to see for games that already have steady frame rates.
 

InvalidError

Titan
Moderator

As stated in the article, modern LCDs still require some timing guarantees to drive pixels since the panel parameters to switch pixels from one brightness to another change depending on the time between refreshes. If you refresh the display at completely random intervals, you get random color errors due to fade, over-drive, under-drive, etc.

While LCDs may not technically require vsync in the traditional CRT sense where it was directly related to an internal electromagnetic process, they still have operational limits on how quickly, slowly or regularly they need to be refreshed to produce predictable results.

It would have been more technically correct to call those limits min/max frame times instead of variable refresh rates but at the end of the day, the relationship between the two is simply f=1/t, which makes them effectively interchangeable. Explaining things in terms of refresh rates is simply more intuitive for gamers since it is almost directly comparable to frames per second.
 
G

Guest

Guest
Freesync will clearly win, as a $200 price difference isn't trivial for most of us.

Even if my card was nVidia, I'd get a freesync monitor. I'd rather have the money and not have the variable refresh rate technology.
 
Very interesting read. I never knew that variable refresh rates had effects on light strobing.

I wonder how adaptive sync and G sync will work when the new OLED monitors start hitting the market?

 

InvalidError

Titan
Moderator

Mostly unchanged, except for negligible (CRT-like) persistence and backlight strobing becoming history.
 

Chris Droste

Honorable
May 29, 2013
275
0
10,810
14
Freesync will clearly win, as a $200 price difference isn't trivial for most of us.

Even if my card was nVidia, I'd get a freesync monitor. I'd rather have the money and not have the variable refresh rate technology.
This. when the consumer is driving the value of a new standard, price is king. If the quality is mostly similar, and given modern gaming at 1200p,1440p, 2160p,etc no way 90% of gamers will have enough computer to punch through that ceiling on most current games to create the observed screen tearing, then so long as the standard keeps improving via AMD's QC/revisions and the OEMs responding to observed QC or consumer reported issues, even if all that extra work starts adding to the price. most people will still opt to pay $25-75 versus $200, that's just all there is to it. unless nVidia gets in bed with the panel manufacturers (LG? Samsung? Benq? Optronics?) then this will be an uphill fight.

 

nukemaster

Titan
Moderator
The real issue with backlight strobing and VRR is that you can not simple use the same on time because as the frame rates drop the picture would get darker.

Think of 1ms on 7 off @ 120hz than think of 1ms on and 15off at 60. That is a much longer time in the dark.

Since OLED has no backlight they would have to adjust the pixel brightness(something that may or may not be hard to do while keeping the brightness some users want) to compensate for different frequencies.

On the plus side backlight strobing makes everything so nice to look at as long as the motion is smooth since skips are partially hidden by out own perception combined with sample and hold on non stobes backlights.

Also backlight stobing needs tweaking to make the most of it. BenQ's default implementation is not as good as lightboost, but has better color. This may be because lightboost expects ed glasses and is designed to compensate for it.
 

AnimeMania

Honorable
Dec 8, 2014
311
14
10,815
13
Since monitor manufacters use more expensive LCD scalers to support higher resolutions, FreeSync allows monitor manufacturers to choose which LCD scaler will work for their monitor. G-Sync's one or two LCD scalers for all monitors might be a disadvantage since NVidia produces their own LCD scalers that have to support a wide range of resolutions from 1080p to 4K, along with those weird ultra-wide monitor resolutions.
Anything you can throw at it solutions tend to be much more expensive than just good enough to get the job done solutions.
 
G

Guest

Guest


this is what i was thinking. i wonder why the article didn't mention this fix. +1 for this post.

i agree with the conclusion that g-sync has the advantage now but freesync will be in an advantageous position in an year or so.
 

SamSerious

Reputable
Dec 12, 2014
42
0
4,540
1
Another aspect (next to the solution just to cap framerates above 144Hz) is missing in the article.

It is true that backlight strobing reduces ghosting heavily. I own an Asus VG278-he display. Although i use it with an AMD Card (R9 380 4G) and not a nVidia one i can manually activate strobing via the strobing utility (freeware, available on the blur busters blog). At the highest strobe setting, 10%, you get the impression of a static crt-like picture that is absolutely amazing. But you also get (at least myself and friends who tried it out as well) a headache quite fast. And if its late und the display is the main light source in the room, everything will flicker of course. Just moving your hand in front of the screen is very irritating.
Some companies like BenQ with their FlickerFree advertising build monitors with a non-PWM-backlight (by dimming the leds via current control) which is a lot friendlier to the eye especially at office use. However it may affect the color neutrality when dimming your screen to very little brightness and therefore it wont be able to get as little bright as a pwm-controlled backlight. In the enthusiast flashlight world pwm-controlled dimming of LEDs is considered a cheap and awful solution.

What i want to say is that strobing works great and is absolutely stunning, but if you tend to get a headache from things like that, better try it out before buying your GPU and screen just for that feature. For office tasks you will love a non-flickering backlight, thats for sure.

Apart from that this is just another great article on tomshw that's definitely worth reading.
 

TNT27

Reputable
Oct 12, 2014
1,294
0
5,460
69
I played around with g-sync for a while, i think all this crap is over-rated, and not worth a penny. Same thing with mechanical keyboards, hell old membrane keyboards had faster response times. Both Doesn't seem to make any difference in game-play for me.
 

alextheblue

Distinguished
Apr 3, 2001
3,078
106
20,970
2
I played around with g-sync for a while, i think all this crap is over-rated, and not worth a penny. Same thing with mechanical keyboards, hell old membrane keyboards had faster response times. Both Doesn't seem to make any difference in game-play for me.
Go home TNT, you're drunk.
 

Verrin

Distinguished
Jan 11, 2009
97
3
18,635
0
I have an Acer XG270HU with the most up-to-date firmware. I got an exchange with Acer when I learned that the OD settings did not function (I was getting ghosting galore). But with everything working as it should, I can confirm that I get absolutely no ghosting with OD settings on normal and extreme. Also, in every game I've tested, if you enable vsync the game will ignore the fact that it's enabled unless you go over the maximum refresh rate of the panel. So FreeSync works fine but will cap it at 144Hz.
 

Compuser10165

Honorable
Sep 12, 2015
1,019
1
11,965
210
Maybe in the not so distant future, neither of the technologies will be supported, and instead implemented a new open standard (something between the two) with no proprietary company but I don't know about the likelihood of that. In my opinion that will be the best thing.
 

InvalidError

Titan
Moderator

AdaptiveSync is already an open standard: an official VESA extension of the DP 1.2 spec.

FreeSync is just an AMD branding initiative to give themselves control over marketing - allow only the displays that meet AMD's requirements to market themselves as FreeSync when technically, any display that meets the FreeSync spec will also work without the manufacturers having to pay AMD's trademark toll.

Nvidia could just as easily implement their own AdaptiveSync support, either as generic AdaptiveSync support as AMD should have done or trademarked under their own house-brand to further confuddle the market like AMD did.

IMO, the FreeSync brand made things more confusing than they should have been. As far as AdaptiveSync displays are concerned, it means nothing more than "AMD-Approved."
 

Compuser10165

Honorable
Sep 12, 2015
1,019
1
11,965
210


I know that, I just it is possible for a new standard to emerge (maybe on a different technology) that can solve all the problems with adaptive sync that neither AMD or Nvidia certify (maybe another company), or certified by both with no extra cost from the manufacturers.
 
Status
Not open for further replies.

ASK THE COMMUNITY