News VESA's Adaptive-Sync Certification Could Kill FreeSync, G-Sync Branding

If the testing results in implementations of VRR that are as good or better than either of those two, then sure. But considering how long G-Sync has survived and is still being included on higher end monitors, I don't think that'll go away any time soon.

Though honestly VESA should've done this from the start, or at least the branding of it.
 
  • Like
Reactions: renz496 and King_V

King_V

Illustrious
Ambassador
Agreed overall, but I do think this is, ultimately, going to push G-Sync aside.

Why spend the extra money on the production side, or the extra money on the consumer's price side, for something that doesn't give any extra benefit?
 
Ever since Nvidia quit with the nonsense GSync modules (admittedly necessary to achieve the functionality for the ~1 year before significant market adoption of VRR, hence FreeSync), they've been using VESA adaptive sync , right? The whole "GSync Compatible" charade was just Nvidia's way of directing revenue to their most loyal monitor manufacturers, when in reality, the vast majority of VESA adaptive sync [FreeSync] monitors could work on Nvidia 10xx and higher GPUs over DP from the (Jan 2019) start.

FreeSync was always VESA adaptive sync AFAIK, but with the additional [proprietary] legwork by AMD of including VRR over HDMI (which is now a non-issue for monitors that have HDMI 2.1)

Honestly, it seems long overdue for everything to just fall under VESA adaptive sync.
 
Ever since Nvidia quit with the nonsense GSync modules (admittedly necessary to achieve the functionality for the ~1 year before significant market adoption of VRR, hence FreeSync), they've been using VESA adaptive sync , right? The whole "GSync Compatible" charade was just Nvidia's way of directing revenue to their most loyal monitor manufacturers, when in reality, the vast majority of VESA adaptive sync [FreeSync] monitors could work on Nvidia 10xx and higher GPUs over DP from the (Jan 2019) start.
G-Sync offers a few features that FreeSync doesn't provide, and BlurBusters have noticed that G-Sync tends to provide a better experience. Also NVIDIA likely has a stringent qualification process because G-Sync is posited as a premium feature. However on monitors themselves, I believe the G-Sync module is responsible for driving the display with regards to timings, pixel overdriving, and whatnot, and is not simply just something there to facilitate commands from an NVIDIA GPU.

Though on laptops it's different. It's actually using VESA standards, but NVIDIA is probably still making laptop makers go through a more stringent qualification before they can slap on the G-Sync badge.

The main reason we got the Adaptive Sync rebrand nonsense is AMD choosing to house-brand Adaptive Sync for marketing purposes against Nvidia's proprietary Gsync instead of promoting the generic name that Nvidia could pick up at any time.
I subscribe to the "conspiracy" that AMD's marketers were hoping to bank on people's ignorance, riding on another group's work so people think AMD invented it or something and released it royalty free for PR points.
 

parkerthon

Distinguished
Jan 3, 2011
69
67
18,610
Agreed overall, but I do think this is, ultimately, going to push G-Sync aside.

Why spend the extra money on the production side, or the extra money on the consumer's price side, for something that doesn't give any extra benefit?

I think there's a actually a big difference between gsync certified and "compatible" monitor's that are otherwise amd certified which is fairly flexible and easy.

I don't know if it's a GSync thing, but the adaptive refresh on G Sync has been a superior experience for me vs monitors with GSync compatible version. In my experience, if a game a has micro stutter or any other kind of hitching issue, which is a not uncommon issue, having gsync enabled on a "compatible" monitor is a poor experience. I found buying a gsync premium monitor made the issue go away even if this is cost prohibitive for many. Apparently has to do with the floor refresh rate being too high as microstutter, per my recollection researching the issue at time, causes your fps to drop very quickly before rebounding. That's my only concern about this certification. That now we get an inferior certification(eg I feel 48hz min is too high as it should be minimum 24hz) which will quickly become a base standard and nothing more(similar to how HDR has evolved over the years into competing "premium" standards). I do see the potential for better validation of claimed performance specs by different display vendors, however that doesn't help me much as I never buy anything without seeing someone knowledgeable and geared up measure hard specs on a random sample unit first. So all in all, the standard is probably only an improvement for consumers shopping at the lower end of the gaming monitor segment which is cool all the same. I just wish they offered a premium set of specs as well and also that they refresh the standard frequently.

Meanwhile application performance is the other side of this puzzle. I'd imagine if this standard was, well, actually a widely adopted standard, developers would test more thoroughly for issues that would impact performance on compatible displays. I could see far less sync related issues in games especially with indie games and new releases. So that's pretty good upside as well.
 
  • Like
Reactions: Why_Me and renz496
The main reason we got the Adaptive Sync rebrand nonsense is AMD choosing to house-brand Adaptive Sync for marketing purposes against Nvidia's proprietary Gsync instead of promoting the generic name that Nvidia could pick up at any time.
This is an interesting hot take on the events.

Then why did nVidia create the "tiers" of GSync instead of just saying "VRR compatible" or something? The fact AMD created FreeSync was to market their own "standard measurement" for the monitor quality around VRR. You can make any judgement on it's quality, but nVidia did the same exact thing and went a step further with the module so monitors could pass all their criteria for their VRR implementation.

Could AMD have pushed VESA to get this out faster? Maybe. Could have nVidia done it as well? Absolutely. Was it in either's business best interest? LOL, no and you know it.

Don't go giving that hot take without understanding how both AMD and nVidia didn't want to have VESA get this out too fast/soon, since they'd have to push for either a better certification (so it actually means squat), or just phase the branding out after investing in marketing for them.

TL;DR: you pinning this on AMD is disingenuous, to say the least.

Regards.
 

Sergei Tachenov

Commendable
Jan 22, 2021
64
64
1,610
What about variable overdrive?

It’s no secret that the best overdrive settings differ for different refresh rates. This is especially true for high refresh monitors.

Native G-Sync monitors adjust overdrive automatically, so I don’t care whether my game runs at 60 FPS, 90 FPS or 200 FPS. With a G-Sync Compatible monitor I’ll either have to pick a compromise setting or change the setting manually. There are some G-Sync Compatible monitors that try to imitate variable OD, but they do it by simply automatically switching between the OD presets, which leads to very poor experience if a game’s frame rate happens to constantly fluctuate from one preset range to another.

That being said, however, on high refresh monitors screen tearing is next to invisible anyway, so maybe we should just ditch VRR completely once mainstream monitors move into the 200+ Hz territory?
 
That being said, however, on high refresh monitors screen tearing is next to invisible anyway, so maybe we should just ditch VRR completely once mainstream monitors move into the 200+ Hz territory?
VRR isn't solely to prevent screen tearing. It can also make the system more efficient. Sure it might not matter for a desktop, but for a laptop, tablet, or phone, every watt counts.
 

InvalidError

Titan
Moderator
TL;DR: you pinning this on AMD is disingenuous, to say the least.
The only thing I'm pinning on AMD is the completely unnecessary FreeSync branding as the basic tier requirements add absolutely nothing to baseline Adaptive Sync besides AMD's marketing blessing.

Nvidia's Gsync came to market over a year before Adaptive Sync became a standard and requires proprietary scaler hardware in the monitor that adds ~$200 to cost to monitors that will only ever work as intended with Nvidia GPUs.
 
https://www.tomshardware.com/features/gsync-vs-freesync-nvidia-amd-monitor

I don't know if it's a GSync thing, but the adaptive refresh on G Sync has been a superior experience for me vs monitors with GSync compatible version. Apparently has to do with the floor refresh rate being too high as microstutter, per my recollection researching the issue at time, causes your fps to drop very quickly before rebounding. That's my only concern about this certification. That now we get an inferior certification(eg I feel 48hz min is too high as it should be minimum 24hz)
Low Framerate Compensation mitigates the 48Hz floor issue. That's what the GSync module does also (although I think that kicks in under 30FPS?). The major reason for the 48Hz floor is backlight flicker induced when the refresh rate goes too low, but also because that's probably a more "common" floor target for acceptable image retention on a larger majority of panels compared to the premium panels the GSync module monitors used.
Then why did nVidia create the "tiers" of GSync instead of just saying "VRR compatible" or something?
AMD also has FreeSync tiers. Similarly (as stated in the article) VESA will have tiers.
nVidia did the same exact thing and went a step further with the module so monitors could pass all their criteria for their VRR implementation.
Nvidia started development on the GSync module to implement VRR before it could be done on VESA, NOT to pass their criteria. The main reason for GSync [module] monitors being so premium is to absorb the $200 cost of the module. When the GSync ultimate/HDR module came out, it was so prohibitively expensive (purportedly ~$500 just for the module) that it basically never saw the light of day.

Tragically, the VESA implementation followed VERY soon after Nvidia launched their module solution. They understandably stood on their hill as long as possible to recoup R&D costs for themselves as well as stock for their partners, but ultimately had to cave to the VESA implementation in Jan 2019.
Could AMD have pushed VESA to get this out faster? Maybe. Could have nVidia done it as well? Absolutely. Was it in either's business best interest? LOL, no and you know it.
No doubt both companies are/were playing for their own interests and not the consumer.
 
  • Like
Reactions: khaakon

khaakon

Distinguished
Jun 22, 2011
4
1
18,515
What about variable overdrive?
&
Native G-Sync

* My thoughts is they could be valuable traits on their own merits. Premium G-sync monitors are much [ever so slightly] better than the rest of the rabble, for example. And the fact that this standard doesn't include everything, maybe will make it easier for it to stand, and then have manufacturers standards on top of that.

*Edit: This standard also requires 'Out of the box settings' - which is a nice way of telling manufacturers it can be used as intended 24/7/365.

That being said, however, on high refresh monitors screen tearing is next to invisible anyway, so maybe we should just ditch VRR completely once mainstream monitors move into the 200+ Hz territory?

*when we get there in some many years from now, we stop caring about it I guess. But it will be future revisions, surely - and they might bring in new elements.
 
Last edited:

InvalidError

Titan
Moderator
That being said, however, on high refresh monitors screen tearing is next to invisible anyway, so maybe we should just ditch VRR completely once mainstream monitors move into the 200+ Hz territory?
I doubt mainstream monitors will hit 200Hz any time soon if ever as most non-gaming monitors today are still 60Hz. There are energy costs associated with spamming the same frame over the display cable and through the monitor's electronic multiple times in a row. Even if higher refresh rates becomes standard, variable refresh will stick around for energy saving purposes.
 
  • Like
Reactions: renz496
Apr 1, 2020
1,447
1,103
7,060
To obtain the Adaptive-Sync Display logo, the absolute minimum Adaptive-Sync refresh range is 60 Hz

Anyone else take an issue with this? As far as gaming goes, adaptive sync was primarily intended to eliminate the sub 60fps visual penalty by adjusting the refresh rate lower to compensate so the stream remains smooth. With this requirement they no longer have to go lower than 60hz. Yes now we have image reduction techniques like DLSS to lower the detail levels and resolution to enhance performance, but dropping under 60fps is still quite a possibility.

Seems to me this is a meaningless certification and badge as it is. Personally I would say 48hz needs to be the required minimum.
 

InvalidError

Titan
Moderator
As far as gaming goes, adaptive sync was primarily intended to eliminate the sub 60fps visual penalty by adjusting the refresh rate lower to compensate so the stream remains smooth.
Visual penalty? What is that supposed to mean?

The main goal of variable sync is to reduce perceivable stutter by loosening vsync timing so the GPU can "hold vsync" for a short while instead of having to start refreshing the monitor on a strict schedule. This way, the GPU has some slack to complete a frame instead of duplicating old frames when the newest one fails to complete in less than 16.6ms from the last vsync.

With normal vsync, if a frame misses timing, you end up with a duplicate frame for an extra 16.7ms, then the 16.7ms late frame gets displayed and the game engine has to skip animation 16.7ms forward to catch up. It is the combination of duplicate frames (freeze) followed by a catch-up skip (lurch) that make stutters so jarring.
 
The only thing I'm pinning on AMD is the completely unnecessary FreeSync branding as the basic tier requirements add absolutely nothing to baseline Adaptive Sync besides AMD's marketing blessing.

Nvidia's Gsync came to market over a year before Adaptive Sync became a standard and requires proprietary scaler hardware in the monitor that adds ~$200 to cost to monitors that will only ever work as intended with Nvidia GPUs.
I don't disagree with the overall statement that FreeSync added very little on top of VRR's technical implementation as a certification.

My point is simple: nVidia, while they went a step further, didn't even bother with VRR until FreeSync took it to make it known. So nVidia jumped on the same exact AMD bandwagon when they created "GSync Compatible" or whatever it was called. Even media outlets said "a shadow FreeSync" instead of just saying they wanted to add a certification of their own for basic VRR. So...

AMD also has FreeSync tiers. Similarly (as stated in the article) VESA will have tiers.
Yes. They realized the normal FreeSync was bad, so they upped the requirements. Remember when there was only "80+" certification and then we got "bronze, silver, gold and plat" because money reasons? Well, this is the exact same. I'm not making excuses for AMD or nVidia; it just is.

Nvidia started development on the GSync module to implement VRR before it could be done on VESA, NOT to pass their criteria. The main reason for GSync [module] monitors being so premium is to absorb the $200 cost of the module. When the GSync ultimate/HDR module came out, it was so prohibitively expensive (purportedly ~$500 just for the module) that it basically never saw the light of day.

Tragically, the VESA implementation followed VERY soon after Nvidia launched their module solution. They understandably stood on their hill as long as possible to recoup R&D costs for themselves as well as stock for their partners, but ultimately had to cave to the VESA implementation in Jan 2019.
Correct me if I'm wrong, but... VRR has been in the DisplayPort spec since it was introduced with DP1.2, no? As a "technology", VRR has existed for a very long time, but implemented on DP and HDMI just "recently". So not quite right.

EDIT: https://www.vesa.org/wp-content/uploads/2014/07/VESA-Adaptive-Sync-Whitepaper-140620.pdf
That is from 2014 and looky look the names in that paper.

I will agree that nVidia took the approach a step further, but that's besides the point I wanted to make. VRR was there before nVidia took it as a concept and implemented it as a proprietary thing. Keep in mind that their module was needed so the pacing was to nVidia's "standard" and to support HDMI (IIRC). I know it's confusing, maybe, but the take away is simple: nVidia could have pushed VRR as part of DP and HDMI, but decided not to for business reasons.

Their solution is technically better, but privative and way costlier. It is better (absolutely), because their certification baseline is way higher. So, it's a simple thing: nVidia wanted to achieve something and the core spec didn't cut it, so they developed the additional tooling to get it done (good). They made it privative and exclusive (arguably bad?). They could have made those available for VESA to include in the core spec (or this certification), but they didn't because money. Good or bad? I don't care; it is. As I said, pinning this on AMD was just disingenuous.

No doubt both companies are/were playing for their own interests and not the consumer.
Clearly, no doubts there.

Regards.
 
Last edited:
Yes. They realized the normal FreeSync was bad, so they upped the requirements.
FreeSync isn't/wasn't bad, it just wasn't very descriptive for the consumer to know what they're getting. Hence why AMD has/had to keep a running list of monitor specs because monitor manufacturers and resellers don't bother to advertise the proper info. GSync was...easier for the consumer because it carried a specific [and premium] set of specs (basically FreeSync Premium tier).

I think what a lot of people don't understand is that the performance of a monitor is directly controlled by the panel being used. That has nothing to do with FreeSync/GSync. As I mentioned earlier, [module] GSync needed to be applied to premium grade panels in order to "hide" the cost of the module in the sale price. That alone doesn't make GSync better, but it certainly fooled a lot of people into adopting that mindset.
 
  • Like
Reactions: King_V

InvalidError

Titan
Moderator
My point is simple: nVidia, while they went a step further, didn't even bother with VRR until FreeSync took it to make it known.
Gsync was Nvidia's proprietary approach to VRR and launched in 2013 with a minimum GPU requirement of GTX650. Adaptive Sync didn't become an official standard until 2014.

Nvidia only added Adaptive Sync support under its Gsync umbrella after broad availability of relatively cheap Adaptive Sync monitors along with more competitive GPUs from AMD started hurting Nvidia's Gsync-only GPU sale.
 
FreeSync isn't/wasn't bad, it just wasn't very descriptive for the consumer to know what they're getting. Hence why AMD has/had to keep a running list of monitor specs because monitor manufacturers and resellers don't bother to advertise the proper info. GSync was...easier for the consumer because it carried a specific [and premium] set of specs (basically FreeSync Premium tier).

I think what a lot of people don't understand is that the performance of a monitor is directly controlled by the panel being used. That has nothing to do with FreeSync/GSync. As I mentioned earlier, [module] GSync needed to be applied to premium grade panels in order to "hide" the cost of the module in the sale price. That alone doesn't make GSync better, but it certainly fooled a lot of people into adopting that mindset.
For me it's bad when a certification add little no nothing on top of a technical spec. VRR implementations gives you, well, VRR. What does FreeSync add on top of that initial checkbox? Terrible VRR ranges, terrible picture quality and dubious response times? It's ok you don't agree with me saying "it was bad", but the proof is in the pudding: AMD created FreeSync Premium and FSync 2 that just took those points and realized they were bad, so they made the certification better XD

Gsync was Nvidia's proprietary approach to VRR and launched in 2013 with a minimum GPU requirement of GTX650. Adaptive Sync didn't become an official standard until 2014.

Nvidia only added Adaptive Sync support under its Gsync umbrella after broad availability of relatively cheap Adaptive Sync monitors along with more competitive GPUs from AMD started hurting Nvidia's Gsync-only GPU sale.
I'm pretty sure VRR existed before 2013 and it was only implemented in 2014. But point taken. Maybe nVidia and VESA were working on it in parallel and nVidia just worked faster. I have no way to prove it otherwise, but just keep an open mind about how long it takes for Standards to be released from conception. Maybe this was discussed at some point in SIGGRAPH? Maybe?

Regards.
 
Again, to my point. None of these things has anything to do with FreeSync or GSync. They're all characteristics of the panel.
Hm... No?

Think about it from the 80+ certification. If you have all your components and when you build the PSU it doesn't meet the 80+ certification criteria, can you claim "80+"? No, right?

If the OEM/AIB chooses a component (be it panel, PCB, resistors; whatever) that makes some criteria for it being FSync or GSync compatible fail, they wouldn't be able to claim they are and slap the logo on the product. Sure, they can have VRR, but not be "certified" by neither AMD or nVidia under their umbrellas. Could they advertise VRR? Absolutely. Does people care or know about VRR as standalone? I don't think so? Well, maybe now they will, since it feels like this should push for a more common understanding.

Even more laughable would be they use a panel that actually can pass all the criteria, but they screw it up with bogus Firmware (seen it before!) so the panel performs worse. You can see a lot of this on TVs, which is kind of stupid.

There's one caveat here: I'm suspecting the AMD and nVidia certification is on a per-product base and not per-panel base. If they certify the manufacturer's panel and then OEMs can just grab it (with the certification) and slap the logo even when the end/final product doesn't meet it, I'd be darn surprised it has such an idiotic loophole. I hope I'm wrong XD

Regards.
 

InvalidError

Titan
Moderator
I think what a lot of people don't understand is that the performance of a monitor is directly controlled by the panel being used.
The LCD driver behind the panel plays a huge role too. The best panel in the world is still hot garbage if the driver isn't tuned correctly for it. A lot of the issues with VRR are due to panel drivers having to compensate for the uneven frame pacing, which is why there are monitor reviews where two models sharing the same panel can have drastically different performance.

I'm pretty sure VRR existed before 2013 and it was only implemented in 2014.
Different industries had various flavors of VRR prior to 2010. There even have been standards for partial screen refreshes for applications requiring extreme power savings.

Nvidia announced Gsync in October 2013, had demo units in reviewers' hands in December to drum up interest in Gsync monitor launches at CES 2014 the following month. AMD did its first FreeSync prototype demo at Computex 2014 and the earliest FreeSync monitors I could find were only announced (not launched) in December 2014.
 
  • Like
Reactions: -Fran- and tennis2

TJ Hooker

Titan
Ambassador
Anyone else take an issue with this? As far as gaming goes, adaptive sync was primarily intended to eliminate the sub 60fps visual penalty by adjusting the refresh rate lower to compensate so the stream remains smooth. With this requirement they no longer have to go lower than 60hz. Yes now we have image reduction techniques like DLSS to lower the detail levels and resolution to enhance performance, but dropping under 60fps is still quite a possibility.

Seems to me this is a meaningless certification and badge as it is. Personally I would say 48hz needs to be the required minimum.
Just to be clear, to meet the Adaptive Sync Display spec 60 hz is the highest value permitted for the bottom end of the VRR range. There's nothing stopping monitors from having a lower VRR limit that is less than 60 Hz.

As far as I can tell, neither Freesync nor Gsync had any requirements for what the lower VRR limit should be. Only that the ratio between upper and lower VRR limits needs to be greater than some value (in order for low framerate compensation to work).

So it doesn't seem like Adaptive Sync is losing anything compared to Free-/G-sync in this regard.
 

d0x360

Distinguished
Dec 15, 2016
115
47
18,620
The main reason we got the Adaptive Sync rebrand nonsense is AMD choosing to house-brand Adaptive Sync for marketing purposes against Nvidia's proprietary Gsync instead of promoting the generic name that Nvidia could pick up at any time.

That's not exactly true. AMD did it with some improvements over regular VRR and again with Freesync premium pro with further enhancement but they did so by using the standard as a fallback.

The only reason they did it was to have a "feature" that Nvidia had that people held as just another thing that makes them superior when in fact Freesync was arguably the better of the 2.

AMD will be just fine if Freesync and gsync die. It shouldn't be a gpu specific thing anyways, it should be and is a display standard.
 

Latest posts