The only thing I'm pinning on AMD is the completely unnecessary FreeSync branding as the basic tier requirements add absolutely nothing to baseline Adaptive Sync besides AMD's marketing blessing.
Nvidia's Gsync came to market over a year before Adaptive Sync became a standard and requires proprietary scaler hardware in the monitor that adds ~$200 to cost to monitors that will only ever work as intended with Nvidia GPUs.
I don't disagree with the overall statement that FreeSync added very little
on top of VRR's technical implementation as a certification.
My point is simple: nVidia, while they went a step further, didn't even bother with VRR until FreeSync took it to make it known. So nVidia jumped on the same exact AMD bandwagon when they created "GSync Compatible" or whatever it was called. Even media outlets said "a shadow FreeSync" instead of just saying they wanted to add a certification of their own for basic VRR. So...
AMD also has FreeSync tiers. Similarly (as stated in the article) VESA will have tiers.
Yes. They realized the normal FreeSync was bad, so they upped the requirements. Remember when there was only "80+" certification and then we got "bronze, silver, gold and plat" because money reasons? Well, this is the exact same. I'm not making excuses for AMD or nVidia; it just is.
Nvidia started development on the GSync module to implement VRR before it could be done on VESA, NOT to pass their criteria. The main reason for GSync [module] monitors being so premium is to absorb the $200 cost of the module. When the GSync ultimate/HDR module came out, it was so prohibitively expensive (purportedly ~$500 just for the module) that it basically never saw the light of day.
Tragically, the VESA implementation followed VERY soon after Nvidia launched their module solution. They understandably stood on their hill as long as possible to recoup R&D costs for themselves as well as stock for their partners, but ultimately had to cave to the VESA implementation in Jan 2019.
Correct me if I'm wrong, but... VRR has been in the DisplayPort spec since it was introduced with DP1.2, no? As a "technology", VRR has existed for a very long time, but implemented on DP and HDMI just "recently". So not quite right.
EDIT:
https://www.vesa.org/wp-content/uploads/2014/07/VESA-Adaptive-Sync-Whitepaper-140620.pdf
That is from 2014 and looky look the names in that paper.
I will agree that nVidia took the approach a step further, but that's besides the point I wanted to make. VRR was there before nVidia took it as a concept and implemented it as a proprietary thing. Keep in mind that their module was needed so the pacing was to nVidia's "standard" and to support HDMI (IIRC). I know it's confusing, maybe, but the take away is simple: nVidia could have pushed VRR as part of DP and HDMI, but decided not to for business reasons.
Their solution is technically better, but privative and way costlier. It is better (absolutely), because their certification baseline is way higher. So, it's a simple thing: nVidia wanted to achieve something and the core spec didn't cut it, so they developed the additional tooling to get it done (good). They made it privative and exclusive (arguably bad?). They could have made those available for VESA to include in the core spec (or this certification), but they didn't because money. Good or bad? I don't care; it is. As I said, pinning this on AMD was just disingenuous.
No doubt both companies are/were playing for their own interests and not the consumer.
Clearly, no doubts there.
Regards.