Meanwhile, Vesa's Adaptive Sync and AMD's FreeSync (based on Adaptive Sync) did the same thing and didn't really cost anything extra,
'Meanwhile' meaning 'a year later, and another year after that to solve
the ghosting issues from static pixel overdrive that came from abusing stock eDP adaptive sync (pixel overdrive needs to vary as refresh rate varies to avoid undershoot and overshoot. G-Sync modules used per-rate LUTs, freesync used a single value for all pixel refresh intervals)'.
Then you have 'Freesync Premium Pro' AKA 'you get to use HDR and Freesync at the same time', which requires proprietary implementation in games or does not work at all. It too arrived some time after G-Sync's HDR implementation, which ironically does not require games to implement a proprietary API.
Look, we get that sometimes a company wants to move the market forward with new technologies, but these should be the same technologies that the game developers and graphics professionals want, not just arbitrary vendor lock-in solutions
LOL. DLSS, RT and G-Sync stick around to this day
because that are 'technologies that the game developers and graphics professionals want' (and that consumers want). Try and push some proprietary feature that is not desired, and it will flop and flop hard (e.g. Mantle).
It's pretty much the Graphics Technology Development Cycle at this point:
1) Nvidia introduces some new technique
2) It's decried as useless and proprietary and nobody actually wants it anyway and it'll just fail and Nvidia smells
3) New technique is implemented in games
4) New technique is widely popular with developers and consumers (but is somehow still simultaneously unwanted and worthless)
5) Other vendors implement new technique once it has widespread adoption
6) "Of course [New Technique] is the future, everyone always wanted it!"
Variable refresh rates, GPGPU, GPU accelerated raytracing, AI-accelerated upscaling, frame generation, etc. The cycle has played out time after time, and will continue to do so.