Tanquen :
Ok. VESA adaptive sync (around since 2009) does not fail horribly. Adaptive-Sync protocol is in the DisplayPort standard and Nvidia said they will not support it. Easy for them to do but you can guess why they will not. There is already a 30-144Hz VESA adaptive sync display and VESA adaptive supports down to like 9Hz. Yes, if your game drops below 30 FPS on this new display… Wait, why are you playing games at 20-ish FPS? Cleaning up some horizontal tearing is not going to make that a fun experience. G-Sync tries to help with this by passing the same frame over again, like V-Sync already does. With VESA adaptive sync you can use it with or without V-Sync turned on. So if you go below 30Hz V-Sync kicks in. So your 10-20FPS slide show has no horizontal tearing. For the high end or going over the displays refresh rate, again you can choose to use V-Sync or not and most new cards have frame limiting so it’s not generating 300FPS on your 60 or 120 or 144Hz display and wasting electricity. Besides, horizontal tearing is not normally an issue if you consistently above your displays refresh rate.
So again, thanks to Nvidia you have to try and pick the video card you want and worry if the display you want will support adaptive sync.
I think you have a lack of understanding when it comes to Vsync and adaptive sync technologies.
First off Vsync works at a SINGLE frequency. So on your example of a 30-144hz FreeSync panel, FreeSync would 'fail' or be turned off for anything below 30 or anything above 144. If you have Vsync on then your monitor attempts to refresh at a SINGLE frequency just like old monitors...it still can't handle variable frequencies. If it could then we wouldn't need Adaptive Sync technology. So yes, with FreeSync you have a fail condition for any FPS that doesn't fall inside the FreeSync panel refresh rate. Furthermore, anyone who seriously games NEVER uses Vsync because part of Vsync technology is delaying/buffering frames which obviously leads to display lag and missed frames which is the prime evil for gaming panels.
So if you're seriously gaming then FreeSync is a hard fail outside the panel range. If you're a casual gamer then FreeSync is a soft fail and just turns into a standard panel outside the FreeSync range.
You also have a lack of knowledge about tearing as well. Tearing will ALWAYS happen if you're not running Vsync and your FPS is higher than the refresh rate of the panel. It's easy to understand, if you're at 100 FPS and your panel is at 60hz then on the panel refresh you draw one screen and then are 67% through drawing the second when your monitor refreshes for a new screen (aka 'tearing' the screen because you didn't finish a draw). You won't see this on static screens because the image isn't changing but it's still tearing. You will notice this in any kind of a fast paced game. The whole point of Vsync is to essentially buffer frames so that they only draw at the max refresh rate and only complete frames. This mostly works fine except that this buffering obviously introduces delay which causes lag for a gamer.
FreeSync and G-Sync essentially operate the same in the panel 'hardware band'. However, FreeSync panels turn into a 'dumb panel' when outside that range. On the other hand G-Sync has the capability to perform additional refresh rate mechanics like driving the panel at adaptive integer multipliers of the GPU just like TV refresh rates that are designed at even multipliers of all the potential connected devices refresh rates (which is why they don't stutter/tear). The other shortfall is that FreeSync will always be chained to the VESA standard which means that even if AMD has the technological capability they're locked into the DisplayPort 1.2a refresh rate until VESA decides to make a change which can quite literally take years. If you don't believe me just ask HDMI and DP fans how long a standard was made before it was officially approved. In the case of G-Sync nVidia can iterate the hardware as fast as they want.
So yes, it's a format war. People who want the maximum GPU power already use nVidia hardware even though AMD can be more cost effective. Do you really think those people are going to balk at an extra $50-$100 for a G-Sync panel that will last a decade when they're spending that premium annually on one or more GPUs?