@Ninjawithagun
"...although given that the Adaptive-Sync standard isn't expected to cost all that much..."
NOT! Where did they author get his degree for journalism? Captain Crunch University?? Adaptive Sync is definitely going to cost as much if not more than G-Sync. "Why?" you might ask? Simple. Adaptive Sync requires very similiary technology that G-Sync uses (and Nvidia knows this!). The scaler on an Adaptive Sync monitor is not ordinary and requires a special set of hardware that allows for full duplex communication with the graphics card as well as the screen display itself to have the capability to redraw the frame when directed to do so by the graphics card. Needless to say, AMD is full of it if they think they can lie to the tech community anymore about it's BS vapor Free-Sync. Not buying it sorry! All out of any interest to by swamp land in Florida too ;-) Give Nvidia credit where credit is due. They invented a dynamic refresh rate technology and it works!! Stop hating people. Stop thinking like a caveman just because you decided to buy an AMD card and now are pissed off because AMD lied about Free-Sync and now you are stuck with no dynamic refresh rate technology. Try refocusing your hate on AMD for not only lieing to you, but also for not being innovative to compete with their own REAL dynamic refresh rate technology. NUFF SAID!
OK really, shut up, you are an ignorant. nVidia didn't invent anything. Forms of adaptive refresh rates has been in our hands for years in mobile equipment. Through eDP, whitch is an extention of the DisplayPort standard/protocol, adaptive sync was doable. It was actually added as a power saving feature, and enabled the display to switch refresh rate seamlessly. The first tech demos of FreeSync used just THAT.
http://en.wikipedia.org/wiki/DisplayPort#eDP
DisplayPort has bi-directionnal half duplex already. Full bidir duplex is useless and a waste of energy and hardware in this case.
nVidia's G-Sync is more expensive as it is add-on hardware to existing designs. They use FPGA, not even custom ASICs to implement their tech, along with some memory, a board and all the components. Thats why its sold to you either as an upgrade kit for select models or charge an hefty extra for other models with it built in. And guess how this shmorgaborg drives your panel? Through eDP.
AMD's approach was to develop a standard similar to eDP and submit it for adoption by VESA, which was done. The DisplayPort standard is currently 1.2, but they have officialy released a new 1.2a with adaptive sync, and 1.3 is coming with it too. Evolving such a standard means that adaptive sync is simply slipped in the specs of what manufacturers have to implement when they need to design their next scaler unit, and build it with, say, DP1.3 support. It does require some extra transistors, driving up unit cost a bit. Many manufacturers have announced they were designing new scalers with support for that.What makes Freesync cost much less than g-sync is the simple fact that there is NO addidionnal hardware; just support from the scaler unit that makes your display.
nVidia chose to quickly milk their fans as usual, and AMD chose to take the long way home, pushing for wider adoption of otherwise already existing tech. The only good thing about G-Sync is that it allows enthusiasts who have already invested significant money in monitor upgrades to upgrade them with this (truly welcome) tech advance.
By the way, OF COURSE nVidia will support freesync, they already DO by supporting eDP. If they say they dont, then they've really just locked it down. That makes THEM the one lying to YOU!
I'll stay with my AMD cards, despite the impressive Maxwell wave.