ubercake :
Since the focus is on Project Freesync, can you tell us any advantage of using Freesync tech over that of the competition?
Also, what makes/models of Freesync monitors are going to be released in 2015?
G-Sync and FreeSync are conceptually similar: let the GPU control the monitor’s refresh, and do such a refresh only when a new frame is ready. All the stuttering/tearing/lag associating with and without vsync is eliminated, leaving silky smooth gameplay in its wake.
Here’s where things break down:
NVIDIA has chosen to deliver this technology in the form of an expensive proprietary module that replaces the standard kit of electronics that comes in a monitor (the scaler). They did this because scalers don’t presently support any mechanism to cede control of the refresh rate of the GPU.
That module costs $100-150 USD, and must be purchased by the monitor vendor for integration into their design. That cost is passed onto the consumer, too. It’s also obtained through licensing, which is a complicated and burdensome legal process—not to mention validating an entirely new monitor with a proprietary scaler replacement. If you’re a monitor maker, you can’t just adopt G-Sync because you want to: you have to ask and there's a big cost associated with it.
AMD acknowledged that some effort with VESA, the authors of the DisplayPort standard, could yield an amendment that would also grant GPUs control of monitor refresh rates. Extending DisplayPort in this manner would be broadly compatible, not to mention free to adopt. We proposed the amendment and it was adopted; it’s now known as DisplayPort Adaptive-Sync.
Indeed, many of the firmware vendors are telling us that they’ve only needed to design and validate new firmware for many of their scalers to outfit them with the Adaptive-Sync support. Three companies basically own the scaler market: Realtek, MStar and Novatek. All three of them have committed to supporting Adaptive-Sync, with some combination of existing scalers upgraded to new firmware, and new-generation scalers built from the ground up with Adaptive-Sync in mind.
In contrast: FreeSync is a bundle of secret sauce in our driver that leans on Adaptive-Sync support in a monitor to update the display whenever a Radeon needs to. How you use the control is as important as having the control, so FreeSync is the “how” and Adaptive-Sync is the “have.”
To tidy up loose ends: the Adaptive-Sync specification supports a wider range of refresh rates than G-Sync, requires no license or licensing fee, and AMD charges nothing to work with partners to bring a design online. Plus, using a standard scaler gives you monitor quality-of-life features like an on-screen menu, audio support (e.g. in-monitor speakers), or even HDMI/DVI outputs (which allows you to at least use the display, even if you don’t have a FreeSync-enabled graphics card).
In every conceivable way, FreeSync is a less materially expensive and more robust. Yes, it takes longer to reach market when you work on an industry standard to get there, but the benefits are very clear.
I can confirm that five UltraHD monitors 24-32" will be entering market in March. Those are from Samsung. But there are several other vendors also preparing displays for Q1, and these will be seen at CES. I'm under confidentiality agreements until then.