renz496 :
I said definitely because the current nvidia hardware does not have the hardware needed make adaptive sync work without the aid of the module. eDP used in laptop is a bit different that DP used on desktop monitor. When AMD first talk about FreeSync concept they said their gpu have the necessary tech inside for three generations already. But then why they finalize FreeSync only GCN 1.1 (the latest back then) able to support FreeSync in games contradict to their early claim about have the required tech for three generations already? It is most likely because not as simple as give driver updates and firmware updates to the monitor like how they imagine it first. If driver hack all it needs for nvidia card to support FreeSync then people have done it long ago. But so far I have still not seeing it done by anyone yet. They use mobile gsync does not need module as a prove that current nvidia can work with FreeSync and nvidia simply put a block on their driver but if that's really the case I'm sure it will be hacked quickly. Remember nvidia blocking dedicated PhysX card when AMD GPU is present? There was hack to make hybrid PhysX setup working again. Then SLI only working on mobo that have SLI certification? There is hack so SLI can also work on board that does not have the certification. Mobile gsync has existed since last year. So why there is still no hack to make nvidia desktop gpu working with FreeSync? So far we only heard speculation but as it is we don't even really know what are the real changes nvidia done to their mobile gpu to make adaptive sync work without the module.
That's the thing, we really don't know. I tried to find a copy of the DisplayPort spec, but if it's on the internet they're not making it easy to find. I'll check torrent sites and such, but if it's this hard to find there might only be a few individuals working on it, and they could *easily* be hampered by the fact that the nvidia firmware is closed and they don't give direct help to developers (even Nouveau) except by giving them keys to sign their drivers with. Nvidia has a much easier task than the Nouveau project, as Nouveau has to reverse engineer everything, whereas nvidia engineers have the specs right in front of them. Even having access to the DisplayPort spec that I can't find anywhere, random hackers on the internet still have to deal with the closed nature of nvidia hardware - and hardware reverse engineering is not easy.
And company like nvidia often not giving all information to the public. Even for AMD actually they did not give the public the detail on how they handle FreeSync on their FreeSync and keep saying that it is 'secret sauce'.
This is just the specific implementation (Freesync). Adaptive sync is the specification. I wouldn't expect AMD to give up trade secrets that give them an edge. The only question is whether the spec itself can be implemented in firmware, and so far I don't see any reason to believe it can't. We're talking about a really small amount of information that needs to be shared between the graphics card and the monitor to make them work at the same rate. Yes, implementation details like how to know how many frames the graphics card will be pushing out 50 ms from now are hard to figure out, and even harder to implement when a hacker only has the option of reverse engineering the card based on whatever inputs and outputs he knows about, but doesn't actually know how this really complex piece of hardware works, and even if he gets a working firmware blob, has no way of signing it if it needs to be signed (and it probably does).
And when they have solution on how to deal with image smoothness when frame rates below VRR window they also refuse to explain on how they actually make it possible and keep emphasing on 'secret'. Though guys at pcper suspect that it might not that different on how nvidia handles it (and nvidia did not go secretive on this subject unlike AMD). So if nvidia current hardware really did support adaptive sync I think people already crack on how to make it work long ago.
Well, nvidia had to give monitor makers the information needed to produce G-sync modules, or at least interface with them. But if they're so open with this, why is it that after nvidia stopped selling people G-sync upgrade kits, no one is producing any 3rd party kits? If they're that open about this, is there design document somewhere that will show hobbyists how to make their own upgrade kits?
As for that certified adaptive sync monitor (if nvidia did support adaptive sync on their future product) that is my pure speculation but looking how nvidia works with their tech for years you can say that's how they roll. Counter productive? But it's working for them. The prime example is those SLI certified motherboard. Nvidia actually charging royalties from mobo maker so their mobo can work on SLI (and at this point it is just simple check by nvidia driver to mobo BIOS for the certification instead of needing of specific chipset from nvidia like it was in the past) and yet does people really complaint? And those certified 3D vision monitor. Yes nvidia 3D vision will not work with just any 3D monitor.
As I said, if it's nvidia's own tech, they will require certifications. If it's an 'open' standard like adaptive sync, they will defer to the standards body.
For nvidia part despite the lock up they do support their tech properly. Their 3D solution for example. They will be the one that responsible to provide the drivers so games able to work in 3D. For AMD if the games does not have support for their 3DHD tech natively they will not provide the drivers to make it work instead they want monitor maker to compete among themselves to provide the drivers. And this monitor maker actually charging extra money for this drivers. And in case of Gsync vs FreeSync one of the advantage of Gsync was tight control by nvidia. So when problem happen all they need to do is fix them in drivers. But that's not the case with FreeSync. In some cases the problem is not driver related they the monitor need to be send back to it's maker for firmware updates. For nvidia their module will interact with nvidia drivers to correct problems.
Personally I was hoping nvidia to support adaptive sync as well. But I also understand how nvidia operate looking at them for years.
Well, they have money invested in G-sync, and for now they are marketing it as a competing and superior technology to freesync, because they want hardware snobs to buy G-sync this and G-sync that and thereby get G-sync as widely adopted as possible. Once G-sync has hit its maximum market penetration and it's clear they won't be able to go any farther with it, I think economic realities will force them to support freesync as well, and this will give them a
huge market advantage if they do it right. Doing it wrong, I believe, would entail, for example, failing to support freesync on 900 series cards, and thereby forcing a lot of their customers to unload their used cards on the market as they switch to AMD so they will be able to affordably gain access to all the features of their already-purchased or planned-purchased monitors. For this reason I believe the 900 series cards already have the hardware capabilities to support Freesync, and that they are already planning to update the firmware for these cards when the time is right. But as I said, if I'm wrong I'm wrong. There's always the fact that graphics cards retain their value, and I can sell at any time, which, barring a drastic price reduction in G-sync capable monitors, I would vastly prefer to do rather than purchase a hugely expensive monitor just to get everything perfect. And really, the whole G-sync thing will look like a massive failure if they don't support freesync as well. It's a much better selling point to be able to say 'our cards can do adaptive sync our way
or their way' then to say, 'well, for $100-$200 more (per monitor) our cards can do adaptive sync
slightly better than our competitors do it
on average. Nobody cares about a difference that only the biggest hardware snobs in the world will notice (and mostly professional gamers and uber rich people I suspect). And to alienate those customers who can afford nvidia tech
or a hugely expensive monitor,
but not both would be colossally stupid in my opinion.