AMD or Nvidia for Linux Considering Future Events?

VitrasSlade

Reputable
Aug 23, 2015
103
0
4,690
Currently I have an i5 4690k with two GTX 970s gaming under Windows 10 and everyday stuff in Linux Mint. What I plan to do later this year is have two PCs: one for regular stuff under Linux and the other strictly for gaming in Windows. With the development of Vulkan, I am thinking about what I will do with my Windows gaming PC if it is successful. So, considering future possibilities, would an AMD or Nvidia card be better?

I'd like something on par with what I have, as I was originally considering EVGA's FTW GTX 980ti. The main reason for me reconsidering is how much cheaper Freesync monitors are compared to Gsync models. If AMD drivers are still too much of a headache, I may rather go with spending some extra money for Nvidia's stuff. Thanks!
 
Solution


Well, the video card will be face down in that case, so only the...
For Linux gaming, I would normally go for an R9 Nano (GeForce GTX 980 comparable). The NVIDIA drivers are much more optimized than the open source version of the AMD drivers, but at least you can opt for either the official closed source ones or the open source ones on AMD cards.

Using an open source operating system with closed source drivers just seems bad to me. But that is just me :)

As far as G-Sync and FreeSync goes, G-Sync has better adoption and I have seen better implementations using G-Sync so far. I hope that FreeSync will catch up, since I don't think that it is a problem with the standard - just issues with a first release product.
 
The white Nano by Asus looks very tempting, though the case I'd be putting it in would choke it. The PSU will be quite close underneath the GPU, so a full sized card would be optimal.

This is the monitor I am looking at eventually getting:
http://www.amazon.com/Acer-Curved-34-inch-UltraWide-Display/dp/B0111MRT90/ref=sr_1_1?ie=UTF8&qid=1459412266&sr=8-1&keywords=acer+xr34
Originally the Acer X34 (the Gsync version), but that's an extra $200. And as picky as I am, the red color around the Predator logo would bother me, so the one I linked would be better in my setup. The page doesn't outline it, but the refresh rate is 80hz.
 


Well, if you want a full length card you can get a R9 Fury in basic black. The Sapphire model is the closest to an actual double slot model (many of them are actually 2.5 slots wide):

http://pcpartpicker.com/part/sapphire-video-card-100379ntocsr

I am curious which case you plan on using. For smaller systems, most people prefer something like the Corsair 250D, which is ideal for the R9 Nano.
 
The case I plan to use is the Fractal Design Define Nano S, as I want a smaller space taken up.

The R9 Fury would work pretty well. Though it's pretty tempting to put in some extra cash and get the XFX Fury, which would probably be the best so I won't worry about choking the card (since it is watercooled).
http://www.newegg.com/Product/Product.aspx?Item=N82E16814150756&cm_re=r9_fury_x-_-14-150-756-_-Product

For best performance, I'd be better off with the Fury X. I don't really want to though, since they all have that gray/silver and red on the side. The color scheme I'm going for is black and gold with a little white in there. Is there a way to mod that maybe?
 


Well, the video card will be face down in that case, so only the red logo on the top can be seen. You can black it out (or white it out) using automotive pinstripe paint for about $20 for a brush and paint. It will last forever and should not cause problems if you ever have to send the card in for warranty service.

If you don't want to go that far, I know a few people that have used metallic sticker paper instead. It doesn't look quite as professional, but it is by far the easiest. You just cut it to size, remove the backing and slap it on. You can get it at Amazon for about $10:

http://www.amazon.com/Silhouette-MEDIA-SVR-ADH-Printable-Silver-Foil/dp/B008RX1AYW/

Keep in mind that you should cut the sticker paper with a utility knife and a ruler to guide it - do not use scissors :)

You will usually need to use a plastic putty knife to make it look completely flat.

Good luck!
 
Solution
Hm. Definitely want to try that. Maybe black it out and use gold sticker for the silver/grey stripe on the side.

I'll do some deeper digging on AMD drivers in Linux, but this helped a lot as I don't have any experience with team red's cards. Thanks for the help.
 
I can't recall a time when AMD/ATI stacked up well against NVIDIA on Linux, though there is evidence AMD may well be catching up:

http://www.phoronix.com/scan.php?page=article&item=amd-nv-glvk&num=1

However, keep in mind that this result is for performance on the one game that now supports Vulkan on Linux, and AMD has almost certainly optimized just for this specific game. On the last page (page 3) you can see that they've caught up on the opengl side on *benchmarks* only, but on the first page of benchmarks (page 2) you can see that this parity does not extend to real world apps. it seems to me that they're not going to stack up directly in real world performance on Linux for another year at least, and I'd say 2 years might be a more realistic figure. They don't want nvidia to hog all the steam machine glory, so I'm sure they'll be allocating a lot of programmer time to this task, but on the other hand they have some serious catching up to do.

As far as open source drivers are concerned, there *are* open source drivers for nvidia, provided by the nouveau project. Nvidia does work with them, although not in as timely a manner. It was quite a while before they provided signed firmware images for the 900 series to the the nouveau project:

http://www.phoronix.com/scan.php?page=news_item&px=GTX900-Xonotic-More

So yes, if you're a *stickler* for open source drivers, you'll be better off with AMD. However, for *gaming* purposes I wouldn't recommend being an open source snob. You shouldn't have to fiddle around too much to get the binary drivers to work, as there are Ubuntu packages for them (and likely Mint as well):

http://packages.ubuntu.com/search?keywords=nvidia&searchon=names&suite=wily&section=all

One little closed source binary driver shouldn't hurt anything, especially considering you will likely have to use closed source firmware images to activate all sorts of do-dads on your system, from WiFi cards to bluetooth, possibly even things as mundane as your ethernet chip. Unless you decide to go open *hardware* on your next computer, you likely won't be going fully open source, sorry to say. This is not to say that I don't fully support going entirely open source, or as fully open source as possible. I just want to let you know that when it comes to gaming, I make a pragmatic exception for myself and I reap the benefits.
 
Oh, I wasn't trying to stick to strictly open source, it was mainly the sync technology that had me debating. So as long as it performs within the frame rate for the feature to work (30-75 for the specific monitor I'm looking at). And I didn't even think of it until now, does Freesync/Gsync work in Linux? Specifically in Ubuntu/Linux Mint. I didn't see any specific answers in my search.

I probably won't switch to gaming in Linux until (unless) Vulkan really takes off. Would you suspect the drivers to be better optimized and polished by that time?
 
It should work:

https://www.reddit.com/r/linux_gaming/comments/304aaf/anyone_has_experience_with_gsync_on_linux/
https://teksyndicate.com/comment/1810031

As far as the drivers being optimized and polished by the time Vulkan takes off, I admit I'm a little biased, since I'm hoping Vulkan will take off like a rocket. More realistically, I'd say there will probably be a few AAA titles using Vulkan 1 year from now - probably not dozens.

As far as AMD catching up by then, well, their Vulkan routines don't require OpenGL, and they can fix their issues as new titles come out that reveal the bugs in their driver. However, I don't expect Vulkan to be backported to a lot of old titles, so opengl is going to remain relevant on Linux for quite a while, and I don't know how much investment AMD is willing to put into fixing their opengl implementation in the next couple of years. They may end up so focused on getting Vulkan working properly on every new game that comes out that they leave opengl on the back burner. One thing I can say is that 2 years from now, AMD *may* have Linux drivers that compete well with nvidia, there *may* be a couple dozen popular titles using Vulkan available, and WINE *may* have a working Vulkan backend for windows direct x games. It's all really just speculation at this point.
 
A lot of bias here too 😛 I might end up having my gaming rig on dual boot with Windows for older games.. I haven't tried playing games through wine, how does that typically go?

And just curious, would you personally spend the extra money for Nvidia?
 
I did spend the extra money, and bought a 970. Of course, I got it refurbished from EVGA and saved around $20 vs the amazon price, which at the time was already low. You can often find good deals online if you look.

As far as dual boot, on a 64 bit machine that has virtualization extensions, that might not even be necessary. You can run windows concurrently under Xen or KVM. I believe that this may be the solution that is eventually planned to be standard on Steam machines, for users who wish to install windows - although I'm not sure how close they are to making this easy for non power-users:


http://steamcommunity.com/groups/steamuniverse/discussions/1/558749825189649359/


I've never personally tried running windows as a guest on a hypervisor before, but I think it should be fast, since that's what the virtualization extensions are for. If you run a lightweight DE like LXQT, you might *never* need more than 1 CPU core unless you're playing a game or running a guest operating system such as windows. I would give this a shot if you think you can handle configuring something like xen (I found it to be somewhat easy - KVM may be even easier). The impression I got from Xen when I tried it (with Linux guests) was that it's absolutely fluid and nothing like running a guest in virtualbox. I would expect performance with a windows guest to be similar, unless doing something weird like virtualizing a webserver:

https://www.vmware.com/pdf/hypervisor_performance.pdf

And I also would expect performance with KVM to be similar with supported combinations of CPU and windows version (KVM added support for windows more recently, I believe).

As far as wine goes, I think it will work well with a large selection of older games. I use it to run Doom 3 BFG Edition with no slowdown for example. However, my overall strategy for buying games is to buy at least 80% games that will work on Linux without wine (I have over 100 games that run on Linux, almost all bought on sale at *very* reasonable prices). I only buy for wine if it's something that's very cheap and I would be remiss about not having it (a great example being Borderlands: Game of the Year Edition. I want all three, but only the most recent two run natively on linux). Honestly, gaming on Linux has never been better, and is going to improve steadily over the next few years. Try your current games on wine if they're not native yet, and when you find ones that don't work you can help out by filing bug reports over at bugs.winehq.org. While you're waiting for your old games to become playable under wine, buy some games that are native on Linux and play them. Or buy a copy of windows to run in dual boot or on a hypervisor. It doesn't matter. And if you already have a copy of windows with a key that's not tied down to a specific piece of hardware, it might be a nice opportunity to play with a cool hypervisor - depending on how adventurous you want to get.
 
That's pretty cool. Not sure if I'll be trying it soon though, as I'm having trouble even setting up Ubuntu Server at the moment.

And I'll try using wine with my setup this weekend to see how it runs. As far as AMD/Nvidia, I'll keep my eye on the driver support until I actually build it. Thanks for all of the help. : )
 


Sure :)
 



What a fascinating thread, I saw on Tek Syndicate that Linux now boasts the second largest number of games on steam of any platform and seems that even without Vulkac there has been a big push, maybe everyone finally got sick of Microsofts crap.

I did find the benchmarks for Tomb raider on DX 12 interesting that at 1440p even the 6gb of the 980 Ti was maxed out and only the 8gb R9 390x had enough VRAM of the tested cards topping out at a jaw dropping 7.1gb. Im guessing with the new technology the developers are having a issue with VRAM allocation and if it continues i can see a big push for Vulkan.

Fingers crossed it would be good for everyone.

Just another side note while AMD do have history with Vulkan being based off Mantle its being lead by a guy from Nvidia now and Nvidia are the ones pushing it hard.

To the OP im just in the process of swapping my Si 970s for XF R9 390x's once its done il be picking up a 1440p ultrawide Freesync panel as like you mentioned they are considerably cheaper some 30-40% cheaper here in the UK. It will take a few weeks for all the pieces to get to me, but im hopeful it will be worth it.
 


It'd be much appreciated if you share the results. Hope all goes well for you too.
 
If you seriously consider gaming on linux right now nvidia is the only logical choice. Probably for the future as well. I 've seen people sating that 'amd will catch up' and these padt few years looking at it i still not seeing much change. Some people might not agree with me but to me it seems amd did not put much effort on their official linux driver because they were hoping open source community to do the job for them instead. Last year instead of adding more people to their linux division they cut more people instead. Right now they might be catching up because part of Vulkan is being based on mantle but if they did not put more priority as it should it is not impossible that nvidia will pull ahead later.
 


Just guessing here, but this sounds like it has something to do with TressFX and Hairworks. It's a big bone of contention between the two companies, and Tombraider is pretty much the poster child of TressFX. Nvidia does support TressFX, but I think Eidos and AMD are teaming up to give themselves bragging rights.


Fingers crossed it would be good for everyone.

Just another side note while AMD do have history with Vulkan being based off Mantle its being lead by a guy from Nvidia now and Nvidia are the ones pushing it hard.

To the OP im just in the process of swapping my Si 970s for XF R9 390x's once its done il be picking up a 1440p ultrawide Freesync panel as like you mentioned they are considerably cheaper some 30-40% cheaper here in the UK. It will take a few weeks for all the pieces to get to me, but im hopeful it will be worth it.

I'm pretty sure nvidia supports freesync, only they're calling it 'adaptive-vsync':

http://www.geforce.com/hardware/technology/adaptive-vsync/videos

Just a heads up. I kind of doubt freesync has any features that adaptive v-sync does not, because freesync is an open standard that anyone can implement. IOW, I'm pretty sure if you keep the nvidia card you get both. If you know someone who has a freesync panel, you might want to ask to borrow it for testing, or else look for a youtube video of someone who's already tested adaptive sync on a 970.

 
Vesa variable refresh rates tech was called as adaptive sync.

Freesync is AMD implementation (exclusive to amd card) of vesa adaptive sync so game can have variable refresh rates in accordance to game frame rates.

Nvidia Adaptive V-sync is nvidia tech to turn vsync on and off depending on framerates. Vsync will turn on when FPS exceed 60fps to prevent tearing and will be turned of when fps below 60 to reduce input lag. Gsync is nvidia solution to solve both problem at the same time. with adaptive vsync lower frame rate still habe noticeable impact on game image smoothness.

Nvidia did not support Vesa Adaptive sync.

As it is Adaptive Sync still optional to the latest Vesa display port spec. So technically nvidia can still support the latest spec of vesa display port without the need to support adaptive sync.

Will nvidia support adaptive sync? On current product? Definitely no. Amd once mention that nvidia hardware lack the hardware needed to make variable refresh rates work hence the need for Gsync module. And when spec for DP 1.2a being proposed to Vesa only AMD the one present and their implementation is based on how their hardware work. So why laptop version of Gsync does not need module? The problem could lie on how adaptive sync work on mobile itself. Initially it was a feature meant to be save power not as variable refresh rates that changing screen refresh rates on real time following game frame rates. They said the protocol can only work on short distance not on long cable. So nvidia create gsync module to make it work on long distance and further modify the tech to suit what they envision with gsync. If a simple driver all it's need to make adaptive sync work amd will not going to propose additional spec to added on top of current display port spec. And there is hardware changes in scalar hardware needed for that. That's why you don't see current monitor with DP1.2 being update to make it compatible with DP1.2a spec. So will nvidia future gpu support vesa adaptive sync? Unknown. But they can if they want to since adaptive sync is open. But i imagine even if they able to work with adaptive sync monitor they probably not going to work with just any adaptive sync monitors. For one nvidia is very strict about quality that involving with their product image. So the most likely scenario will be nvidia card will not going to work with monitors that is not certified by nvidia.
 


Well, that's a bummer that I'm not going to be able to have adaptive sync on my next monitor, but I honestly don't know if screen tearing has ever happened on any of my hardware. It's possible that I simply don't notice it, although I notice it just fine when it's demonstrated in youtube videos:

https://www.youtube.com/watch?v=jVAFuUAKPMc

My current monitor is a 3 year old Hanns-G which I run without V-sync. I have a big blotch of dead pixels in the lower left corner of my screen, so I plan to get one of the ROG monitors - 24 inch, non g-sync, 144 hz, like this one:

http://www.amazon.com/Asus-VG248QE-24-inch-Ergonomic-Back-lit/dp/B00B2HH7G0

Once I get it, I'll see if I have a high framerate camera somewhere so I can gather objective evidence of any tearing or stuttering I might be missing, and see if I can find fixes for it. Really though, I don't notice it on youtube videos, games, or when I play videos using mpv. It's not that I don't believe it's there, but if it is then it's going to take some training for me to see it. I assume it will be a lot more noticeable at 144 hz (and perhaps also at 4k), so I'm not sure there's much point in worrying about it until I get a better monitor (although my current one was very nice in it's time).

Nvidia did not support Vesa Adaptive sync.

As it is Adaptive Sync still optional to the latest Vesa display port spec. So technically nvidia can still support the latest spec of vesa display port without the need to support adaptive sync.

Will nvidia support adaptive sync? On current product? Definitely no.

I don't know where you're coming up with 'definitely'. The whole industry is starting to standardize on freesync now, and as far as I know nvidia could support freesync just by updating their firmware:

https://linustechtips.com/main/topic/511277-freesync-on-nvidia-hack/

If I have to sell my card and buy a new one to get adaptive sync working 2 years down the road, they're going to lose me as a customer. I'm sure they'll lose a lot of other customers this way too. So I have a feeling they're going to support freesync (or what intel is calling 'vesa adaptive sync') on at least their 900 series cards. I think they will surely realize that the 900 series is going to stay relevant for at least the next 3 years, and if 2 years from now the market gets flooded with cheap 900s being sold by people who want adaptive sync without paying $100-$200 extra, nvidia's going to have a problem on it's hands. Nvidia doesn't want to hemorrhage cash, so I'm pretty sure they're going to support this on gt 900 series and up. In fact, I would bet this possibility of providing support later with a driver update, is the reason why they did not decide to support this out of the box.

However, if I'm wrong I'm wrong, and if I have to switch to AMD I will.

Amd once mention that nvidia hardware lack the hardware needed to make variable refresh rates work hence the need for Gsync module. And when spec for DP 1.2a being proposed to Vesa only AMD the one present and their implementation is based on how their hardware work. So why laptop version of Gsync does not need module? The problem could lie on how adaptive sync work on mobile itself. Initially it was a feature meant to be save power not as variable refresh rates that changing screen refresh rates on real time following game frame rates. They said the protocol can only work on short distance not on long cable. So nvidia create gsync module to make it work on long distance and further modify the tech to suit what they envision with gsync. If a simple driver all it's need to make adaptive sync work amd will not going to propose additional spec to added on top of current display port spec.

I don't see how this follows. The spec only describes how a monitor interfaces with a graphics card. It doesn't care about the specific implementation details beyond what's described in the specification.

And there is hardware changes in scalar hardware needed for that. That's why you don't see current monitor with DP1.2 being update to make it compatible with DP1.2a spec. So will nvidia future gpu support vesa adaptive sync? Unknown. But they can if they want to since adaptive sync is open. But i imagine even if they able to work with adaptive sync monitor they probably not going to work with just any adaptive sync monitors. For one nvidia is very strict about quality that involving with their product image. So the most likely scenario will be nvidia card will not going to work with monitors that is not certified by nvidia.

I think it's more likely that nvidia will continue to market G-sync as a higher end solution to the same problem. They will not abandon G-sync because although the performance difference is small, it is noticeable:

https://www.youtube.com/watch?v=MzHxhjcE0eQ

The advantage nvidia will get from supporting freesync is simple: just being able to support adaptive sync on the same monitors AMD does. Requiring nvidia certification would be counterproductive towards that end. They will reserve nvidia certification for G-sync, but they will support the full VESA adaptive sync standard while also being able to say 'if you want it perfect, buy an nvidia card and a G-sync monitor'

 
I said definitely because the current nvidia hardware does not have the hardware needed make adaptive sync work without the aid of the module. eDP used in laptop is a bit different that DP used on desktop monitor. When AMD first talk about FreeSync concept they said their gpu have the necessary tech inside for three generations already. But then why they finalize FreeSync only GCN 1.1 (the latest back then) able to support FreeSync in games contradict to their early claim about have the required tech for three generations already? It is most likely because not as simple as give driver updates and firmware updates to the monitor like how they imagine it first. If driver hack all it needs for nvidia card to support FreeSync then people have done it long ago. But so far I have still not seeing it done by anyone yet. They use mobile gsync does not need module as a prove that current nvidia can work with FreeSync and nvidia simply put a block on their driver but if that's really the case I'm sure it will be hacked quickly. Remember nvidia blocking dedicated PhysX card when AMD GPU is present? There was hack to make hybrid PhysX setup working again. Then SLI only working on mobo that have SLI certification? There is hack so SLI can also work on board that does not have the certification. Mobile gsync has existed since last year. So why there is still no hack to make nvidia desktop gpu working with FreeSync? So far we only heard speculation but as it is we don't even really know what are the real changes nvidia done to their mobile gpu to make adaptive sync work without the module. And company like nvidia often not giving all information to the public. Even for AMD actually they did not give the public the detail on how they handle FreeSync on their FreeSync and keep saying that it is 'secret sauce'. And when they have solution on how to deal with image smoothness when frame rates below VRR window they also refuse to explain on how they actually make it possible and keep emphasing on 'secret'. Though guys at pcper suspect that it might not that different on how nvidia handles it (and nvidia did not go secretive on this subject unlike AMD). So if nvidia current hardware really did support adaptive sync I think people already crack on how to make it work long ago.

As for that certified adaptive sync monitor (if nvidia did support adaptive sync on their future product) that is my pure speculation but looking how nvidia works with their tech for years you can say that's how they roll. Counter productive? But it's working for them. The prime example is those SLI certified motherboard. Nvidia actually charging royalties from mobo maker so their mobo can work on SLI (and at this point it is just simple check by nvidia driver to mobo BIOS for the certification instead of needing of specific chipset from nvidia like it was in the past) and yet does people really complaint? And those certified 3D vision monitor. Yes nvidia 3D vision will not work with just any 3D monitor.

For nvidia part despite the lock up they do support their tech properly. Their 3D solution for example. They will be the one that responsible to provide the drivers so games able to work in 3D. For AMD if the games does not have support for their 3DHD tech natively they will not provide the drivers to make it work instead they want monitor maker to compete among themselves to provide the drivers. And this monitor maker actually charging extra money for this drivers. And in case of Gsync vs FreeSync one of the advantage of Gsync was tight control by nvidia. So when problem happen all they need to do is fix them in drivers. But that's not the case with FreeSync. In some cases the problem is not driver related they the monitor need to be send back to it's maker for firmware updates. For nvidia their module will interact with nvidia drivers to correct problems.

Personally I was hoping nvidia to support adaptive sync as well. But I also understand how nvidia operate looking at them for years.
 


That's the thing, we really don't know. I tried to find a copy of the DisplayPort spec, but if it's on the internet they're not making it easy to find. I'll check torrent sites and such, but if it's this hard to find there might only be a few individuals working on it, and they could *easily* be hampered by the fact that the nvidia firmware is closed and they don't give direct help to developers (even Nouveau) except by giving them keys to sign their drivers with. Nvidia has a much easier task than the Nouveau project, as Nouveau has to reverse engineer everything, whereas nvidia engineers have the specs right in front of them. Even having access to the DisplayPort spec that I can't find anywhere, random hackers on the internet still have to deal with the closed nature of nvidia hardware - and hardware reverse engineering is not easy.


And company like nvidia often not giving all information to the public. Even for AMD actually they did not give the public the detail on how they handle FreeSync on their FreeSync and keep saying that it is 'secret sauce'.

This is just the specific implementation (Freesync). Adaptive sync is the specification. I wouldn't expect AMD to give up trade secrets that give them an edge. The only question is whether the spec itself can be implemented in firmware, and so far I don't see any reason to believe it can't. We're talking about a really small amount of information that needs to be shared between the graphics card and the monitor to make them work at the same rate. Yes, implementation details like how to know how many frames the graphics card will be pushing out 50 ms from now are hard to figure out, and even harder to implement when a hacker only has the option of reverse engineering the card based on whatever inputs and outputs he knows about, but doesn't actually know how this really complex piece of hardware works, and even if he gets a working firmware blob, has no way of signing it if it needs to be signed (and it probably does).



And when they have solution on how to deal with image smoothness when frame rates below VRR window they also refuse to explain on how they actually make it possible and keep emphasing on 'secret'. Though guys at pcper suspect that it might not that different on how nvidia handles it (and nvidia did not go secretive on this subject unlike AMD). So if nvidia current hardware really did support adaptive sync I think people already crack on how to make it work long ago.

Well, nvidia had to give monitor makers the information needed to produce G-sync modules, or at least interface with them. But if they're so open with this, why is it that after nvidia stopped selling people G-sync upgrade kits, no one is producing any 3rd party kits? If they're that open about this, is there design document somewhere that will show hobbyists how to make their own upgrade kits?

As for that certified adaptive sync monitor (if nvidia did support adaptive sync on their future product) that is my pure speculation but looking how nvidia works with their tech for years you can say that's how they roll. Counter productive? But it's working for them. The prime example is those SLI certified motherboard. Nvidia actually charging royalties from mobo maker so their mobo can work on SLI (and at this point it is just simple check by nvidia driver to mobo BIOS for the certification instead of needing of specific chipset from nvidia like it was in the past) and yet does people really complaint? And those certified 3D vision monitor. Yes nvidia 3D vision will not work with just any 3D monitor.

As I said, if it's nvidia's own tech, they will require certifications. If it's an 'open' standard like adaptive sync, they will defer to the standards body.

For nvidia part despite the lock up they do support their tech properly. Their 3D solution for example. They will be the one that responsible to provide the drivers so games able to work in 3D. For AMD if the games does not have support for their 3DHD tech natively they will not provide the drivers to make it work instead they want monitor maker to compete among themselves to provide the drivers. And this monitor maker actually charging extra money for this drivers. And in case of Gsync vs FreeSync one of the advantage of Gsync was tight control by nvidia. So when problem happen all they need to do is fix them in drivers. But that's not the case with FreeSync. In some cases the problem is not driver related they the monitor need to be send back to it's maker for firmware updates. For nvidia their module will interact with nvidia drivers to correct problems.

Personally I was hoping nvidia to support adaptive sync as well. But I also understand how nvidia operate looking at them for years.

Well, they have money invested in G-sync, and for now they are marketing it as a competing and superior technology to freesync, because they want hardware snobs to buy G-sync this and G-sync that and thereby get G-sync as widely adopted as possible. Once G-sync has hit its maximum market penetration and it's clear they won't be able to go any farther with it, I think economic realities will force them to support freesync as well, and this will give them a huge market advantage if they do it right. Doing it wrong, I believe, would entail, for example, failing to support freesync on 900 series cards, and thereby forcing a lot of their customers to unload their used cards on the market as they switch to AMD so they will be able to affordably gain access to all the features of their already-purchased or planned-purchased monitors. For this reason I believe the 900 series cards already have the hardware capabilities to support Freesync, and that they are already planning to update the firmware for these cards when the time is right. But as I said, if I'm wrong I'm wrong. There's always the fact that graphics cards retain their value, and I can sell at any time, which, barring a drastic price reduction in G-sync capable monitors, I would vastly prefer to do rather than purchase a hugely expensive monitor just to get everything perfect. And really, the whole G-sync thing will look like a massive failure if they don't support freesync as well. It's a much better selling point to be able to say 'our cards can do adaptive sync our way or their way' then to say, 'well, for $100-$200 more (per monitor) our cards can do adaptive sync slightly better than our competitors do it on average. Nobody cares about a difference that only the biggest hardware snobs in the world will notice (and mostly professional gamers and uber rich people I suspect). And to alienate those customers who can afford nvidia tech or a hugely expensive monitor, but not both would be colossally stupid in my opinion.
 
If you really did follow freesync from the begining you will be aware some changes happen until it hit final product that we have right now. Many believe nvidia only need to give driver updates to make it possible but even in AMD cases that's not true. Why use 'Free' word in the first place? Because first they thought all you need to do is get driver updates. No need for new card for radeon since the required 5ech already inside radeon gpu for three generations where as for gsync you need nvidia latest architectute at that time: kepler. No need to buy new monitor instead just upgrade the firmware on existing monitors.

So how is that ending up? In the end freesync not even work on 7970 in games while GTX680 are compatible with Gsync. And that is cards from same generation. And the way how AMD said before even 6k and 5k series probably able to work eith freesync. And then you only need to upgrade your monitor firmware to make it work. If that all it needs to be done then why there are 3 scaler maker end up making specific scaler for adaptive sync? Why not just use existing scaler like how AMD claim it to be? and ultimately we know only AMD propose the spec for DP1.2a. It is very well known when hardware maker like AMD propose a spec they will propose an implementation that work with with their hardware best. this is also true when IHV proposing their implementation to be accepted in open source API like OpenGL and OpenCL. only this time no one disputing what amd had proposed to vesa because they were the only one there. Intel already said that they will support adaptive sync. But why not supporting them in their current product if driver updates all it needs to make it work?

And it seems you did not understand about the certification of adaptive monitors. Yes adaptive sync is open standard but nvidia still the one that choosing which product they want to work with their card. Even AMD freesync there is certification from for Freesync branding and not just any monitor maker can say their monitor is able to support Freesync (and carry freesync logo). To get the cerftification the lower range of VRR is must not greater than 42hz. For nvidia knowing on how they handle things in the past they will go one step further that monitors without their certification will not work (though i believe this one will be easily hacked). It does sound silly but in nvidia reasoning they will not going to compromize user experience hence the tight control. When FS finally launch we heard that some FS panel have ghosting issue while the same panel with gsync does not. And amd explain that since they don't have tight control over monitor maker (they just let monitor maker do their own things) stuff like this happen. And one FS monitor from Asus will limit FS VRR upper limit to 90hz when FS being activated while will go up to 120hz (or is it 144hz i dont really remember) in normal mode. And that despite amd claims that FS range can go as high as 240hz. So in the end if sone nvidia user end up going with AMD solution because of current 900 series cannot work with vesa adaptive sync then there is nothing nvidia can do about it. Though one thing you should not underestimate about nvidia is their brand image. For example nvidia sponsor a lot of game tournament and even on non triple A games you can see nvidia logo on them. for those that don't really know about gpu world all they know is nvidia because they see them everywhere and only from time to time see AMD logo in triple A games.

 


I'm not sure what you mean by 'scaler'. Do you mean GPU? If so, I'm pretty sure Intel's 'VESA adaptive sync' GPUs will be compatible with Freesync monitors. As far as why Intel would have to come up with their own implementation, it's just because AMD is not sharing their IP with anyone else. It's not like they would publish the design of their own hardware in the DP1.2a spec - at most a reference design of a tiny little portion of a GPU, and probably not even that. More likely just a textual description of the interface and what data will go across it, with some black boxes representing GPU functions.

Why not just use existing scaler like how AMD claim it to be? and ultimately we know only AMD propose the spec for DP1.2a. It is very well known when hardware maker like AMD propose a spec they will propose an implementation that work with with their hardware best. this is also true when IHV proposing their implementation to be accepted in open source API like OpenGL and OpenCL. only this time no one disputing what amd had proposed to vesa because they were the only one there. Intel already said that they will support adaptive sync. But why not supporting them in their current product if driver updates all it needs to make it work?

I wasn't suggesting that just any graphics card would be able to do this. The card has to be able to predict its own behavior a few (like maybe 50?) ms in the future. Also, I don't think intel even competes with AMD and nvidia yet.

And it seems you did not understand about the certification of adaptive monitors. Yes adaptive sync is open standard but nvidia still the one that choosing which product they want to work with their card.

Ok, so this is up to the designer then. I believe that.

Even AMD freesync there is certification from for Freesync branding and not just any monitor maker can say their monitor is able to support Freesync (and carry freesync logo).

That's because AMD created the spec and have a vested interest in the branding. Freesync is just their implementation of it, with maybe some extensions that will make it work better than the Intel version, yet fully compatible with any 'VESA adaptive sync' product. Nvidia now has a vested interest in the G-sync branding as their superior solution. I understand what you're getting at though, and you might be right about the certifications.

To get the cerftification the lower range of VRR is must not greater than 42hz. For nvidia knowing on how they handle things in the past they will go one step further that monitors without their certification will not work (though i believe this one will be easily hacked). It does sound silly but in nvidia reasoning they will not going to compromize user experience hence the tight control. When FS finally launch we heard that some FS panel have ghosting issue while the same panel with gsync does not. And amd explain that since they don't have tight control over monitor maker (they just let monitor maker do their own things) stuff like this happen. And one FS monitor from Asus will limit FS VRR upper limit to 90hz when FS being activated while will go up to 120hz (or is it 144hz i dont really remember) in normal mode. And that despite amd claims that FS range can go as high as 240hz. So in the end if so[m]e nvidia user end up going with AMD solution because of current 900 series cannot work with vesa adaptive sync then there is nothing nvidia can do about it. Though one thing you should not underestimate about nvidia is their brand image. For example nvidia sponsor a lot of game tournament and even on non triple A games you can see nvidia logo on them. for those that don't really know about gpu world all they know is nvidia because they see them everywhere and only from time to time see AMD logo in triple A games.

If they want the certification just for the branding, then fine, I get that. It just means that virtually every single monitor that supports freesync will get the nvidia certification as well. They're not going to create a separate, incompatible version of freesync, all the adaptive sync capable cards will be able to interoperate with all the adaptive sync capable monitors. That's the whole point of creating a standard.

 
The scaler was the chip inside the monitor that handle how monitor works. With Gsync nvidia replace that chip with Gsync module. The early concept of Freesync was this chip doe notcneed to be replace. Of cource not every chip can be upgraded but some of them just need firmware updates according to AMD. In mid 2014 amd shows real monitor handling freesync that have it's firmware being updated. But amd did not disclose the brand of the monitor nor did they run real games on it. Amd did not show freesync running real games until true freesync monitors being shown in CES 2015 with the specific scaler that being made for adaptive sync being use in those monitors.

The advantage of Adaptive sync is cheaper but at the same time you also going to need much detail research before buying one. And you still need to look at it's VRR window because not all will support the smoothness compensation because there ia certain condition to be met for it to work according to amd.

And Gsync is not just bringing smoothness and tear free experience. they also have successor to light boost which to date a tech that amd has still no answer to.