News Nvidia Takes a Shot at AMD's 'Sub-Par Beta Drivers' for GPUs

I mean... If they barely give any actual functionality to the drivers and separate some of the actual interesting things on that garbage that GFE is, then it's not surprising.

Using AMD, now at least, you don't really need 3rd party applications for monitoring, tweaking and configuring like nVidia does, so this is very disingenuous to state and kind of funny as it points out their lack of spine.

As a general thing, I see as many issues with nVidia drivers as I see with AMD, while they're way way less "feature rich" than them, so...

Regards.
 
  • Like
Reactions: Makaveli
Meh, AMDs drivers are light years better than they used to be. Maybe not quite to Nvidia's level, but they've made big progress.

AMDs main problem is having to individually optimize DX11 and earlier games because they've never been able to match Nvidia's superb DX11 optimization globally.
 
Honestly having beta drivers to begin with is just asking for trouble from people who don't know any better. Especially when people are conditioned to think that the latest driver version is always the bestest version.

AMDs main problem is having to individually optimize DX11 and earlier games because they've never been able to match Nvidia's superb DX11 optimization globally.
They could do that by implementing driver wide support for DX11 deferred contexts. But for some reason they never did.
 
  • Like
Reactions: spentshells
Honestly having beta drivers to begin with is just asking for trouble from people who don't know any better.
If you think that "Game Ready", "WHQL" or something drivers are better in general than "beta" – you are mistaken. Sure, they likely test it more with games that I'm never going to play but that's completely useless (to me).
 
If you think that "Game Ready", "WHQL" or something drivers are better in general than "beta" – you are mistaken. Sure, they likely test it more with games that I'm never going to play but that's completely useless (to me).
WHQL is a formal qualification test from Microsoft that every driver must go through, regardless of what it's for.

I think that makes them objectively better than beta drivers.
 
I have to say that with my 15 years with AMD I did have numerous driver issues over the years which required rolling back to the previous month's release, and sometimes couldn't use newer drivers for months on end because they would introduce some game breaking or system breaking issue, I have not experienced this yet with my two years of using an nVidia card.
 
In 25 years, I haven't had problems with ATI/AMD drivers. The vast majority of my cards have been ATI/AMD.

I've had only a few Nvidia cards, and I have had issues with their drivers in the past, but not a lot. One that was unsolvable, though. This one particular EVGA Nvidia card did NOT want to work on this one system that had a motherboard with an Nvidia chipset. Ironically, an ATI card worked in it just fine.

Still, my thought is that maybe Nvidia felt the need to try and reinvigorate that dying group that insists that "Oh, AMD drivers are unreliable" because of some issue from years ago, that they might not have ever personally experienced, but only read about. Gotta get them to start making those claims again online whenever someone asks "should I choose this Nvidia card, or that AMD card?"
 
Still, my thought is that maybe Nvidia felt the need to try and reinvigorate that dying group that insists that "Oh, AMD drivers are unreliable" because of some issue from years ago, that they might not have ever personally experienced, but only read about. Gotta get them to start making those claims again online whenever someone asks "should I choose this Nvidia card, or that AMD card?"
I mean, AMD RTG's marketing group doesn't have their hands clean either.

But I don't condone NVIDIA stooping to their level.
 
Meh, AMDs drivers are light years better than they used to be. Maybe not quite to Nvidia's level, but they've made big progress.

AMDs main problem is having to individually optimize DX11 and earlier games because they've never been able to match Nvidia's superb DX11 optimization globally.
It’s why I’m returning my 6700xt and keeping 3060ti. Although the 6700xt should be faster than the 3060ti, it’s significant slower in the dx11 game that I play, FFXIV. The 3060Ti able to do 4K easily at 60 fps on max settings, I have to significantly lower settings in order to keep above 60 fps.
 
  • Like
Reactions: blppt
They could do that by implementing driver wide support for DX11 deferred contexts. But for some reason they never did

I thought they decided that their GPU architecture would not benefit from that? I remember the whole reason they spent money on Mantle was that the way they designed that current GPU made it very tough to multithread pre-DX12 APIs universally, or something along those lines.
 
"It’s why I’m returning my 6700xt and keeping 3060ti. Although the 6700xt should be faster than the 3060ti, it’s significant slower in the dx11 game that I play, FFXIV. The 3060Ti able to do 4K easily at 60 fps on max settings, I have to significantly lower settings in order to keep above 60 fps. "

My 6900XT is often slower than my 3090 in pre-dx12 titles, but when the drivers are well optimized (and no RT) its actually faster than the 3090. I think the last Hitman showed this in benchmarks.

It just took us forever to ditch pre-DX12/Vulkan/Mantle games.

Currently, Elex 2 is an absolute mess no matter which GPU you use, but being that its DX11 it gets a not insignificant boost on the 3090 versus the 6900XT. The performance is pretty pathetic either way though, given the level of graphics.

Also, I haven't had any stability issues on either card since I got them. And AMDs driver interface is far slicker than Nvidias.
 
  • Like
Reactions: Makaveli
Nvidia is joking right????
My old 1080ti ran like a ferrari, until Nvidia started messing with their new shadowplay features. After that, every driver got worse and worse. Then all went to hell.
 
Last edited:
It’s why I’m returning my 6700xt and keeping 3060ti. Although the 6700xt should be faster than the 3060ti, it’s significant slower in the dx11 game that I play, FFXIV. The 3060Ti able to do 4K easily at 60 fps on max settings, I have to significantly lower settings in order to keep above 60 fps.

I am calling straight BS on this comment or you have way more problems with your rig than you know. I am able to hit 120FPS with a 6700XT at 1440p on Ultra settings easily with the latest drivers. Maybe turn off vsync 😅
 
The situation is reverse on Linux.

Nvidia's drivers are a bovine excrement show. AMD's (And also Intel's) drivers are the real deal and usually you can tell (overlap this with a Venn diagram) users who 1) the new users to Linux having huge problems and at the same time 2) they (of course!) have an Nvidia card.

If you want a perfect experience on an Nvidia card on Linux, well, your best bet is to use some 2 year old distro and lag behind everybody. Then you'll have a great experience. Cause Nvidia be late to supporting new features.

Both SDL and Ubuntu are issuing feature rollbacks right now specifically because of Nvidia's ultra-slow driver work.
 
If I were nVidia, I'd not say this, their *nix support is pathetic, filled with infinite caveats, and on Windows, their software stack included with the drivers has been an endless vulnerability vector for many, many exploits.

Just because they have more software devs and computer engineers in the payroll and more releases does not mean they're better. Phew.
 
I am calling straight BS on this comment or you have way more problems with your rig than you know. I am able to hit 120FPS with a 6700XT at 1440p on Ultra settings easily with the latest drivers. Maybe turn off vsync 😅
I never said anything about 1440p performance. it's 4K where the differences mount
Hook up a 4K monitor or TV to your 6700XT and then put on max settings. Run around Limsa and get back to me
Even Tomshardware agrees with me:

5iwk792umm4iyRrdDUgghV-970-80.png.webp


The 99th may not seem that much difference, but the microstutter on the 6700XT is bad at 4K while on the 3060ti it's much less frequent at 4K. It was originally going to be in my wife's 10850K rig which had a new SSD put in so it was a clean install. 3060Ti put in without even running DDU and it was already smoother and faster.
I tried it on my 12900K same thing, I tried it in my son's 5900x, same thing at 4K. Drop it to 1440p and it runs great.

I'm sure is a combo of AMD not putting much effort into optimizing DX11 and Squeenix not optimizing their game either for AMD.
 
I never said anything about 1440p performance. it's 4K where the differences mount
Hook up a 4K monitor or TV to your 6700XT and then put on max settings. Run around Limsa and get back to me
Even Tomshardware agrees with me:

5iwk792umm4iyRrdDUgghV-970-80.png.webp


The 99th may not seem that much difference, but the microstutter on the 6700XT is bad at 4K while on the 3060ti it's much less frequent at 4K. It was originally going to be in my wife's 10850K rig which had a new SSD put in so it was a clean install. 3060Ti put in without even running DDU and it was already smoother and faster.
I tried it on my 12900K same thing, I tried it in my son's 5900x, same thing at 4K. Drop it to 1440p and it runs great.

I'm sure is a combo of AMD not putting much effort into optimizing DX11 and Squeenix not optimizing their game either for AMD.
That is where the 6700XT performance lands at 4K most of the time. It's not a "driver issue". Look at the 6800 and how it stands: it has a bigger Cache than the 6700XT and it shows. At higher resolutions the effective bandwidth difference between the cards makes a big difference.

Don't go blaming the tires because the clutch was burned and wasted.

Also, use DX12 or Vulkan wrappers for DX9/10/11 games if you can find them. It improves everything for both nVidia and AMD, by a lot. I use it for GuildWars2 and it's like magic.

Regards.
 
That is not fair, that was only EVGA card so i would blame EVGA not Nvidia specially when it occurs over a pandemic when no one has enough parts.
This would hold water if they didn't have a past of stupid failures. Remember the RTX3K launch and how the cards were blacking out due to power not being enough for the aggressive "review" clocks of the beginning? Or how the infamous GTX590 would literally burn under operation? Or how they've bricked/burned cards due to incorrect fan settings?

There's plenty people usually forgets.

And yes, AMD also has similar histories, hence why this statement reads a bit laughable. AMD hasn't burned cards at least? Not that I remember... The 5700 series had a rough initial days, but they're fine now (like Ampere!).

Regards.
 
I never said anything about 1440p performance. it's 4K where the differences mount
Hook up a 4K monitor or TV to your 6700XT and then put on max settings. Run around Limsa and get back to me
Even Tomshardware agrees with me:

5iwk792umm4iyRrdDUgghV-970-80.png.webp


The 99th may not seem that much difference, but the microstutter on the 6700XT is bad at 4K while on the 3060ti it's much less frequent at 4K. It was originally going to be in my wife's 10850K rig which had a new SSD put in so it was a clean install. 3060Ti put in without even running DDU and it was already smoother and faster.
I tried it on my 12900K same thing, I tried it in my son's 5900x, same thing at 4K. Drop it to 1440p and it runs great.

I'm sure is a combo of AMD not putting much effort into optimizing DX11 and Squeenix not optimizing their game either for AMD.

Probably more the fact that a 6700XT isn't really a 4k card and should really be 1440p. In AMD's line up if you want 4k that is 6800XT and 6900XT. And if you want the best 4k performance you really should be on NV.
 
Probably more the fact that a 6700XT isn't really a 4k card and should really be 1440p. In AMD's line up if you want 4k that is 6800XT and 6900XT. And if you want the best 4k performance you really should be on NV.
That's the problem here is that the 3060ti which is supposed to be worse performing than the 6700xt runs better than the 6700XT as it's an unoptimized DX11 game.
Really wish Squeenix would get their act together and upgrade FFXIV to DX12 or Vulcan, but it's not happening anytime soon. There's also no DX12 wrapper for FFXIV. Ironically there's a great Metal wrapper for FFXIV on my MacBook Pro M1Max that runs great....
I've been trying to get a 6800 or 6800XT for my wife's computer for a decent price, but at this point 3060Ti's are much easier to come by and perform well enough at 4K for her FFXIV gaming.
 
That's the problem here is that the 3060ti which is supposed to be worse performing than the 6700xt runs better than the 6700XT as it's an unoptimized DX11 game.
Really wish Squeenix would get their act together and upgrade FFXIV to DX12 or Vulcan, but it's not happening anytime soon. There's also no DX12 wrapper for FFXIV. Ironically there's a great Metal wrapper for FFXIV on my MacBook Pro M1Max that runs great....
I've been trying to get a 6800 or 6800XT for my wife's computer for a decent price, but at this point 3060Ti's are much easier to come by and perform well enough at 4K for her FFXIV gaming.
But a game running like a dog on some brands does't mean the other brands are bad; it's just the game developers not optimizing for those other brands. This has been the case since TWIMTBP days or earlier, where nVidia would invest a ton of money with "partnerships" with developers and basically leave AMD with no room but try to tweak via drivers way after a game's release. This happens in the realm of CPU optimizations as well, so it's nothing new.

Performance differences aside, this is something other than "driver stability". AMD hasn't burned cards playing New World, yes? :)

Regards.
 
  • Like
Reactions: King_V