AMD vs Nvidia: Whose Driver Updates Improve Performance More?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Rogue Leader

It's a trap!
Moderator


I think you have your information wrong somewhere here. While I know some APUs were dropped sooner than they should be, the M290x has support from the latest Adrenalin driver.
 

TheOtherOne

Distinguished
Oct 19, 2013
243
86
18,670
And all this has nothing to do with the fact that most of these games are better optimized for nVidia hardware because of their [strike]under the table [/strike] promotions and bundled deals? :ouch:
 

logainofhades

Titan
Moderator
What I play favors Nvidia, so that's what I use. I didn't think it was that big of a difference, until I went from an HD 7970 to a GTX 770, a few years back, and the GTX walked all over the HD. Also I get good deals from a friend on used GPU's. Maybe one day Blizzard will fix their bad AMD optimization. Until that day comes, Nvidia it is.
 


BF 3, 4 and 1 are not gameworks titles. In fact they were AMD titles.

That has nothing to do with two years of driver performance updates though.
 
Jun 20, 2018
8
0
20
When you are comparing NVidia and AMD cards , why not use (2) systems-----------one with an AMD CPU and another (comparable) system with an Intel CPU. That comparison would match strengths and weaknesses equivalently. To rely on a SINGLE GPU/MB setup does not do justice to the inherent differences between these companies.
GPU's (like CPU's) are optimized for a vendor's CPU/MB design parameters and are not (necessarily) optimal for the other vendor's CPU/MB design.
Of course, the same can, AND MUST, be said about CPU's and MB's. "What's BEST for the goose, may NOT be for the gander".
 
Jun 20, 2018
8
0
20
When comparing different mfg's products, why not test them on systems designed FOR each product----------when possible. ie: NVidia and AMD GPU's should have their performance measured by testing them on similar systems that utilize each OTHERS CPU/MB design. While an AMD GPU will work on an Intel (or other mfg) system, it is DESIGNED to work (at it's best) on an AMD design authorized system. The same holds true for NVDIA. Yes, this adds some complexity and cost, but provides a clearer picture of the strengths and weaknesses of each tested unit.
 
Jun 20, 2018
8
0
20
I agree that getting the "latest and greatest" ( whether that be a new car , MB, CPU, GPU, etc) is little more than a cash sump. I'm running an RX-470 on an M5-A97 R2.0 MB and have only now begun to think about an "upgrade". But----------why spend for minimal improvements now, when I can expect a MUCH larger variance within the next 1-2 years?
 

Rogue Leader

It's a trap!
Moderator


Yeah thats not true. Firstly Nvidia has no relation to Intel, so no Intel systems and Nvidia cards are not "designed" to work with eachother better. Nor in the same respect is there anything special about an AMD CPU or motherboard that works any better with an AMD GPU. Any evidence to the contrary is straight up anecdotal. There has never been a single instrumented test ever that could prove your point. Heck AMD supplies Intel with Vega cores for their NUC CPUs.

In fact the best way to compare GPUs is on an identical system so that it takes the system out of the performance equation and any differences you get are purely GPU.
 

mihen

Honorable
Oct 11, 2017
466
54
10,890
I think this was a good comparison. These cards were released 2 years ago around the same time and it took into account a wide spectrum of games. The main thing I didn't like about the article is that it was not specified if it was a 4gb or 8gb RX 480. I expect the 8gb due to the link in the article and that it is relatively the same performance level as the GTX 1060 6gb.

I think the main difference between today and the HD 7970 is that AMD has been on top of it with drivers releasing them every month and with every major release. With a more unified approach to graphics APIs due to DX12 there is also not much variance between games.

What I think is interesting is that the R9 Fury X typically performs better than the GTX 1070.
 

s997863

Distinguished
Aug 29, 2007
145
10
18,685
I'd like to see an article on whose driver updates break basic functionality and backwards compatibility more, from old games to their own graphic applets. I've owned a gefore8800 and radeon7850 over the last 10 years, on winXP and win7. I've noticed the driver applets have gone to crap over time. From the old days of having all settings on single pages (3d settings all fit on 1 page @ 800x600 desktop, and all desktop color settings fit on another page with clear sliders for brightness, gamma, contrast for either games or overlay or desktop ... etc.) to the recent example of AMD "crimson" drivers who's stupid win10 like applet is too wide for a 4:3 monitor even at 1024x1078 despite having big touch-screen like icons and a lot of wasted empty space and no scaling for different desktop resolutions, which is having problems simply to set adjust the brightness/contrast or to save different profiles of it, and AMD is telling users to adjust their monitor settings instead, WTF?
 

Loadedaxe

Distinguished
Jul 30, 2016
219
144
18,790
I dont know where those prices come from. You can get a GTX1060 6GB for $279 and a RX580 8GB for $249 as of today on Amazon and Newegg.

OT: I have the AMD/Freesync option, when I got my RX480 it was $239, My Dell Freesync 27" SE2717H was $229. I game at 75fps easily for $500, same setup with GTX1060/Gsync, ~$900-$1000. No thanks nVidia.
 

Krazie_Ivan

Honorable
Aug 22, 2012
102
0
10,680


i really, really need to stop voting on comments while still half-asleep... *thumb down*
 

norune

Distinguished
Nov 5, 2010
26
0
18,530


While you are right about comparing these cards based on the release they don't tell us about a few things that are actually interesting :

1 : Amd's power issue via the PCIE connection in the RX 480 cards and that lead to some issues for a few users. Fix was done by driver and seems to have lowered performance some. Some games that has the old driver vs the newer one seems to show that. Then after a while it seems to stabilize, new driver = same fps +- than old one.
2 : Amd seems to have fixed this in the newer versions RX 580 cards and also improved performance, does driver improvements also benefit these cards in a better way?


 
UNDOUBLING,
You are misinformed if you think testing an AMD CPU + AMD GPU vs an Intel CPU + NVidia GPU makes sense.

Your main premise that AMD Graphics Cards work best with AMD CPU's is completely flawed from the start.

Even if there WAS an advantage pairing an AMD GPU with AMD CPU (there's not to be clear) you're now adding more VARIABLES to the testing system.

You can easily find BENCHMARKS that show how different AMD and Intel CPU's do with the same graphics card so WHICH AMD card would you even choose if there was any truth to this?

Not to be rude but you may want to go back and study computers a bit more. I'm not sure where you got the idea, perhaps you're getting the idea from APU's which have attempted to optimize some code around a single CPU+GPU chip but I'm not sure if that has been very successful yet nor does it even apply to graphics cards.

The CPU in a desktop gaming system is basically a separate component to the graphics card and can easily be tested as such. It's gaming functionality is primarily a function of it's ability to driver a specific number of DRAW CALLS etc to the graphics card.

If there's some "secret sauce" in an AMD CPU that would be pretty big news. I've never heard of this beyond the DX12 draw call hardware that got put into the Xbox One X.

(theoretically that hardware could be put into a future AMD CPU though. You'd need to have a future compatible graphics card.. may or may not make sense at desktop level)
 
Update:
I meant "which AMD CPU" (not GPU) as in how would you pair an AMD GPU with an AMD CPU to compare to NVidia GPU + Intel CPU when even AMD has so much variance between CPU's... just the best AMD CPU even if it's not as good as the best Intel CPU?
 

norune

Distinguished
Nov 5, 2010
26
0
18,530


While there is advantages today by having a test setup where you focus on the best IPC per core, there's a few holes there which in fact might give different results depending on how you pair an system :

*Even though an Intel cpu (the latest) seems to have a little better performance per core than Amd, the drawback seems to be higher power consumption and less cores total. More and more games today are released with better support for multiple cores and many gamers today uses the computer in several tasks at once, like downloading, recording what they are playing at or even publicying that.
*Due to certain differences in how an Amd cpu handles certain tasks there can also be both positive and negative results in some games, depending also on any improvements that the game designers have added to uses and Amd cpu's instruction set there might also be certain benefits there ( or on intel side).
*Lastly an Amd Cpu + Amd Gpu combination can give us a better insight if the driver performance from Amd also benefits from having Amd cpu's.

So yeah, having an Intel and an Amd setup where you can see the benefits of each one in different situations with the latest drivers is quite interesting, at least for me. Amd + Nvidia , Amd + Amd, Intel + Nvidia, Intel + Amd are a few examples :)

 

froggx

Distinguished
Sep 6, 2017
89
36
18,570


I'm glad I'm not the only one who read the post that was quoted in the post I quoted and said "I cannot abide by this ridiculousness." I can't tell if redgarl is for or against dx12 (or just likes to state information generally considered to be fact). I couldn't tell whether (s)he skipped reading the article or merely thinks 20 days is a bloody long time in terms of graphics hardware development lifecycles (or, once again, simply has a thing for stating facts, irregardless of relevance).

I'd like to add that another one of the factors in determining which games to use for benchmarking (and keeping them in rotation for as long as they provide useful data) is for ease of comparing results over time. Some of those benchmarks have been in use for over 3 years and keeping them in play makes it easier to pull up old articles and make an informed estimate of how the most recently reviewed hardware performs against older hardware (I say "informed estimate" simply because the test benches, drivers, etc., evolve over time).

I like this article. It helps in making that estimate by giving a glimpse into the effects that the evolution of the graphics drivers has had over time. While it would be nice if more than 2 cards had been tested, it's nicer to have the information from testing 2 cards than if it had been decided this kind of thing wasn't worth looking into at all.

Keep these games in your "stupid library!" XD
I've been reading this site for 15+ years because of the scientific precision and overall reliability of the data gathered as well as articles like this one that serve to further add perspective.
 


Sorry, but your points are meaningless in the context of trying to compare an AMD GPU to an NVidia GPU... I tried to explain that creating an AMD CPU build (i.e. Ryzen) for an AMD GPU to compare an Intel build with NVidia GPU creates far too many variables.

Higher frequency and more cores?
Sure, so you're actually pointing out the very PROBLEM I pointed to. If you have a particular CPU that works better in a game then your results are already skewed thus comparing the GRAPHICS CARDS between systems is pointless which again was the POINT I was addressing.

You are talking about comparing AMD CPU vs Intel CPU which is not at all what I'm talking about.

Let me REPEAT that even if there was some way that the AMD CPU architecture benefitted an AMD graphics card relatively more than an NVidia card on the same AMD CPU setup it's just too many variables so what are you comparing exactly?

*Also, LINK ME to an article that demonstrates that some AMD CPU architecture benefits an AMD GPU more than an NVidia GPU.

Now there are differences in GPU driver efficiency especially at DX11 that benefit NVidia cards but that's not what I'm talking about. In a CPU constrained situation you'd get the same DX11 losses with an Intel CPU of identical performance.

So what's this "secret sauce" in AMD CPU's specifically that gives an advantage to AMD Graphics Cards? I've never heard of it.
 

norune

Distinguished
Nov 5, 2010
26
0
18,530


1 : Well, for me being able to see what driver updates can do on an Amd + Nvidia / Amd + Amd / Intel + Amd or Intel + Nvidia is in fact interesting.
Add price for each system and you might get a more interesting numbers out.
There's been several improvements made by game developers to fix bugs that do adress certain problems with Intel / Amd cpu's , just to point out a few :
https://steamcommunity.com/games/HL2Update/announcements/detail/177106646962802992
https://www.bungie.net/en/Forums/Post/237390845
https://segmentnext.com/2017/11/03/many-still-not-able-play-destiny-since-launch-intel-nvidia-user-base-facing-issues/

And, my point was not focusing on the actual cpu, but which combination would work best in different scenarios for each type of gpu, my own experience has been quite good with the Amd + Amd builds I've had and almost no game stopping bugs due to cpu and gpu combination ( there's a few Intel out there if you actually check the above links ).

2 : What we are comparing ?
How much you get for your money when combining one build vs other build.
Stability of different combinations.
Benefits of certain builds vs other.

A good example is that on Amd motherboards you often get Pcie x16 while on Intel builds x8, put into count other units using the same channels for info and you can get performance problems in certain scenarios.

https://www.youtube.com/watch?v=i8iE_sQBFXk
https://www.gamersnexus.net/guides/3176-dual-titan-v-bandwidth-limit-test-x8-vs-x16

Now again not every user will experience such a problem ( in fact very few) but again this is a thing that is ok to know and how it "might" affect you if you have several things that does use bandwith there ( ssd's, specific cards etc...)

Here you have an older article about the 1800x and the benefits some combinations of Amd + Amd vs Amd + Nvidia and Intel variants have in DX12 games, not a lot to go on, but then again something to think about : https://www.techspot.com/article/1374-amd-ryzen-with-amd-gpu/page6.html








 


1. There is no complete difference. Most CPUs will perform nearly the same in games beyond 1080P maxed out. Very few take full advantage of a quad core let alone more threads. The more AMD hardcore people have been pounding the "it will play better when more cores are used" or "When DX12 hits" etc. Games will slowly move to more cores. By the time 8 cores becomes more common games will probably be efficient with 4 cores, by that I mean majority. Yes there is a very small minority that can utilize more cores. No the majority do not hence why Intel still currently games better than AMD.

2. What? What Intel build only has x8 PCIe? All mainstream Intel CPUs have 16 PCIe 3.0 lanes going to one or two PCIe x16 lanes. The only time it would drop to x8 is if you used both slots for GPUs. The chipsets have their own set of PCIe lanes used for other devices such as M.2 drives, PCIe x1/x4 slots, USB 3.1 etc.

And as for performance, no consumer gaming CPU saturates a PCIe 2.0x8 bus let alone a PCIe 3.0 x8 bus. There is so much bandwidth that it is vastly ahead of where GPUs are.

BTW your last link found nothing to be substantial enough to prove that it makes any real difference.
 

Rogue Leader

It's a trap!
Moderator


Accidentally downvoted this but meant to upvote. this is correct.
 
Guys can stop responding to my REPLY POST because it's gotten way off track. My initial response was to the idea of demonstrating that AMD cards work better on systems with an AMD CPU vs NVidia cards on an Intel rig.

It's not about value or anything like that that I discussed and I thought I was clear.

Norune,
THIS statement is simply, factually incorrect:
"A good example is that on Amd motherboards you often get Pcie x16 while on Intel builds x8"
In fact up until not too long ago most AMD boards had PCIe v2.0 only so the "x16" slot actually had half the bandwidth. You are confusing likely CHIPSET vs CPU data likely. Intel has had PCIe v3.0 x16 bandwidth to the main graphics card slot on every board for a long time.

*Also your final assertion and link suggesting AMD Ryzen CPU's impart some benefit to AMD graphics cards is DISPROVEN by the very link you provided me at the end of your post just above... did you even read the post and watch the video that you linked?

"So, are Nvidia GPUs limiting Ryzen's gaming performance? Well, we didn't find any evidence of that."

Put simply if NVidia isn't performing any worse with a Ryzen CPU vs Intel then by extension the AMD graphics cards aren't gaining an advantage either from Ryzen. I'm not going to explain further the reasons for the differences as the video and article explain it well and if you still don't grasp what I'm saying I'm not sure what to tell you.

I think the quoted sentence above says it very clearly. The title asks "Does Ryzen Perform Better With AMD GPU's?" and the answer is NO.

You can compare CPU to CPU and GPU to GPU but in each case you must attempt to isolate to a single VARIABLE so for graphics cards use the EXACT same system and only change the graphics card.

 
Status
Not open for further replies.