Do New Drivers Really Boost Performance?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
The reason some games run better on nVidia hardware or ATI/AMD hardware is the games are optimized for that hardware. Look at HalfLife for ATI and MS Flight Simulator X for nVidia.

Look at nZone for a list of games, It even tells you "The Hottest Games Developed and Optimized on NVIDIA".... So they will have higher FPS then the ATI cards.
 
Nice article Tino and TH folks!

It's nice to feel/see that you guys care about feedback.

Would be to much troublesome to ask if you could do reviews of upcoming drivers too? New features, improvements, goodies and all that can be in them.

Hope you guys give today's cards overclocked processors as well as stock processors, since you might as well notice that they are held back in almost all resolutions below 1920x1200. I'm only asking this so we can actually see noticeable differences among cards when we are at lower resolutions and detail settings.

Esop!
 
Interesting article, although the newest Forceware drivers (178.xx) are far better than the previous ones and make marked differences in most of the newer NVIDIA cards.
 
I think it's a little early to be doing a "do drivers help" article. Doing tests so close together does really show much, especially for driver sets so close to launch. I would've prefered waiting for the ones that actually perport to be big performance boosters. The initial drivers after launch are primarily bugs fixes, performance drivers usually take a while to be added and get WHQL certification. To see the length of development cycle for those, just look at ATi's Doom driver betas, Chuck Patch Betas and others, where their incorporation into the main WHQL driver package took about 3 releases.

I think this subject should be revisted for both companies towards the end of the year when there's been ample time for bug fixes and speed boosts. Also 3Dmark should be removed from the testing since basically every company optimizes their pre-launch hardware for the best out of the gate performance in Bungholiomarks, so it's the worst test for improvements. A better test would be to take a tough new title(s), or lesser benchmarked but still popular (thus driver team attention worthy) titles that would not be the focus of early optimizations but could potentially improve with succesive drivers. I would suggest if this were done again around Xmas or the new year, compare the 8.7 Catalyst and launch Forceware to those shipping then in games like Warhead, FartCry2, Fallout3, Call of Duty WAW, etc to see what the updated drivers do to game that weren't around long before the launch of the hardware. Assasin's Creed is the only newish title in this review, and even then the biggest interest there is a non-driver issue but the SP1-DX10.1 issue.

As for the launch drivers, people shouldn't confuse drivers that launch with the cards, that have built-in tweaks specific to the hardware, to the generic version of the same driver out there for other hardware, they aren't the same.

Anyone who thinks drivers don't play a role obviously hasn't had much experience with the cards nor the history of these changes. The most important thing being, rarely do driver changes make a difference so close to launch. As in the G80 and R300 series, major driver improvements didn't come until many releases later in the 4-6 month range. Overall rarely will drivers have a major impact on games that are already performing ok without a noticeable issue, where driver improvements matter most is in places where something isn't quite acting the way it should or was expected to, that's where you will notice big jumps because either they fix a glitch or chane the way they do things to better address that case. All the major IHVs have examples of this, but they are the rare exception, not the rule where globally everything sees massive changes.

IMO, it's too early for this review and it seems to have come as a result of the tests of the Catalyst 8.9 drivers on the German website yesterday.
 
[citation][nom]wh3resmycar[/nom]so this article is basically a slap in the face to the ati fan boys who cried foul when the 8.8 drivers werent used on the GPU charts. i've been using videocards for like 7 years already and i'm not expecting new drivers to give me performance boosts, they provide more "stability" instead (like fixing corrupt textures).so for the apes who thought they were cheated, and thought theres a conspiracy between nvidia and tom's... [/citation]

I sure hope that wasn't directed at me or else you totally missed my criticism of the previous review (which arrived a month after it was publish on the German site).

As for the performance boost, while I agree that the companies should focus on stability and bug fixes primarily (and it's what most people say/ask for in polls on the major sites), to ignore the reality that there are driver boosts (for both IHVs) over time is ignorant. One of the easiest examples to see would be the increase in performance of the GF8800GTS-640 which barely beat, and often lost to the X1950XTX (with AA) at launch, however over time, nVidia improved performance especially with regards to the texture-memory issue, and voila boostof about 30% over time for situations with heavy memory usages.

The main thing people have problems with is when tests are not equal, either be it latest drivers, patches, etc or else equal footing for platforms. The biggest issue with the previous article was that it was so old it was no longer relevant, especially when it displayed issue that were later resolved like in ET:QW.
 
[citation][nom]enewmen[/nom]I am thinking the 4870 X2 should out perform the 4870 CF. Since the X2 has an addtional 5 GB/s side-port interface.[/citation]

AMD has said that the side-port is not currently being utilized--it'd require driver optimization to take advantage of, and that bandwidth is currently not seen as necessary. Just an FYI from a conversation I've had with them!
 
Hmmm if there is no real performance gain with newer catalyst drivers but there is with new GeForce ones perhaps this means that the ATI drivers were already fully optimized whereas the GeForce drivers left something to be desired when first released. What Im trying to say is that you could look at it another way and see the lack of any increase in performance from new ATI drivers as a good thing. If there is no performance gain maybe it means that the original drivers are actually excellent and need no tweaking as regards performance but only regarding compatibility etc. What you got to ask is why werent the Nvidia drivers working at maximum capability in the first place. But Im no tech expert so Im probably wrong.

Oh and don't forget that Nvidia drivers accounted for 30% of all crashes in vista when it was first released so that tells you something about their drivers too. And no Im not an AMD fanboi I currently own a 8800gt.
 
Seems like there were plenty of 10-20% increases throughout the benchmarks. The drops were usually less than 10%. Some people will see it either way but there were some rather large leaps in certain areas and a major list of bug updates including all the 8.7 updates that the article doesnt mention.

I just hope THG learned a lesson and will at least try to evaluate the current set of drivers before running their test suite in the future.
 
[citation][nom]sandmanwn[/nom]Seems like there were plenty of 10-20% increases throughout the benchmarks. The drops were usually less than 10%. Some people will see it either way but there were some rather large leaps in certain areas and a major list of bug updates including all the 8.7 updates that the article doesnt mention.I just hope THG learned a lesson and will at least try to evaluate the current set of drivers before running their test suite in the future.[/citation]

That'd certainly be my aim, even if they're guaranteed to be outdated again a week or two later! Thanks for weighing in on it though--definitely appreciated.
 
Overall a good article as itg shows a lot of info about what we want to see which is good. Most likely when you ran these tests the 8.9's may not have been released yet but it would be nice to see a small update with those drivers & the 8.10 beta's as the 8.9's & 8.10 beta's really show a increase in crysis. Also you said the 4870 ran out of steam at 1920 could it be the 512MB frame buffer by chance as it would be pretty much used up by that res. I did notice that both Nvidia cards had 1GB frame buffers. A update with teh 4780 1GB version may prove that the little ATI chip has more steam at those higher resolutions they did in this test its just a thought...
 
@ skalagon : another explanation for the lack of ati driver improvement could be that they solely concentrate on bug fixes, whereas nvidia may care more about performance (or simply not have that many bugs to fix). Also in your vista consideration account for the lack of ati cards in use at the time. Between the x850 and the hd4850 ati hasn't released any reasonably good card, so quite a few more nvidia cards must have been around when vista was launched. And in general - that the driver crashes could (not saying it should) be attributed to aero crashing, or a game being poorly or not at all adapted for vista.

@ ProDigit80 : your eye theory has a tiny little flaw! syncronisation! What you want is 25fps with vsync enabled, or enough fps without to make sure you do have more than 2 frames per 0,1s ; 25fps average isn't good enough, 25fps minimum is.

@ GAZZOO : Using an amd chipset instead of an intel one for cf test should only substract a tiny bit in overall performance, but the performance increase should be the same. Problem is, they can't test the potential performance incrase with new drivers if they're using a slow cpu, and only intel has a fast one. (given games aren't optimized for phenom).

@ jdoe : not really. It means that general driver performance improvements in the ati camp shouldn't be expected to be high between 1-2 versions. Imo nvidia drivers are better, cause I'm having huge issues with catalyst and my tv, but that doesn't mean the ati drivers are bad. I could simply be a stupid monkey that doesn't know where to click to make it work properly.

 
This is epic fail. Lets put this in perspctive here. When did the G200 series get released? When did the 4870 get released? Whos had more time to optimize their drivers? Whos also released brand new cards that also mhave to have attention from the driver team since the release of these cards? If the writer knew enough to know about this round of ATI cards, it may be good to know most of the driver optimisations are just now being implemented, as theyve had trouble with fan speeds, thus the new CCC fan control seen in the 8.10 beta drivers and other things that needed to be worked out If we stopped here, and drew our conclusion, wed all go away thinking ATI doesnt show much progress with their drivers going forwards, but that just isnt so. If whats been seen on the newer nVidia cards are seen to be increases, there will be on ATI as well
 
The assumption that 25fps is enough is not the full story for games.

Yes a minimum of 25fps is enough for fluid movement if you're watching something like a film or if you're just stood there watching the scenery but as soon as you move that mouse quickly to look in a different direction, THEN you'll notice the difference between 25 and 60fps as the number of frames the game manages to draw in between will increase dramatically and the movement will seem that much smoother.

It sometimes amazes me how few people seem to realise this even from first hand experience.
 
Now thats more like it. With the quality of some of the articles I'd seen lately I was starting to wonder about THG. Especially after that series of articles over CPU Heatsinks. My biggest concern with the drivers was like a couple others stated. 8.6 drivers came before the 4800 series cards, and only worked with a hotfix on those cards.
 
Many driver optimizations improve software overhead from drivers. To show this, shouldn't you be running on an average performance CPU instead of an ultra-highend $1200 chip?
 
Good job bro you selected two driver versions where one does not even mention performance increases and for the other you don't test all but one of the listed games.

Also out of all the games that are listed as gaining performance in 8.7, you only tested Call of Duty 4 and that game is only listed as gaining "up to 4% on specific maps" which you do see in your results. From the 8.8 release notes, it looks like a bug fix release and does not even mention any performance increases.

Why did you even write this article?
 
Why did you even write this article?

I am wondering this too. The tests show that driver updates do or do not increase the performance of specific games at specific resolutions. So, if someone claims that they should have used the latest drivers for a certain test, they could be right.

Do New Drivers Really Boost Performance?

Maybe!
 
Thanks as always for taking the time to do all of this testing.

It seems that driver updates are rather unpredictable, which is what I would have guessed. I've had problems playing older games with newer drivers, which means that while the newer drivers fixed bugs in some games they created new problems in others. Because this applies to performance tweaks as well, it's probably best to do what I do, and skim the literature of the new drivers to see if they imply any real benefit to me. If they mention fixing bugs in games I have or want to play, or something else I'd like, then I'll get them.. if not, there's no reason to "fix what ain't broke."
 
[citation][nom]minsky[/nom]Many driver optimizations improve software overhead from drivers. To show this, shouldn't you be running on an average performance CPU instead of an ultra-highend $1200 chip?[/citation]

The Core 2 Extreme X6800 @ 2.93 is now equivalent to an average chip. Actually a cheap chip. You can pick up an e8400 @ 3.0 GHz right now for 170 bucks based on 45 nm with 6 mb cache. That is actually faster and overclocks better than what you are calling an ultra high end chip. Silly thing is Toms pretends the old x6800 is high end and puts some pathetic o'c on it pretty much defeating the purpose they stated for the overclock. To make the best attempt at removing the gfx card bottleneck they should have used quad core oc'd to 4GHz.

'All tests also were carried out with an overclocked CPU. This is the only way to determine how much potential could be seen in the fast graphics chips.' - Actually you can go out and buy a 45 nm quad @ 3.2 GHz that is most likely faster stock than the 65 nm x6800 @3.47
Either say you are testing with a mid range chip or actually go out and get a fast chip and put a decent o'c on it. In every single article Toms makes some bonehead error.
 
@ minsky : if they use a slower cpu they risk handicapping the graphics card. And the point of the article was to prove, or disprove, the performance increase. This can only be done with a cpu that doesn't limit performance. Picking a midrange cpu would only prove that the graphics cards are too fast for such.

@ poro : I assume they picked these drivers as they didn't want to compare really old ones with really new ones. They picked the latest version, and a few months back. That's quite reasonable. And it doesn't really matter if the release notes claim an increase in performance, cause the urban legend states there will be up to 15% increase. That's what the article is about. It's not about testing the release notes to see if they are true. That is just a byproduct.

 
The differences in speed are statistically TOO SMALL to determine if there is any real change. The main reason to update drivers? Bug fixes. Or in the case of nVidia lately, to add more bugs 🙁
 
Status
Not open for further replies.