Nvidia Fermi GF100 Benchmarks (GTX470 & GTX480)

Page 8 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
http://www.tomshardware.com/reviews/geforce-gtx-480,2585.html


"Crysis is perhaps the closest thing to a synthetic in our real-world suite. After all, it’s two and a half years old. Nevertheless, it’s still one of the most demanding titles we can bring to bear against a modern graphics subsystem. Optimized for DirectX 10, older cards like ATI’s Radeon HD 4870 X2 are still capable of putting up a fight in Crysis.

It should come as no shock that the Radeon HD 5970 clinches a first-place finish in all three resolutions. All three of the Radeon HD 5000-series boards we’re testing demonstrate modest performance hits with anti-aliasing applied, with the exception of the dual-GPU 5970 at 2560x1600, which falls off rapidly.

Nvidia’s new GeForce GTX 480 starts off strong, roughly matching the performance of the company’s GeForce GTX 295, but is slowly passed by the previous-gen flagship. Throughout testing, the GTX 480 does maintain better anti-aliased performance, though. Meanwhile, Nvidia’s GeForce GTX 470 is generally outperformed by the Radeon HD 5850, winning only at 2560x1600 with AA applied (though it’s an unplayable configuration, anyway)"




"We’ve long considered Call of Duty to be a processor-bound title, since its graphics aren’t terribly demanding (similar to Left 4 Dead in that way). However, with a Core i7-980X under the hood, there’s ample room for these cards to breathe a bit.

Nvidia’s GeForce GTX 480 takes an early lead, but drops a position with each successive resolution increase, eventually landing in third place at 2560x1600 behind ATI’s Radeon HD 5970 and its own GeForce GTX 295. Still, that’s an impressive showing in light of the previous metric that might have suggested otherwise. Right out of the gate, GTX 480 looks like more of a contender for AMD's Radeon HD 5970 than the single-GPU 5870.

Perhaps the most compelling performer is the GeForce GTX 470, though, which goes heads-up against the Radeon HD 5870, losing out only at 2560x1600 with and without anti-aliasing turned on.

And while you can’t buy them anymore, it’s interesting to note that anyone running a Radeon HD 4870 X2 is still in very solid shape; the card holds up incredibly well in Call of Duty, right up to 2560x1600."



It becomes evident that the GTX470 performs maybe 10% or less better than the 5850 on average, and the GTX480 performs maybe 10% or less better than the 5870 on average. Yet the power consumption of a GTX470 is higher than a 5870, and the GTX480 consumes as much power as a 5970.

The Fermi generation is an improvement on the GTX200 architecture, but compared to the ATI HD 5x00 series, it seems like a boat load of fail... =/



------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Original Topic:

http://www.hexus.net/content/item.php?item=21996


The benchmark is in FarCry2, which is an Nvidia favoring game.


==Nvidia GTX285==

...What you see here is that the built-in DX10 benchmark was run at 1,920x1,200 at the ultra-high-quality preset and with 4x AA. The GeForce GTX 285 returns an average frame-rate of 50.32fps with a maximum of 73.13fps and minimum of 38.4fps. In short, it provides playable settings with lots of eye candy.

==Nvidia Fermi GF100==
Take a closer look at the picture and you will be able to confirm that the settings are the same as the GTX 285's. Here, though, the Fermi card returns an average frame-rate of 84.05fps with a maximum of 126.20fps and a minimum of 64.6fps. The minimum frame-rate is higher than the GTX 285's average, and the 67 per cent increase in average frame-rate is significant...

==Lightly Overclocked ATI 5870==
The results show that, with the same settings, the card scores an average frame-rate of 65.84fps with a maximum of 136.47fps (we can kind of ignore this as it's the first frame) and a minimum of 40.40fps - rising to 48.40fps on the highest of three runs.

==5970==
Average frame-rate increases to 99.79fps with the dual-GPU card, beating out Fermi handily. Maximum frame-rate is 133.52fps and minimum is 76.42fps. It's hard to beat the sheer grunt of AMD's finest, clearly.

Even after taking into account the Nvidia-favoring of FarCry2, the results are not too shabby...in line with what we've been expecting - that the GF100 is faster than the 5870. It will most likely be faster in other games, but to a smaller degree... Now the only question is how much it will cost...
 
Even on Dual DVI cards there is usually a S-Video. That's the most common, and you would need Quad-SLI for that, however, single GPU GF Cards only support Tri-SLI & for dual-GPU cards where Quad-SLI is possible the card usually only has a single GPU's outputs.
 
strangest, yes Nvidia cards currently only allow for 2 outputs per card. they may have more connectors, but the GPU itself is only capable of outputting to two displays at one time. which means tri display will be very difficult under fermi. You'd need an SLI setup, which (According to the rumours) will not be possible in a conventional setup without specialist cooling.

like i said though, as eyefinity gains popularity (and it will), im sure Nvidia will offer something similar. though knowing them, they will most likely create special cards that allow for multi screen display, and charge extra for the privilege. oh, and it will disable itself if its sensors detect an ATI card with 300 feet.
 
From the Link: Farcry 2 testing

Nvidia Fermi GF100
Min = 64.60 fps
Avg = 84.05 fps
Max = 126.20 fps

AMD Radeon HD 5970
Min = 76.42 fps
Avg = 99.79 fps
Max = 133.52 fps

Based on average fps scores, the 5970 is 19% faster than Nvidia's GF100. So, it doesn't beat the 5970.

...As for prices, we'll wait and see. Obviously ATI is the one setting the prices this round. Nvidia has a hard sell to make with when their chips are 40% larger (and therefore at least 40% more expensive to manufacture).

yeah the ATI5970 runs games very good at fast frames per second , well my 5970 did till it broke in to a artifact fit 8 days after i bought it just before christmas had to wait till end of holiday 3 weeks to be told it was a faulty card and got my money refunded all £600 uk , i am no fanboy of either Nvidia or ATI , am a fan of things that work my experience with the 5970 was mixed i thought that i would not miss physic X but did, my games did not feel the same without it , now i have re-instaled my 3x EVGA gtx260 cards i now appreciate the 10 year warranty i got from EVGA on the cards , the reason i did not get a replacement was the time i would have to wait to get it was far to long a wait , and the 8 days of fun was to short before break down and got me thinking 2 years warranty now i know why the warranty is so short, yeah 3 years with an asus 5970
 
yeah the ATI5970 runs games very good at fast frames per second , well my 5970 did till it broke in to a artifact fit 8 days after i bought it just before christmas had to wait till end of holiday 3 weeks to be told it was a faulty card and got my money refunded all £600 uk , i am no fanboy of either Nvidia or ATI , am a fan of things that work my experience with the 5970 was mixed i thought that i would not miss physic X but did, my games did not feel the same without it , now i have re-instaled my 3x EVGA gtx260 cards i now appreciate the 10 year warranty i got from EVGA on the cards , the reason i did not get a replacement was the time i would have to wait to get it was far to long a wait , and the 8 days of fun was to short before break down and got me thinking 2 years warranty now i know why the warranty is so short, yeah 3 years with an asus 5970

All cards break down. The GTX295 and 9800 GX2 had some of the highest failure rates of any card ever. Dual-GPU solutions always run hot and have a higher failure rate than single-GPUs.

As for physX, you won't notice any difference unless you're playing a physX intensive game such as Mirror's Edge or Batman. I went from a GTX260 to a 4870 and I didn't notice any difference in my FarCry2, Crysis, and GTA4 games.
 
And with PhysX really, it's simply better to use a GPU specifically for that task anyways, so if he had one of those 3 GTX265 sitting around doing nothing, I'm surprised he didn't just slap it in there for the PhysX if it's something he uses.

As for a 10 year warranty, great, so I can ask eVGA to give me a Geforce2 Pro replacement? What would that be nowadays, a crippled integrated GPU?

Anything more than 3 years seems almost pointless nowadays because most people buying high end wouldn't keep it that long. But, as long as they cover the shipping then it's nice since you are only out the the time required to do the replacement, but anything that doesn't include shipping wouldn't be worth it for anything mid-range or lower.

 


But the good thing it, companies will give you a recent GPU of the same value if your old one fails under warranty. So if you spent $300 on a Sapphire ATI HD2400 or something, and it fails today while being under warranty, Sapphire will probably give you a $300 ATI 5850 as part of the warranty.
 


Actually, usually that's not the case, usually it's equal performance not value.
This tries to inhibit people making their cards 'fail' in order to upgrade them.
 
People stop flaming about warranties, it gets old fast. As for dual gpu cards failure comes with the territory unless you are a collector/ retro gamer who is still using a Voodoo 5 5500 since those are the only dual gpu cards that didn't have a short functional life span due to their low wattage and single pcb. As for quad and octo gpu cards hey are rare while being mostly limited to a very small market segment with only one exception. Such cards with modern high end gpus simply isn't possible for performance 3D.


This is one rare example of a card with 8 gpus (note these ran FSSAA with no fps hit but performed the same as a V56k).

q3d_aalchemy.jpg


The most I have ever seen of these in use at once was with two such cards (note the VS100 can sli up to 32 gpus) so 16 gpus in one box.

One rare example of a quad ATI dual planer board.

13-14-57-23-61496176h8p.jpg


47_q3d_mer1.jpg
 


Not that rare.
E&S made alot of those SIMFusion 6000 series, which could be purchsed in 6500 model with dual boards, which could also be daisy chained;

picsimfusion6000.jpg



There was even a quad 7000 series model, and they too could be added together to total 64 GPUs in their professional Combat SIM rigs;
e8055c82398424b573b3b95893b9926e.jpg


It was those that used the checkerboard Super-Tiling method of frame rendering.

BTW, this is what the Quantum Alchemy's connector's looked like since you already have the back to front;

57505091ze7.jpg

 
So the Fermi supercard is not going to be the supercard we'd hoped. It won't beat the ATI Dual GPU card, which means there is no reason for me to get this card. I just want the fastest card. If more reviews confirm this speed difference, then I will be buying my first ATI card in a long time.



 


which is the same conclusion most of us have come to.
if the GTX480 were half the cost, it would be a good buy. but the HD5970 is a vastly superior card, thats also cheaper.
 


Yeah i read that somewhere. However they never said if its a new gen or a refresh... regardless its not good news for nvda!
 
it will most likely be another die shrink as with the 4890 part way through the last lifecycle.

if ATI can bring out a 5890 for 400USD, and it can perform similarly to nvidias GTX480 (which looks possible) then they will surely have put the last nail in the coffin of fermi.
 


Or that crazy 3 slot-5970 with 2x PCIe 8 pin connectors, running at native or OCed 5870 speeds, dubbed the 5980 or 5990.
 
AGain, you can't compare a single GPU 480 to a dual GPU 5970. Compare SLI'd 480's, and see which of the two is faster. Or better yet, get the dual Fermi card ASUS is working on, and then compare the two.

I'm getting sick of the "Single GPU fermi is faster then everything but a dual-GPU ATI, so ATI wins performance!" argument, especially since it was the ATI guys that not too long ago were using the same argment against the 7950GX2, and the 9800GX2.
 
Status
Not open for further replies.