Nvidia Fermi GF100 Benchmarks (GTX470 & GTX480)

Page 15 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
http://www.tomshardware.com/reviews/geforce-gtx-480,2585.html


"Crysis is perhaps the closest thing to a synthetic in our real-world suite. After all, it’s two and a half years old. Nevertheless, it’s still one of the most demanding titles we can bring to bear against a modern graphics subsystem. Optimized for DirectX 10, older cards like ATI’s Radeon HD 4870 X2 are still capable of putting up a fight in Crysis.

It should come as no shock that the Radeon HD 5970 clinches a first-place finish in all three resolutions. All three of the Radeon HD 5000-series boards we’re testing demonstrate modest performance hits with anti-aliasing applied, with the exception of the dual-GPU 5970 at 2560x1600, which falls off rapidly.

Nvidia’s new GeForce GTX 480 starts off strong, roughly matching the performance of the company’s GeForce GTX 295, but is slowly passed by the previous-gen flagship. Throughout testing, the GTX 480 does maintain better anti-aliased performance, though. Meanwhile, Nvidia’s GeForce GTX 470 is generally outperformed by the Radeon HD 5850, winning only at 2560x1600 with AA applied (though it’s an unplayable configuration, anyway)"




"We’ve long considered Call of Duty to be a processor-bound title, since its graphics aren’t terribly demanding (similar to Left 4 Dead in that way). However, with a Core i7-980X under the hood, there’s ample room for these cards to breathe a bit.

Nvidia’s GeForce GTX 480 takes an early lead, but drops a position with each successive resolution increase, eventually landing in third place at 2560x1600 behind ATI’s Radeon HD 5970 and its own GeForce GTX 295. Still, that’s an impressive showing in light of the previous metric that might have suggested otherwise. Right out of the gate, GTX 480 looks like more of a contender for AMD's Radeon HD 5970 than the single-GPU 5870.

Perhaps the most compelling performer is the GeForce GTX 470, though, which goes heads-up against the Radeon HD 5870, losing out only at 2560x1600 with and without anti-aliasing turned on.

And while you can’t buy them anymore, it’s interesting to note that anyone running a Radeon HD 4870 X2 is still in very solid shape; the card holds up incredibly well in Call of Duty, right up to 2560x1600."



It becomes evident that the GTX470 performs maybe 10% or less better than the 5850 on average, and the GTX480 performs maybe 10% or less better than the 5870 on average. Yet the power consumption of a GTX470 is higher than a 5870, and the GTX480 consumes as much power as a 5970.

The Fermi generation is an improvement on the GTX200 architecture, but compared to the ATI HD 5x00 series, it seems like a boat load of fail... =/



------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Original Topic:

http://www.hexus.net/content/item.php?item=21996


The benchmark is in FarCry2, which is an Nvidia favoring game.


==Nvidia GTX285==

...What you see here is that the built-in DX10 benchmark was run at 1,920x1,200 at the ultra-high-quality preset and with 4x AA. The GeForce GTX 285 returns an average frame-rate of 50.32fps with a maximum of 73.13fps and minimum of 38.4fps. In short, it provides playable settings with lots of eye candy.

==Nvidia Fermi GF100==
Take a closer look at the picture and you will be able to confirm that the settings are the same as the GTX 285's. Here, though, the Fermi card returns an average frame-rate of 84.05fps with a maximum of 126.20fps and a minimum of 64.6fps. The minimum frame-rate is higher than the GTX 285's average, and the 67 per cent increase in average frame-rate is significant...

==Lightly Overclocked ATI 5870==
The results show that, with the same settings, the card scores an average frame-rate of 65.84fps with a maximum of 136.47fps (we can kind of ignore this as it's the first frame) and a minimum of 40.40fps - rising to 48.40fps on the highest of three runs.

==5970==
Average frame-rate increases to 99.79fps with the dual-GPU card, beating out Fermi handily. Maximum frame-rate is 133.52fps and minimum is 76.42fps. It's hard to beat the sheer grunt of AMD's finest, clearly.

Even after taking into account the Nvidia-favoring of FarCry2, the results are not too shabby...in line with what we've been expecting - that the GF100 is faster than the 5870. It will most likely be faster in other games, but to a smaller degree... Now the only question is how much it will cost...
 
GF100-Crysis-1920.png

Tbh, this is all I care about.
 

wh3resmycar

Distinguished
before i read the benchies, i was under the impression that a die-shrink would save fermi, the same way the 55nm gtx200s made that generation "likable". after reading though, i dont think its gonna happen. a "tock" wont chip away 100w from the 480 which makes 480x2 seems unlikely.

nvidia had the xbox excuse when they failed with the FX, i dont know what reasonable alibi they can come up with now.
 

welshmousepk

Distinguished
the 470 seems to be a very respectable card, but the 480 is an complete failure IMO.

it usually beat a 5870, though not by a huge margin. its 100 dollars more expensive (probably more by the time ATI drop prices) and consumes way too much power.
the big kicker for me though is: 'Turns out that, during normal game play (running Crysis, not something like FurMark), the exposed metal exceeds 71 degrees C (or about 160 degrees F). This will have some ramifications for running two cards in SLI, but we’ll get into that shortly.'

for the surface of the card to get that hot, would mean i absolutely would not want ti in my system. the top panel would act like a massive radiotor in my case, and i really cannot imagine the adverse effect it would have on ambient case temps. as it stands, the GTX480 is an awful waste of cash.

luckily, it seems the 470 is much more attractive bang/buck wise, so perhaps its not a total loss for Nv.
 

IzzyCraft

Distinguished
Nov 20, 2008
1,438
0
19,290
I wouldn't go as far as great the Rad 4000's were great the Geforce 8800's where great everything else imo is just good or worse.

Personally seeing that they released even the 480 with disabled cores the fermi is pretty much fail, they took on too new too complex of task or something. Can't wait for the refresh where they have the yeilds and crap down that chip could be quite nice.
 

rescawen

Distinguished
Jan 16, 2009
635
0
18,990
FERMI pass, I am hoping ATI HD6xxx will bring something decent as HD5xxx not like the power disaster FERMI!

I want a notebook with a HD6870 mobility in it (if this exists) and its performance better be good enough to run SC2 at maximum settings at full HD.

Now has anyone thought how fermi is letting ATI get to the mobility market?

5850>470
 

welshmousepk

Distinguished


well thats not really relevent, since Nv will just rebadge their older chips (again) and sell them as new tech. people buying laptops are generally the most spec-illiterate of pc buyers, so they won;t have a clue. you can bet that GT3xxM's will sell well.
 
Status
Not open for further replies.