Nvidia Fermi GF100 Benchmarks (GTX470 & GTX480)

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
http://www.tomshardware.com/reviews/geforce-gtx-480,2585.html


"Crysis is perhaps the closest thing to a synthetic in our real-world suite. After all, it’s two and a half years old. Nevertheless, it’s still one of the most demanding titles we can bring to bear against a modern graphics subsystem. Optimized for DirectX 10, older cards like ATI’s Radeon HD 4870 X2 are still capable of putting up a fight in Crysis.

It should come as no shock that the Radeon HD 5970 clinches a first-place finish in all three resolutions. All three of the Radeon HD 5000-series boards we’re testing demonstrate modest performance hits with anti-aliasing applied, with the exception of the dual-GPU 5970 at 2560x1600, which falls off rapidly.

Nvidia’s new GeForce GTX 480 starts off strong, roughly matching the performance of the company’s GeForce GTX 295, but is slowly passed by the previous-gen flagship. Throughout testing, the GTX 480 does maintain better anti-aliased performance, though. Meanwhile, Nvidia’s GeForce GTX 470 is generally outperformed by the Radeon HD 5850, winning only at 2560x1600 with AA applied (though it’s an unplayable configuration, anyway)"




"We’ve long considered Call of Duty to be a processor-bound title, since its graphics aren’t terribly demanding (similar to Left 4 Dead in that way). However, with a Core i7-980X under the hood, there’s ample room for these cards to breathe a bit.

Nvidia’s GeForce GTX 480 takes an early lead, but drops a position with each successive resolution increase, eventually landing in third place at 2560x1600 behind ATI’s Radeon HD 5970 and its own GeForce GTX 295. Still, that’s an impressive showing in light of the previous metric that might have suggested otherwise. Right out of the gate, GTX 480 looks like more of a contender for AMD's Radeon HD 5970 than the single-GPU 5870.

Perhaps the most compelling performer is the GeForce GTX 470, though, which goes heads-up against the Radeon HD 5870, losing out only at 2560x1600 with and without anti-aliasing turned on.

And while you can’t buy them anymore, it’s interesting to note that anyone running a Radeon HD 4870 X2 is still in very solid shape; the card holds up incredibly well in Call of Duty, right up to 2560x1600."



It becomes evident that the GTX470 performs maybe 10% or less better than the 5850 on average, and the GTX480 performs maybe 10% or less better than the 5870 on average. Yet the power consumption of a GTX470 is higher than a 5870, and the GTX480 consumes as much power as a 5970.

The Fermi generation is an improvement on the GTX200 architecture, but compared to the ATI HD 5x00 series, it seems like a boat load of fail... =/



------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Original Topic:

http://www.hexus.net/content/item.php?item=21996


The benchmark is in FarCry2, which is an Nvidia favoring game.


==Nvidia GTX285==

...What you see here is that the built-in DX10 benchmark was run at 1,920x1,200 at the ultra-high-quality preset and with 4x AA. The GeForce GTX 285 returns an average frame-rate of 50.32fps with a maximum of 73.13fps and minimum of 38.4fps. In short, it provides playable settings with lots of eye candy.

==Nvidia Fermi GF100==
Take a closer look at the picture and you will be able to confirm that the settings are the same as the GTX 285's. Here, though, the Fermi card returns an average frame-rate of 84.05fps with a maximum of 126.20fps and a minimum of 64.6fps. The minimum frame-rate is higher than the GTX 285's average, and the 67 per cent increase in average frame-rate is significant...

==Lightly Overclocked ATI 5870==
The results show that, with the same settings, the card scores an average frame-rate of 65.84fps with a maximum of 136.47fps (we can kind of ignore this as it's the first frame) and a minimum of 40.40fps - rising to 48.40fps on the highest of three runs.

==5970==
Average frame-rate increases to 99.79fps with the dual-GPU card, beating out Fermi handily. Maximum frame-rate is 133.52fps and minimum is 76.42fps. It's hard to beat the sheer grunt of AMD's finest, clearly.

Even after taking into account the Nvidia-favoring of FarCry2, the results are not too shabby...in line with what we've been expecting - that the GF100 is faster than the 5870. It will most likely be faster in other games, but to a smaller degree... Now the only question is how much it will cost...
 
It could be that they did start mass production and they just havent found the maximum stable clock to set them at, perhaps they are very close but are finding they can push the cores harder with more refined components that are coming out of production.
 

I know that. But most of the BIOSs come from NVIDIA and are simply tweaked a little by the board partner. Custom PCBs require a custom BIOS though, and any change to the GPU also requires a new BIOS (for example, a new process node).

Maybe I misunderstood what you were getting at in your previous post. :??:
 
Forget the clocks, they said in the first day during the Unigine demo that they haven't finalized the cooling solution or PCB yet either.
Clocks can be sent to AIBs just before they are boxed and sent to re-sellers, but the PCB and HSF need to be finalized before 'mass production' of cards starts.

That may have changed, but it's a far different timeline than just finalizing speeds.
 
hanvt we already had an official quote from one of the case manufacturers at CES, that said specifically that SLI with a fermi card will be impossible without a specially cooled case?

i see a couple of people rigidly defending Nvidia here, despite the facts. im not saying fermi will be awful, but one of the more obvious facts is that it will consume alot of power and generate alot of heat (at least compared to a 5870, maybe not a 5970).

i see fermi outperforming the 5870, but when both compared at maximum overclocks it may be very close. and given that fermi will surely be much more expensive, it jsut doesnt seem like nvidia will be able compete at the high end.

the mid range is another matter though, and Nvidia may well dominate in that sector. but what i see of fermi doesnt excite me. except the prospect of it causing an ATI 58xx series price drop...
 


I thought that Fermi wasn't available for the lower ends.
 

????????
And I see a bunch of people blindly trying to knock Fermi based on hyperbole .
Oh my its going to use a lot of power !
If it beats the 5870, it doe not have to worry about competing, IT WON.
You see the people lining up and buying out the $440 5870's and 700 dollar 5970 after the ati markups, theres more of them out there to buy the BEST.
By the way even after the chip shrink the 5 series die comes in bigger than the 4 series. Another aspect of Fermi thats twisted in to making noobs think its a negative.
 


firstly, you dont think its power use is a justified criticism? it uses a 6 pin and 8 pin PCIe connector. so we KNOW it will use more than a 5870. this means more heat, and most likely less OCing headroom.
and yes, it has to compete. if it outperforms the 5870, but costs 200 dollars more, doesn't overclock and cant SLI it will have won nothing but last place.

making judgements based on available evidence is not 'hyperbole'. however, looking at said evidence and denouncing it, claiming 'Nvidia still roxorz lolololl' is purely irational. most of us are coming to fairly logical conclusions, but of course not making claims as if we know the outcome.

you however, seem to love making up facts and disregarding the real ones. : ' $440 5870's and 700 dollar 5970 after the ati markups'. ATI didnt mark up anything, its called supply and demand. supply is short because the cards are selling so well, so retailers raise the price to increase their margins. basic business.

like i said, the only thing i see fermi winning is a 'max FPS for a single GPU in most games' award. price to performance, i dont see it competing.
if you have any actual evidence you would like to bring to the discussion, please go ahead. if i have missed something i would love to see it. but simply saying everyone else is wrong without privding any evidence is a waste of everyones time.
 
@ welshmousepk
Point is that very little of whats going around is actual cast iron fact. Sure some of it is based on educated guesswork from people who should know what they are talking about buy even the actual spec's released by Nvidia are even being taken with a pinch of salt because no one believes what they are saying.
Things we know are this,
The chip is big
The chip will use a lot of power
The chip will cost more to make that a Cypress chip

Things we are guessing
How much power exactly it will use, you cant just base it on what connectors are on the card,a 4770 has a PCIE connector but doesn't use 150 watts. So its possable that just adding up the connectors and saying ouch may be wide of the mark.
The performance based on emulating this and comparing to that based on these figures being close to these results ?? Come on too many variables and we dont even know final clocks etc yet, whats been tested were Enginering samples anyway.

People have claimes all sorts based on guess work and this rumour or that is it 'hyperbole' ?? Yes i think it is a fair description.
You do make me laugh when you say "if you have any actual evidence you would like to bring to the discussion" [:mousemonkey:5]
Havent seen any yet.

Mactronix
 
Agreed mac, and from this foundation, we can discuss possible directions, and features etc nVidia has gone/done on Fermi.
It goes both ways, when people are saying itll do this because, or do that because, or cant do this because, needs to be reined in somewhat, as we just dont know.
Putting it as speculated questions vs verified fact are 2 different things, and creates more argument than knowledge
 


So you're saying the more we ask and try to find out more information about Fermi, the more we learn that we know even less and in fact don't really know anything? Lol...
 
Status
Not open for further replies.