Nvidia Fermi GF100 Benchmarks (GTX470 & GTX480)

Page 12 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
http://www.tomshardware.com/reviews/geforce-gtx-480,2585.html


"Crysis is perhaps the closest thing to a synthetic in our real-world suite. After all, it’s two and a half years old. Nevertheless, it’s still one of the most demanding titles we can bring to bear against a modern graphics subsystem. Optimized for DirectX 10, older cards like ATI’s Radeon HD 4870 X2 are still capable of putting up a fight in Crysis.

It should come as no shock that the Radeon HD 5970 clinches a first-place finish in all three resolutions. All three of the Radeon HD 5000-series boards we’re testing demonstrate modest performance hits with anti-aliasing applied, with the exception of the dual-GPU 5970 at 2560x1600, which falls off rapidly.

Nvidia’s new GeForce GTX 480 starts off strong, roughly matching the performance of the company’s GeForce GTX 295, but is slowly passed by the previous-gen flagship. Throughout testing, the GTX 480 does maintain better anti-aliased performance, though. Meanwhile, Nvidia’s GeForce GTX 470 is generally outperformed by the Radeon HD 5850, winning only at 2560x1600 with AA applied (though it’s an unplayable configuration, anyway)"




"We’ve long considered Call of Duty to be a processor-bound title, since its graphics aren’t terribly demanding (similar to Left 4 Dead in that way). However, with a Core i7-980X under the hood, there’s ample room for these cards to breathe a bit.

Nvidia’s GeForce GTX 480 takes an early lead, but drops a position with each successive resolution increase, eventually landing in third place at 2560x1600 behind ATI’s Radeon HD 5970 and its own GeForce GTX 295. Still, that’s an impressive showing in light of the previous metric that might have suggested otherwise. Right out of the gate, GTX 480 looks like more of a contender for AMD's Radeon HD 5970 than the single-GPU 5870.

Perhaps the most compelling performer is the GeForce GTX 470, though, which goes heads-up against the Radeon HD 5870, losing out only at 2560x1600 with and without anti-aliasing turned on.

And while you can’t buy them anymore, it’s interesting to note that anyone running a Radeon HD 4870 X2 is still in very solid shape; the card holds up incredibly well in Call of Duty, right up to 2560x1600."



It becomes evident that the GTX470 performs maybe 10% or less better than the 5850 on average, and the GTX480 performs maybe 10% or less better than the 5870 on average. Yet the power consumption of a GTX470 is higher than a 5870, and the GTX480 consumes as much power as a 5970.

The Fermi generation is an improvement on the GTX200 architecture, but compared to the ATI HD 5x00 series, it seems like a boat load of fail... =/



------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Original Topic:

http://www.hexus.net/content/item.php?item=21996


The benchmark is in FarCry2, which is an Nvidia favoring game.


==Nvidia GTX285==

...What you see here is that the built-in DX10 benchmark was run at 1,920x1,200 at the ultra-high-quality preset and with 4x AA. The GeForce GTX 285 returns an average frame-rate of 50.32fps with a maximum of 73.13fps and minimum of 38.4fps. In short, it provides playable settings with lots of eye candy.

==Nvidia Fermi GF100==
Take a closer look at the picture and you will be able to confirm that the settings are the same as the GTX 285's. Here, though, the Fermi card returns an average frame-rate of 84.05fps with a maximum of 126.20fps and a minimum of 64.6fps. The minimum frame-rate is higher than the GTX 285's average, and the 67 per cent increase in average frame-rate is significant...

==Lightly Overclocked ATI 5870==
The results show that, with the same settings, the card scores an average frame-rate of 65.84fps with a maximum of 136.47fps (we can kind of ignore this as it's the first frame) and a minimum of 40.40fps - rising to 48.40fps on the highest of three runs.

==5970==
Average frame-rate increases to 99.79fps with the dual-GPU card, beating out Fermi handily. Maximum frame-rate is 133.52fps and minimum is 76.42fps. It's hard to beat the sheer grunt of AMD's finest, clearly.

Even after taking into account the Nvidia-favoring of FarCry2, the results are not too shabby...in line with what we've been expecting - that the GF100 is faster than the 5870. It will most likely be faster in other games, but to a smaller degree... Now the only question is how much it will cost...
 
They already do. Some cases can't fit in modern cards and they are designed to fit ATX motherboards :lol:

EDIT: Also, ATX would be concerned with how much power is being drawn from the power connectors, and how many power connectors are being used, I would think. After all, the power connectors are part of the ATX12V v2.x spec. PCI-SIG would be concerned with the total power consumption.
 
HD5970 is what? 13''? Not trying to bash anyone here, but if Fermi is as big as rumoured, the GTX495 is going to be more than that.

Incidentally, how come ATI/NV won't stop building length-wise, instead going towards width? Increase width to increase efficiency to compensate for lack of cooling?
 


Bitch on dude!

Does CSAA smooth shader aliasing? The black shader borders in Borderlands?
No, you need SSAA, when you truely render the frame at a higher resolution & downsize it.
 


It isn't 13", it is 11" or something.
 
Legit Video Card Reviews says:
At 12.125" in length Radeon HD 5970 is very long and isn't going to fit in all the cases on the market today.
ati_5000_cards.jpg

For a size comparison here are all the Radeon HD 5000 series cards lined up. From top to bottom you have the Radeon HD 5970, 5870, 5850, 5770 and the 5750. You can clearly see how large the Radeon HD 5970 is compared to the other cards. It has a good inch on the Radeon HD 5870!
http://www.legitreviews.com/article/1141/2/
 


Actually, I have used x4 SSAA on a couple games, and it worked quite well. Dragon Age Origins was one of them.
 
PCI-Sig is basically the standards required by a manufacturer to procude a product that will work with all other products that follow that standard. If they choose to overbuild, they can still be certified as long as its within the criteria.

Power supplies have a minimum, they can easily go over that.

Video cards have a MAXIMUM. They CANNOT GET Certified if they go over the spec, that doesn't mean they can't be made. But be wary if you buy something thats above the spec, your power supply may be only made to the minimum requirements, and going above that will be deadly to it. Buying a card that runs hotter than the PCI-Sig, you better know what power supply you MUST BUY.

AMD/ATI is not makiing the 5990/ares cards, Sapphire and Asus are, thats why its not an official product launch. It can't be certified as of today's standards.
 


Yeah but there are many ways to implement that trick, and while not new, they lead to different levels of IQ.
Ati's use of Rotated Grid SS is better than nVidia's Ordered Grid SS and more efficient at it too.

Wanna know the diff? Ironically check out 3dfx's whitepaper extoling the virtues;
http://www.x86-secret.com/articles/divers/v5-6000/datasheets/FSAA.pdf
 


That's not too bad, and falls within predictions for a high end card that performs between a 5870 and 5970.

If it consumes 275 watts, it will use 1 PCIe 6 pin and 1 PCIe 8 pin...perfectly acceptable.
 
Status
Not open for further replies.