Nvidia Fermi GF100 Benchmarks (GTX470 & GTX480)

Page 10 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
http://www.tomshardware.com/reviews/geforce-gtx-480,2585.html


"Crysis is perhaps the closest thing to a synthetic in our real-world suite. After all, it’s two and a half years old. Nevertheless, it’s still one of the most demanding titles we can bring to bear against a modern graphics subsystem. Optimized for DirectX 10, older cards like ATI’s Radeon HD 4870 X2 are still capable of putting up a fight in Crysis.

It should come as no shock that the Radeon HD 5970 clinches a first-place finish in all three resolutions. All three of the Radeon HD 5000-series boards we’re testing demonstrate modest performance hits with anti-aliasing applied, with the exception of the dual-GPU 5970 at 2560x1600, which falls off rapidly.

Nvidia’s new GeForce GTX 480 starts off strong, roughly matching the performance of the company’s GeForce GTX 295, but is slowly passed by the previous-gen flagship. Throughout testing, the GTX 480 does maintain better anti-aliased performance, though. Meanwhile, Nvidia’s GeForce GTX 470 is generally outperformed by the Radeon HD 5850, winning only at 2560x1600 with AA applied (though it’s an unplayable configuration, anyway)"




"We’ve long considered Call of Duty to be a processor-bound title, since its graphics aren’t terribly demanding (similar to Left 4 Dead in that way). However, with a Core i7-980X under the hood, there’s ample room for these cards to breathe a bit.

Nvidia’s GeForce GTX 480 takes an early lead, but drops a position with each successive resolution increase, eventually landing in third place at 2560x1600 behind ATI’s Radeon HD 5970 and its own GeForce GTX 295. Still, that’s an impressive showing in light of the previous metric that might have suggested otherwise. Right out of the gate, GTX 480 looks like more of a contender for AMD's Radeon HD 5970 than the single-GPU 5870.

Perhaps the most compelling performer is the GeForce GTX 470, though, which goes heads-up against the Radeon HD 5870, losing out only at 2560x1600 with and without anti-aliasing turned on.

And while you can’t buy them anymore, it’s interesting to note that anyone running a Radeon HD 4870 X2 is still in very solid shape; the card holds up incredibly well in Call of Duty, right up to 2560x1600."



It becomes evident that the GTX470 performs maybe 10% or less better than the 5850 on average, and the GTX480 performs maybe 10% or less better than the 5870 on average. Yet the power consumption of a GTX470 is higher than a 5870, and the GTX480 consumes as much power as a 5970.

The Fermi generation is an improvement on the GTX200 architecture, but compared to the ATI HD 5x00 series, it seems like a boat load of fail... =/



------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Original Topic:

http://www.hexus.net/content/item.php?item=21996


The benchmark is in FarCry2, which is an Nvidia favoring game.


==Nvidia GTX285==

...What you see here is that the built-in DX10 benchmark was run at 1,920x1,200 at the ultra-high-quality preset and with 4x AA. The GeForce GTX 285 returns an average frame-rate of 50.32fps with a maximum of 73.13fps and minimum of 38.4fps. In short, it provides playable settings with lots of eye candy.

==Nvidia Fermi GF100==
Take a closer look at the picture and you will be able to confirm that the settings are the same as the GTX 285's. Here, though, the Fermi card returns an average frame-rate of 84.05fps with a maximum of 126.20fps and a minimum of 64.6fps. The minimum frame-rate is higher than the GTX 285's average, and the 67 per cent increase in average frame-rate is significant...

==Lightly Overclocked ATI 5870==
The results show that, with the same settings, the card scores an average frame-rate of 65.84fps with a maximum of 136.47fps (we can kind of ignore this as it's the first frame) and a minimum of 40.40fps - rising to 48.40fps on the highest of three runs.

==5970==
Average frame-rate increases to 99.79fps with the dual-GPU card, beating out Fermi handily. Maximum frame-rate is 133.52fps and minimum is 76.42fps. It's hard to beat the sheer grunt of AMD's finest, clearly.

Even after taking into account the Nvidia-favoring of FarCry2, the results are not too shabby...in line with what we've been expecting - that the GF100 is faster than the 5870. It will most likely be faster in other games, but to a smaller degree... Now the only question is how much it will cost...
 


Actually if they do that there is a 99% chance all the worlds nuclear missiles will be launched. Do you want to take that chance?
 


Nice Ghostbusters reference.

But what will happen is they wont get to say they have the ATX certification for it, which means essentially nothing, but people will be blindly afraid of an "uncertified" product and they'll sell less of them. It's the same thing with our products at my job here, if they aren't UL certified, companies will refuse to buy them flat out. Is it because our products are unsafe? No, but if they aren't UL certified people assume it could be so. All of our stuff IS UL certified though for that very reason.
 
The limit of 300W is meaningless for 99% of retailers... The issue will arise in that any OEM will either refuse, or charge much more for warranty support, to include it in a pre built system. Since 99% of the cards in such a system will not be the halo noone will really loose much sleep over it.

I'd doubt that those buying a box from alien ware would give a rats ass if the highest end cards cost a flat $200 more to get more than a 6month warranty.

My head is spinning from all the repeated dual GPU does not count nonsense... the fact that it is being validated as logical because moron ATI fans did the same thing before makes my face rapidly fall towards my palm... Really? The whole world is full of uneasy peon children? Sigh... I don't care if one hates dual GPU cards (I know I currently do), but that just means it is something we should talk about and even more reason not to apply a cut and dry rule to the comparison...

I want the best pet for me, which is a cat.. Does that mean that dogs don't count as pets? That I can't consider a dog when I choose what I want in a companion? To claim one product is better than another based on an arbitrary feature that an individual does not like, yet another may like, is just as ridiculous to give out a title for best pet knowing full well half of the folks around are diametrically opposed to half of the options..
 
its amazing we have this pci-e standard for a reason yet it's now ok to break it just so nVidia can get the fastest card back?

what's stopping ATI from breaking it too? If it's not < 300 watts, it doesnt count and nVidia must meet the engineering challenges just like ATI did. you cant just change the rules because you lose with the old ones.
 


a simple workaround is not to buy pre-built systems if you're looking at fermi.. problems solved.

and if you're in the market for a pre-built gaming pc, stick to a console.
 


There's nothing stopping ATI from breaking it...... if they wanted to they could, as already menton by Daedalus, it's not like they'll lose a ton of sales on the dual-GPU halo card anyways, just some.

And yes, you can change the rules all you want if they are arbitrary rules.

I'll give you an example: Gravity isn't an arbitrary rule. It isn't like you can engineer a bridge and just ignore gravity and forget piers, the bridge will obviously not stand up. That's kind of like heat with graphics cards, you HAVE to engineer a way to get rid of it or your card will die, case closed. That is an engineering challenge, stuff you HAVE to design around.

Saying that you have to build the bridge only so high, is an arbitrary rule. Who gives a damn how tall it is as long as it is designed to work at its best possible capacity. That is the 300W limit argument, it is a pointless limit that was only put there for "safety" and warranty purposes, aka, more money that can be charged if you go above the limit, as Daedalus already explained as well.

But there is absolutely nothing stopping nVidia from making a 600W dual-GPU card if they truly wanted to..... which would be pretty freaking cool if you ask me.
 
They can't meet the thermals or the wattage limts so screw the standards that are set up? All manufactures adhear to these so all products can be interchanged and work together. This is getting crazy. Let me build a 100 gagbyte drive array, screw the standerd for a hard drive bay it will 3 feet square and you need to take a mider saw to your case but it will work.
 

Its not the manufacturers that adhere to it, its the developers. AMD, Nvidia, or Intel can't get a card certified if it draws more than 300watts. That doesn't stop end user manufacturers from breaking those rules, but if something fails, its up to them to offer support, not Amd, Nvidia, or Intel.

And fyi, its already in the pipeline. http://www.pcgameshardware.com/aid,705735/Asus-Ares-HD-5970-done-right-first-benchmarks/News/
http://www.brightsideofnews.com/news/2010/3/3/sapphire-ready-to-launch-radeon-hd-5990-4gb.aspx

Due to the sizes, these 2 monsters require special environments, Asus is oversized height and width, Sapphire takes 3 slots. With either of them, you have to pay special attention to how you hook them to your power supply since hooking it up wrong will cause the system to overload the PSU or your Motherboard.
 


If A company offered a HDD or SSD that could provide me more speed and storage than what I have now for the caveat that it takes up more, or strange assortment of, space I'd take it in a heart beat, you'd be dmaned sure the OEMs wouldn't though.


This is the same thing. If it is produced out of sig no big company will build with it, but as an enthusiast part no OEM will really care anyway. I care about performance per watt, but I don't care in the slightest if I get that ratio in one or two slots.

It would make no difference in the grand scheme if they broke the spec. All it would result in is ATI breaking it for a different product. They would still have their niche, just even smaller than current Halo products.

There is no need to be so melodramatic about it. It is not like anyone is making a card that uses a different slot type... Just ones that use more power. The over clocked 5970s are not to spec either. If the statement "Warning: this card may consume more power/release more heat than ideal for a single slot card" turns some off that is all well and good, but most enthusiasts who want to spend 1000 on a single card don't give a rats ass if it means more numbers in 3dmark.
 


You obviously have no understanding of the graphics industry...

- Enthusiast parts are meaningless to the finances of a company beyond the mindshare/advertising boon they give to the money making parts
- The numbers released are showing the cards to be much closer than the 4000/200 series but as we have nothing sure to go by who knows
- Nvidia will release the best card they can, there is no law stating they 'have' to release a faster card.


The rest of your points are illogical fan drivel. The horrendous logic displayed by jumping from physX can require a lot of power therefor Fermi has to be able to deliver it is absolutely mind boggling.... All it means is that they would want it to be able to deliver it. There is no causal relationship between the two, they are independent and at the best would just provide a target (though to be able to use physX with no hit to performance would be a childish target)
 


Didn't you just post something about not getting 'melodramatic', what the heck is "illogical fan drivel" "HORRENDOUS" and "childish? target" sound like? Sounds like the typical ATI anger post over someone being optimistic with Fermi and Nvidia's little tidbits and promises of UNRIVALED PERFORMANCE AND QUALITY and it’s built for the ultimate gaming experience.
http://www.nvidia.com/object/gf100.html
 
Does anyone recall the JDEC standard for DDR2 and the issues that caused? IIRC the standard called for the RAM to run at 1.8v but a lot of early adopters were getting BSOD's and freezing at start up which only went away if they upped the voltage on the RAM to 2.0v but as the process matured so RAM modules can now be got that run below the JDEC standard.
 
Anyone can say a card is great. You know nVidia gives the 9400GT "Lifelike Gameplay".
Gorgeous Graphics
Modern games and 3D applications demand more graphics performance than ever before and Intel integrated graphics simply aren’t good enough. With 16 processing cores, the GeForce 210 delivers over 10x the performance of Intel integrated solutions! If you want to play popular mainstream games like World of Warcraft, Spore or Sims3, the GeForce 210 is an essential addition to your PC
It's amazing how much a company will back their own piles of junk.
Lols.
80 Stream Processing Units provide enough power to tackle demanding games and applications.
Enhanced Anti-Aliasing (AA) & Anisotropic Filtering (AF) - High performance anisotropic filtering and anti-aliasing (4X AA) smooth jagged edges and create true-to-life graphics, for everything from grass to facial features
.
 


/face palm

I have to assume you are joking... Why the flying fck does my post make you refer to it as an ATI anger post.. what in creations name are you smoking?

To assume that something has to be good because the company has an assumed superiority in the eyes of a fan is drivel, it is illogical, and it is asinine. I can't in good conscience let someone post what is the equivalent logical mess of claiming that the new low millage Honda will use no gasoline at all given that gas use is 'taxing'... I expect at least the semblance of rationality in the arguments posted by others.

If you want to defend such non sense with what amounts to less enlightened than 6 year old name calling by throwing around brand affiliation that is your retarded prerogative..

By the way, while I may have used a touch too many hyperbola, there is a night and day difference between ridiculing what it nonsense, and using melodrama like maximiza did to make a well defined situation have the connotation of something worse.

 
Status
Not open for further replies.

TRENDING THREADS