Nvidia Fermi GF100 Benchmarks (GTX470 & GTX480)

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
http://www.tomshardware.com/reviews/geforce-gtx-480,2585.html


"Crysis is perhaps the closest thing to a synthetic in our real-world suite. After all, it’s two and a half years old. Nevertheless, it’s still one of the most demanding titles we can bring to bear against a modern graphics subsystem. Optimized for DirectX 10, older cards like ATI’s Radeon HD 4870 X2 are still capable of putting up a fight in Crysis.

It should come as no shock that the Radeon HD 5970 clinches a first-place finish in all three resolutions. All three of the Radeon HD 5000-series boards we’re testing demonstrate modest performance hits with anti-aliasing applied, with the exception of the dual-GPU 5970 at 2560x1600, which falls off rapidly.

Nvidia’s new GeForce GTX 480 starts off strong, roughly matching the performance of the company’s GeForce GTX 295, but is slowly passed by the previous-gen flagship. Throughout testing, the GTX 480 does maintain better anti-aliased performance, though. Meanwhile, Nvidia’s GeForce GTX 470 is generally outperformed by the Radeon HD 5850, winning only at 2560x1600 with AA applied (though it’s an unplayable configuration, anyway)"




"We’ve long considered Call of Duty to be a processor-bound title, since its graphics aren’t terribly demanding (similar to Left 4 Dead in that way). However, with a Core i7-980X under the hood, there’s ample room for these cards to breathe a bit.

Nvidia’s GeForce GTX 480 takes an early lead, but drops a position with each successive resolution increase, eventually landing in third place at 2560x1600 behind ATI’s Radeon HD 5970 and its own GeForce GTX 295. Still, that’s an impressive showing in light of the previous metric that might have suggested otherwise. Right out of the gate, GTX 480 looks like more of a contender for AMD's Radeon HD 5970 than the single-GPU 5870.

Perhaps the most compelling performer is the GeForce GTX 470, though, which goes heads-up against the Radeon HD 5870, losing out only at 2560x1600 with and without anti-aliasing turned on.

And while you can’t buy them anymore, it’s interesting to note that anyone running a Radeon HD 4870 X2 is still in very solid shape; the card holds up incredibly well in Call of Duty, right up to 2560x1600."



It becomes evident that the GTX470 performs maybe 10% or less better than the 5850 on average, and the GTX480 performs maybe 10% or less better than the 5870 on average. Yet the power consumption of a GTX470 is higher than a 5870, and the GTX480 consumes as much power as a 5970.

The Fermi generation is an improvement on the GTX200 architecture, but compared to the ATI HD 5x00 series, it seems like a boat load of fail... =/



------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Original Topic:

http://www.hexus.net/content/item.php?item=21996


The benchmark is in FarCry2, which is an Nvidia favoring game.


==Nvidia GTX285==

...What you see here is that the built-in DX10 benchmark was run at 1,920x1,200 at the ultra-high-quality preset and with 4x AA. The GeForce GTX 285 returns an average frame-rate of 50.32fps with a maximum of 73.13fps and minimum of 38.4fps. In short, it provides playable settings with lots of eye candy.

==Nvidia Fermi GF100==
Take a closer look at the picture and you will be able to confirm that the settings are the same as the GTX 285's. Here, though, the Fermi card returns an average frame-rate of 84.05fps with a maximum of 126.20fps and a minimum of 64.6fps. The minimum frame-rate is higher than the GTX 285's average, and the 67 per cent increase in average frame-rate is significant...

==Lightly Overclocked ATI 5870==
The results show that, with the same settings, the card scores an average frame-rate of 65.84fps with a maximum of 136.47fps (we can kind of ignore this as it's the first frame) and a minimum of 40.40fps - rising to 48.40fps on the highest of three runs.

==5970==
Average frame-rate increases to 99.79fps with the dual-GPU card, beating out Fermi handily. Maximum frame-rate is 133.52fps and minimum is 76.42fps. It's hard to beat the sheer grunt of AMD's finest, clearly.

Even after taking into account the Nvidia-favoring of FarCry2, the results are not too shabby...in line with what we've been expecting - that the GF100 is faster than the 5870. It will most likely be faster in other games, but to a smaller degree... Now the only question is how much it will cost...
 


Yep seen it, it's a Rogers commercial here.

And yeah, I've been talking about that for years, since we first discussed the idea of OLED printing making it 'cheap' to make big panels (where's my DAMN OLED WALL it's 10+ years since Mitsubishi's 10" !). My ideal was it would be awesome to print a big sheet of 'OLED WALLPAPER' and then you could make a big screen wall that when nt watching TV or a movie could turn into;

- A Cafe view of the Eiffel Tower
- A Picture Window in a chalet atop a Mountain (like Eagle Eye @ Kicking Horse) watching the sunset
- A sunrise on the beach
- A Rain forest
- etc.

Biggest problem is connecting the immense lattice of wires required to address the whatever ungodly res x ungodly res P .

Yeah, I'm already there wanting it, but with 150" plasmas costing a bazillion dollars, I'm waiting for OLED.

BTW I already have a 1280x1024/1080i projector and can light up half my living room wall (which is ~15' high), but it's not the same of course.
 


Heh, yeah, I love my projector. Was fun at university cause it was like a cheap movie theatre. We watched the whole season of band of brothers on it... loved that.

Runco makes some nice LED projectors now, for $15000 or something. Runco always makes nice theatre projectors though, plan on buying one when I finally get a theatre built in my future house.

They also make these... http://www.runco.com/vw-100hd.html or this http://www.runco.com/cw-95dhd.html

It would only cost you $40 000 for one of those, not too much 😛
 
Yeah I like runco alot, just can't justify the price, I was looking at a 3 chip DLP there for a while which was the international model (supports 16p,24p,30p,50i,50P,60i,60P,75p,100i,100p [unfortunately no 120p]), but they discontinued it (and it was about $12K) and seemed like the best option for me but couldn't do it.

The problem with those above, they're still 'low' resolution on those @ 'just' 1080p, which is OK for video and gaming, but not for desktop, I would be annoyed to have it that big and not at least 2K.
 



Yep. A single larger screen simply would never give the feeling of peripheral vision.

The bezels are only an issue in very few, if you play fps or driving games etc they aren't even noticable. I was playing some god game thing and they were an annoyance, though i think thats was because of the side screen stretching, which i think might be able to be changed as a setting.

Dragon Age: Origins looks absolutely amazing btw.
 


I disagree, they just need to start producing more wrap-around screens like Alienware's.

http://www.wired.com/gadgetlab/2008/01/ces-2008-alienw/

THAT I would invest in.... no bezels ftw!
 
Bezels are not that bad with bezel compensation sofware. But i must admit having seen bezel compensation in acction you cant realy go back to not having it. Similar to me upgrading to a widescreen monitor, you just cant look at a square one the same anymore.
 


Probably costs 5x as much as 3 22" LCD's and only has half the resolution.

It's not worth it and never will be. And if it was, you could just eyefinity 2 or 3 of them together. 😛
 

only time i would like a immersive game is if i was playing rpg or driving (so help me i don't see that appeal), else i rather have the competitive advantage of seeing more of the game in a tunnel vision vs using my peripheral becuase isn't that teaching you the wrong real life reaction, i see someone attacking me from the side i click my wrist to the right to react all while not turning my body (imo using my peripheral without actually fully reacting to them just makes me feel like im sitting in a desk being a fat bastard playing a game cuz instead of doing my actual reaction which is to at the bare min turn my head im teaching myself only to move my fingers).
but thing like fps, rts, action adventure, puzzle etc i don't see much merit. esp as most of those excluding fps are not first person view games. but really you want to feel like you're killing those _generic bad guy_ i don't i rather feel this is all a game.

at best i've played around with it for an hour or so i thought it was interesting but i didn't find it all that great not the worth of setting up my monitors and computer around constantly using it.

not sure if my anger rant was coherent or not.
 


ZOMGOD

... Eyefinity warp-around monitors, 360 degree view

THAT would be worth it....
 
I was testing Far Cry 2 in eyefinity tonight and it struck me, very obviously, that this is the future of gaming. My best friend claimed she wasnt interested until she saw it, then she was basically mesmerised watching it.

All it needs is a new, cheap method of displaying pixels on a cling-film type surface. We might be 5-10 years away from that, but it's coming :)
 
I don't mean this in an offensive manner, but hearing about multiple female gamers is always refreshing, cheers to you and your friend Jenny! it's nice to see the societal boundaries between the sexes melt away, to think many people would say that gaming should only be for males! As a guy who has known many women who were better than him at gaming, I do refute that notion.

Back onto the topic at hand, I like the eyefinity idea and I know that it wont be long before we see minimum bezel monitors hit the market just for such a purpose, but I will wait a bit until the hardware to play at such high resolutions becomes a little more mainstream, maybe with the 6xxx release.

I also hope that nVidia gets on board with this, the more I hear about it the more my opinion changes in favor of it. We need nVidia to compete in this space, otherwise we are looking at a proprietary solution, albeit much more useful and hopeful than PhysX. It is also nice that ATI did not fully lock-down their technology like nVidia loves to do.

All in all I see it as a legitimate option now and especially in the future. Know I'm wondering if the next gen. consoles will be influenced by it.
 

If you could afford a 6 monitor 360 eyefinity set up you probably could afford some sorta pully system to lower the monitors into place when you are ready.
 


:lol: That is a funny image.

But I think a simple mini-desk on wheels would suffice.
 



Haha no worries 😀 I'm more of a gamer than she is, both of us play WoW and lotro though. We also have another friend from Sweden who plays Lotro also.
 
Female gamers are mostly very cool. Unlike males, they don't play so serious that the slightest chance of not winning causes them to go into a raging fit. And they don't usually spam their mic at the age of 10. Of course, there are exceptions. Those that scream "DON'T KILL ME I'M A GIRL!!!!" will get booted quickly :)
 


Female gamers are like the rare pearls in the algae infested, pollution choked waterways that is the world of online gaming.
 


Huh? You're such a topic hijacker!
What about she-male gamers, what do they yell out? Don't kill me, I'm a trans-gender!?
 


is that what you say? 😀

thread derailment, but +1 to female gamers. i would marry any chick that genuinely likes video games (playing wii doesn't count) but i have actually yet to meet a girl who does 🙁

back on topic (kinda) im pretty sure Nvidia will offer cards with 3 monitor output (they may even go 4 to try and one up ATI) as its not really anything difficult to implement, its just something no one has cared about til now.
i agree that eyefinity (and whatever nvidia come up with) is the future, and i think eventually most hardcore gamers will have a 3 display setup. LCDs are so cheap nowadays, that its a viable solution for even budget conscious gamers. i can pick up 2 very nice 1080p 22 inch monitors for half of what i paid for my 5870. and after using it, im decided it will be my next upgrade.
 


You don't know what you're talking about, on both counts, do you even know how nVidia implements monitor output support on large chips? Easy, pff !! [:thegreatgrapeape:5]
In order to 'one up ATi' nvidia has to add 7 monitor support, not 4. :hello:

 
Status
Not open for further replies.