Nvidia Fermi GF100 Benchmarks (GTX470 & GTX480)

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
http://www.tomshardware.com/reviews/geforce-gtx-480,2585.html


"Crysis is perhaps the closest thing to a synthetic in our real-world suite. After all, it’s two and a half years old. Nevertheless, it’s still one of the most demanding titles we can bring to bear against a modern graphics subsystem. Optimized for DirectX 10, older cards like ATI’s Radeon HD 4870 X2 are still capable of putting up a fight in Crysis.

It should come as no shock that the Radeon HD 5970 clinches a first-place finish in all three resolutions. All three of the Radeon HD 5000-series boards we’re testing demonstrate modest performance hits with anti-aliasing applied, with the exception of the dual-GPU 5970 at 2560x1600, which falls off rapidly.

Nvidia’s new GeForce GTX 480 starts off strong, roughly matching the performance of the company’s GeForce GTX 295, but is slowly passed by the previous-gen flagship. Throughout testing, the GTX 480 does maintain better anti-aliased performance, though. Meanwhile, Nvidia’s GeForce GTX 470 is generally outperformed by the Radeon HD 5850, winning only at 2560x1600 with AA applied (though it’s an unplayable configuration, anyway)"




"We’ve long considered Call of Duty to be a processor-bound title, since its graphics aren’t terribly demanding (similar to Left 4 Dead in that way). However, with a Core i7-980X under the hood, there’s ample room for these cards to breathe a bit.

Nvidia’s GeForce GTX 480 takes an early lead, but drops a position with each successive resolution increase, eventually landing in third place at 2560x1600 behind ATI’s Radeon HD 5970 and its own GeForce GTX 295. Still, that’s an impressive showing in light of the previous metric that might have suggested otherwise. Right out of the gate, GTX 480 looks like more of a contender for AMD's Radeon HD 5970 than the single-GPU 5870.

Perhaps the most compelling performer is the GeForce GTX 470, though, which goes heads-up against the Radeon HD 5870, losing out only at 2560x1600 with and without anti-aliasing turned on.

And while you can’t buy them anymore, it’s interesting to note that anyone running a Radeon HD 4870 X2 is still in very solid shape; the card holds up incredibly well in Call of Duty, right up to 2560x1600."



It becomes evident that the GTX470 performs maybe 10% or less better than the 5850 on average, and the GTX480 performs maybe 10% or less better than the 5870 on average. Yet the power consumption of a GTX470 is higher than a 5870, and the GTX480 consumes as much power as a 5970.

The Fermi generation is an improvement on the GTX200 architecture, but compared to the ATI HD 5x00 series, it seems like a boat load of fail... =/



------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Original Topic:

http://www.hexus.net/content/item.php?item=21996


The benchmark is in FarCry2, which is an Nvidia favoring game.


==Nvidia GTX285==

...What you see here is that the built-in DX10 benchmark was run at 1,920x1,200 at the ultra-high-quality preset and with 4x AA. The GeForce GTX 285 returns an average frame-rate of 50.32fps with a maximum of 73.13fps and minimum of 38.4fps. In short, it provides playable settings with lots of eye candy.

==Nvidia Fermi GF100==
Take a closer look at the picture and you will be able to confirm that the settings are the same as the GTX 285's. Here, though, the Fermi card returns an average frame-rate of 84.05fps with a maximum of 126.20fps and a minimum of 64.6fps. The minimum frame-rate is higher than the GTX 285's average, and the 67 per cent increase in average frame-rate is significant...

==Lightly Overclocked ATI 5870==
The results show that, with the same settings, the card scores an average frame-rate of 65.84fps with a maximum of 136.47fps (we can kind of ignore this as it's the first frame) and a minimum of 40.40fps - rising to 48.40fps on the highest of three runs.

==5970==
Average frame-rate increases to 99.79fps with the dual-GPU card, beating out Fermi handily. Maximum frame-rate is 133.52fps and minimum is 76.42fps. It's hard to beat the sheer grunt of AMD's finest, clearly.

Even after taking into account the Nvidia-favoring of FarCry2, the results are not too shabby...in line with what we've been expecting - that the GF100 is faster than the 5870. It will most likely be faster in other games, but to a smaller degree... Now the only question is how much it will cost...
 
The extreme restrictions theyve placed on tesla this gen makes for only the top cards, whereas last gen, they could use the lessor cards for the slower teslas, but not this time, it has to be all the top cards for both renditions., with the slow one coming first of course, while they stockpile only the best for the top one, coming later
 


yea, you're right but that was a while ago, in fact I wasn't even involved in PC gaming/building at that time, so I don't even remember that release.

What I mean is for the last 3-4 years I can't remember them releasing a real loser card, even though there has always been bad press about their cards before launch.
 


You mean like they did for the 5870 and 5970? 😛

People waste money, such is life.
 


Actually the price of the 5870 and 5850 are about right and would be perfect with a $50 decrease. The 5970 is a dual GPU card, of course its priced horribly.
 


?? I'm not hating ATI for that, I'm just saying people that love either company will always rush to buy their products no matter how crappy they are. People on here make it sound like ATI doesn't have those faithful servants as well lol.

I haven't bought ATI in a year or two, but I'm considering it in light of the events of the past half a year. Thing is, I didn't just rush out and buy the 5970 or 5870 as soon as it came out because I realized there's pretty much nothing I play that needs it right now 😛, but I'm sure there are people that did much like people will for nVidia 😛
 
If you also bought another two screens you'd realise how much you *do* need that 5970 though.

That was the whole point of this ATI arch. Eyefinity was brought to us, and made playable even though it won't run everything perfectly yet. If you think ~ 1 year from now, chances are that even Crysis will be running on max settings with 3x 30" screens.

Eyefinity is the real technology from this series, and even the people who thought it was a gimmick at first are starting to realise that it is a lot more.
 


Ugh it looks cool but I was so impatient and had to go and buy a GTX 295... a few weeks before the 5870s came out :heink:
 
I am no fan of eyefinity for some uses, but it would be AMAZING for a hardcore RTS or flight sim. Also with projectors, it would be amazing for everything!

That said, eyefinity is best for productivity where it is extremely helpful.
 
But you don't really need 'eyefinity' for productivity, it's just an updated version of Hydra in that case (similar to Matrox Desktop IMO), that they added 6 display support in a single card is the impressive thing IMO, and theoretically 4 output from a single 'standard' card (although requiring, DVI+VGA+HDMI+DP to work and not overlap support).

I use 3 monitors at work from my laptop on a Dell docking station (with a Matrox G550 inside) and a colleague I setup has 4 off a G450, and I LOVE Matrox's display tools.

But really you could do as much basic desktop multi-monitor work with just multiple cards, only now AMD is getting serious, so you can do the whole 4x6 displays thing which even with multi-cards is pretty hard to do on other solutions (although still not impossible).

Anywhoo, I just want better multi-monitor support on a laptop natively, and with the Mobile HD5870 (really just a mobilized 5770) supporting 6 display outputs + DP mirroring, I would love to get a laptop with a single HDMI and 2 DP or miniDP connectors on the back to allow me triple monitor support wherever I go. I've gotten too used to it at work. :sol:
 
TGGA you are right, but now that we have standard GPUs able to do this without add-ons it makes a good deal of sense for those who need it.

As for a huge as monitor, well the angled effect of two of the monitors to form a "cockpit" type of experience would be ideal for the game genres I listed.

Over all I would buy a 32" HDTV first, but that's just me.
 


Too expensive.

You can get a 4K monitor, but have you seen the prices on those things?

For work though, I'd personally prefer 3 IBM (what is it with IBM and awesome monitors like my P260 ? ), anywhooo 3 IBM T221 QUXGA (3840x2400 each) monitors than one big 1080P monitor. I needs the pixels not the size personally.

TO bad they still cost a frickin' fortune. I first put in a requisition for one in 2001, and got a "Are you CRAZY/Kidding !!" reply back for the $8,999 pricetag (PIP model was $9,999) but they had no problem with me using two $3,499 P260s. :ange:

Now we have about 8 of those P260s sitting in a corner just for me (no one else uses CRTs anymore), and I bet had we bought those T221s they'd still be used and fiercely guarded the way I guard my P260s.
 

Yup as resolutions get bigger and monitors get bigger i find i need less and less monitors, shame the monitor industry moved to 1920x1080 killing off for the most part 1920x1200 🙁 (atleast at newegg and triger direct etc i see that was a trend)It's a computer most things on computer need more vertical space then horizontal. Sense when does office scroll to the side and sense when are webpages build horizontally.
 

A 30" is $2k here :lol: I wouldn't mind multi-monitor for general use, but not for games.


16:9 may be cheaper to manufacture than 16:10 same as 16:10 is cheaper than 4:3. It seems that only the cheapo TN panels are going 16:9 anyway, which sucks because I only buy TN. 🙁
 


Have you seen that commercial where the two guys walk into a room and there's a hot gal in the pool outside and one guy goes, "so, where's the TV?", and the other guy goes, "dude, it's right there" and changes the channel to football, and the entire wall that you thought was a glass window with a girl outside turns out to be a TV.

Now tell me again.... is there any amount of money you WOULDN'T pay to be able to do that?? 😛
 
as someone who aslo thought ATI eyefinity was a waste of time, i have something to share.

a friend of mine set up eyefinity, i told him how stupid i thought it was so he made me (litterally 😀) go to his house to try it out.

im now sold on the idea, and wnat to get 2 more monitors of my own. the bezels are completely a non-issue. the extra screens are in your peripheral vision, you do not look at them directly. you keep your view centered on the middle screen, but the other two give you a wider view angle that makes a massive difference in FPSs. it really does have to be seen to understand. its very hard to explain, but i assure you it really is a very cool technology.

and at mac from earlier, the whole point of this thread is to examine available evidence. like the released benchmark. its that or nothing. i know we can only make educated guesses, but thats exactly what most of us are doing. disregarding that evidence makes even being in this conversation a waste of time.
 


If TGGA's 4K monitor meant price ($4,000) then why would he mention "but have you seen the prices on those things?"
I think he probably means resolution (4000 x 3000 (4:3)) could be a possibility.
 


That's what I was trying to convey, it's indescribable, the extra monitors completely fill your vision in a way that no single monitor (other than maybe the curved Alienware one) can.
 


After playing Morrowing and Oblivion in surround, I like it, but it used to be a pain, I'm hoping for the laptop described above to fix that.

BTW, when I say 4K monitor, I don't mean the typical 30" 2560x1600, I mean the 4K resolution range: 4096x2160 - 4096×3072 (2160P being the new 'standard' like on RED cameras)

The only ones I've seen are above $12,000 many above $30,000.

If you know of one for $2K that's sweet.

There are some projector tricks for it like 'wobulation' for DLPs but that would blur pixels like edge bleeding/blending almost and likely drive me crazy for text/desktop, but would likely look awesome for games/movies.
 
Status
Not open for further replies.

TRENDING THREADS