GeForce GTX 480 And 470: From Fermi And GF100 To Actual Cards!

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

brisingamen

Distinguished
Feb 3, 2009
201
0
18,680
im also betting there is a reason no site ive found yet thats reviewed the fermi has even attempted to overclock it because its already about to cook itself, . . . .

the wattages are bad enough but the temps are just mind bogglingly unacceptable to me.

the one thing i can say about many reviews ive read so far about these chipsets is that drivers and architecture play bigger roles than ever, a year ago thier was alot of parity but depending on what games you play and the engines they run on really determine what card "wins"

in alot of these benchmarks the memory allocation is the only thing really giving nvidia an advantage in many of the charts at the high resolutions, not the chip itself by the looks of it. . .

the 2gb variants of the 58xx series are gonna be hella sweet. i really wonder how long it will take to see a gtx 512 core part, . . .

4 grand on a tesla part is just laughable.

it will take atleast 2 more revisions imo to make this chip a "good" buy, a fully functional part that has decent overclocking headroom.
 
This comes to me as no surprise. When I saw the frame rates on the Heaven benchmark compared to the ones that some kid did with his HD 5970, I knew the Fermi was toast. I also knew that nVidia would be dumb and greedy and basically price its way out of competition. The release of the Radeon HD 4870 signaled the turning point in this war as it beat the GTX 260 while still maintaining a $60-lower price point. nVidia still hasn't learned and probabably never will. I'll admit that the performance figures are slightly higher than I expected but these cards are so hot that they'll melt themselves (or damage mobos). It seems that nVidia do not do its homework (AGAIN) and therefore did not employ the type of cooler that this card needs to survive. I'm not impressed and neither is anyone else of note that I've seen so far but nVidia has a bigger problem. Its partners have been jumping ship or at least diversifying (Powercolor and XFX come to mind) and this trend will most likely continue of the GTX 4xx series doesn't sell simply because the partners need to turn a profit for once. ATi is the clear winner here.
 
One other interesting point to consider here. The Washington Post indicated that at least one of nVidia's wins was due to the fact it had 1.5x the RAM of the HD 5870s. Since they're still out there, how about a comparo with the HD 5870 2GB?
 
G

Guest

Guest
You also have to take into consideration the noise this card is generating. You will be practically deaf when the card is used for actual gameplay. It is very, very noisy!
 
[citation][nom]pei-chen[/nom]Wow, it seems Nvidia actually went ahead and designed a DX11 card and found out how difficult it is to design. ATI/AMD just slapped a DX11 sticker on their DX10 card and sells it as DX11. In half a year HD 5000 will be so outdated that all it can play is DX10 games.[/citation]

No, you're retarded. ATI's 5x00 cards are as much DX11 cards as the Nvidia GTX4x0 cards.
 

mooseslayer

Distinguished
Sep 30, 2009
2
0
18,510
yes its not a woow feeling over those cards BUT you get a warm house upp in the north and you see when you running your machine..the lights are blinking ;)

i think its only a matter of tast...I go green :
 
G

Guest

Guest
Yes quite disappointing. My last 3 graphics cards were Nvida (7600GT, 8600GT, 250GTS) however I have recently switched to Ati (HD5850). Nvidia have gone downhill, my 8600GT was faulty (constant stuttering on the desktop and in games) and they replaced with an even worse performing card. Combined with the recent driver issues (melting of cards) I'm completely turned off Nvidia. Shame boys, shame.
 
G

Guest

Guest
The charts are deceiving and should have been ordered based on AA performance. Who buys a high-end card and runs without AA? This would flip the results quite a bit in NVIDIA's favor. I wish the performance were a bit better, but this is early days with drivers, silicon and DX11 for NVIDIA. I expect driver development to further improve the performance. In the end, have to see what the street prices will be on these cards. I think the emphasis so many people put on temps is over-reaction; as long as the card is not dumping that all that heat into your case and the cards are not failing, who cares what the heat output it? Unless you live somewhere where there is no AC in the summer and your room runs above average temps, it really makes no difference whatsoever.
 

Sihastru

Distinguished
May 4, 2009
67
0
18,630
You have to take into consideration the state of nVidia's BETA driver vs. the much hyped, 30% this, 20% that, 10.3 very mature ATi driver. 10.3 has big gains in a lot of areas and it was launched just before Fermi launched. I think ATi squeezed every performance drop out of their cards with this driver.

People forget history quickly, there was a time when the GTX2x0 looked poor compared to the HD4xx0, that changed after one or two driver revisions.

What will happen is that the GTX470 will have the same performance as the HD5870 at a LOWER price, while the GTX480 will get as close as 15% to the HD5970 while having a MUCH LOWER price.

What is strange is that other review sites already place Fermi there, why are results so different? They are all over the place... Maybe there are some mixups with the drivers?

Or is it that TH has been overrun by ATi fanbois so their articles are now made to please them more then any other category of readers? It is now impossible to post a comment that puts AMD/ATi in a bad light on this site! You'll get instantly scored as a "Useless message". Yeah, I know, results are results, but when results are so different, a question mark rizes.

In any case I would put the party on hold for a moment, since even these results are opened to interpetation. We all know what the fanbois are saying. ATi fanbois: HOT, EXPENSIVE, FAIL. nVidia fanbois: HOT, COMPETITIVELY PRICED, WIN.

What everyone needs to do is wait for nVidia to come up with a more mature driver. ATi had ~5 months to come up with their perfect storm. Launch drivers always give bad results, it's true for ATi and it's true for nVidia. Don't be surprised when the next two nVidia iterrations give these crippled and hot Fermi cards 20% or even 30% performance boosts. 10.3 did that for ATi.
 

Tamz_msc

Distinguished
Some reasons why Nvidia should be already working on the GTX 485/475:
1.Overshooting the transistor count by 1 billion+(compared to Cypress) has cost them dear in the power and temperatures department.
2.Clock speeds must be increased;600 MHz for a high end card is simply not good enough.After all, the number of transistors does not determine FPS in games to that extent.
3.Promising 512 cores and delivering 480 will hardly help in customers' trust in Nvidia.
With these measures and new drivers Nvidia will eventually increase the performance of the cards, but the time it will take and added costs and resources spent in redesigning makes me see a clear winner in this generation : ATi - they simply won on a "first come, first serve basis".
 

Soulmachiklamo

Distinguished
Feb 7, 2010
97
0
18,630
Seriously, i'm not going to choose for nvidia while i'm actually a fan of them. Anyways, like everyone else says, its too late, too power hungry, and gives about the same test results as the ATI gpu's?
Even some cores are disabled because they are not capable of being used XD. After 6 months i must say that nvidia didn't fulfill their promises.
I'm going for a 5850 MSI and OC it godammit :)
 
G

Guest

Guest
For the 100 dollar premium I'd expect it to be bundled with a hooker, porno, pack of condoms and smokes. Otherwise I don't see what's at all appealing about it; the box maybe?
 

azgard

Distinguished
Dec 20, 2002
159
0
18,680
[citation][nom]edilee[/nom]To the author of this article...great in depth review! Also the retail Dirt 2 DOES in fact have a built in benchmark...it is in the "options" section outside the trailer in the graphics section at the bottom. In fact I ran the benchy a couple nights ago before and after I did a CPU swap. I do not recall the benchmark being in the demo...seems to me it was greyed out if I remember correctly but I could be mistaken but it is in the retail version for sure.Interestingly enough the Nvidia cards hit right in the margin of performance vs. ATI's card I though it would except for the marks where the 480 was extrmely close to the 5970...this should trouble ATI and I say this because the fermi cards are not fully functional cards yet with the first 2 debut cards.And for all the LOL comments from the ATI fanboys I have one question...in a few months when all cores are active on the Nvidia cards what name will be topping all the benchmark charts? Nvidia, Nvidia, Nvidia. Remember all the really good cards come towards the end of a series...and where do you think a Fermi X2 card is going to place on the charts? The top with much ease.Aside from that this Nvidia card series release will hopefully drive card prices down for all ATI and Nvidia fans alike...once the manafacturing yields improve for both companies.To the author of this article...great in depth review! Also the retail Dirt 2 DOES in fact have a built in benchmark...it is in the "options" section outside the trailer in the graphics section at the bottom. In fact I ran the benchy a couple nights ago before and after I did a CPU swap[/citation]

FermiX2? If your referring to a dual gpu card, only way this is going to happen is the way the GTX 295 did, a die shrink have fun waiting 2 years for your card(hey that does actually fit in perfectly with Nvidia's new marketing cycle!). What I'm really shocked is how no one has touched upon the fact that them releasing these stripped down cards mean's yields truly are as terrible as has been speculated for month's. Can't wait to see what the supply looks like once they finally hit the market.
 

hundredislandsboy

Distinguished
Yup. I want one not because I play games but because it's all about the Heaven benchmark that I want to run over and over and over and afterwards, tweak my GPU, overclock it by 1 mhz and then back to running the benchmark, watching it over and over and over, then repeat the cycle again.

Lol, my next gaming rig is crossfire dual 5850s which should be $225 each in about a month (sometimes open box for $200 - ish at Newegg).
 

Tomtompiper

Distinguished
Jan 20, 2010
382
0
18,780
Welcome to disappointment day, hope it was worth the hype, sorry wait. The next time Nvidia start showing us boxes months before launch we will no what to expect.
 

Quinid

Distinguished
Jan 18, 2010
10
0
18,510
Well..... that sucked..... If nothing improves by this fall, I'm switching to ATI, I'm beginning to think it will be worth dealing with crap drivers to get more bang for the buck.
 

lmpi

Distinguished
Jun 2, 2009
3
0
18,510
i want to know what makes one card single-gpu :D gtx 470 and 480 have the characteristic of dual gpus like gtx 295 . I think single gpu card is a card of which is made dual gpu card :D and this wouldnt happen with gtx470/480 because of their power consumption
 

scrumworks

Distinguished
May 22, 2009
361
0
18,780
Funny how Tom's is trying to sugarcoat seemingly a big failure in conclusion. Now all the sudden DX11 is all important (it wasn't when Radeons had it first) and 3dvision now plays a huge role in decision making. 3dvision is just a poor gimmick that requires expensive 120hz panel and glasses + halves game play frame rate. No mention of ATIs out-of-the-box eyefinity that is much cooler feature available to whole lineup which nvidia does not have.
 

Lewis57

Distinguished
Nov 27, 2009
198
0
18,680
Hhahaha. This is exactly what i personally predicted. High power consumption (More than the 5970) and less output (quite a lot less than the 5970).

Consider me an Ex-nVidia loyalist.
 

pinkfloydminnesota

Distinguished
Mar 4, 2010
181
0
18,680
Pretty good, I mean, how much improvement do you expect? What is it supposed to do, exactly, double previous frame rates?

I'd be interested to know the impact the extra power consumption would have on my wallet. That should be included in the review, it's part of the price after all, and the idle power consumption is very different with a computer idle more than 18 hours a day, what does that mean? (Besides turn it off?)
 
Status
Not open for further replies.