GTX480 / GTX470 Reviews and Discussion

Page 20 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Looks like 93c really is where nV wants this card. I really would have a hard time swallowing a 70c-ish idle temp but I guess they are trying to balance fan throttle and noise with acceptable heat.
All in all, I'm waiting for a distributor rebuild; something that could idle around 50c and just bounce around the high 80's. I think that probably is going to be as far as air cooling and a huge oversized hs will go for these things.
 


That last bit should be 'BETTER Thermals', not 'GOOD', and guess what, if you put it under waterblock it will be even better.
But it's the same for ALL cards, and not just can we get below 100c, it's what's the difference? Next article, Fermi is cooler on LN2? :sarcastic:
If the standard practice is for reviewers to test in an open case or in whatever case they usually do, then as long as they keep that consistent, then the numbers are consistent. We knew Fermi had issues with cases when they started the whole program of certification and requiring specialized cooling. The end result is the same, HEAT IS AN ISSUE FOR FERMI, which turns it into and issue for users, which they will have to figure out how best to deal with.

So nV finds one review that shows the numbers their PR wants to use and that's supposed to show that it's suddenly become cool and quiet.

This is still quite hot and hotter than Radeon HD 5870 in any case, but it proves the point that a good case, or any case, will make things look much better than testing the card on table testbed.

Yes and same goes for the other cards as well (did nV just figure out the concept of case airflow when working with case makers to build the 'Fermi Approved' cases? :heink:
So the point of the article is for all other cards you're fine to simply use the case you've been using, but if you want to buy a Fermi, you may also have to replace your good case (the Centurion is a great case very open with mesh grills and lots of fans) with something 'Fermi Approved' which will cost more for that certification as well. Great that makes peoples concerns about heat 'dissipate' so much easier. [:thegreatgrapeape:5]

The card isn't even in retail yet, and already reviewers are being forced to address the 'heat and sound issue' with their hand-picked review cards, that doesn't bode well for people who are going to be buying these online randomly at e-tail and not get the pick of the litter.
Then add to that even a slight layer of dust on the HSF assembly and then what happens? China Syndrome? :hot:

The FX5800 and HD2900 weren't hot they just were ahead of their time with no good cases fit for their modern design. :na:
And as mentioned earlier (in my reference to Xbit) it's also the pitch of the sound that matters, not just the decibels, as a low decibel high pitch sound may be more noticeable. The GTX480 is not quiet, but the dull drone may not be as annoying as a 'quieter' higher pitch as found on the FX5800 and current small fan cards, but it's still louder and more noticeable that it's direct competition.

Call me skeptical but wasn't this one of the locations that 'leaked' some early Fermi info (and surprise surprise they had a pre-launch card), so I have a feeling there was alot of hand-holding going on there for a site that has little/no reputation to lose. :pfff:

A Hummer is still fuel inefficient even if GM were to have come out with a single test that showed in very specific conditions (downhill with a tailwind) it would get 25MPG.

So in the end nVidia's argument is that the number shock is too detrimental to their PR, reviewers must adjust the scale so that X<Y but at least people don't see numbers like 105c which might make them pause to think about it more than 97c because one is above boiling and that might scare people.
Hmmmm..... [:grahamlv:3] I wonder if they would've cared so much if it didn't brush up against that boiling point mark, or if it were in Fahrenheit which would confuse many buyers' perception as much as nV's renaming policies.

Looks like they've been talking to the reviewers to get them to publish something about this (including the lame attempt to fry and egg on the other side of an HSF :sarcastic: ), otherwise for the first time in a while reviewers have suddenly come to the conclusion that they were 'too harsh' and needed to revisit the issue. I guess if they didn't revisit the issue they might not find themselves with samples of Fermi2 to test. 😗

They didn't seem to have the same concerns when people were talking about the HD2900, in fact they did quite the opposite, even though there was the same type of 'it's not that hot' testing, although the result was actually better on the R600 than the G80 due to HSF design.

The test nV is pointing to doesn't dispel the hot & loud & power hungry idea, it just changes the bullet point temp # that people will toss at each other. Fermi still draws more power (and thus produces more heat), runs hotter and is louder than the HD5970 while performing slower, which is very similar to the HD2900 and FX5800, which both came out much later than the competition and were slower than the top card while consuming more power and being louder than their counterparts. Doesn't really change the results, just moves the scale to the left, but for many that's usually enough to make them think there's been an improvement in the design not just a change in the testing methods. :pfff:
 
lmao They try to justify Fermis heat by cheap cases. Since when reviewers show this? Maybe they should test the Fermi under let say 10 cases on the review? lol Sure , get a better case and any card you install in it will be cooler but produce Same amount of heat/Same power draw ( 43w more than a dual-gpu 5970) , that can never be changed. Now Fermis certified Cases make sense :lol:
 



Yes he has implied that before, it constitutes some of his arguments that and comments like this
Call me skeptical but wasn't this one of the locations that 'leaked' some early Fermi info (and surprise surprise they had a pre-launch card), so I have a feeling there was alot of hand-holding going on there for a site that has little/no reputation to lose.

I can do the same thing, with Hocp and his fail of sound testing. Though his revamped videos are better, why are they better? They seem closer. He's playing with the final edit volume less it seems. I can use that he has and promotes his own eyefinity setup and it was probably gifted to him. That and his ant-Nvidia bias with his Editorial out of nowhere, worrying about Nvidia's multi-monitor support.
These web sites are vying for hits for their own revenue and Nvidia and ATI are both in battle for your dollar as well.
\
edit: My stance is , whatever you can imagine NV doing , At is doing as well.
 


Yep, but not at the chip level (too many things could happen after that to ruin a 'great chip', at the card level, they've been shown to do that in the past (as has ATi/AMD) why would it change for their most important launch since the G80?

These parts are not random samples that reviewers were allowed to just pick off the shelf at BestBuy, they are hand picked and then couriered (usually last minute) to 2-3dozen people for testing, ontop of that these are select parts from select AIBs tested prior to delivery and provided product support better than a platinum/diamond club member of any board partners. If there are issues ATi and nV get back to the reviewers with suggestions ASAFP to ensure the review is as positive as possible, and in this case they follow-up to ensure that there's the appropriate caveats that their PR forum trolls can point to when people post negatives about their products.


and BTW builder, I'm not justifying either product (nor the FX5900 which would better fit the Fermi IMO without the launch schedule), they were both nice ideas of how to do things better than was being done, but didn't result in stellar performance except in rare situation in directions few games/devs went and did not equate in product successes (although other successful products evolved from them much later), however all three were underwhelming products.
 


Kyle's been pretty consistent about his quickness to pounce on power and noise. Guess you missed Kyle tearing into the R600's power consumption when it was still just a rumour (before he had samples and before anyone had test results) just that he knew that it would come with 6+8 power connectors. Even in his HD2900XT review he still focused on that and calling it hot despite it being cooler than the competition especially when ejecting the hot air out the rear of the case. From the R600 era you would say he had an Anti-ATi bias, especially with his beyond the pale rants about the then months from release R600.

My argument as the same then as it has remained for Fermi, if the performance justifies it, then consumers will buy it and not care that they need to get a new PSU or case if it meant enough increase in performance/utility. The problem for Fermi is that the performance is below the HD5970 which consumes less power, produces less heat, and less noise. What #scale you decide to place in front of that X>Y is simply a matter of choice. It would be like saying that it mattered that the HD5970 were $649 not $679 when comparing it to a Fermi which is $479 not $499, the ratios would remain pretty much the same regardless of the # you use. It's just how people deal with the situation they will find themselves at and not everyone get the ideal situation for either price or case/airflow/cooling, etc.

And just like prices are different in other countries, not all gamers will have a 19c ambient room temperature to do their gaming as the reviewers do their testing, so even then the numbers are not translatable.
 
I am not sure what was wrong with [H]'s videos they did. They showed the relative heat and noise of two gfx cards did they not? The variable were the same just the cards different.

Also, I believe kyle's setup is bought by himself for himself. I would not however say he is impartial. If he likes a product he let's it be known and that probably does show in his reviews of what he finds playable or what features are preferable.

I remember thinking back when the 8800gtx was released that he seemed to have some odd settings in his reviews. Now, many would claim his use of the 5 series plus eyefinity may cloud his judgement and that could be true.

ATI appear to be communicating well with him and offering him the products he wants, nvidia not so much. If their multimonitor support is not up to scratch I am sure you wont see a good review for nvidia for a while.
 
From how I've always read things from Kyle, is that he's biased towards performance, much like a lot of people here (enthusiasts) are. Back when 8800GTX was king of the hill, there were few people supporting ATI, now that ATI is on top, few people have good things to say about Nvidia. And Kyle has been the same way. For R600, he was an Nvidia fanboy, now he's an ATI fanboy (both more conjecture than fact).

And I don't like Nvidia's Eyefinity copy-cat solution. Not that it's a copy-cat, but the NEED for 2 cards means they haven't done anything outside of software to support it. If they narrow it to one card next gen, I'll be happy, though.
 
I don't want to debate Kyle and Hocp, I know his motto. I have read his responses when asked about his testing methods. He don't do synthetics, time demos. "you don't play a time demo,do ya?" But wait, You don't play Furmark either do ya? Ah, don't bring that up, lol . Different testing methods are a good read. It always easy to second guess methods, I know. I like reading them all and making my mind up, on which method is appropriate.
 


why the hell did you copy randomizer avatar?

and quotes?

and then try to show an attitude to everyone?

edit: sorry for the off topic comment, but that was the last thing that was bothering me, lol, as almost everything about nVidia and ATI has been discussed. 😀
 
While it would be helpful and benefit everyone from gamers to businesses if nv could get it's latened multi-monitor tech out, I personally would not slap down 3 monitors to 1 video card when it comes to gaming (let alone 6), anyhow. I like the idea of EFx6 but I sure as hell would not want to see the immense frame loss that comes with 6x1920x1200. CF certain.
If I worked in a power plant and fiddled with graphs and porno all day, ok.
:)
 


Eh, one of the biggest reasons I still refuse to do 3x or 6x monitors, other than not wanting to buy two more monitors, is because lets be honest, you wont get playable frame rates. One 5870 will never give you non crappy frame rates at a really high resolution like that, a 5970 has a hard enough time, 6 monitors you might be lucky if you get 5 fps 😛 I'm assuming of course your detail level on games is above "bottom-feeder".

Everyone hating on the 400 series for requiring SLI for multi-monitor doesn't make any sense considering you need to X-Fire 5870's to get playable frame rates anyways, nVidia just cut out the middle man. To do multi-monitor it SHOULD be a physical requirement to have a dual-GPU card or two single-GPU cards 😛

That being said, anyone that can afford 3 monitors can surely afford the 2 cards to play with 😛
 


not necessarily, for example in my case, i own 2 ASUS 23" monitors, that i bough for $360, I currently have a 4890, while i can only game on one monitor, its really appealing to me to get a new card and one more monitor for $179, but I bought my 4890 not to long ago, and I still wont replace it yet, so basicly I only need $399 for a 5870 and $179 for my monitor, I dont find that to be so "Expensive".

Now if you think you cant play a game with one 5870 and 3 monitors, where have you been?

Look:
http://www.pcper.com/article.php?aid=793&type=expert
Performance is scaling pretty impressively on Batman: Arkham Asylum - even though we are tripling our pixel count, the game is able to stay right at about 30 FPS on the minimum and nearly at 40 FPS on the average. This type of game is very playable at those speeds with just a single HD 5870.

Performance on H.A.W.X was pretty good on our triple 30-in monitor configuration, dropping to about half that of the single monitor configuration. Considering we are tripling pixels, getting half the performance sounds pretty good. Just keep in mind when looking at the screenshots below that most of the time the side displays don't have nearly as much action on them than the center panel where the gamers' focus is kept.

The only game that isnt playable might be FarCry2
Far Cry 2 was easily the most demanding game in our suite and really brought the system to its knees using the same IQ settings with three monitors as we used with one. At 15 FPS it definitely isn't playable; to find reasonable performance we dropped the IQ settings from "Very High" to "Medium" but left AA enabled. The results were obviously a little bit LESS impressive in terms of looks but the Eyefinity effect was still very, very cool. This is one of those titles that could really benefit from multi-GPU rendering!

but mind you that he tested 3x30" monitors for a massive resolution, each running at 2560x1600, a total resolution of 7680x1600
 
Very optimistic but you could imagine what the minimum frame rates are going to be. It might give you a competitive viewing advantage but 1 card with 3 monitors is going to beat it down quite badly in some games. It's a great advancement and would really rock with older less GPU intensive games but will need to be xfired quickly.
Where I could see this tech striving is in the business sector or for games a year or two old and definitely strategy or top downers. Anything else just needs more 'umph' imo. 😛
 
RealityRush, the biggest problem with your argument is oftentimes a single HD 5870 CAN run things at these very high resolutions and run them well (just look at snakej's post), and having a relatively artificial limit on my 3-way display setup is never a good thing. I, personally, like to go back to a game I haven't played for a couple years after getting a new card and really max it out, and I'm sure a lot of people do that, too. A 3-monitor display setup would really be an awesome way to re-experience an old favorite (this is more in future tense than actually right now, I know most games 2+ years ago will have issues with eyefinity).

Like I said, needing SLI for this first generation or two is ok, but in 2-3 years time, I sure hope they have it down to a single card.
 
Like I said, needing SLI for this first generation or two is ok, but in 2-3 years time, I sure hope they have it down to a single card.[/quotemsg]

Wish that was viable but if this mores law thingy keeps going the way it has been since the mid 90's with graphics vs. games (and computers in general), we're always going to have to CF or SLI with multi-monitors if we want to play the next Metro or Crysis. Maybe there is a chance that multi-m will become so mainstream that the nV and ATI push the performance to meet the requirement it comes with. :ouch:
 
ATI gets the 3rd monitor via the display port. Lots of confusion about getting that going. What dongle works with what monitor. Then troubleshooting becomes tricky because of dongle choice etc. I think a official ATI endorsed active display port dongle for say 45.00 that would work with all monitors/resolutions should be released. To help the end user.
Nvidia is avoiding that right now. Maybe another generation Nvidia will implement a hardware solution for the 3rd monitor. This generation they just needed to get Fermi out the door without any more complications, it seems.