Nvidia: GeForce GTX 480 Was Designed to Run Hot

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
One more reason that I am an inch from trying an ATI card soon......

I have used Nviida cards for over 10 years. It may be time to jump ship.
 
[citation][nom]dreamphantom_1977[/nom]Ummm.... [/citation]

You DO know that the 480 and 470 need TWO (as in, not one) video cards to do more than 2 monitors, right? I think YOU need to check your facts. Hurp Durp.
 
Have a little faith. Designing a huge chip architecture like this isn't something that's thrown together at the last minute from random pet projects. If they say that it can handle the extra heat, you can bet your ass they've run thousands of simulations and thousands of hours worth of "extreme" conditions to make sure it stays stable.

The point is start making fun of them *after* their card explodes, not before.

And to some idiot before me, more heat doesn't automatically mean wasted power. You're car essentially takes a shitload of bonded chemical energy and turns it into heated gas, and you get a lot of work out of that. In fact, the less heat you waste by the 2nd law of theormodynamics, the more efficiency you have....but you will always get less power.
 
if it's meant to run hot then remove the fan nvidia :)
or maybe nvidia wants everything to be high, high benchmark scores, high fps, high temps, high power consumption. you're on a roll

seriously, its the FX era all over again, good for ATI. maybe in a few months ATI will release an update on their lineup?

im not a fan boy, im using gtx260, i just buy either brand that i think is a good value.
 
[citation][nom]VioMeTriX[/nom]ok so there hot. tech rule #1 is to never buy a first gen product. let it get out there, do its thing, wait for the 2nd gen and revisions and improvements.thats what created the gtx 285, the gtx 280 sucked and had problems, gtx 285 rocks.[/citation]

buddy i have two 280 in sli and i have not had any problems , they are also overclocked :) did i mention that i have had them for at least 1 1/2 year? and yes they have been overclocked the whole time 24/7
 
everyone keeps complaining that nvidia messed up and that it's too hot and all that jazz...the 480 is a powerful card for a single chip card, from the reviews that i have read ati can't compete with nvidia when it comes to single chip cards.
 
[citation][nom]Crashman[/nom]Sure you jest, 5870's are designed to run around 70-80C, just run Furmark and find out. It's actually in the fan map. If I try hard enough I can top one out at around 90C.[/citation]

I like how the ati fanboys just skimmed right over this factoid.

And people, don't bother with softOCP reviews for nvidia vs ati, they are ticked at nvidia and will do anything they can to slam them.
 
First off the 480 isn't as hot as it was forcasted to be.

But this comment is inexcusable. There is no future for a chip this hot! At least nVidia could say that they are workin on a 28nm version, oh but wait that would disourage sales, huh?
 
Heat is the enemy of silicon, if the GF100 is designed for heat, is the GF100 using a new generation of silicon? Gamers seem to be only interested in performance/heat, which is reasonable, but the GF100 was designed to do more than just play games. Comparing the GF100 to ATI's offerings, is like comparing apples to oranges.
 
I don't want a card for gaming, I want a card for video conversion and stuff like that. Sure, someday CUDA won't make a difference but until them I'll want CUDA. I don't think the GTX480 card is worth jumping on, since maybe they'll make cooler ones later. My GTX285 will do until then...
 
Great, so I can add more heat to my case. Lovely. I therefore will probably need to back off the voltages and subsequent clock speeds of the surrounding hardware.
...Or, I could stick with my HD5870 and be happy with roughly equal performance and far less heat + power consumption for $100 less.
Tough decision there.
 
[citation][nom]Crashman[/nom]Sure you jest, 5870's are designed to run around 70-80C, just run Furmark and find out. It's actually in the fan map. If I try hard enough I can top one out at around 90C.[/citation]

I'm not much of a jester.

NewBitmapImage2-2.gif


That's 39c idle. During typical gameplay it sits at 50c. That's at 3150x1680. That's stock HIS cooling with the fan at 40pct, which is still very quiet.
 
I just built a machine for a customer with a single HD5870. I set the manual fan control at 33%, and maxed out both clocks (900/1300). I ran fumark at 1920x1080 w/ 8xAA in stability mode for 30 minutes. Never broke 76 deg. C. With the stock (reference) cooler, no less.
 
If you think its too hot, just buy the fish tank and drop your mobo into it with some oil and call it a day.

Now if it gets hot enough where you can cook fries in it (400F), then it runs too hot.

 
Here's an insider's sneak peek at Nvidia's GTX500 series:
<a href="http://s161.photobucket.com/albums/t229/arloomis76/?action=view&current=pc-ez-bake.jpg" target="_blank"><img src="http://i161.photobucket.com/albums/t229/arloomis76/pc-ez-bake.jpg" border="0" alt="Photobucket"></a>

I'll bet it'll make Crysis 2 look almost as good as a short stack of buttermilk pancakes.
 
Prescott and Rambus were good for Intel, right??? So Nvidia makes a Prescott replacement of a GPU with Rambus proprietary physcs. We'll see how this works for them.

"Those who cannot learn from history are doomed to repeat it." - George Santayana
 
http://i889.photobucket.com/albums/ac93/litlefox/002.jpg[img]

Does that answer it for you? Nothing special at all. I don't buy that 5870s idle at 55c and I don't buy that their normal operating temps are in the 70-80 range. Perhaps under stress testing they'll reach temps like that but until someone can show me a 480 that sits at 50c under normal conditions, I stand by my statements.
 
To be safe ,I just spent my money on a watercooled 5970 ,good luck NVIDIA.
 
Status
Not open for further replies.