Nvidia: GeForce GTX 480 Was Designed to Run Hot

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I went to the blog, took the time to read the post, and left the following comment:



I have had many Nvidia cards over the years in different rigs: started out with a GeForce 2 MX400, had an FX5200 and a 6200 on a spare computer, a 7600GS on my main one, and now a factory overclocked 8800GT. I also have several ATI cards. And from all models and both brands only one X700 card from ATI has failed on me. Apart from that, I've had no complaints. So, I'm not a fanboy of either side, I just look at the market and try to get the best value for money.

But, truth be told, I've grown very skeptical of Nvidia's behaviour towards gamers. Nvidia has clearly stated a few months ago that they are focusing a little away from games, which is no good indication, but worse than that, they now have their PR department saying that "We want tessellation perf to rock since we deeply believe that PC games should have the same geometric complexity as movies".

Yes, and now that you do want that, the industry will finally get it. But ATI has had tesselation support since 2001, and yet where was Nvidia all this time? And we had to wait for Nvidia.

Where was Nvidia when DX10.1 came out ? Where was Nvidia when Ubisoft removed DX10.1 support from Assassin's Creed ? And again we had to wait for Nvidia, because without Nvidia the gaming studios don't move forward.

And where is Nvidia's ethics when you rename your 8800GT's to 9800GT's (while some 9800GT's were still 65nm parts and others 55nm), and then to GTS240 ? And do the same to the 9800GTX+ becoming the GTS250, thereby confusing your customers ?

An now you say a performance card is designed to run hot ? And be power inneficient ? The card idles at twice the power draw of the ATI models, despite droping Core Mhz count threefold. Is that by design too ? Is the fact that the chip has 512 cores, but you can't get them out to the street in numbers with those many cores because you didn't adapt to the manufacturing process properly ? Please. You just missed a perfect opportunity to stay quiet and work on the next generation. Because you calling next generation to the GTX 480 and GTX 470 is six months late. We are already in the next generation, making it the "current generation" and, unfortunately, Nvidia doesn't yet have a single customer with one of their "current generation" cards.

You have a lot of homework to do Nvidia. I wish you well, not because of any fanboyism of either side, but mainly because you have provided me with countless hours of both work and fun and I appreciate that, and also because competition is good for everyone. That said, I surely won't do you any favors, as I'm not expecting anybody to do you any favors when you screw up. Especially when you ridiculously pretend everything's fine. It's a lesson I hope you learn for the future.

Take care.
 
This is why I switched to ATI. I learned my lesson on the 8800 GT. If Nvidia isn't concerned about the rest of my components, I'm not concerned about using theirs in my system.
 
nvidia did this for purpose. maybe they want to made these graphic for Canadian and Russian. which is a very green solution for people lives in far far away north and south. you will love to gather one in your room.
 
just look over the review done here on Tom's..

And acording to that review the temperature difference between lets say 480 and 5870.. ltes take those two since they are both sides fastest sigel GPU card..

Link to Tom's review
http://www.tomshardware.com/gallery/Temperature,0101-242418-7746-0-0-0-jpg-.html

From what Tom's tested the difference in temperature between 480 and 5870 is a small 7c.. if u ask me thats not that big a diffrence... ye its more hot i know that but give it a break 7c come on people 😛
 
Earlier today, there was a caption contest in the news. It was removed at around 5 P.M. I guess nVidia fired a lot on you guys today so I understand.

On the note that relates to this news: Operating temps and longevity of the card are not the only concerns. When the card runs hot, it heats up the entire computer, that will affect other components. Also, in the summer, gamer will have to withstand the heat from the card in addition to the summer heat. Provided that many will be in a room with AC or central air, these room cooling devices will have to consume more power; ultimately, that will add more money spent just to run a GTX 400 series. Conclusion: it's never good when the graphic card runs hot, no matter what nVidia/ATI have to say.
 
[citation][nom]gimmeausername[/nom]Earlier today, there was a caption contest in the news. It was removed at around 5 P.M. I guess nVidia fired a lot on you guys today so I understand.On the note that relates to this news: Operating temps and longevity of the card are not the only concerns. When the card runs hot, it heats up the entire computer, that will affect other components. Also, in the summer, gamer will have to withstand the heat from the card in addition to the summer heat. Provided that many will be in a room with AC or central air, these room cooling devices will have to consume more power; ultimately, that will add more money spent just to run a GTX 400 series. Conclusion: it's never good when the graphic card runs hot, no matter what nVidia/ATI have to say.[/citation]


I still have the link to the article. By the time I arrived the link was posted on the main page, but it led nowhere, but at least the link url hints at what was the original title:

http://www.tomshardware.com/news/Fermi-GF100-GTX-480-GTX-470-grill,10059.html

It had something to do with us saying what Nvidia's CEO was thinking about the GTX 480 grill feature.

Tom's Hardware, care to explain what happend with the page taken down ? Was it up too soon ? Is it scheduled for tomorrow, is that it ?
 
People asking why one chip might run hotter than the other companies ?
They are different designs. One has 33% more transistors for other performance/operational features. Its why the GTX 480 is at least 10% faster than the 5870 at a lower clock speed. Just one example.

Here's the next chapter in this story of the Fastest GPU on the earth.
The GTX 480: ATI caught cheating in Crysis to outperform GTX480
http://www.tomshardware.com/forum/285352-33-caught-cheating-crysis-outperform-gtx480

 
"The chip is designed to run at high temperature so there is no effect on quality or longevity."

That's the most paradoxical statement I've ever heard.
 
I'm not a gamer but run a repair shop,have been telling people that dust and HEAT is bad for computers,have i missed something. Me personally use I use ATI but sell Nvida-7c is alot of heat in a case.
 
std::system("compute:'The chip is designed to run at high temperature so there is no effect on quality or longevity.'");
std::system("compute:'The GTX 480 is the performance leader with the GTX 470 being a great combination of performance and price.'");

FATAL ERROR: 0xBADC0DE: - Does not compute.
kernel_panic();

Silly Nvidia! Blog post fail. 🙁
 
This is a hard pill to swallow for sure. As PC gamers we have essentially brain washed ourselves into the mindset that Hot = Bad. This is why every in our computers have fans on them. CPU's GPUs, Cases, even ram has some form of heat dispersion. And for you, Nvidia, to come and tell us that this time "it's ok" and it won't hurt our cards... It's a bit confusing...
 
Basic semiconductor engineering 101: The hotter the chip runs, the lower the lifespan. The cards are great for the 3 years of consumer operation they are designed to run for, but they will be dropping like fly shortly after.
 
lol next thing you know GTX480's will start getting Red Rings of Death
 
ok so there hot. tech rule #1 is to never buy a first gen product. let it get out there, do its thing, wait for the 2nd gen and revisions and improvements.

thats what created the gtx 285, the gtx 280 sucked and had problems, gtx 285 rocks.
 


wow, one german site that translates poorly, and we aren't even sure if its cheating or just a bug in the drivers, better to wait for other sites to have this up

personally, i don't like that the GTX480 is within 10c at times in games to the thermal limit of 105c, but it is technically within spec and we might have to start getting used to this as transistor count goes up

@VeoMeTrix, some people have to buy the first gen (enthusiasts) so that a second gen is made, i have used first gen before (8800GTS 640MB) and its not that bad, though i have said this before here, i'm skipping this gen since i have 2 x 4870 1GB which will be plenty for at least a year
 


i would say that the 2 are comparable to each other due to being close in power draw
 
Well Well, I am not an Nvidia fan-boy, but my only gaming GPU so far has been a 9800GT
bad board maker I'm afraid, INNO3D, It runs bot and doesnt have fan speed ctrl...
I am developing sth on cuda for my BSC final project, so I have used cuda...
really seems that ATI is the game today, but I would support NVidia if a cheaper card comes out..
remember, support AMD CPUs and NVidia GPUs..
they might not be great at the moment, but their removal from market could be disastrous
 
I don't know about longevity, but it does seem like they intentionally run on the hot side. overclockersclub was able to keep the cards are reasonable temperature even when over clocked by manually controlling the fan speed. If you watch the vidoes on hardocp, the fans don't seem to bother speeding up until around 90C when left on auto.
 
Excuse me but the "it was meant to be a hot card by design" is today a bad design choice, inefficient that is and inexcusable as a public statement. And is also misleading for the consumer. The only way Nvidia or any other big chip designer will choose this path is because they did not have any other option. And you can guess that is not really a choice. I don't want to see any other future Nvidia monolithic chip. I hope this will be the last generation or they will need to come up with very high performance next time.

By the way I am very happy with my Quadro 580 and this card is all I needed and wanted. The best in its class!! And I payed recently for this card almost the same I paid for a very bad and hot ATI 2600XT consumer card in 2008. Well, this part only consumes 40 watts at top performance inside my small Shuttle PC. I know this is an entry level professional part that can't be compared with the Fermi cards but let me tell you that my next consumer card wont be from Nvidia.
 
Doesn't matter if the chip itself can handle the heat, when you have a massive radiator in your case damaging every other component.

exposed metal, without a fan, reaching over 70 degrees during gameplay? thats close to be able to melt my side window.
 
Status
Not open for further replies.