Nvidia: GeForce GTX 480 Was Designed to Run Hot

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Well I was going to post my comment here that I posed on their blog but they removed it. It basically said if your gpu is designed to run hot then why is the amd 5870 running 5-10% slower than yours at half the power? Oh wait i forgot about physx... apparently so did the rest of the gaming community. with havok cloth and amd pushing their open source physics "bullet" you have no excuse to release a card with these specs. Dump the GPgpu crap and give us more fps at lower temps. In short FPS or GTFO!
 
If ATI designs a GPU as complex Fermi, then that too will run hotter than their current offerings. ATI's current GPUs are primarily designed with games in mind - GPGPU is an after thought. Fermi was designed primarily with GPGPU in mind, and to be able to achieve a 'high level' of rendering/pixel pushing for gamers. Based on the initial reviews it has achieved that.

The bottom line is that the GF100 contains several additional features which would blow the current 5xxx series out of the water if they were used optimally *in games*. If you read any tech discussion you will see an inkling of what these features can provide - higher precision math, multiple kernels running simultaneously etc etc. These features are not just of relevance to GPGPU - but the true potential of Fermi for gaming relies on developers exploiting this tech. If they don't then, sure, you may as well stick with a 'gaming' card which will give you better bang for buck (and consume less power)
 
Its just...Way too much power draw, I can get pretty much the same performance for less power draw and less heat output at a cheaper price..

Put that in with the factor of the savings running the card over a two year upgrade cycle and the savings rise further..

No the 480/470 cards run too hot and use too much power and are too expensive even if they do have buckets full of Performance.

Nvidia are fighting fires with this one, pity is that their "HOT HOT" chip Fermi started them all!
 

Same here, the GPU's on my 8800's can sit in the high 80's for extended periods and have done for the last three years with no ill effect to them or the rest of the machine.


It's odd that a "buggy" driver would give better performance, especially since ATi have been lauded as getting better with their drivers. And you can guarantee that if the tables were reversed Nvidia would not be given the benefit of the doubt instead they would be getting crucified for it.
 
[citation][nom]skevil[/nom]"The GTX 480 is the performance leader..."Wait... what?[/citation]

The GTX 480 is the performance HEATER.
 
I am not in favour of censorship in any way, so I must ask why the Caption Contest has been pulled. If Toms is not independent then it becomes worthless to me, I need unbiased and informed information, I don't want paid for reviews or censorship.
 
The heat is a byproduct of clockspeed. had they clocked it at a speed that would max out at about 200w, then the 5870 would have stomped it into the ground in performance. Now, ATI can release an uber-clocked 5890 that will consume as much as power as the 480, and take the performance crown back.
 
[citation][nom]ColliSions[/nom]You guys here whine so much about the heat, my xfx 8800gt has been ~100c under load for the past 2 years and its still running fine. Sometimes it does reach the shutdown threshold (110c) while playing Crysis for 2 hours but so far it hasn't failed..[/citation]

THAT...is already a fail. Only 2 hours, n shutdown?

Btw, for the statement, they are useless. Anything that runs HOT on a computer is NEVER a good thing. This fermi card is a step backwards in terms of technology, any companies can pump out a fast card (I bet ATI can do that as well), but they have to look out for other factors as well. I guess nVidia has no other choice to beat ATI 5xxx, and this is the only chance to get some $$$ from those who actually go buy this mother of all heaters.
 
As a nvidia user this card is not only a let down but a failure just not a total one. The architecture is showing its age on the old shader design but more or less overhauled since G80 days and so there is still a few years left in it before having to start from scratch like they had done with the GF6 and later GF8. The only way they could improve this is by a die shrink. It should also serve as a warning to ATI if they want to super size the R870 on the 40nm process and go for the original design that had 2400 shaders and side port. 28nm looks like the only exist at this time forth both for future generations to reduce cost. At least my 8800gtx barely goes above 70c on a hot & humid southern day.
 
[citation][nom]gimmeausername[/nom]Earlier today, there was a caption contest in the news. It was removed at around 5 P.M. I guess nVidia fired a lot on you guys today so I understand.On the note that relates to this news: Operating temps and longevity of the card are not the only concerns. When the card runs hot, it heats up the entire computer, that will affect other components. Also, in the summer, gamer will have to withstand the heat from the card in addition to the summer heat. Provided that many will be in a room with AC or central air, these room cooling devices will have to consume more power; ultimately, that will add more money spent just to run a GTX 400 series. Conclusion: it's never good when the graphic card runs hot, no matter what nVidia/ATI have to say.[/citation]

We should get rid of TVs, lamps, and other stuff too. Since electricity is so expensive we have to start worrying about an extra 30-100 watts on an entertainment device.
 
That will also limit your ocing, seeing a 5850 can easly OC by 38%, and at those speeds, it will beat a GTX480 while still consuming less power and more efficient and costing half price.

Good Luck running 2 of these in SLI, they will pull around 1Kw from your wall, just for your info a bulb is <100w(And Im not talking about efficient fluorescent bulbs here that runs @ 23w/13w), a 46inch TV is 300w maximum Brightness, they use more like 200w if not 100% Brightness. And a GPU runs more time than a TV,whenever a 3D app is running or Folding@Home, the GPU will run @ its max clocks, thats if people dont let their TV on and go on vacation :lol:
 
Everyone that is complaining about NVIDIA's new card, is probably jealous because their card isnt the top dog anymore. Stop being a little bitch about it and accept it.
 
[citation][nom]user500[/nom]Everyone that is complaining about NVIDIA's new card, is probably jealous because their card isnt the top dog anymore. Stop being a little bitch about it and accept it.[/citation]


Are probably jealous, not is. And unless we are talking surface temperature the 5970 eats the 480 for breakfast lunch and dinner. Do you accept this?
 
This seems like microsoft idea on the xbox 360, now we will see those 480s frying and dying or maybe RRoD...... 😛
 
My card is an ATI 5850 and I love it. I opted for value, efficiency, cooler temps, and a quieter rig. But that was just my choice, not the only choice. The GTX 480 is a powerhouse of a chip that is simply not fully utilized by the current generation of gaming software. It's like having a rocket powered car, and only using it to go to the corner market and back - it will run way hot and be a lot less efficient than your Ford Focus. It needs to be given a task worthy of it's design before its true value can be appreciated.

But then again, I have no idea what a good use of a rocket powered car would be, or who would want one.
 
Status
Not open for further replies.