Nvidia: GeForce GTX 480 Was Designed to Run Hot

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
But us as customers, we had a 20% free boost LOL Why did some games removed the 10.1 support with a patch? Assasins Creed was one of them.
Perhaps because the patch was buggy, after all it made no difference to Nvidia as DX10.1 is a hardware thing not software and one game is nowhweher near enough reason to scrap an entire production line that was ready and waiting for shipping long before ATi had the 2900XT's ready.
 
[citation][nom]kennyforgames[/nom]nvidia did this for purpose. maybe they want to made these graphic for Canadian and Russian. which is a very green solution for people lives in far far away north and south. you will love to gather one in your room.[/citation]
Your one of those people that think Canada is freezing uh?
It's about 30 degrees Celsius right now in Ottawa.

Anyway the GTX series will fail while ATI will get better sales.
 
[citation][nom]el3ctr0migrashun[/nom]250 or 300w is a whole lot of energy for a tiny piece of silicon.In fact, it's enough energy to cause electromigration to those tiny little 40nm transistors, which will cause your card to eventually start giving hardware errors, before it craps out altogether.[/citation]
I currently have a 300w PSU. Yep this thing will fail.

Signed an ex-Nvidia consumer
 
[citation][nom]elie3000[/nom]If nVidia kept the 512sp, the card would even pull more Watt and produce more heat, I think that some environmental laws should start being introduced seeing there is already Co2 tax in europe [/citation]

You do know that even the scientists at the UN now say there hasnt been any significant warming since 95 right? All of their data pre-recorded history is based on tree rings which are at best a guess. man made global warming has been proven to be a hoax. The countries that implemented a "green economy" (i.e. spain) have experienced high unemployment and in some cases rolling blackouts. A Spanish professor in one of their universities calculated that for every green job they create they lose 2.2 jobs elsewhere in the economy.

On top of all of that its mathematically impossible to lower the temperature even 1 deg F in the next 50 years unless you kill every animal and person on the planet and stop all machinery. Why? Lets assume you believe that co2 is the cause of global warming (it was the assumption that the IPCC scientists made in 1982 with no real proof so lets just go with it for this case)... the UN says we as a planet create 30 billion tons of co2 a year. Their scientists say that in order to lower the temperature 1 deg F we would have to cut emissions by 1 trillion tons. So we would have to all die for 33 years to lower the temperature of the planet by 1 Deg F if we were causing global warming.

Sorry to say it but the temp has gone down on its own and our emissions have only increased. So man made global warming is a crock of s#!t pushed by junk science paid for by politicians who are power and money hungry bastards who just want to dictate how much of a gas you can produce before you have to pay a fine... sounds communist to me.
 
[citation][nom]shin0bi272[/nom]You do know that even the scientists at the UN now say there hasnt been any significant warming since 95 right? All of their data pre-recorded history is based on tree rings which are at best a guess. man made global warming has been proven to be a hoax. The countries that implemented a "green economy" (i.e. spain) have experienced high unemployment and in some cases rolling blackouts. A Spanish professor in one of their universities calculated that for every green job they create they lose 2.2 jobs elsewhere in the economy. On top of all of that its mathematically impossible to lower the temperature even 1 deg F in the next 50 years unless you kill every animal and person on the planet and stop all machinery. Why? Lets assume you believe that co2 is the cause of global warming (it was the assumption that the IPCC scientists made in 1982 with no real proof so lets just go with it for this case)... the UN says we as a planet create 30 billion tons of co2 a year. Their scientists say that in order to lower the temperature 1 deg F we would have to cut emissions by 1 trillion tons. So we would have to all die for 33 years to lower the temperature of the planet by 1 Deg F if we were causing global warming. Sorry to say it but the temp has gone down on its own and our emissions have only increased. So man made global warming is a crock of s#!t pushed by junk science paid for by politicians who are power and money hungry bastards who just want to dictate how much of a gas you can produce before you have to pay a fine... sounds communist to me.[/citation]
Off topic
 
[citation][nom]a4mula[/nom]Maybe this card is meant to run hot, maybe it isn't. Whatever the case might be I do know this; Every other component of the pc isn't meant to run hot. When you stick this next to your brand spanking new 930 how do you think it's going to respond? Let's see, a 50c 5870 vs a 100c 480. 50c difference, what's that going to do to every other temp in your case?[/citation]

Sure you jest, 5870's are designed to run around 70-80C, just run Furmark and find out. It's actually in the fan map. If I try hard enough I can top one out at around 90C.
 
To the Nvidia fanboys:
-How many of you have 3D set ups? What I thought 2
-How many of you will actually be buying this card? chirp chirp....
-It's not the difference in max temp as much as it is HOW quickly and HOW long it jumps up and stays up and HOW noisy it is even with small loads. All the sites I've read, even the ones who loved it, complained about the excess heat and noise. Even Nvidia recommends a special case with insane airflow.
-Even if the gpu can handle all the heat can all the other components on the board? Caps don't like getting that hot.
-Do you really buy the marketing department's "it was meant to run hot" BS.
-Beta drivers after being released 6 months late? Really?
-Does it perform, yes
-Does it perform that much better than ATI to justify its price?
-Can you, with a straight face, tell me it's superior in all aspects?

The 480 & 470 are good products, not great ones. Maybe they should have been released with water blocks too. They were late, too hot, too much power consumption & too expensive for a neutered chip.

They weren't what Nvidia was trying to make them out to be and only 2 monitors for 500 bucks. What a let down. I am more likely to hook up 3 monitors than play 3D. The flicker in the glasses gives me a headache, polarized 3D doesn't.

Nvidia has a lot of work to do to bring out a hands down better product because this isn't it.
 
Now 10% is extremely low performance gain for this kind of heat. At 30 fps you get 3 fps... worth it? I think not.

For the cost, power consumption and low performance gain ATI has won this round again... and this is coming from someone who only bough nvidia. Guess my next upgrade will be ati at this rate...
 
Yes, Tom's, what happened to the Fermi Caption contest page? Don't tell me you got hit by a bogus DMCA takedown or something equally frivolous?
 
@dreamphantom: GPU PysX will never be more than a cheap gimmick used as an after thought for at least 2 reasons.
1) A game developer will never alienate the ATI gaming sector
2) The gpu is usually the bottleneck in the system(EDIT: for gaming), there is no reason to make it do more calculations for a scene, the api works fine on cpu's, only the games where the devs didn't care (Batman AA) it doesn't use more than 1 core for the PhysX

FPS is not what it is all about, WTF are you smoking, this is a video card for gaming, that is all that matters in this realm

and the GTX480 and the 5970 are pretty fair to compare for anyone who disagrees with the 2 gpus vs 1, look at the power consumption there are pretty close for comparison

as for this BS that NV says they are supposed to run hot, just an excuse so if it ruins something else in your computer (say the cpu or motherboard), its not there fault

EDIT: and tessellation, really, by the time a game comes out that fully tessellates enough surfaces for this to matter and not just a few, it would cripple both cards as well
 
Side note: Very Nice setup bro. I really want a projector or like 3finity with 3 projectors, each directed at a different wall, but I don't have my own room.

Edit: I dont think the Dell monitor support 120hz though.

Ok, so you have water to cool the 480, I see that you wont have to really care about heat now. Show us some OC on the 480, 1Ghz? At 1Ghz it should be pretty close to a 5970 but I dont think it will as many people on XtremeSystems didnt pass 800Mhz yet, maybe it needs some more voltage, the new AfterBurner support it I think. But NO way on earth I would run that fan @ 100%.

All that I am seeing in front of my pc right now is a Heineken empty bottle lol
 
I call BS. In electronic circuits (vs. heating appliances), heat means wasted power, simple as that. It doesn't matter how good Fermi is, or isn't. NVidia's marketing statement is either an outright lie, reflects incompetence, or is apologizing for problems they couldn't overcome. In any event, it adds up to no nVidia card for me.
 


Wow , Jesus ,44168 posts! 😱
 
Ah man, this is probably the dumbest comment I ever read..
Yeah heat is GREAT for computers, that's why nobody uses heat sinks, water cooling, compressor cooling or liquid nitrogen.

If Nvidia focused on making graphic cards that endure a full six months I'd be amazed. Since they currently can't even make competetive cards in due time.. The chances seem slim.
I've returned every single gpu from them due to various breakdowns.
Never a card that lasted more than 6 months.
 
It's a shame it had to be like this but it looks like Nvidia had to crank the crap out of the 480 to get it to beat 5870. Not saying it's doesn't have the goods but it sounds like if you buy one you are getting a F1 racer and they are at the edge of tolerance and a lot of the time they are only good for one race on the other hand 5870 is a le mans car, not quite as fast as the F1 but it will be there at finish it is built for endurance so I guess it all depends how you like to race, gentlemen start your engines.
 
[citation][nom]beeboob[/nom]If ATI designs a GPU as complex Fermi, then that too will run hotter than their current offerings. ATI's current GPUs are primarily designed with games in mind - GPGPU is an after thought. Fermi was designed primarily with GPGPU in mind, and to be able to achieve a 'high level' of rendering/pixel pushing for gamers. Based on the initial reviews it has achieved that. The bottom line is that the GF100 contains several additional features which would blow the current 5xxx series out of the water if they were used optimally *in games*. If you read any tech discussion you will see an inkling of what these features can provide - higher precision math, multiple kernels running simultaneously etc etc. These features are not just of relevance to GPGPU - but the true potential of Fermi for gaming relies on developers exploiting this tech. If they don't then, sure, you may as well stick with a 'gaming' card which will give you better bang for buck (and consume less power)[/citation]

So what you are saying is that for present use Fermi is no good for games (compared to 5xxx) but in the future maybe some games will be coded in a way that will let it shine? it seem like Fermi is taking the same road to success as Itanium did...
 
Tom's should do one of the following:

Overclock a 5870 until it consumes as much power as a 480, and then compare, or

Underclock a 480 until it consumes as much power as a 5870, and then compare.


I've got a hunch that such a comparison is going to make Nvidia look reeeaal bad. Pentium4 4ghz much?
 
i'm curious about performance/feasibility ratio. does the added heat and power consumption provide equally scaled graphics improvements over ati? is there a 50% heat and power increase, but only say, 5% graphics improvement? is that worth it? has anyone looked into THAT aspect?
 
I have built and had many computers for myself and friends... I Know ATI is now king, but I have an extended experience with Nvidia Chipsets and GPU's

I have seen the 8800 Series run in the 80's and even going up to 100 at peak and they are all still alive. (100 when the card is very dirty and require to be cleaned.

I have a friend with a cheep 8400GS where the fan is super slow due to cat hair... running Wow and even Company of Heroes... this card barely has a heatsync... yet, it is now still alive and running well as my HDTV PC.

Nvidia GPU's don't easily die off heat... All of the 8800 Series I have sold, or had then gave to a friend are still alive and take abuse regularly so ...

Yes I think the man is right when he said that heat will not kill the card unless it goes over 100... in that case 3 years later, blower clogged with crap.
 
Status
Not open for further replies.