Nvidia's Fermi Cards Said to Run Very Hot

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I wonder if the heat issue may be a factor with enthusiasts ability to use the cards in an 'average' case? If heat is too high, it would definitely be a detractor to their card series. I'm all for competition in the market, I guess I'm just worried nVidia may be doing a rush job that will brutally embarrass it.
 
"50% higher transistor count. Wider 384-bit memory interface. Huge L2 Cache.

I think fermi's going to be slow. /sarcasm."

Be that as it may the GeForce 200-series had 50% more transistors than the Radeon 4k-series, 1.4 billion vs. 900 million, and didn't significantly outperform those cards.

Fermi is likely going to end up faster, and more expensive than the Radeon 5k-series but transistor count alone doesn't really say much.
 
I think that ATI's current cards were engineered right up to the point of diminishing returns in regards to transistor count and thermals for the current 40nm process. Nvidia overshot it by about a billion transistors, I think that the thermals and hardware error correction are going to hurt performance, and they won't be able to clock it high enough to compete with ATI. Point, set, match and generation go to ATI.
 
[citation][nom]njkid3[/nom]another reason why fermi is going to fail. it runs absurdly hot and no one in their right mind would buy a card that requires such specialized cooling to properly work. this should be the final nail in the coffin for fermi. it will be overpriced, power hungry, scorching hot and behind the ball. to sum it up nvidia failed.[/citation]

Ability to predict the future, have you?
Wait and see. AMD/ATI & NVidia have both produced good and bad products over the years. Be patient.
 
Guys where theres smoke theres fire. An old cliche I know, but come on... all these negative rumors about Fermi can't all be Fud. Theres trouble in the green camp and its not pretty. The ATI 5000 series knocked Nvidia to its knees, and its scrambling now to recover.
 
Come on Nvidia!! Putting up pretty little pictures and running demos with NO numbers is degrading. Unleash that fermi fury...
 
[citation][nom]ohreally[/nom]Of course it's an issue. Two 2-billion transistor gpu's cost less to make than one 3-billion transistor gpu does due to manufacturing issues.Fermi needs to beat the 5970 while also costing less. Don't hold your breath.The smart money is on Nvidia retreating from the gaming arena completely. Fermi will be a decent performing gaming gpu, pretty good in terms of compuational stuff but there is no way Nvidia will continue to lose money and resources trying to beat ATI.[/citation]

Oprimistically, I want to NOT believe this.

Realistically, I think you may be on the right track.
 
Gaming is just a hobby for Fermi's intended use. Nvidia is trying to sell GPU servers. Those who have the cash to buy large servers have a lot more cash than all of us gamers. AMD made a lot more money off Optys than Athalons.
 
Amazing to see the fanboys of one corner fighting the fanboys of the other about a product that isnt even reviewed / available.... Its one intressting card for sure, hope it can deliver once its released!
 
Is the card running hot really that much of a problem? I have an Ati 4850 thats slightly overclocked and it runs over 105C, its been 2 years now and I have never had a problem. I'm sure that Nvidia will make sure the card is stable while running hot, and that the air will be vented out the back.
 
[citation][nom]IzzyCraft[/nom]Must be the wood screws not thermally efficient *ba da psh*Haha i'm guessing all this means is that there wont be a x2 2 gpu on one card coming any time soon, which to me doesn't sound like an issue if the fermi runs faster then the 5870 enough and falls between the 5970[/citation] The Fermi's Have twice the transistors than ATI With around 3 billion And Them Running Hot Is a bummer Because i was planing to get two mobiles to replace my Mobility 3870 X2's. I want my PhysX
 
lol a friend of mine was one of those case vendors, and he just commented back about woodscrews and non dx11, the nvidia rep was confussed and then pissed LOL
 
who gives a crap if a card is hot or not. as long as it performs and can withstand the temp i could care less, just turn down the ac to help them cool off, lol. the way i look at it is that if you have the money to purchase two of those cards then you can afford to pay for the extra cooling since the ac will be running longer. i live in Guam and my AC runs almost 24/7 so i could care less, lol
 
[citation][nom]njkid3[/nom]another reason why fermi is going to fail. it runs absurdly hot and no one in their right mind would buy a card that requires such specialized cooling to properly work. this should be the final nail in the coffin for fermi. it will be overpriced, power hungry, scorching hot and behind the ball. to sum it up nvidia failed.[/citation]
Yes because the ATI 2900XT was such a cool and quite card when it first came out? Or does anyone even want to talk about that one? Not only that performance wise it was totally outgunned so despite it being an oven it could not hang.
 
Take a deep breath guys and ladies! The card at CES was not using the cooler planned for release with the card, as it is still to be finalised.
 
[citation][nom]spectrewind[/nom]Oprimistically, I want to NOT believe this.[/citation]You speak good engrish. You want flied lice and egg woll?
 
[citation][nom]ohreally[/nom]....but there is no way Nvidia will continue to lose money and resources trying to beat ATI.[/citation]

That's funny, sounds like some talk I've heard about ATI video cards and also AMD processors vs Intel over the years and with AMD not too long ago as well. AMD did not throw in the towel over Intel procs being faster even though AMD was loosing money and that's cause there has usually been a leap-frog situation where one company's product leaps over its competition's performance and back again, etc. I agree with others about waiting and augment their thoughts with: we need to wait and see retail (not pre-production) products benchmarked for performance while measuring power and cooling efficiencies. ie: we have already been told that the Fermi product is not complete in terms of cooling etc. So is that just a diff cooling solution and/or a chip that runs cooler? The powers to be are not talking and anything else is speculation that can't be relied on to make a purchasing decision.
 
I'd like to see some benchmarks to see if it makes up for the heat. ^^;

If it doesn't... then I'll wait until they downsize it and get it right.

I waited before I'll wait again.
 
[citation][nom]Renegade_Warrior[/nom]Time to incorporate your Espresso Machine into your Gaming Computer!This way you can have all the hot java you want when you want.[/citation]
Let's call this the Hot Coffee Mod.
 
[citation][nom]warezme[/nom]Yes because the ATI 2900XT was such a cool and quite card when it first came out? Or does anyone even want to talk about that one? Not only that performance wise it was totally outgunned so despite it being an oven it could not hang.[/citation]Right, except this time it would be Nvidia. It was bad for ATI back then, and it would be bad for Nvidia now.

I don't think that will happen, though. I do think that Fermi will be a good performer. However, the power consumption and thermals will hold back what they can do.

I hope for Nvidia's sake that TSMC's 40nm process has improved by leaps and bounds - because AMD was having supply issues, and their chips are considerably smaller than Fermi.
 
Status
Not open for further replies.