Nvidia: GeForce GTX 480 Was Designed to Run Hot

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]turboflame[/nom]It's not a bug it's a feature![/citation]

isn't Apple the only company that can say that sort of thing?

Anyway, the new nvidia cards are deliberately designed to run hot just like Vista was designed deliberately to run crap, is that right?
 
"The chip is designed to run at high temperature so there is no effect on quality or longevity."

Now if only the same could be said of everything else in my case the this card would be radiating heat onto.
 
Nvidia has always been a company that pushes the limits on technology. Thats what fermi is. It's not just about framerates people. Fermi can do stuff Ati's card doesn't. If ati's card could I would have bought it already. Like husker said "The GTX 480 is a powerhouse of a chip that is simply not fully utilized by the current generation of gaming software.". I said this before the benchmarks came out, and when they came out, it was proven. Ati will win the fps with the older games that don't utilize tessellation, but with tessellation heavy games, fermi will wipe the floor with ati. even the 5970 can't keep pace with the fermi cards in extreme tessellation. A couple benchmark reviews I've seen showed fermi has up to 8 times the tessellation performance of ati, and a minimum of 2 times the performance. Couple that with physX, and 3d across three screens, 32AA, cuda, and allow programmers to code with C++, plus it is the fastest single gpu on the planet. Fail? I don't think so. Big deal it runs 10C hotter then last generation. Quit whining. When new games start come out that take advantage of everything fermi has to offer, then you will see why nvidia chose this route, because current ati architecture is gonna choke.

If you don't want to buy a fermi card, then don't. If you do, then read my last post. Either way, just because the card runs hot doesn't mean it's a failure. It's just nvidia decided to go with more heat instead of investing in a crazy expensive air cooler because they know everyone has been complaining about the price, so they designed it to run hot instead of making it crazy expensive.

The only thing that "Failed" at nvidia was management of PR (lying to us about release date and doing a paper launch) and manufacturing (because of bad yields). The architecture is sound. Nvidia isn't perfect, and neither is ati, thats what the thing is with pc gamers. If you have a problem you are supposed to come up with a solution, not whine about it. Be smart, not a baby. If it runs hot, then fix it.
+1. Although you forgot to mention that these cards are not the full monty 512 core cards and they are still faster than their ATi counterparts, can you imagine what kind of whining we would have to put up with if the fully working GPU's had been released first?
 
I really like this site.... nvidia comes with a new card that 7c hotter than its ati counter part 5870... and because its 7c hotter every ati fanboy have something new to talk about... now that we have seen that it can out do the 5870 (with beta driver), ati fanboys need something new to talk about and what could be better than,
"OMG 7c hotter than ATI!!! ITS gona melt in 2 sec" it so makes my day to go in here and read stuff like that, so thx ATI fanboys for making my day fun and keep it up.. as ppl say u live longer when you laugh so keep it up :)

btw link for temperatures so u can see for your self thats its only 7c hotter

http://www.tomshardware.com/gallery/Temperature,0101-242418-7746-0-0-0-jpg-.html

 
[citation][nom]dreamphantom_1977[/nom]Nvidia has always been a company that pushes the limits on technology. Thats what fermi is. It's not just about framerates people. Fermi can do stuff Ati's card doesn't.[/citation]

Actually, ATI's products can do pretty much the same, and they can sometimes do it better (GPU based transcoding for example. They just haven't focused ONLY on that sort of featureset yet, preferring to concentrating on making great gaming chips.
They are OpenCL capable, and with a hack or two they can even do Physx.
Just because nobody has bothered to use the API doesn't mean that nobody ever will.

Also, I think you might want to check your facts about "3D across 3 screens". At the moment its only ATI that can do that from a single card (in fact they can do 6 screens from a single card). Nvidia needs 2 cards to do the same.
 
Yea. thats why nVidia said that DX10.1 was useless when ATi cards had a 20% Boost using it LOL
 
[citation][nom]zybch[/nom]Actually, ATI's products can do pretty much the same, and they can sometimes do it better (GPU based transcoding for example. They just haven't focused ONLY on that sort of featureset yet, preferring to concentrating on making great gaming chips.They are OpenCL capable, and with a hack or two they can even do Physx.Just because nobody has bothered to use the API doesn't mean that nobody ever will.Also, I think you might want to check your facts about "3D across 3 screens". At the moment its only ATI that can do that from a single card (in fact they can do 6 screens from a single card). Nvidia needs 2 cards to do the same.[/citation]


Ye and nvidia can do it in 3D... ATI cant... and when they in what 6 months have there 3D ready can they run it on 6 screens, i think not, will it need two ati cards as nvidia for 3D most likly but lets see :)

i think its stupid to compare it since one of they run 3D and i think its fair that u need two cards to run 3 screens on 3D...
and imo fuck 6 screens who even have room for that lol... i would much rather have 3 screens in 3D by far :)
 
Couldnt a REALLY hot card such as this fry the chipset next tto it... Sure THAT card is made to withstand the heat, but what about the motherboard its connected to??? Cant the motherboard withstand the heat its next to?
 
Wow people still debating. The 5970 is simply a better card for GAMING. No one in his right mind would buy a card just to be able to run games a bit "better" in at least a year. Name me please some FULLY tesslated games? None, other than gaming like GPGPU Folding@Home,Fermi is the thing to get YES. Not for actual Gaming unless its price drop.

@ dream, would you accept the fact that having a Fermi would increase your CPU, HD, Motherboard temps? If your happy about that, get one, it wont fry but it has MORE chance of frying, its like cancer, eat veggies will protect from cancer, yes you can still get cancer But if you do eat you have LESS chance of getting it, at least from a food side. Yes, he was exaggerating but the point is still here: High Heat. High TDP for what it represents facing the 5970 in nowdays gaming not in 1 year :lol:

Technology is supposed to make "The newcoming", More Efficient, Faster, Cooler, and finally Smaller.


hahaha Actually thers much more nVidia trolls than ATi LOL

Thanks for the -1 nVidia fanboys LOL
 
So wait, your saying for $100 bucks more, you wouldn't pay for %20 performance average, %200 tessellation performance, physx, native 3d across 3 screens, 3x ray tracing performance..?
It would cost you more then that just to buy a nvidia card to use the hack to enable physx.

Im not wasting any time on PhysX and how useless it is , just browse forums. Yes, 100$ more for a more efficient card, 30%+ Faster, sure, why not?
3D with half the fps and ridiculous monitors, equipement prices ? No thanks. This is all simply marketing.

Where do you get your 200% LOL? A 480 runs near the 5970 tesslation performence but it is waaay behind on less tesslated games.

Want to discusse more? Come here http://www.tomshardware.com/forum/284770-33-gtx480-gtx470-reviews-discussion
 
The chip is designed to run at high temperature so there is no effect on quality or longevity - I think everyone is taking this out of context. What they are saying, is, they knew what they were creating would run hot as hell so they made sure the chip was "DESIGNED" to handle the heat.. otherwise if they didn't design the chip to handle the heat it created, it would fry. They are trying to ease your mind so you don't think the damn thing is gonna fry. It wasn't designed to intentionally produce heat!!! Like it would run better or something?? LOL
 
[citation][nom]tanjali[/nom]Best solution for competition is Intel buying Nvidia.[/citation]I highly doubt it's the best solution but I wouldn't mind seeing that happen. On the contrary... I like being able to switch between cards and chipsets. That might all change if Nvidia teamed up with Intel.
 
I have a question: If cards were limited to only running 60fps at the fastest regardless if they were able to go faster... would it consume less electricity instead of letting it max out the frame rate?
 
you know for as much crap as I gave the cards theyre really not that much hotter than the 295 (and the 480 performs similarly).

http://www.tomshardware.com/reviews/geforce-gtx-480,2585-15.html

While the nvidia does use way more power (about 30%) than the amd card it's still in the same ball park as its previous generation of cards so its not that big of an announcement. Still a load of hogwash though that they claim they HAVE to run that hot.
 
If nVidia kept the 512sp, the card would even pull more Watt and produce more heat, I think that some environmental laws should start being introduced seeing there is already Co2 tax in europe LOL But really Im sure people will run 2-3 of these in SLI, that will pull more than 1.2Kw from the wall!!! I was hoping for a better all-around chip to force ATi drop their prices and get another 5850. When theres one side, we are the loosers in the end, we need competition to drop prices, remember 8800Ultra prices 1k$?
 
Is this Nvidia's way to say that we are having problems making efficient CPU's with less heat? ATI has managed to cut heat down (not in their mobile CPU's)in GPU's. The statement sounds like an admittance that they could do nothing better than crank up clock/heat to get a fast card-not by design and ingenuity.

In English: We have inferior technology!
 
Nice talk but what does this suggest for the future from the green camp..!! Are they suggesting that future higher performance chips are going to be even more aggressive on the power draw and thermals..!!!! That's scary i say..
 
Just browse the forums? Thats exactly ati fanboys problems, instead of looking at benchmarks and coming up with facts, you "only" browse forums for your information. Physx is the best physics software out there, the only reason it's not being used in games is because the developers aren't taking full advantage of physx-

http://www.gamephys.com/2010/04/01/why-developers-are-not-taking-full-advantage-of-physx-and-implementing-it-into-every-game/

http://www.tomshardware.com/forum/forum2.php?config=tomshardwareus.inc&cat=33&post=285328&page=1&p=1&sondage=0&owntopic=1&trash=0&trash_post=0&print=0&numreponse=0&quote_only=0&new=0&nojs=0

Again, you aren't posting facts, only opinions not based on evidence. Thats how I know your an ati fanboy. Where are your facts? Come to the forums? For what? I already read just about every review out there on the cards. If you READthe reviews instead of looks at the pictures, you would know that in highly tessellated benchmarks during the highest tessellation time frames the gtx 480 doubles even the performance of the hd 5970 in tessellation. And I don't have to go into the forums to get 100 peoples "opinions" to make my decisions, I can read reviews on legitimate sites to base my choices by. If you don't believe me, i don't really care, when newer games start using a lot more tessellation, which I think will happen sooner than later, then you will see what I am talking about.

The only thing fermi is behind on in the 5970 is FPS, but thats only because games don't really use tessellation yet. When they start to, you will see gtx 480 wipe the floor with even the 5970.

Yes, fermi tessellation is twice as fast as 5870.

http://www.overclockersclub.com/reviews/nvidia_gtx480/14.htm

http://www.brightsideofnews.com/news/2010/3/30/nvidias-island-directx-11-demo-works-on-amd-gpus.aspx

Anyways, i'm done with my rant. Sick of atifanboys who can't read.


Sure, now insult me that I cant read I told you for the second time, if you want to discuss with more in-depth post in this thread please, http://www.tomshardware.com/forum/284770-33-gtx480-gtx470-reviews-discussion that's where it should be posted.

I can name a few things about physx:

- Big hit in FPS as the GPU is doing other calculations.
- Any CPU above the Q6600 Will handle PhysX WITHOUT any slowdown. Infact, best titles chose not to use PhysX , you can ask them why, Crysis is one of them and it is in fact supported by nVidia while not using nVidia technology.

- OpenCL and DirectCompute that came with DX11 will render PhysX completely useless in about 2 years or even less. AND PhysX is propriety locked VS OpenCL/Direct Compute which are open source and free.

EDIT: The overclockersclub bench are using a OCed Fermi, How did they managed to cool it? Water? Sad they didn't show power consumption when OCd LOL
nVidia GTX 480 804MHz/1608MHz/1058MHz
 
Yea. thats why nVidia said that DX10.1 was useless when ATi cards had a 20% Boost using it LOL

It's funny how most game developers skipped DX10 really. Face it, if there are not enough changes over DX9 then why should software houses invest in a new code path and thus spend extra money on development? This literally was a problem with DX generation 10.

Not my words but it kinda covers the whole DX10.1 thing as well. :finger:

Source
 
But us as customers, we had a 20% free boost LOL Why did some games removed the 10.1 support with a patch? Assasins Creed was one of them. $$$$


I don't think Ray Tracing games are in the near future, or at least until next gen GPUs :lol:
 
Status
Not open for further replies.