GeForce GTX 480 And 470: From Fermi And GF100 To Actual Cards!

Page 9 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

aungee

Distinguished
Sep 17, 2008
12
0
18,510
Good that Fermi is finally out (well, almost!). I suspect Nvidia will hold off the extra un-used codes until ATI releases their next gen cards as this would keep ATI guessing on what performance to beat from the green team.
 

Tomtompiper

Distinguished
Jan 20, 2010
382
0
18,780
[citation][nom]aungee[/nom]Good that Fermi is finally out (well, almost!). I suspect Nvidia will hold off the extra un-used codes until ATI releases their next gen cards as this would keep ATI guessing on what performance to beat from the green team.[/citation]

What ATI has to beat is their own cards, Nvida have nothing to bring to the table, unless you are into CUDA and folding at home. If you live in a cold house get a Fermi, anybody else will get a 5xxx.

 

HavoCnMe

Distinguished
Jun 3, 2009
603
0
18,990
Sign me up for a GTX 295 still, the GTX 480 is such a disappointment! Maybe the GTX 480 X2 will be worth it, but then again you will need a separate psu to power it along with a dedicated AC unit in your case to keep the heat down. What a hyped up p.o.s.
 
G

Guest

Guest
One of my acquaintances assembled a new computer few months back and, when I asked him which Graphic card he would buy for his computer, he said he'd wait till Fermi comes out cuz the spec nvidia released is really impressive and would eat ati 5 series for breakfast. He didn't realized it was PAPER spec that doesn't translate to equivalent framerate. I wonder how he feels now after reading reviews. He probably has to buy new psw if he still wants to go ahead and buy nvidia for mere few frames advantage.
 

roadrunner343

Distinguished
Aug 22, 2009
75
10
18,635
Like most people have said: It's just too little too late. While the slight performance boost over the 5870 is nice, it's just coming too late. I have used nVidia cards since the FX5000 series and have never been disappointed.

I just couldn't wait any longer for my next build. Had this come out 3 or 4 months ago, I would have dropped one in my rig rather than a 5870. Alas, they were just too late, and based on these benchmarks, I am not upset about choosing a 5870 at all.
 
G

Guest

Guest
I would like to point out (contrary to commenters who refer to this card as an "over-hyped POS" without any real arguments, that:

* this card has incredible double-precision performance, an increase of 8x compared to the previous generation
* the CUDA 3.0 SDK just came out, and it is leaps ahead the competition.

It may not be a card for gamers, but this is the future of computing.
 

shovenose

Distinguished
wft nvidia?
you can do better than that.
that card will die after 6 months when some dust gathers in that already border-line heatsink!
NVIDIA FAIL!
they should release a watercooling ready version!

ps dont u even think about overclocking!
 

yyk71200

Distinguished
Mar 10, 2010
877
0
19,160
[citation][nom]meat81[/nom]to some of the people making comments, please realize the 5970 has two GPU's.[/citation]
And 5970 resides in one PCIE x16 slot just like 480 and you can XFire them just like you can SLI 480s. So it is a fair comparison.
 

shovenose

Distinguished
[citation][nom]yyk71200[/nom]And 5970 resides in one PCIE x16 slot just like 480 and you can XFire them just like you can SLI 480s. So it is a fair comparison.[/citation]
what about a dual-GPU geforce 490!?
thing would probably blow up though!
 

Keiki646

Distinguished
Jun 22, 2008
630
0
19,010
I thought a temperature reading would be interesting. Turns out that, during normal game play (running Crysis, not something like FurMark), the exposed metal exceeds 71 degrees C

Maybe that is the reason why they disabled the cores because it runs too hot and may have crash.
I am not please with Nvidia right now and I thought that they would have a card to work with against the ATI 5870 and 5970
All this shows is that there were some problems keeping the card running stable at high Temps
Don't get get me wrong now I have a ATI 5850 but I still have alot of respect towards Nvidia.
 
[citation][nom]Vader3[/nom]Interesting that Anandtech came up with different outcomes on the performance of the GTX480 card.[/citation]
Oh you mean that guy that Intel pays to trash-talk everything that AMD produces? Whoa, big surprise there PAL!
[citation][nom]meat81[/nom]to some of the people making comments, please realize the 5970 has two GPU's.[/citation]
Yeah, and I'd love to see how long a dual-GPU Fermi lives for, the single GPU is almost melting the card as it is! LOL
[citation][nom]aungee[/nom]Good that Fermi is finally out (well, almost!). I suspect Nvidia will hold off the extra un-used codes until ATI releases their next gen cards as this would keep ATI guessing on what performance to beat from the green team.[/citation]
The green team's only green this day is envy. They didn't knock off the 5970 and they won't be able to.
 

siman

Distinguished
Dec 29, 2009
41
0
18,530
umm the new ati cards are coming vary soon......what is nvidia trying to do anyway......2 simple little words can be used to describe this "EPIC FAIL"
 

tobinmarch

Distinguished
Nov 23, 2009
10
0
18,510
Has ATI fixed some of their Grey Screen of Death issues with the 58xx cards? I had a 5870 for the month of January and RMA'd it for a second one that had the same issue. Sent it back for store credit and have been awaiting the new Nvidia cards. Looks like I am still very undecided.
 

bossie2000

Distinguished
Jul 30, 2008
8
0
18,510
See some short term problems for Nvidia here.According to plenty of Nvidia fanboy comments on this forum they might not wait 2+ weeks or more before mass availabilty after this review.Some of them will now just go out and buy ATI out of frustation and disapointment!! Got myself 5850 last sept and still rocking all the titles at max settings(well except crysis).
 

yyk71200

Distinguished
Mar 10, 2010
877
0
19,160
[citation][nom]shovenose[/nom]what about a dual-GPU geforce 490!?thing would probably blow up though![/citation]
There is no such thing as dual-gpu 490, and not going to be at least till the next die shrink
 

brisingamen

Distinguished
Feb 3, 2009
201
0
18,680
fermi is obviously not a mature product, but nvidia had to release something, they released a few a2 stepping chips on some quad boards in europe or whatever,

once they release a full 512 core chip, and can up the mhz to 900, while keeping the temps 20% lower ill consider it,

i really like the folding performance and min fps,

but driver maturity and most of all this chip is just a while away from being made properly imo. by the time this thing is really ready we will have a 6xxx and when that time comes i will evaluate the situation, i prefer ati but, if the new stepping or die shrink or whatever works out the current heat/core problem ill consider this gpu.

 

swiftsword69

Distinguished
May 26, 2009
7
0
18,510
the ati cards have been on the market for over 6 months now. The GPU segment gets an update of the lineup/architecture every 6-9 months. ATI/AMD has been keeping suspiciously quiet and I can only guess that they are sadistically watching on the side while Nvidia fail epically with the realease of the fermi. I reckon any time soon they are going to release some hardcore that will wipeout Fermi accross the board. The current generation from AMD is already doing well enough for Fermi to be chocking on.

However, on Nvidia ground, fermi is a drastic shift to a new architecture. Perhaps they will improve this brand new architecture in their next generation of GPUs. Also Nvidia just dont really seem to be that innovative these days. They copy the idea of triple display from AMD which also had gotten the idea from the old Matrox cards.

I know Nvidia is pushing hard on CUDA and GPGPU, however, AMD is the only company with the license and practical experience of making both CPU and GPU. In the future I think the key would be fusion for AMD and if Nvidia wants to survive, they should beg intel to buy them out.
 

zodiacfml

Distinguished
Oct 2, 2008
1,228
26
19,310
Inevitable. Nvidia has to do this to protect their future from AMD and Intel and GPU's are quite becoming more powerful than the games they run.

The performance is not a surprise considering the leaks we had read.
They sacrificed gaming performance to create these cards better for gpu computing.

Their problem now is how to sell these cards plenty enough to gamers and enthusiasts. Hope they have some impressive CUDA applications for these cards.
 
[citation][nom]wealljustlost[/nom]We all lose here - this is where many people are missing the point. If nVidia delivered a superior product - ATI would bring their prices down, ATI would work harder to get the next product out, we'd have some healthy competition - and that's where the market pushes forward with better products and lower prices.But now it all stagnates - the real loser here is the consumers - we're stuck with only one option for graphic cards - whose prices are going UP might I add.So think about that for a second before being gleeful about nvidia sucking - the joke is on you.[/citation]
You're wrong. How can you even THINK that after all the CRAP nVidia has pulled due to their arrogance? ATi made them eat crow and gave them a much-needed dose of humility (I hope). Now perhaps it won't just be ATi that's treating consumers with honesty. I'm glad ATi is in the driver's seat because nVidia had become complacent, dishonest and cocky.
 
[citation][nom]swiftsword69[/nom]In the future I think the key would be fusion for AMD and if Nvidia wants to survive, they should beg intel to buy them out.[/citation]
God I hope not! Can you imagine how dangerous Intel would be if they had a real GPU department? It would be better for all of us consumers if nVidia sells out to AMD. The only thing keeping AMD alive right now is ATi. If nVidia had Intel's resources behind it, AMD would be totally destroyed. And then what? You really want Intel to rule this industry unchecked? I sure as hell don't.
 

yyk71200

Distinguished
Mar 10, 2010
877
0
19,160
What cracks me up about some of the reviews like guru3d is that they say that 480 slaughters 5870 based on low resolutions. As if 120 fps over 100 fps matters. Why would anyone buy 480 to game at 1680x1050? As resolution increases, the differences between cards decrease. And at high resolutions where it matters they are typically smaller than what you can call slaughter. Most current games are not very challenging even for mid-rage cards at reasonable resolutions.

Computing power is another matter. The question here is how much does it matter to you to justify buying fermi.
 

yyk71200

Distinguished
Mar 10, 2010
877
0
19,160
[citation][nom]Avro Arrow[/nom]God I hope not! Can you imagine how dangerous Intel would be if they had a real GPU department? It would be better for all of us consumers if nVidia sells out to AMD. The only thing keeping AMD alive right now is ATi. If nVidia had Intel's resources behind it, AMD would be totally destroyed. And then what? You really want Intel to rule this industry unchecked? I sure as hell don't.[/citation]
AMD is very competitive to INTEL right now at low and mid-range segments in CPU department. Oh, btw, AMD has much more affordable quad-ores now (Athlons). But this is a topic for another thread.
 
G

Guest

Guest
interesting how this fermis architecture will evolve. not only benefited for the scientists side, but also us as gamer (more effects/more realism) that may any of us missed to relate GPGPU with GAMES applications. tough with this crippled/unmatured version, still can competed with its rival. remember the first architecture of G80,, it had the same issues, power and heat. but it is always a fix, and they fixed it in G92/G94. i think nvidia already had on the right track, just need to move on, perfecting it. just grab your whatever graphics card solution that most affordable and suitable for you. each person has its own need. no need to worry.
 
Status
Not open for further replies.