GeForce GTX 480 And 470: From Fermi And GF100 To Actual Cards!

Page 12 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

mm0zct

Distinguished
Aug 13, 2009
25
3
18,535
Why does Tom's Hardware always forget about the other stereoscopic 3d solutions? Have they simply never heard of IZ3D, Tridef or Zalmon? I'm running my Zalmon trimon 22" 3D display happily on my ATI hd3850, and my next upgrade will probably be a 57x0 series or 5850 if the prices drops.
Being a member of the 3 screen crowd (all completely different sizes though, so not much use for eyefinity) i'd much rather go with the new ATI cards, and I'm pretty sure that the IZ3D and Tridef middleware would happily span across 3 Zalmon trimons to give me surround 3D if I had the cash.
 

knutjb

Distinguished
Jan 11, 2009
68
0
18,630
Oh, don't forget the specially certified Nvidia cases with turbofan blowers to get all that heat out of the case. Now we have proof to why they are required.
 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795
[citation][nom]mm0zct[/nom]Why does Tom's Hardware always forget about the other stereoscopic 3d solutions? Have they simply never heard of IZ3D, Tridef or Zalmon? I'm running my Zalmon trimon 22" 3D display happily on my ATI hd3850, and my next upgrade will probably be a 57x0 series or 5850 if the prices drops.Being a member of the 3 screen crowd (all completely different sizes though, so not much use for eyefinity) i'd much rather go with the new ATI cards, and I'm pretty sure that the IZ3D and Tridef middleware would happily span across 3 Zalmon trimons to give me surround 3D if I had the cash.[/citation]

Check back on Tuesday--we have something that I think you'll like!
 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310
I sometimes wonder at the work of people here... Why would they vote down a post (page 6 of the comments) where one points out some typographical flaws in the article? Yes, I know that fixing them'd make nVidia's cards look (sorta) worse... But seriously, is it that bad a thing?

At any rate, with 15 pages now, I'm starting to feel Tom's should find a way to re-vamp their comments section, 'cuz this is ENTIRELY too long to browse through page-by-page... Either options for skipping multiple pages, options for larger pages, or perhaps a "view all" option would be nice.

[citation][nom]cangelini[/nom]Check back on Tuesday--we have something that I think you'll like![/citation]
While, like most people, (I think) I'm of the opinion that 3D gaming is perhaps a fad, it'd be nice to see a thorough, head-to-head comparison of them. Of course, that could mean benchmarks. The prospect of that sounds exciting, since we all know that benchmarks are the enthusiast's crack.
 
G

Guest

Guest
come on. My gtx 275 gets to around 100 degrees and it doesn't burn down anything cuz it doesnt hav anything to burn.
You realize 97C is not even in real gaming. It's just simulation.Imagine if you run crysis with all the settings crank up and tell me if you don't burn down anything
 
G

Guest

Guest
Wow, Tom's readers are a bunch of clueless kids now. Sigh, use to have rather intelligent people who used to discuss rationally but it's now populated by fanbois, gamers, and just stupid people.

Let's get this straight, looking at the architecture of this card and driver/software support that Nvidia has been heading for, this card is NOT focusing on the gamer side, this card is geared much more towards the professional/scientific community. If you think otherwise, then you don't belong on here posting as you cannot think past what only matters to you. At $500, that's a bargin price for these communities to try out and use. You can bet that Nvidia is eyeing various aspects of the supercomputing market along with other profession applications such as from Adobe.

I myself was hoping for better game results, however I also see where they are heading in the future. And currently, they are quite a bit ahead of AMD/ATI and Intel for that matter, a rack of these chips with mature drivers will beat anything those 2 companies could come up with for straight computational power.

That aside, they also did a whole new rework for dx11 which ATI will have to do to keep up in the future.

Toms, games and most of your benchmarks are NOT the end all be all of computing. I understand you don't do alot of enterprise hardware, but you should have realized what this card represents and as such included that in your review. As it sits, you have nearly everyone saying it's an epic fail when in fact it holds tremendous potential. This isn't Tom's Games Site yet.
 

tamalero

Distinguished
Oct 25, 2006
1,227
242
19,670
[citation][nom]edilee[/nom]To the author of this article...great in depth review! Also the retail Dirt 2 DOES in fact have a built in benchmark...it is in the "options" section outside the trailer in the graphics section at the bottom. In fact I ran the benchy a couple nights ago before and after I did a CPU swap. I do not recall the benchmark being in the demo...seems to me it was greyed out if I remember correctly but I could be mistaken but it is in the retail version for sure.Interestingly enough the Nvidia cards hit right in the margin of performance vs. ATI's card I though it would except for the marks where the 480 was extrmely close to the 5970...this should trouble ATI and I say this because the fermi cards are not fully functional cards yet with the first 2 debut cards.And for all the LOL comments from the ATI fanboys I have one question...in a few months when all cores are active on the Nvidia cards what name will be topping all the benchmark charts? Nvidia, Nvidia, Nvidia. Remember all the really good cards come towards the end of a series...and where do you think a Fermi X2 card is going to place on the charts? The top with much ease.Aside from that this Nvidia card series release will hopefully drive card prices down for all ATI and Nvidia fans alike...once the manafacturing yields improve for both companies.To the author of this article...great in depth review! Also the retail Dirt 2 DOES in fact have a built in benchmark...it is in the "options" section outside the trailer in the graphics section at the bottom. In fact I ran the benchy a couple nights ago before and after I did a CPU swap[/citation]

are you really crazy?
how the hell you will put 2 fermi chips that are 96°C on load and 86°C on Idle and eat each one almost 260Watts ? you will need 4 plugs in that card, even Fudzilla (theo) the biggest Nvidia supporter right now, mentioned as "not in plans".
also cherrypicking benchmarks isnt exactly fun for everyone, I could easily go to a PRO-ATI website and show the 480 BARELY beating (less than 2%) a 5870 to counter your point if I wanted...
 

manitoublack

Distinguished
Jan 23, 2009
109
0
18,680
[citation][nom]Ien2222[/nom]This isn't Tom's Games Site yet.[/citation]

No they killed Tom's Games when Ben and Rob left :(

They (along with Fringe Drinking, Tech Darling and their Second Take show) have been sorely missed.
 


You mean like the 5870 knocked off the 295 on the 5000 series launch day ? Apples and Oranges.....5870 competes with the 285 and 480 not the 295 and the 5970. Can we skip the arguments fans took the flip side on 6 months ago ?

It's just numbers guys .... he who has the biggest wins.
 
More DX11 numbers

Hardware Canucks
AvP = 1920 x 1200 Very High Settings, Tesselation + Advanced Shadows, 4AA 16xAF
BFBC2 = 1920 x 1200, Highest Settings, HBAO Enebaled, 4AA, 16 AAF
Metro 2033 = 1920 x 1200, High Adv DOF and Tesseleation Enabled, 0AA, 16AFF
Heaven = 1920 x 1200, High Shaders, Tesseleation Enabled, 4AA, 16AFF

Card Price AvP % Faster % Price Incr. $per Frame
ATI 5970 $725 63.31 12.63% 45.00% 11.45
nVid 480 $500 56.21 24.22% 42.86% 8.90
nVid 470 $350 45.25 1.14% -16.67% 7.73
ATI 5870 $420 44.74 16.91% 37.70% 9.39
ATI 5850 $305 38.27 NA NA 7.97


Card Price BFBC2 8AA % Faster % Price Incr. $per Frame
ATI 5970 $725 77.33 32.39% 45.00% 9.38
nVid 480 $500 58.41 15.85% 19.05% 8.56
ATI 5870 $420 50.42 7.71% 20.00% 8.33
nVid 470 $350 46.81 9.55% 14.75% 7.48
ATI 5850 $305 42.73 NA NA 7.14


Card Price Metro 2033 % Faster % Price Incr. $per Frame
ATI 5970 $725 44.84 15.06% 45.00% 16.17
nVid 480 $500 38.97 27.73% 19.05% 12.83
ATI 5870 $420 30.51 0.20% 20.00% 13.77
nVid 470 $350 30.45 15.60% 14.75% 11.49
ATI 5850 $305 26.34 NA NA 11.58


Card Price Heaven % Faster % Price Incr. $per Frame
ATI 5970 $725 40.20 34.90% 45.00% 18.03
nVid 480 $500 29.80 21.14% 42.86% 16.78
nVid 470 $350 24.60 0.41% -16.67% 14.23
ATI 5870 $420 24.50 23.12% 37.70% 17.14
ATI 5850 $305 19.90 NA NA 15.33

For those looking to play DX11 at high settings, if these prices hold, the 470 looks like the surprise of this launch.
 
Benchmark Reviews
BattleForge = 1920 x 1200 Very High Settings, No SSAO, 8xAA, Audio MT

Card Price BF % Faster % Price Incr. $per Frame
nVid 480 $500 82.50 18.03% -31.03% 6.06
ATI 5970 $725 69.90 46.54% 72.62% 10.37
ATI 5870 $420 47.70 17.49% 37.70% 8.81
ATI 5850 $305 40.60 NA NA 7.51

More reviews

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/36.html
http://www.hardwareheaven.com/reviews.php?reviewid=950&pageid=24
http://hothardware.com/Articles/NVIDIA-GeForce-GTX-480-GF100-Has-Landed/?page=17
http://www.legitreviews.com/article/1258/16/
http://www.anandtech.com/video/showdoc.aspx?i=3783&p=20
http://benchmarkreviews.com/index.php?option=com_content&task=view&id=480&Itemid=72&limit=1&limitstart=16
http://www.tweaktown.com/news/14641/nvidias_directx_11_ready_gtx_470_and_gtx_480_cards_are_here/index.html
 

tamalero

Distinguished
Oct 25, 2006
1,227
242
19,670
[citation][nom]Ien2222[/nom]Wow, Tom's readers are a bunch of clueless kids now. Sigh, use to have rather intelligent people who used to discuss rationally but it's now populated by fanbois, gamers, and just stupid people.Let's get this straight, looking at the architecture of this card and driver/software support that Nvidia has been heading for, this card is NOT focusing on the gamer side, this card is geared much more towards the professional/scientific community. If you think otherwise, then you don't belong on here posting as you cannot think past what only matters to you. At $500, that's a bargin price for these communities to try out and use. You can bet that Nvidia is eyeing various aspects of the supercomputing market along with other profession applications such as from Adobe.I myself was hoping for better game results, however I also see where they are heading in the future. And currently, they are quite a bit ahead of AMD/ATI and Intel for that matter, a rack of these chips with mature drivers will beat anything those 2 companies could come up with for straight computational power.That aside, they also did a whole new rework for dx11 which ATI will have to do to keep up in the future.Toms, games and most of your benchmarks are NOT the end all be all of computing. I understand you don't do alot of enterprise hardware, but you should have realized what this card represents and as such included that in your review. As it sits, you have nearly everyone saying it's an epic fail when in fact it holds tremendous potential. This isn't Tom's Games Site yet.[/citation]
problem is.. this IS a GAMING card, not a pro card, if you wanted the full fledged non crippled card, you need to buy the TELSA or QUADRO.
also its being labeled as a GAMING CARD by NVIDIA. so thats why its dissapointing.
 
G

Guest

Guest
I am missing something here ?

Everyone complaining the card is too hot, did you even bother to see the temperature of the other cards ?
The 480 is only 2% hotter than the 5970 and 5.1% hotter than the 5870.

The power consumption tho really is bad, nvidia better do something about it.
 
In all fairness, other aspects of the card must be considered.

http://benchmarkreviews.com/index.php?option=com_content&task=view&id=480&Itemid=72&limit=1&limitstart=16

While most consumer buy a discrete graphics card for the sole purpose of PC video games, there's a very small niche who expect extra features beyond video frame rates. NVIDIA is the market leader in GPGPU functionality, and it's no surprise to see CPU-level technology available in their GPU products. Fermi GF100 is also the first GPU to ever support Error Correcting Code (ECC), a feature that benefits both personal and professional users. Proprietary technologies such as NVIDIA Parallel DataCache and NVIDIA GigaThread Engine further add value to GPGPU functionality.

My primary (I make my living with it) use of a GFX card is for AutoCAD which, at this point in time, eliminates the ATI's 5xxx series from consideration ....I'm told the next set of drivers will address their 2D GFX problems. At the end of the day, I enjoy blasting a few monsters as much as the next guy and my wife's gona use that same box to do "her thing" w/ home movies which leads onto the next factor to consider which is CUDA

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/36.html

Finally, let's look at the value proposition. NVIDIA packs a set of features along with this card, such as CUDA, which is arguably the most popular GPU computation API among consumers. CUDA helps speed up video encoding, image fixing, among other media acceleration features, and that is a nice addition. NVIDIA doesn't leave out support for other APIs either, with support for OpenCL, and DirectCompute 5.0.

I'm not gonna agree with benchmarkreviews.com and say:

http://benchmarkreviews.com/index.php?option=com_content&task=view&id=480&Itemid=72&limit=1&limitstart=16

Conclusion: NVIDIA is back on top again, right where most gamers like to see them.

But I will say that I think both the ATI and Nvidia offerings are compelling and competitive products. I could recommend either the 480 or the 5870 though the 470 I think made the bigger splash.
 
G

Guest

Guest
I have been am Mvidia user for years now..6,7,8 & 9 series cards. I waited and waited for Fermi. Three weeks ago I purchased my stuff to build a new rig around the Asus P658D Premium Mobo with an i7 920. I whinced and bought the ATI 5870 1GB. Yesterday when I heard about the 400 series released I almost died. After reading the various reviews to hit the streets today I smiled and sighed a breath of relief. I did good.

No Regrets Here :)
 

qwertymac93

Distinguished
Apr 27, 2008
115
57
18,760
[citation][nom]nottheking[/nom]I sometimes wonder at the work of people here... Why would they vote down a post (page 6 of the comments) where one points out some typographical flaws in the article? Yes, I know that fixing them'd make nVidia's cards look (sorta) worse... But seriously, is it that bad a thing?At any rate, with 15 pages now, I'm starting to feel Tom's should find a way to re-vamp their comments section, 'cuz this is ENTIRELY too long to browse through page-by-page... Either options for skipping multiple pages, options for larger pages, or perhaps a "view all" option would be nice.While, like most people, (I think) I'm of the opinion that 3D gaming is perhaps a fad, it'd be nice to see a thorough, head-to-head comparison of them. Of course, that could mean benchmarks. The prospect of that sounds exciting, since we all know that benchmarks are the enthusiast's crack.[/citation]

look up were it says COMMENTS in big red letters, right under it is the option to see the comment forums. after you click that, you have all those options you were asking for, and more. i know, i just blew your mind right? :p
 
G

Guest

Guest
You lot are missing the BIG POINT >>> We enthusists are looking this card already. We don't want more displays, we want 3d games as they are so much better than simply more monitors... and guess what if you want more monitors then at least Nvidia are expanding to 3d vision for 3 monitors.

Power, heat, cost, mean nothing to us, we have big power plant PSU's, big gaming chassis, and understand that the reason for the heat is the size of the GPU.

Asus and EVGA are letting us overclock it with more power and that will be more heat, but do we care... yes as the prospect excites us lot.

We want the avant garde system, which is 3d and hd, and we are willing to do whatever for that. However ATI, no 3d, so never will I move over from Nvidia to them!
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290
[citation][nom]gamby1[/nom]
come on. My gtx 275 gets to around 100 degrees and it doesn't burn down anything cuz it doesnt hav anything to burn.
You realize 97C is not even in real gaming. It's just simulation.Imagine if you run crysis with all the settings crank up and tell me if you don't burn down anything[/citation]
You realize that synthetic benchmarks usually result in highest temps, right? (3D Mark, Furmark, Unigine Heaven... etc) This is the reason reviewers often test load temps using these benchmarks, as a worst case scenario. I think they even mentioned somewhere in the article that average gaming temps are significantly lower.
 
G

Guest

Guest
a question I'd like to see answered is: "What are the chances of unlocking the dormant SM core(s) and thereby the cards' full potential?"
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290
[citation][nom]gymeaper[/nom]You lot are missing the BIG POINT >>> We enthusists are looking this card already. We don't want more displays, we want 3d games as they are so much better than simply more monitors... and guess what if you want more monitors then at least Nvidia are expanding to 3d vision for 3 monitors.Power, heat, cost, mean nothing to us, we have big power plant PSU's, big gaming chassis, and understand that the reason for the heat is the size of the GPU.Asus and EVGA are letting us overclock it with more power and that will be more heat, but do we care... yes as the prospect excites us lot.We want the avant garde system, which is 3d and hd, and we are willing to do whatever for that. However ATI, no 3d, so never will I move over from Nvidia to them![/citation]
lol, I can already see what's going to happen... unfortunately opinions such as these get marked down, A LOT, in these comments. Good luck dude, just give it a few minutes.

And o ya, ATI does support 3D, although their solution isn't quite as mature as Nvidia's yet. As a result it hasn't had as much marketing or support up to this point.
 

newxmatrix

Distinguished
Jan 22, 2009
20
0
18,510
I AM KEEPING MY GTX 295.... :) WAITING FOR NEXT GENERATION OF FERMI.

MY GTX 295
IDLE TEMP IS 53C
LOAD TEMP IS 75C

I AM HAPPY WITH IT.
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290
a question I'd like to see answered is: "What are the chances of unlocking the dormant SM core(s) and thereby the cards' full potential?"
Not likely. After the massive number of 'unauthorized' Radeon 9800 to 9800 pro upgrades, requiring a simple firmware tweak, I seriously doubt any graphics company will ever make that mistake again. Correct me if I'm wrong, but as far as I've seen every high end card since has had unwanted shaders/cores disabled in hardware, making it impossible to modify.
 

kutark

Distinguished
Jan 10, 2007
193
0
18,680
As much as i would love to buy an ATI card, i simply refuse to until their driver teams pulls their heads out of their asses. I have had far too many problems with far too many ATI cards that i've NEVER had with an nvidia card. Hell, the only driver related issue i've had with any nvidia card was the whole fan speed not auto adjusting in game issue with the 8xxx series cards. The sad part is its not just me. Over the years with my LANing ive met SO many people with similar problems.

It really bums me because when you look at the pure hardware, ATI really is doing a damn fine job. But its like the difference between having a Ferrari and a Saleen. Yeah, the Saleen is faster, but it constantly has weird little problems, its uncomfortable, noisy, etc.

The point is, some people are willing to put up with the annoyances, and back when i was in my teens and early 20's, and i didnt care about having to do a driver wipe and reinstall every couple of weeks, i bought ATI cards. Now, i don't wanna screw with it. So i buy an nvidia card, and yeah i pay a bit more for it, but it just works.
 

billyg45

Distinguished
Sep 8, 2008
16
0
18,510
When someone runs furmark with a triple sli setup the power plant will have to use that switch for Auxillary Nuclear as in the National Lampon's Christmas Vacation movie.
 
Status
Not open for further replies.