PAX: What Gamers Think of Nvidia's GTX 480

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

rmse17

Distinguished
Dec 7, 2007
38
0
18,530
While I am 100% nVidia guy (Riva 128, GeForce 256, GeForce 3 (Pre Ti), GeForce 6600GT, GeForce 7950, GeForce 8800GTS512, GeForce GTX285), I do not plan on upgrading to either of the current options. As may be deduced from my card history, I generally buy in during the refresh of the technology, either at die shrink, or at some other optimization addition to the series. I did consider Radeon 5870, but I believe the ATi drivers are generally of lesser quality than nVidia on Windows, and on Linux nVidia is a clear winner. However, I am a fair weather fan, and if ATi improves in the area of Linux drivers, and or nVidia does not come up with a good refresh to the 480 monster, I might just jump ship (I did for the first time go with an AMD Phenom II over an i7 920 when both became availabe, breaking my 100% Intel stand)
 

tykus

Distinguished
May 7, 2008
13
0
18,510
[citation][nom]williamdbradley[/nom]The greatest part of these debates are when the morons cry "Nvidia drivers are better!". Tell that to the people who have fried video cards thanks to the drivers not speeding up the fan when gaming. I tend to think they'd disagree with ya.Honestly, ATI spanked Nvidia this go round. ATI used to get spanked by Nvidia until they cleaned up their act and started working smarter. Now it's Nvidia's turn to build a better product. I hope Nvidia's next card kicks ATI's ass. No company needs to stay on top. If not, we get overly inflated prices and crappier cards.[/citation]

Almost sure this is what burned my 8800 gts.
 

Jarmo

Distinguished
Jan 28, 2009
136
0
18,680
Thinking nvidia should start looking into a die shrink of GTX 275 some time soon.
That 8800 can be renamed only so many times before it starts to look a bit long in the tooth,
and GTX4xx is not going to transform into a mid range part soon, not without an act of god.
 
As a gamer the benchies and power demand from these cards just doesn't add up in terms of value.

Cost ... too high to recommend ... and it isn't any faster compared to the competition.

Thermals ... any gamer wanting to run a sustained session with one of these puppies in the blinged up case had better look at the following:

Front and rear 120mm fans with no filters obstructing airflow, as well as a side fan and a blowhole;

Nice and tidy cabling inside to maximise airflow;

A quality 650W PSU considering the CPU and associated bits you already have in the box ... more if your running one of those i7's or a PII ... definately if either are overclocked;

Making sure the case isn't jammed up against a wall or other space restricting airflow;

Current home and contents insurance to cover fire damage ... get a new battery for your smoke detector while your at it.

I imagine NVidia will shortly release a new driver package which will claim a dramatic improvement in performance ... with a sneaky drop in image quality ... that's what they did in the past.

 

roleki

Distinguished
Mar 1, 2010
49
0
18,530
[citation][nom]a4mula[/nom]Cuda, PhysX? Please. I'll take 6 monitor Eyefinity (on a single card mind you ... lol @ Nvidia 3 monitor only with sli) over any gimmick Nvidia can cook up.[/citation]

This. People are more likely to have 6 monitors than they are to pick up a PhysX-enabled game.

Er, wait.
 

LukeyB

Distinguished
Mar 29, 2010
12
0
18,510
I feel the 470 offers a great price/performance ratio. But i'm so disappointed with the 480, that i'm switching from a GTX260 to a 5870.

Keeping in mind I have used Nvidia for 9 out of my last 10 graphics cards.
 

Agges

Distinguished
Dec 19, 2009
60
0
18,640
[citation][nom]J3d1M1nD7r1cKs[/nom]But one thing is for sure, both companies need market share: They have their hands in too deep, and this is the generation where we (consumers) and developers will choose the standards the cirtual industry will live with.

And THAT is why I will be promptly picking up a GeForce-GTX 470.[/citation]

So you buy a card to make sure that one of the major competitors gets recognition and not for your own benefit, very philanthropic of you..
 

vaughn2k

Distinguished
Aug 6, 2008
769
4
19,065
"... Another negative reaction came from Brian R, near the Rockstar Games booth..."

It would be better if you remove the 'negative' thing... otherwise there might be an impression (like me) that THG is biased with nVidia, and I think its not good.
 

warezme

Distinguished
Dec 18, 2006
2,450
56
19,890
All this reading is interesting but for the die hard AMD fans, do you guys recall the 2900?? A new architecture that showed up very late to the part and ran very hot and could not compete with the then recently released 8800 series? I see a very similar scenario but AMD should be concerned because the Fermi, unlike the 2900 is actually competitive with single chips in its class. It wasn't until the 2900 was shrunk and later messaged into the 4000 and 5000 series several generations later that the "architecture" became mature and strong. If the Fermi architecture is this strong and competitive at its first iteration what does it hold when it is also die shrunk and performance optimized?
 
G

Guest

Guest
I see that it was not a big surprise since nvidia does this most of the time, but keep in mind though like someone said that their 295gtx card was just as hot and people still bought it, not to mention it is about a few 20 dollars or so lower than that card. What I find funny is people complaining non stop about the watt issue and the price, and blah blah blah. remember that they lowered their cores from 512 to 480 or 488, I cant remember, and that was just to stop yields, so at least be grateful that the card is even LIVING. That way ati can lower the prices, not at the moment of course. P.S. "Aw geeze, nvidia is more pricier than ati, I'm taking ati" uh..old news anyone? at least its an 80 difference or so, don't be that angry about it.
 

HavoCnMe

Distinguished
Jun 3, 2009
603
0
18,990
The benchmarks where not impressive to say the least. The GTX 295 did better on most of the games they did benchmarks on...very pathetic. Seems like the Fermi/GF100 was not worth the wait. Maybe the performance bottleneck is in the gay 384-bit memory interface, just a thought. The GTX 260 had a memory interface of 448-bits.
 

amstech

Distinguished
Feb 9, 2010
113
0
18,680
Nvidia bombed the GTX 480.
It gets matched in most games by GPU setups much less expensive.
My 5770's in Xfire match a GTX 480 in several games (Techspot.com).

The performance minus a couple games is absolutely terrible.
Nvidia better readjust thier prices.
 

kartu

Distinguished
Mar 3, 2009
959
0
18,980
[citation][nom]pratkal[/nom]eodeo nvidia cards are not better for 3d gaming, but they are impressive nonetheless. The thing they are indisputably better is for CUDA, and thats not a negligible thing for me. I'm having hard time deciding should i trade my 4850 for 5850 or gtx 470. 5850 would be a clear winner if it could run CUDA apps, like vReveal and upcoming Mercury player for Adobe Premier cs5...HD 5850 is ati stream ready which make it able to run Mercury player and adobe premier cs5 on gpu[/citation]

Why on earth should ATI support nVidia's proprietary stuff, not to mention, I am curios, what "CUDA apps" do you use...
 

ajcroteau

Distinguished
Jun 18, 2008
276
0
18,780
I attended PAXEast Boston on Sunday and I thought the GTX480 looked great... The 3D gaming was just unreal. Hopefully, I can get a 120Hz monitor and the 3D gaming glasses soon myself.
 

pratkal

Distinguished
Jul 28, 2009
50
0
18,630
[citation][nom]zipzoomflyhigh[/nom]I don't think we'll see a Dual gpu 480.[/citation]

we might see a duel fermi but maybe based on somewhat GTX 450 or maybe a laptop version fermi
 

eodeo

Distinguished
May 29, 2007
717
0
19,010
Why on earth should ATI support nVidia's proprietary stuff, not to mention, I am curious, what "CUDA apps" do you use...

1. I'm using vReveal fairly often. In fact, I was so impressed with this little program I adopted teaching it as a standard thing in my visual effects classes. vReveal allows you to stabilize shaky footage, such as those shot from handheld devices... while running. Also, it allows you to superscale small resolutions 2 times by analyzing sequential frames to extract relevant video information and rebuild detail.

As I dont own an nvida card the whole workload is done by the CPU, that takes sweet time doing it. My c2q @ 3ghz does about 5fps on SD footage. From what I've seen, gtx 260 does it faster than realtime (30+ fps).

2. mental ray, the mostly widespread and mostly used high-end renderer of most 3d packages just got an "iray" extension. It is supposed allows users to use nvidia GPU to render things. From what I've read, its not perfect yet, but its amazing start nonetheless.

3. "Mercury player engine" for Adobe Premiere CS5 (to be released in april of this year) uses CUDA on gtx 260 to allow realtime full HD playback and manipulation, opposed to highend 16 core system that does 2-5fps on the same footage.

4. And to bring things closer to home- games with physix. There are more and more of these. The list goes on and the most relevant thing to note about all and any of those is that they're all CUDA bound. If ATI would support it we could see more things like:

http://www.brightsideofnews.com/news/2010/3/16/ati-radeon-hd-5970-is-the-king-of-iphone2c-wi-fi-password-cracking.aspx

please note that tesla s1070 in that graph has 960 cores (4x 240) in total. See how far behind in CUDA crunching it is compared to ATI 5890? Also note that 4870 is actually faster than gtx 285. This is why I want to use ATI hardware and CUDA software. Why? Because ATI has no software alternative except few confident ideas on paper, while CUDA is here- ready and accounted for. As soon as OpenCL kicks in all will be well.. till then, I want what works now. Funny how living in the present is.
 
G

Guest

Guest
im only gaming at 1440x900 so i see gtx470 is more than enough to replace my old 9600GT. and surely i need physx,3D gaming,DX11 and cant wait to see raytracing in action. i think each person has each need and interest, so pick your product based on that as long as it is affordable for you.
hmm.. maybe nice to grab an overcloacked version of GTX470, if there is any.
 

lowguppy

Distinguished
Apr 17, 2008
192
0
18,710
Personally, I think its priced about right for what it is designed for: 3x large 3d capable displays. I you're willing to invest in those, you will pay for the card that can drive them to the fullest. If you have a single display, you shouldn't spend more than $200 on a video card anyways.

I was at the launch, and blowing a bridge into 1 million particles was quite cool, but I wasn't entirely convinced Drew was sober the way he tripped over the name of the card.
 
G

Guest

Guest
eodeo:
why ATI doesnt support CUDA, or even on their own ATI Stream technology? Because they dont have nice architecture to support it. They are faster than GTX285 (like you said), for nothing other than graphics , not GPU computing. I think it is clear why Fermi is so hard to make and power hungry.. maybe because it is very hard to make GPU as graphics accelerator in the same time as Computing Machine (like CPU does).
 

terr281

Distinguished
Dec 22, 2008
261
0
18,790
I, too, was hoping Nvidia's Fermi release would be better, and this in itself would allow for price competition. Alas, it will not be so due to Fermi's power requirements, heat, and people that are afraid / don't have MBs capable of SLI. (I personally don't see a problem with Nvidia's 2 monitors per card for 3 monitor setups. It allows people to buy decently a decently priced monitor without a display port connection / a $100 adapter.)

But, the real problem with Fermi's release is in the budget/low end enthusiast area. (The exact same problem that occurs in the Intel / AMD cpu battle.) Nvidia doesn't produce a card in the new generation at present in the ~$100-$200 range that compares to an ATI card. Further, SLI/Crossfire allows those that buy cards in that budget to upgrade later. (Perfect example: The last SBM here where the budget machine had 2 ATI 4850s because they were best bang for the buck.) Many people build machines with one card, then a year later buy a second card.

The only option now for these builders is the 5750 or 5770. If Nvidia were smart, they would keep producing the GTS 250 & GTX 275... and price them a bit cheaper than the aforementioned ATI cards. (The 3 monitor setup will be possible in SLI now/in the future with Nvidia, the only real "loss" for current games is... nothing... since DX 11 is a dream for most games at this point.)
 

Drag0nR1der

Distinguished
Oct 7, 2007
245
0
18,680
Interestingly, the Anadntech review (a website that I have a lot of respect for when it comes to reviewing hardware, due to their clear depth of knowledge, thorough testing and impartiality) shows that when it comes to min. frame rates both the 470 and 480 are way ahead of the 5870, irrespective of whether the 5870 beats them in avg fps or not.

For me this is the3 most interesting of the results they show, mainly because it is these low dips in framerate that tend to spoil my gaming experience, not the general gameplay... you know what i mean, those times when the game stutters for a few seconds as it's hit by a lot of fancy graphic effects.

I'd take a slightly worse average but better minimum framerates over maximum framerates but lower minimums (but obviously within reason).

So I'm kinda left thinking 470 ... or 5870... but I do a lot of Cad work too.. and cuda based gpgpu looks like it has some good things actually being produced for 3D studio... of course if it moves over to the dx11 native gpgpu support then I'd be even harder pressed to decide :D

The thing that really puts me off is how power hungry the 470 and 480 are, that and price... when you can get a 5850 for around £230, that's mighty tempting...
 
G

Guest

Guest
I'm not an Nvidia fanboy, i like the choices but....

Question: What happens when developers start designing games based on Nvidia's Strengths (i.e. tesselation,PhysX, & straight out GPU computing) since nvidia is working WITH developers?

Even though the term futureproof doesn't exist in the electronic world, could Nvidia's solution be more futureproof than the competition...
 
Status
Not open for further replies.