GTX480 / GTX470 Reviews and Discussion

Page 8 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
PhysX isnt as usless as you might think. Its used to render physics effects in films etc..

i think Transformer 2 used physx for when one of the robots smashed through the bridge stuff like that.

Look for a physx technology demo and look at what its capable of, especially fluid dynamics....if this were in games it would be unbelieveable but i think you would need too much horsepower to render effects like that in real time. Really impressive what it can achieve though.


Edit to add: http://www.youtube.com/watch?v=iyg9HgiD8X0&feature=related

Youre tellin me that physx is useless??
 
Phsyx is hardly useless as ATI fanboys would want you to think, current uses of phsyx in games usually is more atmospheric and it's something that falls under "If you don't have it you wont miss it".

Useless no.
Something you want if you never had it or tried it, not at all.
Something you want if you played good games with it yes.
I mean it's like the ultra high settings in most games where it's just small details that are added and most people can play it just fine without it, but if you played the game with it going without it can be a sad thing.

Also anyone just noticing how much crap nvidia's gpu does now from physics rendering, compiling code, streaming audio, doing 3d and supporting mutli monitor and crap. I wonder how good of a gaming gpu people would have if gpu's just did shader and geometry work and that's all they designed it for.
 
Look it wasn't just Ari gambits that make the 480 look bad, the over hype also killed wat we expected. If they would've just said they will be identicsl I gaurantee u most of the sites would be
praising them a lot more.

I blame our parents
not beating children!
 
God, stupid, stupid, stupid....

PhysX =/= GPU accelerated physics in general

PhysX = a dead propriatary API

It doesn't have any meaningful future and is useless as long as it does not work on ATI cards, because physics is NOT DENSER PARTICLE EFFECTS, physics is how a wall crumbles when shot with different ammunition, how buildings collapse when a central pillar is destroyed, ect. All of this stuff effects gameplay and is important, a game developer cannot release a game that does this via PhysX GPU acceleration because ATI cards will not be able to do the same, changing the gameplay and alienating customers.

DX11 compute shaders and OpenCL and do everything, and more, that PhysX can but can do it on both company's products.

If you think PhysX has a future and should be a factor in choosing a GPU then your an idiot plain and simple.
 
Well physx will always be a bonus and more of a luxury then a gaming need. Unless mvidia starts disributing it. It will never catch on otherwise and even then... It has a long road.
 
I just wanted to add that the physcs in hl2 is more than enough for me. I don't know why those physics look better than even crysis I might be loosing it!

Come on hl 2 ep 3 in dx 11 with 1000% tesslation and raytracing and 3d eyefinity!
 


edited: Its trolling trying to stop the discussion with name calling
 
PhysX will probably be replaced by DriectCompute and OpenCL from DX11, all these are synthetic bench as when physx is enabled , the cpu has much less work to do but thers no way that would happen in any game. Yes, the 5970 availibility is bad but not as bad as when it launched lol lots of misleading websites showing stock while they didnt even have the cards.

btw I had PhysX since GTX280 Launch, yes it add effects but not essential, more like extra effects in a game that could be easily done on any modern CPU.

Crysis 2 + PhysX ? :lol: Never going to happen.

Havok are more advanced in term of experience and it has already been demo on GPU with DirectCompute/OpenCL, Havok is also used on consoles btw

Propriety as stated before will Kill PhysX.

I still dont understand this --> the 250w TDP of the 480 which is sucking more power than a 300w TDP Card 5970.
So either the TDP are lies or the 5970 is pulling under 250w? :bounce:

LoL @ Tempered81 posted pic:

15ez287.jpg
 
Well, there's only 1 point in favor of PhysX: they put the physics topic on the table for devs. They showed the world what could be done separating the physics engine from the games allowing this 3rd party API/engine to work it's magic. So, as a first stone (correct me here if I'm wrong, please), PhysX really worked out nice. Now, thanks to nVidia and ATI (according to an article saying ATI said "Hell no!" to support PhysX, favoring it's own and then open proyects), it came to a stop in our eyes.

I like what we have now in terms of "physics acceleration" and it's developing fast. And remember, the whole point behind having "better video cards" has always been about the EYECANDY, and we really can tell that physics is another eyecandy out there having the same importance than mostly every other form of eyecandy (AA, AF, types of lights, etc).

Now, the GTX480... That thing is supposed to have a LOT of horsepower to do both worlds fine: heavy OpenCL/CUDA/DC/PhysX calculations and DX11/OpenGL/DX10 eyecandy goodness. Down the road we're supposed to see the above I said put into practice, because my bet is that OpenCL and DirectCompute is little to none on current games and the GTX4xx will show it's fangs when we get there, but that's just my interpretation of things. Right now, the GTX480 is a lackluster gaming card for games (as obvious as it may sound, it ain't for a lot of folks).

Better drivers and more DX11 games using OpenCL/CD is what the GTX480 needs... Maybe... 😛

I'm curious as to what the 5xxx series do in terms of OpenCL and DirectCompute right now...

Cheers!

EDIT: Typo xD
 
Yep, the 480 Will be the GPGPU King as expected but not as a Gaming card sadly, well not yet in the near future.

Fermi will even be up to 5x Faster than the 5870 is such applications, Note that the 285 is already twice faster than a 5870 in GPGPU.

Hers the 480 DriectCompute and OpenCL perf.: Impressive! 😱

http://anandtech.com/video/showdoc.aspx?i=3783&p=6

It has already being said by some nVidia workers in the past that Fermi will be a GPGPU House with C++ but nothing was said about games other than the false benchmarks.
Fermi will revolutionize the GPU world toward other uses than gaming. But again as a gaming card, it doesnt get up to the hype of 30-50% over the 5870.
 
Holy, I bought the 5970 for 629$Canadian when it launched in november and now it is 800$Us lol? I should have gotten 4 or even 5 and sell them, damn I could have made 850$ if I had got 5 back then! :pfff:

As for the --> Geforce GTX480: the fastest gaming GPU in the world , i Think it should be more like--> Geforce GTX480: the fastest gaming Single GPU in the world

btw Some French website asked manufactures about the Fermi TDP and nVidia affirmed that those TDP of 250W for the 480 and 215W for the 470 are the average load in game(Even though we are seeing higher load than that and than the 5970 even in games,so what game mix did they use? No one knows) and that during heavy stress like Furemark, the cards can go well past the TDP limit. All in all, these TDPs are Not the Max Board load which are unknown to date, prolly around 300w for the 480,so thats what explains why there is no 480 with 512SPs?. The TDP barrier is even broken when playing heavy games.

EDIT: Ineteresting notty, in the posted website the 480 TDP is 300w and the 470 TDP is 225w, those look more logical than the anounced 250w by nVidia.
 
nvidia fan writes = PHYSX IS SO GOOD OMG LOOK AT METRO 2033 AND UNIGEN PERFORMANCE (mindexplosion)

ATI fans = uh... what about popular games like battlefield?
 

Thats what happens when a new Champion is crowned.
wendys_www-txt2pic-com.jpg
 
if you water cool it there is zero sound.
all computer parts should be water cooled

the revised version shouldnt be as bad. but the card is a monster.

 
This thread has taken a down slope. I gaurantee u that there is alot of nvidia hate going around. But that's how it is, there was ati hate during 2900xt. But ppl are starting to bring morals into play which posses me off such as business practices. I gaurantee u if each ati cost the life of 1 dog or animal a day ppl would still buy it. It's just nvidia season, hopefully wabbit season I'd next year.

I'm going to say this 1 more time, w.e ppl say in the forums now will change within a month, and these cards will sell.

Due to Ali scaling and driver support. Ati may have the better product but nvidia has the better driver engineers. I don't care what any1 else says I stand by that statement. I've been an ati fan my whole life and there's something about their cards that makes me choose em, but they def don't have the same hit as during the 9800 XT era. A big soft modding era lol.

Anyways plz be civil ppl, screw features because you'll never win. Features are opinionated benefits which doesn't deserve to be marked. You'd have to be a jackass to tell someone buy eyefinity if they want 3D.

Nvidia made decent cards but they come at a price, they are the single gpy kings just like ati is the single slot kings.

Now play nice, if u want to start talking trash open up another thread but don't start acting immature in the official thread.

Anyways, I also wanted to add I'm not pointing fingers I'm just saying