GTX480 / GTX470 Reviews and Discussion

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Head on over the o/c section and start spamming about power use there, heres a example of a o/c adding a 120 watts. Whats that 10-20 dollars a day ? /sarcasm.
psu_load_power.png

Yes the worlds fastest gpu's use the most electricity. How shocking ?
 
GTX 480 doesn't come close much to HD 5970 in most games(excluding some few games or Heaven).
Also about the PhysX enabled vs disabled,when disabled cards perform normally as they should but when PhysX is enabled both Nvidia cards and ATI cards get a performance hit compared to when PhysX is off however ATI cards suffer more from performance hit compared to Nvidia cards
 
\
And physX is only useless to you . You have a Pavlov response to the word. You just bark that same
comment out. Your probably don't even know that Physx is used in Metro 2033
http://www.youtube.com/watch?v=Bt8DEEEMTHw
Actually the 5970 uses less.
Its not the fastest GPU, did you notice all the new launch reviews all over the web that read
Geforce GTX480: the fastest gaming GPU in the world
http://www.tweaktown.com/pressrelease/3034/galaxy_geforce_gtx480_the_fastest_gaming_gpu_in_the_world/
 



If Physx is so useless, then please explain why ATI is rallying for an open physx initiative? Seems to me that if enabling more real-time physx in-game were JUST a marketing ploy, game manufacturers wouldn't offer physx enabled features in their games. Crysis, Cryostasis, Warhead, just to name a few, all benefit from having physx enabled. Not saying it improves framerates, it definitely doesn't. But what it does do is provide more responsive physx effects, while at the same time putting loads less stress on the cpu. I can't be the only one that sees the benefits.
 


To shut up the Nvidia fanboys.
 

That's awesome SV_Bubbles, thanks for that, it does a much better job of summarizing what I've been trying to say in numbers which aren't ambiguous rather than just me spewing at the mouth 😛


The fact the the 480 can come close to the 5970 at all, or beat it in one case, is pretty epic though, it speaks to the efficiency of nVidia's new architecture with things such as tessellating. I like that nVidia is the one actually trying to push the tech this time, whereas ATI just tacked on DX11 to their hardware without any real thought of it just so they could be the "first ones" to use DX11, whereas nVidia has designed their arch to try to take full advantage of all the new DX11 goodies, among other things like better OpenCL support, better AA scaling, etc. I'm not saying the 5800's or 5970 are bad cards, I'm just saying I'm more impressed by the new Fermi arch. It might have been a smart move by nVidia to redesign now and bite the bullet than wait like ATI is, 'cause they'll have to go through similar growing pains.


It isn't that it is useless, it is that nVidia and JJH are a55holes for trying to make it proprietary, rather than making it an open standard so we can see what can REALLY be done with it. I like the idea of PhysX, but in its current incarnation it IS essentially useless to gamers.
 


I am with Notty, here. It really DOES make a noticeable visual improvement difference. With Physx enabled, Watch an explosion, or look at the way the wind interacts with the environment. It does make a framerate hit, but I am okay with that. With my sys specs, I can run Crysis, warhead, farcry 2 with 4xAA Physx enabled, all settings maxed on my monitors native res( 1440x900) and still get 55-61 fps. The only one I get less with is Cryostasis@ 30-40fps

my specs; cpu-Phenom II X4 20@3480mhz-cpu cooler-Coolermaster V8-8gb ddr28600 Patriot @960mhz-5-5-5-12-24-gpu-MSI GTX 275 Lightning ed.@745-1583-1189 clocks-psu-Sigma Monster 750w
 

LOL, at the time I bought the monitor, was the best I could get( 2006) Seemed more important to me to upgrade other components as I could afford. Being out of work for so long, I do what I can, This monitor is just fine for now.
 

*Shrug* why not?

I'm probably going to Xfire a couple 5850s or SLI some 470s and I'm gaming at 1900x1200, which is mad overkill, but meh, it'll be great for newer games coming out heh.

Did you want me to donate to charity or something?
 


So lingering smoke and unrealistic particle effects again? Tell me notty, how many times will you notice this while playing? Probably none, but we both know you will defend this dead API to the death.



If ATI were able to get support of PhysX, it would have a future and would not be a proprietary, useless software. No one is going to do anything exciting or game changing with PhysX if it won't work on ATI cards, its impossible.

Oh and Crysis and Crysis Warhead do NOT support GPU accelerated PhysX. On of their many Physics APIs is PhysX but it is all done by the CPU, not the GPU. Cryostasis shows very little benefit form GPU accelerated PhysX, but it manages to cut the framerate in half so that is a poor example.

You aren't the only one that sees the benefits in GPU accelerated physics, I do too and everyone else for that matter. PhysX is not the answer because it is proprietary and only works on nVidia cards. This means game developers can not do anything more flashy then more particle effects which are useless. Take heart, however, that there are answers out there. DX11 has compute shaders which can do anything the CPU can on a DX11 GPU. There is also OpenCL. PhysX is dead, there is no way around that.
 
Honestly sometimes ppl throwing the word overkill bothers me. When I had the 260 sp 216 maxcore, I couldn't play stalker full, or crisis, or even just cause in massive areas ANC this was all at 720p. Even bf2 had drops. Funny thing is in seriuos sam HD the 5870 lagged to 30 gps in certain areas while the 260 kept the high 70 fps. IFC that was the only game where that ever happened that I tested.

Trust me overkill is not a word to use depending on the game the person wantsbto play. I'm pissed about the 5830 from ati... It's almost as exp as a 5850 here....very excited about the5870 refreshes aka 5890 :).


P.s

games load faster on the 260. And I've been told due to the shaders.
 


well.. I'm still a big fan of the green team, but they pissed me off with gtx480 having 1/8 double prec. performance (1/2 is only on expensive Tesla cards with fermi).

they crippled d.p. an gtx480 by a factor of 4 just to gouge on the high-end cards.
i hate that (I wanted to buy lots of gtx's for a cluster, now I don;t know - must recalculate)
sure it's my fault since nvidia never promised full speed d.p. on gtx's but....
 
The cards aren't as bad as ppl's hype and expectation. Look at the 6 core... Over hyped and useless for 99% of daily user gaming . Honestly with the nv fanboys playing up the card it was bound to flop under some of the ridiculous statements. The only thing that scares me is the heat of the cards but what 'll do is turn up the fan past stock and hope that it peaks at 80 to 85. I'm sure the refresh cards will be much cooler. I'm either picking it up this week near the end or the end of next week depending on the shipments. I heard my friend is only getting 3 and I got first fobs on the evga haha.
 


yep, 5830=epic fail.

And i just thought that on a 19" screen, a i7 GTX 275 was pushing it a bit....
 
Trust me man it isn't I use to say the same thing until I saw wat high quality shaders snd as do. It wad unbeleivable seeing just cause 2 run at 21 fps at 720p during the city level... And I'm pretty sure a 19 inch will be higher resolution than 1360x768 (prob 1280x1024 or 1440x900 around there.)
 
It's vet weird it pretty much dependant on the user. I personally crank up only 2 as even if the card can fo more, just because in case I keep the card, I don't get depressed as much when new games come out hahah
 



haha my case is way louder than mine.................and i had a satisfying feeling when i saw the moderator here have the same graphics card as mine :)
 
Sey I'm off iPhone it predicts eat I'm trying to say some time and rights it in when push space lol so if it doesn't make sense I'll correct it at home

No my 480 gtx isn't here yet but I fully paid it off already and will pick it up when it's in