Analysis: PhysX On Systems With AMD Graphics Cards

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I guess, dedicated PhysX is a good way to make use of that old 8800 that many people seem to have, when they decide it's time for an upgrade.

When I upgrade from my current dual GTX460 1GB setup, I'll probably end up using one for dedicated PhysX, regardless of whether I decide to take the red or green path (unless, of course, I sell both for the desperately needed cash).
 
There's no question that at least some games are crippled for CPU-based PhysX, no doubt at nVidia's behest. Batman: Arkham Asylum is one of them.

My setup is a 5970 4GB with an old 8800 GTX for PhysX. With the latter disabled and PhysX set to high, a benchmark run at 2560x1600 with 4xAA gets 11/33/19 for min/max/avg. CPU usage never goes above 25% on my i7 950.

With the 8800 GTX enabled, the results are 46/122/89, and CPU usage is slightly higher throughout the run, maxing out at around 30%. During one sequence, where thugs kick papers on the ground, the CPU-based PhysX had 16% CPU usage, while GPU-based PhysX had 18-20% CPU usage.

Mind you, the CPU figures are based on eight logical CPU's, so the most charitable interpretation is that the usage with hardware PhysX disabled is up to 50%, or two whole cores with no HT benefit at all.

The game Mirror's Edge shows similar performance crippling with CPU-based PhysX, despite having very little in the way of a PhysX load to speak of.
 
@ smartroad: I was going to say something like that myself.

To a rather BIG degree, Nvidia is shooting themselves in the FOOT, again. The lack of PhysX isn't going to keep me from buying an ATI card... if I want one. I use both, but in general - nVidias' marketing and naming games is a huge turn off.

Their cards are still hotter and are having trouble in the market with the Fermi debacle. Come on, the is now the GTX-580, which is really what the GTX48 should have been - but could have been called the GTX 480 as its STILL a FERMI series chip?!

Geez, in the space of 19 months or so, (March 2009 > NOV 2010), Nvidia is gone through FIVE (COunt them) *5* model series of products... 100, 200, 300, 400 and now the 500 series - yes, the 100/300 is pretty much re-tread/relabeled older tech for the OEM market but HOLY screw and confuse the customer base BATMAN. AMD has been far more consistent with their technology and their model numbers.

AMD allows nVidia to make chipsets for AMD CPU, but since AMD started using ATI chipsets, nVidia started losing market share for various reasons and of course intel raped nvidia over future chipsets beyond core2. nVidia is in a weak position... and should do some actual thinking.

SO, with that said - A SMART thing for NVIDIA to do is *IS* to allow ATI/AMD GPU owners to EASILY add a 2nd Nvidia GPU and not make them jump through hoops. A sale is a SALE, but making it so the AVG user can't simply add PhysX to their AMD system is plain stupid. Imagine thing... someone has an AMD-chipset (Fusion) mobo and adds a low-cost PhysX-Nvidia GPU card. That's a sale!

Yes Nvidia, we KNOW that PhysX can actually run on an ATI GPU and understand your restriction.

But act like you guys have more than half a brain.

- Open the driver to allow ATI owners to run a Nvidia GPU as a dedicated card.
- SELL two kinds of PHYSX cards, ie: use the EXACT same GPU in the GT240 for a low end model and a GTs 460 GPU. They don't have output ports or SLI connectors.
Example of product.
PhsyX 420X = $75 (GT 240 class GPU with 256mb)
PhsyX 450x = $140 (gtx 460 class GPU with 512mb RAM)

Then Nvidia can SELL these to BOTH Geforce and Radeon users as a simple add-on.
As proven, an ATI 5850 + $75 GT240 will run very good over a GTX480...

nVidia... do something smart.
 
[citation][nom]compton[/nom]This is a disaster in some respects. …. Hopefully nVidia knows something I don't.[/citation]

Would that be they know PC component consumers are mugs?
 

So do you buy things and then give them away for free?
 
???

A full GT240 costs about $75~85. The GTX 460 is about a $140~180 part.

Nvidia makes a profit for product sold... its their problem and right to not expand their market share. If there was a survey, I bet a large percentages of ATI-5700~5800 users would easily drop $75 for a physX card that was hassle free and supported.
 

So what, do you buy stuff and then give it away for free?
 
It seems that a Geforce card increase the performance of PhysX supported games. I have a Radeon HD4850 card. If I use Geforce GT240 then would the combination increase game playing experience?
 
Yes it would in PhysX games, but officially it would not be supported... So if Nvidia somehow would completely lock the use of other GPU's your GT240 would render useless... that would be somewhat problematic...
Hopefullu someone would make an hack to get around it, but there is a risk in there... I by self am not too comfort with unofficial hacks.
Is the increase in playing experience worth all the hasle... I am not sure. If Nvidia would allow other brands together their own cards, it could be...
 

There is already a hack to get around the lock out but I think you need an updated hack with each new driver release from Nvidia.
 
So in your world Intel should share hyper threading technology with AMD because some program was made to work best with Intel's hyper threading so we can make everybody happy except for the people who made the product. And i guess nvidia shouldn't be allowed to profit from their products at all either, because they are under some communistic rule. BUT they do have an investment in Ageia and employees of Ageia to pay.

Not to mention the CPU is often slower at certain things and the memory it uses is slower than the GPU and it's way faster Video RAM. They should be moving forward and making the Graphics card better at it's job rather than slow down progress because some people don't want to buy their product and still want to benefit from software made for that product. This isn't freeware. Stop complaining. We should just have a sit down and require nvidia to support Xfire on their boards and AMD to support SLI on their own. How about I send in a complaint that my ATI onboard video on my xfire board doesn't allow me to run an nvidia video card without having to go through a crap load of trouble of going into the bios turning off the onboard, saving and then killing the machine before it can load the bios, and turn the onboard video back on so I can then install the nvidia graphics card and have a picture.
Don't get me wrong here, I love my AMD prodcts, running my phenom II 720 right now, and had planned to get a second 3850 to xfire.

So what about us AMD fans that love to run AMD/nvidia and possibly run SLI or have a dedicated physx card, but only have maybe two AMD choices on newegg and none at Frys electronics nor any at microcenter that can run SLI let alone tri SLI, and only the dual slot is almost worth a crap, the other a POS that your lucky if it doesn't just randomly die after a week or two of no troubles. There are ample things the complain about besides physx, which can apparently be simply disabled.
 
The previous comment is a lot of nonsense. What people are complaining about is this:

1) nVidia disables PhysX on *their hardware* when the primary display device is not an nVidia card. If they had a higher market share, that would unquestionably violate anti-trust law. As of now, it's just incredibly unethical.

2) nVidia pays game developers to cripple CPU-based PhysX, so that it only runs well with an nVidia GPU doing the calculations.

No one suggesting that nVidia, or anyone else, give away the results of their R&D and/or acquisitions. They're simply requesting that nVidia not engage in anti-competitive market practices, which they clearly do.
 
so PHYSX is basically a physics engine...and its supposed to "enhance" gameplay, only when used with a Nvidia card. Hmmm...
How is it "enhancing" anything when Nvidia is being a scrooge about it?

but my real question is:
1. Should I turn on or off the physx on my nvidia graphics card (just ordered a nvidia gtx 580)

2. If i turn off physx will my games still have physics?
 


1. Since you went and bought the GTX580, it would be a waste NOT to use it to its full extent where it can take advantage of it.

2. As MOST games don't bother with PhysX and use Bullet or Havoc to mention (the most probably biggest) two. Physics will still be present in your games. AMD/ATI Users still have physics in their games. They just suffer a penalty with PhysX for not having a nVidia card.

 
Awesome article going after the truth. While I can admire Nvidia's tactic for being proprietary and controlling, it follows the same doucheital business practices of Apple, "we're the best, make everyone else suffer."

Thanks for the reaffirming read. :)
 
When you think about it, it's the two company fault (NV/AMD);
nVidia ridiculously trying to gain market and force AMD to forfeit by disabling the gpu-physx whenever a non nvidia's card is present (how they could be right if you already own a nvidia card ???)
on the other hand AMD stayed silent all this years letting nvidia doing what it wants, if at least they released a development framework -like cuda- then some company will introduce gpu-physx which runs on nvidia using cuda and on amd using whatever use,,
hope by introducing opencl and directcompute thing go to the right way
 
i just finished going through mafia 2 and metro 2033 with physX cranked, and i must say physx is awesome, i wish every game had it, its one of those things you take for granted after not having it, it really makes the game feel immersive and full of life during a hellbreak.

notibly the hotel level in mafia 2, when the job goes kinda bad and there is a tommy gun shootout at the hotel bar thats made of glass bricks with bottles, and wooden chairs splintering and shattering into a million peices. the game with physx looks way better than the benchmark too btw.

also i might add that metro 2033 looks great with dx11 and physx, the debri from explosions, and bullets. . . just awesome.

i was bitter about physx for a long time, but the future will be awesome when it gets opened up and nvidia lets ati cards play with nvidia cards for solid gaming.

i want a rig with crossfire 6950's and a gtx 465 for physx, . . . . anyone know if that will work? i like ati's anti aliasing modes better. . .
 
It would be helpful in an article like this (as I've said before about 3D tech comparisons) if there was a comparison of the quality of the physics in each case as is visible. A GPU-orientated tech, after all, is about how something looks. If Physx looks better than, say, Bullet, then it would make sense for me to buy a fast Physx rig. But if, on collision, a corpse flies 300m into the air and then explodes in an orgy of richness when I walk past it in a corridor (a la Just Cause 2), then what's the point? It sounds as though you get quite close when picking the game (mafia II) to uncovering this, then you relent and go back to a semi-anti-Physx bash, concluding that if you want to run the Physx engine, you have to do get faster components. Perhaps out of scope of the article?
 
[citation][nom]varfantomen[/nom]Heh why should nvidia spend their time and money to help AMD? It's as much nonsense as saying Toyota should help Ford be cause that too would be for the greater good. Yeah damn those scumbags at Toyota![/citation]

So..? I dont se any stupidity about nVidia NOT using extra money on helping ATi, butthe fact that nVidia started vasting money(with a even slower nVidia-card as result) ...just to PREVENT ATi to be able of getting any use out of PhysX? ...thats where the stupidity came in guns blazing!(yes i do know is an old marketing-trick, but its still a waste and utterly studpid)
 



Thank you for your response and information. I had always wondered. I remember watching a documentary about the NES and some 3rd party Dev said something like I had suggested but probably meant something more like what you had said. It's a lil old now but I really appreciate the information. Cheers~
 
I know this is an old review but I would like to point out, how inaccurate the dual-card perfomance is in the Mafia II section. You cann't say HD5870 + GTX 285 is better than HD5870 + GTX 260 just because the test you show. The goal is to prove GPU-PHYSX performance, and the only way to do that is to run separate test for each card. The first test could be the one in the pictures you show in this review, the second test would be HD5870 + (each card) WITHOUT physx turned on, and the third test would be HD5870 ALONE (with and without physx). This would be the only fair way to actually determine which is the best physx-performance nvidia card, since we are only trying to determine that. An easy example is, nowadays there are nvidia GTX 590 cards (the best I would say), if we test HD5870 + GTX 590 for physx performance, against HD5870 + everysingle card in the world, only taking into account the test you did, obviously GTX 590 would win by miles, but it would only reflect the high-end graphics card boost in overall performance, not the physx performance alone. Now if we test the HD5870 alone (with and whitout physx) and also test HD5870 + GTX 590 (with and without physx), we could actually determine the physx perfomance value of this card. Hope it works for further reviews. I said it with the best intentions. Good luck =P
PD: If you said it somewhere in your review, I apologize, I only saw the pictures and hardly read the rest, excuse me If I'm wrong
 
now come on you guys, amd and nvidia both give hours of enjoyment on our pcs with or without physx. i have a geforce and physx makes physics more realistic but there is only a few games that you can tell the difference. if you have a gpu that supports it, use it. if you dont't, then disable physx. no big deal, you still get to play with higher detail than xbox or ps3 or wii.
 
Just read this article. Seems Nvidia wants to attempt at monopolizing the GPU market by enforcing their own products for themselves. This seems to explain the higher benchmarks when running Nvidia cards as opposed to Radeon cards.

However, one thing I've noticed is that even whilst Nvidia do not want users to bypass the PhysX lockout for rival cards (by ensuring it only works on Nvidia cards) they aren't entirely missing out on the workaround either. People are still having to buy GeForce cards just to get PhysX to work on a Radeon.

Developers seem not too bothered about Nvidia's monopolizing either. Alot of games I see nowadays come up with the Nvidia logo in the splash screen and I have yet to see a game that parades an AMD graphics card. Either this is a shrewd marketing move by nVidia or developers are scared not to rise up. One thing that suffers though in the end is performance. I suppose CPU physics can only be improved when physics calculations move towards multithread/multicore performance, since the clock speed war is irrelevent.
 
take it simple and from another prospective guys, you want to play video games then go ahead and buy nvidia, you are a 3d designer then go and buy amd, end of story
 
Status
Not open for further replies.