Batman: Arkham Asylum: GPUs, CPUs, And PhysX Performance

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Of course the game artificially bottlenecks when the CPU does physx... I thought this was common knowledge. The Physx calculations aren't that complicated, and most CPUs could handle them fine compared to the GPU. If not for the artificial bottleneck, even people with nvidia cards would change their settings to let their CPU do the Physx as their performance/FPS in games like this would be substantially higher. The CPU is barely being taxed otherwise on this title so there's tons of headroom for anyone with a quad core.
 
So even if you have an nVidia card you can't play with PhysX at high... that's lame.

You'd need to buy another card for dedicated PhysX just to make the game playable on a GTX 260...
 
How much of a difference in performance would you see if you ran the benches over with a pre 186 forceware driver? Might give some perspective on the ATI/PhysX dedicated card setup if the single 260 numbers were close to the same with both drivers.
 


you can actually, im playing it on a gtx260 @ 1080p, PhysX is set to high but on 4xAA. setting AA higher would make the game crawl.

i dont get it why people here keep on bashing PhysX while they want those PhysX effects.

just turn it off if "PhysX sucks". dimwits.
 
[citation][nom]dmarvp[/nom]can anyone show me how to enable PhysX if I only have an ati HD 4870 ? I tought that wasn't possible. But I don't know now after seeing this article...[/citation]

It depends on the game; some will allow it in the game settings, but as you can see, the performance suffers. If you really want the physx effects, buy the cheapest Nvidia card with at least 32 stream processors (8600 GT or better), and use it with your ATI card (Windows XP or 7 only). Just use the 185 or earlier NVidia drivers, and it should work.
 
PhysX is awesome. I'm all about little details. My two EVGA GTX 260s can play anything on max settings @ 1920 X 1080 with high PhysX (without overclocking). DX11 won't be mainstream until mid next year so I'll wait till better games come out to make me upgrade my graphics cards.

Batman Arkham Asylum is an awesome game with or without PhysX. But if you want your money's worth, play it with PhysX on High. You'll wish every game used PhysX.
 
Thanks for doing this article, I was very curious as to the effect that a dedicated PhysX card would have, and the performance requirements of the dedicated card.

I assume that older architectures would show similar PhysX processing results, meaning that an 8800GTS 320MB would perform as well as the GT 220 tested here, since they're roughly equivalent on the GPU charts. I'd love to see a test confirming that, though. :)
 
there should be a physics library usable with comparable performance on both ati and nvidia , otherwise we suffer . the 4870 which pawns the 260 regularly , gets injured with this . software should take maximum advantage of the hardware .
 
[citation][nom]ethr[/nom]Here's a bid of my rig running a Batman AA benchmark with eyefinity at 4800x1200 with physx hardware acceleration[/citation]

looks like toms stripped out my link

go to youtube and search for eyefinity and physx and it should be the first listed
 
With a bit of googling I found a way to enable in game Anti Aliasing on my ati4850 with very little fps loss (8 fps loss avg at 4x), the CCC Anti-Aliasing destroys the fps. Here is what I did and it works perfectly:

"Using Ray Adams ATI Tray Tool utility. The advantage of doing it the alternate way is that the Nvidia card check is bypassed and you get ingame AA rather than AA forced by the display driver. Big deal you are probably thinking but the ingame AA is faster as it isn't done via brute force.

So how do you do it the ATI Tray Tool way?

Well first download the ATI Tray Tool App and Install it.

Create two profiles in the ATI Tray Tools app, one for the BmLauncher.exe and ShippingPC-BmGame.exe

In the 'Direct3D Tweaks' option, enable the tickbox to change the adapter identification
In the Vendor ID put 10DE for the Device ID put 5E2 The device description put NVIDIA GeForce GTX 260. Do this for both profiles saving each one before starting the next.

Launch the batman launcher. If you have done it correctly it will say that your Nvidia drivers are out of date (because you are using ATI ones of course) and it will then allow you to set AA in the launcher. After that start the game. Notice that the ingame AA works."

And there it is, Anti-Aliasing does work for ATI after all! (and it looks much better with it, it is necessary in this game)
PS use google to find the ATI tray tool program , the version i used was the beta 1.6.9 for Win7

 
[citation][nom]Curnel_D[/nom]The PhysX API is a great thing. The guys who made it had a brilliant thing going. But it'd be a TON better if Nvidia converted it to the OpenCL standard. If they don't, it's going to end up as a 'could have been' technology.[/citation]

True. People are mixing up PhysX and PhysX for CUDA. The effects are good. Sadly, Nvidia continues to push CUDA in front of OpenCL, and as yourself and the article pointed out, may be crippling performance on CPUs to make the CUDA support look better in comaprison.

In this presentation they spend 4 min 30 sec on 5 slides on the interoperability between OpenGL and CUDA, but only 55 sec and 2 slides on the interoperability between OpenGL and OpenCL. They also encourage the use of Nvidia specific extension to OpenGL:
http://bit.ly/3HMPyA

About ten days ago Nvidia held a lecture here in town on their Scandinavian tour about CUDA programming, even though they released beta Forceware drivers with OpenCL functionality just days after.

Real life examples like these make their claimed OpenCL-engagement less credible. They will push CUDA and Nvidia-specific programming even though there are open standards available.
 
and , if somebody still has a doubt , the best way to take max advantage of hardware is to go open source . its not about being free (ibm,sun etc pay millions to hire GPL code developers) , its about the quality that comes with it , its about the programming model , similar in principle to the ide features that improve on code quality as well as performance . and if you thought open source was communism , Red Hat ceo's are regular ceo's .
 
it is good to see the Geforce GTX 260 perform well here , yes , but 4870 is THAT much behind with physX not because it really cannot but because it is not allowed to . i too have a 260 , and i was really torn when buying this or 4870 a while back . it scares me how i the customer , the gamer , would have suffered with a 4870 . and the solution is not to buy only nvidia , if ati did this it would be still bad for us period .

dont think ati or nvidia , think buyers and sellers , and value for money , and maximum performance out of your own hardware , FTW , and thy will be done .
 
having said that , nvidia own physX now , so they should at least enable physX if someone bought a gt220 , a cost effective physX add on card . but having to someone migrate completely is ridiculous .
 
Yo, Woligroski... I'm happy you posted this and all... and I'mma let you finish... but [H]ard|OCP posted the best Arkham Asylum PhysX article of ALL TIME, last month! [http://www.hardocp.com/article/2009/10/19/batman_arkham_asylum_physx_gameplay_review/]

Just kiddin'! Tom's has provided a great test, and covered some ground OCP didn't... BUT, there was one cool bit featured exclusively on the HardOCP version-- HardOCP ran some cross pairing tests, using a Radeon for primary graphics and an nVid card for Physx and provided this interesting little takeaway blurb:
Now isn’t this interesting! In Batman: Arkham Asylum, PhysX, which was officially disabled on AMD graphics cards, runs better with a Radeon HD 5870 graphics card than a GeForce GTX 285 when using a GeForce GTS 250 as a dedicated PhysX accelerator.
 
Tried the demo on Steam. The controls of the game seemed really geared towards console and there is no multi player that I know of. So I bought this game for the PS3 instead. Too bad for the computer world of things where it's geared towards NVidia, makes me not like them so much.
 
This sucks. So, if I want maximum effects, I'd need to scrap my perfectly good HD4850 for a pair of nVidia cards, one to not be using ATI, and one for PhysX...sorry, not going to happen.
nVidia, BITE ME. I used to use your cards. Never again, until you change your ways.
 
the game doesn't cap performance when ati is detected, iot caps performance when you don't have a physics card or physics enabled card when it is enabled.

not sure why they didn't test it here but from my test, if your card does not support physics regardless of brand, when you enable it you loose a lot of performance while the game doesn't really use much more CPU usage or memory usage, it is a artificial cap in order to endorse getting a physx enabled card
 
Would have loved to see benchmarks pairing a dedicated Physx nVidia card with an ATI Radeon selection of cards. Would have helped put things more in perspective.
 
Status
Not open for further replies.