Batman: Arkham Asylum: GPUs, CPUs, And PhysX Performance

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

K2N hater

Distinguished
Sep 15, 2009
617
0
18,980
I see AMD sided with nVidia to acquire ATi. Now I see Intel working quite too closely with ATi... Look! All their chipsets (including server ones) work flawlessly with CF while SLi is just the very opposite. Guess what happens next...
 

cleeve

Illustrious
[citation][nom]magicbullet[/nom]i7 with 2.06ghz at the last page???[/citation]

Ouch! Wish I caught your comment earlier. Yes, it should have been 3.06 GHz. Fixed!
 

Narg

Distinguished
Mar 15, 2006
115
0
18,680
The article is very nVidia biased. So is the game. All Unreal 3 engine games are coded to use nVidia and hamper ATI cards. Even still, the scores shows that the high end ATI stands up very well to the high end nVidia. Even though the article writer still uses wording that shows his bias is very clear. When the 4XAA is used, nVidia lies and actually doesn't do 4XAA and instead uses 2X.

And those wondering about using 2 video cards, get the original Physx card instead, it'll do a lot better.
 

cleeve

Illustrious
[citation][nom]Narg[/nom]Even though the article writer still uses wording that shows his bias is very clear.[/citation]

The sad part is I think you actually believe that. You have my pity.

Please provide us with an example.
 

spoofedpacket

Distinguished
Jun 13, 2009
201
0
18,690
Is it me or the FUD against Nvidia in here getting thick and ignorant?

I have never understood why when one company gets a top-performance product, knowing they are only going to keep it on top a few months causes such a flood of fanboi outpouring. Calm down kids, ATI and NV both make their money off OEM deals to big-box companies. Seriously, most of their money comes from buyers who have no idea which chip is even in the computer.

What NV is wanting to do involves holding on to their OEM deals while reaching towards the server market and high performance desktop cards. While it sounds like they are stretching their efforts, they aren't. All of this will be based on the same basic platform. It's like saying because Intel has mobile, desktop and server processors they are limiting themselves when it comes to games. Nope. All those processors are based on similar cores.

Anyway, it's best to root for whoever to come out with a more powerful 'next gen' platform. It does nothing but send the competition back to the drawing board to surpass them. Win win for all.
 

amirp

Distinguished
Nov 9, 2009
521
0
19,010
With a bit of googling I found a way to enable in game Anti Aliasing on my ati4850 with very little fps loss (8 fps loss avg at 4x), the CCC Anti-Aliasing destroys the fps. Here is what I did and it works perfectly:

"Using Ray Adams ATI Tray Tool utility. The advantage of doing it the alternate way is that the Nvidia card check is bypassed and you get ingame AA rather than AA forced by the display driver. Big deal you are probably thinking but the ingame AA is faster as it isn't done via brute force.

So how do you do it the ATI Tray Tool way?

Well first download the ATI Tray Tool App and Install it.

Create two profiles in the ATI Tray Tools app, one for the BmLauncher.exe and ShippingPC-BmGame.exe

In the 'Direct3D Tweaks' option, enable the tickbox to change the adapter identification
In the Vendor ID put 10DE for the Device ID put 5E2 The device description put NVIDIA GeForce GTX 260. Do this for both profiles saving each one before starting the next.

Launch the batman launcher. If you have done it correctly it will say that your Nvidia drivers are out of date (because you are using ATI ones of course) and it will then allow you to set AA in the launcher. After that start the game. Notice that the ingame AA works."

And there it is, Anti-Aliasing does work for ATI after all! (and it looks much better with it, it is necessary in this game)
PS use google to find the ATI tray tool program , the version i used was the beta 1.6.9 for Win7 64-bit
 

fmenton66

Distinguished
Feb 6, 2009
5
0
18,510
Has anyone done any testing to see if the original Ageia PhysX card would work with this game? Just wondering if this card is capable of working as a dedicated PhysX processor for someone using an ATI gpu. Also, if it does work, what level of playability would it have? Would it do better than the Geforce 9500 GT as a dedicated PhysX processor?
 

silicondoc

Distinguished
Feb 7, 2008
82
0
18,630
[citation][nom]Curnel_D[/nom]With Nvidia pushing proprietary API's like CUDA and PhysX, they're at a point where these things are some of the largest selling points of their products. With this in mind, and given Nvidia's past Anti-Consumer business practices, I think we can all expect to see a lot more of this kind of thing in the future with TWIMTBP games.[/citation]

Gee, at least you admit Cuda and PhysX are great selling points. Let's take PhysX vs Dx11 = oh boy PhysX wins by a LONG SHOT!
--
Now, since ATI is supposed to be saving so much money with their tiny blazing hot crammed full because they're tiny cores cooking to keep up, they should have all sorts of money to put a few ati driver writers on the job and deliver an AA implementation to RockSteady for Batman Ark Asy as HAS BEEN REQUESTED !!!
--
Instead, ati whines Nvidia didn't give them their driver they made to use for free, and whines that Nvidia protected their work... and claims "ati isn't like that".
Well, just hop on out or over to a few Tom Clancy HAWX' benchmarks and see if you believe ATI isn't "like that".
I suppose if we all lived in the one communist internationale world, Nvidia would be required to pay their own programmers to write drivers for ATI cards, and as far as the "red trolls" go, that is EXACTLY what they expect.
All I have to say is, with that theory from the ati fans, ati should be writing DX11 drivers for DX11 games that ALSO RUN on Nvidia cards. Not that Nvidia has ever needed help writing drivers - but ATI sure does...
ATI is begging for free help writing just a simple AA implementation for one game... begging for help - for a big fat freebie from their biggest nemesis - now THAT'S LAME -
Can't blame 'em completely, amd/ati is billions in the hole, losing billions a year, so they want a handout - some driver welfare, and plenty of people are stupid enough to "go for it".
---
Scour the net and see if you get a single AMD/ATI statement that they sent a driver team or tried to deliver an AA driver to RockSteady for Batman Ark Asy - and you'll see a BIG FAT ZERO - because they didn't do it - they didn't their work.
ATI "the welfare" videocard company.
 

silicondoc

Distinguished
Feb 7, 2008
82
0
18,630
[citation][nom]alterecho[/nom]aa can only be enabled if you have a geforce 8 series in this game. It is impossible to use in any ati cards.ati changed it's hardware id on one it's card and they were able to enable aa and guess what, the performance was same as that of geforce and not much performance hit. i think don forced aa via catalyst which cripples the frames.[/citation]

Another example of the epic fail of the CCC - the Crazy Crumby Crippler we should all call it.
So ati can't produce their own AA driver even after RockSteady asks them to, and their CCC that ships with all their videocards is a framerate crippler...

ROFLMAO -

"I'm an ati fan because I hate nvidia and I'm smart enough not to install CCC ! " ROFLMAO - Well said red fanboi -talking crap about ati and not even realizing it - gosh you're so smart and such a techie!" hahaha
 

silicondoc

Distinguished
Feb 7, 2008
82
0
18,630
[citation][nom]zak_mckraken[/nom]I find this concept of selling free things confusing.[/citation]

Read some Karl Marx or head on over to the public aid office - it will become painfully obvious what the talk is about.
 

silicondoc

Distinguished
Feb 7, 2008
82
0
18,630
[citation][nom]shubham1401[/nom]Physx is not a great deal.....I'm still leaning towards ATI cards...[/citation]

Yeah, especially with that DX11 stuff that you can't even see on screen, unlike PhysX. *rolls eyes*
 

silicondoc

Distinguished
Feb 7, 2008
82
0
18,630
[citation][nom]mowston[/nom]Since you basically need a second card to run Physx (or one of those cards with an extra dedicated GPU for Physx), why doesn't NVidia release a dedicated Physx card? By releasing the "Physx" card with two different GPUs, they have pretty much admitted that you need an extra GPU for Physx, so why not just release one for that alone? They already had the IP for it by buying Ageia. I would like to have a PCI-express x1 card. Just try finding a modern video card that fits into an x1 slot. I had to cut off the end of an x1 slot to get my Nvidia card to work with my ATI card; not the best option, but it worked.[/citation]

Well, that's just plain wrong. You don't need a second card to run PhysX, nor do you need a dual gpu card. It's runs great on a single 9600gso for instance.

ATI has their proprietary "bullet physics" but since it sucks so badly, no one has been brain washed into whining about it.
AMD and Intel had to cross purchase and make cross agreements on eachother X86 technology, why aren't you people SCREAMING all the time that x86 should be an open standard so we get better processors ?
I know why you aren't...but I'll be polite and not say.
---
As far as PhysX, or any other implementations, if some entity has Intellectual Property then the rest of the not yet full blown communist world ought to PAY UP and buy it if they want it.
We can't have any real development if all the crybabies of the world want is one big freebie for everyone. Not everyone is some gigantic government that can print and tax till everyone's blue in the face to pay for "the latest internet or gaming invention or code" and make if free for the masses - and for all the businesses that want to gank it for their own profiteering.
Why did ATI stop the independent software writer(by locking out his makes it work program in their next driver release) that made PhysX run on ATI cards, while NVidia sent a few programmers over to help him ?
I'll tell you why. AMD/ATI would HAVE TO PAY !
So instead, we have this social welfare communist mindset that says everything has to be free for anyone who wants it - or the world will explode - and then only when the commie has a big fat stick up their rear and chip on their shoulder and is filled with hatred and on the attack against their targeted object of hate.
Other than that, the welfare free for all whiner doesn't even notice - when tens of thousands of similar software/implementation situations stare them in the face on the net on a daily basis... the CLUELESS COMPLAINER passes right by without a peep.
---
PAY UP and then shut up ati.
---
I can hardly wait till ATI and Intel have to hand over a GIANT WAD of cash to NVidia for developing and patents on PhysX, even if it is changed or ported to some free for all linus torvalds community suckers respository.
 

silicondoc

Distinguished
Feb 7, 2008
82
0
18,630
[citation][nom]jtt283[/nom]This sucks. So, if I want maximum effects, I'd need to scrap my perfectly good HD4850 for a pair of nVidia cards, one to not be using ATI, and one for PhysX...sorry, not going to happen.nVidia, BITE ME. I used to use your cards. Never again, until you change your ways.[/citation]

hattip/ Bright Side of the News: Summer 2008 saw the arrival of PhysX Wrapper for ATI cards. Even though both companies claimed that they will support Eran from NGOHQ and his team, the truth was that the PhysX wrapper was blocked with the next version of Catalyst drivers [according to Eran]

----
Well there you have it - you can thank your ati masters... for screwing the pooch.
 

liquidsnake718

Distinguished
Jul 8, 2009
1,379
0
19,310
i loved the ps3 version of this game, I dont see how one can FULLY enjoy this game not using a 40inch LCD at 120hz and FULL HD.... even with an Xbox 360 controller it is still more of a platform game and I bought the PC version for "fun" to compare it with the ps3 version.......
 

cleeve

Illustrious
[citation][nom]liquidsnake718[/nom]I dont see how one can FULLY enjoy this game not using a 40inch LCD at 120hz and FULL HD.... [/citation]

That's because you haven't seen it 2560x1600 on a 30" monitor. :D
 

badaxe2

Distinguished
Aug 27, 2008
491
0
18,780
[citation][nom]Jeanluc[/nom]NPD numbers show this game as currently being one of the least popular PC games - 92nd in fact.Nice article BTW, it's sad that the developers were paid off by Nvidia to drop support of AA on AMD cards (in game menu AA support that, there is a work round for ATI cards) as this shows just how morally bankrupt Nvidia is these days.And just for the record this isn't a case of AMD not 'supporting developers' as Nvidia would lead you to believe. Never mind the fact it can be enabled via a hack, Richard Hubby from AMD has uploaded an email he got from the developers of Batman Arkham Asylum saying there would be lawsuit if they changed to games code to enable game menu support of AA on ATI cards.[/citation]


So Rocksteady was in bed with Nvidia on this one? That's just great.
 


You realize you can always plug a PC into a 40" LCD, or like I do at work a 50" plasma or my HDTVs or projector at home.

HDTVs are not restricted to just consoles, and like Cleeve said a 1600P 30" monitor is nice too, and something no console can do. [:thegreatgrapeape]
 
Status
Not open for further replies.