Batman: Arkham Asylum: GPUs, CPUs, And PhysX Performance

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

MiamiU

Distinguished
Jul 8, 2009
58
0
18,630
the way i see it is if ATI goes the same route as Nvidia PC gaming will become like that of console gaming. Your favorite title might be available only for your hardware choice. Nvidia is taking a path that ultimately will only hurt the customers/gamers.
 

cleeve

Illustrious
[citation][nom]scrumworks[/nom]Tom is pressing hard keeping nvidia in footlights these days. They don't have DX11 and they don't have mainstream/high end 40nm cards. Reviewing twimtbp PhysX title with nvidia's vendor lock AA implementation is one way to show how "nvidia cards rock".[/citation]

How does it show that the situation rocks? I don't think I make it sound like a good thing that AMD AA has to be accessed from outside the game. I also went out of my way to show that something looks a little strange when it comes to CPU utilization when PhysX is enabled. I'm a vocal supporter of an open physics standard, and I think that would be a far better solution.

Having said that, PhysX is used to good effect in this title and I'm not going to lie about it just so people don't start accusing me of "pressing hard to keep Nvidia in the footlights".

By your logic, would it be more fair if we totally ignored games with PhysX?
And would it also be fair if we ignored DX 11 titles because Nvidia doesn't have any DX 11 cards yet?

That wouldn't make any sense.
Try to remember that fairness doesn't come from ignoring one side or the other, it comes from examining all perspectives.

DX 11 reviews are coming as more games show up. You can bet your arse I'm going to be all over AvP 3. The screeshots I've seen are to die for, and I've been waiting for another AvP title from Rebellion for a long, long time. When I review that game, an Nvidia fan will inevitably accuse me of playing favorites with AMD. Ho hum.
 

mowston

Distinguished
Jun 12, 2007
61
0
18,630
Since you basically need a second card to run Physx (or one of those cards with an extra dedicated GPU for Physx), why doesn't NVidia release a dedicated Physx card? By releasing the "Physx" card with two different GPUs, they have pretty much admitted that you need an extra GPU for Physx, so why not just release one for that alone? They already had the IP for it by buying Ageia. I would like to have a PCI-express x1 card. Just try finding a modern video card that fits into an x1 slot. I had to cut off the end of an x1 slot to get my Nvidia card to work with my ATI card; not the best option, but it worked.
 

reasonablevoice

Distinguished
May 10, 2008
71
0
18,630
I don't usually accuse the tech sites I read of "bias", and I don't believe the author is biased as such. I DO believe that
Nvidia has pressure you to do this review. The only thing they have to bank on right now is PhysX. Otherwise they are beat at all price points on performance and features. This isn't the only Batman AA article up on certain sites, and when I see a story like this repeated on a few certain sites I can almost guarantee that Nvidia called up for a little chat.
Anyway, with DX11 you'll see Physx dead in a year. So I wouldn't bother investing in an Nvidia card just for that.
 

liquidsnake718

Distinguished
Jul 8, 2009
1,379
0
19,310
This game is meant to be played on a console. My PS3 plays this game great and the cell processor and the GPU (7800gt? 7900gt?) does great as the game is one of the best looking action games next to Metal gear solid for a console. With my 40inch lcd this game totally rocks and defenitly beats some small 20plus inch monitor and a keyboard. i am a PC guy but in this game the Console wins hands down.....
 

cleeve

Illustrious


For the record, Nvidia did not pressure me. All they did was ask if we were interested in reviewing the title; frankly I had heard good things about it so I said I was interested. But there was no political pressure coming from anywhere that I am aware of, I could have just as easily said no.

It's a AAA title, plain and simple, attached to a major license. PhysX or no PhysX, it's a damn good game. I try to do a game performance analysis at least once a month, and Batman interested me so I went for it. There's really nothing more to it.

 

inmytaxi

Distinguished
Dec 9, 2008
73
0
18,630
You've heard of Radeon's and Nvidia's used together, with the second as a Physx coprocesser, so why don't you show us it can be done and how?

You say Nvidia didn't pressure you, but it seems like you cowed before them here.
 

cleeve

Illustrious
[citation][nom]inmytaxi[/nom]You've heard of Radeon's and Nvidia's used together, with the second as a Physx coprocesser, so why don't you show us it can be done and how?You say Nvidia didn't pressure you, but it seems like you cowed before them here.[/citation]

I mentioned that in the article, dude. If I was cowing to them, why would I put it in the article? As for why I didn't test it, my Core i7 testbed is still a Vista 64 machine and Vista can't use two video drivers at the same time, so I couldn't do it. But I mentioned it in the article, not exactly being secretive.

I can't imagine Nvidia is ecstatic about my coverage of PhysX CPU usage either, yet I didn't hold back on that.

Have you ever read my monthly "best cards for the money" piece? Check November's out; the Radeons have been ruling price/performance value for months, and I've certainly pulled no punches letting people know. I'm not looking for anyone's blood, but I'm not afraid of letting people know what I think is the truth.

There's always going to be conspiracy theories - I get it. If folks want to believe I'm compromised because I reviewed this game - a great game with or without PhysX, so be it.

But my actions speak louder than my words. I try to be as even handed as possible, and if that means fanATIcs and Nvidiots are accusing me from both sides, I consider it a badge of honor. :D
 
G

Guest

Guest
I think I am the only one that ever bought an AGEA Physx card lol. I have had it for like 2 years and have never used it except for Unreal 3 witch it didn't impress me there.
 

San Pedro

Distinguished
Jul 16, 2007
1,287
12
19,295
I would like to see benchmarks using the workaround for in game AA with ATI cards that someone mentioned in the comments. That would be really interesting.
 

mowston

Distinguished
Jun 12, 2007
61
0
18,630
[citation][nom]cynewulf[/nom]What a fine Nvidia ego stroking article this is. Utterly pointless.[/citation]

I thought the article was very balanced, especially since they showed the artificial cap or poor implementation of the CPU when an Nvidia GPU was not present, and mentioned the possibility of using an ATI + Nvidia setup (which I have, by the way). The fact that an extra nvidia GPU is needed for the high Physx effects will still keep most people from using them I think. (You can get a Physx capable GPU for $35 right here after rebate: http://www.newegg.com/Product/Product.aspx?Item=N82E16814500072&cm_re=9500_geforce-_-14-500-072-_-Product )
 

scrumworks

Distinguished
May 22, 2009
361
0
18,780
[citation][nom]Cleeve[/nom]How does it show that the situation rocks? [/citation]

I don't even have to look at the bar graphs to know how bad ATI cards look in comparison to nvidias. I think nvidia representatives are very happy about this (unlikely) free advertisement.

[citation]I don't think I make it sound like a good thing that AMD AA has to be accessed from outside the game.[/citation]

You really think most people actually read all the fine print? I usually quickly look at the bars and read the conclusion. And the conclusion didn't say anything about vendor lock AA (which AMD representatives are quite pissed of btw). There was some commercial about nvidia's lackluster and unimpressive GT220 which similarly priced ATI cards eat for breakfast.

[citation]
I also went out of my way to show that something looks a little strange when it comes to CPU utilization when PhysX is enabled. I'm a vocal supporter of an open physics standard, and I think that would be a far better solution.[/citation]

Can you start the next physx game article with that sentence?

[citation]
By your logic, would it be more fair if we totally ignored games with PhysX?
[/citation]

You can mention the support in a footnote. Just as the history will be remembering it. :)

[citation]
And would it also be fair if we ignored DX 11 titles because Nvidia doesn't have any DX 11 cards yet?That wouldn't make any sense.
[/citation]

If my memory serves there hasn't been any major technology article about DX11 here but I'm eagerly waiting for your Dirt 2 review.

[citation]
Try to remember that fairness doesn't come from ignoring one side or the other, it comes from examining all perspectives.[/citation]

Of course but I think there has been enough of physx articles already.

[citation]DX 11 reviews are coming as more games show up. You can bet your arse I'm going to be all over AvP 3.
[/citation]

Of course you are. Nvidia most likely has Fermi then when AvP 3 is out but Dirt 2 is here soon as well as Stalker.

[citation]
The screeshots I've seen are to die for, and I've been waiting for another AvP title from Rebellion for a long, long time. When I review that game, an Nvidia fan will inevitably accuse me of playing favorites with AMD. Ho hum.[/citation]

Highly unlikely. As I said, nvidia has Fermi then and it will probably beat HD5870 with notable margin even if the availability of the card would be bad/non-existing in that time.


Hope you enjoyed the counter arguments and can we have a preview button here? I've got no idea how those tags will end up showing.
 

CptTripps

Distinguished
Oct 25, 2006
361
0
18,780
[citation][nom]scrumworks[/nom]You really think most people actually read all the fine print? I usually quickly look at the bars and read the conclusion.[/citation]

So you are too lazy to read the article yet somehow found the energy to tear it apart when the conclusion was not what you wanted?
 

CptTripps

Distinguished
Oct 25, 2006
361
0
18,780
[citation][nom]scrumworks[/nom]As I said, nvidia has Fermi then and it will probably beat HD5870 with notable margin even if the availability of the card would be bad/non-existing in that time.[/citation]

Non-existent like the 5850 is now? You are already looking forward to Nvidia having shortages of a card that is several months out just so you have something to bitch about. You think there have been enough physx articles and I think there has been enough conspiracy crap from people like you.

He tested the game and the results are accurate even if you don't like them.
 

jch059008

Distinguished
Nov 10, 2009
3
0
18,510
Nvidia's lame attempt to win back some of the competition. The consumers are the ones that lose. Why dont you do a review of dirt 2 running an ati card and an nvidia card running dx 11. Oh wait they dont have a dx 11 card so that review would be pointless just like this one is....lame article!
 

jch059008

Distinguished
Nov 10, 2009
3
0
18,510
Nvidia's lame attempt to win back some of the competition. The consumers are the ones that lose. Why dont you do a review of dirt 2 running an ati card and an nvidia card running dx 11. Oh wait they dont have a dx 11 card so that review would be pointless just like this one is....lame article!
 
G

Guest

Guest
Nvidia phoned you up to ask for a review,so you had no interest of your own until nvidia ask you?,sound`s like a nvidia dollar hand out to me...

And if your so interested in fairness where is the dirt 2 review?????
 

cleeve

Illustrious
[citation][nom]scrumworks[/nom]Hope you enjoyed the counter arguments[/citation]

Scrum, I'm honestly not seeing much for counter-arguments there.

You say you want me to ignore PhysX games entirely because you personally consider it a footnote. You say there's been enough PhysX articles already, yet I've never written a single one until now. You complain that I'm looking forward to AvP - a title that AMD has sent me tons of promotional material about - but you've pre-decided that my enthusiasm is a ruse to stall for fermi. At the same time, you completely ignore my recommendations to purchase AMD cards in my monthly Best Cards for the Money article.

Frankly, I get the impression that the only thing that would make you happy is if I did nothing but praise one company and diss the other. I'm not willing to do that for any company, so I think there's little point in trying to convince you of the merits of what I believe is fair and even handed coverage.

You certainly have the right to disagree with my position, as I have the right to disagree with yours. We'll have to agree to disagree and leave it at that.
 

cleeve

Illustrious
[citation][nom]email[/nom]How do you dedicate a card to physx?[/citation]

Install 2 cards and select one for PhysX processing in the Nvidia control panel. You don't need an SLI bridge.
 
G

Guest

Guest
Calm down people, we will have games using OpenCL and DirectCompute for physics in one or two years, PhysX and CUDA will not stay much longer.
 
Status
Not open for further replies.