Nvidia Responds to AMD's Claim of PhysX Failure

Status
Not open for further replies.

randomizer

Champion
Moderator
[citation][nom]Nadeem Mohammad[/nom]And to anticipate another ridiculous claim, it would be nonsense to say we “tuned” PhysX multi-core support for this case.[/citation]

Oh absolutely, nonsense indeed. :sarcastic: In fact it's such utter nonsense that he won't even bother to provide evidence to substantiate any of his claims - which, funnily enough, is what he accused Richard Huddy of.
 

FoShizzleDizzle

Distinguished
Apr 22, 2009
144
0
18,680
Not to take sides here, as I own an Nvidia card fwiw. But I came to the same conclusion as Richard Huddy before ever knowing he made this statement. It struck me when toying around with PhysX on Batman Arkham Asylum.

I disabled card PhysX and let the CPU handle them just to see how it performed. Strangely, my CPU usage barely increased at all and framerates suffered immensely as a result - same thing reportedly occurs with ATI cards.

The physics being calculated on this application are not particularly intensive from a visual standpoint, especially not when compared to say what GTA IV does (which relies solely on the CPU). They are just terribly optimized and by my estimation intentionally gimped when handled by the CPU.

Anyone can connect the dots and understand why this is so. It's just stupid because I bet a quad core CPU, or even a triple core paired with say a measly 9800 GT can max out PhysX and the in-game settings if the CPU handled the PhysX without being gimped. But since it is gimped, owners of such a card pretty much cannot run PhysX.
 

demosthenes81

Distinguished
Jan 20, 2010
52
0
18,630
If game developers added true multicore support in the first place i bet this would have never even come up even the newest games like borderlands have bad multicore support I know almost nobody with single core cpus these days the devs need to step up
 

Honis

Distinguished
Mar 16, 2009
702
0
18,980
[citation][nom]randomizer[/nom]Oh absolutely, nonsense indeed. In fact it's such utter nonsense that he won't even bother to provide evidence to substantiate any of his claims - which, funnily enough, is what he accused Richard Huddy of.[/citation]
I think Batman Arkham Asylum benchmarks are evidence enough that something fishy is going wrong in Nvidia's APIs.

http://www.tomshardware.com/reviews/batman-arkham-asylum,2465-10.html
 

porksmuggler

Distinguished
Apr 17, 2008
146
0
18,680
My first thought is PhysX has been a market failure since it's Ageia days. Nvidia is just using this proprietary gimmick to hock more GPUs. I was stunned when Nvidia bought Ageia, but I guess the price was right, and their in-house development was lagging. The list of games using PhysX is just sad, and the performance hit with PhysX enabled is rough. Makes you wonder how big of a carrot Nvidia has to dangle out there to get the developers to bite.
 

randomizer

Champion
Moderator
[citation][nom]Honis[/nom]I think Batman Arkham Asylum benchmarks are evidence enough that something fishy is going wrong in Nvidia's APIs.http://www.tomshardware.com/review [...] 65-10.html[/citation]
Oh I wasn't doubting that at all. My post was meant to have a sarcastic tone, but text doesn't convey sarcasm well. I'll have to fix it up.

EDIT: A smilie makes all the difference :D
 

mlopinto2k1

Distinguished
Apr 25, 2006
1,433
0
19,280
[citation][nom]randomizer[/nom]Oh absolutely, nonsense indeed. In fact it's such utter nonsense that he won't even bother to provide evidence to substantiate any of his claims - which, funnily enough, is what he accused Richard Huddy of.[/citation]Funnily? :p
 

mlopinto2k1

Distinguished
Apr 25, 2006
1,433
0
19,280
I think PhsyX is a bunch of bull anyway. Any game I have seen that utilizes it isn't anything special. Like, I mean.. the physics aspects implemented in the game. I.E... they wouldn't be missed. (Arkham Asylum is a good game) They wouldn't need an GPU. It's just a money maker in my opinion. Garbage.
 

mlopinto2k1

Distinguished
Apr 25, 2006
1,433
0
19,280
[citation][nom]randomizer[/nom]Yes, it's a dictionary word with basically the same meaning as "strangely", but has more of a "hehe, you fail" tone to it.[/citation]I had no clue that was a real word! Haha.. well in that case... :)
 

AMW1011

Distinguished
It is true, it has been proven many times like FoSjizzleDizzle explained.

I own nVidia as well, but their anti-competitive acts are really starting to piss me off.

Luckily DX11 will make PhysX completely useless anyway.
 

ptroen

Distinguished
Apr 1, 2009
90
0
18,630
As a amateur game developer I was intrigued by Physx since it's a significantly cheaper route then Havok. However, I found that Physx just had more problems then it's worth. For instance, 1) Physx makes use heavy of the PCI bus when in pure hardware mode. How fast can it really be if your utilizing the PCI bus which has a maximum bandwidth of 133megabytes/second(or 4 megs per frame) 2) Nvidia has been caught already locking the competition out of Ageia physics 3) the pci express bus is quoted by microsoft as SLOW and needs to be shared by the graphics card 4) No direct hlsl interface with the physics directly(you have to use a C++ call to get around it 5)bullet physics is free and offers cross platform gpu based physics 6) To write custom physics with ageia you will need to write a event handler that will have to be invoked by the C++ api on a PER ACTOR/ENTITY basis. This can be a problem if you wish to have LOTS of entities/actors.

So yeah not too crazy about Ageia and havok is costly as well. Anyways that's my 2 cents.
 

welshmousepk

Distinguished
'We continue to invest substantial resources into improving PhysX support on ALL platforms--not just for those supporting GPU acceleration.'

this made me lol.

if they are so intent to support multi-platforms, why is their primary platform (PC GPU acceleration) locked out in the presence of competitor hardware?
 

climber

Distinguished
Feb 26, 2009
325
0
18,780
Each corporation or business wants to drive it's competitors out of the marketplace, but doesn't want to pay the price for this in ant-competitive practice law suites. Nvidia is no better than ATI/AMD (or A^2 = A Squared as I like to think of them), or Intel.
 

elno

Distinguished
Jan 20, 2010
3
0
18,510
"Our PhysX SDK API is designed such that thread control is done explicitly by the application developer, not by the SDK functions themselves."

To me, this means that it is up to the game developers to optimize thread control for multi-core CPU. It is not nVidia's fault that game developers choose to only spend time making phsyx work with GPU and not optimize it for multi-core CPU use.

Can AMD point to changes within the code that can show that performance of Physx has deterioted on multi-core CPU if you compare pre-Nvida Ageia API versus present day Nvidia Physx API?

Then we'll know who is telling the truth? If there is no deterioration, then nVidia is not in the wrong. Why would they spend resources making Physx work better on multi-core cpus. That is just a dumb business decision unless they see the value of doing so. It may be that they should do that or risk phsyx being ditched as a widely used physics engine.
 
G

Guest

Guest
Any modern CPU can pretty well handle physics without any slowdowns or affecting fps. All it takes is some multi-core optimizing, with Octacores CPUs coming, games are barely taking advantage of 2 CPUS, that's really really Sad but good for GPU makers. GTA graphics are badly optimized because it was ported from consoles. Best games never used PhysX on my book. Crysis, Stalker, MW1/2, NFS , Unreal 3 had physX in only 1 map and the physx rain was really really annoying during gameplay.
 
G

Guest

Guest
Developers wake Up please, It is Time to make use of Quad cores that have been in the market for 3 Years! Thats ridiculous seriously Q6600 release date January 7, 2007! God developers some is paying them to not do so,selling more gpus?
 
G

Guest

Guest
EDIT: Quad cores were in the market for 4 years actually not 3. Core 2 Extreme QX6700 Release date : 2 Nov 2006 http://www.techspot.com/review/27-intel-core2-extreme-qx6700-quad/
 
G

Guest

Guest
Were mad gamers because PhysX is actually Slowing the Multicore game optimization process. Look how bad games are multicore optimized while they can run perfectly on a Quad Core if done efficiently. They would rather let the GPU make the physics calculations and forget about the CPU.
 
Status
Not open for further replies.