Nvidia Responds to AMD's Claim of PhysX Failure

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

mrcmark

Distinguished
Oct 15, 2009
50
0
18,640
ATI has teamed up with the Khronos Group which also has control over OpenGL and OpenAL. They are making their own OPEN SOURCE (again OPEN SOURCE) gpu accelerating engines from Havok, PixeLux and Bullet. If this is a success this will make PhysX a thing of the past.
 

Raishi

Distinguished
Jan 12, 2010
31
0
18,530
The fanboyism here really is laughable. I suppose it's too much to ask for you all to do a little research, have some idea of what you're talking about?

As already stated, Nvidia is not locking anyone out of Physx; in fact they are actively supporting the 3rd party team trying to make it work on Radeon cards; ATI is the one trying to stop it. You don't even have to leave this site for that story.

http://www.tomshardware.com/news/nvidia-ati-physx,5841.html


For everyone claiming that Physx should just be running on the CPU and Nvidia is trying to screw us all over focusing it on the GPU, you have no idea what you're talking about. Physx coding benefits extremely highly from massively parallel computing architectures. The best CPUs available now have four cores. We have GPUs with well over 200 cores now, and potentially over 500 before long. You honestly don't see why Physx is best run on a GPU?
 

jonpaul37

Distinguished
May 29, 2008
2,481
0
19,960
they're doing it on purpose, it's a reserved "ace up their sleeve" for when they need it, why else would they intentionally cripple the support, it'll make them more money in a time of need... The end.
 

scrumworks

Distinguished
May 22, 2009
361
0
18,780
[citation][nom]raishi[/nom]The fanboyism here really is laughable. I suppose it's too much to ask for you all to do a little research, have some idea of what you're talking about?As already stated, Nvidia is not locking anyone out of Physx; in fact they are actively supporting the 3rd party team trying to make it work on Radeon cards; ATI is the one trying to stop it. [/citation]

Wrong. Nvidia is not giving physX for AMD. AMD's Development Relations manager Richard Huddy said like this:

"[Nvidia] put PhsyX in there, and that's the one I've got a reasonable amount of respect for. Even though I don't think PhysX - a proprietary standard - is the right way to go, despite Nvidia touting it as an "open standard" and how it would be "more than happy to license it to AMD", but [Nvidia] won't. It's just not true! You know the way it is, it's simply something [Nvidia] would not do and they can publically say that as often as it likes and know that it won't, because we've actually had quiet conversations with them and they've made it abundantly clear that we can go whistle."

http://www.bit-tech.net/bits/interviews/2010/01/06/interview-amd-on-game-development-and-dx11/
 

Raishi

Distinguished
Jan 12, 2010
31
0
18,530
[citation][nom]scrumworks[/nom]Wrong. Nvidia is not giving physX for AMD. AMD's Development Relations manager Richard Huddy said like this:"[Nvidia] put PhsyX in there, and that's the one I've got a reasonable amount of respect for. Even though I don't think PhysX - a proprietary standard - is the right way to go, despite Nvidia touting it as an "open standard" and how it would be "more than happy to license it to AMD", but [Nvidia] won't. It's just not true! You know the way it is, it's simply something [Nvidia] would not do and they can publically say that as often as it likes and know that it won't, because we've actually had quiet conversations with them and they've made it abundantly clear that we can go whistle."http://www.bit-tech.net/bits/inter [...] -and-dx11/[/citation]

Thanks for that article; very interesting and Richard Huddy made for quite a good interview, as well as coming across as very likable. The actual Physx thing is Nvidia's word against ATI's, though. I'd love to hear from the NGOHQ team that was trying to port Physx to the ATI hardware and see how much support Nvidia has actually been giving them.
 

maximiza

Distinguished
Mar 12, 2007
838
3
19,015
Having an Ageia card I can tell you Nvidia stops the Ageia card working with an ATI video card at the driver level. Nvidia betrayed all Ageia card owners and will pay dearly one day.
 
G

Guest

Guest
Yes a GPU has more raw power to do Physics I Know that But can you please name me games where Disabling PhysX would Result in a 100% Load of 4 Cores?None, infact Crysis physics using the cpu does not even go over 50% load on 2 cores and that is with a Q9550 not an i7. Whats the CPU load of a quad core in batman with PhysX ON and OFF?http://www.tomshardware.com/reviews/batman-arkham-asylum,2465-10.html Strange that an i7 is getting double load of a phenom. It is much better to let the CPU do physics calculations and let GPU do other work in order to improve FPS.
 

Pei-chen

Distinguished
Jul 3, 2007
1,297
8
19,285
[citation][nom]elie3000[/nom]Yes a GPU has more raw power to do Physics I Know that But can you please name me games where Disabling PhysX would Result in a 100% Load of 4 Cores?None, infact Crysis physics using the cpu does not even go over 50% load on 2 cores and that is with a Q9550 not an i7. Whats the CPU load of a quad core in batman with PhysX ON and OFF?http://www.tomshardware.com/reviews/batman-arkham-asylum,2465-10.html Strange that an i7 is getting double load of a phenom. It is much better to let the CPU do physics calculations and let GPU do other work in order to improve FPS.[/citation]
You just don't get it do you? The point is to let the CPU handle as few tasks as possible and hardware accelerate everything else. GPU accelerated flash, GPU accelerated HD decoding, etc.

It doesn’t matter game developer can’t properly support multi-core CPU because if they can’t even get their game to run properly on multi-core, what are the chance that their can get physics to run correctly?
 

back_by_demand

Splendid
BANNED
Jul 16, 2009
4,821
0
22,780
[citation][nom]maximiza[/nom]Having an Ageia card I can tell you Nvidia stops the Ageia card working with an ATI video card at the driver level. Nvidia betrayed all Ageia card owners and will pay dearly one day.[/citation]
I think they already are?
If you want to use it you have to stick with nVidia, so if you upgrade you have to NOT buy a new ATI HD5000 series. People aren't that stupid so they buy the HD5000, stop using the Ageia card and get on with their life. In the meantime nVidia are left behind with warehouses full of unsold cards and a Fermi product that exists only on paper apart from the fake mock-up they tried to push out at a show. Sorry matey, if you want to get back to big sales you need to stop using cardboard cut-outs and produce a card that is at least 2 of the following 3:-
1) Faster
2) Cheaper
3) Cooler
 

fulle

Distinguished
May 31, 2008
968
0
19,010
[citation][nom]randomizer[/nom]What's a game that uses PhysX for more than special effects? If I knew one, I could probably tell you if it looks good or not. As yet, I can only think of Batman but that's just smoke and flying paper. It's hardly a game changer.[/citation]

A good game? Uhm... there isn't one. But, Mirror's Edge has some noticable PhysX effects, notably those involving glass breaking.... and strangely, the game seems to take every opportunity to put you in environments with glass everywhere, sometimes to the point where its almost ridiculous. Yay for TWIMTBP games!
 

blackened144

Distinguished
Aug 17, 2006
1,051
0
19,280
[citation][nom]spinaltap11[/nom]I'm quite startled by the large number of comments here that completely miss the point. I think that Nadeem Mohammad pretty clearly rebuts the original statement. If PhysX implementations are not running well on multicore, it's because the game developers did not implement multicore code for PhysX. He even cites one example of a developer who has implemented multicore PhysX code properly (Futuremark). It's not up to NVIDIA to enforce programming practices at game companies for CPUs in the "The Way it's Meant to be Played" campaign, and it's silly to assume that they would do that.[/citation]

Some people just cant miss an opportunity to bash Nvidia.. Same thing with Intel..
 

fulle

Distinguished
May 31, 2008
968
0
19,010
[citation][nom]Regulas[/nom]I agree with your post. By your flames (Thumbs Down) I would say this is a AMD/ATI fanboy site. Quote from Nvidia in article, "Our PhysX SDK API is designed such that thread control is done explicitly by the application developer"I agree with you. Nvidia says it lets the game developers handle the code but for some reason theses tards on this site still want to blame Nvidia.[/citation]

Games with PhysX support are typically run on TWIMTBP titles... which have been known to intentionally remove normal features, like AA, from ATI cards. Sometimes the optimizations go as far as to do things like scrap DX10.1 support. But I'm sure Nvidia has nothing to do with that /sarcasm.

Currently, Nvidia could be trying to make a better future for parallel computing and physics effects, HOWEVER, they instead chose to buy Ageia, and leverage a proprietary API. They've already demonstrated clearly what their intentions are with this API when they DISABLED PHYSX in the presence of a competitor's GPU.

They have no intention of having this be an open standard. Even if AMD licenced Physx, Nvidia would go out of their way to insure that its performance is inferior on their products. They wouldn't work with AMD. It'd be a freaking nightmare.

As consumers, right now, we need to stop supporting this BS, otherwise we'll be stuck with another freaking 3DGlide situation.
 
G

Guest

Guest
Pei chen your not getting what I wrote. Read again my post and you would understand that hardware accelerate physics has NO advantage over CPU physics with ANY Quad Core System. Please dont repeat your self again. I am not really bashing nVidia, in fact I will be the first to buy a Fermi if performance and price is right like it was with the GTX 260.But PhysX is useless(Yes I tried it with the 260) and a marketing strategy on my list I buy a GPU card because of the raw and performance power just my 2 cents
 

back_by_demand

Splendid
BANNED
Jul 16, 2009
4,821
0
22,780
[citation][nom]elie3000[/nom]Pei chen your not getting what I wrote. Read again my post and you would understand that hardware accelerate physics has NO advantage over CPU physics with ANY Quad Core System. Please dont repeat your self again. I am not really bashing nVidia, in fact I will be the first to buy a Fermi if performance and price is right like it was with the GTX 260.But PhysX is useless(Yes I tried it with the 260) and a marketing strategy on my list I buy a GPU card because of the raw and performance power just my 2 cents[/citation]
I think I understand the point you are making. People dont buy a GFX card because it has PhysX, they buy it because it pumps out more frame and higher res. Putting a PhysX badge on a card that performs slower is like putting spinners and a custom paint job on a Ford POS.
 

Ephebus

Distinguished
Apr 14, 2008
61
0
18,630
As an (ex-) NVIDIA fanboy, I'd like to offer a simple example on how that company is no different than any other company that wishes to dominate a specific market through anti-competitive practices.

After buying a new motherboard based on an NVIDIA chipset, I did what I always do when I get a new hardware item and went looking for a few basic technical documents on that component - only to find there was NONE available from NVIDIA. I then started a thread at the NZone forums only to be swamped by a horde of NVIDIA fanboys, believers, cultists and, probably, employees, trying to have me buy the ridiculous argument that it was up to the motherboard manufacturer to offer technical documents on the chipset - which would also be on most cases legally impossible since those documents are usually copyrighted by the original component manufacturer.

What I'm talking about here is really basic technical info, like a datasheet stating how much heat the chipset can endure, etc. Then I contacted NVIDIA's support, and the level 1 guys came up with the same misleading arguments. That did it for me, and I registered the domains www.boycott-nvidia.com and www.boycott-nvidia.org (the latter should be the main site, with the .com redirecting to it) and told them about it. The case was then escalated to what they call the "concern" level, and I finally got the correct reply, that NVIDIA didn't make available those types of documents and neither allowed manufacturers of products which included NVIDIA components to do so, because they were all bound by NDA's (non disclosure agreements). So, Gigabyte (my motherboard's manufacturer), for example, isn't even allowed to officially tell me the maximum working temperature of the NVIDIA chipset on it unless that information had been made public by NVIDIA before.

Now of course AMD, Intel and every other company out there would love to be able to control (or keep its current control) of all markets their products reach, BUT if you go to AMD's or Intel's (or JMicron's, ITE's, Marvell's, Realtek's, Silicon Image's, etc.) web sites, you WILL find quite a reasonable amount of technical documentation available for all components these companies produce - only NVIDIA refuses to follow the same route.

I'm currently gathering material for the site (right now there's only a very basic page online) and hope to have it going in a month or so. My intention here is not to harm NVIDIA in any way, but rather contribute - within my limited possibilities - towards putting some pressure on the company so that they change their way of seeing and treating the very people who buy their products. I'll be glad to take the sites offline as soon as I notice NVIDIA have started thinking outside the shell they are living in.
 

mlauzon76

Distinguished
Nov 18, 2008
30
0
18,530
Here, let me fix the first line of Nadeem Mohammad's quote:

"I have been a member of the PhysX team, first with AEGIA, and then with Nvidia, and I can honestly say that I am being paid to say there's nothing wrong with the SDK, but what AMD says is actually the truth!"
 

knowom

Distinguished
Jan 28, 2006
782
0
18,990
Nvidia's response..."Tough titty said the kitty when the milk went dry"

The bottom line is PhysX is a good API right now regardless of being proprietary or not, but AMD is pissed because they want a piece of the pie.

Who's fault is it that AMD hasn't bothered to come out with it's own physics API?
 
G

Guest

Guest
Considering ATI released their OpenCL support for their cards last month, supposedly a direct, open-standards competitor of CUDA (on which PhysX runs), it will be interesting if someone adapts the PhysX SDK to it, and run on ATI cards.
 

BoxBabaX

Distinguished
Sep 30, 2009
81
0
18,630
[citation][nom]Ciuy[/nom]Physyx is doomed anyway. Just like Lucifer ultis you at lvl 6 when u got 200hp . Dead... Dooomed[/citation]


That was an incredibly cheesy dota reference. I love it :D.
 

bounty

Distinguished
Mar 23, 2006
389
0
18,780
Would testing Physx CPU utilization with Shattered Horizon help? Since it's made by Futuremark I'd assume their programmers would use multi-core physx since they apparently use it in their benchmark?
 
G

Guest

Guest
[citation][nom]ptroen[/nom]As a amateur game developer I was intrigued by Physx since it's a significantly cheaper route then Havok. However, I found that Physx just had more problems then it's worth. For instance, 1) Physx makes use heavy of the PCI bus when in pure hardware mode. How fast can it really be if your utilizing the PCI bus which has a maximum bandwidth of 133megabytes/second(or 4 megs per frame) 2) Nvidia has been caught already locking the competition out of Ageia physics 3) the pci express bus is quoted by microsoft as SLOW and needs to be shared by the graphics card 4) No direct hlsl interface with the physics directly(you have to use a C++ call to get around it 5)bullet physics is free and offers cross platform gpu based physics 6) To write custom physics with ageia you will need to write a event handler that will have to be invoked by the C++ api on a PER ACTOR/ENTITY basis. This can be a problem if you wish to have LOTS of entities/actors.So yeah not too crazy about Ageia and havok is costly as well. Anyways that's my 2 cents.[/citation]

True that. Not to mention the fact that it explodes and goes mental if objects are even slightly interpenetrating- something I've haven't seen other engines do. Bullet looks promising (GTA IV and the film 2012 have used it quite successfully). It's also going to get OpenCL support soon enough which means Phys-X's days as the only GPU physics in town are numbered. No wonder AMD are backing it :)
 
Status
Not open for further replies.