Nvidia: We Didn't Bribe Anyone to Use PhysX

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]jacobdrj[/nom]Why is PhysX any different than, say, MMX?[/citation]Well, due to cross licensing agreements, AMD and Intel processors both got native MMX support. Intel developed it and deployed it first, but AMD followed up quickly. With PhysX, it is entirely under Nvidia control. So unless they decide to officially port their PhysX "middleware"/API to DirectCompute, it won't be running on both platforms. We'll have to wait for someone like Havok to step in and offer a DirectCompute or OpenCL solution that runs on both, officially, without unsupported hacks.

Also I still fail to see why they blocked the use of a secondary Nvidia card as a PhysX accelerator, when used with a primary AMD card for rendering.
 
[citation][nom]dskljdsfkl23[/nom]Lies. PhysX can work using the CPU too... but , ofc, runs much faster on a NVIDIA GPU.[/citation]Have you tried enabling software PhysX in a demanding game? If it even lets you, it will be unplayable. That's like saying "Sure my 486 can play DVDs! It just plays them slower than a modern computer."

Although maybe if we use a $1000 Intel hex-core CPU, we can get the same PhysX performance as $50 Nvidia secondary card.
 




Maybe if AMD/ATi hadn't talked PhysX down first, Nvidia wouldn't have shut them out.
 
I would have liked them to say the prices that they are charging for the middleware. Before Nvidia bought physx from ageia, ageia was giving the physx middleware away for free to push their cards. So you had havok which cost a quarter mil, ageia which was free and "real world" physics, or you could do your own and add an additional 6-12 months to your dev time. So if Nvidia came out and put it like that then it would have appeared to be more of a decision made by the devs and not a total load of horse hockey put out by a corporate spin machine...
 
bless nvidia's heart. if only we could be as honest as they are. the world would be a much, much better place.
 
[citation][nom]builderbobftw[/nom]I want him to explian how Crysis Warhead got optimized for Nvidia hardware and Not ATI.[/citation]
Shouldn't it be AMD doing that explaining? I'm not sure nVidia could properly comment on why ATI didn't do something...

I run nvidia but physx is a joke right now.. and I'm sure it will continue to be until it's either supported by ati OR nvidia makes their own patches for each game like 3dfx used to do. Havok works on all video cards (err well, cpu's), physx only works on nvidia. If I were a game developer, havok is a no brainer.

I've never owned a game that supported physx. The ones that do are just not my type of game so far. I really wish Company of Heroes supported it.
 
...and do you all remember a few months back when NVidia released some drivers that disabled nvidia GPU from working in tandem with ati as physics cards in win7???
You need to try harder Nvidia to convince us of your "fair" practices...
 
ok so ATI is nividia's direct competitor. Claiming that they are bribing game developers because they don't want to hand something they paid money for over to a competitor is just silly people. Disabling the ability to use physics with both a ati nvidia pair also goes along the same competitive edge path. Hell otherwise someone could buy/talking a friend out of the cheapest nvidia card supporting physx and then keep using it while buying big dollar (or multiple do to upgrades etc) ati cards.
 
[citation][nom]mousemonkey[/nom]AMD did not want to use PhysX long before Nvidia blocked their cards from using it.http://www.tgdaily.com/business-an [...] evelopment[/citation]
And now we even understand, why.
 
[citation][nom]mousemonkey[/nom]Maybe if AMD/ATi hadn't talked PhysX down first, Nvidia wouldn't have shut them out.[/citation]
Right, it's ATI's fault that it prefers open/third party standards to nVidias homebrew crap...
 
[citation][nom]darkguset[/nom]ATI already doing it, HAVOK[/citation]

Last game I played that with (Demon's souls) everything was like a rag doll , Havok physics suck.
 
PhysX can run on any hardware ... is not something implemented into the videocard ! Remember the original forceware drivers had no PhysX for 8 and 9 series until nvidia aquired PhysX, it`s only a piece of software that runs on GPUs and not on CPUs, most likely if some programer will put his mind at work he can include PhysX in ATI drivers. At the moment nvidia brags about a thing that even ATI cards can do since is not hardware related, is a fukin piece of software made to run only on nvidia cards.
 
[citation][nom]mousemonkey[/nom]AMD did not want to use PhysX long before Nvidia blocked their cards from using it.http://www.tgdaily.com/business-an [...] evelopment[/citation]

I see... So you're sayin' nVidia might have talked with AMD so they would give support to the PhysX "middleware" and AMD said "hell no!"?

So, you're saying that AMD refused the posibility to "buy" the suppport for PhysX "middleware"? I can understand why they would do that, actually. HavoK, Bullet and DX11 on the roadmap (at that time, I thnik) would make much more sense than buying PhysX from nVidia =/

Cheers!
 
Frankly, I am more pissed of at developers that decide to go with Nvidia PhysX over making a cross platform game. Metro 2033 advertisements are rife with pro-Nvidia GPU/PhysX comments and media. It goes as far as to introduce doubt that ATI/AMD will even run the game well by saying you need a Intel i7 and Nvidia card for recommended settings. Unfortunately I really want to play game so I will probably buy it anyway, but I hate how some developers are basically sycophants to one or the other graphics company. It ends up screwing the consumer over.
 
[citation][nom]Shin0bi272[/nom]ahh nice try http://www.nzone.com/object/nzone_physxgames_all.html[/citation]

the majority of those games are terrible, unless you have your heart set on the new 50 cent game. nice try yourself.
 
I posted a question about physx and their price on the cuda dev forum. I was informed by a member there (with a link) that the physx sdk is 100% free for private or commercial use. So nvidia doesnt have to bribe anyone to use it because its free!

The old price scale is still valid...$250,000 for havok, 6 months to a year of dev time (which is probably worth about 250,000) and numerous headaches, or drop in physx for free and get realworld gpu accelerated physics... gee tough choice...
 
[citation][nom]d-block[/nom]the majority of those games are terrible, unless you have your heart set on the new 50 cent game. nice try yourself.[/citation]
That's not the point... he said that physx is only in 15 games... trying to insinuate that physx wasnt that big of a deal because it was only used in a few engines... its in the Unreal 3 engine which means gears of war 1 and 2 use it and any other game based on the unreal 3 engine uses it. Its also not just pc platform dependent either so he was cherry picking his source to support his claim and thats dishonest.
 

Re read the article, they were approached by a third party and they declined to help.


They didn't want it so they don't get it and now they complain, that's how I see it. I know it's not a POV shared by everybody but that's the beauty of personal opinions, there are so many of them.
 
This right up there with their "our laptop GPU's are NOT overheating...." Yeah, right! Hard to buy this particular bridge after such epic fails....
 
It's a real PITA but the architectures are so different I am not surprised that somethings only work with one card or the other. As far as bribery? it's not bribery if you call it a financial incentive right?
 
Status
Not open for further replies.