Nvidia Disables PhysX in Presence of ATI GPU

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]schizofrog[/nom]No, this only regards those with ATi AND nVidia GPU's in the same machine. The chipset is something completely different so have no fear.[/citation]
I wouldn't put it past them at this point in time to require a Nvidia nForce motherboard/chipset down the road.

Besides, we still have Havok out there, which happens to run off of the CPU. So it isn't the end for using physics in games out there with another solid, viable option. It just sucks that the big N would shut some of us out like this.
 
SOOOO EASY TO HACK/PATCH!

Step 1: Find detection logic (Probably looks something like "Check what other drivers or other device vendor ID's are in the graphics device section")
Step 2: jmp or NOP

They probably don't have a checksum.
 
[citation][nom]warmon6[/nom] I dont think it was ever really intended for it to run with an ati card. It just happen that you could.[/citation]
It was never intended to work on a Nvidia card either. Back when it was Ageia, you had to buy their dedicated cards to run their Physx software. Nvidia rewrote the code after buying them out, then made it work for graphics cards. Apparently after realizing the loophole that it was for graphics cards and not soley Nvidia cards, they decided to shut the competition down.
 
Just someone said I hope that some open platform wins this round, so big plus to openCL. And if PhysX will stay Nvidia only feature, the Betamax can really happen again... Ati don't have firepower to shoot Nvidia down (good so, because it's true allso in reverse situation) but Intel is coming strong with Larrabee, and it definitely will not be happy to compete against restricted standard. If and when Intel goes to GPU busines big time, it can be hard to PhysX, when Intel pump money to some competitive physic standard...
 
I love my EVGA graphics cards, but NVDIDA has finally pushed me over the edge. They highly irritated me when they refused to adopt DX10.1, and then made excuses that it wasn't worthwhile. Then the Assasin's Creed sketchiness went down where the DX10.1 features were mysteriously removed (draw your own conclusions, but it made me weary that Nvidia was involved in some way or another), and then they fucked the packaging up on a fuckton of their GPUs, so that they failed like crazy...

Now, they're doing everything they can to cockblock DX11, and blocking Physics features on a card you payed them for because you also purchased an ATI product?

That's enough. You know what? FUCK NVIDIA. I don't care if ATI has an inferior product or not this time around. I'm buying 5850s regardless of what the GT300 does.
 
[citation][nom]supertrek32[/nom]P...PhysX, pay Nvidia a licensing fee, and make your game only run at its best for half your consumers, or use OpenCL for free and ensure everyone can use it. There's no reason any sane developer would opt for PhysX over OpenCL. Unless there's a major overhaul, PhysX is dead.
 
I was thinking about doing this just for kicks, but I guess that idea is out. I'm not that familiar with the nVidia driver releases, but couldn't someone just revert to an older version of the driver that doesn't have this crippling "feature". Use release 185 or something?
 
Well technically PhysX is better, but so was Betamax compared to VHS...

Closed standards are hard to sell in long run (if only one GPU company can make cards that utilise it). I hope that Nvidia makes PhysX an open platform to any GPU maker, but it definitely is making it hard to achieve.
 
What happens if you happen to have an IGP of AMD, while running 2xNVidia cards in SLI?
 
Has anyone tried ati + nvidia + an older Nvidia driver -185? Does it still work? It would be nice to have PhysX enabled with my ATI setup.
 
Something I haven't read here so far; Lucid's upcoming Hydra 200 chipset, which is going to be included in some new high-end mainboards and could be a potential SLI/Crossfire killer, could be seriously impacted by this news.
 
I understand why they did it. But its really a BAD idea to do be doing that to people who buy your graphics cards. They will likely lose more sales then they will gain by doing this.

I can understand it just not working, but to physically disable it....dumb.
 
Wait...you can use a amd card as the primary and leave a nVidia card as a dedicated physx card on another slot? Thats the impression im getting.
 
Apparently you can according to ngohq.com.

Can toms make an article on amd card for rendering and nvidia for physx? I'm actually quite interested now to see if this kind of setup is viable, even if I have to use old nVidia drivers.
 
[citation][nom]togenshi[/nom]Wait...you can use a amd card as the primary and leave a nVidia card as a dedicated physx card on another slot? Thats the impression im getting.[/citation]

You SHOULD be able to do just that, but that is precisely what nVidia is disallowing; you could have a nVidia card, but if its drivers detect that you also have an ATI card, NO PHYSX FOR YOU!
 
[citation][nom]08nwsula[/nom]I think someone is jealous[/citation]

You know that the pronuciation of "Nvidia" in Spanish means Jealous

not that they are of course. The can always rebrand and relaunch g200 as g300 :)
 
A few interesting topics:

1. ATI owners usually say that PhysX is crap and does nothing to improve the gaming experience and yet are very angry if something happens and they can't use it anymore.

2. ATI owners think that because PhysX happened to work when you used an nVidia card for PhysX and an ATI card for rendering then this scenario was a supported feature. A supported feature is listed in official documents, on the manufacturer's website, on the product box. An accident is not a feature.

3. nVidia works on their drivers all the time. They have an obligation to their customers to make improvements and add new features. The obligation does not extend to assuring compatibility with competing products (that would be dumb).

4. ATI owners think that PhysX should die because they can have Havoc. Havoc is PhysX for ATI with another name. So if PhysX doesn't die, Havoc cannot be. Why is that? OpenCL and/or DX11 is another matter. nVidia already supports OpenCL on a variety of platforms, something ATI doesn't do yet very well. DX11 will come when their DX11 hardware will come. In 3 months.

5. The possibility to use PhysX while using an ATI card for rendering was removed when the new DX11 cards were launched. Did any of you whiners tested PhysX with a DX11 ATI card for rendering? Could there be that ATI made some changes that produced so many bugs that nVidia called it quits? What is ATI doing to ensure compatibility with PhysX? Does ATI support PhysX when you're using an nVidia card for processing? Is it stated in their official documents, on their website that they are involved in any way in this matter?
 
Mmmm this move is going to be risky.
Nvidia is hoping this would boost sales for those who want Physx.
Risk is that people who find this off putting and not buy nvidia cards. If this happens GG nvidia, less developers may use nvidia cause of this.
I like Ati
 
Is there any legal action that can be taken here? I'll never have this setup but anti-competitive practices such as this need to be squashed as a matter of principal. I'm buying an ATI card next time, this isn't the first story of questionable business ethics I've heard about NVIDIA
 
Status
Not open for further replies.