News PhysX quietly retired on RTX 50 series GPUs: Nvidia ends 32-bit CUDA app support

Just a quick note, Nvidia didnt invent PhysX. They bought Ageia, and even Ageia didnt even develop it themselves, the engine was developed by a swiss company named NovodeX AG, that was aquired by Ageia. Ageia then produced and released a stand alone physics accelerator cards using the engine from NovodeX. Nvidia eventually bought Ageia in 2008, and implemented an API that would allow only their GPU's to use the physX engine, and thats where it died.

https://en.wikipedia.org/wiki/PhysX
https://en.wikipedia.org/wiki/Ageia
 
Last edited:
  • Like
Reactions: TCA_ChinChin
PhysX was just the coolest.... Like, it wasn't a gimmick, it genuinely made a huge impact in the quality of the experience. I remember playing Arkham Asylum, first w/o PhysX, and then later with it (once I upgraded my GPU). My jaw was literally dropping. But then again, I don't think Raytracing is a gimmick, so what do I know.
 
Man, seems like I want to stick with a 40 series card then. Why upgrade for more drama and less functionality?
I like the Arkham games, I like Mirror's Edge.
 
  • Like
Reactions: artk2219
So, genuine question... Does this mean that modern non-PhysX games have less physics capabilities than those old PhysX-supported titles , or was PhysX basically made obsolete due to increased capabilities of modern CPUs?
I think PhysX was made obsolete by lack of use.
Most of the implementations I've seen were bad, but there were a few pretty good ones. Like the smoke in the first two Metro games, or the fire in Witcher 3 before they stopped using GPU PhysX, or Batman Arkham Knight. My GT730 pciex1 card running the PhysX could keep up with SLI 1080tis running raster in PhysX supporting games. When GPU PhysX stopped working in W3 my CPU load went way up and the flames dropped their refresh rate. There are a ton of games that use a lot of GPU processing to do a worse job at smoke than some passive trash tier card can handle doing better smoke with PhysX. It could have helped games look better on cheaper hardware, but it just wasn't used much. A lot of uses were things like chips that would appear that you could scatter, or some out of place piece of paper strangely flapping in some very localized wind.

But if you really want to run PhysX in some old game with a new 50 series GPU you could probably just pick up a $20 Quadro K620 off of Ebay and have the PhysX run off of that card. I'm not totally sure on this one since there might be some driver lock preventing you from doing it with a 50 series, but it works with my 3080.
 
So, genuine question... Does this mean that modern non-PhysX games have less physics capabilities than those old PhysX-supported titles , or was PhysX basically made obsolete due to increased capabilities of modern CPUs?
This issue is about gpu physx specifically. Personally i think gpu physx become less relevent once nvidia come out with PhysX 3 which have improved performance and much better multi cpu core support. Some of the effect that nvidia claim can only be done effectively on gpu with older PhysX 2 are now can be done on cpu woth PhysX 3. As long as you did not go crazy/excessive with it. Personally i think it always can be done on cpu like what havok did with the original red faction game. Some of gpu physx effect are excessive to the point i think it only ruin the look and performance rather than enhance it (like the massive flying brick in scare crow level in arkham asylum).
 
People seem to be very confused about what PhysX actually is.

The PhysX everyone knows from marketing: some proprietary thing that uses the GPU to make flappy fabrics and some extra particles.

The PhysX actually implemented by hundreds of games: an open source CPU-based physics engine (like Havok et al) used by several different game engines.

The former is affected by the driver no longer supporting that particular branch of the API, but it was also barely used by any games in the first place, well over a decade ago. The latter is unaffected, because it never relied on the GPU in the first place.
 
So, genuine question... Does this mean that modern non-PhysX games have less physics capabilities than those old PhysX-supported titles , or was PhysX basically made obsolete due to increased capabilities of modern CPUs?
They are many games released that use physics extensively provided by game engines like Unity 3D or Unreal Engine, under the hood it's not necessarily CPU bound as they can use OpenCL (or CUDA) to do the same as PhysX.
But it's true that CPU was not powerful enough to run physics with a good framerate and now you can (i have tested it when released by Ageia).

PhysX when bought by nVidia have given them the marketing tool for CUDA like DLSS has been the marketing tool for RTX.
They contracted many game editors to integrate it in theyre games and it worked when ATI Radeon were more powerful than nVidia Geforce, peoples were buying Geforce for PhysX.
 
Honestly it's extremely short sighted on Nvidia for dropping this makes 5000 series even more less attractive

Less functionality
More expensive
Broken out the gate.

Nvidia response is just typical we don't care lol.

I can see alot of good will just being flushed down the toilet. There's quite a few games that use that engine. Some are popular even today
 
  • Like
Reactions: artk2219
Honestly it's extremely short sighted on Nvidia for dropping this makes 5000 series even more less attractive

Less functionality
More expensive
Broken out the gate.

Nvidia response is just typical we don't care lol.

I can see alot of good will just being flushed down the toilet. There's quite a few games that use that engine. Some are popular even today
This thing only affecting a dozen game or so vs thousands of games out there. Most game that use PhysX did not utilize the gpu accelerated feature. If there are enough people care about this there probably modders that going update the game 32 bit cuda with 64bit one. Just like those DLSS swapper. Even without gpu physx the game will play just fine. Just lack some fancy graphical effect that have no effect on core in game physics function.
 
  • Like
Reactions: artk2219
So, genuine question... Does this mean that modern non-PhysX games have less physics capabilities than those old PhysX-supported titles , or was PhysX basically made obsolete due to increased capabilities of modern CPUs?
Good question. Physx can be run cpu side. But not all of it. So it's proof of the loss of more desire to add physics to games. All they are worried about is lighting. Out with the old in the new. When we want both... welcome to the "future" of innovation. The customer is always right. Yet the costumers have zero clue what their legal rights even are. Just like they planned. Cause your rights don't matter.
 
  • Like
Reactions: artk2219
This thing only affecting a dozen game or so vs thousands of games out there. Most game that use PhysX did not utilize the gpu accelerated feature. If there are enough people care about this there probably modders that going update the game 32 bit cuda with 64bit one. Just like those DLSS swapper. Even without gpu physx the game will play just fine. Just lack some fancy graphical effect that have no effect on core in game physics function.
True, but the real damage is that we now know we can't trust Nvidia to continue supporting technologies they get us to buy the cards for.

It's kind of like Microsoft arbitrarily killing WMR. It's pretty much only effecting those of us with Reverbs at this point, but we're having to junk perfectly good VR hardware because of a senseless decision by Microsoft to brick it for no reason.

Doesn't impact too many people until the next time Microsoft wants to get into the VR-powered-by-Windows game at which point people are going to remember this and buy something else.

This move is a red flag to developers: don't make a game that depends on GPU ray tracing or any other "only Nvidia cards can do it" tech because Nvidia will eventually kill it.

And yeah, by then your game will be old but... There are a whole lot of people still playing Batman and Witcher 3.
 
True, but the real damage is that we now know we can't trust Nvidia to continue supporting technologies they get us to buy the cards for.
This is nothing new. Nvidia has a long history of doing this. 3D Vision is one example. While it's frustrating, I also can't blame them. It's a for-profit business. If you sink money into a certain tech, but it isn't adopted enough to make profitable, you stop supporting it. But yes, it creates a vicious circle like you described.
 
True, but the real damage is that we now know we can't trust Nvidia to continue supporting technologies they get us to buy the cards for.

It's kind of like Microsoft arbitrarily killing WMR. It's pretty much only effecting those of us with Reverbs at this point, but we're having to junk perfectly good VR hardware because of a senseless decision by Microsoft to brick it for no reason.

Doesn't impact too many people until the next time Microsoft wants to get into the VR-powered-by-Windows game at which point people are going to remember this and buy something else.

This move is a red flag to developers: don't make a game that depends on GPU ray tracing or any other "only Nvidia cards can do it" tech because Nvidia will eventually kill it.

And yeah, by then your game will be old but... There are a whole lot of people still playing Batman and Witcher 3.
from what i can see for the last 15 years nvidia usually try to support their tech much longer than their competitor. when it comes to GPU PhysX the last game that use it was batman arkham knight back in 2015. that is more or less 10 years ago. (though the game gpu physx should still work on 50 series).

nvidia most likely aware about this issue hence with RT they work with MS to create standard DXR spec. that way there is no concern about like what happen with Physx or 3D vision. same thing with DX12 ultimate. all those feature are initially nvidia hardware feature that later on being adopted into DX12 ultimate spec. this way every vendor will have similar thing rather than those stuff being nvidia exclusive. right now nvidia are working with MS to make neural shaders as standard part of DX.
 
  • Like
Reactions: artk2219
from what i can see for the last 15 years nvidia usually try to support their tech much longer than their competitor. when it comes to GPU PhysX the last game that use it was batman arkham knight back in 2015. that is more or less 10 years ago. (though the game gpu physx should still work on 50 series).

nvidia most likely aware about this issue hence with RT they work with MS to create standard DXR spec. that way there is no concern about like what happen with Physx or 3D vision. same thing with DX12 ultimate. all those feature are initially nvidia hardware feature that later on being adopted into DX12 ultimate spec. this way every vendor will have similar thing rather than those stuff being nvidia exclusive. right now nvidia are working with MS to make neural shaders as standard part of DX.
er no PhysX wont work on 5000 series period people are already having issues with some games not even running it. it will run cpu side that's it but poorly. its still used in unity and cant imagine alot of developers are thrilled there have been newer games using physx not every games going to be labelled its not as big as a selling point i believe borderlands 3 uses as does all the borderland titles.
 
er no PhysX wont work on 5000 series period people are already having issues with some games not even running it. it will run cpu side that's it but poorly. its still used in unity and cant imagine alot of developers are thrilled there have been newer games using physx not every games going to be labelled its not as big as a selling point i believe borderlands 3 uses as does all the borderland titles.
Looks like you did not know that PhysX have CPU and GPU portion. The only one that need CUDA was the gpu portion. GPU physx on Arkham knight was expected to work because they said it was using 64 bit version of CUDA. That's why the news mention some very old game. and CPU based physx does not used GPU at all. It seems some people have misconception that PhysX will use gpu if nvidia gpu is available when in fact that is only for gpu PhysX. Game engine that still using PhysX as the core physics engine will work just fine nor going to have performance penalty on 50 series.
 
  • Like
Reactions: P.Amini
PhysX was just the coolest.... Like, it wasn't a gimmick, it genuinely made a huge impact in the quality of the experience. I remember playing Arkham Asylum, first w/o PhysX, and then later with it (once I upgraded my GPU). My jaw was literally dropping. But then again, I don't think Raytracing is a gimmick, so what do I know.
If someone knows a little about CG then he/she knows raytracing is the real simulation, the other way around is gimmick. Yes realtime raytracing is extremely taxing and is in earlier stages but it doesn't change that fact.
 
They are many games released that use physics extensively provided by game engines like Unity 3D or Unreal Engine, under the hood it's not necessarily CPU bound as they can use OpenCL (or CUDA) to do the same as PhysX.
But it's true that CPU was not powerful enough to run physics with a good framerate and now you can (i have tested it when released by Ageia).

PhysX when bought by nVidia have given them the marketing tool for CUDA like DLSS has been the marketing tool for RTX.
They contracted many game editors to integrate it in theyre games and it worked when ATI Radeon were more powerful than nVidia Geforce, peoples were buying Geforce for PhysX.
There were a way to use an Nvidia card (just to have PhysX) along AMD as the main graphics card . For that reason I had two Radeon 6950 in crossfire mode and one Nvidia card (560 I believe) just to run PhysX.