News PhysX feature unlocked for RTX 5090 with RTX 3050 'helper' to enable full performance

Personally I'm hoping that somebody can figure out how to make a compatibility mod of sorts that can basically bridge a game's 32 bit PhysX over to 64 bit so it can use hardware acceleration without requiring a second old GPU.
They can use the RTX 4000 PhysX drivers and modify them to work for the RTX 5000 series, these GPU generations are similar enough.
 
  • Like
Reactions: KyaraM
Personally I'm hoping that somebody can figure out how to make a compatibility mod of sorts that can basically bridge a game's 32 bit PhysX over to 64 bit so it can use hardware acceleration without requiring a second old GPU.
Imagine someone using ROCm to run these old physx games in Linux on an RX7900XT and blowing a 5090 out of the water. Lol.

Physx was always a scam to sell CUDA anyway, it didn't HAVE to run like crap on CPUs, it just did because Nvidia put the minimal possible effort in making it work so their GPUs would look good. A source port to real multithreading with AVX2 (or even AVX512) would easily play any of these old physx games on modern hardware.
 
Last edited:
  • Like
Reactions: greenreaper
I'd love this idea. Use my 2080 to run RT in its entirety then use whatever new gpu i got to do raster.
Well, that used to be a thing with early generations of PhysX, but nVidia blocked that almost immediately when it becamse somewhat popular to use (back then) an ATI for render and nVidia only for PhysX. So in the modern world, you can bet anything nVidia will block any "hybrid" setups.

--
In regards to the news/article: well, since nVidia is now for people with lots of money, they just sell you the solution to a problem they created themselves. Why optimize their drivers or add a wrapper for 32->64 when they can just tell you to buy a secondary card for it. Also, damn the Devs that took that sweet PhysX money over HaVoK or Bullit.

The Way It's Meant To Play You All.

Regards.
 
Well, that used to be a thing with early generations of PhysX, but nVidia blocked that almost immediately when it becamse somewhat popular to use (back then) an ATI for render and nVidia only for PhysX. So in the modern world, you can bet anything nVidia will block any "hybrid" setups.

--
In regards to the news/article: well, since nVidia is now for people with lots of money, they just sell you the solution to a problem they created themselves. Why optimize their drivers or add a wrapper for 32->64 when they can just tell you to buy a secondary card for it. Also, damn the Devs that took that sweet PhysX money over HaVoK or Bullit.

The Way It's Meant To Play You All.

Regards.

This captures the entire story. It would be TRIVIAL for Nvidia (and it's unlikely they don't already have this fully fledged) to code around the 32bit interface.

This is it guys. What else do you need to know about this company? They artificially limit backwards compatibility to the tune of a 10 year old card performing 3 times better on games that people STILL PLAY, in order to trap you in their ecosystem so they can sell you a solution later.

Beyond anything I've learned about the 5090 this week, this, above all, signals how utterly wretched this unbridled corporation is. How much they absolutely f*king hate their customers and consider them nothing more than a cash register spilling their hard earned FRNs into their coffers like the idiots they know we are.

Bring back the Sherman AntiTrust Act, shatter and splinter and burn this company into a million falling ashes of confetti, and then maybe, just maybe, we can get back to some semblance of sanity.
 
In my opinion, todays news is one reason why I think nVIDIA should have never killed off the dedicated PPU. Having the option to use a PPU with any graphics card could have prevented these compatibility issues.
 
It has come full circle. PhysX was originally an addon-card, back in the day. I was debating buying one, never did and was glad i didn’t when nvidia acquired Ageia for their physX and added it on to their cards.
 
  • Like
Reactions: PixelAkami
And a big enough case to fit two GPU in, a lot of them are pretty chunky
And a mb with enough space between PCIe slots as if my card were an Nvidia GPU, I couldn't use a 2nd card in a 8x slot as its hidden under 1st card.

What exactly is a legacy game? Many of those are way better than the pretty things they push out now. Before they could rely on graphics, games used to have to be fun to sell.

They not completely unknown games on here: (some might be 64bit physx). Witcher 3... how soon do games become legacy exactly?
https://en.wikipedia.org/wiki/Category:Video_games_using_PhysX

Star Citizen is on there? (funny, become legacy before release.... Duke Nukem Forever never achieved that)
 
Last edited:
Also, damn the Devs that took that sweet PhysX money over HaVoK or Bullit.

what choice they have? most game engine come with specific fixed game engine. all the game that use GPU PhysX have it core physics engine being PhysX (AC4 Black Flag being exception). plus you're not going to get havokFX with havok thanks to intel buying havok. it is what prompt nvidia to get Aegia for themselves to begin with. as for bullet it is open source but not many game engine use it.

also for game that use GPU PhysX some of them simply use it just to say the PC version game have something extra that console version does not have. back then many pc gamer complaining about things like console parity.

and ultimately game developer in general does not like GPU accelerated physics. AMD endorse Bullet physics back in 2009 and said game using bullet will come out in a year. one year plus after that one tech outlet ask AMD about it and AMD response was game developer are not interested with it.
 
This is it guys. What else do you need to know about this company? They artificially limit backwards compatibility to the tune of a 10 year old card performing 3 times better on games that people STILL PLAY, in order to trap you in their ecosystem so they can sell you a solution later.
funny thing is people suddenly care about GPU PhysX. no new game use GPU PhysX for almost a decade and nobody care. when nvidia still push hard for GPU PhysX many people hate it even those that use nvidia GPU saying it useless and only kill performance. all the graphical enhancement does not worth the performance hit (people say the same thing with RT now).
 
  • Like
Reactions: KyaraM
I'd love this idea. Use my 2080 to run RT in its entirety then use whatever new gpu i got to do raster.
I don't think it's worth the extra power consumption, or heat generation, to dedicate a 2080 to PhysX. Don't forget that you're going to need a pretty hefty PSU, with lots of PCIe connectors, to run the 2080 in conjunction with whatever your primary GPU is. The 3050 6gb can handle it just fine, without using more power than the PCIe slot can provide, and with less heat generation. A GTX 1650 would also likely work, or a GTX 1050/1050 Ti, though you may run into driver issues down the road, when Turing and Pascal go into legacy driver support.
 
  • Like
Reactions: PixelAkami
funny thing is people suddenly care about GPU PhysX. no new game use GPU PhysX for almost a decade and nobody care. when nvidia still push hard for GPU PhysX many people hate it even those that use nvidia GPU saying it useless and only kill performance. all the graphical enhancement does not worth the performance hit (people say the same thing with RT now).
I mean I play Need For Speed Shift 2 and that uses Nvidia PhysX. Not sure on its implementation though
 
I mean I play Need For Speed Shift 2 and that uses Nvidia PhysX. Not sure on its implementation though
They use pure CPU PhysX. even when the team behind Shift/Shift 2 leave EA to form Slightly Mad Studios to create Project Cars they only use CPU PhysX mainly for ragdoll. And then they have their own proprietary physics engine to simulate how tire behave on the surface on the road.