Now here is something that caught me off guard. It appears that NVIDIA has removed GPU PhysX support for all 32-bit games in its latest RTX 50 series GPUs. As such, you will no longer be able to enjoy older GPU PhysX games at high framerates.
This means that the RTX 5090 and the RTX 5080 (and all other RTX50 series GPUs) cannot run games like Cryostasis, Batman: Arkham City, Borderlands 2, GRAW 2, Mirror’s Edge, Assassin’s Creed IV: Black Flag, Bioshock Infinite with GPU-accelerated PhysX. Instead, you’ll have to rely on the CPU PhysX solution, which is similar to what AMD GPUs have been offering all these years.
This is such a shame as one of the best things about PC gaming is returning to older titles. The old PhysX games were quite demanding when they came out. I don’t know if I’m the minority here, but I really enjoyed most of them when they came out. And yes, when I got the RTX 4090, I tried Cryostasis’ tech demo so that I could finally see all those PhysX effects with high framerates.
NVIDIA claimed that the CUDA Driver will continue to support running 32-bit application binaries on GeForce RTX 40, GeForce RTX 30 series, GeForce RTX 20/GTX 16 series, GeForce GTX 10 series and GeForce GTX 9 series GPUs. However, it won’t support them on the GeForce RTX 50 series and newer architectures.
I honestly don’t know why NVIDIA has dropped support for them. It’s ironic because Mafia 2 with PhysX felt WAY BETTER than the ridiculous remaster we got in 2020. And now, if you want to replay it, you’ll have to stick with an older GPU. We are going backward here.
So, I went ahead and downloaded the Cryostasis Tech Demo. I remember that tech demo running smoothly as hell with the RTX 4090. So, how does it run on the NVIDIA RTX 5090 with an AMD Ryzen 9 7950X3D? Well, see for yourselves. Behold the power of CPU PhysX. 13FPS at 4K/Max Settings. Thanks NVIDIA. Ironically, the RTX 4090 (which still has GPU PhysX support) was able to push over 100FPS at 4K/Max Settings. Let this sink in.
This is such a huge disappointment. NVIDIA has dropped the ball here. And I get it, these are old games. However, there is no CPU capable of running them at acceptable framerates right now. So, these games have become unplayable in the blink of an eye.
I don’t know if NVIDIA will one day decide to correct this issue. After all, it appears to be mostly a driver issue. So, let’s hope that someone will be able to hack the CUDA drivers to add support for 32-bit games.
Here’s the list of all the GPU PhysX games that will now perform horribly on the RTX50 series.
Stay tuned for more!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email


BRUH
LOL
They trying to sell you a consolification for the PC, while asking a bloody high price for it!
High end my arsë!
Long story short: nvidia messed something up in the 50 series and instead of finding a workaround they just cut the feature from their product… this will further increase the demand for 40 series gpus because they are the last gen to support this then.
It is no different than when nvidia/amd stop releasing drivers for 32 bit windows. In case of gpu physx the last game that ever use it was Arkham Knight released in 2015.
That is not the same, you have equal or better performing drivers for 64bit. This isn’t a case where you get an properly performing alternate option, this is simply cutting a feature that people expect. So it would be more comparable to microsoft cutting the ability to use a larger font in word and telling people to just zoom in.
"RTX50 series GPUs) cannot run games like Cryostasis, Batman: Arkham City, Borderlands 2, GRAW 2, Mirror’s Edge, Assassin’s Creed IV: Black Flag, Bioshock Infinite with GPU-accelerated PhysX. Instead, you’ll have to rely on the CPU PhysX solution, which is similar to what AMD GPUs have been offering all these years."
"nvidia has features"
until it stops offering them, i wonder if ray tracing will end up like physx at some point.
Probably not seeing as Rayrtracing will eventually replace baked lighting. Hell don't we already have raytracing only games ? I think Indiana Jones RTX is a must right ? I don't think that ever happend to physx.
Not that I agree with it mind you, but unfortunally it seems that's just the path the industry is heading and there's nothing we can do about that.
"there's nothing we can do about that"
Well, fvck me sideways!
Consumers are this braindead now?
Oh yeah, let's ignore the context that I confirmed that I don't agree with. But hey what can we do about it ? Unless you´re basically saying that you no longer going to use modern hardware you will be supporting the path of the industry is on one way or the other. It's not like we´re back on the GTX10XX/RTX20XX era where you could still choose not too support it, we already in the 4 generation where every nvidia card has raytracing.
I gave up. That's why I barely come by here anymore. It's depressing to read the amount of unaccountability these people are peddling. But hey, one thing is for sure, they are ready to go back and forth with you all day long. They have so much smoke for you the individual but none for the corporation.
Tiresome, isn’t it?
People nowadays just forget the power of their wallets, all they want is to cons00me!
Bioshock Infinite does not have H/A Physx
Ray tracing use DXR which is MS standard. And there is no issue using CPU PhysX on all of those old game.
That's the fundamental flaw with proprietary software:
You are always at the mercy of the original creator.
Same reason why all the major players in the AI industry are working on open-source alternatives to NVIDIA's CUDA.
Which reminds me, at one point the sole developer of ZLUDA managed to get PhysX running on an AMDGPU, but unfortunately that version was never released because of legal concerns:
https://uploads.disquscdn.com/images/2ff67ad848ea6a68c1d5941acdaf622d279c7c44311b7810b0c1d38f7e6a045f.png
I'm pretty sure it may have been abandoned because Nvidia opened GPU physx up. I think it's been resurrected as ZLUDA by vosen but IDK if it supports physx.
There is nothing open about PhysX, spoken like someone who fell for the fake NVidia marketing of going open source.
You free to head over to NVidia public repositories, and all you'll find is SDKs for implementation.
There no actual code to change how PhysX works, is quite useless to make any progress when it comes to ZLUDA.
I was going to point out DirectML, which is part of DirectX 12, but I think Microsoft released that as open source as well (it's on Github at least).
That being said, CUDA has always been a problem since it only works on NVIDIA hardware, but Open Source stuff doesn't necessarily get maintained forever either.
Im playing Arkham Knight (finally) and those NV options add huge visual value. What a shame the games we allow NVIDA to play. I Was planning on upgrading from 3080Ti to 5080, but given the gimp that card got and general hatred of Evil( nvida), may go AMD if they "Do Something" for once.
arkham knight is 64 bit not 32 bit game, support has ended for 32 bit games using Physx
Some games outright refuse to work without PhysX
I wonder why it took this much time to people to realize this, i hope they'll backpedal on their decision, until now NVIDIA has done a great job supporting old games and black listing games that have issues with new features through drivers, if they're not willing to do something about this it will be a very bad situation for PC Gaming
This is part of the problem with pc gaming. They want the new hardware to work well even on 10-20 years old game. Kind of remind me of how those professional group want OpenGL to keep supporting older stuff in the spec even if it will hinder performance. This also probably why other gpu maker that still exist did not want to enter pc dgpu market.
Another reason to leave the RTX 4070 aside when I upgrade.
nvidia will automatically made AMD gpus viable with so many anti consumer practices like this
Even AMD removes feature. For example when they remove mantle from main driver. So you can't use mantle on new drivers.
Mafia 2 DE is that worse without physx?? I really miss the Normal maps in the original version
The Definitive Edition is 64 bit so it's not going to be affected.
I know, but jhon said physx in the original was better.
Mafia 2 DE has the same Physx as the original.
Not true. In DE there is almost no debris.
It does not. Not even close. It's there but it's downgraded.
Not even close, PhysX cloths has a bug where it sometimes only works for the main character and only in the base game, and the particles sometime flicker or turn black due to broken shaders, they also vanish faster
I must admit, I originally played Mafia 2 in 2010 and 2011, with full Physx on a 9600 GT and then a 560 Ti.
When the Mafia 2 DE dropped, I noticed the game force enables Physx with no in-game option to tune them. You can still enable/disable Medium/High Physx from the ini.
I replayed Mafia 2 DE on a 5700 XT with Medium Physx using DXVK. It looked to my eyes like good old Medium preset Physx in the OG game.
Nvidia: Pay More for Less
Most of the tards who call themselves "gamers" buy a 5090 to play some shtty f2p game anyway.
So you're telling me most of these so called "tards" are rich af?
Yes. Welfare leeches and scammers.
This is ridiculous and unacceptable! Fix it Nvidia!
Guess ill keep my 4070ti super forever
Mafia 2 was the first game I booted up when I switched back from a R9 290 to the 980Ti. PhysX was glorious.
I did the same because of crazy 290 temps. For me this card ran slightly hotter than gtx 480. (90-95 celsius).
I recently installed Mafia II & in some areas my rtx 2060 laptop drops to 30fps when Physx are on medium or high. Same is the case with my old gtx 980m laptop. For me Physx is hit or miss depending on the game. In Arkham city and arkham origins, i can turn Physx high and game is always above 60fps. In arkham asylum, my fps always drops in scarecrow sections if Physx are set to high. Same in borderlands 2, fps tanks after 1 or 2 hours if Physx are set to high. Gtx 980m and gtx 2060 laptop gpus are more than capable of running these.
Laptop GPUs are considerably slower than their desktop versions.
ur not telling me something new which i don't know already but rtx 2060 laptop gpu is faster than desktop gtx 1070 & gtx 980m is fast as gtx 780 desktop. Your answer doesn't explain why Physx turned on causes fps drops in older games while others like mirrors edge 1 can run even on gtx 980m maxed out Physx in 1440p
It was always an unoptimized turd of a technology, neat to look at, awful to actually use. In the case of Arkham Asylum I'm guessing it's just a bottleneck with the game or the API, it still uses DX9 doesn't it?
Could be anything but many people even with latest desktop gpus have issues with Physx on in older games. Arkham asylum has issues in only scarecrow section if Physx is on high. Arkham city no problem using same Dx9 api. Mirrors edge 1 Physx max no issue at all using dx9. Borderlands 2 severe fps drops with Physx on high after 1 or 2 hours using dx9 api.Mafia 2 some sections run at 30fps with Physx on high & GPU CPU usage is not even reaching 20%. This tells me while Physx was cool technology, it was not optimized very well to utilize hardware.
Borderlands The Pre-Sequel is complete garbage in spots with hardware physx. It performs no better at times with a 4090 than it does with 10-year-old cards. Very few games did it ever work very well and it was always laughable to see that GPU usage actually went down every time there were effects. And I always thought it was just ridiculous how some people were impressed with the same exact little chunks flying out of walls when you shot them.
It’s ~1660 performance, from memory, which is slower than the 1070.
I found a bench – https://technical.city/en/video/GeForce-RTX-2060-mobile-vs-GeForce-GTX-1660
ur not telling me something new which i don't know already but rtx 2060 laptop gpu is faster than desktop gtx 1070 & gtx 980m is fast as gtx 780 desktop. Your answer doesn't explain why Physx turned on causes fps drops in older games while others like mirrors edge 1 can run even on gtx 980m maxed out Physx in 1440p
With all this upscaling and frame generation, Ray Tracing + Nvidia PhysX is feasible. As always people without knowledge with average pc's will max out everything on 4K and Nvidia are probably afraid that people will complain again about performance. Cowards.
FFS I hope they can add an abstraction layer for this at a later date. What a clusterf*k.
oh dear…
Cryostasis is a hidden gem. Very underrated game.
It is a shame you can't buy it anymore and my disc copy requires a crack to play it.
So what ? You have the right to use the disc you bought bro,
It's a unoptimized piece of crap that even on gpus orders of magnitude faster than when the game was released still has issues with dropping frame rates.
AC Black Flag is locked at 60 FPS and modern CPUs are powerful enough that no one is even going to notice that it is running on their CPU instead of their GPU ….. If you unlock the FPS with a mod then you really need to change PhysX so it runs on the CPU because FPS above 60 tend to break the PhysX physics especially when you start running about 90 FPS
I played Black Flag on a RX 5700 6 years ago and there was no difference from the way it played on the 1660 Ti I had before it or the 2080 Super I had after it. That's because the even the 2600 I had back then was easily powerful enough to run PhysX on a single CPU thread and since the game is locked at 60 all three ran the game at 60 FPS unless you kicked in MSAAx8 but that had nothing to do with PhysX
Maybe since the last mining craze but definitely since the AI craze Nvidia is moving further away from caring much about their PC gaming customers. They make 6 times more revenue from professional cards and not only that but the profit margin is much, much higher.
They do still make billions of dollars from us yearly though so we still matter to them but it's just not their main priority.
This is disappoint af! Arkham City is one of my all time favorite games that I like to replay every couple of years. I guess I'll just have to hold onto my RTX2060 for long term I guess.
Just disable those physx effect.
^ negative IQ advice right there
Nvidia "the Way you meant to be played", if AMD wasn't such a bi#tch that could only compete with low to mid nvidia GPUs, nvidia wouldn't be such a money hog
Great, now we all have to keep an old PC with an old GPU around just to play old games. At least until someone comes up with a solution to the problem.
Always hold on to your old hardware.
paper launch, overpriced, melting cables, fake frames for a limited list, removing backwards compatibility on physX is there anything the 50 series can't do?
$5000 Turd
Remember this comes after Nvida is one of THE RICHEST companies on the planet…so what do they do?, make GREAT products…NOPE. They GIMP their cards(5080,low Gb #s, giving actual customers less value)( but the MIGHTY shareholders happy). Modern capitalism FTW. Their will be NO repercussions.
You should try one of the forgotten nvidia features, that you can make second GPU a dedicated PhysX GPU. Like you can get mid range (like xx50) older gen GPU and set it as PhysX only, I wonder if that paired with 5090 would restore physX to 32bit games.
The entire point of owning a new gpu is to play games. While these games could be considered 'old' this excuse no longer hold weight because these games are not remotely 'dated' and in many ways play better then new games and in the least deserve to still be able to be played with full compatibility especially with physics graphics effects that still look better than many modern games.
I think I am just going to stick with my 3070 and skip this generation.
What an unacceptable decision/design choice! Ngreedia getting out of control!
This is somewhat wrong, NVidia crippled CPU performance on PhysX on purpose, all they had to do is add the proper instructions and any CPU released the last decade would run it with no issues whasoever.
Thats the problem, in early days, unlike havok, Physx are not coded optimally for cpu, just Nvidia GPU with Physx support
I dont think you need that much upscaling with AMD as long as RT is turned off