It appears that PC gamers have found a way to enable DLSS 3 Frame Generation on NVIDIA’s RTX 30 and 20 series GPUs in Portal Prelude RTX.
So, let’s start with the good news. By simply using DLSS FG DLL 1.0.1 (you can download it from TechpowerUp), you can enable DLSS 3 Frame Generation on GPUs like the RTX 2080Ti or the RTX 3080. Generally speaking, this workaround works with all the RTX20 and RTX30 series graphics cards. Do note that the original dll or any other dll than 1.0.1 does not work. Moreover, this “workaround” only works with Portal Prelude RTX. It does not work in other DLSS 3 games.
Now the bad news is that the game is completely unplayable. As you can see below, there are major frame pacing issues on the RTX3060 with DLSS 3 Frame Generation. Despite the high framerate, the camera is also jittery, making the game feel awful. To put it simply, it’s a stuttering mess and completely unplayable.
For those wondering, yes. This actually does work. This isn’t a fake video. You can also find here other PC gamers who were able to enable DLSS 3 Frame Generation on their GPUs. And yes, all of them report the exact same experience. While DLSS 3 Frame Generation manages to double their framerates, the overall gaming experience is worse with DLSS 3 enabled.
To be honest, I’m not really surprised by this. NVIDIA has told us that DLSS 3 Frame Generation could work on the RTX20 and RTX30 series GPUs. The reason why NVIDIA has not enabled it as it currently has major issues. And this is exactly what the green team meant.
DLSS 3 Frame Generation takes advantage of the Optical Flow Accelerator that the RTX40 series GPUs have. This is why DLSS 3 Frame Generation works as well as it does. So no, I don’t think you can say that NVIDIA has locked this feature behind the RTX 40 just so it can screw its RTX20 and RTX30 customers. I mean, the proof is right here. With DLSS 3, Portal Prelude RTX is completely unplayable on the RTX 3060.
In a way, these frame pacing issues remind those we saw in AMD’s FSR 3.0. As we’ve reported, both Immortals of Aveum and Forspoken suffer from major frame pacing issues when using FSR 3’s Frame Generation on VRR displays. So, this could be another indication of how important the Optical Flow Accelerator actually is for DLSS 3.
It will be interesting to see whether gamers will find a way to somehow improve DLSS 3 Frame Generation on the RTX20 and RTX30 series GPUs. I doubt this is possible but hey, you never know!
UPDATE:
NVIDIA has issued an official statement about this that you can find below.
“We’ve noticed that some members of the PC gaming community, like Discord Communities before it, have encountered a bug in an old v1.0.1.0 DLSS DLL that falsely appears to generate frames on 20/30-series hardware. There is no DLSS Frame Generation occurring—just raw duplication of existing frames. Frame counters show a higher FPS, but there is no improvement in smoothness or experience. This can be verified by evaluating the duplicated frames using tools like NVIDIA ICAT, or viewing the error message in the Remix logs indicating an error with frame generation.
DLSS frame generation DLLs have since fixed this bug, and NVIDIA will be issuing an update to RTX Remix in the future to prevent further confusion when frame generation is behaving incorrectly.”

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
The same company that is fine with compromises with GSync compatible FreeSync displays and software DXR support on the 10 series is suddenly now concerned about getting people to buy 40 series cards with a barely 20% difference in performance from the last gen to somehow even more lazily brute-force lazy game optimization. Doesn’t add up.
Now they’re concerned about “a bad experience”? When they knowingly address that both of those things they mentioned are substandard to the real deal and apparently don’t block FSR/XeSS from games (they do) because DLSS “makes them look bad in comparison” (at least according to the same people claiming this during the Starfield fiasco that was just due to Bethesda being almost as low effort in terms of their PC porting as Japanese developers).
Things would be better if consumers actually had the brain to see through the nonsense that these publicly traded companies say and do, rather than acting like a Reddit armchair developer/legal-expert that clearly aren’t humble enough to be told they’re wrong by the same people working at said places.
Don’t expect any logical thinking from NVIDIA fanboys. Almost as bad as PlayStation ones in terms of cope. Although this is a separate thing from how AMD locked Anti-Lag+ behind a 7000 series paywall or never did software RT support on Navi 1 like they promise. The 10 series had software RT and CUDA for stuff like Blender, while AMD’s 5000 series cards were stuck with a buggy ProRender renderer plugin for Blender that looked worse than Cycles.
I mean all you have to do is look at the video. RTX 2000 and 3000 series don’t have the hardware capabilites to run frame gen at a steady frame rate. I would rather have a smooth 60 fps than having 100 fps with stutters. And btw I am running rtx3080 but also a realist.
FSR3 might be the only hope, although not at the current stage.
20% difference in performance from last gen???
40 series vs 30 series???
Tell me what you smoke buddy, because it must be really strong!
I got a 40+% gain going from a 3070 Ti to a 4070 Ti
I also got an increase in FPS per Watt and a decrease in cost per Frame …. Plus an additional 4 GB of VRAM
It more than justified the $61.50 price difference between the two
The 3070 Ti cost $670 direct from EVGA and the 4070 TI cost me $731.50 after the 5% cash back I got from buying it through Amazon Prime
What if nvidia actually currently WANTED to make this run fine in the 3000 and maybe 2000 series?
Yup, i bet it would run just fine, maybe not so fast, but fine.
Topaz or any other video editor don’t need motiion vectors or ‘sh*t nothing’ of that to make interpolation heppen, even if we’re talking offline rendering, and the rendering isn’t crazy slower than 1:1 duration of the video, so, probably could be done real time. Just saying.
it would be almost same level as AMD fluid motion, which is awful. Worse than native, go check out Hardware unboxed video on it. Why would nvidia even want to invest resources into last gen hardware?
No it would not …. Yes it could work but not nearly as good because 2000 and 3000 lack the hardware (circuitry) that makes Frame Generation work as fast. You can’t fake it with software, at least not at the speed and efficiency you get from the hardware solution
Some things are simply done better with hardware which is exactly what every GPU has a hardware encoder for video instead of trying to do it in software which is slower. You can also do DLSS (Tensor Math) with normal cores but you’d get a FPS drop instead of a gain.
But what would I know I’ve only been an Electronics Engineer for 30+ years
Portal is the most overrated game in history.
If Valve hadn’t made it, but another developer, everyone would call it a casual indie game and it would have gotten a 7/10.
But it was Valve, and Valve had released the excellent Half Life 2 three years earlier. So when Portal came out after, suddenly this silly game was a masterpiece.
If Valve hadn’t made it, but another developer
Actually…
what?
valve buy the project, portal wasnt an original idea of the SH…
https://en.wikipedia.org/wiki/Narbacular_Drop
Portal 1 isn’t a masterpiece, but Portal 2 is. I started playing Prelude, it’s awful. It’s portal without proper level design.
Portal 1 isn’t a masterpiece, but Portal 2 is. I started playing Prelude, it’s awful. It’s portal without proper level design.
The game isn’t overrated, you just don’t like puzzle platform games. It’s that simple. Your excellent HL2 is meh to me because I don’t like sci-fi games. Instead of sh*tting on a popular game you don’t fancy, move on with your life and play what you like.
Other companies make puzzle games too. None of them come close to being as good as Portal. The reason people love Portal isn’t simply because it was made by Valve, but because Valve actually knows how to make great single-player games. Portal is one of the reasons people love Valve so much.
Has Valve fallen from grace since Portal and Portal 2 were released? In many ways, yes they have. They release almost nothing these days, and what little they do release is overmonetized trash. That being said, Half Life Alyx was probably the best VR game I’ve ever played and set such a high bar for game mechanics and narrative driven gameplay in VR that no one else is ever going to compare, so it’s fairly clear that Valve still knows how to make amazing single-player experiences and they just choose not to.
Even people like myself who were never into Portal (but own and played some of them) recognize it’s excellence in the genre. I always thought it’d be cool if it was merged with HL2 using both FPS and puzzler. Might induce a headache though lol.
I think it’s called “saying sh*t to rile people up” or “gas lighting” or whatever.
Well done Sunny.
Someday you may look back on things you said on the internet with embarrassment.
Fake news.
https://www.reddit.com/r/nvidia/comments/177cwc7/portal_prelude_rtx_can_tap_into_dlss_3_on_rtx/k4s8ce3/
How is it fake news? The video is literally showing what that post suggests..
Nope. On the video, the OLDER RTX CARD is NOT shown running Frame Generation. As the name suggest Frame Generation generates new frames, and on the video no new frames are generated. They’re duplicated, which causes stutter and has nothing to do with Frame Generation.
So yeah, it is fake news. Click the reddit link I posted above. It’s not Frame Generation.
“On the video, the OLDER RTX CARD is NOT shown running Frame Generation” … “So yeah, it is fake news. Click the reddit link I posted above. It’s not Frame Generation”
So what youre saying is …that the video is showing what the redit post suggests? Like i just said?
Awesome Post 👍
This shows the retards, that always bashing nVidia aka AMtrash scums, that nVidia, are doing things for a Reason, and what they do – It Allways Works, and are Best in Class!!
https://uploads.disquscdn.com/images/6331e12bfc1052fa2886ea6f2c26659ec402ccc8fe250fa80289b69cac8de9bd.png
https://uploads.disquscdn.com/images/e1fe281fcc90c7ee50c2ad25463c5e86a1429d98130f0577a94e97634fe5f5f5.png
You got fooled …. It’s not generating any new frames it’s simply displaying same frames twice and not actually generating new frames at all
No way can true Frame Generation would ever work with version 1 DLSS because it’s a completely different beast from DLSS 2 and up …. The only simularity between DLSS1 and DLSS 2 is they both use Tensor Cores …. but in completely different ways
Even if it did work DLSS1 was crapola and the generated frames would all be crapola squared