A few days ago, NVIDIA released the DLSS patch for Tom Clancy’s Rainbow Six Siege. According to numerous reports, that DLSS implementation is great and way better than other DLSS 2.0 games. Thus, some gamers decided to take a look at the DLL file of DLSS and discovered that Rainbow Six Siege uses a new version of DLSS, DLSS 2.2.
Now what’s really interesting here is that owners of Rainbow Six Siege can take that “nvngx_dlss.dll” file and use it in other DLSS 2.0 games. If you don’t have Rainbow Six Siege, you can download this DLL file from here. Furthermore, and by doing this, you can eliminate a number of visual artifacts that were present in those older games.
For instance, Reddit’s dedSEKTR used DLSS 2.2 in Death Stranding and the results were excellent. DLSS 2.2 was able to reduce the dark trails and the noisy checkbox artifacts in the sky. Moreover, it did not bring any additional performance hit. Below you can find the comparison screenshots (DLSS 2.0 is on the left whereas DLSS 2.2 is on the right).
Similarly, Reddit’s desmonds99 used DLSS 2.2 in Necromunda Hired Gun which improved the ghosting of the sights. Again, this is a great improvement over the previous DLSS implementation. Again, DLSS 2.0 is on the left and DLSS 2.2 is on the right.
What’s also interesting is that Reddit’s Techboah was able to improve the image quality of Call of Duty Black Ops Cold War with DLSS 2.2. According to Techboah, the game is not as blurry as before (something we’ve criticized in our DLSS & Ray Tracing Benchmarks article).
Lastly, here are two videos showing Rainbow Six Siege with DLSS 2.2 Performance Mode and Native 4K.

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email




Some good improvements by NVIDIA, very happy that they’re not just abandoning this tech and improving it significantly. The jump from V1 to V2 was quite big from what I remember, I hope the jump from V2 to V3 is even bigger.
Excellent!
Seems we will see more and more of dlss beating 1:1 native even when it’s working from lower base res Why? Due to the dev’s are basically forced to use taa that blurs as proper aa is to costly to utilize.
And here is the thing… Dlss samples from several frames so when the picture is moving (and the motion vectors are properly added) it have way more pixels to work with. First frame when doing heavy dlss (performance mode) will look like garbage but just a few frames in even in that mode it starts to look pretty good as it’s number of pixels to sample from (when in motion) are getting somewhat close to native.
I personally suspect nvidias solution to far beat amd’s due to the tensor cores that won’t eat into its own potential gains – While amd’s solution will cannabalize its own compute pipeline to process and thus eat away at its potential gains.
If you guys want to see something really cool try using dsr to supersample to 7680×4320 then use DLSS ultra performance to upscale from 4k. It looks f*king phenomenal and still runs at like 80 or 90 fps. I might turn some other settings down to try and get it to 120 so I can basically get a free supersample out of it.
I feel it reduces noise in Control Ultimate Edition on Steam. Works well with CP2077. Also seems to work with No Man’s Sky.
I pasted that text in to Google’s transtlator, it replied ‘WTF?’.
I can’t find nor download the DLL from the link posted by John…Can anyone post this file please?
you can find the dll in UE4 DLSS plugin: https://developer.nvidia.com/dlss/unreal-engine-plugin