505 Games and Kojima Productions have just released Death Stranding on the PC. Death Stranding supports both DLSS 2.0 and FidelityFX Upscaling. As such, and instead of sharing our initial 4K performance impressions, we’ve decided to test these re-construction techniques. We’ve also included Native 4K screenshots in order to showcase the graphical differences between them.
In order to capture the following screenshots, we used an Intel i9 9900K with 16GB of DDR4 at 3600Mhz. Naturally, we’ve paired this machine with an NVIDIA RTX 2080Ti. We also used Windows 10 64-bit and the latest version of the GeForce drivers. We’ve also included MSI Afterburner in our screenshots in order to give you an idea of the in-game performance.
As we can see, FidelityFX Upscaling and DLSS 2.0 Quality Mode perform similarly. However, FidelityFX comes with a Sharpening slider that lets you improve overall image. Thus, and thanks to it, the FidelityFX Upscaling screenshots can look sharper than both Native 4K and DLSS 2.0.
On the other hand, DLSS 2.0 does a better job at eliminating most of the jaggies. Take a look at the fence (on the right) in the seventh comparison for example. That fence is more detailed in DLSS 2.0 than in both Native 4K and FidelityFX Upscaling.
Now while DLSS 2.0 can eliminate more jaggies, it also comes with some visual artifacts while moving. Below you can find a video showcasing the visual artifacts that DLSS 2.0 introduces. Most of the times, these artifacts are not that easy to spot.
It’s also worth noting that FidelityFX Upscaling introduced some artifacts during some cut-scenes. These artifacts were completely gone when we disabled FidelityFX (or when we restarted the game). So yeah, this is something that you should also consider before enabling it.
All in all, DLSS 2.0 is slightly better than both Native 4K and FidelityFX Upscaling. Performance-wise, both FidelityFX and DLSS 2.0 perform similarly. FidelityFX Upscaling comes with a sharpening slider via which it can provide a sharper image than both Native 4K and DLSS 2.0. However, there is more aliasing with FidelityFX Upscaling than in both Native 4K and DLSS 2.0. On the other hand, DLSS 2.0 can eliminate more jaggies, but also introduces some visual artifacts.
Below you can find the our comparison screenshots. Native 4K is on the left, DLSS 2.0 is in the middle, and FidelityFX Upscaling is on the right. We also suggest opening the images in new tabs.
Stay tuned for our PC Performance Analysis!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email

























Why is DLSS 2.0, when on, look better than the native 4k? I thought when it was on performance will be better, but visually it’ll look a bit worse?
It’s because TAA most of the times is mediocre to say the least, so DLSS does a better job at AA. A win-win basically for DLSS 2.0. Hope more games implement it..
DLSS 2.0 makes games run better AND look as good if not better (especially on certain aspects such as shimmering and fine detail) than native resolution. It’s basically magic.
It’s difficult to see the differences with downscaled 4K screenshots John, must see these screens at their native res with 100% zoom, otherwise it’s hard to notice any difference
Open them in new tabs and you’ll get their 4K versions 😉
Thanks man, didn’t know that
It’s IMPORTANT to point out that you can have BOTH DLSS 2.0 and a contrast adaptive sharpening filter enabled if you have NVIDIA. Nvidia freestyle has one, and the AMD one was ported to reshade. The only thing the fidelityfx option is doing is rendering at a lower res and applying a CaS filter on top to regain detail, nothing complicated. At least not compared to DLSS, which is leaps and leagues beyond in term of technology in comparison
That’s the extra sharpness layer which adds aliasing by the way, it’s hardly visible with screens at less than 100% zoom but it’s pretty noticeable ingame
You can adjust sharpness through nvidia control panel…
I know, i was just saying that the sharpness added by FidelityFX can add aliasing and noise as usual with any sharpness shader
you’re either blind or a fanboy. zoom into the pictures. fidelityFX is just naive upscaling with a sharpening filter on top.
I’m still not believing that the cutscenes in this game are all in-engine. Are they all done in-engine? You can control the camera in some of them so I guess they are, but damn they look beautiful, never seen anything like it.
it’s really nice seeing DLSS pull past native res (with TAA) in some parts.
the resolution race leads nowhere and with DLSS we can stop pushing render resolution and just increase output res at will (at least by 100% over internal res). we will be able to render 1440p or even just 1080p internally for a decade to come instead of overtaxing GPUs with native 4K.
in the past I’ve often criticized nvidia for their gimmicks but the tensor cores are anything but. machine learning is going to be crucial for advancing games in the future, not just graphically but with complex physics like fluid simulation and yes, even “AI”. singleplayer enemies will remain more hand-scripted for longer but multiplayer bots especially are going to benefit greatly from running on a neural net. they will be much less predictable and boring to play against. they will figure out behaviors that traditional game AI never could. massive trial and error is so much more powerful than trying to script a bot to do everything rationally and correctly from the start. nvidia has pushed this long before AMD even considered it and at least the new Xbox now also has support for it so there’s a real chance we will see it in mainstream games.
What was internal resolution rendering for 4k DLSS 2.0? It’s important and article clearly doesn’t say that. Is that 1080p, or 1440p?
here are 2 first screenshots native 4k vs dlss 2.0:
https://screenshotcomparison.com/comparison/4531/picture:0
I think you have the sharpening turned up too much in the FidelityFX screenshots. By default I think they set it to 80% for whatever reason in the AMD control panel at the driver level (and seemingly at 100% in that game). In Far Cry 5 which I only started playing recently, I found anything above 10% sharpening looked worse, but having the sharpening turned off completely made the game extremely blurry with TAA on. With TAA on, and 10% sharpening, the blur is seemingly gone, and the sharpness is not too much or too little. The flaws I’m seeing as a result of the sharpening in the FidelityFX screenshots, is probably because it is turned up far too high.
[Edit: the amount of sharpening that is optimal is probably something that needs to be adjusted on a game by game basis (and may differ between AA methods such as TAA and FXAA, and their implementations in specific games), but 10% is probably safe in terms of not being too much. Again, in FC5, above 10% looked worse, in that things such as foliage looked too “harsh”, in a way.]