As we reported a couple of days ago, EIDOS Montreal and Crystal Dynamics released a new patch for Marvel’s Avengers that added support for AMD’s FidelityFX Super Resolution. Therefore, and since this game supports both NVIDIA DLSS and AMD FSR, we’ve decided to test and compare them.
For these benchmarks, we used an Intel i9 9900K with 16GB of DDR4 at 3600Mhz, NVIDIA RTX 3080, Windows 10 64-bit, and the GeForce 471.11 driver. Moreover, we did not update the game’s DLSS file (by default it’s using version 2.1.52.0). For NVIDIA DLSS, we used the Quality Mode and for AMD FSR we used the Ultra Quality Mode. These modes offer the best image quality these two techniques can offer.
In this particular game, DLSS offers a better image quality than even native resolutions. Below we’ve included some comparison screenshots between native 4K (left), DLSS (middle) and FSR (right).
Take for example the bushes in the first screenshot. As we can clearly see, the DLSS image looks sharper with less aliasing. You can clearly notice these image improvements in the screenshots that have the Helicarrier. The distant objects look more refined with fewer jaggies in the DLSS screenshots. Native 4K comes in second place, whereas FSR comes in third place.
Now what’s also interesting here is that DLSS also runs faster than FSR in Marvel’s Avengers (at least on our RTX 3080). As we can clearly see, DLSS was constantly faster, by around 4-5fps, than FSR.
In conclusion, NVIDIA’s DLSS implementation in Marvel’s Avengers is truly amazing. Not only does it look better than AMD’s FSR, but it also runs faster. Additionally, DLSS reduces aliasing to such a degree that is even better than native resolutions.

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email















Finally someone who has eyes. The amount of people saying that they are indistinguishable… let’s just say that there is no wonder why developers treat gamers the way they do.
For someone that didnt have the choices due to my GTX1060 and cant use DLSS, we have to stay what we can get, but no one playing Marvels Avengers, so we dont have to lose anything
It depends on the game & implementation. In Necromunda: Hired Gun they damn near ARE indistinguishable. FSR requires some developer specific tweaking to look it’s best for a given game, and some developers have done this part MUCH better than others. DLSS will generally have the edge (and UNDISPUTEDLY so at lower resolutions & upscaling quality levels, FSR is basically useless for <=1080p for anything but integrated graphics), but if properly tuned for the specific game in question, FSR Ultra Quality at 4K can get very, VEEEEEEERY close in image quality to DLSS Quality also at 4K.
>developer specific tweaking
What tweaking? There is literally nothing you can adjust besides the strength of the sharpening pass.
There’s no need to get hung up on this, it shows what everybody already knows, DLSS is the more advanced tool, if you can’t use it there’s FSR.
No amount whataboutism will change that…
The fanboy has a hard on for FSR.
GPU scaling: 1800p looks pretty close to 4k.
FSR: 2159p + heavy sharpening looks better than native 4k! No, really guys! Trust me.
DLSS: Here’s 1 pixel we’ve upscaled into 16k and added the missing details.
Lol Leonardo is literally the image I have of the posters on this site. Chubby, greasy hair, with a stupid looking mustache with accompanying hair on the wide neck.. Holy crap I spit out my drink it’s so damn accurate to the image in my head.
That’s on AMD from. FSR is e great addition to PC gaming, but c’mon, obviously DLSS is and will ever be better. Still FSR is a great implementation and incredible usefull. The image quality is very close to native resolution, which is impressive. DLSS is slightly better than native.
But FSR works on 1650S … Nvidia sucks
Guess what? Custom resolutions and gpu scaling works on pretty much any gpu.
Traditional bilinear GPU upscaling doesn’t have ANYTHING on FSR. The quality isn’t even REMOTELY close. FSR is the undisputed best spatial upscaling algorithm on the market, end of story. And it takes many temporal upscaling solutions to task as well, with similar still frame image quality (assuming you don’t completely bork the comparison with other different graphics settings ala having different Depth of Field configs & only comparing at a miniscule 1080p res, like that idiot Alex over at Digital Foundry) but while lacking the awful motion artifacts inherent to TAAU like built in to Unreal. Get lost ya tech illiterate dingus. The grown ups are talking.
I love all the hurt feelings over inferior products that are repackaging, re-marketing, and oversharpening current solutions. Sure FSR works, but we all have scaling options already at hand. FSR will just become another reshade shader like AMD CAS.
I guess comparing traditional upscaling to FSR is not okay, but comparing FSR to DLSS is? Traditional may not be as good, but can yield similar results to FSR with a variety of sharpening options available. FSR is not as good as DLSS and never will be as it’s not an intellegent upscaler. THE QUALITY ISN’T EVEN REMOTELY CLOSE. To quote an idiot.
But but but at 4k FSR and DLSS looks almost the same. Yeah no sh*t you are upscaling a high res to a high res with more data than a 1080p image. Let’s see how FSR and DLSS do at 720p or 1080p. Or set a custom 3200×1800 resolution, gpu scale to 4k. Bet you couldn’t tell the difference between gpu scaling and FSR except for the excessive sharpening, which didn’t equal better IQ.
Huge win for DLLS.
https://media2.giphy.com/media/8YBpKSm3uPWG9Ca0F4/giphy-downsized-medium.gif
Haha, you always after that boy. In a new and unique way too. ???
Not really. When FSR is properly tuned for the game in question, it can get CRAZY close to DLSS in visual quality at 4K Ultra Quality vs 4K Quality DLSS (aka highest quality setting for both at 4K res). See Necromunda: Hired Gun for a much better FSR vs DLSS if FSR is properly tuned example.
Otoh, when developers are lazy with their FSR implementations and don’t do any (or enough) game specific tweaking of the FSR parameters, you get stuff like the above example.
As FSR gets used more & more, developers will become more & more comfortable with modifying it to best suit their games. Or JUST like what’s already happened with DLSS.
All that said, DLSS will remain unquestionably superior to FSR at lower resolutions & upscaling quality levels. FSR is basically worthless at <=1080p for anything but integrated graphics, where DLSS otoh can still produce acceptable results. Thankfully 1080p is a definitively "last-gen" resolution, & 4K is RAPIDLY becoming the standard developer target resolution.
You’re right, but people are opining on still imagens from this article. From that point of view, they’re right. On still image, there’s no huge win for DLSS, but very close results and DLSS slightly beter only.
But If anyone has seen the Digital Foundry review, would know that this is a true for Marvel game. It’s clearly much better and much more noticeable how better it is on non still images like these.
From other hand, on Necromander, it’s indistinguishable and FSR have virtually the same quality than DLSS. So it will always depending game implementation and development.
Is it so hard to label them? Atleast name the fckn jpgs so we can read that and get it! -_-
“native 4K (left), DLSS (middle) and FSR (right)”
Are you retarded or something?
You are the rtard for not being able to understanding basic implications.
Apparently you are because it’s easy to understand which is which. So stupid boy!
You are the r**ard for not being able to understanding basic implications.
Huge win for DLSS 2.0.
Curious what native resolution fsr ultra quality is closest to visually speaking and what it is closest to performance wise. One should do a comparison to the resolution with similar performance metrics for a better assessment.
Performance of the various quality levels is similar between FSR and DLSS, except DLSS has extra lower tiers available that AMD wisely does not allow in FSR. Each tier of DLSS looks a lot better than the same tier in FSR, and the highest tier looks MOSTLY better than native.
I am aware of all that. I didn’t even mention dlss as I don’t think anyone should be comparing the two. I’d like a comparison of native at a performance matched resolution to fsr. That’s all. That would be the most fair comparison and no one has done it. They have been doing comparisons of native baseline fsr resolution to fsr upscaled, and fsr upscaled compared to fsr target resolution, but not performance matched native resolution to fsr. This was done back in dlss 1.0 days (remember 1800p vs dlss quality 4k comparisons back in the day?) but people seemed to forget about the fsr overhead for some reason. I don’t know what the results will be and am too lazy to test it myself but analysis that doesn’t do it right are just wasting everyone’s time.
We had to wait 2 days for this? Really?!?!
Technically Nvidia wins in its technology but popularly AMD passed over it and gave one more option to those who can not change their gpu for the current and abusive prices worldwide and where AMD stands out is that it is for virtually all gpus with an acceptable level of graphics power because there are gpus not supported by FSR but still can work, if even with an intel gpu you can make it work and where it hurt the most was that nvidia left out its gtx series not even a minimal option to implement the technology and AMD comes to the rescue with this. .. so whatever and no need to give more turns AMD won and nvidia was left as a company that leaves abandoned a series that could rescue and give something to those who bought their gtx series gpu that still perform very well and seeing the price scenario they have and is not permissive for all. accept that it is positive that amd has done something even if for some it sucks AMD helped and NVIDIA refused.
A USER FROM NVIDIA GTX 1060…
“AMD helped and NVIDIA refused”
Just history repeating itself. Still have not forgotten Nvidia’s game libraries that was designed to not work well in non-nvidia gpu’s.
Am I supposed do believe that the order is 4K, DLSS, FSR? 😀
https://uploads.disquscdn.com/images/f85ae6cffeee573b2f51869529745f095252a97700d799a0af28d9b88ef29df9.png
Yes. That is the result you would expect, had you been paying attention over the last year.
The only explanation would be DLSS doing some sort of MSAA under the hood. Can’t make up all those details from the thin air.
And for crying out loud use PNG not JPG for comparison.
4K PNG? good luck with that lmao
11mb png vs 2mb jpg. What is there to cry about? Are you on dial-up?
Native 4K looks better than the DLSS and FSR versions, as expected.
Yup. But this is a particularly bad/lazy FAR implementation. See Necromunda: Hired Gun for a FAR superior one, at which point at 4K, FSR Ultra Quality gets EXTREMELY close to DLSS Quality (aka the highest upscaling settings for both), and both are VERY close to native 4K. DLSS unquestionably takes the win at lower resolutions/quality levels though.
Look at the pictures next time.
Mate, you really need glasses or new display. *i’m joking, you have rights to disagree with anyone*.
Anyway, try to look at the edge of the objects, look at distant objects and mainly, use full screen image to compare specially if you’re not have a 4k panel.
Yeah, let’s just totally leave out the litttttle fact that DLSS only works on a TINY fraction of GPU’s that are currently impossible to buy while FSR works on literally ANYTHING that supports DirectX 11!… -_-
You forgot the footnote that image quality may differ depending on the GPU (looks better on AMD cards)
Let’s also cut out the fact that we’ve had standard TAA Upsampling for several years now, which produces objectively superior results, and it’s been employed by every major game engine already.
Strange how Necromunda was virtually indistinguishible and this one vastly favors DLSS.
Blame the game specific developer tweaking of the FSR algorithm to best suit the specific game in question. Or more specifically, a lack there of in this case. Just like DLSS though, as more & more developers implement FSR into their games & engines, they will get more & more comfortable tweaking it to achieve the best possible results.
All that said, DLSS will still win at lower resolutions & upscaling quality levels, regardless of how well FSR is tuned for the specific game in question. The higher your target resolution, the better FSR works & the closer it gets to DLSS & native res in image quality.
Bunch of nonsense, FSR simply looks better in dark low contrast games like this, it was the same result in the Terminator game.
Who else wants to bet Digital Foundry’s Alex will completely ignore Necro and laser focus on Avengers to show how much better DLSS is period? Calling it now.
>completely ignore Necro and laser focus on Avengers
It’s already been established that the shortcomings of FSR are less visible in games with low contrast dark art styles with lots of post processing, they literally showed that in their first video with the Terminator game.
Necromunda, low contrast?
Are you real?
Lol, and he’ll bork the comparison with different graphics settings, as well as refusing to test above 1080p again (aka completely gimping spatial upscalers like FSR vs temporal ones like Unreal TAAU or DLSS).
Also they didn’t “break” the comparisons in any way, the articles were updated, and it made absolutely no difference for the bottom line.
It’s hilarious how people got so emotional due to them criticizing FSR and attempted to start such drama over it.
DLSS win in Necro. https://imgsli.com/NjE4Mjg
I must be blind because i didnt see any noticeable difference here. But on Marvel, there’s a clearly win, but still a great work on FSR too.
DLSS is never going to be “better” than native resolution, I wish people would stop “reporting” it in this language. Perhaps you prefer the image it presents, much the same way as some people prefer a vivid display profile versus one which has been properly calibrated for accuracy, there is no accounting for taste, but information loss is never “better” than retaining the information. The difference is in running algorithms to try and guess the missing pixels versus knowing what those pixels actually are, you may not like what they actually are, but that doesn’t change what they should be, blame the assets and implementation if you the prefer less accurate image, but stop reporting that DLSS is better than native, because it shows a great deal of bias.
You’re just ignorant about how the rendering aspect works.
>information loss is never “better” than retaining the information
TAA used by every game leads to information loss, especially with sub-pixel detail and fine geometry.
DLSS is superior at resolving such details than any form of TAA, which is what leads to CERTAIN ASPECTS looking objectively superior than native resolution with TAA.
You also don’t understand how DLSS operates, the reconstruction it does is based on the ground truth pixels of the game across multiple frames, it’s not simply hallucinating baseless information.
“information loss is never “better” than retaining the information”
Like how native rendering “loses” the information from past frames and DLSS “retains” it? The SS stands for Super Sampling, you know. DLSS uses more information than native rendering. There are more pixels in three 1440p images than in one 4K image.
Zoom in on distant trees in these screenshots and stop being in denial. DLSS can be better than native. It isn’t always as good as native, and when it produces artifacts it’s worse than FSR! But on average DLSS already looks better than native and every update does away with a few artifacts. If AMD fail to develop a competitor over the next few years then in the long term they’re finished making GPUs. In the short term AMD don’t feel the pain because they can still sell to miners. But once GPU mining goes out of fashion (Ethereum is dropping it soon) what will AMD’s GPU division produce? You can’t compete with twice the effective rendering resolution by lowering prices and running hotter. If nvidia cards only ever need to render half the resolution to produce the same image AMD GPUs are simply done.
Some kind of label or watermark on the photos would be nice to know which is which ??
https://twitter.com/IGN/status/1416790212815081472
It’s also worth noting that 4k is rare amongst PC gamers and that DLSS holds up well at lower resolutions, while FSR falls apart. This isn’t limited to just Marvel’s Avengers, but holds true in general.
I feel FSR is designed for console since these will be plugged in to 4k TVs. Again it’s AMD holding back PC gaming with console tech.
Consoles literally have no need for FSR.
Existing methods like checkerboard rendering and TAA Upsampling (current common standard), are superior to FSR and already used across multiple games/engines for multiple years.
AMD marketing was successful because average person, even including PC gamers and “enthusiasts” are completely clueless about game rendering and what the state of the art is.
So people literally think that FSR is the first open source/hardware agnostic upscaling technique…
Lol what a sperg
DLLS also wins in Necromunda: Hired Gun. Look at the metal chains in the background, they are reconstructed in a much better way with Nvidia’s AI tech.
https://imgsli.com/NjE4Mjg
Still a great job from FSR technology. Really amazing. Thanks for this article.
Yes lets compare a method that has years under its belt while being under lock of a greedy corporation vs a while new methods thats open source to be shared with everyone while still in its infancy stages.
John Papadopoulos is quite the shill.
In a years time things will turn out differently sees what happened to intel and this has Nvidia quite worried as its market edges start to rub out.