NVIDIA and Bioware have added DLSS support to Anthem and we’ve decided to capture some comparison screenshots and benchmark it on our NVIDIA GeForce RTX2080Ti. And we are happy to report that this is the best DLSS implementation we’ve seen to date.
For this benchmark test for DLSS, we used our Intel i7 4930K (overclocked at 4.2Ghz) with 16GB of DDR3 RAM at 2133Mhz, NVIDIA’s RTX 2080Ti, Windows 10 64-bit and the GeForce 419.67 driver.
NVIDIA initially disappointed a lot of its Turing fans with the awful first implementations of DLSS for both Battlefield 5 and Metro Exodus. As we’ve stated, these initial implementations looked blurry as hell and were nowhere close to native 4K (or even 2560×1440). The green team was quick to react and released newer and better versions of DLSS for these two games, however most of its fans were disheartened by those initial results.
That disappointment does not carry over to Anthem as Bioware’s latest looter shooter has the best DLSS implementation we’ve seen to date. Not only does it run better than native 4K, it also looks sharper.
The performance difference between native 4K and 4K DLSS was 15fps in most cases and as a result of that, we were finally able to run Anthem with 60fps. As we’ve already stated, that wasn’t possible in native 4K as there were major drops below 45fps in various places.
Below you can find some comparison screenshots between DLSS (left) and native 4K (right). We’ve also included MSI Afterburner in order to give you an idea of the performance increase that you can expect from it (we suggest opening the images in new tabs so you can more easily see the visual differences).
Enjoy!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
















Looks like AMDliars lies debunked again.
Whats your problem
the cool part is that it may even get better .
Yeah . Maybe if we dont play this EA will just die That would be fun
Die? After Apex? I don’t really think so.
No one is playing this junk
It was EA’s most successful launch since Mass Effect 3 but ok.
You heard it first here folks!
Capitan marvel sold as well
And its still garbage
That’s not what you said initially. You said”nobody plays…”
The game itself might be garbage. That doesn’t change the fact that alot of people still play it
Ok
That’s not what you said initially. You said”nobody plays…”
The game itself might be garbage. That doesn’t change the fact that alot of people still play it
This is sad
who cares about this TRASH game…..pointless article highlighting DLSS…..
How is it pointless article?
It highlights and assesses the validity and use of new technology, that is of use for people who own or are considering buying RTX cards, or even for people like me who are interested in new technology and it’s potential implications for the future.
Terrible game, terrible tech.
Clearly isn’t terrible tech at all though, and if you saying otherwise, you’re a) wrong, b) little slow and c) deluded
Should have gone to Specsavers. /facepalm
You clearly went to special school
I’m guessing you missed school, otherwise you would have learned to end a sentence with a period.
Bottom line, do images look better with DLSS? Nope. Cry more.
Oh no!!! You burned me good, I missed a period, how will I ever live with the shame?
Clearly you haven’t missed a period, as you act like a woman on the rag.
“Bottom line, do images look better with DLSS? ”
Better than what? A native image resolution?
Well, duh!!! It’s clearly not designed to look better than a native resolution image. It’s meant as a lower cost alternative solution to running at native resolution, that frees up performance to be used elsewhere. It’s a hardware/software based cheat or an optimisation in a sense. In much the same way that developers don’t use fully ray traced lighting in all games, as the costs are too high to run it. So, they find ways of emulating it as best as they can, using baked in reflections etc.
If your expectation is that something will exceed native resolution, then you’re going to be disappointed a lot.
That’s like expecting a vegan “steak” to taste better than a 9 oz sirloin steak.
And if yu read article DSOG even conclude that DLSS actually looks better than the native 4k in one sense:
“Not only does it run better than native 4K, it also looks sharper.”
To have option of getting an image that runs very close to 4k native image with a performance boost as found here would be very acceptable and preferable to many people I’m sure.
RTX GPU are rightly criticised for being over priced, lacking support in enough games to justify their existence etc, but you clearly are incapable of showing an ounce of objectivity and are biased against them in an unreasonable manner.
The results for DLSS in this game are factually very good, and offer an image that bears remarkable similarity to a native 4K image, which is about all you could reasonably ask from the technology.
Should have gone to Specsavers. /facepalm
Does it undowngrade it to look like the reveal trailer from E3 2017?
it looks really good .wow .how about metro dlss in new patch?plz inform us
dlss seems a promising tech and through dlss ray tracing will shine on next gen consoles
“It looks really good” https://uploads.disquscdn.com/images/f19d53e94c9cdf0b933c4ca7d8c1d73d3a5a2b1f3149f60c8567127d8647f23e.jpg
“how about metro dlss in new patch ”
Metro have that and its looks terrible
DLSS can be improved as the AI learns from the game, so the DLSS in Metro could in theory be improved to levels almost as good as this
Metro is so s*it that they need AI to improve the game ?
We are discussing Deep Learning super Sampling here.
The AI is what provides the basis feature and allows it to attempt to fill in missing data on non native images and improve the image quality by doing so. It improves image quality by doing multiple passes on the information and tries to learn how to better fill in the gaps.
Due to nature of how this feature works, this improves over time, due to having acquired more information.
Your perceived quality of the game is irrelevant in this respect.
“Your perceived quality of the game is irrelevant in this respect”
So EA is pumping graphics into this junk so it can be Relevant??
Cuz there is not mutch to talk about Anthem beside gaytracing and dlss .
Its still a game and not a techdemo
“unlike scam citizen”
Are you capable of staying on topic or having a coherent conversation?
One minute you’re saying Metro is sh*t and now you’re talking about Anthem and EA.
The article in reality is not about a particular game, and more the implication of a new technology within it and how effective it is.
EA , the quality of this game and Metro etc are not relevant to that.
EA are a horrible company and this game is most probably terrible (I’ve no intention of finding out), but the technology used here is interesting regardless
Consoles are not using Nvidia hardware, they won’t have DLSS.
DLSS can’t be on consoles since it’s Nvidia tech. Consoles are all AMD.
im talking about ps5 .they may use nvidia gpu why not?even if they dont use nvidia gpu something like dlss tech will be used i mean deep learning anti aliasing tech
Nvidia isn’t friendly with other manufacturers. It is the same reason Apple won’t use them. They won’t even let Apple write their own drivers. They’ll never be in consoles because it is their one goal to control everything.
Advanced upscaling already exists on consoles…
Why do you have to be so uniformed? Nintendo Switch uses NVIDIA GPU/CPU called Tegra.
Are we talking about Nintendo? No.
That’s a handheld and completely not related to the discussion. We’re talking about consoles over here.
It’s is a console and can be a handheld, again you don’t seem to know much.
Yeah, sure. Except it is built from purely mobile components. It runs on less than 10 watts and uses and arm processer, low power ram, and runs off a battery.
Just because it can display an image on a TV doesn’t make it a console. My phone can do that. It isn’t a console.
Don’t play dumb ( you are playing, right?)
It’s a console get over it, NVIDIA also made the GPU for the XBo360 as well, you just don’t like that you’re wrong. By your logic the WiiU wasn’t a console then but the Switch is more powerful, idiot.
How old are you?
I ask, because you talk like a 3 year old and use the logic of a 5 year old. None of your “points” are even counter arguments.
It’s not a matter of the brand name attached to it, it is a matter of the architecture.
It’s clear you’ve never taken a computer science class in your life. Middle school I’m guessing?
Listen little boy who thinks he knows something, I can educate you only so much, Tegra is a very powerful APU, stop with the projection and admit you’re wrong.
Because both Sony and AMD have already confirmed their partnership.
And there’s no point really, AMD already uses similar techniques to upscale resolution. Classic upscaling, checkerboarding, interlacing, etc. They’re not gonna make Sony (and then the consumer) pay extra for upscaling through dedicated hardware, that’s just not a good idea.
dlss in anthem is far superior than all those upscaling techs u said so they will use dlss or a similar tech in next gen so we will see ray tracing for sure
So you’re just gonna ignore everything I said and keep on being dumb? Okay, suit yourself.
u say its like upscaling techs before it and i say no its total different if it has anthem quality dlss its a game changer cause people expect nothing than 4k of ps5 and its unacceptable for them anything other than 4k resolution in my opininon .they r tired of upscaling crap othervise its realllyyyy good
It’s not a game changer, it’s barely any different from regular upscaling. Anthem is the first game where it’s actually done somewhat properly but only because they used excessive sharpening.
If you don’t want upscaling crap just don’t buy a console.
Considering people are buying consoles in the first place, I don’t think they care if it is native 4K or not. Most people wont’ even know the difference.
at least those ps2 ps3 and ps4 players are now bigger boys and their knowledge are far more so they do care i guess.they r not children anymore
This isn’t true, actually. There are several forms of upscaling that are as good or even better.
There are worse too, but that’s all up to the developers.
It would be a bad decision to put dedicated hardware into something that can be done for free.
This is another Gsync vs Freesync kind of thing.
there is no upscaling method better than dlss in anthem and its far better than all previous methods
With a loss of detail and over-sharpening? I strongly disagree…
They basically blured the image and then applied a sharpening filter over top of it. Good enough for the masses, but I’m not a fan.
The Spider-man implementation on PS4 is more convincing. No sharpening, and a nice, clear image without losing detail.
its far superior man in my opinion .that spiderman was just checkerboeard crap
You’ve obviously not seen it in person.
u r right its not checkerboard .yeah i must check it out
Digital foundry talks about it in one of their Spiderman episodes . It’s worth a watch.
sony and microsoft already use their own upscaling technique. ray tracing probably won’t come to consoles for a couple of generations at least.
AMD Partnered Games: Resident Evil 2 Remake, Devil May Cry 5, The Division 2
Nvidia Partnered Games: Battlefield V, Anthem, Metro Exodus
It turns out AMD had a better taste in games to partner with.
Hey. Leave Metro alone.
I played a couple of hours then left it alone. Little scripted routines annoy me, while the AI is just awful.
Does the new patch improve the AI?
It’s a steaming turd
And they run better. Nvidia and their GimpWorks BullSh*t. But seriously thought, I’m truly impressed with how well The Division 2 runs especially in DX 12. There is not one nvidia game that runs smoothly, because GimpWorks will always Be sh*t. I was with nvidia for 8 years. I had a 690 & then a Titan X. I watch nvidia Nerf the hell out of that 690 as the years went by. I went back and tested the same games I was getting 200fps in and now I’m lucky if I get 60-80. I’ve recieved they’re a treachery first hand, nvidia is trash and we the consumers are to blame. Because as soon as they announce some B.S with a fancy name, everyone fanboys and flocks to it like it’s the 2nd coming. They don’t even create these so called new tech, they just latch on to it, by giving it a name and say, “look what WE have”. So tell me this Batman, why is it the Physx & even hair works runs better on an AMD cards than it does on Nvidia’s own Hardware. The sad part is that AMD has gotten lazy and isn’t competing with Nvidia, so they are left to run a much in the industry.
I could agree on RE 2 Remake, and maybe on DMC 5, but the division…
I am hearing great things about The Division 2. Also it has an 85 on Metacritic.
I don’t really care about metacritic and other sources, i use myself as source honestly.
Same for me as well but it seems as The Division 2 is a massively improved sequel. Will eventually check it out when its on a deep deep sale or it come to Steam.
I am hearing great things about The Division 2. Also it has an 85 on Metacritic.
No, they’re getting paid by your clicks. You’re feeding the beast that slays you.
How many console games run in native 4k instead of upscaling 2k to 4K?
SotTR is better