Alex Tardif, Lead Graphics Engineer at ZeniMax Online Studios, announced that The Elder Scrolls Online will support NVIDIA’s DLAA tech. In addition, the team will also add support for the more “traditional” deep learning technique, DLSS.
DLAA stands for Deep Learning Anti-Aliasing. This basically means that the game will use AI techniques at full resolution in order to provide better anti-aliasing results.
DLAA also sounds like what NVIDIA advertised as DLSS 2X (back when it announced its Turing GPUs). As the green team stated back then.
“In addition to the DLSS capability described above, which is the standard DLSS mode, we provide a second mode, called DLSS 2X. In this case, DLSS input is rendered at the final target resolution and then combined by a larger DLSS network to produce an output image that approaches the level of the 64x super sample rendering – a result that would be impossible to achieve in real time by any traditional means.”
Now I’m pretty sure DLAA won’t approach results similar to 64X supersampling. Despite that, though, it appears to be working in a similar way with DLSS 2X.
As we’ve seen, DLSS can match or even surpass native resolutions. Not only that, but it introduces new details thanks to its deep learning techniques. So, theoretically, DLAA should provide a way better and crisper AA than TAA or most common AA methods.
There is currently no ETA on when this DLSS/DLAA patch will go live. Naturally, though, we’ll be sure to keep you posted!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email

I’m pretty sure you couldn’t tell the difference 😉
Fuuuuck this game bring TES VI already, jesus.
Different teams, buddy.
Most of the “TES 6 team” are still making Starfield. Almost nobody is working on TES 6, it’s in planning / prototyping stage.
Finally, been waiting for a game to do this. A lot of DlSS’s weak points come down to having to work with a lower resolution input and interpolating information that doesn’t exist into something with 2.25x as many pixels.
DLSS is already the closest thing to decent AA approaching SGSSAA. Most TAA is hot garbage for so many reasons I could talk forever about. (Not that DLSS doesn’t share some of them) Hoping more games opt to add this option (always with an adjustable sharpness slider for those people who like over sharpening everything).
It’d be nice if Nvidia could create a shim layer in the driver with functions for compatibility to hook into any game like with the old DX9 AA compatibility layer. Let the community do the rest, just put it in there.
https://docs.google.com/spreadsheets/d/1ekUZsK2YXgd5XjjH1M7QkHIQgKO_i4bHCUdPeAd6OCo/edit?usp=sharing
The thing is dlss actually have more pixels to work with after a few frames (on scene change) than the lower base resolution due to its temporal component (IE uses several frames data and that’s why motion vectors are so important to a good dlss implementation).
Damnable TAA blur really makes it easy for Dlss to win even when its up sampled in the majority of titles (after dlss v2.x)
https://thumbs.gfycat.com/MistyAcrobaticBonobo-size_restricted.gif
DLAA could prove interesting, most TAA destroys clarity and makes the scenes way to blurry
Why not try it on a game that needs actual AA and isn’t CPU bound. Try it on that new Mass Effect Legendary Edition where they have AA and post effect garbage so bad that you have jaggies all over the place even at 4k. Don’t waste time on a game that can be supersampled or have SMAA eliminate any jaggies, while not blurring the game with the garbage FXAA it uses. ANYTHIGN would look good compared to the FXAA it uses. CMAA would look brilliant as well, which is what WoW uses and is just edge aliasing which is all the game needs.
puzzled as to why DLSS does not always come with a “enhance” option to just maximize image quality. a huge portion of games don’t NEED a reduction in render resolution since they’re not GPU-bottlenecked or run so well it wouldn’t be fair to attest them any bottleneck at all. why shouldn’t tensor cores on the GPUs of 10% of PC gamers be utilized to improve image quality in those games?
When DLSS 3.0 hits and becomes compatible with all games that have TAA / motion vectors the prospect of a new superior AA method might really make me buy one of those overpriced tensor UFOs from nvidia even though my 1060 runs everything okay already. If you think about it perfect image quality is a BIGGER selling point than upscaling for performance. You can always get more performance by just upgrading to more brute force. You CANNOT get more image quality by just paying more unless you count downsampling which has its own problems and runs into a performance ceiling quickly.