NVIDIA has announced a pragmatic algorithm for real-time adaptive supersampling in games that extends temporal antialiasing of rasterized images with adaptive ray tracing, and conforms to the constraints of a commercial game engine and today’s GPU ray tracing APIs.
What this basically means is that this new Adaptive Temporal Antialiasing will address the blurring and ghosting issues that are introduced with TAA.
As NVIDIA noted:
“The algorithm removes blurring and ghosting artifacts associated with standard temporal antialiasing and achieves quality approaching 8X supersampling of geometry, shading, and materials while staying within the 33ms frame budget required of most games.”
Unfortunately, this new anti-aliasing technique may not come to PC games anytime soon. After all, there isn’t any gaming GPU that can support ray tracing right now (though this may change next month).
Either way, this is another technique via which NVIDIA – and perhaps Microsoft – will use ray tracing in video games!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email

Read this news before.
This new AA feature is going to be demanding on the GPU though, because of “Ray tracing/DXR” implementation, and no way we are going to see this feature implemented in games anytime soon.
Its currently not supported unless we have Windows SDK installed. We can possibly build and run DXR tests with current GPUs, like this though, if I’m not mistaken (like a software renderer):
But Raytracing still needs hardware acceleration, a dedicated ASIC.
https://uploads.disquscdn.com/images/89ee5e641d19cdddcc031c3149f83f567ad00c581f5a5a421cd53200e99959db.png .
And, apart from this, why they didn’t include MSAA comparison in the above chart as well(vs ATAA ). I see a lot “blurring” though, at least as evident from this image.
Btw, they used TITAN V to achieve 8x supersampling, under 33ms frame delay. BLUR in this image?
https://uploads.disquscdn.com/images/263abdedd8527ee634cd8d4e979764b18e90a1b3fe77891ddef957a75ec20a7b.jpg
To quote this article text:
“After all, there isn’t any gaming GPU that can support ray tracing right now (though this may change next month)”.
Nope. I doubt it will change next month….
Ohh it’ll support Ray Tracing…. at 20fps/4k. Very nice tech incoming but all of this will be « enjoyable » in what 2-4 years ? We never know.
I truly hope Intel shake things up with their dGpus.
But It really impresses me to see how NVIDIA has been pushing the boundaries lately with new and latest tech, releasing Tensor Core GPUs, and other new techniques, like RTX etc.
Still no news from AMD.
Well i believe AMD will have to retaliate and to offer something similar. Otherwise the whole gaming community is going to get it way more than right now. No lube next time.
my gtx 970 will run it 1440p max settings 30 fps iam sure of it!
no.
iam sure my 970 will run it ok maxed 1440p!
I find ray tracing kinda pointless, its one of those features that are impressive on paper but show it to an average person and he wont even be able to tell the difference. You can see better textures, better draw distance, etc etc. How about nivida workes towards improving something we can all benefit from?
Its like all this hype about 4k yet most games dont even support high enough textures to go along with the higher res.
Well from the Metro : Exodus trailer ray tracing was clearly visible and added a nice depth to the video.
I thought it was only used to calculate a softer change in light levels within Metro Exodus? Perhaps the same thing that could be done CPU side for those with more than 4 cores / 8 threads as such an effect can be processed at a lower frame rate than the main game.
Except it made the water look worse. But you are right about the fact that you could tell the difference.
it is not about the difference in how “photorealistic” the image was. it is about how to render object and it’s interaction with light more correctly like how it was supposed to be done on real world. pre baked effect is cheap on performance but sometimes they make the image look much worse instead of improving it. take AO for example. without AO everything look flat. but sometimes introducing stuff like HBAO causing excessive “black edge” to the image making it looked more unreal than it already is. this is what i remember happen with Far Cry 3 before. HBAO+ is more refined version of HBAO but still does can’t escape certain issue like casting the shadow on the wrong side.
everything have to start from somewhere. right now ray tracing is very expensive on the hardware even with the aid of tensor cores but performance will get better over time.
someone who thinks before writing. good joob, bro
He just don’t know what ray-tracing is. 🙂
you are in the same boat when you’re talking about amd 🙂
I was talking about Tony Hybo. Where do you see any mention about AMD?
i know, you should know as well i was talking about your other comments.
Well I am barely talking about AMD, so I still don’t know what do you mean…..
“…within the 33ms frame budget required of most games”
‘Most games’ run at 30fps on PC? Er…
yep, laughable. but that’s what happens when you’re a graphical engineer, especially if you’ve ever worked on consoles. you start thinking that 30 fps is acceptable because the best site for the discovery of new algorithms to pioneer is right at the edge of playability. there’s a chance nobody has done it before and there’s a chance you can get it running in “real-time” and take full credit. that’s the dream.
of course there are also graphical engineers working on, say COD or Quake, who aim for higher framerates. I don’t mean to generalize. but those guys rarely invent completely new tech. making a game run at a high framerate is about optimizing existing tech. if you want to be a pioneer new tech you’re stuck in 20 fps hell on SLI TITANS all day 😉
This was my takeaway. Useless at this speed. I guess in 15 years it will be viable…
Sounds like the ray tracing features within the 1st gen are going to be very weak. With a 7nm refresh next year, I’ll stick with my 1080Ti FTW3 for now.
What next year? If they will announce the next video cards this year.. i doubt they will start selling new ones next year? Hmm. Maybe they will start making them but selling them as in stores? Doubt it.. but who knows i guess..
Also 1080Ti should make anyone happy for like 2 years AT least, i dont even get why you need anything new? I got 1080 and i’ll probably keep it for 3 or 4 years :O 1080ti is even better… hmm
yes i have gtx 970 since 2014 and still runs evrything 1440p maxed 40+ fps so no reason to change it anytime soon!
I don’t understand what any of this means, but as long as my taxes don’t go up then I guess this is okay.
What does any of this have to do with Microsoft? That was so random.