Microsoft has just announced DirectX 12 Ultimate; the culmination of the best graphics technology we’ve ever introduced in an unprecedented alignment between PC and Xbox Series X. Moreover, AMD has shared a new tech video, showing DirectX Raytracing 1.1. in action.
According to Microsoft, DirectX 12 Ultimate supports all the next generation graphics hardware features. These include DirectX Raytracing, Variable Rate Shading, Mesh Shaders and Sampler Feedback. Thus, DX12 will ensure stellar “future-proof” feature support for next generation games.
What’s also important to note is that DX12 Ultimate will not impact game compatibility with existing hardware which does not support the entire breath of DX12 Ultimate features. Therefore, next-generation games which use DX12 Ultimate features will continue to run on non-DX12 Ultimate hardware.
Regarding DirectX Raytracing, DXR 1.1 is an incremental addition over the top of DXR 1.0. Thus, below you can find its three major new capabilities.
DirectX Raytracing 1.1 Key Features
- GPU Work Creation now allows Raytracing. This enables shaders on the GPU to invoke raytracing without an intervening round-trip back to the CPU. This ability is useful for adaptive raytracing scenarios like shader-based culling / sorting / classification / refinement. Basically, scenarios that prepare raytracing work on the GPU and then immediately spawn it.
- Streaming engines can more efficiently load new raytracing shaders as needed when the player moves around the world and new objects become visible.
- Inline raytracing is an alternative form of raytracing that gives developers the option to drive more of the raytracing process, as opposed to handling work scheduling entirely to the system (dynamic-shading). It is available in any shader stage, including compute shaders, pixel shaders etc. Both the dynamic-shading and inline forms of raytracing use the same opaque acceleration structures.
As we’ve already said, AMD has released a new tech demo video that you can find below. As the red team noted, it has collaborated with Microsoft on the design of DXR 1.1. This video is to give you a taste of the photorealistic realism DXR 1.1 will enable when using hardware-accelerated raytracing on our upcoming AMD RDNA 2 gaming architecture.
Lastly, NVIDIA announced that its GeForce RTX GPUs are the first and only graphics cards that support these game-changing features.
Enjoy!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
MEMETX
excellent news.
excellent news.
excellent news.
excellent ruse
Being infected by corona – ok
Being infected by Robo Fernando – not ok
And yet old 3D Vision tech makes every game that it works with look much, much better than ray tracing can ever even dream off. What a lousy gimmick compared to a tech that actually makes your monitor a window to another world.
What the actual hell are u talking about lmfao. I can’t believe some people still think ray-tracing is just a gimmick.
Well, I have RTX2070 and compared to 3D Vision ray tracing is gimmick.
Meaning I have compared both. 3D Vision actually creates much better immersion of the game world making the view actual 3d. Ray tracing has nothing against that.
??? They’re two completely different things lmao. One is for 3d monitors and another is for boosting visual fidelity. It’s also the last important hurdle that needed to be crossed to achieve photorealism in videogames.
I know they are different, I’m not an idiot. Point is that both techs has been created to make games more immersive, the other one only succeeds in that much better than the other. It’s a shame that nvidia killed the more immersive tech.
Yes I agree it’s a shame they killed 3d vision.
But rtx was not created to make games more immersive, it was created purely to boost the visual realism of realtime graphics. Any added immersion due to that is merely a side affect. Games can be unimmersive while being photorealistic and games can be extremely immersive while being non-realistic.
nvidia did not kill anything, it is because game devs did not care about 3D Vision, hardly any recent game uses 3D Vision, and nobody other than you and your two friends bought 3D Vision equipment. Even TVs dropped 3D support a while ago. Can you blame nvidia for pulling the plug on a tech which was already dead?
yep in a sense the majority of the consumer are not really interested with it. nvidia have been pushing for stereoscopic 3D long before 3D TV becoming a thing. but the best part of nvidia 3D vision nvidia do all the profiling themselves and game developer does nothing at all. on AMD side to use 3D stereoscopic they need game developer to built in the function into their game. for other games without native support AMD push the responsibility to provide the driver profile to monitor maker (which is monitor maker did not provide for free).
Get VR instead. Way more immersive than 3D Vision in every way and you can even use 3D Vision with VR.
Also, Ray-Tracing isn’t a “gimmick”. It’s literally just a way more fancy and accurate way of rendering computer graphics. It can’t be a gimmick by nature, it’s just the way graphics are going.
Can I get DX12 Complete please so I don’t have to buy Season Passes?
Yeah, but there will be some microtransactions, you know, “cosmetics”.
Well done Microsoft, unprecedented progress is being made.
YOu like the abuse don’t you? I feel as if by this point you log on, comment, grab lube & Robo-DiKK then just let it RIP!!! That’s how this Robo gets his rocks off.
I guess it can’t properly reply yet.
They must be working on that and will be implemented in future updates.
Haha, you kill me man ???
As someone who works in the Architecture industry and utilizes the Unreal engine for our pre-walk-through models, where we both do baking in lighting and also try to use Ray Tracing on our renders (Still broken in UE 4.24) this interests me, specifically when Multi-GPU ray-tracing is implemented at least for the Unreal engine which they have been talking for a while.
Interestingly, ILM (Industrial Light and Magic) has been using a modified Forked version of the Unreal 4 engine in conjunction with nVidia to use Multi-GPU ray-tracing for their custom render processes. They have been doing it for almost 2 years now.
Lastly we use RTX 2080 Ti’s on our Render boxes, really hoping that AMD brings some competition to the table with hardware level Ray tracing since we also utilize nVidia P40 Tesla cards on a Nutanix VDI cluster for our Virtual Machines to do heavy workloads within Revit and other Autodesk products. The nVidia licensing is ridiculous for the VM pass through and we can save a lot of money once AMD can release a server side GPU that can compete with the P40’s.
About time to show off my RTX2080TI…
Just kidding, DX12+1 won’t do anything to show off the “raw horsepower” of my video card I wasted $900 on.
and an AMD card would perhaps? Your card has the most graphics features at the moment – try Control, Wolfenstein, Call of Duty etc. There is always a price tag to be an early adopter of any new tech!
and an AMD card would perhaps? Your card has the most graphics features at the moment – try Control, Wolfenstein, Call of Duty etc. There is always a price tag to be an early adopter of any new tech!
Metro Exodus maxed out in 4k is the prettiest game I’ve ever seen. Quake 2 RTX is also really impressive. I spent more than you did, but I don’t regret having a 2080 Ti.
We have seen Turing ray tracing, now RDNA 2 ray tracing. Can’t wait to see the next level above these with Ampere ray tracing.
Ray Tracing is the best <33333
If you’re so “heavily” user to criticize the OS, you should know how to let it the way you like.
But your whining shows that you’re b1tching about something you don’t even know how it works.
That!
Keep whining and never try to learn how to use a PC. Maybe if you whine enough, mom will come to help.
If anything auto updates in your system, is because you’re nothing but a mor0n. And I don’t waste my time with mor0ns, so F off!
Hello JOHN,
Have you seen this tech news ??
Earlier this week, Microsoft announced its latest DirectX 12 Ultimate API which aims to provide a unified platform to developers for next-generation graphics on PC and consoles. One of the key features of the announcement was the addition of Mesh Shaders to the DX12 framework & the Principal Engineer at Microsoft/Xbox ATG (Advanced Technologies Group), Martin Fuller, has showcased how the new technique would help devs in delivering higher graphics throughput in next-gen games.
DirectX 12 Ultimate API’s Mesh Shaders Tested With NVIDIA GeForce RTX 2080 Ti & Xbox Series X – Huge Performance Gains On PCs & Consoles..
https://www.youtube.com/watch?time_continue=1420&v=0sJ_g-aWriQ&feature=emb_logo
https://wccftech.com/directx-12-ultimate-mesh-shaders-nvidia-geforce-rtx-2080-ti-and-xbox-series-x/
Martin explains that there are only two platforms that currently support DirectX 12 Ultimate Mesh Shaders, and those include the NVIDIA Turing GPU lineup and the Xbox Series X with AMD RDNA2.
The DirectX 12 Mesh Shader demo shown by Martin includes the NVIDIA GeForce RTX 2080 Ti running on Windows 10 at a resolution of 1440p while the Xbox Series X devkit is running at 4K. The demo includes five rooms with various techniques. A normal pass through renders the 4K scene on Xbox Series X at around 100 microseconds which are reduced to just 55 microseconds with meshlet sphere culling that is a more advanced culling technique. The same holds true for the RTX 2080 Ti which reports significant render time drops with advanced meshlet culling techniques. You can see the demo of the RTX 2080 Ti and Xbox Series X in the video below:
https://wccftech.com/directx-12-ultimate-mesh-shaders-nvidia-geforce-rtx-2080-ti-and-xbox-series-x/
https://www.youtube.com/watch?time_continue=1420&v=0sJ_g-aWriQ&feature=emb_logo
It is also noteworthy that the RTX 2080 Ti renders the scene in about 40 microseconds using the regular pass-through method at 1440p whereas the Xbox Series X renders in around 100 micro seconds at 4K. The Xbox Series X, however, delivers much faster render times even at 4K than the (standard pass-through) NVIDIA GeForce RTX 2080 Ti which goes off to show the benefits of the new Mesh Shaders in Direct X 12 Ultimate API being embedded in Turing and RDNA 2 GPUs.
Hello JOHN,
Have you seen this tech news ??
Earlier this week, Microsoft announced its latest DirectX 12 Ultimate API which aims to provide a unified platform to developers for next-generation graphics on PC and consoles. One of the key features of the announcement was the addition of Mesh Shaders to the DX12 framework & the Principal Engineer at Microsoft/Xbox ATG (Advanced Technologies Group), Martin Fuller, has showcased how the new technique would help devs in delivering higher graphics throughput in next-gen games.
DirectX 12 Ultimate API’s Mesh Shaders Tested With NVIDIA GeForce RTX 2080 Ti & Xbox Series X – Huge Performance Gains On PCs & Consoles..
https://wccftech.com/directx-12-ultimate-mesh-shaders-nvidia-geforce-rtx-2080-ti-and-xbox-series-x/
Martin explains that there are only two platforms that currently support DirectX 12 Ultimate Mesh Shaders, and those include the NVIDIA Turing GPU lineup and the Xbox Series X with AMD RDNA2.
The DirectX 12 Mesh Shader demo shown by Martin includes the NVIDIA GeForce RTX 2080 Ti running on Windows 10 at a resolution of 1440p while the Xbox Series X devkit is running at 4K. The demo includes five rooms with various techniques. A normal pass through renders the 4K scene on Xbox Series X at around 100 microseconds which are reduced to just 55 microseconds with meshlet sphere culling that is a more advanced culling technique. The same holds true for the RTX 2080 Ti which reports significant render time drops with advanced meshlet culling techniques. You can see the demo of the RTX 2080 Ti and Xbox Series X in the video below:
https://wccftech.com/directx-12-ultimate-mesh-shaders-nvidia-geforce-rtx-2080-ti-and-xbox-series-x/
https://www.youtube.com/watch?time_continue=1420&v=0sJ_g-aWriQ&feature=emb_logo
It is also noteworthy that the RTX 2080 Ti renders the scene in about 40 microseconds using the regular pass-through method at 1440p whereas the Xbox Series X renders in around 100 micro seconds at 4K. The Xbox Series X, however, delivers much faster render times even at 4K than the (standard pass-through) NVIDIA GeForce RTX 2080 Ti which goes off to show the benefits of the new Mesh Shaders in Direct X 12 Ultimate API being embedded in Turing and RDNA 2 GPUs.
Hi john
Some fresh findings….Please read this tech article. Applies to GAMING as well..
As part of its Virtual Game Developers Conference (GDC) 2020, Intel has put a presentation online detailing the features of its oneAPI Rendering Toolkit that are applicable for games. These libraries include Embree, OSPRay, Open VKL, OpenSWR and Open Image Denoise. Intel also announced that some will receive GPU support soon.
The libraries discussed below are part of Intel’s oneAPI suite (which went into beta late last year) focused on rendering. Given the company’s push into graphics with its Xe Graphics architecture, it isn’t surprising to see it focusing more on the gaming side of the house.
You can find more details on each of libraries in detail under this TOM’s HW article….very interesting…
https://www.tomshardware.com/news/intel-bringing-oneapi-to-gaming-rendering-toolkit-xe-graphics
https://devmesh.intel.com/projects/intel-oneapi-rendering-toolkit-and-its-application-to-games
Open VKL
Intel’s Open Volume Kernel Library is a library for high performance kernels for sampling and traversing rays in volumetric data (scalar fields). It contains APIs for single sampling and packets to aid in vectorization of ray tracing algorithms. It is optimized for x86 CPUs and includes AVX-512 support. Intel said GPU support is in the works.
Open Image Denoise
Open Image Denoise is a library for denoising images rendered with ray tracing. It uses deep learning. Like Open VKL, it also supports up to AVX-512 and GPU support is also coming.
OSPRay
Third is Intel’s Open Scalable Portable Ray Tracing library. It is a full library for ray tracing on CPUs, with rendering options from fast to photorealistic. It is scalable from laptops to supercomputers, according to Intel, and GPU is also coming.
OpenSWR
Intel’s Open Software Rasterizer is implemented as part of the MESA driver stack. Intel says it is intended for scalable software rendering of large scenes, on the order of billions of triangles. It is designed for HPC systems and OpenGL 4.0 support is coming.
Embree
Embree is an open source library focused at solving the fundamental computations for ray tracing. It uses the latest state-of-the-art ray tracing algorithms. The kernels are highly optimized and deliver a 1.5 to 6x speedup, according to Intel. This is achieved by using SIMD (AVX-512), optimized data structures and other optimizations. It is targeted at professional rendering applications, and it has a wide adoption in the film industry.
https://www.tomshardware.com/news/intel-bringing-oneapi-to-gaming-rendering-toolkit-xe-graphics
https://devmesh.intel.com/projects/intel-oneapi-rendering-toolkit-and-its-application-to-games
I think you may find this interesting John…OFF TOPIC though…
https://www.guru3d.com/news-story/nvidia-empowers-game-developers-and-content-creators.html
NVIDIA works on helping to develop advancements in industry-standard APIs and game engines to drive graphics in games, empowering game developers to add next-generation graphics features such as ray tracing, NVIDIA DLSS, mesh shaders and variable rate shading into their projects as quickly as possible.
Press release: NVIDIA’s Latest Suite of Tools Empowers Game Developers
In a continuing effort to drive graphics innovation and to make it easier for game developers to add next-generation graphics features to their games, NVIDIA released a new suite of tools today, including:
NVIDIA RTXGI SDK – The RTX Global Illumination (RTXGI) SDK provides developers with scalable solutions to compute multi-bounce indirect lighting without bake times, light leaks or expensive per-frame costs. RTXGI is supported on any DXR-enabled GPU, and is an ideal starting point to bring the benefits of ray tracing to their existing tools, knowledge and capabilities.
Why is this important? RTXGI enables high-quality ray traced lighting in games while maintaining great frame rates.
NVIDIA Texture Tools Exporter – The NVIDIA Texture Tools Exporter uses CUDA to optimize textures for game engines or other applications. It is available as a standalone tool or as an Adobe Photoshop plug-in for game developers and artists.
Why is this important? This allows game developers to use higher quality textures (for higher texture resolution) in their applications, and provide the application to consumers in a smaller (and faster) download size.
Related Links:
NVIDIA DevZone article on NVIDIA RTXGI SDK: https://news.developer.nvidia.com/announcing-nvidia-rtxgi-sdk/
NVIDIA DevZone article on NVIDIA Texture Tool
https://news.developer.nvidia.com/Texture-Tools-Exporter-2020-1/
Vulkan Game Developers Get New Tools
NVIDIA embraces industry-standard APIs, and Vulkan is no exception. Today, NVIDIA added Vulkan support to two of its most popular game development tools.
Nsight Aftermath is now available for the first time for Vulkan. Nsight Aftermath for Vulkan will be available via an SDK update along with support in Nsight Graphics for viewing the new Vulkan GPU crash dumps.
Why is this important? GPU crashes have historically been incredibly difficult to fix. With Nsight Aftermath, NVIDIA changed the game by providing precise information on where the crash occurred and why.
Nsight Graphics: GPU Trace has proven to be an invaluable tool for game developers developing on DirectX, and today NVIDIA is providing Vulkan developers with GPU Trace, a low-level profiler that provides hardware unit metrics and precise timing information.
Why is this important? Gamers demand high-fidelity graphics without compromising framerates. To do this, developers have the difficult task of profiling their applications to identify performance limiters. Game developers can achieve peak performance with the metrics that GPU Trace provides.
https://www.guru3d.com/news-story/nvidia-empowers-game-developers-and-content-creators.html