Earlier today, Microsoft released a new DirectX 12 Ultimate feature, called Work Graphs. Work Graphs introduces new types of GPU autonomy that attempt to eliminate CPU bottlenecks. And below, you can find its first tech demo.
Work Graphs in the Direct3D 12 (D3D12) API also has a unique capability to dynamically choose and launch shaders on a micro-level. And, what’s really cool here is that NVIDIA has shared a tech demo for it.
Compusemble has shared a video, in which it tested Work Graphs on an NVIDIA GeForce RTX 4090 with an AMD Ryzen 7 7700X. So, make sure to watch it as it can give you an idea of what this new API feature can do.
In theory, this should improve performance during specific scenarios. As you’ll see in the video, though, these performance improvements are not universal. In other words, there are multiple scenes in which the tech demo runs exactly the same with and without Work Graphs.
It will be interesting to see whether any devs will take advantage of it. Let’s not forget that MS launched DX12 Mesh Shaders a few years ago. And, right now, there is ONLY ONE game that takes advantage of them. So yeah, while Work Graphs sounds cool, I’m not certain a lot of devs will use it. Unless of course the next PlayStation and Xbox consoles support it.
NVIDIA and MS have shared more details on their sites about Work Graphs. If you are interested in it, we highly recommend reading their blog posts.
You can go ahead and download NVIDIA’s tech demo from this link.
Enjoy and stay tuned for more!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
Only supported on RTX 30 and up and RDNA3 suggests they are using AI cores for parts of this
That’s fine RTX 30 series is 3.5 years old at this point and nearly two generations old.
If you had bothered to actually read about how Work Graphs work, you’d realize they are not making use of any AI cores whatsoever.
In fact, that doesn’t even make any sense…
If consoles do it then pc will too. Two of the largest videogames giants sony and M$ wth consoles and buying up studios will dictate. M$ uses tricks to force people to use their new wares which this is, dx12 been out a long time now and a lot of games still use 11 so idk about advanced obscure features like this for most games that use dx12. But w the power of these cards now they could pretty much just be the whole computer for non gamers, cost as much too at the high end.
We need more of that – Shame it will get adopted by the dev’s for real when mainstream GPU’s have support – So in other words by the year 3050 or soo if the gpu vendors keep up their price hiking each gen. GPU vendors are basically killing the PC gaming slowly atm
Just like the DirectX 12 disaster, most of this stuff is too difficult for the average developer to implement.
Just because experienced studios like IDSoftware and Asobo Studio are able to write their own rendering pass and get another 10% of performance through these type of optimizations, doesn’t mean the average studio can.
The average studio is made up of diversity hires who rely on Unreal Engine.
REALITY CHECK
The real reason is Management doesn’t want to pay for the extra time and effort involved to implement newer DX12 features like Mesh Shaders because that would cut into their profit margins
99% of business failures are caused by bad management not bad workers. Companies that have good management, like Nvidia thrive and innovate. I can pretty much guarantee that 90% of Work Graphs was developed by Nvidia’s R&D engineers. Nvidia has some of the happiest employees in America which is why they thrive in an economy where others are failing
Wrong.
AMD was the main driving force behind this, as confirmed by Microsoft:
https://uploads.disquscdn.com/images/a0a1a1f435424c721b35311e11acb3270423a459bb58b0100b3f50026998e0ff.png
AMD is also the one who developed the equivalent extension for the Vulkan API, publicly released in the summer of 2023:
https://uploads.disquscdn.com/images/43161424cc65fe0b3f2e35115787d132dd0bce4e8cd41561ca04fffa28354910.png
Don’t let the truth get in the way of a good story!
Nothing to do with management, writing your own low-level shader pipeline is incredibly complex. Microsoft overestimated the ability of the average coder. IDSoftware warned of this several times. Some game companies have the right people, able to take advantage of low level hardware access, but most do not. Most gaming companies rely on commercial engines and aren’t experienced enough to take advantage of DX12 features that would require rewriting a custom rendering pipeline.
DX12 disaster? Huh?
The promise was that low level access to hardware through DX12 would result in substantial gains in performance over DX11.
But as IDSoftware predicted, that didn’t happen for most developers, due to the coding difficulty involved in writing a more efficient custom rendering pipeline. The average developer is not experienced enough to do this.
DX12 builds of games often perform worse than their DX11 counterparts.
No? UE4 in DX12 is VASTLY faster in CPU processing than in DX11. Massively so.
It’s just that in GPU bound scenarios, it’s usually 1:1 performance.
All the UE4 games I’ve been playing, I’ve been doing so in DX12 for the past close to 5.5 years now.
Rise of the tomb raider and shadow of the tomb raider and Hellblade senua sacrifice are few games i tested in which Dx12 gives significant fps boost over using Dx11 even on older gtx 900 maxwell series gpus. Other than that most games perform better in Dx11 like re2 remake, re3 remake, Deus ex mankind divided and few other games. I also noted that if ur not cpu or gpu bound, dx11 achieves highest peak fps in games. Also most newer games using only Dx12 perform better in fullscreen windowed mode compared to Exclusive fullscreen. Frametime graph is more consistent using fullscreen windowed in Dx12 games.
It’s a shame that more mainstream games don’t have a Vulkan renderer. From my experience they tend to perform better than DX (eg in RDR2).
Part of the reason why Vulkan generally performs better than DX12 is the fact that efficient execution on mobile platforms is a priority, because its running on the most popular OS in the world, namely Android (powered by the Linux kernel for all eternity, because Google’s Fuchsia micro-kernel OS experiment is practically dead by now).
While I agree that Vulkan is great, DX12 has been just as awesome. It just depends on the developer at the end of the day
Or the average code monkey who can barely glue together 20 bloated codebases/api’s without having a clue how to optimize for them, likely done in clean code and all other performance zapping code disciplines of today – Like the over usage of polymorphic classes while trying to fool themselves the compiler will fix that 🙂 Spoiler – It doesn’t, still pull a deep dent in nifty things like cpu’s code branch prediction and causes cache bloat, and just to be sure kill that completely and add as low performance and as much stutter as possible – Lets ice the cake with denuvo on top
Your resume shows you are fluent in Javascript.
–“Definitely, I can write jQuery, React and Angular”
I see. We’ll be in touch…
It felt like looking at a DirectX 7 demo that showcase Transform and Lighting.
Good old vertex shader days!!!
Funny how nVIDIA failed to release a raytracing demo back when 2000 series was released instead of leaving users with new hardware without RT software to experience…
Or it just didn’t want the negative publicity from the gigantic performance hit.
Nvidia made Quake 2 RTX. You must’ve been living under a rock.
When? How long AFTER 2000 series was released and how long BEFORE the 1st raytracing game was released/updated?
Who’s been living under a f*cking rock now?
Does ReBAR need to be enabled for this or is it purely independent on a 30 or 40 series card?
I love how almost every time MS demos some new DirectX feature it takes like 10 years for it to be commonly used by developers. Outside of low adoption of very unpopular versions of Windows like Vista and 8 (which are outliers, frankly), I’d make an educated guess and say that console first focused multi-platform development is probably to blame.