EIDOS Montreal and NIXXES have released a new patch for the PC version of Deus Ex: Mankind Divided. This patch adds DX12 support (no longer listed as a beta) and fixes a number of bugs that have been reported.
We will be testing the DX12 mode this weekend in order to find out whether it performs as good as the DX11 mode, so stay tuned for more.
This patch will be auto-downloaded from Steam, and below you can find is complete changelog.
Deus Ex: Mankind Divided – PC Patch Build 582.1 Changelog:
- Players can now also close their in-game menus using the keys to open it: TAB / I / M / O / P / L.
- Fixed an issue where the diagonal walk speed was too fast, causing Jensen to make noise.
- Fixed a crash at the shooting range.
- Fixed an issue where Rucker would not spawn in certain situations.
- Fixed an issue where the tranquilizer rifle would no longer shoot in certain situations.
- Fixed an issue in System Rift where the “Heat Sensor” strategy page would not trigger after loading a savegame.
- ‘Hold E’ to open Mission Objectives/Pocket Secretaries/Strategy Pages/Etc. are mapped to the TAB key by default. This removes the interference with ‘Hold E’ to drag bodies.
- Added support for DirectX 12.
- You can toggle DirectX 12 from either the launcher window or from the Display options within the game.
- Note that DirectX 12 requires a restart of the game if (de-)activated from within a running game session. (WARNING: Using the DirectX 12 API can offer better performance on some systems, however, it will not be beneficial on all. If you encounter problems with DirectX 12, we recommend turning it off again. There areno visual or gameplay differences between the two DirectX versions.)
- Multi-GPU Note: Currently we only support Single GPU on DirectX 12, see below for more details about Multi-GPU.
We feel that it is in a state where we can release it to you, but we are aware that some systems might still experience issues.
Please let us know when you are experiencing issues.To activate Multi-GPU DirectX 12 you are required to do the following:
- Within your Steam Library, right-click on Deus Ex: Mankind Divided.
- Select ‘properties’ from the context menu and navigate to the Beta tab.
- From the dropdown menu, select dx12_mgpu_preview then close the window.
Known Multi-GPU DirectX 12 issue:
- Some systems may experience some graphical corruption.

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
And I was just about to exit the HUB for the final area of the game. Glad I can finally do the shooting range now.
what a joke, there will never be a patch that fixes this.
Just used the in-game benchmark to test DX12 multi-GPU with 2x 980ti’s at 1440p/Ultra with MSAA disabled:
DX11 SLI:
Average: 44.6fps
Minimum: 34fps
DX12 mGPU:
Average 59.7fps
Minimum: 40fps
Big increase with DX12, and on Nvidia too, which are often #$%! with DX12.
BUT… DX12 has random frame stutters. Which is reflected above, as it nearly constantly hits 60fps but once in a while there’s a stutter. It’s still in preview so if they can fix that then DX12 will be a clear improvement.
Good. Devs will end up learning the new api and optimize it much better. Baby steps. I remember when DX11 came out. God it was awful. And the tesselation, AWFUL lol.
tessellation was mostly raw power problem. in the early of DX11 the feature is very demanding making enabling tessellation end up taking big performance impact. they “feel” more optimized right now because our hardware become very powerful at handling tessellation.
Tessellation implementations were also ridiculously stupid, however.
Ever hear of Crysis 2’s Tessellation overuse? Crytek’s implementation was absolutely sh*t.
Well, to be fair, if Volta is indeed yet another Maxwell repackaging + Pascal gimping, the only person we’ll have to thank for that is AMD, as Nvidia will most likely continue to sit on “Nvidia’s Next Gen Architecture” until AMD decides to finally move on from GCN.
Once AMD finally moves on from GCN, we’ll definitely see Nvidia moving to counter it with something more than another repackaged, die-shrunk Maxwell lineup with a few modifications slapped across its brow.
why would they move from cgn?! they’re starting to reap some benefits from incredible parallel scaling of it, just have to bring up driver overhead down
Because they’ve been using it across how many years now? Not to mention how many major iterations it’s gone through.
It’s a solid architecture, sure, but as long as AMD doesn’t do anything that’s actually groundbreaking new, neither will Nvidia.
not just solid, it’s insane how good parallel workload distribution is! so to build on that with new iterations is the most logical way
Yes, they’ve been introducing major changes with each “generation” of GCN, but at its core, it’s still GCN. Like a game engine, you can only patch it so much before the core starts to drag you down simply “because age.”
True.
Absolutely true, we really need to get more developers to take proper advantage of not only DX11, but also DX12/Vulkan. While DX11 support as baseline nowadays is great, nominal support remains just that – nominal.
What? An AMD addict like yourself that hasn’t jumped to Microsoft Spyware 10?
*shocked.* 😛
Valve’s busy counting the money ;P
Maybe he realised how bad the entire thing is, & how much of a non-threat it really is to Steam’s supremacy, so he went back to countin’ da muniez?
If we want to talk delays, though, AMD’s suffered its own as well; the 14/16nm transition was so troublesome for TSMC, they ended up an entire year behind schedule.
I agree, actually, I believe they’re moving away from TSMC, aren’t they?
Agreed (on pushing Vulkan). I have a theory that Nvidia will actually begin to push Vulkan in 2017, considering how their hardware isn’t as DX12-compatible as AMD’s. AMD on the other hand will most likely remain ardently behind DX12 for as long as possible, considering how well equipped GCN is for it.
If that happens to happen, considering Nvidia’s dominant market position & the (alleged) ability to easily port from Vulkan to DX12, I could see DX11 being dropped relatively soon actually, in favour of a Vulkan -> DX12 porting path. Plus, since you mentioned it, it would allow Nvidia to keep up the “new-gen API ready” facade without putting them into such a morally grey area as they are in now, while also permitting developers to continue supporting Direct3D & AMD, fully.
Honestly, while I don’t support Nvidia’s market domination, or either company’s “bribe developers to make games run better on their hardware” strategies, if Vulkan does indeed have the ability to easily create DX12 ports, I look forward to seeing what sort of an impact that has on DX12 AMD vs. Nvidia Vulkan. Will those games favour Vulkan? DX12? Or will performance be equal? Tradition dictates the former, but, who knows.
not all developer are fond with tessellation. unity dev for example quite negative about tessellation early on for how the feature use too much performance due to how they work.
That too, but Crysis 2’s problem was basically that Tessellation was being overused. It was “Tessellation for the sake of Tessellation” basically, which just caused an unnecessary strain on the hardware.
Which is basically why DX11 should never have been patched in 6 months later in the first place, but yeah.
crytek most likely have no intention to use tessellation from the start anyway because of their focus on console. i heard that nvidia give them 2 million to make DX11 happen for Crysis 2. nvidia probably asking them to use DX11 in a way that will give the advantage on their hardware. and then crytek just put tessellation everywhere.
but still i hate the most what crytek did to Crysis as a whole (maybe we should blame EA as well). Crysis 2 is the prime example how a game being sacrifice to satisfy the hardware limitation on console be it in graphic or in gameplay.
P.S. TWO MILLION? Jesus.
They should have just threatened them with a Class-Action Lawsuit for false advertising instead. Not like they wouldn’t have deserved it >.>
Oh that could be right. Nonetheless, a new API means learning curve. People expecting devs to fully use DX12 like DX11 is used right now is day dreaming in colors.
and DX12 is much more complicated than high level API like DX11. some people expect DX12 to replace DX11 when in reality it does not mean to do so.
Are they even fully using DX11, yet, however?
I mean, saying “yes, it’s DX11 compatible” / “Minimum system requirements: DX11” is one thing. Saying “yes, it takes full advantage of DX11” is another thing entirely. Nominal DX11 support is basically just basic API integration, which means a bit of a performance boost, but no real visual upgrades.
Have we noticed an actual uptick in games that use proper Tessellation, etc.?
Might be a stupid question, to be fair, but I’ve really been wondering lately.
Very cool, I’ve been looking forward to mGPU native support in DX12. Since I have gsync, all the better I think.
I’ll take it as a sign to redownload DX:MD
Edit:
Also did you have to do anything in NvControlPanel for the profile? Or just load up the game in dx12?
Just load the game in DX12. SLI is totally a driver hack for DX11 and older (and OpenGL). DX12 actually supports it in the API, so a game with multi-GPU support will just work even if SLI/Crossfire is disabled in the control panel.
Wait for the Director’s Cut before playing it through. Even if it runs better, the game world itself & its contents are still a mess.
Sooo…you still prefer Dx11?
Yep. But keep in mind the multi-GPU DX12 support has only just come out in the preview branch, so it’s early beta support. If they fix the strange stutters/frame pauses it will be way better than DX11 performance.
Hmm that’s good news I guess.Thanks for the reply 🙂
so in DX12 sacrifice smoothness (frame time) in favor more frame rates?
Well it’s clearly a bug. It will run solid and feels responsive, but then there’ll be occasional random single frame pauses. DX12 mGPU support is only in the preview build branch so it will probably get fixed before it’s properly released.
Thank you so much for detailed info!
Vsync is a performance killer?
Because it caps the game at 60 fps as the monitor supposedly 60hz?
In some game yes i give you an example dota 2 without vsync with 60 fps max my gpu is using only 18% with vsync enable my gpu using 75-90%
Still to broke to buy the game at $50 🙁 Keep fixing the bugs till I gather my money
Does the mGPU implementation through DX12 allow for pairing different GPU’s? Like a Titan X and a GTX980? I know in Ashes it works like that. Just curious….
isn’t that in ashes you still need gpu with similar performance? or else you will just get worse performance because using two gpu that having gaps in performance.
I7 4790K + GTX 970 1440P high settings
DirectX 11
33 Average
28 Minimum
39 Max
CPU usage was around 30-35%
DirectX 12
32.7 Average
27.9 Minimum
38.6 Max
CPU usage 25-30% i monitored this using AV control app from Logitech live during the benchmark
After human revolution I wonder if this game any good ??
It’s alright. But shock shock Gears Of War 4 just happens to be the Best DX12 game on the market. Both game / Performance.
ahh I see :O I l might get mankind divided on sale
People normally in general have given Nixxes to much credit since they purchased their games later on and not when they released. TombRaider Reboot was a prime example.
So yeah, definitely on sale.
P.S. Sorry, I buried this article in another window until I just dug it up now….. ^^
It’s arguably either the second best Deus Ex game in the entire series (including the originals), or the worst, ranking right down there next to Invisible War, depending on how you want to see it.
Story-wise, it’s a f*cking mess, with everything from mid-game basic gameplay tutorials to the overall dumb-as-dogsh*t main story that abruptly cuts off because it most probably used to be an episodic, before we told Square Enix to take its pre-order bullsh*t & go f*ck itself. Environmentally, gameplay-wise, etc? It’s great. It’s Human Revolution, improved, with great stealth, side-quests, etc. (even though the AI can be ridiculously hilarious at times). As such, it really just comes down to whether or not the eventual Director’s Cut actually fixes it, or just shores up some of its major faults.
Either way, it’s a shame, as there really is a great gem hidden under all the sh*t, unlike most other AAA’s that just get pushed out early “because release date” & go from “EPIC FAIL” to “MEDIOCRE” after 6-9 months worth of patches, if they even get that much post-launch support in the first place.
“Players can now also close their in-game menus using the keys to open it”
FUUUUUU!! I really want that stupid simple feature sooo freaking bad when i played the game. Guess i can always enjoy it when i replay the game again. Goddammit i always taught myself to never buy game at launch. Should’ve listened to my own advice goddamit!!