EIDOS Montreal and Nixxes have released the first patch for Deus Ex: Mankind Divided. According to its changelog, this patch fixes a number of crashes. In addition, Nixxes is already looking into the mouse acceleration issues (we can confirm that the game suffers from mouse acceleration issues) that have been reported.
This patch will be auto-downloaded from Steam, is 77MB in size, and you can view its complete changelog below.
The following fixes are in this patch:
- Fixed crashes caused by third-party programs interfering with the game.
- Fixed crash that occurred after viewing the intro videos.
- Fixed issue that could cause a crash at the end of the Prague intro scene.
- Fixed mouse invert setting not being displayed correctly in the options menu.
Last but not least, here is what Nixxes had to say about the performance issues that PC gamers are currently experiencing.
“We are seeing people reporting performance issues when playing the game on Very High/Ultra settings with MSAA set to 2x, 4x, or 8x. We would like to emphasize again that these options are very demanding.
We recommend everyone that is running at recommended spec or higher to start with the High Preset and MSAA turned off, and then tweak the options to optimize your experience.”

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
Before you get ingame the lancher say dont even try to run MSAA if you dont have 4+ GB VRAM
Even in 1080p
So no MSAA for 970 and 1060 3 GB people :P:P:P
The game can take upwards of 7GB at 1440p, ultra textures, very high, MSAA off :p
Yup, at 4K i have seen as hight as 10GB with Very High Textures set. That is with MSAA set at 2X. But the game is beautiful.
“But the game is beautiful”
Well i’ve looked at 4K clips and it doesn’t look that beautiful, lighting seems a joke, effects look very random, and some textures don’t feel even that good, so i don’t know what you’re saying sir, also not even 5.5GB of vram are justified at very high…
And I’ve played it at 4k and it’s nothing special at all. AC Unity looks far better and has superior lighting even after all these years.
I have a GTX1080 and even at 1080p, MSAA can get the frame rate below 60 fps..
But there is no reason to enable MSAA anyway, since we have an extremely efficient TAA option that looks just as good, if not better than 4x MSAA.
Most the people i see complaining about performance are intel and nvidia users so i wonder if it’s just specific to that hardware. I have an amd cpu and gpu and while my performance isn’t always 60 fps it’s still 40+ most the time and i’ve only had 1 major graphical glitch that went away after reloading and a icon bug that apparently goes away if you activate the tracking aug after dubai.
On top of the performance issues, the game has a f*cking microtransaction store built right into the UI. I grabbed it for $42 on DLGamer once the reviews hit, but I honestly wish I hadn’t on principle alone. What a disgrace to the legacy of this series.
And SeasonPass is not going to share via family share, so family members can buy it again, lol. i knew they are going to take revenge after the backlash after augment your pre-order crap.
This downloadable content is excluded from Steam’s Family Sharing service.
AAA publishers in general are notoriously slow to realize and act on (if they do at all) stuff they should have done differently in the first place. So -maybe- in a couple of years from now they will slowly begin to realize that it doesn’t pay off to release their games completely unoptimized and patch them later.
Sadly that seems to be the norm nowadays.
You mean like they’ve been doing for the last 7+ years?
day one patch already being part of the plan in pretty much all triple A games.
I don’t understand those people who’s crying about the performance.
Go buy Titan X Pascal if you want to run on maxed out.
I have 1070 and I play on high, graphics don’t equal gameplay.
>graphics don’t equal gameplay
Always the same straw man.
Nobody thinks that, nobody says that.
Keep crying that it doesn’t run.
If graphics matter for you, where is the Titan X?
Excuse me, did I trigger you or something?
I pointed out your fallacious argument buddy.
I’ve not complained about performance.
You’re so ignorant oh god
“Keep crying that it doesn’t run.”
Why is that a good thing?
If it was the best looking game and far advanced than any game on the market, I’d understand your point. It’s not the new Crysis.
This game runs in 1080p (high settings) on PS4.
The PS4 version is more like a combination of Medium/High (with perhaps the odd thing pinched from higher settings), with low FOV and 30fps cap.
But to take your Crysis example, Crysis has the exact same problem as this game. Try running it on Max settings and then comeback and tell me how well it runs (TBH, I already know: 30fps on a 970).
It doesn’t matter that Crysis was a high-tech, demanding game: What matters is that its ultimate and penultimate presets, much like Mankind Divided, are visually almost indistinguishable, but have drastically different performance (High will net around 90fps). Being technically accomplished shouldn’t exonerate Crysis from the fact unless you are willing to do the same here.
Now if the granularity in the options as lacking forcing you to choose between visuals and performance I’d understand your point, but that’s not the case here. You can get most of the visuals and a huge gain in performance by lowering the preset once.
So that means that if you are getting poor performance it’s not because you are trying to attain a certain visual bar (as that bar is largely the same between Ultra/Very High and High), but because of avarice, pure and simple.
This reminds me, the much celebrated optimization in MGSV probably amounted a lot to not going crazy with the maximum graphics options, if you pay attention that game can look rough even at max settings, but the thing is that it looks rough in ways that don’t really matter that much.
I think people just get mad when they see an “ultra” option that actually lowers their framerate on their top-of-the-line graphics card even if they don’t need the option because it barely makes a difference visually.
BTW the issue with Crysis is probably CPU related, the game is pretty old and I wouldn’t be surprised if it’s single-threaded and can’t take full advantage of modern processors.
I think you may well be right about CPUs there as I seem to remember getting a smoother experience back in the day on an old processor, whereas today it’s kinda stuttery.
It’s a console port. It shouldn’t require any significant hardware to run. It’s simply another bad port from Nixxes. Hardware shouldn’t be used to compensate for that. This is the reason I’m skipping this game altogether.
My condolences for YOUR loss.
Meh, I was never that impressed by the older ones either (the controls just feel off). I’m not missing much. Maybe later if they fix it I’ll buy it at twenty bucks on sale, but that’s a long time coming.
shut up already clown, jeez your posts are total corporate dck riding. I bet you give bjs to the devs in your free time.
Oh my, someone is frustrated in the pantalones. I’m sure there is someone you can pay for that. Good luck…and be safe.
Probably…the foliage in ROTR looks awful and jagged on PC even with max AA on. XB1, not so bad. From what I see on the PS4 port it looks even better.
The PC version will look and play better than the console versions. People seem to think consoles use “high” settings when the reality is that they only really use “high” texture settings without any anisotropic filtering so they textures are a blurred out mess when viewed at any angle. Hell some of the settings in console games are worse than PC’s lowest settings, Witcher3’s draw distance and population count for example. Sure it’s the same game but the PC versions are usually vastly superior. The ambitious ones at least.
“I don’t understand those people who’s crying about the performance.”
“Go buy Titan X Pascal”
“I have 1070 and I play on high, graphics don’t equal gameplay.”
????? Stupid or trolling? Hey man graphics dont equal gameplay, ill just play old games on a 10 year old pc then.
Truth.
“Go buy Titan X Pascal if you want to run on maxed out”
The thing is it doesn’t run maxed out on the new Titan X either.
“graphics don’t equal gameplay.”
But Performance is.
Ok Nixxes, now go market your game somewhere else.
And another 400$ next year and 10x more power yet the same result.
Should’ve made the very high preset to ultra, and Ultra to insane. People with good GPU nowadays have this mindset like their PC should be able to run all games at very high easily.
One thing I’ve found strange is that you can’t toggle ironsights with keyboard and mouse, but you can with controller (wat?). This means having to shuffle the controls for things around compared to the previous game, despite the fact that ironsight toggle functionality is clearly in the game.
Well In DEHR you couldn’t untoggle ADS. at least this time things are in my favor
Something’s not right here…
Ultra preset (MSAA on, everything set to ultra) 47fps avg on benchmark
Low Preset (EVERYTHING off) 66fps avg on benchmark
What CPU are you using?
i7 3770k @ 4.6ghz, 32GB 2400mhz ram, 780ti SLI
Hmm, like you said, something isn’t right there.
Sli utilisation is around the 90s mark so it would suggest usage is good.
Their in-game benchmarks have been garbage since (at least) Thief and not indicative of real gameplay.
Their last few releases have definitely had it’s share of issues.
People refuse to see that. I don’t know how one can be so oblivious to obvious issues.
Why are people still using MSAA? It’s a terrible form of AA that only works on geometry, is overly demanding, and has no temporal filter. Use FXAA from the Nvidia Driver control panel if MSAA is too demanding.
The game has a Temporal AA option that looks just as good as 4x MSAA, greatly reduces temporal aliasing, and barely impacts performance.
Yet I see people disabling that and using 2x MSAA instead.
temporal aa is even more blurry then fxaa, msaa doesn’t touch the textures
There is no much cost to it anyway, just like Post AA methods, they’re done after the image is rendered, MSAA does it while the image is rendered(hence “post”.). I’ve been saying this along time and there should be more coverage on this that people miss about MSAA in DX11, deferred shading, it’s expensive for a reason, it’s not like the days when a deferred engine wasn’t used.
I lot of post effects are used now, it’s a lot more complex, that’s why you need a post AA., MSAA didn’t actually work with DX9 deferred engine tech, DX10/DX11 did but it’s more expensive. All those people that think they can force MSAA in their control panel don’t realise it doesn’t work, some deferred engines don’t even support MSAA.
Well, Unreal Engine 4 became fully deferred so I don’t think it’s going to die off anytime soon. :p
no AA > FXAA
Ah, this game was ported by Nixxes, the same people that borked the Rise of the Tomb Raider port.
Now I see where all the performance issues are coming from. These guys do terrible PC ports.
I suppose you did not play the PC version of the previous Tomb Raider game…or did you just take that one for granted since it ran very well.
I own the game actually. If you have an i5 or i7 it runs great, but anyone else with any other processor has to deal with sub 60 fps at ANY graphical setting. What kind of rock do you live under to not know about the issues?
Many users at launch had frame rates dropping down to a literal frame per second at the lowest setting, myself included. 5 patches later, it’s better, but still runs at unacceptable levels for many. I still cannot get a solid 60fps in some areas at any setting, kind of like the old Dark Souls 1 port or Arkham Knight.
only the first mission are ok on ROTTR and the rest its worst than batman arkjam knight
i would say this DEUS ex its better optimized than that crap called ROTTR console optimization level
ROTR runs just fine and Nixxes are good at what they do most of the time.
Do you work for nixxes?
Shut up dude.
get lost peasant
Get blocked kid. You’re just more cancer to this website
In a way he is right, I mean just look at all the options they give PC gamers, far more than a lot of other studios do. This game generally runs better on AMD hardware, it’s optimised better but there are plenty of games that run better on NVIDIA hardware, it’s just swings and roundabouts to who gets the better optimisation.
Come on man,of course it is less bad than some absolute craps like batman but just because something is less bad than complete crap doesn’t make it fine.
Depends what you definition of complete crap is, some people like to exaggerate. Some people just can’t manage their settings well, and somehow need ultra rather than go for the better experience at lower settings.
crap means:offering less quality for more power.the more a game developer gets to this,the more crappy the result.
yes ROTTR its one of the worst ports of all times
its on top 10 along GTA IV ,batman
That wasn’t what they said 😛
It was in between the lines…but that’s what they’re saying and they know it.
are they bullshit or really someone tries to play a game at HD resolucion with 8x MSAA?
How much on VHigh textures?
About 5.4GB.
Just put it on high and learn to tweak, I’m getting good performance on 1440p with 980.
Only in La La Nixxes Land would even low settings of MSAA be ‘very demanding’
Post-processing or GTFOH is what Nixxes is saying to Eidos PC customers
MSAA is demanding in any game, it’s not compatible with modern rendering techniques. That’s why all the “post-AA” / “temporal AA” stuff exists.
MSAA is compatible, it’s just more demanding with DX11 deferred engine tech, MSAA always missed certain aliasing, the days of 16x/32x MSAA simples are gone, it’s a mix of MSAA/Post AA now because it’s much cheaper for alpha effects. TXAA was really the first to address all the problems using a mix of MSAA, post AA and temporal filter. People now know blurring is to do with the temporal,post AA part, not the TXAA tech itself.
Basically, MSAA for quality, Post AA for transparencies and shader aliasing, temporal for pixel breakup on image movement due to the low simple rates, TAA in this game is does a really good job, they just over sharpened it with the sharpen setting.
OK, around the town, I can get nearly a perfect 60FPS at 1080p best image quality settings vs performance in my opinion with a GTX 1070 OC, using Reshade for the post AA, SMAA.
MSAA: 2x
FoV: 100%
Texture Quality: Ultra
AFx16
Temporal AA: Off
Contact Hardening Shadows: Off
Depth OF Field: On
Bloom: On
Volumetric Lighting: On
Subsurface Scattering: On
Cloth Physics: on
Ambient Occlusion: Very High
Tessellation: On
Parrallax Occusion Mapping: High
Screen Space Reflections: On
Sharpen: Off
Level OF Detail: Very High
I literally can’t see any difference between Ultra and VHigh textures, I would guess it would be the same for High vs Ultra?
Rise of the Tomb raider was unplayable with the highest texture setting, lowering it by one level had nearly no visual impact, it took me going through Nvidia’s guide 1:1 comparisons and trying to find differences in order to actual discern them.
It seems like it’s the same thing here, the highest textures are poorly compressed and the game does a bad job of streaming.
By the way you should use FXAA with 0.4 subpixel + 0.8 Luma Sharpen instead of SMAA, it’s half the performance and looks better. Reshade’s SMAA is not very good and is almost no different from FXAA.
Actually, using temporal AA and Luma Sharpen 1.50 is about right, the in-game one is pretty horrible, makes the game look like a cartoon almost.
FXAA will still remove additional subpixel aliasing that the game’s TAA won’t catch.
If I go over 1.0 Luma it starts looking too much like the game’s sharpening.
The only problem I having with using SMAA or FXAA though Reshade is how it affects the text in game, same goes for the FXAA enabled in the NVIDIA control panel.
I’m using 1.0 subpix now with 0.083 edge threshold and it’s not preventing me from reading it in 1440p.
Oh yea I mean it’s not bad, it’s just not crystal clear. SMAA gives the text a disconnected pixel look, while FXAA gives it a to smoothed look.
Nixxes are slacking. Poor port.
ROTTR all over again
ROTTR was fine.
GTA IV was fine , ROTTR was a disaster a complete one
i think u played more than just the first mission or the useless benchmark
Why did you bring up GTA IV? We all know you suck rockstars d!ck but gta iv has nothing to do with this.
ROTTR was good sorry your crap pc isn’t fit for it.
i think most of the PC gamers (true ones not with console behind their desks know that NIXXES its the worst porting company for example ROTTR i think that game its a perfect case of NIXXES suck-ing BBB icks)
as for R* by far the best developers and we can all see what they did even a GTX 660Ti its able to run GTA V perfectly while in ROTT it gets 5 FPS on lowest settings.
So u got my answer
Yeah nixxes really sucks man.rottr was crap.I wonder why some think it looked good and ran well.
Rockstar the best developers lol
GTA V PC and max payne 3 😀 OFC ITS THE BEST DEVELOPER
GTA 4 ran horrifically. I once installed Lost and Damned, played for 5 minutes, then uninstalled again. One of the worst console ports of all time.
nope, it wasn’t.
No it was crap.
the f up ROTTR
Hopefully Nixxes will take advantage of the ability to better manage textures in VRAM with DX12.
with DX12 that value probably going to shoot up more.
Tiled Resources have been available on DX11.2 for YEARS.
Not a single developer has bothered to use it, only now the new Gears of War will be using it on PC.
ohh the guys who F UP ROTTR screw them
I don’t understand. the game is unoptimized because people can’t max it out at 1440p 60fps?
Yes exactly the game looks average at best and a grands worth of GPUs should deal with this easily at 4k but it doesn’t.
Tell me about it. this is RotTR on High setting on a GTX 970 without MSAA or anything taxing and 1680×1050 resolution not 1440p or 4k (also looks like a turd) but only very high textures. game is unoptimized and nixxes is not good at optimizing, they are good at adding useless options which won’t make any difference in performance most of the time. Taxing/Demanding =/= Unoptimized, issues and variety of PCs having problems with a game = Unoptimized. like almost all of the Square Enix games in the past two or three years.
Amir i noticed same thing i played the first missions also the benchmark i said omg such a solid port
but after playing couple of them my FPS drops were heavy even on the lowest settings!!
Yep, First two missions were locked at 60fps for me at max settings, then the russian whatever locations dropped it below 10fps, there is another big map which it’s worst and looks blurry and not that great but eat all of the frame rate.
No because it looks average and asks for much power.
they always have been. what they did 1-2 decent ports, and everyone kissed there butt.
They aren’t terrible: people are just r€tarded. They’re like: “OMG IM GONNA PUT THE OPTIONS AT ULTRA AND EXPECT THE GAME RUNS WELL ON A GTX 560 TI!”.
I have a Phenom II x4 and a R9 380 and the game gives me an average of 25 fps on Ultra at 1080p: that’s already amazing by itself having in mind that AMD cards sucks at DX11. And I expect to run it at stable 30 fps once the DX12 patch is out.
the game engine actually favoring GCN architecture more. even in DX11 the game are faster on AMD hardware just like hitman (almost the same game engine). as for DX12 we shall see. with hitman even AMD cards can get performance hit with DX12. worse they might be able to improve FPS in certain areas but stability gone down the hill.
If AMD cards get a DX12 performance hit when they’re supposed to benefit the most, and certainly more than NVIDIA (not because AMD has better DX12 support, but NVIDIA having BETTER DX11, and thus the difference between DX11 and 12 isn’t so big. DX12 only happens to hit the sweet spot where it fixes the features where AMD screws them the most in DX11), then it’s DEFINITELY the dev’s fault, for creating a crappy DX12 port.
But yeah, I agree that MD is an AMD-biased game. That’s why I chose the 380 over the 960 in my time.
DX12 is not easy. low level make sense on console because they only have one hardware config. and i see many people underestimating gpu driver team optimization and thinking with DX12 developer can do much better and get better result that gpu driver team optimization. in most cases DX12 games gaining more performance in situation where CPU become the bottleneck. developer like Remedy readily admit that the CPU portion in DX12 is easy to do but for GPU just to match DX11 performance is already a tall order. in their GDC presentation they talk about try to match DX11 performance as much as possible. they don’t talk to much about exceeding gpu maker optimization because they know that is very hard to do. even with Async compute AMD hardware does not necessarily gain performance. just look at hitman DX12 for example.
I don’t care if it isn’t easy. If you’re a dev, you should either be up to it, or don’t implement DX12 at all if you feel you’re just gonna ridiculize yourself.
Excuses are for r€tarded hipsters.
probably most of them don’t really want to use DX12 either. but they have to do it anyway when certain company that sponsored the game ask them to use DX12.
No, they have to use it because DX12/Vulkan are the next gen of low-level APIs. Renovate or die. If not we’d still be stuck with Doom 1 graphics.
Not really. DX12/Vulkan is just low level API for PC. the two still cannot compare to low level API used in console. And low level API on PC is just optional. Dx12 for example did not replace DX11 like how DX11 replacing DX10/DX9. graphically everything that can be done in DX12 can also be done in DX11. DX12 just have much lower cpu overhead and the ability to use async compute to better utilize gpu available resource. In certain way you say DX12 is DX11 with more certain things being controlled by developer more. But just because developer have control means it will always going to yield better result. The existence of DirectX on PC (and OpenGL) was supposed to ease game development on PC by offering high level API. Going low level bring in the complexity that developer want to avoid on pc in the past.
This i call a F’ed Up Port.
Contact Hardening shadows:
Off : 72FPS
Ultra: 58FPS.
And people call Gameworks, “Gimpworks”? PCSS takes about 8-10FPS.
they fu up ROTTR aswell at least Thief its running good from them!
Yes but consoles need that free performance because they have a tight millisecond budget to stick to, with PC it’s just a bonus. Gameworks offers far more advanced effects than cheap SSAO. Notice how all these effects are on consoles and PC, no special treatment to PC other than some higher settings that make little difference. Gamesworks exists to give games that special treatment and at less cost to the developer in time and money to implement something special for PC themselves, it’s often called adding value.
i know like they did with ROTTR .
ROTTR dosen’t look impressive(same engine as old one only higher textures) at all yet the game simply works like crap in some areas where it shows 0% care/optimization for PC userss
New test done. 1080P with Ultra Preset ignoring the 4GB warming on a Single 970
Before GameReady Drivers: Min 21, Max 57, Avg 35.517
With GameReady Drivers Min 27, Max 58, Avg 41.583
worst ports of all time i would build my list
ROTTR Rise of the console optimization
GTA IV
batman AK
call of duty ghosts (on release)
ROTTR ran alright. Not perfect. But certainly not bad..
ran alright?
it means that GTA IV and Batman AK they were golden compared to ROTTR
ROTTR ran better than AK and GTA 4 at launch. I made it through the entire game, barely dropped below 60.
idk maybe u played another game all pcgamers seem to suffer massive FPS drops on ROTTR even at 800×600 lowest settings because its a pos optimized game
as for GTA IV and AK even at launch runned better for me than ROTTR crap
Don’t you have a crap graphics card though?
GTX 970 OC at 1550 and 8000 Mhz memory i don;t think its crap
also my i7 its oc at 4.9 Ghz so i don’t know
ROTTR its a pos crap game the optmization its a big FAIL for all pcgamers
I made it through with a 980. And clearly not all. Plenty of people say it was a good port.
Oh man, GTX 970 it’s such an old graphics card.
You need to get a better one.
He’s come back off his meds, just look at his spamming of the same thing, he did it with The Witcher 3, it’s like watching a child after you took sweets off it.
Yayaya. I’ve seen that.
No brain.
i hope nixxes developers(PC)will die in a very bad plane accident and the last image they see its Lara face crying because of the bad PC ports
MSAA settings should have huge disclaimers on them with how taxing MSAA has become. You can’t expect the average PC gamer to just KNOW that MSAA is a weird legacy feature that doesn’t really make sense to use but we keep featuring it out of tradition.
Other than that, optimization is on the users. If you have performance problems lower your settings, devs shouldn’t have to tell you this.
Anyone got results from multi gpu setup? I’m kind of on the edge of buying now, or waiting a few months for performance to get ironed out.
You can get near perfect SLI scaling by using the Rise of the Tomb raider custom bits “fix”.
The original bits weren’t too bad either.
Don’t expect performance to be “fixed”, the game simply has a couple of extremely demanding graphical settings.
This isn’t to say that the game looks amazing, they simply shoehorned in some tech and made bad use of it.
Outstanding, this was exaclty the information I was looking for. Yeah I’ve read some stuff about Contact shadowing and volumetric lighting needing to be tweaked back from max to address performance.