Square Enix has revealed the official PC system requirements for Deus Ex: Mankind Divided. According to the specs, PC gamers will need at least a modern-day dual-core CPU with 8GB of RAM, a 64-bit operating system and an AMD Radeon HD 7870 (2GB) or NVIDIA GeForce GTX 660 (2GB). You can view the full PC requirements below.
Deus Ex: Mankind Divided – Official PC Requirements
MINIMUM:
- OS: Windows 7.1SP1 or above (64-bit Operating System Required)
- Processor: Intel Core i3-2100 or AMD equivalent
- Memory: 8 GB RAM
- Graphics: AMD Radeon HD 7870 (2GB) or NVIDIA GeForce GTX 660 (2GB)
- Storage: 45 GB available space
RECOMMENDED:
- OS: Windows 10 64-bit
- Processor: Intel Core i7-3770K or AMD FX 8350 Wraith
- Memory: 16 GB RAM
- Graphics: AMD Radeon RX 480 – 1920 x 1080 or NVIDIA GTX 970 – 1920 x 1080
- Storage: 55 GB available space
- Additional Notes: 55GB HD space includes DLC

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
FINALLY!
I will playing it at 4k/60fps with my GTX 1080s.
No one cares but ok
You cared enough to reply.
:O
Hey I care~ I think it’s awesome! Nice setup FTL! @Akirascreaming is either jealous or just a jerk… Pick one, neither are good.
Dat arrogance tho’.
Thanks.
most people aren’t proud to be this stupid, but congrats
Still smarter than Titan XP owners.
Whoa, whoa… whoa. I was on your side there for a minute FTL…. I just bought a Titan X. And quite frankly the reason I didn’t go dual GTX1080’s is because SLI (Multi GPU in general) support is on a down slide. I have owned SLI setups since my dual 7900GTX’s and I loved SLI. But once Unreal Engine 4 announced they were no longer supporting the tech the writing was in the wall. And it’s games like Dead Rising 3, Batman Arkham Knight and The Evil Within to name few that don’t support SLI (as well as games that support mods, often get broken SLI utilization even when the base game supported SLI) that I was waiting for a card that could run games at 4K@60fps. And The Pascal Titan X (with an overclock, mine is also water cooled… 45C under load MAX! And Afterburner is releasing unlocked voltage in the beta so I should be able to hit 2200MHz no problem. Add to that 570GB/s memory bandwidth on 12GB and you start to see the light) is that card. So far only Assassins Creed Unity requires me to lower settings (and only shadows to “High”) to lock at 60FPS. I put a lot of thought behind my purchase and quite frankly I find it ignorant that you would insinuate someone like myself was lacking in “smartness”… lol. Honestly though it’s a great card and I am super happy with it! I know your dual GTX1080’s rock too so… WE RULE!!! Have fun man.
I like the Titan XP but I still think its at least $350 overpriced for whats it offering. Maybe the GTX 1080 Ti will fix that.
FTL is right on this one, theres almost no reason to buy a titan product unless its for heavy professional work or deep computing. shouldve waited for the 1080ti honestly, itll be probably 300-400$ cheaper and match, if not beat, the titan in gaming scenarios just like the 980ti did with that titan
Nope, Pascal titan X doesn’t have double precision Compute advantages like the old Titan ,it’s a pure gaming GPU.
Even NVIDIA and a whole ton of techtubers reviews all have said its more of a gaming card than the other titans but its still meant for deep learning and professional first and foremost
Only because of the CUDA core amount, not the double precision.
And the price. Some folks are not willing/able to afford a $6000 Quadro card so a Titan is way more cost effective, double precision be damned.
Cool, have fun mate!
I am aiming for 2560×1440 with my gtx 980ti… hope that happens 😛
I’ll be playing it with my GTX 560 ti. Come at me, bruh. 😛
Where do you live?
No way it’ll support SLI in DX12.
Seems quite reasonable. The graphics seemed really good on the gameplay demos at E3 trailer so hopefully this means it’s well optimized.
The graphics have their moments & the lighting system is solid, agreed, but if you get into the details? Just take a look at how sh*tty hair still is. I mean, if you thought hair was bad in Human Revolution, HAH, you’re going to love it now…..
AMD has had their grubby mitts on this title so I don’t expect much good on Nvidia hardware
AMD GE titles actually runs often very well on nvidia, sometimes better than on AMD’s. GE titles are often best optimized titles of that year when theyve been released. But sure it is from AMD and used their fully open tech undet MIT licence, so it has to run like crap on other HW.
Your forget to tell people how AMD’s HDAO killed performance in Farcry 3 and Sleeping Dogs while taking BS about Gameworks performance but you know, technically illiterate people just want to tell one side. open source means nothing if you can’t provide the tools and support to back it up, AMD do it and leave it to the dev, that’s why they don’t have much success because their software support it poor and a history of being poor.
Notice how the RX480 is there instead of the R9 390, yeah AMD market their newer, current GPUs too just like NVIDIA do with the GTX 970 on their involved titles.
That or we all know how much Nvidia loves to get their hands on games and give them the ol’ Gimpworks treatment. How about them Ultra godrays in FO4?
Well, the default godrays are garbage and as usual, NVIDIA have to step in and fix the console cheap effects, fix the sh*tty AO, fake low polygon hair, sh*t shadow quality
So every effect developed by other party than nvidia sux. Such a bullshit!
In the case of Fallout 4 yeah it’s pretty sh*tty all round in terms of effects, even some of those top looking games have sh*tty AO, poor shadow quality. Remember how contact hardening shadows in Bioshock Infinite killed performance and people complain about PCSS when it’s superior to the high setting. Oh yeah, go play Tomb Raider 2013 now, TressFX kills performance at 4K on my GTX 1070, even Hairworks doesn’t drop to 18 FPS near the hair at 4K but TR 2013 does.
tressfx is open, nvidia could optimize for it any time, but it’s an old game and we all know how they treat old (their own gpu’s for example) – they don’t care for it.
on the other hand how could amd optimize for close source with unnecessary tesselation? you’re coming to wrong conclusions..
UPD: by the way if you’re going for 100% there’s a region you can’t fast travel to, you get to it by shore, where mountains are – that gave me a lot of pain when I tried to get every collectible 😀
not be able to optimize just because you can’t look at source code is just an excuse. even AMD for years optimize some other games without accessing their source code. they have the tool for this kind of problem. and tressfx despite being open it is developed with GCN hardware in mind. i still remember AMD told reviewer to hold their article until AMD was sure any patch to the game did not affect their performance negatively. and despite what you believe nvidia hardware usually work better on old games. take star craft 2 for example (games that coming out in 2010). in that game GTX950 still faster than RX 470 which supposed to be significantly faster than GTX950 (nearing GTX970 to be exact)
People don’t realise why Gameworks is used, it’s because of TOOLS not source code. AMD want devs to do all the work and not provide proper tools for them. Gameworks actually has an SDK and comes pre-optimised, this saves devs time, they don’t want to have to go rooting through the source code with limited documentation or poor tech support. John Carmack even said this and when NVIDIA helped him out and he’s stuck with NVIDIA.
It’s meant for top end GPUs, I find it funny how people make this argument because they enable it on lower end GPUs and wonder why it runs like sh*t. why are people not moaning about the ultra settings that take 10FPS+ just little gain yet complaining about PCSS or HBAO+ultra taking 6-10FPS yet give superior image quality? PC gaming is supposed to be about the best quality you can have, if you can’t run it DISABLE IT and stop crying about it.
yeah but.. my amd 270 (which is low end for modern standards) runs tress fx just fine.. they don’t have to be “high end” it’s just what nvidia does to sell newest and most expensive gpu’s, even at nvidia side! people with 970 couldn’t really use their “works” just because of how taxing they are – at the time 980 was their top tier gpu.. so it’s “get the best gpu or go home” and you’ll quickly understand why even people with nvidia cards despise gameworks
Well ,they assume that, that’s their silly fault, herp, derp ultra don’t care if it doesn’t look much different to high with 20FPS loss. At least when you enable Gameworks effects they do look a lot better because usually you get slightly better quality over the console version and that’s all, while Gameworks effects are superior. There are games without GameWorks that you can’t even max out at 60fps on top end hardware, why not complain about 8xMSAA with little quality gain over 4xMSAA but a big FPS drop, into the 30s on a i7/980 in Crysis 3.
Pretty sure this is an AMD-sponsored game.
godrays in FO4 looks very nice actually. they were very taxing because of tessellation.
GTX 970 is legacy, it’s been superseded by the 1070, just like the R9 390 is legacy but don’t wont say it, in fact some of the 300 series has 4 year old tech(GCN 1.0), that IS legacy.
Legacy: “denoting or relating to software or hardware that has been superseded but is difficult to replace because of its wide use.”
^GTX 970 right there
too bad nobody looks at min specs anymore.
Are you actually being serious? The 300 series is barely a year old and AMD already dropped it from their recommended spec of this game over the 480 so please, don’t give me that BS. Both AMD and NVIDIA want to sell new GPUs, the difference is that AMD won’t officially say the 300 series is legacy and sold you the same GPU as 200 series which some of them using 4 year old GCN.
“Recommended” for what exactly? Another B.S. list without mentioning resolution and framerate.
1080p/60fps most likely.
Resolution is mentioned right there…
Doh, dunno how I missed that (:
Damn, good requirements, even with downgrade.
It will run great on GTX 1070.
I’ll be waiting for CPY to crack this like tomb raider, because I’m not supporting these pig skinned, neanderthal dna having, inbred school shooters.
also ban this one.
” 45 GB available space”
“55GB HD space includes DLC”
Damn
10 GB of DLC.
ban this one john they are bots.
Yep, we don’t need real racists here.
Yeah, only the fake ones.
It’s not racist if it’s real.
I’m referring to the people who call other people racists because they may have a different opinion on something like immigration or want to stop Muslims coming into your country, that isn’t racist. Real racism is when you hate another race for no reason and want to do harm to them, stop them from doing things in your country, not because you just don’t like their ideology and get called a racist
Dx12 and Async is Nvidia’s worst nightmare. Expect the 480 to continue destroying the 460 in modern games.
When trump loses, I will be sending you neanderthal dna having cave dwellers back to the caucasus mountains on a plane piloted by Omar Mateen’s father.
Expect the 480 to destroy the 460?XD
Um did you mean the GTX 1060?
Neandertals had larger brains than current humans.
Sure, but saying “Germanic, Mediterranean, Iberian, Caucasian, Scandinavian, Celtic, etc. etc. etc. to describe “light-skinned” people is ridiculous.
Yeah, but it’s informal. Same reason they use the “unoffensive” African-American, instead of just saying “black.” “Latino” instead of “brown”, etc.
isn’t this based on the same game engine used in Hitman 2016?
Nope, this is using the Dawn engine, Hitman 2016 uses Glacier engine
look a bit more into it and it seems that the Dawn engine is modified version of Glacier 2 engine.
Sounds about right, reuse of their code.
My GTX 1080 is ready for this game!
My GTX 660 is ready also 😀
High textures/medium the rest of them/Fxaa or Smaa/ 900p/ with luck and some O.C. maybe 1080p FTW!
If I understand those requirements correctly are they saying they have 10 GB of Day 1 DLC?
Hope it supports sli and crossfire.
Not happening on DX12
Because of the Multi-GPU thing that DX12 has?
No seriously, you’ve lost me completely, but I’m quite curious about this.
Because it’s Square Enix, Hitman has no multi gpu on DX12, the engine of this game is a modified version of the same engine.
powered by denuvo?
Tomb raider has denuvo. So I think this one will have too
It’s Square Enix, you can bet on it.
I’m finding it very random that the last few AAA games from late last year and this year are asking you for 16gb ram for the recommended specs. Current gen systems only use 8 and we’ve been fine with 8 thus far.
The PS4/XONE barely manage with 8GB’s (unified – Xbox One only allows games to use 5, actually) on games running at sub-1080p/60 FPS, at what the PC version generally considers to be medium settings, so yeah, the PC version recommends (needs) more RAM for higher settings, better FPS, resolution, etc.
Surprise!
It just doesn’t add up all that much for me, I mean I have zero issues with those AAA games that demand 16gb and me sporting 8gb. From the benches as well I’m not seeing insane gains from sporting 16gb over the usual 8.
You running a standard 1080p Rig? If so, then it probably wouldn’t matter to you either way, as you’re not really pushing the game. It’s *ss-covering stuff, they’d have put 12 but nobody does 6GB sticks anymore, so 16.
Well, people can use 12 GB. One with 8 GB and one with 4 GB. Same frequency, latency, voltage and vendor. It’s rather unusual, but still doable.
That relies on the vendor in question selling solo sticks, though.
Like, I see the point you’re making, sure, but for the sake of mainstreaming if nothing else, it’s simpler to use 16 than 12.
True. It’s still viable for those who only have 2 slots on their motherboard and have 8 GB (2 x 4GB), like yours truly right here :D. Rather than buy a whole new motherboard for more RAM, I’d just buy an 8 GB stick of RAM from the same vendor (Kingston) and be done with it. Seems like the cheaper and hassle free solution.
And for the record, I have a H61 chipset motherboard… Don’t ask. As long as it works, it’s all fine and dandy for me.
Hah! I had this epic X58 Bloomfield i7 Extreme Rig I’d managed to patch together with some great bargains for years until it all crashed & burned on me, forcing me to switch to a bloody Haswell, so yeah, no judgement here 😛
I’d still be proudly running the damn thing if I could, actually… One of Intel’s best Chipsets ever, IMO.
Awesome. I’m really into PC tech, new and old. Don’t know why, i just like it. Though my heart screams of joy when I see PC parts, my wallet says no.
I built my PC during my early college years (just finished now). Some parts used (PSU, Core i7 2600, case) some new (RAM, Sapphire HD 7870, HDD and an Asrock H61M-DGS motherboard). Some might scoff at this setup or say it’s a POS, but it’s my POS and I take pride in it.
Add to that the fact I live in a rather poor country, so it costed me a fortune (relative to the economy here and my income) to built it, but it was worth it. Now I’m just waiting until next year to take an entry exam, hopefully pass it and get a well paid job. All I have now is some odd jobs and an unemployment wage for 6 months.
Hey, some people like shopping, some people like cars….. & some people like circuits 😀
My first non-pre-built rig was a piece of sh*t too 😛 (I even had one of those glass-encased fire-breathing monitors that took up half your desk because of how fat they were to prove it….. ugh!)
Add to that, the fact that my country not only has a hilarious economy, but also, that they absolutely despise hardware to the point where if you’re lucky, you’ll be able to find a single piece from a single company in the local shops, yeah……. ^^
Good luck with the exams 😀
Thanks man. I’ll surely need it…
How much pushing do I need to do to require way more RAM?.
Been running my games on max settings, just running slightly higher res shouldn’t require more RAM.
“Slightly”? 😛
1440p is almost double 1080p, & 4K is almost 4x 1080p. You’re literally running ~4x the pixels a 1080p monitor runs 😛
I know but entirely all down to just the RAM needing to be increased?. CPU and GPU yeah that absolutely goes without saying but RAM?.
I mean Does Battlefront legit need that 16 gigs?, does it make a massive difference that is night and day to everyone objectively (like seriously we can all see it, not something you’d even have to explain because it’s got to be that clear type reason).
Depends on how much data you need to load at a single point in time.
Battlefront? Hm.
Granted, a degree of it is due to developers laziness, in any case. Games like DOOM are extremely well optimised, whereas Mankind Divided makes me wonder if those Montreal blokes even know what the f*ck file compression even means.
But yeah, if they’re taking into account 4K, they’d probably exaggerate the recommended settings a little just to be on the safe side, as it’s better than having a storm of people complaining to them because “hey, I needed more memory than you said! waaah, waaah, waaaah!”
Most devs these days besides indie ones seem to have tossed file compression out the window =P.
Why not, I guess…..
I mean, Hard Drives are so cheap these days, after all >.>
/eyeroll-/facepalm combo.
Well yeah we can have cheap HDD’s but why stop good file compression?. It’s like excusing one thing for another.
Na it’s simpler than all that, really;
“F*ck the PC.”
Even more so, “f*ck doing extra sh*t for the PC version as much as possible.”
Why don’t they put requirements for 1440p and 2160p.
Can you do simple math? 1440p is nearly double 1080p
So? So by that logic, you’d need 2×970 to run it at 1440p. That’s stupid bro. It doesn’t work like that.
Doesn’t work that way? You’re rendering twice the amount of pixels, either you get another 970 or you get a 1070-1080 or whatever other GPU that comes close to being twice as fast, if you want the same framerate.
Mmm. No. SLi or Crossfire isn’t 1:1 performance. Some titles will indeed be twice as performing when you add a second gpu but many if not most will only get you between 1.5 and 2.0x performance.
FC Primal (4k)
1080 : 44fps
1080 SLi : 72fps
Thief (4K)
1080 : 52fps
1080 SLi : 84fps
So i think we get the trend. I’ve asked for 1440p and 4k resolution requirements because you cannot deduct them from a simple “add a second gpu and double that framerate”.
You’re a special one aren’t you?
I clearly stated that you could deduce the GPU requirements for higher resolutions by comparing the relative performance of either a more powerful GPU or adding additional GPUs assuming the game has SLI support and it scales well.
Most AAA games I’ve played in the last couple of years have had stellar SLI scaling, near 200%.
>you cannot deduct them from a simple “add a second gpu and double that framerate”
Didn’t say that you damned dunce.
200%? Go see a quick review of the 1080 SLi and see by yourself that not “most” AAA have stellar mutli gpu performances.
” Doesn’t work that way? You’re rendering twice the amount of pixels, either you get another 970 or you get a 1070-1080 or whatever other GPU that comes close to being twice as fast, if you want the same framerate. ”
This right here. By what you said “get a second 970;if you want the same framerate”. In a context where 1440p is nearly double the 1080p pixel count, that logic implies that adding a second gpu would double the performance 1:1. By that logic, if you would use let’s say 1080p then you’d get double the fps since you’re not rendering at 1440p (nearly double the pixel count). Which is false since not all (if not many/most) do not give 1:1 SLi scaling. Some do.
Does not look like the footage shown at the PCGaming E3 show.
I hit minimum and close to recommended thanks to crossfire
My out of stock RX 480 is ready. Kinda …
Low actually looks more realistic. You will never see godrays that “sharp” in the real world.
I was being sarcastic. There is such a small difference between the two except when it it comes to performance. On Ultra it destroys frame rates.
have you seen them in game yourself? i did so i know how much low to ultra difference can be very visible to the eyes.
Uh huh.
So you never play the game then.
I have played it quite a bit. I think I’m around 45ish hours. Lost interest once a lighting mod broke my save and I didn’t feel like starting over.
Is that supposed to be them trolling?
It’s not trolling, there is literally no difference between the two.
Not you, but Nvidia. I can’t see ANY difference between both images either. It looks like THEY are trolling us.
ohh okay, I wasn’t sure. But if you go to the Geforce page where that is listed they have 2 more images like that where you cannot see a difference yet on ultra it tanks fps.
I wouldn’t be surprised, honestly. The other thing I don’t understand is why does Godrays use tessellation. That doesn’t make much sense to me.
It makes perfect sense. That’s how Nvidia has made AMD look bad in the past. Same thing happened with Crysis 2. They had invisible layers of tessellation hidden under the water to lower the performance of AMD cards to make them look bad in benchmarks.