Bethesda has revealed the official PC requirements for Dishonored 2, as well as its PC graphics settings. PC gamers will need at least a 64-bit operating system, an Intel Core i5-2400 or an AMD FX-8320 CPU, 8GB of RAM and an NVIDIA GTX 660 2GB or an AMD Radeon HD 7970 3GB GPU.
Here are the game’s PC requirements:
Minimum:
- Windows 7/8/10 (64-bit versions); Intel Core i5-2400/AMD FX-8320 or better
- 8 GB RAM; 60 GB free HDD space; NVIDIA GTX 660 2GB/AMD Radeon HD 7970 3GB or better
- Contains a single disc. Download of additional files from Steam is required to run the game.
- Requires Steam activation.
Recommended:
- Windows 10 (64-bit versions); Intel Core i7-4770/AMD FX-8350 or better
- 16 GB RAM; 60 GB free HDD space; NVIDIA GTX 1060 6GB/AMD Radeon RX 480 8GB or better
- Contains a single disc. Download of additional files from Steam is required to run the game.
- Requires Steam activation.
In addition, Bethesda revealed the PC graphics settings for Dishonored 2. PC gamers will feature uncapped framerates, support for multiple monitors and an adjustable FOV.
Furthermore, Arkane Studios has implemented some specific NVIDIA features, such as HBAO+, TXAA, Surround technology and Ansel.
Here are Dishonored 2’s PC graphics settings:
- Adaptive Resolution
- Gamma
- Window Mode / Borderless / Fullscreen
- V-Synch
- Texture Details
- Models Details
- Environmental Details
- Antialiasing
- Rat Shadows
- Bloodfly Shadows
- Water Quality
- Shadow Quality
- View Distance
- Video Card Selection
- Resolution
- Monitor Selection
- Field of Vision
Dishonored 2 releases on November 11th!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
Quite a large difference in the minimum GPUs. The 660 should be roughly on par with the 7870. I’m guessing the game is optimized more for nVidia.
but on the recommended side the gpu recommendation is about right (1060 vs 480). just hope this is not another Forza 3 case where anything that is not polaris did not perform as it should be.
“Requires Steam activation”
I really want to play this game but I will wait for GOG release. #irony
In your case wait for the UWP Win10 only version.
If this game will be only on Steam without GOG then I will buy it on Xbox. No problem
youre fine with xbox closed garden but not with steam? lul
YEP I WILL WAIT FOR UWP WIN10 ONLY VERSION SO I CAN PLAY IT ON MY XBOX AT 720P AND 20 FPS
WITH MY BROTHER
Lol yé bouzin!
Rat shadows? WTF! And no post processing settings means I guess we’re gonna get forced CA (Chromatic Abberation).
IdTech based, so prrolly can be forced co fig. The tattoo shadows were in the first game, kind off an inside joke since it was demanding
Another unoptimized game featured by NVidia.
Stop this stupid bullshiting please. NVIDIA is not featuring and publushing games. If you are AMD fan, it;s your choice. But it doesn’t mean you have to lie and attack it’s competition.
It’s only BS for NVidia fanboys, NVidia is not featuring any games? Jesus…
Featured: “An item advertised or offered as particularly attractive or as an inducement”, pretty much falls in line with what NVidia is doing.
“Featured: “An item advertised or offered as particularly attractive or
as an inducement”, pretty much falls in line with what NVidia is doing.”
You are right. My first moment translation of word “featured” was wrong. Sorry for that.
But I insist on what I wrote – that your first comment in this thread is BS. And it has nothing to do with fanboism. If the game use GW or other NV techs, it doesn’t mean that the game is unoptimized by default. Let take a look on Dishonored 2. Only HBAO+ in this game can influence performance on AMD GPUs. But this module was used in many games without any problem. And source code of this module is available on Github. So according on what you wrote in one comment in this discussion, it should be AMD fault if there will be performance problem with their cards. Right? 🙂 If they can see source code, they should prepare for it in their drivers. Unfortunately it is not that simple.
No word on Screen Space Reflections. I haven’t seen them in the videos released so far and it’s one of the features that really make some games shine, pardon the pun. And why use TXAA when they could ask for help from Tiago Sousa to use Doom’s excellent Antialiasing solution?
Also, no word on disc space requeriments. idTech scares me, it’s games are gargantuan. Doom weights more than 60GB after the latest updates, and Wolfenstein was also enormous.
I think the ZeniMax studios don’t share much with each other, but yeah, agreed, why not ask id Tech for a couple of techies for a few days, or whatever.
And the disk space thing, also agreed. This will definitely be a hefty install.
The PoorStation 4 uses its own OpenGL application, so “no”, as porting to it from Vulkan shouldn’t be too difficult, whereas the Xbox One still runs on a modified version of DX12 “because Microsoft”, so yeah, but if the id Tech guys are correct, it’s easier to just make a Vulkan game & then port it over to DX12 than vice-versa, so, in theory, it should be the go-to API soon.
After all, this stuff takes a ridiculous amount of time to standardise.
Really sick of games still using proprietary Nvidia graphics effects, and this is coming from a NV user. It’s bad for PC gaming.
Dude it’s HBAO+ and TXAA, who the fak cares ?!?
honestly HBAO+ rarely is a performance hit and is usually on the fault of the developers for a bad implementation.OR am I thinking of HBAO.
I don’t remember he mentioning HBAO+ whatsoever, but mentioned NVidia itself.
NVidia brand is present on the worst PC ports we got this past 5y, while the AMD brand is present on the best PC ports.
Feel free to disagree, one is entitled to his own opinion, but facts speak for themselfs, and by facts I mean the thousands of links you can find on Google that confirm what I just said (and honestly you won’t have to go that far, right here on dsogaming we have clear examples of it).
proprietary Nvidia graphics effects means HBAO and TXAA that were mentioned in the article Is this not what he said.
If you think all NVidia does is give away the SDK for those effects, you’re deeply wrong.
When a game is under either the NVidia or AMD brand they help optimize the game for PC.
We all know how NVidia goes in terms of optimization, they force expensive effects with poor optimization specially on AMD.
When did I say that.I am not a Idiot I know what Intel,Nvidia, and AMD do for pc gaming in general but I was replying to the comment that was talking about Nvidia’s gameworks.
I was saying NoClipMode was only talking about the Nvidia graphics effects I was talking about the subject at hand.
“We all know how NVidia goes in terms of optimization”
It’s not that simple. In GW there are advanced effects which are performance expensive. And GW is optimized. But for NV architectures. Just like GPUOpen from AMD which is optimized for AMD GPUs. If some advanced feature takes more performance it doesn’t mean that its implementation is not optimized.
“When a game is under either the NVidia or AMD brand they help optimize the game for PC.”
Yes. They are helping to optimize games but only helping. The main responsibility is on developers. If game is mess on PC it is not fault of NV or AMD. It is developer fault.
Clearly you been living under a rock or wanna be ignorant, although bold claims there’s thousands of sources that confirm what I said.
Plus NVidia Effects are indeed unoptimized, they can’t even run right on NVidia cards!
And you make it worse and bring AMD GPUOpen that has little to do with the AMD Gaming Evolved program.
AMD GPUOpen is not more than a helping hand and can be used by developers outside the AMD Gaming Evolve program, cause unlike NVidia, AMD uses open source software.
If there’s any piece of AMD tech that doesn’t work well on NVidia cards, it’s purely NVidia fault, cause like I said before AMD tech is open source.
Fell free to navigate “dsogaming” and/or “google” and goodluck finding any NVidia sponsored title that can be called optimized, and vise-versa for AMD, then come back.
PS: Don’t forget to make some space for the optimized titles under the AMD brand.
“Plus NVidia Effects are indeed unoptimized, they can’t even run right on NVidia cards!”
Claims like this are BS. What does it mean they are not running right? That they cost more FPS than you except?
“If there’s any piece of AMD tech that doesn’t work well on NVidia cards, it’s purely NVidia fault, cause like I said before AMD tech is open
source.”
It can be open source, but it’s optimized to AMD GPUs. NVIDIA also release source code for some GW features. Does it mean, that people from AMD or developers will rewrite source code to be more optimized on NV GPUs? Nobody will do that. BTW only HBAO+ from Dishonored 2 can influence performance of AMD GPUs. And source code from this module is avalable on Github. So you should agree, that if teher will be performance problems on AMD GPUs, it is AMDs fault right?
“Clearly you been living under a rock or wanna be ignorant, although bold claims there’s thousands of sources that confirm what I said.”
What sources do you mean? That who claimed that GW features are responsible for game problems even if these problems are not connected with GW modules that particular game uses? Or sources which make statement on themes which they don’t understand (like was in Project Cars which doesn’t use GPU PhysX but there are still people who claimed that it does and that it ruins performance on AMD GPUs)? There are many sources like this who claimed something which is wrong.
I don’t live under a rock. But my life is not focusing on conspiracy theories. If I accuse some library to be problematic, I need proof. Accusation based only on articles which somebody wrote in internet is not proof. It has to be written by relevant person who understand to computer graphics and game development. No somebody who accuse GW from bugs which can’t be connected with this library. And this is the case of the most articles I read. Only very few of them were relevant.
I say mankind divided was pretty darn bad
“while the AMD brand is present on the best PC ports.”
actually bad ports also happen with AMD sponsored title.
I generally agree, but it’s worth noting Nvidia has released the source code for HBAO+
YEAH YOU SHOULD GET IT ON XBOX
How about open source AMD CHS in Deus Ex Mankind Divined? It offers worst performance and image quality but it’s open source. So we must bow to AMD.
So DX12 and Async optimized only for GCN is good for PC gaming?
you are using a backwards analogy……DX12 and Async are not products of AMD….AMD just supports them better than Nvidia.
That means Nvidia is gimping their customers in one area to provide gains in other areas. Their architecture failures in that respect, but their architecture is also why their cards perform better on DX11
async is pretty much solving the problem that AMD have with their architecture. those ACE hardware were present since the very first iteration of GCN. but they can’t gain access to those ACE in DX11. AoS dev specifically mention that they able to access those ACE because of async compute in DX12. personally i don’t think AMD is really that bad with DX11 just that they want to ease the burden on their driver team so they decided to push more of the optimization responsibility towards game developer with low level access. when it comes to it nvidia just give more effort in DX11 compared to AMD.
wat.
Only these general accusations are bad for PC community. And people who don’t understand what they are talking about. This game used:
– HBAO+ – this is one of AO implementations which was used in several games without any problems (neither for AMD users who can use it too)
-TXAA – one proprietary antialiasing form available only for NVIDIA users. Other users have alternatives so why they are complaining?
– Surround – the same technology as eyefinity. It only allow usage of more monitors to display the game. Just don’t tell me that eyefinity support is good and surround don’t.
– Ansel – allows NV users to make 360 degrees and 3D photos of the game. It is not influence game and other gamers. I really don’t know why is this consider as something bad.
Claims like yours are nothing more than trolling. In this game there is no technology used which is causing problems to users who don’t have NVIDIA GPU. So why are you complaining exactly?
And the funny thing is that nobody force you to use these additional effects. Everybody can use turn them off. People – be reasonable.
game developer have a reason to use them. sometimes it is cheaper to license ready made solution rather than creating their own. funny thing is many games use other proprietary solution but people only complaining about nvidia gameworks.
“funny thing is many games use other proprietary solution but people only complaining about nvidia gameworks”
Exactly. This is doing only by people who just don’t like NVIDIA or who don’t understand what effects in gameworks represents. They think that if developers use GPUOpen from AMD, it’s better because it is open. But they don’t realize that these library is also optimized for AMD GPUs just like GW is optimized for NV GPUs. And nobody rewrite whole library only just because it’s open. It is used exactly like GW. It’s a nice trick from AMD. Call something open and people will consider it as good solution by default. 🙂
Cause every game under the NVidia Gameworks ends up in the worst ports in history.
Dishonored 2 just suffered the same fate, after the first game was properly optimized
EVERY? then why Shadow Warriors 2 have excellent performance? stuff like this does not happen to Gameworks tittle only. what happen with mankind divided? the first game perform really good because the game did not use any graphical demanding feature. i still remember how i was a bit disappointed when i see the graphic compared to other tittle back then (and i’m not alone with this) but i understand part of it because developer intend to make the game look that way.
Shadow Warrior 2 doesn’t use Nvidia Gameworks.
Deus Ex: Mankind Divided uses the Dawn Engine at it’s full potencial, and was not hold back to run on current hardware.
You can still play it at 60FPS by lowering the lightning settings, and is not unoptimized (you have an article on dsogaming that shows that).
GW already open source isn’t
I assume adaptive resolution just means dynamic resolution? That’d be great, should really help for scalability on lower end machines. Hopefully the game is as well optimized as Dishonored 1, which runs like greased lightning on pretty much anything (except consoles :P). It’ll be interesting to see how the VOID Engine (modified version of id Tech) is improved from UE3 of the first game (both visually and performance).
I heard they were aiming for 60 fps on the consoles so its probably not that demanding.
Really? That would mean the game is quite light. In that case the req. are bullshit.
I think its to noticeable as well for when you are up close console users don’t game right up to their monitor like us. Still always nice to see more settings.
I really, really want to get the skinny on why in the past year or two, some AAA games require 16gb RAM and yet current gen systems are still stuck with 8. Nothing has changed, the PC versions aren’t being made night and day different thanks to the jump in RAM requirements. What is the actual reason as to why they want more when little differences are being shown?.
Nvidia effects are garbage, yes I use nvidia but its bad really bad
What exactly is wrong with HBAO+, Surround or Ansel? I don’t use TXAA because it blur frames but you have choice to turn it off or use alternatives. Nobody is pushing you to use them. And if these effects are garbage, what are better alternatives? Do you know better solutions which gave us better result with less performance cost?
The 7970 is significantly faster than the Gtx 660. You can already tell its gonna be poorly optimize for AMD. To the point where AMD users must have more powerful hardware than their Nvidia counterparts.
what is it i smell?? oh a game that will makes my 980 ti barely achieves 50fps at 1080p max settings without looking that demanding?
well f*ck me
“Contains a single disc. Download of additional files from Steam is required to run the game.” Wow… game companies have all just started doing that now. What the hell like, come on.
then just get the game directly from steam?
‘Rat shadows’
NEXT-GEN AS F*CK
They save resources using the same old.
Looks like no 21:9 support?
On the other hand, it seems Bethesda finally managed to fix Havok to work with high framerates.
HBAO+, TXAA are not NVIDIA specific features.they work fine on AMD
TXAA doesn’t.
But do you really want to use that blurry mess ?
Yes.
it does so
The minute i saw “HBAO+” i realized that the game will suck on AMD hardware…
It is because you don;t understand what is it about. That’s all. 🙂 HBAO+ was never the problem for AMD GPUs.
Right… HBAO+ is a problem to all GPUs irregardless of the company that makes them.
Ok
Can I know why? Just don’t tell “because it’s from NVIDIA”. Try adult answer.
Because, in general, the performance hit is greater for the fidelity it provides and because it’s not open source so AMD can’t truly optimise it
🙂
“the performance hit is greater for the fidelity it provides”
Tell which solution is better and why. And I want technical arguments. No open source argument because open source doesn’t mean better.
“it’s not open source so AMD can’t truly optimise it”
Source code of HBAO+ is available on GitHub. So in your way AMD can optimize their drivers for it. But it is not that simple. HBAO+ is optimized for NV GPUs in meaning it is optimized for their architecture. It’s the same for GPUOpen which is optimized for GCN architecture. It means that you have to rewrite the most part of it so it’s nearly the same you implement your own solution. Nobody will do that. If you are capable of programming, try to implement something similar like SSAO and you will see how it works. Then you will stop spreading general BS which you read somewhere on internet.
SLI ?
GPUOpen is optimized for AMD GPUs and their architectures. Even if it is open source, nobody will be rewriting these features to be optimized for AMD competitions.
GPUOpen might be optimized for GCN, but it’s 100% functional with any compliant GPU (yes, even the ones from nvidia), and full source is available without any type of EULA chicanery, unlike nvidia’s approach.
Lazy devs can still opt to include AMD’s solution as an in-game option. The more choices, the better.
“Lazy devs can still opt to include AMD’s solution as an in-game option. The more choices, the better.”
It’s not about laziness. But I agree with choices.
“The AOFX library provides a scalable and GCN-optimized ambient occlusion (AO) solution.”
So features optimized on AMd HW are good, but NV optimized features are bad? Really?
Never implied that. Just compared the closed mindset of nvidia against the open nature of AMD’s offering.
Having said that, from AOFX’s page:
Once again, GPUOpen it’s 100% functional with any compliant GPU and full source is immediately available, no EULA, no artificial restrictions.
HBAO+ run without any problems on competitors GPUs too. TXAA is only one option and AMD users can use alternatives. Ansel and Surround are like Eyefinity (which is for AMD GPUs only). So Dishonored 2 should not have any problem with AMD GPUs because of Gameworks. I still don’t understand why is this discussion under this game. I know that some people are allergic to GW, but why complain when there is nothing wrong?
“GPUOpen it’s 100% functional with any compliant GPU and full source is immediately available, no EULA, no artificial restrictions”
So? Developers will not rewritten source code of GPUOpen to make it optimize for NV GPUs. Source code of HBAO+ is available too. And it’s the same case. The best way is to use libraries from both companies to get best solution for both sides. It better comparing to situation when developers implement one solution for both sides because it will not be the best solution from performance point of view for one or both of them. I think that this “open” terminology is good marketing strategy from AMD when they are aiming on humans feelings. And it doesn’t matter if it make sence or not. People never thought about that. When they hear open, they thinks its better and optimize by default. And that is wrong.
“Because if we don’t preemptively complain, we will eventually get more crap like tessellated god rays that kill performance on any card”
Then complain only when this happen. But no under game like Dishonored 2 which doesn’t have techs causes any problems in previous games. It’s nonproductive to start flame against Gameworks with release of every game which use this library.
“”Being open from the start and pushing for cross-pollination gives the movement potential to be something way bigger than GameWorks. That’s thetrue power of openness.”
I agree with that statement. There will be no problem if GPUOpen library provide general solutions. But it’s not. It is optimize only for one brand of GPUs. If developers have to rewrite these solutions to make it better optimize on other brands of GPUs, it’s like they implement new solution which fit them. It doesn’t matter if it’s open if this library favors only GPUs from one company. AMD is not building general solution. They are building libraries in the same way as NVIDIA (for their HW) just make it open source. And it means nothing in many cases because of architectural differences.
“just compare nvidia’s HairWorks as implemented in The Witcher 3, to PureHair AKA AMD’s TressFX 3.0”
This is exactly how it works. Hairworks is build mainly around tesselation because NV architecture is handling this good. How can AMD optimize it even if they have source code of it? This are architectural differences. It’s the same as asynchronous shaders where GCN is better then competitors. When next versions of GPUOpen libraries will be use it, NVIDIA wil not have chance to do something with even when they have source code available. That’s the point.
BTW there would be no problem with Hairworks in Witcher 3, if developers released game with options to tweak this feature. It was set to highest level by default without chance to change it and it was wrong. Funny thing is that AMD handled it in drivers by tesselation factor changer so the main problem had players with older NVIDIA GPUs who didn’t have options to do that.
I have to admit that I don’t know in depth technical differences between TressFX and Hairworks. Performance is not the only thing which matters. If Hairworks offer better solution with more possibilities for developers, then it will take more performance. So if you have better knowledge about that, write it please here.
Thanks for the link. It is good to know what is the difference between open source licence and licence used for GW.
“SSAO is a good example of a balanced Ambient Occlusion between performance and quality.”
I implemented one year ago SSAO in DX11 according to Frank Luna book. SSAO is good but HBAO+ is better in result. And it cost around 10-15% of FPS (depends on the game and scene). If it is too much for better AO solution, it should consider everybody for himself.
I never claimed that HBAO+ doesn’t look better. It does. But at what cost? That’s my “argument”. It’s the same with MSAA and SSAA. The difference is there but so is the performance hit. That’s why we saw all these different Anti-Aliasing techniques coming to life. I guess Ambient Occlusion will see something similar. More companies (except nVIDIA) releasing their “take” of the matter. And hopefully it will move forward to less performance taxing routines
“I never claimed that HBAO+ doesn’t look better. It does. But at what cost? That’s my “argument”.”
Everyone should consider by himself if the performance cost is acceptable or not. I am OK with that, because I like better graphics. And HBAO+ is optional. Nobody push people to use it. You can turn it off or use alternatives. There would be no difference even if HBAO+ would be open sourced. This is why I don’t get it why people are complaining on games like Dishonored 2, which doesn’t use any tech which badly influence performance depending on GPU vendor of card they own.
“And hopefully it will move forward to less performance taxing routines”
I think it will come only with stronger HW, where the performance impact will no be so noticeable.
My objection is that it’s weird how on most games, along with a patch comes an updated HBAO+ version (obviously from nVIDIA). This keeps me thinking that because of the fact that it’s not open source, there may be some additional “features” which cripple performance on both vendors.
This is the reason i’m in favour of open source code. It’s more transparent and can be modified by all vendors to make it suit their products
“This keeps me thinking that because of the fact that it’s not open
source, there may be some additional “features” which cripple
performance on both vendors.”
Why would NVIDIA do that? Cripple their own GPUs. And why would developers agree with that? It doesn’t make sense. There is of course still development on features like HBAO+ so there are updated versions. As you wrote, more functionality could be implemented, more options for developers, some bugs can be fixed or there can be performance optimizations. First 2 mentioned cases could have impact on overall performance. But if it’s worth-ed, why not.
“This is the reason i’m in favour of open source code. It’s more
transparent and can be modified by all vendors to make it suit their
products”
It’s not that simple. In cases like that (advanced graphics features), modifying of source code, which is optimized for completely different architecture, can be not possible without complete rewrite. And then you can wrote your own solution rather then rewriting existing source code. AMD is not providing general solutions, but solutions optimized for their GPUs architectures.
Matter no1. Project cars. 780Ti was crippled and only after user outcries did nVIDIA release a proper driver.
Matter no2. TressFX used to run like crap on nVIDIA hardware until nVIDIA released their “updated” version.
In general, I think that open source is the better way to go.
Thanks for links. I will take a look at it during weekend.
And? Shadows Warrior 2 is not under the NVidia Gameworks program, nor has Nvidia Gameworks features, so my point stands.
It can garantee it does not, keep making up stuff you almost there.