The beta phase for Tom Clancy’s The Division begins next week and Ubisoft has not revealed yet the official requirements for the PC version. However, it appears that the specs have already been leaked online. According to Steam’s members, Ubisoft Russia has revealed the game’s PC requirements, and you can read them below. Since the Steam page has not been updated yet, we’re marking this story as a rumour.
Tom Clancy’s The Division PC Requirements
Minimum Requirements:
- Operating System: Windows 7 SP1, Windows 8.1, Windows 10 (only 64-bit versions)
- Processor: Intel Core i5 2400 3.1 GHz or AMD FX @ 6100 @ 3.3 GHz
- RAM: 6 GB
- Video Card: NVIDIA GeForce GTX 560 or Radeon HD 7770 AMD (2GB VRAM)
- Hard Drive: 40 GB
- DirectX: DirectX June 2010
- Sound Card: DirectX compatible
- Input Devices: Windows-compatible keyboard, mouse, controller (optional)
Recommended Requirements
- Operating System: Windows 7 SP1, Windows 8.1, Windows 10 (only 64-bit versions)
- Processor: Intel Core i7-3770 @ 3.5 GHz or AMD FX-8350 @ 4.0GHz
- RAM: 8 GB
- Video Card: NVIDIA GeForce GTX 970 (4 GB) or AMD Radeon R9 290 (4 GB) or better
- Hard Drive: 40 GB
- DirectX: DirectX June 2010
- Sound Card: DirectX 9.0c support
- Input Devices: Windows-compatible keyboard, mouse, controller (optional)
Supported Video Cards at Release
NVIDIA
- GeForce GTX 500 series: GeForce GTX 560 (2 GB VRAM) or better
- GeForce GTX 600 series: GTX 660 GeForce or better
- GeForce GTX 700 series: GTX 760 GeForce or better
- GeForce GTX 900 / Titan series: GeForce GTX 960 or better
AMD
- Radeon HD7000 series: Radeon HD7770 (2 GB VRAM) or better
- Radeon 200 series: Radeon R9 270 or better
- Radeon 300 / Fury X series: Radeon R9 370 or better

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
Inevitably sh**ty PC port from Ubisoft firing in 3…
2-.
It is made by Massive a pc studio who ported fc3.
Wow those are surprisingly low.
too crappy graphics for these recommended
Gameplay circulating around is from the PS4 version.
Wrong, it’s from xbone
Really don’t care.
funny how thay ALWAYS say gtx 970 for Recommended requirements…lol
for 1080p 60fps and probably max settings.
Don’t get you panties in a twist guys, it’s probably 1080p, 60FPS ultra for recommended specs. BTW, the game looks really poor, it’s like a generic their person shooter, Ubisoft can’t come up with anything innovative.
Intel Core i7-3770 @ 3.5 GHz, 40GB HDD, Video Card 4GB, wtf is special about this game It looks like s#it. I just don’t understand PC gaming does days. From worst to worst.
Mortal Kombat XL, confirmed NOT coming to PC. So now Ubisoft with Far Cry Primal and The Division will definitely be getting my money in March. People like to hate on Ubisoft but at least they treat PC gamers much better than WB and Activision and make better games than EA.
I just don’t like 64% GPU utilization in watch dogs and 80% in far cry 4, they don’t bother to optimize.. at all.
For all Ubisoft games increase the resolution as much as you can and disable AA and V-Sync and lock you fps to 60 or whatever you prefer.
the latest cod runs better for me than watchdogs
“GeForce GTX 700 series: GTX 760 GeForce or better”
So the 750ti is not officially supported? That’s strange, it’s quite a bit faster than the 560 that is supported as the recommended minimum.
no one has the answer yet….even the above article was mention as rumor.
John you should stop using that old E3 picture for these articles, That version of the game is long gone and now we have a downgraded version.
Just like Watch Dogs, just like the Witcher 3 is was never going to look that good, people need to realise those trailers are just that, vertical slices of how they want the game to look.
watch out! your just bad mouthed witcher 3! prepare for fanboys to cry over the coming months at your comment and try to convince you it wasnt a downgrade and that cdpooject didnt lie all the way up to release day.
I’ll support him
Let them bring it on.
Pfft, developers should stop releasing footage that the actual games can’t come close to.
How does a developer know what the final outlook will look like if they haven’t even begun to think of optimization/final features/both ends of the vertical slice and are showcasing a very early work in progress?
You are mistaken if you think they weren’t perfectly aware that the game will never look that good. It’s exactly the opposite, and it’s called false advertising. Don’t blame people for it.
Those builds were created solely for marketing purposes. Vertical slice is a different thing.
It’s not a different thing, a vertical slice is exactly what you see like E3 2013 with The Division and Watch Dogs, it’s what they want the game to look like. There are plenty of adverts where they show better looking faces by X beauty product, but the lighting trickery makes them look better, they seem to be allow to do that even though it’s misleading with the lighting.
There is all sorts of ways they can get out of people claiming it’s false advertising, adverts on TV do it all the time.
Vertical slice is used to show the key points of a software. It doesn’t have fancy graphics (if any at all), scripted movie-like sequences and some other sheet. What you saw on E3 2013 can be called as cinematic “gameplay” trailer. It was build only to pull money out of you.
>it’s what they want the game to look like
Please. Ubisoft knew this will be never be in-game visuals.
Moreover, I don’t care what they want. What I want is presentation that will look more less close to release version.
The alpha version will never look like the finished game anyway, that’s why they do vertical slices, Crysis didn’t, The WItcher 2 didn’t. People are just going to call raw alpha junk, just like with what’s happening with Star Citizen now ,they dare not show buggy raw alpha footage because it’s so bad, RSI do it because they always have backers during the development and they need to show something for it.
Huh? So, we, somehow, managed to live to this day without marketing bullsheet cinematic trailers, which you call vertical slices for some reason, and now you’re talking like it’s absolutely normal to create completely separated builds, which doesn’t represent the quality of a final product, and have nothing to do with the game itself, because “people are just going to call raw alpha junk”.
First of all, no one is forcing you to show “raw alpha”. Show the game when it’s ready.
Second, Star Citizen is a completely different beast because they have certain obligation to backers.
Third, Witcher 2 alpha footage was unofficial, and it was a desperate measure to attract potential investors, since CDP had serious financial troubles back then. Moreover, I think it looked very good.
The devs call it vertical slice not me, raw alpha footage just doesn’t look good, we never used to have access to alpha or even open beta like now days. Honest, gamers should stay out of this conversation, it’s not your money you’re risking, gamers seem to claim they know when they know nothing when it comes to real world business.
Incorrect.
All three of them showed the demos BEFORE they knew console specs. ALL three of them were then downgraded. That should tell you that it was probably because they overestimated console specs. Either that or all three dev teams are incompetent or deceptive.
For example, if Ghost Recon : Wildlands ends up being downgraded on PC ? Then we can rage at Ubisoft. They showed it E3 2015.
Not for these games though. We can’t really blame them. They just severely overestimated console specs. As long as the PC version looks somewhat better than the console version, it’s fine. You can’t expect them to make a separate PC version with completely different graphics and tech solutions.
It’s still really sad though. Just watch the Snowdrop engine trailer back in 2013. The visuals were insane ! It seems to me that they were literally trying to make the best looking game of all time. Things like Global Illumination are gone, that’s for sure. The textures and level of detail…they were sooo good, but I doubt it’ll be anything like that.
>They just severely overestimated console specs.
Cheap excuse to cover their bullsheet. How in the world, being a game developers, you can overestimate 400$ console hardware? There is no way it could have been better than 400$ PC. And I’m pretty sure, they were showing Division first gameplay trailer on a monster spec.
Moreover, Ubisoft is doing this bullsheet for a very, very long time, long before PS4 and X1.
“Cheap excuse to cover their bullsheet.” Or maybe it could be the legitimate reason. Why does it have to be an excuse if it makes sense ?
Since when did the devs know it was $400 ?
Devs did’t know what the console would cost. Not to mention (and this settles it completely), the previous gen PS3 was about $600.
Why wouldn’t they overestimate it ? What’s stopping them from doing that ?
Your argument is weak.
Again, THREE separate dev teams did it. Those three happened to be the ONLY THREE graphically impressive multiplat games shown in E3 2013. Nothing else came close to that level. ALL THREE were downgraded. That’s literally ALL of the relevant data samples in the situation.
Even though it’s only three cases, it should tell you something. The explanation I’ve provided makes perfect sense.
“Moreover, Ubisoft is doing this bullsheet for a very, very long time, long before PS4 and X1.”
Then what about CDPR ? What about The Witcher 3 ? Did you see how their game got downgraded ? They aslo had a massive downgrade. It makes sense, ofcourse, because they also showed their game on a high end PC at E3 2013, before console specs were known.
Are CDPR also like Ubisoft then ?
“And I’m pretty sure, they were showing Division first gameplay trailer on a monster spec.”
Ofcourse they did. Why wouldn’t devs do that ? Especially when next gen consoles weren’t even revealed, and a new generation was just about to start.
See, if Ghost Recon Wildlands gets downgraded (the PC version), then we can rage at Ubisoft. That game was demoed on a PC well after console specs were known.
Not for WatchDogs and The Division though. If we do that then we have to give CDPR the same over The Witcher 3.
My argument is super strong, as always. Consoles itself have to be affordable, after all, they were made and design for masses. They simply couldn’t be more expensive than a middle-range PC, which means 400-600$.
You’re telling that the tops of Ubisoft didn’t know all this? You’re just fooling yourself, pal. There is no way you could, somehow, sell a monster spec for 500$, which will be able to run Division in its “original” quality. Consoles would have cost 2000$ or more.
That’s just simple logic. I don’t know why you can’t understand it.
Actually they even worse. You see, Ubisoft was always a multiplatform developer, while CDP and the Witcher series both originated from PC. In the end, they completely switched to consoles after Witcher 1, and spat on PC gamers. That kind of betrayal from a PC developer hurts much worse.
But, speaking of the downgrade, I have, again, an unshakable argument for you. Even if we forget about E3 2013, and the old lighting system, the game looks much worse even when compared to the 35 minutes gameplay demo, which came out on August 2014. That was long after the console specs were revealed.
Conclusion? This demo was build solely for marketing purposes, with visuals that players will never see in the actual game.
$400 and $600 are very different.
If the PS4 was $600, things would look much, much better on it. $200 is a HUGE difference in processing power on a console.
So, no, your argument doesn’t really work.
” the game looks much worse even when compared to the 35 minutes gameplay demo, which came out on August 2014. ”
No, it didn’t. People keep saying this, but it clearly shows that they don’t actually do proper graphics comparisons. The engine is so similar here it’s ridiculous. It looks sooo similar to what we have now. Heck, the final version looks BETTER in some ways. In fact, there’s reason to believe that it was running on console ! Why ? The shadows – they sucked. Watch the video again. The lighting looks exactly the same, the detail level looks very similar. The shadows -some of the most low resolution, blocky shadows I’ve ever seen, similar, if not worse than what you see on today’s console version. The shadows and lighting on the PC version by comparison… it’s gorgeous, it’s an improvement, not a downgrade from 2014.
I’d like to know where you see the downgrade though. If you can make some specific comparisons based on the 2014 demo and tell me. I watched that many times, and, it’s prettymuch the same. The tech aspects are all there, and they are the same, if not better on the PC version we have now.
There are only 2 things that matter:
1) Division was shown on a top PC config
2) Consoles were designed for masses, to be cheap and affordable, which means they couldn’t possibly have this config
There were various comparisons of the 35 minutes demo from August 2014 and the final build, clearly showing downgraded visuals (especially geometry). If you’re really interested in this, just google, search the old threads here, I have absolutely no desire discussing this anymore. Everything is already said and done.
Vertical slices of how they *wanted* the game to look. Note that the three games we are discussing here were THE three grahpically impressive multiplats shown at E3 2013. Nothing else was even close. These were the only 3 games pushing graphics that far.
They were shown before devs knew what the console specs were known. ALL three games were then then significantly downgraded over the years.
That should tell you something.
Even so, that only justifies Witcher 3, Division & Watch_Dogs on Consoles, not Ubisoft’s hiding Watch_Dogs PC Settings, CDPR changing the Lighting System, & Division being on an entirely different version of the SnowDrop Engine (I mean, that’s what it looks like these days, it’s a bigger split than BF3 vs BF4, ffs) which can only be defended with “uh, we were piss scared Console kiddies/Publishers would scream, stamp their feet & throw loud angry tantrums at us if the PC version was too fancy, & since PC isn’t a “big deal” for us the way Consoles are, we bent over & let them f*ck us, just as we bent you over & f*cked you real good.”
The Bullsh*t Campaign Ubisoft had going right until Watch_Dogs’ Launch, & the “it’s not downgrading, it’s optimizations!” bullsh*t CDPR tried to push onto us Pre-Launch, that can’t be justified by “Console Hardware Specifications weren’t released yet” either.
Just read my last post below. There is now way an experienced developer such as Ubisoft didn’t know that consoles will cost 400-600$. They were designed to be affordable. Have you ever seen a console for 2000$ or more? That would be the price for that kind of visuals.
And it’s not just that. Ubisoft is doing this sheet for a very long time, long before PS4 and X1, since.. I don’t know. Far Cry 2 is the first thing I can remember.
Ghost Recon: Future Soldier’s pre-delay E3 2010 Presentation looked amazing, the actual final result was a generic shoddy FPS with bad visuals.
Nobody blinked an eye because either nobody remembered, or nobody cared enough to comment on it, though. For those who’ve already forgotten, yeah, this was the Game they tried to deny they’d ever announced a PC version of, even going so far as to delete Forum posts to “prove” it.
“& certainly not Ubisoft’s hiding Watch_Dogs PC Settings”
That much, I can agree with. Sucks, forces you to think about what might be really going on :/
Has it or that was the console version?
Tell me one thing. Is this game with solid singelplayer and story or is it full fledged multiplayer MMO type grindfest??? All info points to the latter-you must play as a team thingy.
There will be story driven missions actually, with cutscenes and everything.
This is a mmo grindfest. You and other players are thrown into a destroyed environment in a emergency situation. You take down generic mobs of enemies and get loot, you got a HQ that you meet with other players and you upgrade your gear on workbenches.
This is another mmo but with shooter controls because it was aimed at consoles.
There is a story to follow through 13 or something missions, and you can choose to play that by yourself. I do believe, however, that you have to be constantly online to play the game, so the world will have other players even if you choose to play alone.
at least they use proper GPUs for both teams and not way off specs like 780/290x
Still beter than EA, 8gb minimum and 16gb recomended.
I smell a bad optimization
Man never got into this series waiting for dues ex and hitman even with hitman being a episode series game
It’s not a deceptive version of directX. Any DX11 card can run it, it’s just that those cards they listed are probably worth it, I mean come on,a GTX 560 or better, that’s a 5 year old card.
Some better bit-rate shots, uploaded at 1440p bitrate from XB1 footage.
http://i.imgur.com/eeq9EF9.jpg
http://i.imgur.com/SGzJ7ab.jpg
http://i.imgur.com/LSfcrpB.jpg
http://i.imgur.com/VnuUVVp.jpg
http://i.imgur.com/MNUZV1Z.jpg
Just like their laptop brethren: most likely will work, but the haven’t been tested or will offer proper support for- that may lead to poor performance here and there or even a glitch or two.
That’s… not nearly as steep as I was expecting honestly.
Enjoy your kiddy console, simpleton.