It’s been a couple of days since the launch of Watch_Dogs and we are here today to see how this new open world title performs on low to mid end systems. We’ve been extensively covering Watch_Dogs these past months, therefore there is no reason to discuss (or prove) at all whether it’s been downgraded or not. Yes, Watch_Dogs has been downgraded and as you may have heard these past days, it also suffers from various performance issues.
For this article, we used an overclocked Q9650 (4.2Ghz) with 4GB RAM, an Nvidia GTX690, Windows 7-64Bit and the latest version of the GeForce ForceWare drivers. Nvidia has already included an SLI profile for this game, however we were CPU limited and we could not determine whether SLI scaling was ideal or not. An option to find out whether SLI is behaving as it should be would be to further increase the game’s graphics settings, however in this case we were also bottlenecked by our VRAM. Each GPU core of our GTX 690 features 2GB of VRAM. As a result of that, we experienced major stuttering side-effects the moment we enabled AA or Ultra settings for Shadows and Reflections. Hell, even enabling HBAO+ High resulted in a stuttering mess.
Our low/mid end system was able to run Watch_Dogs with 30-50fps at 1080p and with High settings. While driving, our framerate was taking a minor hit and our framerate was dropping at 26-28fps. Not only that, but even at the lowest settings we were unable to hit constant 60fps. This proves that our CPU is the bottleneck here. Also, and despite Ubisoft’s claims, Watch_Dogs ran perfectly fine with our 4GB RAM. For PC standards, our Q9650 is simply not powerful enough for Watch_Dogs. For console standards, the game runs as good as on PS4 or on Xbox One.
What also surprised us was the game’s minimal performance difference between a tri-core and a quad-core CPU. Watch_Dogs’ Disrupt engine scales wonderfully on a quad-core, however the performance difference between a tri-core and a quad-core is only 3fps. This makes us wonder; why is the game stressing all four CPU cores if there are no benefits between tri-cores and quad-cores? Below you can find comparison shots between a dual-core, a tri-core and a quad-core system. Moreover, we’ve included a video showing the similar performance of the game running on a tri-core and on a quad-core.
So, Watch_Dogs requires a high-end CPU to shine and dual-cores (or even older quad-cores if you are targeting 60fps) are not able to provide an enjoyable performance. But that’s not all. The game has also some ridiculously high VRAM requirements. Nvidia claimed on its performance guide that a GTX690 is able to handle the game with Ultra settings (albeit using High textures). Well, that’s far from truth as there is ridiculous amount of stuttering with Nvidia’s settings. In order to enjoy a stuttering-free experience, we had to completely disable AA and run the game at High settings (and 1080p). With the aforementioned settings, there was no stuttering even while driving. We don’t know whether Ubisoft or Nvidia will be able to provide a performance boost for 2GB GPUs so until we get a new driver or a patch, we strongly suggest avoiding the Ultra settings (unless you have a 4GB or a 6GB GPU).
Graphics wise, Watch_Dogs looks okay-ish at High settings. Its visuals are washed out and the game desperately needs some SweetFX treatment. Graphical effects – such as Parallax Occlusion Mapping, smoke effects, debris from explosions and the number of reflected lights – have been reduced or completely removed. The lighting system also feels ‘simplistic’ and ‘weak’, and is nowhere as good as Ubisoft advertised. Specular and normal maps are also underused. And while the game requires more than 2GB of VRAM for its Ultra textures, those textures are nowhere as good as those found in Crysis 3 or Battlefield 4. Truth be told, those games aren’t sandbox titles so we can’t really compare them. Still, Watch_Dogs’ textures feel kind of underwhelming for their high requirements. And while the game looks great when downsampled (with every graphical setting cranked up), most PC gamers won’t be able to even experience that due to the game’s ridiculous VRAM requirements.
All in all, Watch_Dogs is a mess. Its performance is underwhelming, its VRAM requirements are unjustified for what is being displayed on screen, and most gamers will encounter major stuttering side-effects. Although the Disrupt engine scales great on quad-cores, there is only minimal performance difference between a tri-core and a quad-core CPU. Ironically, Ubisoft’s heavily modified Unreal Engine 2.5-powered Splinter Cell: Blacklist shows better CPU scaling on all cores. Not only that, but there is a noteworthy performance difference between tri-cores and quad-cores in Splinter Cell: Blacklist.
Enjoy and stay tuned for our second part, in which we’ll put Intel’s i7 4930K to the test!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email











The PC version of Watch_Dogs uses Dunia engine.
What have you been smoking my friend? Its using the new disrupt engine.
Only far cry 2 and 3 used dunia engine.
He was joking ,since the gfx on the pc ver sux he maked it out be like if those were fc2 gfx
Well, no surprise there. It’s Ubicrap and their CPU hungry crap engine at work again. It was obvious that they were trying to brute force PC performance to hide their incompetence in porting PC games.
I mean, i7 4770 as recommended ? seriously ? that CPU’s raw power is many times more powerful than PS4/Xbox1’s CPU.
not only that but the game won’t even run properly with an i7 4770k
Seriously ? HAHA! way to go Ubicrap. You did it again
i don’t know but i guess totalbescuits have an SLI titan and 32GB RAM and a intel 3930 which is more powerful than 4770k and still can’t run it on ultra at 1080p 60fps, game goes to 30fps alot. you can see his video on youtube at his channel. his a nice guy and a fellow PCGamer
totalbescuit talked about 40 fps dips with TXAA, not 30.
whatever
you mean it was actually 10 fps more? REALLY?
GAME OF THE YEAR!!
I called it…
Game has poor optimisation, period. But “Amir” was over exaggerating for sure. There’s a clear difference, between 45-50 fps dips (and only from to time), and when game hits 30 fps a lot.
he says when he uses fraps for capturing it goes down to 30s. even so the stuttering was there the whole time. “over exaggerating” is a strong word
Video capturing (with fraps) is different story, especially in this CPU demanding game. Totalbescuit have impressive GPU power, but current CPU’s arent good enough for rock solid 60 fps in this game.
http://prod.cloud.rockstargames.com/ugc/gta5photo/3516/CtHpwsxwE0qohUf7hA-YpA/0_0.jpg
BTW. this is “Watch Dogs” recreated in GTA 5 :), I know this game was ahead of its time
“but current CPU’s arent good enough for rock solid 60 fps in this game.”
actually the game is not optimized at all, 3930 is a beast, how is it not good enough for this game ? how ? just how ? thats ubichimps that can not code/optimize/port their games.
Is not more powerfull in next-gen games it has the same FPS as i7 3770/4770 and WD is just bad optmized thats all
Your comment got me thinking – perhaps they ditched x64 support at the last minute or something, in order to come in under 4gb RAM requirements. The game has trouble streaming assets in and out of memory – it seems like they could have fixed the streaming issue with more RAM usage.
Something funky is going on under the hood of this game that doesn’t make a whole lot of sense. I hope it’s just a bug that can be found/ fixed.
Then that’s just sad. It really tells how Ubicrap feels about the PC gaming. If a game they made needs more than 1 patch just to run, then they’re just focking with their consumers
Here’s the complete playthrough:
https://www.youtube.com/watch?v=smAnN0tz0uU&list=PLROJxSiaUCWesIt-4y_YA9tGs0uM4wQvx
1- it is i7 3770 recommended, not i7 4770
2- PS4 version runs 900p with only 30 fps. You don’t need high-end CPU or GPU to run with same quality as PS4 version. You only need something like 4th gen core i3 and GTX 760 and that should beat PS4 graphic settings
1. i7 3770 and 4770 is almost the same. The difference in performance is little. So how come they are recommending a powerful CPU when PS4 and Xbox 1 CPU can handle running the game ?
2. A dual core ? are you serious ? the dual core that can run this game on 30 fps or more is the haswell dual core and it’s not because it’s optimized for pc but because of the architecture of that CPU. So again, brute forcing.
A 760 2GB? Ubicrap said PS4 quality is on high. Question all of the readers here who isn’t suffering from stuttering when they set the texture and graphics to high when they only have 2GB of VRAM and you will find none. I, too, have 2GB of VRAM and the game stutters a focking lot when I drive.
Ubi said that, but playing on PS4 seems more like medium settings. A huge difference, just another fail advertising to bring PC users to PS4
PS4 = FX 4300/R9 260 at most!-
i bet that 4930k will perform the same. they f***ed the game. there
will be a patch soon but i’m not so sure how it would fix this mess.
thanks john
I have a Q9550 @3.4ghz, GTX 670 OC’d and the game looks surprisingly beautiful and runs surprisingly well with Temporal SMAA and Blur enabled – I put textures on Medium and LOD on Low and the game still looks fantastic. The game was built to scale over time, as is evidenced by the 4GB RAM requirement (or more) for Ultra settings.
There are some weird lag spikes when it’s streaming assets, but I don’t think that has anything to do the CPU. It seems they may be able to fix this with a patch.
Textures on Low look atrocious!
Do they? I use Medium and they look fine. I can’t tell the difference between Medium and High while playing. I suspect that changing that setting affects how many high-res textures to load, so up close the textures still hold up but distant textures get more and more blurry.
But that may be from the LOD settings? Uhg, it’s hard to say. The settings aren’t very consistent.
Just take a close look at any advertising poster, floor and vegetation with Textures set on Medium. I think thats no medium at all, thats like low or very low in any well developed game. Even Sleeping Dogs has better texture resolution and it requires less than 1 gb of VRam.
Yeah? I’d like to see a comparison of the two. The Sleeping Dogs I played looked far worse than GTA4 on my PC, and Watch Dogs looks better than GTA4 for me.
No way Sleeping Dogs looks worse than GTA IV, come on.
I mean SD has overall better textures (with high resolution pack enabled) than WD on medium, and VRAM requirements are not as high as WD.
Yeah, SD did have some high-res textures, I remember. Perhaps SD used lots of repeating textures? Watch Dogs has a lot of unique buildings. That could be a factor. And Watch Dogs also has far more advanced shaders than either Sleeping Dogs or GTA.
I base my judgement on more than just textures, tho. “Graphics” are more than just high-res textures to me. GTA5 looks fantastic and all the textures look fairly low-res. Doom 3 looks pretty disgusting up close, but overall the graphics are still pretty amazing to me.
In fact, slapping high-res textures on every surface is often a mistake. Many textures in real life are no super-sharp or bumpy, but rather soft and diffused. I’ve seen people make this mistake a million times in the Skyrim mod scene. They think adding a bump texture to everything makes it look ‘better’. I very much disagree.
Agree, that’s a good point. But im only telling you that, in my opinion, medium textures in WD looks horrible, so low res to be called “medium” quality.
I think I get what you’re saying. Yeah, some of the textures in Watch Dogs are low-res, but I think they’re done well and I don’t think they detract at all. They put high res textures on stuff that matters, mostly.
The more visual variety a game has, the less sharp the textures will need to be. It’s a pretty direct trade off. Until ALL graphics cards have 64GB of VRAM we’ll not see perfect graphics. Even then I’m sure people will complain!
Are low res, but only when set on Medium. In high and ultra i agree with you when you said “I think they’re done well and I don’t think they detract at all. They put high res textures on stuff that matters, mostly.”
In medium i think the quality is unacceptable for a Next Gen game.
Yeah, medium textures do look bad, I’d rather put up with the minor stutter than medium textures. Ultra textures are very good though.
Watch_Dogs™ doesn’t feature more advance shadering than gta IV ,gta V looks horrible to me ,worst than Watch_Dogs™ but Watch_Dogs™ is nothing great neither ,it could’ve but…..no.
“Watch_Dogs™”
Hahahha nice one, it wil be a true next gen game they said, it wont be held back by old gen they said, the pc version is lead they said, it will look better than e3 2012 they said….they are lying ubic3nts i said.
I know ,they lied ,and they deserved to be sued just as EA was ,but i don’t see anything wrong with spelling “Watch_Dogs™” ….is the proper way….
Watch Dogs has a lot more texture variation and a lot more location variation, much larger world. The traffic draw distance in Sleeping Dogs is awful, you can see cars disappear into the depth of field and on the motorway. Mouse control is awful too.
SD has awful DoF as well, you look at the water and it blurs it at short distance. The game itself is good though, nice streets feel alive but just lacks the variation of WD.
I can see the cars appearing and disappearing very near in WD too, even on Ultra LoD (test it yourself). However, is true that SD is overall worse in draw distance and DoF.
But i’m just talking about textures quality, not about the complete tech
Besides, windows and glass reflections, as it is in GTA IV, are better in SD than in WD 🙂
But, obviously, WD made many other things better, im just trying to understand the VRAM requirements.
That so called HD texture pack isn’t that good, it barely uses 1GB of VRAM and the textures still look plastic and mushy.
Ultra reflections(they’re not that intensive and much better quality), plus WD does have realtime reflections like the big screens that show ads, they show in any near by reflective surface, i.e water., same for animating neon signs.
“I base my judgement on more than just textures, tho. “Graphics” are
more than just high-res textures to me. GTA5 looks fantastic and all
the textures look fairly low-res. Doom 3 looks pretty disgusting up
close, but overall the graphics are still pretty amazing to me.”
^^^ I explained my comment in another post. I guess I could have explained this better from the start, but whatever. The point is that I didn’t like Sleeping Dogs graphics, despite its use of super-sharp high-res textures. I’m starting to think that the real magic comes from shaders or procedurally-generated textures. In Watch Dogs, look up at the buildings and notice how surfaces reflect the color of the sky. It’s really quite brilliant.
What do you mean repeating Jay’s phrase? Are you agree or not? Because if you are, i dont know what game did you play.
That does look pretty amazing. My only issue is the road texture.
Actually, the more I look at it the more I discover what I don’t like about it. Look at how every surface is flat. The distance haze is not rendered correctly, either – the sky should be light not dark. But that’s minor. The flatness of everything is really what bothers me most.
Whats the matter with the road?
The texture is completely wrong. It looks neat, but roads don’t look like that. Roads (especially wet ones) reflect the light from above. Look at all the blue haze! The road would be a similar shade of blue. They seem to have taken lots of liberty with visual style in this game (haze ain’t blue).
That’s fine if that’s what the creators were going for, but as an 3D artist myself I understand how light behaves on surfaces. Some games get far closer than others.
I think Sleeping Dogs was pretty well optimized. the graphic was really great too.
Well, let’s see.
Slepping dogs has SUPER SAMPLING AA + FXAA applied @ Very High settings, it is like playing this game in 4K. Without SS AA it runs very fast on my PC, 120 fps @ 1680×1050 (680GTX OC)
Very high is 2xSSAA, Extreme is 4xSSAA
textures:
someimage(.)com/R904YqP
Up:Ultra
down:High
don’t know about medium or low
https://dl.dropboxusercontent.com/u/50147967/Random%20Share/screenshot.79.png
^^^Low LOD, medium textures, high shaders, high shadow – looks fantastic, I think.
Really I have zero complaints about the graphics in this game. It’s actually quite amazing what my ‘old’ computer can do. Maybe I would be more pissed if I spend $400 on my CPU recently expecting to max out every game, but I’ve gotten a lot of mileage out of my Q9550!
it’s indeed still looks nice with that screenshot you shared, so you won’t missed that much
Theres no low textures! In gamerprofile.xml low is = medium ingame.
Yeah, my bad, i mean medium.
That’s another issue – why are all the settings names fucked up in the configuration file? Why is it so confusing? Another person found they can put “PC” for a lot of the settings. What does that even mean? Watch Dogs is a huge mess for PC graphics tweakers.
“All in all, Watch_Dogs is a mess.”
mess is because they decided not to give priority to the PC, and make game looks approximately equally between X1/PS4/PC.
https://www.youtube.com/watch?v=QZjfR9YrWFw
In a one word, the game is pure console port.
You’re an idiot for getting a 4GB card they said. You’ll never need that much video ram they said. Suck it, losers! 😀
They were right – you don’t “need” it.
Who ever said that is an idiot living in the world of 512×512 textures of last gen.
Thats what i thought today while playing FC3.
I’m not sure if they patched it, but when I played Far Cry 3 there was no way to change FOV other than a configuration file. That’s the #1 sign of a shoddy port job.
What was wrong with AC4? I don’t remember any glaring ‘console’ issues with that game.
lol have you read DSOgamings performance article on AC4? That game was a mess when released. I would say far worse than this.
But for some months now with ubi patches and the nvidia dx11 optimized driver, it runs well. Still artificially cpu limited in some areas but much smoother and better framerates.
AC4 didn’t stutter though, no where near as bad as Watch Dogs and GTA 4.
Basically, AC4>Watch Dogs>GTA4 from best to worst.
lol ya it did stutter, TERRIBLY. It was all over the Ubi forums in the beginning.
And If we are talking about most optimized, Watch dogs beats out AC4 and there really is no question about that. Thats even with the texture streaming issues. The fact alone that in AC4 you have GPU and CPU bottlenecks just from the game code should speak for itself.
Once watch dogs get the Texture stuttering streaming bug fixed, it will be miles ahead of AC4 in optimization. Obviously not perfect but hopefully that can be fixed too.
Nvidia has always had a history of being cheap with video memory, I’m glad that it has finally bit them in the ass.
A very quick answer ubisoft dont know how to optimaze their game on pc its not the first time we see them doing that (( assasin creed)) 😉
Performance wise, game runs very good on my PC (i5 3570@4.4 + 680GTX OC + W_D running from SDD ). Im using “high” global settings + SMAA @ 1680x1050p (pre prendered frames set to “2”), and game is very fluid. 55-75 fps while driving, and 65-85 on foot.
Ultra settings with only high resolution textures takes a big hit, although theres still 60 fps on average, but I could see some dips to 45-50 area (while driving), also theres some very minor stuttering from time to time (not so big problem).
BTW- TXAA in this game is extremely demanding, on Nvidia Watch Dogs game guide even 2x TXAA is 8 fps slower compared to MSAAx4
Yeah but MSAA is a lot more demanding on the VRAM and 2TXAA removes most of the aliasing, 4xTXAA pretty much removes every bit everywhere, MSAA doesn’t and you need alpha coverage for the alpha textures which is even more demanding.
And now we are in the same situation we have always wanted to run away from.another messy coded game with average visuals depending on raw power of pc offering nothing worth of the hardware it asks for.I miss the days when PC games were optimized well And offered worthy visuals. I believe you my dear friends and PC gamers miss those days too.
Dear JOHN PAPADOPOULOS, thank you for your great articles specially performance analysises,they are always useful and informing.
?ubisoft, Ubicrap, NoPlay, shitty console port
Got it running on ultra but still not impressed!
cool story bro. But yeah i agree its not graphically impressive.
Someone answer me ,is Watch_Dogs™ graphics mod compatible? i mean ,Will users be able to make their own enbseries for Watch_Dogs™ in a close future?
I am absolutely heart broken by this news as I own a gtx690 and cannot max this game out at 1080p. I mean it would be bad enough if I could only max the game at 1080p with a frame rate below 60fps but I can’t even have that as the stuttering is simply unplayable.
I know my card only has 2gb of unique VRAM and is 2 years old but I only game at 1080p and most games since owning it have had basic texture details except for a few exceptional ones. Now games are upping the texture details my card may not be enough which is sad as I don’t feel it’s been put to good use and now when games come along that I thought would utilise it’s resources I find it’s not enough.
I have only been gaming on PC coming up 4 years and I up until this point loved it and was considering another upgrade in say 18 months time but this news has made me sick. The gtx690 is around 3x more powerful than PS4 and so is my I7. Those processors and 16gb of DDR3 should be enough to run games better than PS4 at least 1080p
Don’t for a second doubt your hardware! It’s unfortunately the worst part of PC Gaming: Getting shitty console ports. The only way to remedy it is to invest money and time into companies that make well optimized games(CD Project Red, Square Enix), You have to accept that some triple a games are just going to be shit because it is what it is. Disappointing I know.
The Witcher 2 still stutters badly for me, they never fixed it and you can see the loading icon when you go into areas like Flotsam forest and the game’s LOD is terrible. I love The Witcher 2 but man it was terrible on release and it’s had so many patches that never fixed the LOD objects drawing in front of you and the stuttering.
Thanks mate. I am hoping they patch this game as there is no reason why a GPU with 4.4x the texture fill rate of PS4 and well over twice the GDDR5 bandwidth should not run the game with ultra textures at 1080p
Thanks.
I am very disappointed by this game, not only does it look nothing like what we were promised and Ubisoft knew that for the longest time but even on Eurogamer when specifically asked about graphics downgrades the developer said no there wasn’t and the original footage was from the PC version which he then said looks “stunning” LOL
It’s what I’ve been saying all along about the Titans. If a game comes along that uses more VRAM than 3GB then all the faster cards will be bottlenecked by it’s VRAM, the Titan won’t be because of it’s 6GB VRAM.
The game stutters regardless though but the VRAM point still remains, you can’t cry about a very fast card that lacks VRAM for a particular set up or settings and you cna’t max it out because of that simple fact.
There is a few things you aren’t considering though. All games use texture streaming and stream textures in when needed and as long as you have high bandwidth GDDR5 there shouldn’t be issues.
My PC has an I7 Sandy around 4x faster than PS4’s CPU, 16gb RAM and 690. Now the 690 has 3.5x the shader compute, 4x the pixel fill rate, 2x amount of ROP’s ,65% faster texture decompression , over twice the amount of GDDR5 bandwidth and 4.4x the texture fill rate of PS4.
So usually when a game engine knows when a GPU doesn’t have much VRAM it will simply lower the texture cache or put it another way stream textures more often and as long as you have a texture pushing monster of a GPU with high memory bandwidth then there is no performance loss or graphical glitches.
For example Metro Last Light’s texture streaming tech is so good even when I max the game out with every setting on ultra with 4x SSAA or running the game with an internal resolution of 4K I am only using 1,8gb of GDDR5.
Crysis3 is a game that can fill a 3gb frame buffer yet I am able to max everything out with no issues as the game’s engine reduces the size of the texture cache and relies on the fact that I have 400gb a second memory bandwidth which is enough bandwidth to stream textures in without issues.
So there is no reason for my 690 to become memory starved at 1080p or even 2560x1600p. Pixel fill rate , texel fill rate, ROP’s and GDDR5 bandwidth all play a more important role than amount of memory. That is unless you get a shit port like Watchdogs where the game is designed around filling as much VRAM as possible to act like a buffer between what is being rendered on screen and what the GPU is working on to give it more time to render the textures. That was NEVER going to work on my card or the vast majority of gaming PC’s.
Texture Streaming is not about bandwidth, XB1 has low bandwidth DDR3 memory and no amount of memory bandwidth will help if the streaming is set wrong
You still need decent bandwidth to stream textures into the game world. Go look at the Digital Foundry frame rate video of Dead Rising3 either on Youtube or Eurogamer and see for yourself how long textures take to stream into the game world.
Or any older Unreal Engine3 game on last gen consoles you will see texture streaming issues. If you have higher bandwidth GDDR5 and lots of texture mapping units then you shouldn’t see this effect.
I think Watchdogs on PC has very poor asset streaming which is causing these issues.
sad indeed, there is an article saying that porting from console to mantle and from mantle to dx is easier than porting from console to dx or dx to mantle so devs should start using up mantle and fix those issues.
There is a performance patch incoming per Ubisoft. It will make the game run at 60fps, make it look exactly like the E3 demo, make Unicorns real and make hot chicks want to watch you play pc games. Ubisoft said it, you better believe it!
ooh… Unicorns ? definitely want that.
Far Cry 3 was NOT a good PC port. It had some signs of “we tried” in it like a FOV slider and borderless windowed mode but the actual graphics settings were ridiculous. There was barely any difference other than draw distance and yet maxed settings barely ran on my PC while I could get 100+ fps with low settings. I repeat, barely any visual difference but HUGE performance impacts all around.
The game also had a clear downgrade for consoles and the original fidelity was removed rather than kept in the PC version. Also let’s not forget the broken mess of a multiplayer mode, no dedicated servers, strange lag compensation, bugs everywhere. Far Cry 3 got off the hook for being open world and “not as bad as other games right now”.
i got microstutter near water also aiming down the sights feels weird, like the game zooms in and lowers mouse sensivity while doing it, feels consolish.
What stands out in this game the most is the locations, they’re full of detail and massive. The ultra textures are really nice, lots of variation of textures on buildings and everywhere, grass LOD is really good as well, not just a grass texture but grass everywhere that draws far. I was impressed by a few new locations I found last night, one was the cemetery and the other was the docks, the docks is full of detail and very large. Traffic draw distance is not great, you can see cars disappearing and the train going across the bridge disappears 3/4 across the bridge, not as bad as Just Cause 2 or Sleeping Dogs though.
I managed to get rid of most of the stutter, have nice mouse input.
1920×1080
High settings
HBAO+(Low),
GPU Max Buffer Frames 2
2xTXAA
Smoothness set to 0 in the gameproile.xml
Having it on my new SSD probably helps as well. Some Ultra settings like shadows and textures, HBAO+ and TXAA really make this game shine, getting rid of the jaggies in this game makes a big difference assuming you have the VRAM like 3GB+.
I wouldn’t have expected use of TXAA from you.
Here are nvidias comparison images:
http://international.download.nvidia.com/geforce-com/international/images/watch-dogs/watch-dogs-anti-aliasing-01-19×10-2xtxaa.png
http://international.download.nvidia.com/geforce-com/international/images/watch-dogs/watch-dogs-anti-aliasing-01-19×10-smaa.png
I would take the 25fps(90fps vs 115fps) more that SMAA gives me any day, it even looks better at some points like the mountain range on the background. Also, less blurring.
I don’t usually like it but it does do a good job in this game, SMAA just has aliasing and crawling everywhere, even SMAA temporal isn’t as good as 2xTXAA. It’s the moving images where TXAA count not static ones so much.
What people forget is that when the image moves, all that image quality breaks from that nice static image, it’s not a good comparison.
I have a theory of what happened to WD :
Back when they were just starting to develop WD, they were going for the “real” next-gen feel and that’s what we saw on the E3 2012 reveal. A game that is shimmering with next-gen graphics. Then comes PS4/X1. Ubicrap tried to port WD to the consoles and shit hits the fan. The consoles can’t run the game with that kind of visual fidelity so they had to shift the focus to developing the game for the consoles – which Ubicrap said themselves that they shift the development midway. Because of the unexpected situation, the development of the game took longer and compensations had to be made so it will run on the consoles and that’s where the delay and downgrade came.
Now, we have a shitty “next-gen” game with shit performance on PC and we live happily ever after
That PS4 14 minute demo 8 months ago looks like the finished game now, only the PC version looks better than that and that got 60,000 likes. The game seems to micro stutter as well like it does now.
Huh ? I don’t get what you’re trying to say.
On youtube ” PS4 – Watch Dogs Gameplay Demo (14 minutes)” 8 months ago, it looks almost identical to what we have now on PC with higher settings, so why are people crying?
If you’re directing your question on me. I’m not talking about the “PS4 Watch Dogs gameplay Demo.” I’m talking about what may have happened to the WD on E3 2012 reveal and why, despite Ubicrap claiming that WD was developed mainly for PC, it runs like shit on PC.
Forget the E3 2012, the point is that they showed gameplay that look identical to release over 8 months ago and no one seem to complain, the video even show micro stuttering the same way as now.
Ahh… so that was your point. I think the answer to that is – majority of the console gamers don’t really delve that deep into things like these coz big percentage of the games being released on their platform don’t have problems like we do on the PC like, stuttering, low FPS, crashes etc. And even if some of the games’ frames do drop to 24, most of them wouldn’t notice it anyway
What you mean? Are you telling me that you ok with the game? You telling me that devs saw the game STUTTERING 8 months ago and after all this time they deliver that mess and you ok with that?
Of course I’m not OK with that but I’m just saying no one complained about it, nor did they complain about the graphics in the video.
Myself pretty much get hardly any stuttering on a mid-range system, virtually never drops below 30fps at 1080p, high, 2xTXAA and it’s always between 30-40fps driving around everywhere. My settings exceed the PS4 settings and hardware terms mine is 2 years old on a GTX 660, FX 6300@4.2Ghz.
The consoles are not responsible for anything. The game has ridiculous requirements on the PC and it’s not very impressive visually. It’s obvious Ubisoft couldn’t optimize the game for PC, on consoles it runs just fine and looks almost like the PC version.
“on consoles it runs just fine and looks almost like the PC version.”
Get a pc on par with consoles and run the game at console settings, ta da, console “optimization” achieved.
1+ as it allways is, to bad the stupid F*CKERS console sheeps dont get it huh!
my 6 year old amd cpu and a mere 168$ gtx 660 run the game better than ps4.
This “next gen” is a freaking joke.
Yeah it really is, so pathetic weak that those clowns Sony /Ms shouldent bother to sell crap like that and call it nextgen. Whats even worse are the drones that are buying that garbarge huh!
Fist time in history that they are so far behind the pc that it’s not even funny. They are totaly obsolete from day one!
Yeah keep telling you that.
Even a Phenom X6 is close to a Pentium Dual Core in terms of performance in this game.
You think it runs better that is the difference.
how i “think” it runs better when i hit 40 fps?
So what if it hit 40fps if it can sustain that and dips constantly below 28fps.
i havent dipped below 28 fps but there is a big on textures that cause stutter while driving ubisoft will fix it, they know the issue. Pc perfomance will improve with driver and patches and new hardware console will not
id blame x360/ps3 for that more, in any case as another user pointed out the lighting can be similar to e3 if you change a option at config file from “console” to “pc” it is obvious that something messed up because of consoles but still the optimization is crap.
Can you tell me which setting that is ? I wanna see
http://www.reddit.com/r/watch_dogs/comments/26c3ew/deferredfxqualityconsole/
Part of the issue seems to be VRAM limitation with the high textures and high settings, it could be why the PS4 is only 900p. Also HBAO+ seems to cause more stuttering so it’s probably getting on the edge and over 2GB VRAM.
Hopefully ubisoft can find a way of optimising the VRAM issues because that’s the main issue.
i run the game on my 955 and gtx 660 better than ps4 but i when i drive fast the screen freezes for a second or two, to fix it i lowered texture resolution, this is a huge issue that ubisoft needs to fix.
Anyway the game is overall average not particulary good or bad, worths 40 bucks, i wouldnt go higher than that, 8/10. 7.5/10 for the crappy pc version.
Also the game is lead version on pc but while aiming with the mouse sensivity slows down like you are on a console. There is acceleration too…. hey its becasue the pc version was lead version, am i right? thats why the game controls and runs like a crappy console port.
Yeah they’re laggy but you can fix that by settings GPU max buffers 2(5 suppose to help minimise the driving lag) and set smoothness 0 in the xml file. BTW, reflections on ultra and HBAO+ make the game look much better, not flat like some shots.
The game can be fun though, just as much as GTA.
Here’s where the PC version excels.
http://i61.tinypic.com/qpp1lt.png
http://i62.tinypic.com/2m7wubm.png
Sean, I’m playing W_D PC ver. and it still looks washed out compared even to gta 5 on ps3.
Scroll down to the bottom of that page and see for yourself, what was possible on ps3. W_D on PC is running bad even on high end, and it looks average at best (although there are few nice details like water and grass)
http://forums.guru3d.com/showthread.php?t=352937&page=68
And why is the game on the PC a console port??
Is it because PC is such a superior platform??
uhhhhh…yep…damn glorious superior platform. PS4 = GTX 740 at most!
The GPU in the PS4 is between a HD 7850 and a HD 7870.
And did you yet believe in that? Damn Sony Fanboys!
So you are an idiot. It figures.
You don’t have to believe anything you can look at the gpu specs by yourself.
Kid. Why get angry too fast?
What spec? AMD didnt release it, and Sony is well know for Watch Dogs in 1080p/60FPS
Well both of them are suspect. But there are already some benchs that confirm that at most a GTX 750
So now you are also stupid, I was taking about the specs of the APU in the PS4.
Show me those benchmarks because I am certain they don’t exist but I want to how far your stupidity goes.
Fanboy Angry kid.
Is there a better bench that a exclusive game like Infamous and Killzone with such a ugly low textures, almost no life(human, animal), 30FPS and yet few advantages to last-gen?
A 7850 could easily run Watch Dogs at medium at 1080p/45FPS.
https://www.youtube.com/watch?v=7cKWqjgok5E
As u saw its even more beautifiul than in PS4 . So with a good GPU like that why is PS4 so ugly?
Someone could say that they still are learning how-to dev on consoles, but that is a even more bullshit!
Its simple… just another shitty lie from the godness Sony/MS
man the game looks more or less the same on all systems its pathetic.
Damn right ,they ruin their own reputation too.
psst watchdogs is a true next gen title no compromises due to last gen.
watchdogs will look better than e32012 on pc.
watchdogs lead platform is pc, thats why it controls like a crappy console port with aiming sensivity with acceleration and diffirince sensivity between aiming and looking around (aka like a controller than a mouse)
Whatever ubisoft shows at e3 on far cry 4 ill take it with a mountain of salt.
You can get the PC version on custom ultra at 1080p, SMAA and that’s where it destroys the others, even on a GTX 660. Trust me, there is more detail than these videos are showing in their crap videos.
Some people just don’t understand what they’re even seeing, the differences are more than just skin deep.
what you mean custom?
I.e not all ultra, water and reflections on ultra only with HBAO+. It seem the reflections on ultra make quite a difference.
Comparison of graphics on youtube should be a crime punishable by death. Especially when the video doesn’t show any difference between even 720p and 1080p. Max 30fps vs 60fps can’t also been seen and rarely is talked enough of.
Yep, not only is the game badly optimised it need raw performance, something AMD GPUs don’t have over NVIDIA. Seem strange though that a GTX 770 beating a R9 290 on high settings, R9 290 beats the GTX 770 on ultra though but not by much. :/
AMD’s gpus have more raw performance even than a GK110 gpu. Just look at the compute capabilities of a full blown FirePro 9100. It has almost 2 times the Double Precision performance of a Quadro 6000k.
AMD GPUs are beasts in compute and GPGPU tasks.
The R9 290 is typically 25-40% faster than a GTX 770 across the board.
The problem is not AMD Gpus the problem is the game and the way it was optimized.
Yeah, I know exactly what you mean. I think Rockstar are just great at capturing an open world in all it’s detail and Watch Dogs lacks that special something R* has. Saying that, there are some nice locations in Watch Dogs but like said, it probably lacks that “special sauce” GTA has.
Despite how p*ssed off I was about GTA4 on the PC with it’s poor optimisation, the PC version did have more detail and that attention to detail which WD somewhat lacks.
But who know’s Sean, few months from now, and we could see graphics mods, that will make W_D look like e3 2012?
Sean, do you remember from what site your ps4 vs PC comparison comes from ? In PC side lighting in W_D looks already better. I’m using Ultra details but it dosnt look so bright
Eurogamer – Digital Foundry. WatchDogs Face Off
Youtube Digital Foundry channel, those shots are from their video comparison.
I still think people should give it a chance ,the screenshots don’t do it justice because it’s a lot better in motion and while the dynamics are at work. There is a lot of things people are not explaining well when they compare to older games.
Thanks Sean. I know how Watch Dogs looks like on my PC, but in your pictures PC version looks much better compared to PS4 ver :). Maybe this difference in lighting comes from just different “time of day” settings. But it only shows, that with better lighting mods, this game will shine :). For now I like how they made grass, water, and buildings (there are even “fake” interiors, like in gran turismo, it looks great), but I would like to see mods for better lighting, textures and car shaders (car shading looks so much better in gta 5, not to mention ICE ENCHANCER for gta 4). I really hope mods will bring this game to e3 2012 quality 🙂
Hell might freeze over and we get a good patch from Ubisoft but we’re getting Physx features in a patch I read.
Every time i see gta V graphics i just facepalm myself ,how come people actually defend it? omg they’re just plain horrible ,Watch_Dogs™ is nothing great neither but gta v…..OMG im tired of all of you gta fanboys.
This game shines on custom Ultra and you can play at 1080p 30fps on a GTX 660. I’ve seen some strange videos are people showing bad desruction when WD isn’t on Ultra, all the effects and you can smash cars to pieces just the same and the pieces of the car stay around on the floor, same with the world objects.
I’m doing more testing just to see how dynamic the world really is.
The destruction is how it is. The Settings don’t have a decisive impact in the end. it doesn’t matter you are on low or high when you hit a wall it still has to leave a mark. It’s strange to see a game launch in 2008 that has better environment destruction and card destruction than a so called Next Gen Title but that’s how it is.
I don’t understand why you defend WD, it’s clearly an example of how a next gen game shouldn’t be.
I’m not defending it but people shouldn’t put videos up of a few missing effects that might be bugs. WD has plenty of dynamic effects for an open world game, you can’t just pick a few bad ones and say it’s a 2008 game.
The PS4 runs WD at High settings at least.
The games looks bad in that clip the textures are quite washed, you can clearly see the AA is turned off, the resolution he is playing at is 1366×768. So you just made a fool of yourself.
The game doesn’t look and doesn’t play better in that clip then it does on the PS4. For the WD to look better then it does on the PS4 you need a 780 or above plus an i7 plain and simple.
It’s simple you clearly don’t know what you are talking about.
Innocent angry kid.
Who said high? that same UBi that said 1080p/60FPS on consoles in april?
Did u at least played on PS4? Its not near of High on PC its atmost medium(i do doubt yet!). And he is playing at HD because he wants to be close to 60FPS.
Thats exactly contrary what devs will do in PS4/XO, cut half of FPS and gain in resolution.
But after u said that is need a 780 “to look better”(770/30FPS/MaxSettings), anyone would bet that u a Sony angry fanboy.
Did at least have a High-End PC or next-gen console?
And “The order” got his delay, bet what…900p/30FPS…lmao!
https://www.youtube.com/watch?v=QZjfR9YrWFw
AND a 7870 running at 900p(ultra) or 1080p(high)
https://www.youtube.com/watch?v=wtQn4abLa18
https://www.youtube.com/watch?v=jgLgLsMU5Hs
High? PS4= Texture low-medium/R9 260.
You keep coming back fro more even after you lied. As I said you are not very smart.
Ubisoft did not say the game would run at 1080p/60fps, Sony did and they retracted that inaccurate information from their site just a few hour after. Ubisoft said the PS4 version runs pretty much at high settings at 1920×900 compared with the assumptions of a liar I would take their word against yours any day.
The GTX 770 can’t max the game at playable 60fps(not even the 780Ti is able to do that) even so you get a lot of stuttering which is quite bad.
I played the game on my friend’s PS4 and it looked more than OK on a 1080p big screen TV and the game performs as expected no frame drops what so ever. I also played the game on a GTX 660 + i5 2500K system at medium-high settings 1080p and it doesn’t look and doesn’t perform better that the PS4 version, the stutter is very annoying.
The clips you posted just prove you don’t know what you are taking about. So annoying I have to repeat myself but it’s not my fault.
The first clip just shows the game look much much better on the new consoles compared with the old ones.
The second one runs at 1600×900(lower than the PS4 resolution) at custom setting High+Ultra and it performs very bad the fps are all over the place. Is this a good example in your book and the last is even worst the game dips even at 0.0 fps and a lot of times at 8 or 9 fps, that is clearly unplayable.
You clearly can prove your point at all.
Not that smart repetitive Sony Fanboy kid. …lmao
UBI + SONY = liars…
Damn, did u at least read half of Papa analysis?
Stuttering = CPU side bad optimazation.
A 7870 is capable of running at 1080p/30FPS with high texture but ultra settings. PS4 cant.
I did say GTX 770/280x at 35FPS/1080p Max Settings.
“PS4 of friend”…we all are hearing about it from a Sony fanboy. lol
PS4 and XO got all attention from UBI thats why both are running better.
A patch and NVIDIA/AMD new updates drivers and no more stuttering on PC and even better perfomance.
Second video stated avg 31FPS/1080p at ultra settings not 0(CPU bad optimization).
My point: PS4 = R9 260 lmao.
Dont get me wrong I will eventually buy a PS4 to play TLOU(at a decent resolution now) The Order, and yet some few great exclusives.
But still Sony got a entry mid-end PC build behind PS4
Again you don’t know what you are taking about.
The CPU is not the problem, people with i7’s 4960k’s and GTX 780Ts’s still complain thy have stutter and input lag so the cpu clearly is not the problem.
Also Ubisoft stated that WD is a next gen game and that PC is the lead platform.
You asked if I played the game on the PS4 and the answer is YES so what’s your problem??
First you say PS4 = GTX 740 and now PS4 = R9 260(which is R7 actually the R9 is the OEM version).
You what do you make of this fanboy?? I’m just loosing my time, you clearly don’t deserve any attention.
The game stutters and I can’t maintain constant fps on a GTX 660 but it runs on a HD 7870 just fine. Keep it coming with those useless yt clips.
Quite contrary, even a beast CPU cant take so much bad optimization, even a i5 is not using 50%(equally) in all cores, my GPU work at 87%, because no one is reach it in time.
PC will not be a leader until it sell more copies than consoles, so here u are completely innocent on what UBI says.
UBi said it, UBi said that, are u innocent to believe in what a recognized liar corp says?!
I didnt said GTX 740(yet not released) but a comparable AMD GPU(never bought a AMD GPU) R7, R9 whatever.
Fanboy = No arguments but still my platform(PS, Xbox, PC) is better…
I recommend u to read it:
http://www.dsogaming.com/pc-performance-analyses/watch_dogs-pc-performance-analysis-part-2-high-end-system/
What a great optimization for a leader plataform…lmao…FANBOY!
You said the lag ans stutter happens because of the CPU and I simply pointed out these things happen even on the fastest cpus. if the cpu optimization is not good enough you should get lower fps not necessary stutter and input lag that problem comes from the video card.
Here is an official source.
http://www.pcgamer.com/2013/02/27/watch-dogs-is-targeting-pc-as-lead-platform/
You saying Ubisoft is lying it’s your problem because you have absolutely no proof for that.
You did say the PS4=GTX 740 can’t you read and remember your own words?? The GT 740 is just a rebranded GT 640 with 384 cuda cores, 128 bit bus and Gddr 5 memory. Compared that even to the APU in the XB1 and you will see how wrong you are.
The GPU in the XB1 is equal to a r7 260 in specs but the one in the PS4 isn’t.
I also gave you plenty of arguments so much so you chose to ignore some of them, I also pointed out the many mistakes in your attempts to justify what you are saying.
So what if the PC was the lead platform it automatically means the optimization on PC can’t be bad?? This is Ubisoft they have a track record of bad PC ports.
Sony Fanboy with fool arguments kid.
Really?
Are u still defending UBi for lying that PC were leader plataform on PC(liar).
Arguments? U didnt say one that matters…only UBI said it(so its true), Sony That, AMD stated specs…liars liars liars
“PS4 = GTX 740 at most!” as u did read again, its stated as comparable AMD GPU, as we all did know that AMD build all “next-get” since 2013.
What a great google’s research here: “The GT 740 is just a rebranded GT 640 with 384 cuda cores, 128 bit bus and Gddr 5 memory.” lmao
U are just defending that Consoles can run WD because they are strong and PC is weak and limited….fanboy
Do urself a favor go back to neogaf console foruns. I aint fanboy so I wont blame PS4/XO for fanboys lame acts.
PC GLORIOUS MASTER RACE….deal with it dear peasant.
What a great optimization for a leader plataform …hahaha!
http://gamegpu.ru/images/remote/http–www.gamegpu.ru-images-stories-Test_GPU-Action-Watch_Dogs-test_new-3840.jpg
Nope, PS4 is no way medium to low, you can tell by the PS4 videos it uses high textures because there is a big difference between medium and high textures, it definitely uses high settings.
Yep…high textures at 6:21,12:28, 26:42 lmao!
https://www.youtube.com/watch?v=qApg9QAO_tA
444
“Sony Fanboy with fool arguments kid.”
You should look at you “arguments” first.
“Are u still defending UBi for lying that PC were leader plataform on PC(liar).”
I’m not defending them I simply pointed out your misinformation and lies.
“Arguments? U didnt say one that matters…only UBI said it(so its true), Sony That, AMD stated specs…liars liars liars”
And you call me a kid. How old are you? like 10 at most.
“PS4 = GTX 740 at most!” as u did read again, its stated as comparable AMD GPU, as we all did know that AMD build all “next-get” since 2013.”
There are a lot of graphics cards from 2010 or 2011 that are faster than a GT 740. The GT 740 is not comparable to any of the new consoles.
“What a great google’s research here: “The GT 740 is just a rebranded GT 640 with 384 cuda cores, 128 bit bus and Gddr 5 memory.” lmao”
Well that is the truth if you ever bothered to read an article about the GT 740. Now compare it not to the gpu in the XB1 that has over 700 shaders, more ROPs etc. The GPU in the XB1 is at list 2 times faster.
“U are just defending that Consoles can run WD because they are strong and PC is weak and limited….fanboy”
Where did I say that kid?? You start to sound more and more like a 10 year old kid.
“PC GLORIOUS MASTER RACE….deal with it dear peasant.”
Yeah peasants do talk like this.
Uhh why are you using a 690 GTX for a low/medium build?
I ran it on Low with 30FPS on:
i5-2310 2.9 Ghz
64-Bit Windows 7
8 GB RAM
AMD Radeon HD 6670 1GB DDR5