Watch_Dogs 2 To Be Optimized For AMD’s GPUs, Will Support DX12

It appears that Ubisoft has switched sides and is now working closely with AMD in order to optimize its PC games. During AMD’s Capsaicin event, Ubisoft revealed that AMD’s DirectX support is what makes its future games look better, like the upcoming Watch_Dogs 2.

Although Ubisoft did not reveal whether Watch_Dogs 2 will support DX12, AMD’s Roy Taylor basically confirmed it with the following tweet. Not only that, but from the sounds of it, Ubisoft’s games will be better optimized on AMD’s GPUs.

What we find really interesting here is Ubisoft’s stance towards the two GPU vendors. Back when Assassin’s Creed: Unity was announced, Ubisoft and NVIDIA teased a Tessellation patch for it. As you may know, this patch never got out.

What’s also interesting is that the first Watch_Dogs game was a GameWorks title. And from the looks of it, Watch_Dogs 2 will be an AMD Gaming Evolved game.

For what is worth, Massive Entertainment took advantage of NVIDIA’s GameWorks features in The Division, and Ubisoft

All in all, this is a pretty interesting thing to note regarding Ubisoft’s latest – and upcoming – titles!

65 thoughts on “Watch_Dogs 2 To Be Optimized For AMD’s GPUs, Will Support DX12”

  1. As an nvidia user its a good thing when a game is optimised for amd it works well on everything(crysis 3,bf4,tomb raider…)
    While gameworks games tend to work bad even on nvidia hardware most of the time

    1. Yeah thats right, and the crazy think is the more they say AMD optimized, the even more and better the game will run on nVidia hardware.
      Have been a nVidia fanboy from time to time. But every damn time one is most often let down by gameworks. So yeah that may look good but it allways runs like a f*ckn turd!

      Everyone should be extremely pleased with AMD’s new GpuOpen. That will give those gready nVidia something to think of.And it allready seems to have some affect. Dident nVidia just open up gameworks to some degree, if I remember correctly!

      This is good, go AMD. nVidia surely needs it or else we all are f*ckd as they have more then 80% of the market, it’s crazy :/

    2. Same old AMD PR BS as always? Every AMD sponsored game runs best on both?
      -Tomb Raider
      -Dirt Showdown
      -Dirt Rally
      -Hitman Absolution
      -Hitman 2016
      -Ashes of Singularity

      No where near are optimized same for AMD/Nvidia.

      And Far Cry 4, Division works w/o problem for AMD.

      1. FarCry 4 works better on AMD… yet it still stutters like hell. Sadly, FC Primal and The Division seem to be the exception rather than the rule. Also, nobody said that AMD optimised games ran the same on Nvidia. It’s obvious there has to be some difference. But they often are better optimised (as in lower sys reqs) and run more stable. The recent DX12 games, for me, show only that 1. AMD architecture is better for that, and 2. that it’s still a fairly recent tech that needs improvement (it’s been said time and again that DX12 is much harder to code into than DX11).

    3. “While gameworks games tend to work bad even on nvidia hardware most of the time”

      Well, yeah, some of those features, such as VXAO, smoke and fluid physics, HFTS are performance heavy, because they’re very advanced. Ofcourse there’s a performance drop – they’re meant to be for people with high end hardware to have that extra option for better graphics.

      1. The thing is, is it worth it though? In Arkham Origins/City I barely noticed when PhysX was on because I was focused on Batman not getting beat up LOL! And when I did notice it, it was kinda distracting. That’s just IMO though. As for HBAO+, in my actually pretty mid range (?) rig (R9 270x with an i7 4770), I just experience drops in FarCry 4 so if VXAO is around the 5FPS drops, I can live with that. But are those really for the high end only? I mean, I was really surprised to see that SVOTI in CryEngine actually IMPROVES performance! But on the other hand, and assuming the performance hit is unavoidable, it really doesn’t justify when less powerful cards outperform the older buy more powerful ones (say, the 960 getting better FPS than the 780)

      2. How do you know?
        You cant access source code of almost any GameWorks effect (FaceWorks is the only effect on GitHub and that effect was never even used in any game), so how do you know what and how is compute?
        How do you know it is demanding?
        You know nothing and that is the very point behind GameWorks blackbox.

        1. Hairworks is in github too. So now if you have access to this tech, you could look at the source code and show us where it is unoptimised. But I think you will not do it. Talking bulshit is much easier.

  2. It’s all fine these games siding with AMD on DX12 but if the game isn’t taking advantage of DX12 properly, AMD are going to have egg on their face like with Hitman. DX12 requires a lot more work, responsibility and will be years before we see the real advantages of DX12 other than Async Compute. At the moment, AMD are selling DX12 on Async while NVIDIA are not bothered about DX12 because their architecture doesn’t suit it, NVIDIA will just play the long game with DX11 until they get real benefits from DX12.

    NVIDIA users haven’t got anything to worry about, true gains from DX12 will come in the form of proper DX12 games, not these patched half baked DX12 games where DX11 out performs it, AMD users will get the biggest gains in DX12 Async initially.

    1. While you raise good points, it would be wise to say that nvidia got caught with their pants down. Not only that but their pascal architecture was most likely ready when the async bandwagon hit the fan, so they have no other choice but to “play along” like you said. I’m pretty sure they really care about that feature because it is a “smart” feature and it should’ve been like that for years. I’m a green camp boy and i’m glad to see that AMD played a big part in changing the way DX handles the game engines. But like you said, REAL DX12 implementation and real benefits (not 5fps or 3-5%) are to be seen in the next years, not in the next months. God DX11 was a mess for sooooooo long.

      1. Yes but DX11 isn’t going anywhere yet, NVIDIA know this and one reason why Maxwell 2 is pure DX11 and their Async support is meh, they went for efficiency over hardware Async, which was right because they have had 2 years of Maxwell 2 before DX12. Maxwell 2 has plenty of life in DX11 games, probably another 2-3 years while games get properly developed for DX12.

  3. DX12 makes games look better on all cards not just amd . But ubisoft devs are stupid that why they think that dx12 will make games look better on amd cards. While evryone knows that on Nvidia they will look the best .

    1. Graphical wise everything that can be done with DX12 can also be done in DX11. DX12 just give more control to developer and have reduced cpu overhead due to much thin layer between drivers and hardware.

  4. I showed off this tweet in a previous article. However this does not mean Watch Dogs 2 will be a AMD Gaming evolved game. All it means is it will be more optimized for AMD hardware since it will most likely use Gameworks effects that are now open for AMD to optimize for.

    But at the same time it would not surprise me if the game is a AMD game since Nvidia took ROtTR from AMD.

    But the only problem I have with these statements from Roy is what I take from dev’s like StarDock / Oxide who claim using low lvl API’s are no walk in the park. Meaning if Ubisoft has a hard time on PC with DX11 for optimization on both AMD / Nvidia hardware how are they to fair with DX12 for example? I hope it’s not PR hype because it will really make AMD look at fault along side Ubisoft just like people blame Nvidia when Ubisoft made a garbage port.

    Well it will be fun to see how this goes down. I just don’t think Ubisoft is ready for a low lvl API.

    1. Just look at Hitman, runs like crap in DX11 on ultra, so DX12 isn’t going to change garbage games, just like NVIDIA sided with Ubisoft and their poorly optimised titles. Ubisoft are primary console devs, siding with AMD isn’t going to change that, though their GCN optimisations from consoles might favour them like with The Division and Farcry Primal.

      1. Well even with Ubisofts backlash the last four games have been decent performance other then some multi gpu issues here and there. But I still don’t think AMD landed Watch Dogs 2. It would not be a good marketing move after how bad 1 was.

        1. But what if their own engineers help them with the proper implementation? Nvidia did that to place their Gameworks stuff so AMD can do it to ensure the game runs good. WD was a good game but it suffered because of its crappy performance and downgrade (thanks for being underwhelming, consoles).

      2. SHOCKED!!
        And that comes from an UBI Pr W**RE!!

        Now we really knows what UBI are, it’s pr DIPSHITS are running berzerk 😉
        No love no future ..UBI are sinking fast!

        1. No, you only say that because I actually like The DIvision(which is better than I thought after playing it) not Ubisoft the company, which was done by Ubisoft Massive studio, a former PC dev before Ubisoft brought them. Ironic that The Division is one of Ubisoft’s better optimised games that runs well on both AMD and NVIDIA, Massive probably having a PC centric background helped that.

          People like you can’t tell the difference between someone actually liking the game and not liking the company, credit were credit is due, something you should learn about.

          1. Yet Multi GPU set ups are still having stuttering issues. I hope this get’s fixed fast. XD

          2. For me multi GPU is dead. I really dont remember the last game that had good support.

          3. What? the only recent game that didn’t was JC3. Every game I play seems to have good SLi support maybe not initially but after a short while.

          4. Exactly there won’t be a single card that will run 4k for a good few years.

            My 980ti SLi does an excellent job at 4k 95% maxed out in most games. Plus i’m sure that SLi Pascal cards will offer another 15 to 20 fps when they launch.

            If you want to game at 4k with high settings SLi is the only way to go for now and the near future.

  5. How about support Vulkan for a change and Nvidia as well, I’m so tired of buying into a new GPU and getting screwed over by the devs every few years, it;s just pointing out that I should just flat out drop from gaming because I’m just never going to experience a good amount of good AAA titles on any GPU the way things are going.

  6. Ubisoft should probably work better on optimization, rather than jumping from one GPU vendor to another. But hey, anything is better than Satanworks, so I won’t complain.

    Back when Assassin’s Creed: Unity was announced, Ubisoft and NVIDIA teased a Tessellation patch for it. As you may know, this patch never got out.

    The performance was bad as it was. Not to say that tesselation applied to every slate on the roofs was a bad idea to begin with.

    1. I just loaded up Unity. It runs a lot better now then it did on release. Only it still has slight stuttering when attacking mobs once in a while for a insta kill. Might have to do with the loading since I was using my mechanical drive.

      Ether way Ubisoft’s last four games has been pretty decent on performance other then Multi gpu support as a whole.

      However it will be interesting to see if AMD did nab this game. However WD1 got so much h e l l I don’t think marketing wise it would be a good game to bundle with cards just because WD1 was such a let down for people both visually, and performance wise with it’s awful stuttering issues even on flagship hardware at the time.

  7. DX12 games with support of AMD:
    – Hitman
    – Total War Warhammer
    – Watch Dogs 2
    – Deus Ex Mankind Divided
    – Ashes of Singularity

    DX12 games with support of Nvidia
    – Rise of Tomb Raider
    – Gears of War Ultimate

        1. Yup, everybody think its DX12 but it’s not 😉
          It’s DX12 in 11_3 Feature level, so called Bridge to easily transition for DX12 in the future (the teacher ;-)).
          For Real DX12 Games we need to wait…. 2017 IMO

  8. “Ubisoft revealed that AMD’s DirectX support is what makes its future games look better, like the upcoming Watch_Dogs 2.”

    There are so many things wrong with that statement.

    “Ubisoft revealed that”

    ” AMD’s DirectX support is what makes its future games look better,”

    “like the upcoming Watch_Dogs 2”

  9. awesome !
    so no extreme tesselation to gimp older cards (nvidia)!
    not to many smoke particles (that you can’t even see nvidia) to make the benchamark on favor 980Ti / TITAN X
    good i am happy about this news!

  10. Tessellation is worthless crap these days. Was good when models where made of 100 polygons lol! no wonder that patch never came out.

    As for this AMD Ubi performance stuff, i guess they are trying to say the title will actually be playable well on AMD hardware. But we could also see a big FPS increased compared to NV’s cards.

    1. Agreed, but I’m glad to see major publishers supporting AMD. They’re really trying to push PC hardware/optimization forward and the market being dominated by Nvidia hasn’t been good for anyone.

      1. Supporting AMD? They’re not supporting AMD, as they were not supporting Nvidia before. Their game will be sponsored by AMD, thats entirely different.

        1. Semantics. If you’re receiving money from someone to advertise their products, you’re supporting him/her in a way.

  11. It’s also steamrolling ahead on the benchmarks for Ashes of the Singularity. I see good things for AMD over the next several years if they can manage to get through these next couple.

  12. I guess mac=pc but he can’t talk about it yet. No way they will abandon pc and make a mac version for the 10 people who game on mac. Its all gtav again. Trying to push console sales. But it could be my imagination though.

  13. Considering current gen consoles are running with AMD GPUs, it make sense to optimize for AMD. And that usually mean the game also have good chance of performing well on Nvidia, meanwhile the opposite isn’t always true.

  14. That’s weird. The difference between the FuryX and the 980Ti is obsolete.. 4fps on way or another.. Can’t talk about groundbreaking differences.

      1. Fury X : 80fps
        980Ti : 76 fps

        +/- : 5% performance

        And that’s on DX12. I would’ve thought that since they have Async compute they would be much more ahead.

        1. Yep, the devs claimed it’s hard to tune, so again it comes down to making a game DX12 from the ground up, not these half arsed DX12 games AMD are pushing.

          AMD said their Async Compute support “may” improve performance by upto 46%, looks like they got a lot of work to do because it just marketing ATM.

          1. Yea i think you’re right. Anyways, all those DX12 titles run so bad in DX12 mode, reminds me of DX11’s early days. I guess it’s normal…

  15. It took them this long to realize that they have to optimize for AMD? I thought having AMD GPUs in consoles was already a huge hint!

  16. “Watch_Dogs 2….”

    NO UBISOFT NO

    Let’s speculate which features from other games they will port over and pretend are novelties this time. I’ll start:
    You can crouch down and skin people after you shoot them. The main character reinforces his leather coat in this manner and by the end of the game no bullets can hurt you.

Leave a Reply

Your email address will not be published. Required fields are marked *