Ubisoft and Nvidia have announced the extension of their strategic partnership for Ubisoft’s upcoming titles including Assassin’s Creed Unity, Far Cry 4, The Crew and Tom Clancy’s The Division. This basically means that Ubisoft will use Nvidia’s GameWorks tech on all the aforementioned titles.
So, what PC gamers can expect? Well according to the press release we got, Nvidia’s GameWorks technology includes TXAA antialiasing, which provides Hollywood-levels of smooth animation, soft shadows, HBAO+ (horizon-based ambient occlusion), advanced DX11 tessellation, and NVIDIA PhysX technology.
Tony Key, senior vice president of sales and marketing, Ubisoft, said:
“Working with NVIDIA has enabled us to bring an enhanced gameplay experience to our PC players. We look forward to continuing our partnership with NVIDIA on our biggest upcoming titles.”
Tony Tamasi, senior vice president of Content & Technology at NVIDIA, added:
“We’re excited to continue our long-term partnership with Ubisoft in bringing our latest PC technology to their games. Through GameWorks, we have been able to add unique visual and gameplay innovations to deliver amazing experiences for these stellar Ubisoft games, I can’t wait to play them myself.”
For what is worth, Watch_Dogs is still broken for Nvidia’s cards are there is still this annoying stuttering issue that most PC gamers are experiencing. It’s funny as you’d expect that such a title would not be plagued by such issues, especially when Nvidia is working so closely with the development teams.
But anyway, let’s hope that a) Ubisoft will release shortly a performance patch for Watch_Dogs and b) an Nvidia patch will add all those missing PhysX effects that were present in the E3 2012 demo.

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
They working pretty close for a very long time, and where is my shiny graphics?
All I can see from Ubisoft & Nvidia is downgraded titles with shitty optimization.
Yes its Nvidias fault that Ubisoft had to downgrade for the consoles. Seems legit.
The visuals can only be as good as the lowest common denominator my friend. And unfortunately the devs have the last word. Nvidia can huff and puff all they want for better visuals but it wont change anything if the dev already has its plans.
And optimization wise, it isnt the Nvidia effects added to the game that are the optimization problem. No, those effects are optimizes extremely well as a matter of fact. go take a look at benchmarks with them on and off. Blame Ubisoft for its poor engine code ported from consoles and blaming PC for not having unified memory… derp…
You definitely need to read better, before accusing someone in stupidity, Nvidia fanboy.
Huh I have no idea wtf you just said. That response gave me cancer.
But continue being a jackass please, because obviously I was so rude to you right? Apparently I accused you of something?
Jesus…
Please, tell me more.
Ughh your gonna like this one. Ready?
Show me where I called you jackass in my first comment. Not until your jackass like response.
Secondly derp was meant as in ubisoft trying to pawn off PC problems with a unified memory excuse.
Please tell ME more…
NVIDIA have to side with Ubisoft because they have an agreement with them but they’re not responsible for CPU and VRAM performance, the NVIDIA render paths are fine, the HBAO+ performance is fine, the TXAA performance is fine, the CPU, VRAM and memory performance is not, that’s Ubisoft court, it’s a console port, nothing NVIDIA ever can do about that.
Now, go blame Ubisoft like an adult would in the proper way instead of misplaced kids like rants of which you have no proof of.
Ubisoft said: ” PC lacks Unified memory” which is an excuse for the memory problems this game has.
http://international.download.nvidia.com/geforce-com/international/images/watch-dogs/watch-dogs-ambient-occlusion-performance-chart.png
http://international.download.nvidia.com/geforce-com/international/images/watch-dogs/watch-dogs-anti-aliasing-performance-chart.png
There is 10fps difference between the superiour HBAO+ and MHBAO
There is 5fps difference between 2xTXAA and 4xMSAA in which TXAA removes more aliasing
How do you accuse someone “in” stupidity?
Outch… http://www.youtube.com/watch?v=mqnAbRpyqfI
HBAO+ works great in Watch Dogs. I get maybe a 3 or 5fps hit for MUCH better visuals. It’s totally worth it to put HBAO+ AO on High.
I’m not so sure the stuttering is a problem with the graphics card. It’s a problem with streaming, which could be a problem with how the game dynamically loads assets (for instance, why should the game load the detailed interior of a pawn or weapon shop if you’re driving past it at 80mph?).
The game just doesn’t seem to be smart with what it loads into memory. It seems to sort of load large areas without considering its relation to the player.
Nvidia is the cancer of gaming, they want as hard as they can to create a monopoly, not saying that AMD wouldnt do it if they could but that’s not the case
AMD have no proof to back what they’re saying about Gameworks, Hindering performance, they’re just crying about it and now it gets removed from games and they look downgraded because of it.
Where is the proof AMD, you got your tech into the consoles, they run on AMD hardware. They didn’t get review code for Watch Dogs, NVIDIA didn’t get such review code with Tomb Raider 2013 and such AMD logo games and suffered for it just the same because TressFX is so Compute heavy.
What next, your CPUs are hindered by game devs because they don’t multi-thread properly, who to blame? Intel? The game stutters on all GPUs because Ubisoft didn’t optimise it well not because NVIDIA hindered them.
You should read the last article on Extremetech it explains that both AMD and nvidia are right but in the end Gameworks does pose a threat for AMD.
So AMD is right to complain, Nvidia also does it when they feel like they are entitled.
Also it seems that when Nvidia partners for a game they try to improve the performance for their hardware as much as they can compared to AMD who tires to improve the performance of the game in general with their Gaming Evolved program.
So if Ubisoft continue its partnership with Nvidia I would not be surprised to see bad performing games on PC in the future from Ubisoft.
That’s funny because I’ve seen a 7770 keeping up with next gen consoles in Watch Dogs and that’s at 1080p. Sorry, there is no proof at all, it’s all Ubisoft’s fault and not optimising and specific render paths have always been used with NVIDIA/ATI/AMD.
People’s blame is misplaced, they just want a scapegoat after Ubisoft because their card is not performance as they think.
Yeah, not even a 7850 is able to play decently the game at 1080p but a 7770 is.
It’s always is Ubisoft’s fault, but that won’t be enough for me when the next games will be out and still perform quite bad.
People are right to blame who ever they want especially when their cards runs worst in Ubisoft’s game than it does in general for no apparent reason, the graphics in WD are not amazing at all considering the resources you need to max the game.
I don’t think any of us expected it to be super optimised for the PC to be honest but it’s much better than the consoles if the stutter wasn’t there on my mid-range 2 year old hardware spec. It’s an AC4 really.
Well that is what I expected nothing less.
The game was in development since 2009, PC was the lead platforms etc. what should I expected??
Take it with a pinch of salt until you get the game yourself, just like theoretical game benchmarks. NO PC game is ever perfect on launch.
I did get the game.
I can’t think of a PC game that’s had a smooth launch, PC’s are complex, I mean even the great Crysis go hammered for not being optimised and the games that are claimed “optimised” have fake effects or baked, pre-computed.
This topic is getting old and stale.
Face it, AMD(as much respect as I have for what and how they do their business) has been bleeding money more and more as of late. Their driver teams are fractions of what Nvidia has and so is their driver dev budget. They have fantastic hardware however their software side of things just isnt up on the same level. I would imagine they poured a lot of resources into the Mantle development and support.
Its already been explained by Nvidia that Gamesworks and “The way its meant to be played” while often are integrated into the same game, are in fact not the same program. Do the research. TWIMTBP works with the dev from day 1 to not only integrate the gamesworks libraries but optimize their code path and shaders for nvidia cards. This is nothing new and AMD do it as well with their “Gaming evolved program”.
However games can also just use the gamesworks libraries and have Nvidia help integrate that into the game engine. And while the effects will be optimized more for nvidia cards, the rest of the game engine or code isnt optimized for Nvidia specifically. This is the case with Watch Dogs. There are tons of tech sites that have benchmarked AMD vs Nvidia and actually show none to negligible performance differences. So that argument is shaky at best.
AMD can still optimize the game via shader caching etc. Just like Nvidia can on AMD games. The only code that AMD cannot access is the Gamesworks effects code. And really i dont see an issue with that. They did all the R & D to create them and license them out. Its their software. And really the only things AMD are missing is TXAA and Physx which are honestly overated at times so who cares. HBAO+ and PCSS still run quite well on AMD.
So if in fact, AMD is getting much poorer performance(which is questionable at this point with benchmarks from different sites showing different things), perhaps you should blame AMD’s driver team?
So honestly, apart from the garbage Gamesworks accusations, please explain to me how Nvidia is a cancer to gaming? I would love to hear your unbiased explanation. Otherwise Im just regarding you as an AMD fanboy who reads too many Forbes articles.
And for the record I would be saying the exact same thing if the roles were reversed with AMD and Nvidia. BS accusations without any proof always deserves a call out.
AMD are doing the same thing, not sure why everyone thinks this practice is only nvidia’s. So AMD got beat to the punch for Watchdogs big deal, they have their own shit going on in lots of other games.
Because people love to pick on the big guy who is doing well. And Forbes and extremetech wrote articles about games works when they didnt even know what they actually were or how it worked.
People seem to forget how bad Tomb raider was for Nvidia. That’s life. You win some, you lose some.
I think two shitty companies finally found each other, so let them be. After AC IV and Watch Dogs I will stay away from Ubisoft anyway.
thats not a nice thing to say, because i can say the same thing about AMD.
but yeah i will stay aways from ubisoft too
They do make good games though.
The problem is ubisoft not nvidia
AMD is cancer in gaming. They are forcing everyone to create new APIs. Mantle, Apple Metal… Soon to play games you will need 4 different PCs, 6 different phones.
Mantle is a very good thing. It just shows how better other api must be.your name shows you are an and hater
many improvements from the industry after Mantle existed… DX12, nvidia’s DX11 optimized drivers…. all credits go to AMD for this… bad for those want amd to collapse….. so…. i’m no AMD fan instead I’m on the green team… but amd is good for the industry….
“Nvidia is the cancer of gaming” ….
You’re joking right? That’s like, the stupidest thing I’ve read today.
Fanboys are the cancer of gaming!
you & people like you are the cancer of PC community
Way to go Nvidia, you’re partnering with a shitty company that doesn’t optimize their game for PC. The news would have been way better if they said they’ll be partnering with CDPR to bring their tech into Witcher 3 and Cyberpunk 2077 because you know, that company is pro PC —,—
But they are, The Witcher 3 uses Nvidia Gameworks, including Nvidia Hairworks that is part of gameworks but most games dont use it. The Witcher 3 will have Hair and fur with physics because of that.
The Witcher 3 is already unitizing Nvidia HairWorks, Nvidia PhysX & Nvidia APEX and no doubt Cyberpunk 2077 will use these as well since both games are using the same engine.
lol
Swing and a miss, Christian
HAHA! yeah … I knew they’re using that fur physx but I didn’t know it was Nvidia’s tech XD
Sad, Really waited for Witcher 3, but now, its doomed to have shitty performance.
And enlighten us on how you came to that conclusion. CDPR has always been PC focused, dont even compare them with the likes of Activision or Ubisoft.
AMD = all of the Square Enix Titles (some of them run s**t on nvidia – tomb raider-DeusExHR)
AMD = all frosbite engine games.
nvidia = all of the Ubisoft Titles (some of them run s**t on amd – watchdogs…)
nvidia = all unreal engine games.
i hope both of them will help developers to polish their games like how AMD did it with Saints Row 4 or how nvidia did it with Blacklist and Batman AO
I was lucky, I had an AMD card for Tomb Raider and an Nvidia card for Arkham Origins. I’ve never had issues with Unreal Engine titles, and I’ve had AMD cards through most of the UE3 generation of games. DX:HR was fine on AMD for me, too. Usually the annoyances were just missing features on AMD like PhysX effects.
you lucky ba***rd :))
DeusEx HR for me was 16fps, after a month with no word from devs someguy found a way to fix it. it was one reg file that said something about AMD, i guess i changed it to false or true (0 or 1 , i don’t know) then game runs 60fps after that :D.
Weird. I was running an HD6870 but my friend had a 560ti and had no issues.
yeah i didn’t have any issues with it with my 280gtx but with 660ti i lost 44fps
I don’t mind them using GameWorks since a lot of that is just tools to make development easier and is not Nvdia-specific. However, when we have these partnerships it often screws gamers with the other company’s cards because they don’t give them early access to optimize drivers. And it’s not just Nvidia, Tomb Raider was a mess for Nvidia users because Squeenix partnered with AMD on that title.
So. Annoying.
But yeah, Gameworks itself could be cool. And TXAA I really liked in Arkham Origins, finally all that crappy aliasing shimmer was gone.
Nvidia only had problems with TressFX(and that is because their care are shit at compute) in Tomb Raider, the game was very playable on Nvidia cards since day 1. The GTX 670 was getting like 60-70 fps on Ultra, Tressfx disabled at 1080p. That is not bad at all. And Nvidia did get the code a few weeks before the game was released.
But let’s take the bad example from Nvidia
First: Batman games where nvidia cards have an unusual performance advantage. nvidia cards also get much better SLI support, it’s integrated in the game since day 1 and AMD has to force CF through drivers.
Second : Assassins Creed 3 and 4, nvidia support for SLi is integrated in the game AMD still has to force CF through drivers. The performance difference again is bigger than usual even if AMD did all it could to optimize the game.
All these games make use of Gameworks.
The only game where nvidia had problems was Tomb Raider and that was only with TressFx enabled and maybe Dirt Showdown although the game run more than OK on Nvidia cards the problem was AMD cards were much faster.
ah just shut up and buy an nvidia card already & quit bi*hing 😀 bu seriously dude we are part of a big community here and thats called PC community, we have AMD/nvidia and directX/OpenGL. some games works better with one of them some not. chill – DeusEx was another problems for many nvidia users too.
Watchdogs is a mess on nvidia cards too, better yeah but the problems is ubisoft and their s**ty ports not nvidia or amd or whatever. AC III was a mess too but there wasn’t any gameworks or whatever for it. it’s just ubi and their sh**ty games
“nvidia cards also get much better SLI support”
so nvidia is bad because they have good support and better SLI support ?
I’m just pointing the obvious it not about “bi*hing”.
DeusEx run just fine on my old GTX 550 don’t know what you are taking about.
And that is not what I meant. Nvidia SLI support is integrated in the game and AMD can only force CF through drivers and then you have artificial differences in performance.
“don’t know what you are taking about.”
i’m talking about a big problem i had with this title and lots of other nvidia users too, just like how you have problems with WD but not all of you
What problem exactly? what you are saying is very general, maybe you don’t even remember.
I was even able to run DeusEx Human Revolution on a very old AMD HD 3000 igp at 720p low setting. How can this game have problems when it even runs on shit gpu’s?
it can and it did, game was 16fps no matter high or low settings. not only me, go and check steam community and see lots of people with the same problem. the problem was a reg file named AMD whatever…. you have to change it to false or true, i don’t remember and boom game runs at 60fps but devs wont even bother to help those people (me included) and that was a solution by a guy i guess in guru3d
AMD should get off their arse and help the game devs then with CF, you can’t just expect devs to support every PC tech without help, that’s what NVIDIA go in to the studios and do.
Deux EX HR stuttered like hell on my NVIDIA card, think it was due to the AO on high.
You think AMD doesn’t want to help them with CF??
If Ubisoft goes to amd and says: Hey we want to implement CF in our game do you think AMD would refuse them??
Deux EX HR did not stutter for me at all.
NVIDIA cards have better tessellation performance, AMD cards have better Compute performance. You could argue that devs make their Gaming Evolved titles Compute heavy, what because NVIDIA cards ‘should have’ awesome Compute performance? Well no, they do have good performance, but like NVIDIA didn’t want to go with Compute so much on their GPU space, NVIDIA invested in CUDA anyway. More games use CUDA/PhysX than heavy Compute anyway and DX11 is a requirement of TressFX Compute.
What about DX11 MTR? NVIDIA implemented it in their driver before AMD, there performance then moved ahead of AMD cards in games like Civ 5, which also used DX11 Compute. NVIDIA’s CPU bound driver pushed their performance above AMD Mantle high end cards where before they we’re behind.
Hah,I hope the results will be something way better than the crappy performance/visuals we got on watchdogs.
I just hope Nvidia works full on with Ubisoft in making future PC games feel like they were made with PC as a TOP priority. Just as we would expect from any company dealing with GPU’s. Btw I been reading some of the comments on here. Some of them are very IGN’ish. Come on now we are not Console only peasant’s.
Fanboys everywhere! UBI = Great games/Bad perfomance
“Working with NVIDIA has enabled us to bring an enhanced gameplay experience to our PC players”
Yeah like Watch Dogs.
nvidia+ubisoft= bad ports until now, maybe things will change but who knows I’m tired of waiting for a game from Ubisoft that doesn’t have performance issues at launch.
“nvidia+ubisoft= bad ports until now”
“Yeah like Watch Dogs”
or like splintercell: Blacklist. that was a great port and it was ubi+nvidia. i couldn’t even play AC III at all, it was 9fps on major cities, and yes i had even better than req system and i was on nvidia and that game wasn’t supported any nvidia things but splinter cell was
Blacklist was only a decent port and they did use Unreal Engine 2 for the games. It’s hard for modern gps’s to struggle with that old game engine.
More lag and bad optmization ports?
No thank you!
But if you do what you did with Splinter Cell and Far Cry so far i will be happy!