It appears that GDC 2016’s schedule has ruined Avalanche’s surprise for PC gamers. As it is hinted, Just Cause 3’s Engine already supports DX12 and some new, PC exclusive features will be implemented thanks to it.
According to the description, Intel and Avalanche have been working closely to make use of DX12’s features in Just Cause 3, so they can improve the performance even further as well as bring additional visual quality to the game.
And regarding the new PC exclusive features that are coming thanks to DX12:
“This session will cover the changes Avalanche made to their engine to match DX12’s pipeline, the implementation process and best practices from DX12. We’ll also showcase and discuss the PC exclusive features enabled thanks to DX12 such as Ordered Independent Transparency, and G-buffer blending using Raster Ordered Views and light assignment for clustered shading using Conservative Rasterization.”
Our guess is that Avalanche will release a DX12 patch for Just Cause 3 in the coming months, so stay tuned for more!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
basically what this means is Nvidia got back at AMD with its async in AoTs
How you deduced this is beyond me.
it all makes sense really
Don’t try understand paranoid AMD fanboy logic, by it’s very nature, it’s devoid of logic
AMD fanboy logic? I have 970’s in SLI…
Yeah, it’s by your constant anti Nvidia tirades in almost every article.
Every possible delusion, criticism of Nvidia and paranoid conspiracy theory has been advocated by yourself on these threads. It’s embarrassingly bad.
If you have valid, substantiated criticisms of Nvidia, then that’s fair enough. Nvidia aren’t saints, and anyone who believes that is deluded too, but the consistent agenda you display is pathetic.
Funny because here the anti NVIDIA people are the ones with NVIDIA cards, they continue to support a company they continue to sh*t on, it’s like they’re schizophrenic or simply do not understand how business works. Gamers shouldn’t care or get involved in how NVIDIA or AMD conduct their business.
You see it here time and time again, people continue to criticize options, Gameworks can be disabled but people can’t comprehend that for some reason because they have read something on the internet wrong what was wrong in the place. People are now cheering AMD based on their DX12 support, yet no solid evidence or actual game has backed up those claims.
Well said.
Umm I had a HD 7990 before I got back to Nvidia… I go back and forth all the time. But just because I get mad over SDK does not mean I am gonna sell my cards in a instant for cheap and buy new GPU’s….
You must be new… I praised gameworks many times… But Nvidia just plays dirty with it. Heck Gameworks is the only thing that looks good in fallout 4 since the overall games visuals are junk.
Fair enough, I’ll give you benefit of doubt.
You can’t blame Nvidia for fallout though. They only provide a tool library for features, they don’t develop the game themselves.
Course Nvidia use things to their benefit, in same way AMD trying to use async to their advantage in propaganda and utilising it within games.
Standard business warfare IMO. End of day, overall, gameworks offers some benefits to Nvidia users if they have hardware capable of it, and they add to game’s visuals. That’s a bonus IMO. Without Nvidias involvement, we simply wouldn’t get them.
Don’t want those features, turn them off.
Much of the rest is ridiculous conspiracy from what I’ve read. AMD and AMD users crying, because reality is AMD don’t invest themselves in their software/drivers etc as heavily, but expect to benefit from Nvidia’s investments
I been gaming on PC’s for almost 30 years…
And as for Fallout the performance of a SDK has nothing to do with performance of the game it’self… If a game is garbage then it’s garbage. Only now is fallout 4 starting to run well. I just played the new beta and it’s running a lot better then I last remembered.
HBAO+ is what the game should of had Day 1 since SSAO is trash. I just find it funny that they put debris using PhysX flex. The same type of debris that was used in games back as unreal tournament 3. But Flex PhysX tech is a lot better on performance then any past PhysX from what I have seen. And it’s used very well in games such as killing floor 2, Wareframe, and that’s about it from what I own let alone seen in games.
The only gameworks features I don’t like is Hairworks unless it’s on monsters in Witcher 3 every other game that used it has been garbage in my eyes. But overall that’s not NVidia’s fault since their partners just can’t make it look cool yet I have seen people make demo’s that look cooler then what AAA dev’s produce with it. Heck their is even Witcher 3 hair mods that tweak the look of how the hair looks and flows in Witcher 3 on Geralt and it looks 100X better then what CDPR did with it on Geralt.
As for driver’s.. Well the past couple months I don’t think the drivers have been all that great really. I mean all these Game Ready drivers to me have not been anything special. They seem to only be game specific and not really showing much if any performance gains on previous titles.
Regards to fallout, I agree, and physx was always generally quite awesome to see in motion, but always ridiculously punishing in terms of performance. So, if they have brought in a maybe slightly inferior end product that performs better I think that is great news. I’ll have to try it out later, but might have to sacrifice something somewhere to use hbao+ and the physx flex, as at 4k and using gsync, I’m sure much more of a load will make it feel not as fluid as I like. I could sacrifice shadows slightly though.
Agree regarding Geralts hair, prefer it off myself and would have liked option for hairworks on for enemy creatures, but off on Geralt, as I wasn’t really a fan of way his hair looks.
Speaking of features that aren’t that great with gameworks, God rays in fallout 4 and far cry 4. Turned them to low in fallout as performance tanks and can barely tell difference tbh. Far cry I hate how washed out everything seems. But I loved way light shines through trees. Gameworks are generally features that are ahead of the technology we have available to masses. They will be staple features in future, but right now they are GPU heavy, but a plus for those at cutting edge
The thing about Nvidia drivers though, is they are very quick and regular, you can’t always say same with AMD sadly.
Well in the new beta of Fallout I put Godray’s on Ultra and the performance seems a lot better now. But if you are playing at 4K HBAO+ should not do any real dmg on performance. Out of all the gameworks features I think HBAO+ is the most less demanding part of it. However HBAO+ Ultra in ACS in pretty insane on GPU’s even a 980 TI.
Yeah hbao+ shouldn’t have too much of a penalty, but convinces with physx, the will surely begin to eat away? Tbh though, some of drops in game seem to be more around BSD optimisation than GPU power anyway. If they’ve improved the performance hit on God rays too that’s good work. I always felt that seemed disproportionate.
Yeah, the gameworks effects in syndicate have insane hit. PCSS ultra is biggest culprit by some way. With both on, 980ti can’t sustain 30fps at points at 4k. Turn them both down a notch and you generally run 45+.
Think hbao+ is well worth the hit, pcss is nice for sure, but not that fussed if I have to turn it off
Well I don’t think the new flex will be that much of a hit on performance. I mean If it was used in games like Metro 2033 it would of been amazing. I like the direction Nvidia has taken with PhysX with it’s Flex tech. I hope more games use it instead of traditional PhysX. And yeah PCSS Ultra has a lot of coverage over a area that’s why it’s such a huge hit on performance. It’s just like turning grass / tessellation on Ultra in GTA V the combination of the two is nuts
John is more so a fan of Nvidia. He is known to have given AMD credit from time to time. He’s tired of Nvidias dirty laundry, however AMD isn’t giving him much better than what he sees in nvidia. Quite the conundrum.
I’m not averse to criticism where it’s fair and just. Continually ragging on gameworks, when there’s no proof of what you’re suggesting and something is optional, is frankly ridiculous
yet it shows in performance if you watch digital foundry’s test with a I7 paired with a 970 / R9 390 during explosions. The R9 390 losses a lot of frames.
Not as underwhelming as AMD’s dx11 performance
You can’t say anything to him that will change his mind. Besides like I been saying since the whole DX12 S H I T storm arrived. Current GPU’s under AMD will use Async to make Nvidia look bad. And Nvidia’s advantage will be CR and Rov’s. But he wants to go on about how it’s a 11.3 feature and yeah it is but it’s DX12 version of it is gonna show way more then what is currently on the market.
All that’s going on is AMD is feeding MS with features just like Nvidia is feeding MS with Features. Async could be used for DX11 as well but it was only designed for a low lvl API.
Totally agree, even fully supporting dx12 as a term is dubious, as there’s neither vendor fully supporting every feature.
They both excel in different areas with their GPU. There was an analysis of shadows of ashes on some site and they showed in what features both vendors cards excelled. Ashes also slightly performs better on 980ti now (0.4fps lol) and that’s without Nvidia async software and drivers fully in place.
It’s all fairly standard in reality. One performs better in one area and other performs better in another. Nothing new really. AMD just relying on dx12 to remove responsibility from their drivers, because they can’t march Nvidia there IMO.
Sorry I misjudged you
Well I think Async has been nothing more then Hype just like when AMD was trying to hype Mantle years back. I just want a full on DX12 game to come out before the World blows up. I am so sick of all this pre alpha / beta BS. It’s been article after article after article. And nothing but delay after delay.
Cool I guess.
I don’t see the obsession with a FOV slider, as long as the FOV is changed via console of a config file. Its not like its something you set more than once.
this is a general statement, i have no clue if the FOV is locked or not in JC3 as I’m yet to play it.
Its locked.
“they can improve the performance even further”
even further ? yeah because it was a phenomenal in terms of performance now people can get 10% more.
basically Nvidia used tech from Maxwell to one up on AMD cards but at the same time Kepler once again gets left in the dust
This doesn’t help the pc at all. The game is gpu bound!
Oh I know! With the power of DX12 they will finally enable sli and crossfire!! Only on DX12!
But all AMD fanboys screamed that AMD support DX11.3 so what’s the problem? Nobody cares about Async. Nobody wants Minecraft graphic games running at 1000 fps. Everyone wants eye candies.
Also AMD lied to everyone claiming they have full DX12 support, Mantle is DX12. It’s their own fault for holding back industry by not supporting these DX12_1 features.
Always the same, they supported it but hardly any games actually used 11.3. AMd always give more out of the box but it’s just not needed, just only gives the impression of value, just like no game will use 8GB of VRAM.
Well yeah but the thing is, it will probably give you better performance compared to its predecessor.
It’s not a gimmick to call it a gimmick is to call Async a gimmick…
We’ll see, if it’s anything like DX11 Compute, AMD’s superior Compute performance didn’t benefit them much and they didn’t capitalise on it in terms of game titles.
Lol it is a DX exclusive feature it’s not something Nvidia just made up its a DX standard. and apparently it improves performance on Nvidia cards as digital foundry showed in their performance analysis video
If you seen the video’s from digital foundry you would not say it’s a grain of salt…. Besides Async could be used on DX11 as well but it’s only used for a low lvl API.
Async Shading was 1st demonstrated and buffers were 1st demonstrated in OpenGL back in 2008 by Kgroup… The same thing AMD put into Mantle.
Nvidia uses software asynchronous don’t they? Amd has it in hardware.
Lol. You guys want to turn the comment section here into wccftech ver.2?
please noo..nooo…no!
Yeah, sorry, but “DX12” sounds too bold if they didn’t implement its best features. And remember, we are waiting so long since the first announcement of DX12 for the first game using it fully. People are gonna rage :P.
I was excited when I saw “DX12”, but it seems it’s stripped from the most important DX12 features.
Whys anyone complaining? Fidelity + performance improvement sounds great. Let’s wait to see it before we pout, shall we?
People got to complain about something now days, I mean people are still complaining about Batman AK, yet it runs vastly better now, same with AC Unity. Funny really because the worse Batman AC was never fixed in DX11 mode.
Most of the people who comment on this site are so cynical about everything. It’s insane.
No doubt.
Like i said TB got famous for being cynical.
I stopped playing the game after reading this and will continue playing after dx12 patch to enjoy it with the best graphics possible!!!!!
The way I’m reading it, those features are already in the game, being used if you have a DX12 environment. I’m not seeing an announcement of any future features.
“improve the performance even further as well as bring additional visual quality to the game.”
Every news outlet covering this says this same thing. This is literally all DX12 is for. Performance and visual quality.
so sad…. JC 3 in win 7 cant use dx12 features… T_T
That’s monopolistic proprietary software for you. OpenGL doesn’t need an OS change.