A few days ago, Square Enix released the official PC benchmark for Final Fantasy XV Windows Edition. And while this benchmark uses some NVIDIA GameWorks effects, it appears that the GameWorks Hairworks effects are currently bugged, resulting in a noticeable performance hit even when there aren’t on-screen any characters that benefit from them.
But let’s take things from the beginning. The Final Fantasy XV benchmark currently offers three presets: Lite, Standard and High. On High settings, the following GameWorks effects are enabled by default: NVIDIA Turf Effects, NVIDIA HairWorks and NVIDIA Flow. Square Enix and NVIDIA also plan to implement VXAO and ShadowWorks effects, though these are not currently supported in the benchmark.
Resetera’s member ‘Kvik’ has shared a tool that was created DrDaxxy by that allows you to completely disable these GameWorks effects (or adjust other settings). Using these tweaks – that originated from Reddit – GamersNexus has run some tests and discovered – via Ansel – that the NVIDIA GameWorks HairWorks effects are active even when there aren’t any characters that benefit from these effects on screen.
As a result of this, the benchmark runs noticeably slower on both NVIDIA and AMD hardware. GamersNexus claimed that it contacted the green team who is currently looking into this.
Our guess is that someone forgot to enable proper LODs/culling for the HairWorks effects. Understand that such mistakes can happen as a simple line of code can result in disastrous performance issues. And… well… could this be done intentionally in order to make the AMD GPUs feel inferior? Well, that’s up to you to decide.
Let’s hope that in the full game Square Enix and NVIDIA will provide options to disable the GameWorks effects and that they will further optimize them for all graphics cards.

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
Gw trash killing perf yet again
English much??
Uuuuu pettiness is bad
not as bad as your English.
“Gw trash killing perf yet again”
AMD GPU, killing performance as usual
Not as bad as your nonsense , if you even bothered to read the damn article gw trash kills perf on the 1080 ti and titan pascal. Instead of acting like a dumb F educate first..
the fact it kills performance on those GPU doesn’t mean the gw options are in fact badly implemented, just that they are demanding and even the titan xp and 1080ti are not exempt from that.
(again, what a surprise that demanding effects are demanding, even when you have powerful GPU).
You have the must awful logic ever, your conclusion is within your premise
SquareE should make absolutely sure that we end users can turn OFF these GameWorks effects.
Further, they could also allow for the Benchmark tool to be available in the game launcher. The users can then test to see what’s working and what isn’t before getting into the game proper.
“Resetera’s member ‘Kvik’ has shared a tool that was created DrDaxxy by that allows you to completely disable these GameWorks effects (or adjust other settings). Using these tweaks – that originated from Reddit – GamersNexus has run some tests and discovered – via Ansel – that the NVIDIA GameWorks HairWorks effects are active even when there aren’t any characters that benefit from these effects on screen.”
Sane person would assume that Kvik’s tool doesn’t work. But whatever keeps AMD propaganda wheel spinning.
AMD’s compute effects are active even when there aren’t on-screen any characters that benefit from them. But whatever keeps AMD propaganda wheel spinning.
the problem is with implementation the AMDiots are blaming Nvidia lol
Nvidia is, to blame if the ppl they sent to se failed to make their own sdk to work and fyi gw kills perf on pascal also.
“and fyi gw kills perf on pascal also”
Course they do, they are very demanding effects, which are meant for high end users with spare horsepower that want to utilise the highest quality of effects.
It’s like saying Ultra level settings in a game are more demanding than the low settings.
OF COURSE THEY ARE
the ykill perf on the 1080ti and pascal titan and no their are more broken than demanding
Sure, then why are other developers not implementing these effects in a way that doesn’t affect performance so heavily? Why is it that the weaker consoles never gesture these types of effects??
Reality check, these effects are all demanding and require powerful hardware, and in future become standard features in some form
Statistically speaking given the fact that not a single gw game ran properly with gw effects on , we can deduce that gw sdk is the culprit a.k.a garbage.
statistically speaking(let’s ignore the fact you have no statistic to back up your claim), most people who play games on PC have a PC less powerful than a ps4, therefore we can deduce that PC gaming is to blame for games running badly on those PC’s and that all those games are horribly optimised.
Again, your conclusion is contained within your premise, therefore your logic will always show what you claim to demonstrate, but this doesn’t make your reasoning correct or rational.
GameWorks has been the center of controversy, since the beginning of it’s inception.
Apart from being detrimental to AMD GPUs, because they are unable to properly optimize Nvidia’s libraries for their hardware, I now wonder how Nvidia video cards are going to pan out, in terms of performance ?
Especially OLDER Gen Nvidia GPUs, but only time will tell.
Anyone remember the tanking performance effect of the “Tessellation” gameworks feature, when it was implemented ? It had impact even on Nvidia GPUs (Kepler, Fermi etc.).
Thanks to AMD for at least releasing the “GPUOpen” middleware, which is OPEN source though, unlike GameWorks which is of closed nature, but it needs to be adopted and implemented by Game developers.
…
it might be the case in the past but nvidia already been giving access to gameworks source code for quite sometime now (for effect that doesn’t need nvidia specific hardware to function). for AMD part they can’t no longer make excuse about unable to optimize gameworks performance on their hardware anymore. the only difference between Gameworks and GPU Open is on the licensing front. so if we see poor performance on AMD hardware it can only because of this two: the effect simply made with nvidia hardware in mind making it slow on AMD hardware or AMD simply did not optimize their performance for nvidia gameworks on purpose. the same also true for AMD GPU Open effect on nvidia hardware.
Thanks for your input.
That is factually false. Yes the libraries are “open source”, ie you can see the code, however you still need a license to use the code, moreover there is a large part of the Gameworks library that will never be open source as that would be akin to shooting themselves in the foot.
Ask yourself this: if Gameworks was truly open source, hint- it’s not, why would AMD not optimize it for themselves? Also why would Nvidia allow AMD to overcome their industry advantage?
Read at my comment once more, did i ever say nvidia “open source” their gameworks library effect? Just because nvidia giving access to it’s source code it doesn’t mean they were open sourcing their code. I see many people get confused on this matter. When AMD said they let GPU Open go open source they just let their GPU Open to be used differently than nvidia Gameworks. AMD allowed their tech to be look at modified and then the one that did the modifying can called the modified code as their own tech. Nvidia allowed others to look at their code for optimizing performance (even AMD) but they did not allowed modification to the core tech itself without nvidia permission. But it should not affect how the optimization being done on GPU Open or Gameworks. Because remember AMD initial complain about Gameworks is source code access not the ability to modify the tech freely to suit their hardware. That’s why we no longer hear AMD complaining about not having access to gameworks source code.
“Nvidia allowed others to look at their code for optimizing performance (even AMD)” I’d love to see a source because there is no way that is true.
Yeah, I’m aware of that as well.
Was talking only about GameWorks, with regards to the above article. NOT blaming or comparing any of these features, be it AMD or NV, including ASYNC Compute…..
Btw, Hitman is an AMD sponsored title.
Why? There’s nothing wrong with it, used correctly. I still marvel at Geralts hair and beard in The Witcher 3. Is it going to affect performance? YES! Of course it is, it’s an extra feature the GPU has to work on, just like AA or motion blur, or AO … Why do people think that extra eye candy from Gameworks should come free, but all other effects they expect a performance hit? I’ll tell you why, because they’re idiots that don’t understand what’s going on.
I don’t think he was criticizing gameworks itself, but rather, how it’s implemented.
GameWrecks.
Saw this video this morning. Glad to see it’s posted here.
HairJerks in the form of two buffalo thingies..
AMD doesn’t use compute because it hurts Kepler (Nvidia did a bang-up job of that themselves), they do it because that’s long been the strength of their GPU architecture, as it was designed to cater to both gaming and “professional” workflows.
AMD GPUs have a history of having more TFlops than their competitors, but having lower performance, AMD is trying to capitalize on that. AMD plays to its strengths, not to Nvidia’s weaknesses.
^Henry Ford
Pretty nice quote from Henry Ford !
“they do it because that’s long been the strength of their GPU architecture, as it was designed to cater to both gaming and “professional” workflows.”
As per reason Nvidia uses tessellation and other features that their cards are stronger at than AMD’s GPU’s.
So, AMD defence force can’t have it both ways
AMD don’t use compute because it hurts kepler? no such thing. what nvidia severely cut from kepler back then was FP64 related stuff on GK104 and below. no games, ever use FP64. we start hear this “kepler weak in compute” with Dirt showdown when AMD and code master implement global illumination to use GPU compute. enabling global illumination in that game cripple all nvidia GPU performance be it on Fermi or Kepler (techreport have shown this in their test). that feature using AMD proprietary tech called Forward plus engine which is tuned more specifically towards AMD hardware. TressFX also using GPU compute but we still see parity performance between AMD and nvidia GPU. if AMD really have that compute advantage no amount of optimization is enough for nvidia to catch to AMD performance with TressFX. when we see faster performance on AMD side most often it is because architecture reason.
Dsog, stop encouraging the paranoid conspiracy theory nuts please