Nvidia has released a new video for its Hairworks middleware that enables developers to simulate and render fur/hair to provide a truly interactive game experience. This video features some animals from CD Projekt RED’s highly anticipated RPG, The Witcher 3, so be sure to check it out if you are fans of it. Enjoy!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
in before “Nvidia tech = the demise of the free world” comments….
This looks great however I hope its better optimized for fps compared to how it was in COD:Ghosts. But given its CDPR, I have more faith in them than any other developer at this moment. Also hopefully AMD users will be able to use it with decent FPS too. AMD users can usually use HBAO+ and PCSS with not a drastic performance difference compared to Nvidia hardware. If anything this can follow that same trend.
I don’t see THAT big of a difference between witcher 2 maxed out to Witcher 3.
Even if that were the case, that’s not necessarily a bad thing. Witcher 2 had fantastic textures and detail in objects. But there are a few things that stood out to me as different, mainly water and how it interacts with objects, fire effects and just the amount of detail in the flora especially considering its open world. But I’m sure once we get closer to launch maybe we may get a breakdown from cdpr.
Here we go. Radical AMD fanboys comments showing up. Please explain how AMD’s global illumination, Forward+ looks same when on/off. But when these features are on they cut Nvidia performance by ~50%?
HDAO does hurt performance a lot in Sleeping Dogs while HBAO+ doesn’t in other games so much.
Hdao in far cry 3 specifically kills performance at least for my nvidia cards it does. But it looks way better than the standard HBAO. All other games I prefer HBAO or HBAO+. I assume hdao is optimized for amd which would make sense since these games would be gaming evolved titles.
You assume how about you should know. Did Nvidia have any restrictions from optimizing their drivers for Far Cry 3?? The answer is NO.
And FC3 doesn’t use AMD optimized effects just DX11 effects. Generally AMD’s gaming Evolved Program’s goal is to improve the performance and visual quality of the PC version period, that is what a developer said when Nv’s goal is to improve performance on their hardware exclusively.
Great here we go more amd fanboys.
Instead of going off on your tangent, how about you read properly next time. Nobody here is arguing about your beloved AMD or how many of their optimized effects or how their program is better blah blah. Which BTW you are wrong about how “gaming evolved” and “Twimtbp” program works but if you are only going to read one article and decide based on that then fine, you’re clearly biased.
Um not sure why you are running in here white knighting for no reason. There was no bashing of amd or hdao or game optimization for either cards. I actually have no issues with AMD and I respect them as a company. I’m not an nvidia fanboy, I just prefer them, end of story. I simply stated nvidia cards take more of a hit with hdao than amd cards did. At least initially that was the case. And that’s a fact and you should read up on. But there was no shit talking or condescending attitude towards it.
So stop being butthurt for no reason. For fucks sake, the amount of fanboy warriors boggles my mind on this site.
I base my opinions on information from people who know what they are taking about. Now a game developer knows how Gaming Evolved and TWIMTBP work better that little nobody Djinferno806.
Simple you spread misinformation that has to be addressed. You are clearly nvidia biased, there are more the 1 posts to prove that.
So what if Nvidia’s hardware takes more of hit from a DX11 effect, setting?? Far cry 3 doesn’t use any specific AMD effects just DX11.
You are butthurt for no reason on tressfx and FC 3 it seems. Again Nvidia has access to TressFX’s source code and they can optimize it like they see fit. They have no restrictions other that their hardware performing like crap in compute but that is their fault.
“They have no restrictions other that their hardware performing like crap in compute”
What do you mean by this? Do you mean double precision performance which has no impact on any game?
Yeah right. Kepler GTX cards perform like crap in compute that is why AMD is much better at mining.
Double precision is not an argument as an HD 7970 is better at mining than a GTX Titan.
They don’t perform crap, NVIDIA didn’t put so much Compute power into their cards because they focus on CUDA. There is no game that used AMD massive Compute power anyway, it’s negated by NVIDIA raw power and multi-threading performance.
One could argue it’s like the ATI Pixel Shader 2.0 fiasco but it’s not, NVIDIA don’t feel massive Compute performance is needed, and besides, PhysX is used in more games so CUDA is more important for them. If you want massive Compute performance, that’s one of the reasons the Titan was made for.
The subject is more complex than you think between NVIDIA and AMD performance.
They do perform like crap in compute even the GTX 580 is better suited for compute than most Kepler cards and GK110 is barely faster.
Honestly I don’t care what NVIDIA feels is needed or not needed or the excuses you try to invent for them.
Then show me a game where AMD destroy NVIDIA in Compute performance and show benchmarks. Synthetic benchmarks don’t count.
You generally see it in apps.
http://www.extremetech.com/computing/153467-amd-destroys-nvidia-bitcoin-mining/2
Nice article.
When games have some effects that make use of the compute capabilities of a card AMD will perform better in that area.
You also have the examples of Dirt Showdown and Grid 2.
So when AMD bring something which is better optimize for them, it’s a proof that their performance is better. When NVIDIA do the same, it’s the proof that they are doing something worng. Right? None of these games doesn’t use any miracle computing power from AMD GPUs. They are not using FPUs (which represents differences what are you talking about). This is misunderstanding of what “compute” power is using and benefit in games. If it would be like you think, in every game we would see that kind of difference.
AMD used an open standard, who’s fault is Nvidia didn’t take in consideration the compute performance of their cards. Even so nothing stops Nvidia form accessing the source code and optimizing the game.
Games just started to use compute, in the future it will only become more popular.
“AMD used an open standard, who’s fault is Nvidia didn’t take in consideration the compute performance of their cards.”
What is that open standard? DX11? Or what? I want to know if you know what are you talking about. Because it doesn’t look like that.
“Even so nothing stops Nvidia form accessing the source code and optimizing the game.”
Really? Or you only believe it?
“Games just started to use compute, in the future it will only become more popular.”
Can you tell us how games use this “computer power”? Even physics APIs don’t use FPUs.
Direct compute physics man direct compute and any GPU can do it only some are better than others.
“Really? Or you only believe it?”
Well didn’t Nvidia optimize Tressfx for their card in Tomb Raider??
“Can you tell us how games use this “computer power”? Even physics APIs don’t use FPUs.”
Like explaining to a 10 year old kid: you have TressFX, DiRT 3 employs DirectCompute for its high-definition ambient occlusion, The advanced depth of field (DOF) effect in Metro 2033 needs three rendering passes. Two of these employ pixel shading, while the third uses DirectCompute, Civilization 5 uses DirectX 11 and DirectCompute to leverage a variable bit rate texture codec algorithm.
http://www.tomshardware.com/reviews/directcompute-opencl-gpu-acceleration,3146-5.html
Satisfied???
It’s just your word again everyone else and AMD’s whining about Gameworks which they’ve yet to prove. AMD denies DX12 existed then brought out Mantle, then DX12 strangely came along without AMD knowing about it after. R9 series re-branded 78xx/79xx cards, AMD are just as bad.
AMD made some fair points about Gameworks, if they are lying why doesn’t Nvidia sue them for hurting their imagine??
Richard Huddy said he has the emails which prove what he is saying but he won’t show them because it would hurt the developer’s image. If he is lying why doesn’t nvidia sue him??
AMD did not deny anything about DX12 you are a very confused guy.
Maybe NVIDIA has some e-mail on AMD too. And they also don’t want to hurt somebody’s image. 🙂 Richard Huddy could lie. On the other side we can ask why they don’t sue NVIDIA, if they have proof of this kind of behaviour.
AMD’s Roy Taylor – AMD – April 6th 2013
““There will be no DirectX 12. That was it. As far as we know there are no plans for DirectX 12,”
Bitcoin means nothing, I’m talking about gaming. Tell us something new though, AMD has MUCH better Compute power than NVIDIA, what so f*cking new about that?
Bitcoin uses direct compute and why is is so hard to use google?
http://www.tomshardware.com/reviews/directcompute-opencl-gpu-acceleration,3146.html
So what? Why do you have to say the obvious over and over again? AMD have superior Compute performance. Yes we f*cking know that however, it’s not the only factor.
And you think that using direct compute break NVIDIA legs right? Did you ever program something in your life? Should I be afraid of NVIDIA HairWorks because it uses Direct Compute? Should I buy AMD GPU because of that?
Well NVIDIA did optimise TressFX after release but on release the performance was poor and even AMD users said TressFX is not worth the performance loss.
AMD sponsored games must be using their optimised render paths for their GCN architecture, it’s just the game is probably not using specific NVIDIA render paths for optimal performance on NVIDIA cards.
Well that is why Nvidia had access to the source code. So they don’t have excuses. TressFX was not a Black Box for them.
TressFX was pre-optimised out of the box for AMD GCN, don’t make it out to be some kind of standard that GPU makers should go and optimise for.
You really think NVIDIA can optimise for TressFX without the game code? No, because they didn’t get it prior to the game’s release.
They did get it at least 2 weeks before release and were able to put out a driver rather quickly. Nvidia was just making excuses for the poor performance on the release day. Nice to see their fanboys backing them up.
You can’t prove anything other that what you read in the media which doesn’t prove anything, it’s just creative thinking on their part to get clicks.
Yeah yeah.
Nvidia got they cod before the game was released. And didn’t they say it would be crazy need access to the games code for driver optimization. So why did they complain so much?
Why is it an issue? Loads of games get driver optimised without all this FUD surrounding them.
For some problems you need access to the hole game code genius, that was the point.
So AMD and NVIDIA have full access to all libraries what are used in games. They have full access to all source code. Right?
Because just like tressfx, forward+ uses compute cores to render the lighting. But more specifically it is optimized for AMDs compute architecture., just like tressfx. Hopefully amd can fix this to run somewhat better on nvidia like they did with tressfx(kind of).
Nvidia has access to the source code so they can fix it by themselves.
Also Nvidia is just weaker at compute that is why they have lower performance. Anyway Nvidia was able to fix Tomb Raider and Showdown as much as they liked and how they liked improving the performance considerably. Can AMD do the same with Nvidia games?
They fixed it so what’s you problem then? Who spread the FUD about AMD having p*ss poor performance in Watch Dogs? Websites have shown that’s not the case and in some instances beating NVIDIA cards on ultra.
I think AMD cards have better value for money but as a choice I choose NVIDIA, like I choose AMD for my CPU, so stop whining.
He made it sound like Nvidia can’t fix the problems they have in TressFX and AMD has to do it for them. That is false.
It’s stupid I have to explains such simple things.
http://gamegpu.ru/images/remote/http–www.gamegpu.ru-images-stories-Test_GPU-Action-Watch_Dogs-test_new-1920_msaa.jpg
AMD does have poor performance in WD. That is withnthe last update.
Also nobody is whining.
Actually some compute tasks are few times slower on NVidia, AMD is not responsible for that, Kepler is just not generation focused on computation but rather different tasks.
“Actually some compute tasks are few times slower on NVidia, AMD is not responsible for that”
Yet, Nvidia using tessellation part of DirectX is an attack on AMD because their cards can’t handle it.. huh? it that how it is?
Nvidia is horrible for for having closed and proprietary tech like CUDA and PhysX, yet PhysX is open and they have licensed it to just about everybody and offered it to AMD.
AMD mantle is closed and nobody can see it.
AMD is the scum of the earth. Bottom of the barrel trash.
not really GI from AMD uses OpenCL and AMD is optimized more for OpenCL, there is no evidence OpenCL code favour AMD! and anyone can check it out actually.
._. i have a gtx660 3GB GDDR5 and is the best car i’ve ever had so far.
Because NVidia OpenCL compute performance sux, not just in this game, everywhere! But it is their problem, OpenCL is open source, they can optimized or make whole HW around it as they wish. They have a choice!!!!! With GameWorks there is none.
“Please explain how AMD’s global illumination, Forward+ looks same when on/off. But when these features are on they cut Nvidia performance by ~50%?”
“Because NVidia OpenCL compute performance sux”
Forward+ and AMD’s global illumination doesn’t use OpenCL moron.
Second, AMD doesn’t fully support OpenCL.
Third Nvidia has good OpenCL performance, although OpenCL IS slower than CUDA.
http://compubench.com/result.jsp?benchmark=compu20&data-source=1&version=all&test=572&text-filter=&order=median&os-Android_rs=true&os-iOS_cl=true&os-Windows_cl=true&pu-CPU=true&pu-dGPU=true&pu-iGPU=true&pu-mGPU=true&pu-CPU-iGPU=true&pu-ACC=true&arch-ARM=true&arch-x86=true&base=device
The difference was pretty insane imo. It’s not just the detail level on individual things, it’s the scale and the amount of stuff that is out and visible at once.
afaik the Nvidia Gameworks stuff should work just fine on AMD hardware. The main issue with stuff like this is how it keeps the other card company out of the loop until release, leading to shoddy drivers etc. Same thing happened with Tomb Raider, which was awful on Nvidia at first because Squeenix was working with AMD.
Ya most games works features do work fine on AMD hardware with minimal performance hit, especially after AMD has had some times to release a driver to optimized the game. Obviously things like TXAA and Physx hardware effects are locked to nvidia.
Ya all the AMD fanboys outraged and screaming bloody murder over Forbes and Extremetechs retarded and uninformed article about how games works is bad for gaming need to do their research. They seem to think games works and TWIMTBP are the same programs.
Other than ad homs do you have any articles or evidence to the contrary that gameworks is not evil?
We have real games where we can see, that there is no problem with gameworks. For now. The only “evidence” from ExtremeTech is that GameWorks source code is closed. And Forbes has only one test from one game which is completely out (made with old driver, new one works as already expect). According to Forbes, Watch Dogs is a proof of bad influence of GameWorks. Please tell me how HBAO+ or TXAA influence performence on AMD HW. Are there any tests which can show us something like this? Then why should we believe that there is something wrong? Because somebody told us that it is?
Yeah good examples like: Assassins Creed 3 and Black Fang, Batman Games, CoD, Borderlands, such a coincidence AMD’s performance is considerably lower in all these Nvidia games, more than usual anyway.
Anyway there is an ExtremeTech interview with a developer that works at Epic Games(Unreal Engine 3,4) who knows Nvidia and Gameworks and he did say that Gameworks can have a negative impact on AMD hardware.
Negative impact yes (but for now not big). But not by purpose. Only becasue it is optimize for NVIDIA HW. I believe thats true and logical. Developers can optimize GameWorks source code if they choose to. But it doesn’t mean, games are not playable or play worse.
I know all the games you mentioned and none of these examples doesn’t give us a proof that there is a difference because of GameWorks (in times of Borderlands there wasn’t something like that and GPU PhysX used by this game is different after 2 years, so this game is not good example at all). I want to see a test which give me answers on 2 questions. Are game using GameWorks runs worse with all GameWorks features turned off on AMD GPUs? When I turn on these features, how much influence will they have on AMD and NVIDIA HW? Just naming all games using GameWorks is not enough for me. This deserves detail analyse before such kind of judging.
I have bad feeling about another thing. Anytime when some game using NVIDIA “materials” is out, there is always somebody who blame them for somethnig bad (without proofs, only empty words which can be used for everything). Isn’t it some kind of hidden AMD marketing as attack on everything what NVIDIA has and AMD hasn’t? As you can see such kind of theories can be written on everything you want.
So it is a coincidence AMD underperforms in all Nvidia Games when Nvidia doesn’t have problems with AMD games(yes we know Tomb Raider well you can’t say it’s a problem now).
Because Nvidia has more eye candies. With AMD those features are turned off, that explain better performance and blank static game world.
It doesn’t look like they have more eye candies. Nvidia games don’t look better than AMD games.
“Nvidia games don’t look better than AMD games”
Well, sometime they looks. In the most cases, usage of GPU Physx has significant improvement on game atmosphere. For me. Somebody else can see it differently.
My question is: Why are you not happy that you can use HBAO+ or you will have possibility to use HairWorks on your AMD GPU? Is NVIDIa really that bad? 🙂
That is your problem, I don’t even have an AMD gpu I currently use a GTX 660.
And yes in a lot o ways nvidia is bad.
The quality of a game is influenced by a lot of thing and some more colors and smoke won’t make a difference if the game art design is not that great. And AMD games do have physics it not like to have physics in a game you especially need “Physx”.
After everything what you wrote here I hardly believe that you have NVIDIA GPU. In your opinion is AMD much better so why would you buy GPU from their competitor? 🙂
So because I have a GTX card I should root for nvidia?? Nice logic.
I don’t stab by Nvidia’s practices and you clearly have a problem with that.
“So because I have a GTX card I should root for nvidia?”
I didn’t say that. But after all, you think that AMD approach is better, their GPUs are better (and cheaper in general) and finally you wrote that you bought GPU from NVIDIA. Well if I thought about AMD and NVIDIA like you, I would buy GPU from AMD becase it would be a better choice for me.
I know why I’m buying NVIDIA GPUs. I like their approach. I like how they work and how they bringing us sometning new what they develop. On the other site, AMD always only declair something and let others to do the whole work. And then they say, here we are and it’s free (after some years while others can enjoy “the same” thing from NVIDIA). Tell me then why should I bought GPU from AMD, with this opinion?
Why don’t you buy PS4 (if you like closed solutions)? Do you realize that NVidia never success with those proprietary solutions, and if they do, then is it good for PC platform? Should not be PC about different HW and standard API where this HW openly compete?
Problem is that PhysX is unoptimized on CPU in v2.8 and in v3.3+ it runs on CPU very much same as on GPU (plus unified FLEX v3.4). Shame most of games with HW PhysX use 2.8, and therefor FPS goes down regardless what graphic you have. I think there is only one game able to achieve 60FPS stable. And that is not enough^^.
You mean TXAA, because that is only NVidia only feature in that game, and since blurs the image as hell I do not thing anyone miss that. You didn’t get the problem. Problem is that all GameWorks effects (not TXAA) are not HW dependant, even new PhysX FLEX (CUDA vs. DirectCompute version) I wonder how much slower the DirectCompute path get.
And since GameWorks is black box distributed in DLLs with contract that usurps option from developers to change anything, so every effect can possibly harm AMD HW and AMD cannot do anything about it, they cannot optimized drivers. Hope I made that clear, No one give a fruck about NVidia’s eye candy here^^. But how well they run. NVidia is certainly not only company with AO,DOF and other effects. AMD has most of those too in open source (HDAO, Contact Hardening Shadows, Tiled Lighting, TressFX, Global Illumination, etc) in radeon SDK, and it runs everywhere because mostly it uses DC or OpenCL, and there never been a case where it would hit final version of that effect NVidia’s GPUs more than AM’D’s.
It’s because such NVIDIA games have optimised render paths, this is nothing new at all. Also, you’re wrong about that, Watch Dogs actually performance really well on AMD cards, not the FUD that people were spreading and because AMD cards come with more VRAM, the R9 290X performs better than the 780Ti with a slightly less average for a cheaper card.
Metro and Last Light favour AMD, There Nvidia backed games, Hitman Absolution favours AMD, It’s an AMD backed game.
How big is that “underperformence”? As I see, when games are released there are many problems with all of them (it doesn’t matter if it’s backed by AMD or NVIDIA). After some time (and some new drivers) every game perform in the way as we can expect for everybody. Look at Watch Dogs for example. But some sort of people will always have in memory, that this game perform worse on their AMD GPUs because of NVIDIA. It won’t matter if it is true or not. Isn’t it marketing picture created in the background by AMD? To make everything from NVIDIA look bad? Are these theories helpful? I don’t think so.
Yeah yeah. AC 4 sill performs bad with AMD after they did update the drivers as much as they could, in batman you sill have that huge gap between both vendor gpu’s.
AMD did approach the developer of Batman to suggest optimizations for their gpu’s but they were refused. It’s that Ok with you?
Can you write us how big difference is between NV and AMD GPUs in AC IV and Batman (which one)?
And no, it’s not good if developers refuse AMD for their optimizations. But was it really that simple? I don’t think so. In many “NVIDIA games” AMD didn’t have any problems with cooperation and performance. Now you take one or two examples and you are making standard from them. I wouldn’t believe everything what AMD say. Watch dogs is good example of their excuses.
Depend on what card you playing in, AMD had to massively increase tessellation performance in HD7000+ because of that, so pre-HD7 will have much bigger hit. Problem is tessellation units are occupy space that could be use to compute performance improvement which AMD focuses on. You can use GPGPU for almost anything, but tessellation not so much. Better example would be COD Ghosts, Batman, The Secret World and many others.
Do not worry AC4 sux everywhere, np with NVidia features, no use of tessellation there.
about 70%+ on similar HW. But they are not talking about everything from NVidia, they talk about very specific features and code. You just have to listen. But we can always say, those are haters for whatever reason and generalize.
So you mean optimization for NVidia means use tessellated object with so small polygons that you cannot even see that? C’mon, seriously? Or in crisis 2 case water that is constantly under map, no one can see it but it is massively tessellated, there are object with no LOD, so they are again overtesselated even they are on the other side of the map. That in basically every NVidia supported game there is no tessellation quality settings, you can only turn it on or off, and when AMD made tessellation quality settings in catalyst, NVidia stopped use LOD completely to make sure this will not work.
I also believe it is logical to buy more expensive graphic and then make their manufacture slow down its performance by unoptimized game so you have to buy SLI. It is definitely logical, I just have different kind of logic.
“such a coincidence AMD’s performance is considerably lower in all these Nvidia games”
This just in. The pot calls kettle black…. Read all about it.
Nvidia performance is lower in all AMD games. Shogun 2, Sleeping Dogs, Tomb Raider, ext.. all of them.
AMD has a lot of nerve with their hypocrisy and demagoguery.
Nvidia more than doubled their cards frame rates in TombRaider with driver updates. That goes beyond AMD not optimizing. They put shit code in the game and swapped for good code in the driver.
AMD has been sabotaging Nvidia for years. Nvidia is not sabotaging AMD by supporting AMD’s tessellation. It’s part of DirectX.
AMD should make a product that can support their own tech.
Well TXAA is a massive pile of dung so that’s not an issue. It’s interesting however to see you adamantly defend it.
“In Nvidia’s GameWorks program, though, all the libraries are closed. You
can see the files in games like Arkham City or Assassin’s Creed IV —
the file names start with the GFSDK prefix. However, developers can’t
see into those libraries to analyze or optimize the shader code. Since
developers can’t see into the libraries, AMD can’t see into them either —
and that makes it nearly impossible to optimize driver code.”
Do you dispute that?
I’m not defending TXAA. I personaly never use it because of image blur. One of the most examples of “bad” GameWorks is Watch Dogs so I only wanted to know, how TXAA od HBAO+ can influence performance for AMD GPU. If somebody tells, that Watch Dog has bad performance for AMD because of GameWorks, he lies.
For games you mentioned, what is the impact on performance for AMD GPUs? How many %? Are they unplayable? I understand why you don’t like it. I have different opinion on this.
Does Watch Dog has a bad performance for AMD because of GameWorks?
Nobody can say that!!! Nobody can check the code^^
According new contract some developers can finally see the code, but still cannot change it or share it. So still the same shit. Does anyone realize that NVidia publicly support DX12 because it is much closer to HW and it supposed to stop being Black Box and in the same time offer us their Black Box middleware called GameWorks? I like the name, hate the rest.
R. Huddy stated that they never ever made a contract where they would denied access to their code to anyone. NVidia never said such a thing.means this is crystal clear and hopefully people like PC platform open and standardized so they will fight back, but one never know.
“Other than ad homs do you have any articles or evidence to the contrary that gameworks is not evil?”
AMD is a demagogue. They blatantly lie to stir up a mob of the uninformed and ignorant.
AMD claims: Crysis was rendering tessellated water under the ground to intentionally harm AMD performance.
Lie, Cryengine culls occluded polygons and they certainly don’t tessalate culled polygons. Am I to believe that nobody at AMD has an understanding of how game engines work?
AMD claims batman’s cape uses high levels of tessellation to harm their performance.
Lie, The cape has a high poly count to prevent it from clipping through the character’s mesh. See in physX, cloth can collide with a body mesh. None of those deflection plates that have things floating on a plate or clipping like TressFX.
AMD claims gameworks is a blackbox, that devs have no control over tessellation and devs can’t get the code.
Lie, devs can get the code and gameworks is a full software package with tools for artists that let them adjust every effect. You can even see a tool in the promo videos. Cloth physics is also defined by the artist and painted on. Want to see how easy it is?
AMD is pushing an expensive process where everything needs to be hand coded. The artist needs to wait for the programer to come down and explain to them what they need to code up. Do you have any idea how much wasted time that is?
That’s why AMD needs to bribe devs hand over fist to use gaming evolved. It expensive in terms of development time and lacking in innovation.
The things they claim Nvidia is doing, are in fact things they do. That’s called hypocrisy.
Everything new and innovative in computer graphics is in gameworks. Every “next gen” demo was using it.
What does AMD have other than hollow promises, lies, demagoguery and crocodile tears.
Nvidia offered PhysX to AMD and never got a call. AMD then attacked it. That’s why you don’t have it. Nvidia even supported NGOHQ’s attempt to make physX drivers for AMD cards. AMD blocked that. They blocked you getting physX. Nvidia can’t make AMD drivers directly and they aren’t interested in spending the QC money doing AMD’s job for them.
Almost everybody else licenced PhysX. Why not AMD?
Pride, because they lost the bidding war?
In other news AMD hopes to move beyond GDDR3 for their graphics cards. They are looking at GDDR4.
https://www.youtube.com/watch?v=8uoD8YKwtww#t=69m30s
Stop listening to the PR clowns. Stop hanging on the every word of liers.
How about you say something that is true or would that cause you to melt?
Yes most of games do, but imagine that AMD wouldn’t be open as it is and actually does the same thing as NVidia, Intel could do the same thing. So we would have 3 companies with their own proprietary solutions that would not work/work very bad on different vendor’s HW. So ideally you would need 3 PC to get most of all PC games. They literally trying to create a console from PC. Thank god only developers that have contract and NVidia pay them use it and no one else does. This is the last thing PC platform need atm. Confident company would not have to do that, because they would know they have better HW, they can program better drivers.
Hairworks use massively line tessellation and it is black box so developer probably cannot change that settings. So it runs on AMD but twice as slow, but it doesn’t run great on NVidia anyway, overuse of tessellation is common and NVidia does it for years (since DX11 came out, often without LOD, so drivers cannot change it either, in crisis water that was never visible and was tessellated, and basically every NVidia supported game that used tessellation has some form of it), I wonder why anyone buy more expensive NVidia graphics if they NVidia purposely degrades performance of their own cards but anyway.
With Tomb Raider wit was very much different, TressFX was brand new tech in beta state, immediately after TressFX was final it was downloadable via AMD webpage (open source code + samples, now integration in some engines too) So NVidia could very well optimized their driver. But you are right there was about a month a gap. But TressFX then was months old tech, NVidia HairWorks is under development much longer now, purposely slowing down HW and not like TressFX that run same on any HW, hairworks runs like shit on AMD due tessellation overuse. 🙂
Nice FUD.
“Hairworks use massively line tessellation”
That’s how you do hair and fur.
“black box so developer probably cannot change that settings.”
yes they can. It comes with all the needed tools. The artist moves a slider. Too hard? Devs have the source code.
COD optimized for AMD by lowering tessellation and blurring it. It looked like shit though.
Seeing as AMD/ATI invented tessellation, it’s clearly a plot by Nvidia. Did they use psychic powers to implant the idea into AMD’s so they could later harm them by using it? Perhaps you can explain this Nvidia plot to use directX 10 and 11.
“in crisis water that was never visible and was tessellated”
Lie. Cryengine culls occluded polygons, and doesn’t tessellate culled polys.
With With Tomb Raider Nvidia got the code late and it was sabotaged. AMD inserted a fast shader during runtime.
TressFX is a piss poor imitation of 2008 Nvidia tech. It’s a spring with gravity and a deflection plate and no LOD.
Nvidia didn’t need the source code to optimize their drivers. They just needed to write a non a better shader and substitute it in like AMD did.
Yep it’s all part of Nvidia dastardly plot to support tech AMD invents.
AMD’s “gaming scientist” says AMD hopes to move beyond GDDR3, even GDDR4… lol
It’s the blind leading the blind over at AMD. They should drop the demagoguery and try competing.
Cool effects, especially on the Wolf from Witcher 3. Also, I hate that there’s so many games where you kill wolves. Like what the hell? They’re majestic animals; why can’t we have a game where they’re the heroes or they’re celebrated in their presence and full glory instead? When I make a game, wolves will be friends, not the enemy.
Well, you can summon a wolf (familiar) to help you in Skyrim. 🙂 In The Witcher, they are a wild animal that attacks you, ergo puts you in danger. Of course you kill them.
Yeah that’s about one of the few examples where wolves are actually useful instead of an obstacle. and yeah obviously you kill them if they get in your way, but I still don’t see why devs always have to portray them as beasts or as demonic/dark creatures a la Twilight Princess, I mean seriously. I seriously will make a game where wolves are embraced instead of fully feared and people will love it with passion.
Well, except from some werewolf love here and there, they are perceived as a dangerous predator, just like a bear and the sort, especially in a RPG medieval game.
GoT has them as pets, but that’s about it from what I can remember now. Usually it’s wild predator, that you’ll meet in that environment, ergo, portrayed as such.
yeah I suppose. I’ve actually never watched GoT though I’ve heard it gets intense, especially with the sex scenes. but yeah even as pets that’s a step forward. or something like in Far Cry 4 where you won’t be rewarded for killing elephants something along those lines would help to say the least. it’s still not fair that it gets put in the spotlight like that; it’s like how gender and sexual orientation need more representation in games and what we have now is only the start of such a movement in the right direction.
They are “bad guys”, could be any animal, human or alien. Does it really matter? 🙂
Wolves are my favorite animals so yes it does matter. They are underutilized and underrepresented in the industry. I’m sure you can understand that.
Yup, agree with you. However, in games, we talk about wild, vicious creatures. It’s all about concepts, representations and so forth. Don’t play those games. 😀
Yeah that’s the way it’s always been; looks like that’s never gonna change. I won’t play games where I’m pitted against wolves. If anything I’ll do my best to dodge any encounters in The Witcher 3 and only do it if it’s part of a main story quest or get a weapon or move I will need to proceed in the game 😉
don’t worry, geralt him self is a white wolf 😀
yup that’s as good as it gets 😉
i wasn’t kidding. at the end of the witcher 2 geralt have to choose between killing a dragon or save her, he himself doesn’t want to kill her (player can do it but it was a moral choice) because he thinks dragons are beautiful and she was the last one. it’s not Farcry or Assassin’s Creed to just kill then animals for nothing maybe some upgrades but nothing more, killing whales for achievements
I know I just only read that he’d gotten the name because of his past so I thought you were referring to that; I didn’t know all those other details since I didn’t play Witcher 2 nor do I plan on it or even the first game since CD Projekt RED said they’ll give a recap of all that for Witcher 3. but hey thanks for that cool info though. I can’t wait for Witcher 3, even though it will be the first and last one I ever play, given it’s the finale of the Witcher saga.
Well, except from some werewolf love here and there, they are perceived as a dangerous predator, just like a bear and the sort, especially in a RPG medieval game.
that’s another thing:why werewolf over pure wolf? pure wolves are much more interesting and graceful-I wanna RPG where wolves are the saviors of humanity or at least fight for the common good or have a place in the world beyond being prey and predators.
What a shame, nobody said Okami. ‘-‘
(okay console exclusive please don’t hurt meeeeeeee).
Oh my god I completely forgot about that game. I never played it actually so it didn’t cross my mind.
It’s awesome, just pure awesome and assuming that you are a Zelda fan (highest of fives 😉 ) then I just say: F’IN PLAY IT. You won’t regret.
Cool. Another game to adds to my list if stuff I still have to play which is a huge list hahaha. Of course I’m a huge Zelda fan; wouldn’t have this profile pic if I wasn’t 😉 and arguably my dad is a bigger Zelda fan than me since he’s played the series from its inception except for the handheld ones.
Your dad is awesome 😀
And of course there’s Zelda TP, altough I never quite enjoyed the “Wolf Link” (AND THOSE BLOODY BUG HUNT QUESTS).
Yeah I have him to thank for me being a hardcore gamer though there’s other things I’m not grateful for him doing. Anyway he saw the trailer for the next Zelda and is pumped up for it, and the same for Hyrule Warriors, both which I a looking forward to myself. I personally hated TP; it is the least fun and interesting of all the Zelda games even if being a Wolf and going to the Twilight Realm and weapons like the Spinner were cool when it first came out. And yeah I didn’t like the bug hunting. It was also where Nintendo fell below its and the game’s own ambition, going below Nintendo standards honestly no matter how much so called true fans will say it was the only real Zelda since OoT and MM. I believe the next Zelda will finally remedy that.
Dat trailer, can’t put my emotions in words *o*.
But I have to control my hype, I did the same thing with both Zelda TP and SS (didn’t play it yet, but from what I saw, the lack of “openess” per say of SS made me want to tear my hair out), so I have to remain sceptical from my own good xD.
Right there with you my friend. I didn’t know about TP until my dad had bought it.and SS is very good.yes it has a lack of openness in terms of connection between the different regions at least from the sky but within them there’s lots of openness and the story is actually good and the game play with Remote Plus more than makes us for it. Worth playing to the end 😉
Goodie, have to play SS now 😀 , and you, find yourself a way to play Okami or else… (*?m?)
xD
That I shall 😉
And there we have the death of the joke called TressFX as every thing nVidia does is on a different level then AMD!
There is only one and thats nVidia for sure, GameWorks just kills any pathetic crap those clowns at AMD could ever put out!
ATI now AMD have allways been second class unstable junk and that wont change even how much you retards AMD fans says!
nVidia are superior in every way and I am not some stupid fanboy course I just speaks the truth as I have used ATI cards before but will gladely pay more as you get a whole different level wtih nVidia hardware, thats just a fact!
what an idiot
but seriously what are you like 12?
And you’re not some stupid fanboy huh? I guess you’d be perfectly happy in a world where NVIDIA is the only GPU manufacturer and charges you whatever they feel like charging for their hardware (definitely not cheap).
When you see someone using the words “retard” and “fag” in their comments you pretty much know they are morons. I think this kid should have his PC taken away from him, he’d fit in a lot better on Xbox Live.
TressFX is opened and can be used and optimized by anybody, that simply makes it better.
GameWorks is just a library of DX11 effects than Nvidia sells to lazy game developers.
As I read before, game developers can make changes to GameWorks source code (according to license) and they can make optimization as they want to do. They only can’t share source code to NVIDIA competitors.
GameWorks is not about lazy developers. It’s about saving money and time. They work a lot even without using GameWorks. You don’t have to worry about that. The question is: What is better? Game without these additional effect or game with them. Because in reality developers who don’t use GameWorks, at the end they don’t implement these effects at all. So who is more “lazy”? Somebody who spare time to implement support for additional libraries and give us something intreresting or somebody who doesn’t do anything in this way?
Yeah yeah in certain situation developers can optimize the source code for who they like, not like they ever did this or the didn’t refuse AMD when they approached them with suggestions. It’s nice to live in denial.
All gameworks effects can be implemented through DX11. Until now gamewors has been used mostly in poor PC ports that underperform even on Nvidia hardware, yeah nice stuff. I hope it won’t screw the performance in Witcher 3. That would be bad.
“All gameworks effects can be implemented through DX11”
Yes of course. GameWorks is the proof that it can be done. But why somebody doesn’t do it? It’s not that simple as you think. When there wasn’t GameWorks, what was your excuse for poor performance in many games?
Are you blind or something? Most games don’t use Gameworks and you make it sound like every game uses it.
When they first announced Unreal Engine 4 they said Gameworks is integrated in the Engine and Now Epic says it optional for those who want it. Yeah developers started to see what gameworks really is.
I don’t have excuses, if the game performed bad on a gpu solution It was because the vendor was not able to optimize their divers properly or the game developer was not able to deliver an optimized game.
“if the game performed bad on a gpu solution It was because the vendor
was not able to optimize their divers properly or the game developer was
not able to deliver an optimized game.”
Well so why do you thing, that NVIDIA is responsible for performance on every game which is using GameWorks? Now it’s not AMD (as a vendor) or developer responsibility that it doesn’t work as good as it can? In Watch Dogs only one driver from AMD released on day 1 return things to normal.
“Yeah developers started to see what gameworks really is.”
No. It’s only what you think. APEX is integrated in UE4 but it doesn’t mean that developers have to use it. They have choices (for everything). This is what Epic said abou it. For example do you think TressFX is integrated in Cryengine? That developers who are using Cryengine doesn’t have choice to use something else? It’s the same thing and it doesn’t mean that “developers started to see what TressFX really is”. Did you ever see some game engine? Do you know how it works? Because it looks like you have no experiences in this.
I don’t think is nvidia’s responsibility I know it is. If AMD would have access to the code and would be able to optimize the game OK it would be their fault but if they can do then because of gameworks licences then it’s Nvidia’s fault. A very simple concept.
Not everybody wants to give one gpu vendor an unfair advantage and that is what gameoworks basically does.
AMD could optimize games with GameWorks without any problems. They have no access to source code of GameWorks, but developers who use it, can do it instead of them. But there is no problem with game source code. Nobody restrict AMD from it. So when watch dogs wasn’t problem for AMD, why other games should be?
And what is unfair advantage? That you are using something what you developed by your own? Thats unfair?
First not all developers can do it and only in certain circumstances then AMD still won’t be able to do it, that is what Nvidia PR said. What if Ubisoft doesn’t know how to properly optimize the game for AMD gpu’s and AMD is not allowed to help them?
The idea is simple. If AMD has a problem with a gameworks effect they can’t fix it even if it’s fixable.
And WD is not OK.
http://gamegpu.ru/images/remote/http–www.gamegpu.ru-images-stories-Test_GPU-Action-Watch_Dogs-test_new-1920_msaa.jpg
That is with the last driver update from AMD.
Well it is unfair for a gamer that just whats to enjoy a game and happens to have an AMD GPU which doesn’t perform how it should.
And for example TressFX is not locked for Nvidia. Would it be OK for you if the performance of Tomb Raider stayed the same with all the setting enabled??
tressfx and hairwork still unstable they eat a lot of frame rate
but if they find a way to put all those effect with a small hits like 6-8 fps then its good if not then nobody will ask about this
will be hit in …FX (cuz not only tress, is also Grass, birds etc. ) some say in 2.1 will be hit MAX of 2fps 😀
I don’t think in 2.1 max hit only 2fps if soo AMD done a very good job in optimize tressfx
What does it mean they are unstable? Eating a lot of frame rate doesn’t mean that? And how you know something like that if there isn’t any game which use Hairwork? TressFX is not unstable too (it only doesn’t look good).
I mean by Unstable frame rate every where 70-28-35(tomb raider)
Call of duty you will get massive hits in fps
No, doesn’t look good, cuz it uses Particle system to simulate hair o_0 Utopia lol and hardware hungry as hell. Better is Tress FX 2.1 cuz hair is more natural there. and its free
Your comment is ill informed and doesn’t make sense. Tressfx is free? What are you talking about? That is all speculation from you. Until 2.1 is actually used in a game, there is no validity to your statement. It just sounds like an AMD fanboy comment.
Yes TressFX is free. You have access to the souce code and optimize it how you see fit.
Where you see that it uses particle system? How do you know that TressFX 2.1is better? And what does it mean that TressFX is free? Because it’s not vendor locked? Hairworks is not vendor locked too. It is implemented using Direct Compute so verything you need is DX11 capable GPU (NVIDIA, AMD, Intel). BTW developers can optimise GameWorks libraries for AMD or Intel GPUs (if they have license with source code). The only can’t share source code with NVIDIA competitors.
That wolf’s pelt looks too perfect though, as if somebody just got done shampooing and adding conditioner to its hair. It’s too flowy and thin looking. Fur is thicker and moves much less.
Wanting to show off the new technology got in the way of the strive for realism.
playing FPS game, just to see the dog’s hair. superb
Quoted from HARDCOP:
“Performance
In terms of performance we were surprised how close the R9 290X and GTX 780 Ti are. There has been a lot of FUD around the internet about AMD potentially lacking in performance compared to NVIDIA. We hope we have smashed the rumors and provided facts based on gameplay and not some quick-use benchmark tool that will many times tell you little. We actually found the Radeon R9 290X slightly faster in some scenarios compared to the GeForce GTX 780 Ti. We also found out that gameplay consistency was a lot better on Radeon R9 290X with “Ultra” textures enabled thanks to its 4GB of VRAM.”
Quoted from Extremetech:
“Right now, there are three things we can say about Watch Dogs: Its multi-GPU scaling is poor, benchmark results are erratic, and it uses colossal amounts of VRAM above 1080p — which means comparisons above this point tend to favor AMD to a progressively greater degree as the resolution increases.”
Quoted from Cem Cebenoyan – NVIDIA
“I’ve heard that before from AMD and it’s a little mysterious to me. We don’t and we never have restricted anyone from getting access as part of our agreements. Not with Watch Dogs and not with any other titles,”
First the HARDCOP test is strange and contradicts almost any user test.
Extremetech you simply read the guy’s opinion.
And NV PR yest that is proof enough for any nvidia fanboy especially when the guy contradict himself later on.
You’ve posted nothing to backup what you’re saying and benchmarks are to be taken with a pinch of salt and you can’t validate “users” benchmarks
The benchmarks you posted hold no more validity than HARDCOP’s or Extremetech’s.
User testing(real world) is more accurate to me. Not all tests show “good performance” of AMD gpu’s in Watch dogs, at Guru3D showed the same thing.
Benchmarks are different on every system anyway but the simple fact is Watch Dogs doesn’t play well on any system as expected, ever mind the whining AMD users on top.
Watch Dogs doesn’t run well on a Titan on ultra so stop your whining.
And there are other tests which are different. So everybody can find what he want.
Everything what you said in this discussion is fanboy talking. You gave us nothing as a proof.
Well I am taking with other fanboys after all.
What do you think is NVIDIA using on their features? Do you think they don’t use Direct Compute? Using of Direct Compute doesn’t mean usage of this theoretical computer power. Are you talking about single or double precision performance? Because I think you are talking about double precision which is not using in games.
BTW NVIDIA HairWorks from video in this article is using Direct Compute too. So maybe we can see in Witcher 3 how far is AMD before NVIDIA. 🙂
I you can’t concentrate and answer to what I wrote you should just shut up. I answered you questions and now come with other questions that aren’t directly connected to what I wrote.
If AMD will have access to HairWorks for optimizations it should perform better on their hardware as simple as that.
“If AMD will have access to HairWorks for optimizations it should perform better on their hardware as simple as that.”
And you saying that on what basis? I’s not as simple as that. It’s only you simple view on the much larger problem. If you think, that only using of direct compute make big differences on performance between GPUs because of “computer power”, you are completely wrong. It’s only your simple imagination based on zero experiences in SW development.
Again. Watch Dogs used only HBAO+ and TXAA from whole GameWorks library. How could these 2 features negatively influence AMD GPUs? TXAA is even exclusive for NVIDIA GPUs so for AMD it’s irrelevant from performance point of view. What are you saying is not logical. You are only constantly finding issues and complains on NVIDIA without thinking about what you read.
You said So when watch dogs wasn’t problem for AMD and I gave you the image.
I haven’t even mentioned WD you keep bringing it up.
Warframe’s Kubrow needs this implemented.
Were any animals harmed in the making of this video?
My being a hysterical lesbian feminazi with Origin interests and all that….