Yesterday, we informed you about the exclusive PC features that will be implemented in Far Cry 4 and Assassin’s Creed: Unity. Well, today we’ve got some additional titles that will benefit from NVIDIA’s GameWorks. According to slides from NVIDIA’s event, Batman: Arkham Knight, Project CARS, The Witcher 3: Wild Hunt, and Borderlands: The Pre-Sequel. In addition, we’ve got additional information about the GameWorks effects that will be featured in Assassin’s Creed: Unity.
According to NVIDIA, Batman: Arkham Knight will support Turbulence, Environmental PhysX, Volumetric Lights, FaceWorks and Rain Effects. Project CARS on the other hand will come with DX11, Turbulence, PhysX Particles, and Enhanced 4K support.
We’ve known for a while that The Witcher 3: Wild Hunt will support specific NVIDIA features, but today we got the complete list of those features. The Witcher 3 will support HairWorks, HBAO+, PhysX, Destruction and Clothing (do note that Destruction and Clothing will be supported by the console versions).
Here is the complete list for a number of titles that will support NVIDIA’s GameWorks:
Assassin’s Creed: Unity – HBAO+, TXAA, PCSS, Tessellation
Batman: Arkham Knight – Turbulence, Environmental PhysX, Volumetric Lights, FaceWorks, Rain Effects
Borderlands: The Pre-Sequel – PhysX Particles
Far Cry 4 – HBAO+, PCSS, TXAA, God Rays, Fur, Enhanced 4K Support
Project CARS – DX11, Turbulence, PhysX Particles, Enhanced 4K Support
Strife – PhysX Particles, HairWorks
The Crew – HBAO+, TXAA
The Witcher 3: Wild Hunt – HairWorks, HBAO+, PhysX, Destruction, Clothing
Warface – PhysX Particles, Turbulence, Enhanced 4K Support
War Thunder – WaveWorks, Destruction
And here are the slides themselves. Current-gen consoles currently support PhysX Destruction, Clothing and WaveWorks. While additional effects will be supported at a later, there is no ETA as of yet. Furthermore, some of these graphical features will also be supported by AMD’s GPUs on the PC.
Enjoy!
[UPDATE]
Here are some comparison shots for Far Cry 4 and Assassin’s Creed: Unity. Enjoy!
Assassin’s Creed: Unity NVIDIA GameWorks






John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email











hopefully this does not mean, shitty optimization.
hope this answer does not mean low end gpu on ur PC case!
PCars, Witcher 3 and Batman gonna look so sick on PC
Will Physx destruction on PC in the TW3 more complex and better than on consoles?
duhhhhh…of course ! Same as Android, Console will receive only a capped version of it!
its expected to be, yes
Hell yeah, nVidia F*CKN RULES 😉
I wish they’d support 3D vision too. Batman in 3D will be delicious.
TW 3: Destruction…¿some barrels and such or actual destruction? O_o
in one of the first trailers you can see widows shatter
http://www.push-start.co.uk/wp-content/uploads/2013/12/1-2.gif
That was almost a year ago and they have improved a lot since then. But I imagine it will be stuff like that.
Aehooo…my first Gigabyte G1 970 is coming until next friday…Brazil logistic!
Witcher 3 dayONE 4k maxsettings FTW!
Holy shite, Already? How much?
339 bucks! But has people already at 30 hours gameplay with it and reselling it ! lmao!
I’m Br too dude, give me the realz xD
1449 no terabyte! g1 gaming! falaram q chega ate sexta!
Sendo q ela eh superior a 780ti q custa 1000 contos a mais!
Queria na Kabum, mas la so em novembro!
Aproveita o desconto 15%
Awesome 😀
Sem falar 4GB, BlackPlate, requer uma fonte 450w e roda no medio/alto em 4k!
kkk…fazendo comercial pros caras quero grana nisso! kkk
steam: goCorinthiansgo!
HUE xD
huehaehuauehahehaue! kkk
eh aqui q tah rolando altas potarias?
Ainda vou demorar um pouco pra fazer meu upgrade, mas bom saber que eles já tem disponível assim tão rápido. 🙂
Gigabyte sempre chega rapido por isso so pego ela…ainda mais pela refrigeracao…a placa dizem as reviews nao passa de 65 graus em 99% de uso…
Esses caras arrebentaram dessa vez!
por favor
por favor oq? kkk so os hackers!
e-huh ? :)) i just said something that i don’t know what that means
“Por favor” means “Please”
😉
Eu não falo portuguës. jk pero si hablo español.
owch nvidia games… my HD 7770 cant handle this shit
Wait for R3xx or go for nV 970 😀
Put that joke gfx in the trashcan and get an GTX or GTFO 😉
lol fanboy detected
Not mentioned in this article but Star Citizen is also getting the full GameWorks features as well, which include Tessellation, Turbulence, PhysX Particles, PhysX Clothing, HBAO+, TXAA.
Source?
here you go:
http://physxinfo.com/news/11822/star-citizen-and-project-cars-will-include-gpu-accelerated-physx-and-apex-effects/
that was before the AMD stuff (the mustang etc)
on the actual website the only company whos technology has been talked about to any great extent has been AMD so i have a slight feeling the GPU physx might not be happening (i think Croberts himself said that such a thing would be a bad idea due to it being only available to NVidia users and the finished game would require quite a bit of physics simulation to even be the game – this is the current issue with project cars the game requires physx in order to work it cannot operate without physx so it runs like sh*t on AMD cards and nvidia cards if you turn GPU physx off (if you have NVidia and cars just try it and watch your framerate burn)
its really a shame pCARS did not deliver in regards to enhanced GPU PhysX effects, there’s no Turbulence, no GPU particles. nothing. all is “generic” / standard PhysX and strictly running on CPU. I have the game, when I turn on physx visual Indicator from driver it says in game “PhysX > CPU”. toggling physx in driver between auto/GPU/CPU has no effect its still running on CPU and no framerate difference whatsoever so you can stop bitching about physx, you got exactly what you and your beloved AMD wanted – no one will enjoy special GPU effects just so you can feel better about your sh**ty hardware, which barely handles the game’s graphics regardless of physx.
The announcement was at the Nvidia Editor’s Day event.
http://physxinfo.com/news/11822/star-citizen-and-project-cars-will-include-gpu-accelerated-physx-and-apex-effects/
IT will also get mantle support (very likely). Also since most of physX are planned for new consoles, they should also run on AMD HW. OR we we will have to wait for flex (for another few years?)
A gimped version is coming to console and mobiles, But at the end of the day you still need a nvidia GPU if you want to get the full visual effects.
it should be pointed out that that was the nvidia editors day event back in 2013
on the actual star citizen website (in the news and announcements) there have been little in the way of talk about nvidia technology but quite a bit with AMD so things have most likely changed
also if it was to use physx we would have a second project cars because the game even now requires many physics calculations to be done to even work (flying the ships etc) so using physx for that would cause AMD users to be unable to play the game (you cannot turn physx off on cars because it is required for the game to work same thing might happen if SC uses physx)
Yeah
DBZ: Xenoverse coming Steam>> http://www.saiyanisland.com/2014/09/dragon-ball-xenoverse-confirmed-for-steam/
Can someone explain what Turbulence is?
It’s a whirlwind effect on particles all done in realtime.
Ubi games unoptimized for amd: Confirmed.
AC4 runs great on AMD cards, not that AC4 run great in the first place really lol.
It’s about teh future. Remember Watch_Dogs? amd users were completely rekt.
AT the start yes but driver updates fixed it, besides, Watch Dogs didn’t run that great for anyone on ultra.
Game performance improving after driver updates is what we humans call “Unoptimized pos”. It’ simply a failure if a game doesn’t run at it’s optimum after it’s release and the hardware manufacturers have to fix those issues. It’s just like a mediocre game fixed by the modders.
except for money.
Ubisoft Confirmed.
Cool stuff; can’t wait to play some of these games.
i really hope the physx works just as well as it did in the previous arkham games.. and not like assassins creed BF which was BS !!!
That why nVidia F*CKN RULES you bitches!
For me nVdia are Pc gaming, just gotta F*CKN LOVE EM!
G-Sync are an perfect example of nVidias inovation. What they do just works.
So get an nVidia card or GTO, it’s just that simple 😉
Only a million times faster and better, just as everything nVidia does!
You cant compare those clown over AMD to do anything good, only a poor and or an IDIOT buys AMD gfx. It’s that simple 😉
I guess being a nice person is a unknown concept these days.
Just a stupid thing to say, you clearly don’t like competition and if AMD wasn’t around your Intel and NVIDIA prices would be sky high. you see even hardened NVIDIA users and PC snobs busted blood vessels over the Titan Black price and the R9 295 X2 destroyed it and at half the price.
Ouch.
http://i.imgur.com/Z0Iu9ON.png
Oh damn,,, I didn’t realize i’m poor and idiot for buying amd cpu and gpu.
I just want a decent game play with recent games during free time.
So I should spend more money on a 6 fps difference which is 80% more pricy on what I have purchased?
damn somethings wrong with me. I got to talk my Psychologist now i think.
Hell yeah, I feel sorry for mate, really sorry and as stupid as you are you are forgeting about all those nice shiny physX effect. You go and try to play them on your weak amd crap mate, but oops you cant it’s nVidia exclusive 😉
I can only hope Nvidia told Ubisoft to get their crap tougher because it’s pointless to put gameworks into their games if they are un optimized, stutter-ish, pieces of garbage on PC.
Finally you admit that those effects are meant to cripple fps ,on the hand AMD cares for its fan base ,hint; Mantle which is meant to reduce CPU overhead and improve performance.
you are dumb…. it had nothing to do with gameworks… ubisoft does a crap job with or without gameworks… And it looks like Nvidia told them to get PC optimization as a number 1 thing to do. I am sure if Ubisoft blew it for AC5 and FC4 Nvidia would end its partnership. And Mantle does not do jack for higher resolution and ultra settings dealing with higher end gpus… Is that Why BF4 gets 10 more fps on a 780ti then a 290X does with Mantle on? come on dude. you trolling me fool?
PS don’t put works in my mouth….
1st its you who’s blind Nvidia fool. 2nd my 2×7950 improved a good deal largely in minimum frames using mantle in bf4 64 mp 1080p ultra settings. Do you even know what Mantle is supposed to do ?
http://www.hardocp.com/images/articles/14110637240cPED1snfp_7_3.gif
LIAR!
http://www.hardocp.com/images/articles/1393620031nTfVKdLjSj_3_1.png
That pretty much shows what I’m on about, the CPU bottleneck is exposed and Mantle uses the CPU cores a lot better.
dude that benchmark is old come on man… lol
Well the one your put up doesn’t show anything or prove anything you said.
the hell it don’t… it showed that DX11 on the 780ti was 1 fps from Mantle on the 290X. WTF is not to prove!?1?! really? At least I showed the newest benchmark you freaking tool and you can’t say I did not prove anything. wow just piss off man
It’s not my fault you don’t understand about Mantle and you also contraindicated yourself. The 290X wasn’t faster than the 780 Ti in the first place, Mantle allowed it to beat it but you’re too stupid to see that and the point of Mantle anyway.
Now what? I just freakking showed you that Mantle with a 290X only Beat a 780ti by 1fps…But as you can see it beats Mantle with DX11 over a new driver… You are just a moron!
And like I said I tested it on BF4 with a AMD R6 265 as well as a R9 290 none X version and it’s not that much of a improvement over Nvidia cards on DX11. I fully freaking understand about Mantle. And it has a lot of work that needs to be done. See update Idiot.
by 4fps, a more expensive card beats a cheaper card by 4fps LOL
you don’t get the point… it’s not that a expensive cards beats it… it’s because it’s on dx11 I am done with you
Yes because there is no CPU bottleneck, moron.
oh you are so smart… so smart… if you game buy a decent CPU to begin with… you don’t think Mantle will let you use a DUO core gpu with a 290X do you!?!?! Hell I might just buy that as a set up just for fun youtube it and prove your dumbass wrong… you think you would get amazing performance with a duo core gpu with a 290X!?!? well do you !?
I can’t really talk to you while you’re a rabid fanboy, we don’t even know what NVIDIA did with their CPU overhead driver, they won’t tell anyone. Notice I said “CPU overhead” which puts the 780 Ti ahead of the 290X because it was faster in the first place all because of the CPU bottleneck.
http://www.vortez.net/articles_file/28040_bf4-4.png
Learn something.
CPU bound.
http://i.imgur.com/eGpLhGW.png
GPU bound
http://i.imgur.com/ZX4txQs.png
don’t try to fool kids with old benchmarks with old drivers…And those are on really old drivers btw… nice try lamer
http://www.hardocp.com/images/articles/1393620031nTfVKdLjSj_3_1.png
Don’t bother with the guy, he knowns nothing on a technical level.
” Is that Why BF4 gets 10 more fps on a 780ti then a 290X does with Mantle on? ”
Well the graph you just put up just proves you wrong LOL The cheaper 290X wins
wow you get a amd card and now you are all on rory reeds nuts… lol thats funny by 1 fps… you don’t seem to get the point made… if Mantle was so great the win should be by 15 to 25 fps…
And if you read what I said right, it will only do that if their is a CPU bottleneck. DO you even know what Mantle does? It’s not always about the GPU, you wonder why so many people complain about low GPU or CPU threading? It’s because of DirectX and driver limitations, it’s not about NVIDIA vs AMD, both cards are limited exactly the same way and why NVIDIA CPU driver doesn’t actually affect games or systems with no CPU bottleneck.
Again, Mantle does NOT magically boost performance on every system, just like DIrectX 12 wont give 40% performance on every system. BTW, I posted a screenshot of Mantle destroying DX11 in Thief just to make a point.
even with dx11 theif does better on amd hardware.. I am done man… button line is Gameworks is running the show now.
and your point kinda sucked with Thief because it’s a AMD game that is spose to hype up mantle… I am sure DX11.1 benchmarks would be better if it was programmed properly XD
Your logical is flawed, Bioshock infinite s an “AMD game” but some NVIDIA cards perform better in it, funny how it’s the other way around in NVIDIA logo games.
you are in denial… like I said I am done with you. Everything I seen other then thief when it comes to DX11 vs Mantle has made Mantle look like a joke. So please just stop… A 290X $450-$700 card loses to a $550 card… Mantle… Failed Face it… you lost man.
I have very same perf. with 290X with MANTLE^^. Compare pls price of 980 in SLI (2/3/4) with price of 1 290X
970’s are beating 290x in bencharks and they are only $329 cards XD
You are in denial, you did not understand what mantle is and what it suppose to improve. I will say it one last time (which is mostly for other ppl who can actually digest new informations).
Mantle is designed to make performance improvements in CPU-limited scenarios and reduces API overhead by reducing draw calls and maximizing the GPU’s overall capability.
Mantle is improving low-end performance, especially when CPU-bound. Mantle also really improve performance in games that tend to be very ‘CPU-heavy’ with lots of calculations and units.
Now to my question. If we know what mantle is and what it does (you do not want to know), then why are you providing non-relevant evidence with GPU-bound situation? Does it have any reason? … apart make look mantle bad and make you look stupid…
I am not in Denial you loon not every game is Mantle… The majority and this is what matters the Majority of games are not using Mantle. That has always been my points because you can’t base overall performance off cards when less then 1% uses Mantle while everything else on PC right now is on DX11.1 to DX9 now
Well it’s beats a 780, why is that a surprise, every new GPU beats the old numbered card, just like the 770 beats the 680. Wait for the R300 series instead of comparing it to an old GPU.
300 series will not be out till 1H of 2015… And is said to be 20nm with stacked hbm
That’s not an Apples to Apples comparison.. You simply can’t compare them both and the fact you show a 980 with a generation GPU leap compared to an old 290X with only 3 fps difference is a joke on your part.
See my image below of a R9 295 x2 destroying a GTX Titan Black in SLI. Note that now One Titan Black costs about the same as a R9 295 X2 LOL
Keep it mind the 295X2 was $1499…. AMD had enough respect towards the 900 series to make it 999.99.
Also anybody who buy’s into the Titan series for games is a fool. Pretty much the reason I believe Nvidia canned the Titan series.
And that was Thief the Biggest AMD optimized game of the year something you should not take lightly about 3fps… That would be like A 290X beating a 780ti/990 in a batman game…
Plus I noticed you typing again. well I am taking off for a while got things to do.
It’s still not apples to apples comparison I’m afraid, look at Mantle with crossfire benchmarks, 30fps difference over DirectX11, you just keep ignoring the point here, got nothing to do with NVIDIA. What are you going to say now, no one uses CF, most people don’t use CF?
The simple fact of the matter is they you don’t understand where Mantle matters and it doesn’t, and nobody ever said that ‘everyone” will benefit, even on the Mantle slides it shows the CPU bound, GPU bound and why Mantle gains and doesn’t.
Everyone who uses DirectX12 will not gain 40% performance, just like everyone didn’t gain with the NVIDIA reduce CPU overhead driver
You don’t know what DX12 will do… nobody does. Just because you got a AMD gpu now you are all AMD fanboy about it… lol you know who else are like that… Bi-polar people…
All you need to do is do some reading.
“PC gamer’s have long suffered the issues of DX11 – and even during my review and preview of Mantle on the AMD’s R9 280, Thief went from 62 FPS on DX11 to over 80 with Mantle. Battlefield 4 was similar – 1080P on a mid range GPU suddenly became a non issue. My point being that it’s clear how much of an issue it is. The R9 280 on the test rig isn’t in the same ‘league’ as the I7. The I7 is traditionally paired with a much faster GPU (say an Nvidia GTX 780 or an AMD R9 290, yet even this super beefy CPU had issues. Think of it this way – the HD 4400 from Intel is an IGP (Integrated Graphics Processor) on the CPU. It’s slower than GPU’s like Nvidia’s GTX 730. And yet, DX12 makes the frame rate jump from 19FPS to 33 FPS.”
bla bla bla I seen that since day one man. Besides the new Maxwell Arch shows great scaling with sli beyond 2 gpu’s without the help of a low lvl API.
And right now AMD is going crazy because the 970 a $329 card is going toe to toe in benchmarks with the 290X.
Besdes another thing you have to keep in mind is right now the Major Majority is what matter’s and last time I checked 99.9% of all the pc games on the market don’t use a low lvl API
Grow up man. I don’t choose products because of brand loyalty, I choose them because of price and performance so you have a lot of learning and growing up to do about weighing up for pros and cons. You know nothing about Bi-polar, your inability to know about how people see the pros and cons and then argue against is useless, using a puerile argument of being a fanboy because someone want to defend the pros of their product by fact.
I can safely say all that you have said that you are the definition of a fanboy and not to win an argument because it it.
feel better now… if you go off price performance right now the 970 is the king of the ring XD
And that we can agree but it’s beyond my price range at this time, it’s a great card.
They have respect towards their R9 300 series 🙂 295X is still the fastest card on the planet with great performance/price ratio.
295X2 is the fastest dual gpu yes… but 980 in sli with some games so far beats it. Which tells me we might just see a 990
Small advice – try to use evidence instead of stupid insults.
I already showed the evidence…
No, you are very personal (for no reason), you insult ppl just because they have different opinion.
Discussion can be led only when you are avoid that. A Moment you fail will end any form of discussion and change it to other form of communication (unfortunately). But it is probably needless to say this anyway 🙁
I only insult when people make smart a*s remarks and talk about a bunch of crap they have no idea about
http://www.pcgameshardware.de/screenshots/original/2014/02/Battlefield_4_Second_Assault_Benchmarks_02-pcgh.png
For some reason dsogaming does not want me to publish images that prove you wrong ,just check hardocp and search for mantle ,and I assure you ,you are going to get hurt. BTW what was the original price of 290x and 780ti ? perf/dollar man, google it.
who cares man main point is gameworks is going strong and you had to come in with mantle this and mantle this… all off topic…
2 things to note here.
1. Hairworks uses a lot of tessellation and NVIDIA are known for having better tessellation performance with this gen of GPUs.
2. AMD GPUs have superior OpenCL performance so It remains to be seen if devs actually use OpenCL heavy. Dirt Showdown just showed how bad it is with NVIDIA GPUs because they used forward based lighting in OpenCL.
AND that is why nvidia had to pay to ubisoft millions to implement it. LOL makes sense!
You sounds like a kid… All companies pay dev’s to get their name on a hot product.
Nvidia see’s something in ubisoft and now they are stepping in with AC5 and FarCry4 at a much higher lvl then usual with a developer.
Ubisoft games been garbage on pc’s way before gameworks even came out. So now I am just sitting back and seeing if Nvidia can change that.
I am not sure where did you hear me, but:
AMD supports more games recently and they do not need to pay millions to everyone who they wanna work with (good for them)
AC4 is garbage with GW too (it is just badly optimized game with DX being huge burden.
Point of GW critic is in distribution and conditions.
BTW why don’t you want to have all PC players to have great experience regardless what HW they have? Are you that selfish? Do you realize that this could lead to yet more fragmentation of whole PC platform. Or do you hate PC as platform? (and waiting for new upcoming nvidia PC with denver or whatever they would use)
If by strong you mean – GameWorks is strong enough that nvidia has to pay to almost everyone who is currently using it, then GW is very strong in that regard!
It just takes forever Venus because they have to review them beforehand.
OMG! Someone should probably tell you but MANTLE as every other new API solving CPU overhead! You have to test in situation when there is CPU bottleneck (not GPU)! Regardless of that, everyone can clearly see here that even without this mantle is faster than DX11! When CPU is actually bottleneck than MANTLE can easily almost double FPS.
I been following and testing Mantle since it came out… So far everything with Mantle support is not as great as when I switch to my MSI 750ti as well as 780ti. AMD gpu’s I used for test are R7 265 as well as a R9 290 none X version.
Just like I still think all low lvl API’s are a gimmick for gamer’s. it’s all about how a developer….developes. As of right now Sniper 3 Elite is the best optimized game that get’s amazing performance on all gpu hardware even on Intel igpu’s for the settings they are limited to. So please don’t be such a AMD only lover.
It’s not about AMD you idiot, it’s the simple fact you can’t grasp about CPU/GPU bottlenecks which are the API and drivers. Game optimisation can only go so far before devs hit the driver and API limits, consoles, Mantle don’t have this problem and it’s why DX12 is now working on trying to solve it.
Low level API is made by almost everyone now. Not because they would love AMD (im sure they love AMD as technology pioneer anyway), but because CPU overhead is very serious issue that cannot be handle otherwise.
No gameworks is amazing you can’t handle it.
who cares about your 2 7950’s do… Proof is Nvidia cards woops AMD cards in benchmarks using dx11 while the AMD gpu’s use mantle…
Nvidia is going beyond What Consoles are capable of giving PC gamer’s a reason to make gaming PC’s. That’s why gameworks was made. And that is why AMD is now trying to copy Nvidia with it’s Radeon SDK so you are the blinded fool. Like always when it comes to gaming improvements AMD is playing follow the leader and they have been every since they bought ATI.
I didn’t say gameworks isn’t amazing ,your trying hard to divert the conversation, I stand correct that gameworks is proven to hamper performance a lot (ex:watch dogs) .Mantle is proven to help performance in minimum frames by reducing CPU overhead in 64mp game.
wow you don’t know jack… Ubisoft jacked gameworks in watch dogs… Even with it turned off it’s a mess. so please you don’t stand corrected for nothing. farcy3 was worthless and guess what it did not use gameworks. Play metro last light amazing performance even on a 750ti! so please don’t talk BS man.
And it sounds like you don’t have any idea nor do you play games with gameworks you are just a net baby who reads but never does. When a developer does a game right the most fps hit I seen with gameworks on is around 5 to 8 at the most… And that’s not bad for looking so spectacular
BTW if you like PhysX that much, then please mind that even ubisoft, paid to use GameWorks, is using HAVOK in all games as main physics engine!!! Just interesting fact.
Radeon SDK does the very same thing (TressFX, Contact hardening shadows, HDAO, etc) in source code so everyone can review that and optimized. Radeon SDK exists for years, was made by ATI originally. The main difference is:
AMD and their users want to have great experience for all users (regardless what HW they choose) but nvidia for some reason do not want all users to have good experience. IF you like this approach, then you will support it. I dont like it so I wont. SIMPLE
But You don’t need to spread this rubbish w/o any evidence, because you really making fool of yourself! But then if ok with it..
hush AMD only lover… I know all about Radeon SNK and it’s light years behind the PhysX SDK and Gameworks program.
PhysX and Radeon SDK are different things. Also Radeon SDK is downloadable by anyone for free with full documentations, samples and source codes. Vs GameWorks that is in black box form, only for chosen, very strict and bound by contract. Every other 3rd party SDK, API from many companies never use this form of distribution!
Dude if you look at Radeon SDK they are trying to build their own form of physX… and you are using words from AMD’s new pr hype man
True again, it has allways been so and will allways be.
AMD second class and will ALLWAYS BE 😉
One thing you forget is that Mantle and 290X would destroy a 780TI if they both were CPU limited. Benchmarks always use beast CPUs to avoid GPU bottlenecks so you really can’t take benchmarks as the all and end all.
you are just saying that because you just purchased a 280X…. But guess what if you have a decent card why waste it by buying a BS cpu…
It’s a fact, DIrectX 11 and drivers are the bottleneck, that’s why DirectX 12 is closer to the metal. Benchmarks are to be taken with a pinch of salt end of story. DirectX 11 simply can’t multi-thread like Mantle.
I do my own benchmarks and I don’t need a $1000 cpu to prove that Mantle is bogus… You can look it up yourself. Even cheaper I5’s with higher end gpu’s with dx11 blow Mantle out of the water well not really out of the water but still its a win
Wrong.
http://i.imgur.com/afv7jvK.jpg
And everyone is an idiot and make new APIs (METAL, OGL NEXT, AZDO, MANTLE, DX12) to fixing this non-existent issue. Thank you for the information (w/o any evidence) but you should contact all vendors that making those new APIs instead.
So true, he is just a blind fanboy who are on the wrong side of the fence. The loosers side 😉
That is why all new API are also solving CPU/driver overhead. Most todays games have often average CPU usage of 20% (i7/AMD FX). Paradoxically, even with this low usage CPU is still bottleneck. New API trying to fix this to make bottleneck on 100% usage. So ppl can buy CPU that they actually will be able to use, not stupidly expensive and powerful CPU that no game can use more than fraction of its perf.
Actually HairWorks in COD was very demanding for Radeons/GeForces because of line tessellation that bring no/small visual improvement (curls) but huge performance hit. They might fix it but probably only for consoles, TressFX v2.1 is much faster in that regard.
I play it and it’s not bad at all stop spreading fud please…
http://www.youtube.com/watch?v=I8R35-ZkjVg
COD is missing any form of transparency, color shading is laughable, just watch it please. If you did not mind that from your gameplay!
that is the most bunk movie I ever seen…
You have no idea what you are talking about. AMD just puts words together and says things they don’t fit into reality. When AMD talks about gaming it’s like listening to a person with downs syndrome try to explain quantum physics. AMD is literally that detached from gaming. The AMD charts were also fakes. They literally said that hairworks took 45ms to process on a 290x. Yet all the Game, DX, driver, every shader, transform, lighting, AA, AF, hairworks, ext.. all combined was 16ms-20ms on a 290x. Don’t listen to what AMD says. They don’t ever try with their lies anymore. At 45ms the game would have run at 13-17FPS.
Line tessellation is use so the hair can bend and it depends on what attributes the artist wants. The vertexes for the hair to connect to are also created with tessellation and that determines how many hairs there are.
AMD just needs to optimize the ordered buffers. Those are used so the tessalation only needs to be done once. Shaders then access the buffers and AMD needs to optimize that.
DX11 was first introduced in 2008. That standard calles for 0-64x tessellation. When will AMD make it to 2008?
In AC4 i had to disable all of that extra crap just to keep it running at above 40 FPS with a gtx 760 OC and 4670k 4.5 i supose those extra features are only reserved for people with 780s and above:( though arkham games run flawlesly even with physx on highest.
That’s because ubisoft has been blowing it for years for PC gamer’s…Heck even Metro last light runs great on a PC with a 650ti series card with PhysX on.
Batman games use a lot of baked shadows and fake effects while AC4 uses all reatime effects.
Yea thats truth though even when i lowered everything to low- medium i would still dip below 50 in some areas oh whatewer i dont expect much from unity considering who is porting it again i think ill wait at least 2 months before buying it
The game isn’t very well CPU threaded so using low settings won’t give you much of a boost and doesn’t initialise the GPU much anyway.
Yes!!! This means no more fake textured hair in upcoming games!!!
Honestly i dont give a damn about hairwork,hbao,pcss.Just i want tesselation and good optimization of these games.
If Ubi “optimize” physics&hbao same as AC4 no thanks. no need an unoptimized game with 30-40fps on max.
hairworks was 1st introduced in 2008 the 1st game to use it was in alice in madness returns … so it’s more like TressFX is the same thing as hairworks only hair works is far more advanced.
The hair in Alice Madness Returns was actually not part of the Nvidia package they had.
http://physxinfo.com/news/5883/gpu-physx-in-alice-madness-returns/
No to everything you just said.
https://www.youtube.com/watch?v=bgouvz7vmPI&feature=youtu.be
dragon ball xenorve is coming to steam!!
and outland!
http://www.pcgamer.com/2014/09/19/outland-announced-for-pc-will-be-on-steam-later-this-month/
The Witcher 3 is the only game holding me to the GeForce brand at this point.
And that is why the game aiming 30FPS xD
nice, i’m gonna upgrade my 770 to a 970 gtx just in time for witcher 3
btw nice bench with old drivers from both teams… anyways I am done love what you got.
Yes mantle is huge success even with old, unoptimized drivers^^
no it’s not man… not on the hardware I like to use… high end users with AMD are still waiting for a performance boost more then a normal driver can preform…
A win is a win, moron, like your 4fps makes a win any more meaningful.
yeah and now Nvidia cards win Moron look at the newest bench I showed! Suck it up AMD will be back with something new… btw even the greatest Anti nvidia reporter loves Maxwell http://semiaccurate.com/2014/09/19/nvidias-gtx-980-takes-triple-crown/
yeah and those are using igpu’s… you don’t think I don’t know that… you really suck man.
And again you fail to understand the chart and what’s been shown to you. The best thing you can do is keep out of technical conversations because you clearly are an NVIDIA fanboy, you truly are the definition of a fanboy.
nope I am a guy who is not buying into the low lvl api’s just yet… I don’t care if it’s mantle or dx12 it’s all smoke and mirrors. If you buy high end hardware and don’t get great benefits then what is the point…
Just because you don’t understand it doesn’t mean they’re smoke and mirrors. It’s a fact that drivers/graphics API limit access to the hardware, something that devs have been on about for a long time.
And this explains Project Cars on AMD, and probably Witcher and Arkham Knight too. I wish GameWorks just dies, AMD games still run good on Nvidia but Nvidia is not allowing the vice versa.
That’s what you get for using crap like AMD.
If I could somehow express the middle finger on my hands with words, then I would have done that right now. So here an alternative:
GO F*CK YOURSELF YOU F*CKIN PIECE OF SH*T.