Warface image header

Crytek’s Free-To-Play Shooter, Warface, Gets NVIDIA’s GameWorks Treatment

NVIDIA’s Andrew Burnes has informed us about a video that showcases the newly implemented NVIDIA GameWorks effects in Crytek’s free-to-play shooter, Warface. Via its latest update, Warface supports dynamic smoke effects, realistic turbulence, and persistent real-time PhysX particles from explosions that interact with the virtual environment as well as characters. As NVIDIA noted, owners of GTX760 graphics cards and above will be able to enable all of the game’s bells and whistles (including PhysX High). Enjoy!

Warface Demo of NVIDIA GameWorks Technology

66 thoughts on “Crytek’s Free-To-Play Shooter, Warface, Gets NVIDIA’s GameWorks Treatment”

  1. You know, with all this this Nvidia news, exclusive features and graphics, that I’m seeing. I’m thinking AMD should step up with their game. It will be bad for all PC gamers if AMD’s GPU division got squashed by Nvidia. Without competition, Nvidia GPUs will be very expensive XD

    1. Well, AMD have True audio, Mantle, AMD Gaming Evolved, Games bundle with AMD cards, low pricing so there is plenty of competition from AMD. I went to AMD for value for money and it fitted my budget perfect.

      1. True Audio
        Mantle with like 3 games
        What use has gaming evolved?
        Both have game bundles

        It’s like 1 audio feature vs 10 graphic features.

        I’m not hating Amd, but they really need to step up their features and driver support. The sooner their managers realize that, the better for gamers.

        1. Driver support it fine, Gaming Evolved is AMD version of “the WAY It’s Meant To me Played” but features you can use graphical features on any card, plus they give you up to 3 games from a pick of many.

          Mantle has not been rolled out for long, DX11.2/3 only supported by 1 game, also, Mantle has support in 5 games and 20+ more to come.

          1. Because Fermi and Kepler support DX11 only! Nice to see that NVidia has finally recognized that new DX versions are worth to make HW for!

          2. Really? Where we can see DX11.x tech which has no HW support on NVIDIA GPU? There was none. 🙂

          3. You wrong! ALIEN ISOLATION support DX11.2
            And you ask wrong question.
            You will not see it almost anywhere and that is the point!
            Because no developer will use feature level that only few GPUs has. 11.1/11.2 has only GCN and now 980/970. Because NVidia didn’t think we should use anything better than DX11.0. Therefor this particular feature level will not be use much if at all. You have 11.3 and 12 almost here or in beta.

          4. Ok. Try to think. What is the feature from DX11.x which has a benefit for game development and NVIDIA doesn’t support it? The truth is that NVIDIA has no full support for DX11.1 or 11.2, but they support subset of functionality in HW and the rest by SW. So I ask you again. Name me important tech from DX11.1 and 11.2 that has no HW support in NV GPUs.

          5. “The different tier’s represent the level of Tiled Resources feature supported under DX 11.2. R7 260X, R9 290 and R9 290X will have the ability to support the entire feature set, both tiers of Tiled Resources under DX 11.2 in Windows 8.1. Remember also, DX 11.2 is only available under Windows 8.1, not Windows 8.”

            “DX11.2 will be supported by any DX11 or DX11.1 GPUs.
            – Tiled Resources support is not mandatory, it’s just an option.
            I created a list that contains the most relevant gaming features in DX11.1/11.2 and the hardware support for it.

            UAV in non-pixel-shader stages:
            NVIDIA Kepler: No (technically yes, but the DX limit this option in feature_level_11_0)
            NVIDIA Fermi: No (technically yes, but the DX limit this option in feature_level_11_0)
            AMD GCN: Yes
            AMD VLIW4/5: No (technically yes, but the DX limit this option in feature_level_11_0)

            Larger number of UAVs:
            NVIDIA Kepler: No (8)
            NVIDIA Fermi: No (8)
            AMD GCN: Yes (64)
            AMD VLIW4/5: No (8)

            Optional features:
            Tiled Resources support:
            NVIDIA Kepler: Yes (tier1)
            NVIDIA Fermi: No (tier0)
            AMD GCN: Yes (tier2)
            AMD VLIW4/5: No (tier0)”

        2. MANTLE API

          TrueAudio

          MiddleWare Radeon SDK: TressFX 2, MLAA 2, Tiled Lighting, Separable Filter, Deep Bounds extension, Silhouette Tessellation, SSAA, GPU Particles, Forward+ algorithm
          Those are just few technologies that can be implemented in variety of effects. All generally optimized for any DX11+ compliant HW.

          1. “All generally optimized for any DX11+ compliant HW”

            Please don’t lie. Or you are only so naive? These techs are optimized only for AMD GPUs. Or do you think, that AMD tested these technologies on NVIDIA GPUs to get best performance from it?

            The next thing. Where we can see usage of GPU Particles from AMD SDK?

          2. Now you’re just arguing for the sake of it, both companies optimise their features for their GPUs in sponsored games, it’s not a point to argue and hasn’t been for a long time. The only thing worth arguing about is NVIDIA specific GPU features that no one else can use.

            PhysX CUDA GPU acceleration is for NVIDIA cards, people just need to get over it, I have.

          3. Yes I know wat yoy want to say. I just don’t know if Rodney is so naive or just lie. That’s all. He is claiming everywhere that AMD techs are well optimized for all HW what is completely wrong.

          4. Evidence?
            Any game that runs significantly worse on NVidia after application of any AMD tech. anyone?

          5. I want to see any evidence on your claims too.

            “HairWorks – rendering of 1 strand takes 9* longer for comparable AMD card! In TressFX is about the same.”

            Give me the proof. Not just AMD statement in some article. There was 1 game to now which used Hairworks – COD Ghosts. So give me test which prove that Hairworks significantly runs worse on AMD GPUs.

            Please read only articles about AMD and don’t spreading lies under every NVIDIA connected article. How old are you? 15?

          6. The proof is in the Computte power of AMD cards, you do know that even the Titan which has upgraded Compute can’t even beat a mid/high high AMD card in Compute?

            NVIDIA GPUs have better tessellation performance though AMD fixed that somewhat with the newer R9 series Tonga

          7. Ok. I know that. But isn’t this computer power only referring to amount of double precision units on GPU which are not using in games? As I know, Hairworks doesn’t use DP units at all. And how it is connected with claims of slowdowns for AMD GPUs which are “7 times slower” in computing of 1 strand?

          8. Well. It;s the similar claim that Rodney’s in while discussion. But as you know, AMD and NVIDIA doesn’t optimize their techs for competitor’s HW.I think thats’s logical assumption.

          9. The developers do, NVIDIA or AMD help them. If the tech works on any GPU then it shouldn’t be an issues, like HBAO+, HDAO for example. Of course all GPUs have their way of doing things or faster, default paths but I’ve not seen anything to suggest that HBAO+ is fast or slower on AMD cards or HDAO on NVIDIA cards.

          10. As I remember, NVIDIA said, that they don’t push developers to not optimize code on their features. They can’t show or send it to competitors, but developers could optimize it. And as you say, HBAO+ run good on cards of both manufacturers. And instead of that, there are people who claimed Gameworks for bad performance in Watch dogs and the only tech which can influence it is HBAO+. The same think here with Harworks. Rodney claimed it as bad optimized, but I have no information about it’s bad influence in COD (which is for now the only game where it is used). And I really don’t like that.

          11. NVidia specific features cannot hurt AMD, but their HW independent, yet black box GameWorks easily can.
            HairWorks – rendering of 1 strand takes 9* longer for comparable AMD card! In TressFX is about the same.
            Noone can review the code, no one can fix it or even suggest fix. But at least ubisoft got paid so there is at least someone benefiting from this contract.

          12. Try google radeon sdk (they have documentations, samples, etc. free to download… Try download gameworks)
            GPU particles are described in AMD Gaming Blog and Radeon SDK contain sample (volcano)
            They certainly don’t forbid developers to optimize the code themselves! The code is distributed in source, easily reviewable and fixable if there is any issue.
            AMD efficiently uses standard API that NVidia supports. As evidence you will not find game that would run significantly worse on Nvidia. Something you can see with many Nvidia titles! (typical example is hairworks vs tressFX where rendering one strand is 9* slower for AMD)

          13. And lie and lie and lie. 🙂

            “typical example is hairworks vs tressFX where rendering one strand is 9* slower for AMD”

            AMD claims 7 times not 9. That’s for first. And I read article about that. It’s only something which AMD claims. Without any clue how they get this result. But it doens’t matter right? AMD has always true. They don’t have to prove anything to you right?

            You know nothing about SW development. Nobody from us can tel what is relly optimize and what is not. We have no access to source code (and not only for that). You only choose to believe everything that comes from AMD. But don’t worry. In time you grow up.

          14. “GPU particles are described in AMD Gaming Blog and Radeon SDK contain sample ”

            And some real game?

        3. BTW AMD has more games in GE program then NVidia! AMD just do not promote at all. But you can always use google! It is free!!!

          1. Yes, that is true. Nvidia have zero games in GE program. Also AMD have zero games in TWIMTBP program. But according to my google search skills, there is much more games in TWIMTBP than GE.

      1. Like compare grown adult
        vs.
        young brat
        Open contract free help to game devs. with open technologies
        vs.
        purchased under contract implementing HW dependent or not well optimized technologies in “black box – DLL” form

        1. “or not well optimized technologies”

          And you know that because you saw source code right? It’s fine you defend AMD all the time because you like their techs but spreading misleaded information without any proof is not cool. You should take care only for AMD and let NVIDIA users enjoy their tech.

          1. What? You obviously referring to something but I have no clue what. How did I not let NVidia users to enjoy their tech? To tell the truth and asking questions? Are you for real?

          2. What you are telling are misleading information or lies. That’s all. You have no experiences and knowledge to make such statements. The only one thing you know is, that NV is doing everything bad. So are you for real?

    2. ALIEN ISOlATION (Deferred renderer, HDAO+, Real-time DirectCompute radiosity, BC7 texture compression DX11.2, GPU-accelerated particle physics, DirectCompute Contact Hardening Shadows, Silhouette-enhancing tessellation)
      Sniper Elite (MANTLE)
      Lichdom: Battlemage (TressFX 2.0, TrueAudio)
      Most Forstbite 2.0+ & Cryengine 3.5+ games support MANTLE as MANTLE is part of their SDK now
      THIEF (DirectCompute Contact Hardening Shadows, Silhouette-enhancing, HDAO, MANTLE, TrueAudio)
      Murdered: Soul Suspect (HDAO, MANTLE)
      And many other games from AMD Gaming evolved program. It is just bloody hard to find what AMD’s tech those games use. But AMD should promote that for sure!

      1. “Deferred renderer, HDAO+, Real-time DirectCompute radiosity, BC7 texture compression DX11.2, GPU-accelerated particle physics, DirectCompute Contact Hardening Shadows, Silhouette-enhancing tessellation”

        The only one technology which is claimed as AMDs is HDAO+ and contact hardening shadow tech. It’s in theirs own site. You just get all interesting technologies from Alien and think it’s all ?AMDs stuff ? Can you give us a proof for it? This is for the first time I see something like that.

        1. AMD Gaming Blog (second post)
          Hint: try google it!
          “This is for the first time I see something like that.” I am not surprised^^

          1. I see that blog. And there are only 2 tech which AMD claims as their own from Alien game – HDAO+ and contact hardening shadow tech. So tell me where do you get that for example radiosity used in this game was developed by AMD. Because that blog doesn’t tell this.

      2. “Most Forstbite 2.0+ & Cryengine 3.5+ games support MANTLE as MANTLE is part of their SDK now”

        That is not true. Mantle is NOT part of their SDK. Cryengine does not support mantle.

          1. So? I checked latest version 3.6.9 and as i said. Mantle is NOT part of their SDK. Cryengine does NOT support mantle. What do you not understand?

          2. You’re a user not a developer, developers get the new SDK first. You can’t claim Cryengine doesn’t support Mantle because you haven’t seen it in their SDK yet you moron.

          3. So people who buy cryengine on steam to develop games are users or developers? Or which developers do you mean? So since what version cryengine supports mantle?

          4. I’m saying devs got it first before public release just like Unreal Engine 4. Looks like you’ll have to wait for Mantle, Crytek said they’re supporting Mantle, yet you don’t believe them.

          5. “Crytek said they’re supporting Mantle” Source? I didnt notice that.

            “I’m saying devs got it first before public release just like Unreal Engine 4.” Any proof for that? Especialy for the UE4 case. I dont believe it.

  2. I wonder how this game will run with PhysX and other features. If stable 60FPS on GF 760 then good job NVidia but it would be the first time.

      1. PhysX and especially PhysX 2.8.x is not CPU optimized while CPU still does some necessary computing. Means that regardless what GPU you have CPU will limit PhysX performance anyway as only parts of code is GPU accelerated. This should be fixed with new unified PhysX FLEX 3.4. Still it might takes many years before we see it in any game!

        PhysX is great but needs lot of work. Because it does the very opposite what low level APIs do. When applied you will get everything BUT stable FPS.

      1. I reviewed every single test of PhysX that I could find for last 6 years and PhysX never achieved minFPS>60. Only exception is Batman Origins.
        I might miss some so please be so kind and send me some links that proof me wrong. THX

        1. My godness. Minimmal FPS are proof that something is not good optimize? It depends on many aspects. It’s not only about API. Usage of that kind of API, level of detail which you want to achieve, skills of programmers, analytics or artists – all of that influence the optimization. And I’m sure I missed many other things. I don’t work in game development. But I have 9 years of experiences in SW development and you can trust me, your point of view is very simple (it doesn’t mean you are stupid, you just don’t have knowledge which could help you to understand).

          1. Yes it has. But Rodney as big AMD fan must doing his campain and spred lies about everything connected with NVIDIA. Now he is telling that Hairworks is 9time slower on AMD GPUs comparing to NV GPUs. Of course without proof.

            For those who are interested in PhysX, here is very good insight on PhysX development from very begin:

            http://www.codercorner.com/blog/?p=1129

    1. I play Hawken and Warframe everyday with PhysX on and never get below 60fps. In fact most of the time my frame rate is above 100fps.

  3. “mostly based on GPGPU, optimized for any HW”

    Another lie. It can’t be optimized on any other HW then AMDs. Do you even know how are these features implemented? AMD and NVIDIA primary optimize on their own HW. They are not testing it on competitors HW. Nobody do that.

    “AMD just does not make promotional video for every game they support”

    They should be lucky that they have you. 🙂 I just read this article:

    http://community.amd.com/community/amd-blogs/amd-gaming/blog/2014/10/07/high-tech-fear–alien-isolation

    And there is only one technology which AMD cl,aims as their own – HDAO+. Where we can see a proof that other tech in Alien isolation are developed by AMD?

    1. Read it again and then try to look at Radeon SDK^^
      Surprisingly you will see the very same effects there.
      Interestingly how people calling someone a liar, they lie in very next sentence after that. You’d never tell.

      Next time before you call someone liar again, ask them to provide evidence or explain themselves, etc.

      1. I found out these tech except radiosity but maybe it’s provided by module I didn’t see. S I made the same mistake as you did several times before. I apologies for that.

Leave a Reply

Your email address will not be published. Required fields are marked *