Batman: Arkham Knight – First Tests Show No Performance Gains, Rocksteady Suggests 12GB RAM For Win10

Things are not looking good for those who were waiting patiently for the re-release of Batman: Arkham Knight. After spending five months trying to fix this mess, Rocksteady and Warner Bros re-released the game on Steam without support for multi-GPU systems.

Warner Bros claimed that it’s working closely with its GPU partners in order to enable SLI/Crossfire, however that very same sentence was used in its latest PC status update.

NVIDIA users can try this SLI compatibility bit (0x080222F5)in order to see whether the game runs better from them. Do note that this SLI compatibility bit is not an ideal one. By using it, we managed to enable SLI, however the game did not scale well on our GTX690.

Not only that, but Warner Bros suggested that 12GB of RAM is recommended for Windows 10. I mean… seriously now… 12GB of RAM. In an Unreal Engine 3 game. Man, what a clusterf’.

As Warner Bros noted:

“For Windows 10 users, we’ve found that having at least 12GB of system RAM on a PC allows the game to operate without paging and provides a smoother gameplay experience.”

Last but not least, initial tests show the game performing the same with its pre-release version (we assume the one with the Interim patch applied to it). At this point we can safely say that PC gamers will have to rely on additional raw GPU power in order to overcome the game’s un-optimized code. So here is hoping that a new SLI/Crossfire patch releases sooner than later.

PC Configuration: Intel Core i7 930 @ 4.2 Ghz, Titan X (1425/+500), 24 GB RAM

Pre-launch averages:
with Gameworks:
1080p: 85fps
1620p: 62fps
2160p: 43fps

without Gameworks:
1080p: 131fps
1620p: 91fps
2160p: 57fps

Post-launch averages:
with Gameworks:
1080p: 85fps
1620p: 61fps
2160p: 43fps

without Gameworks:
1080p: 128fps
1620p: 89fps
2160p: 57fps

141 thoughts on “Batman: Arkham Knight – First Tests Show No Performance Gains, Rocksteady Suggests 12GB RAM For Win10”

    1. Are the framerates shown in article bad for you? What do you want? 120 FPS in 4K with the best graphics? If they fixed bugs, it seems ok with me.

      1. Honestly man if you need a 4gb titan and 12 gb to get decent framerates with no stuttering fck it, just fck it. That will not even be playable in the future.

        1. I assume that tests in the article were done with highest settings. In that case it’s good for me. 43 FPS in 4K with all setting on with the best single core GPU is ok. The most people still have FHD and in this resolution they should play the game on highest even with weaker GPUs.

          1. What can you say? Nothing, Gameworks at 4K takes 14FPS more and still people are not happy. Maybe I should do some benchmarks of Sleeping Dogs and show people just how HDAO tanks performance way more than Gameworks or HBAO+.

          2. “Maybe I should do some benchmarks of Sleeping Dogs and show people just
            how HDAO tanks performance way more than Gameworks or HBAO+”

            That’s good idea. But I’m afraid people are not interesting about that. Most of them complain on Gameworks only because NVIDIA is behind it. There are not only HDAO or AA, but we have other settings which are commonly used and have big performance cost. But nobody complain about them never. When these people are not satisfied with 89 FPS in FHD with all settings max, what can please them?

          3. Yes make a benchmark, it’s not mentioned in any of the reviews that HDAO is a performance killer besides that only one game but how many Nvidia powered games run nicely on AMD ?

          4. If you go by Digital Foundry videos, AMD run fine now in Gameworks titles, even beating NVIDIA. Also, the idea that AMD can’t optimise their driver for NVIDIA Gameworks is a flat out lie because Omega driver solved the bad frame-rates in ACU.

          5. Enlighten me which game is it and what features exactly ? If you’re talking about Far Cry 4 only then yes it does run better on AMD but when you disable HairWorks and GodRays. It tanks AMD performance hard if you enable them. As for HBAO+ that’s the only tech from Nvidia that performs neutrally on both Nvidia and AMD.

          6. I don’t how you can make the assumption that those effects would run well on AMD GPUs anyway, they’re optimised for NVIDIA GPUs so they should run much better. Optimisation means just that, I mean if games used heavy Compute then NVIDIA would be in trouble.

          7. Never said that they should run better on AMD but they shouldn’t be outright impossible on AMD, the problem is the fact that AMD is not allowed to work directly with devs when Nvidia is involved at least not on GameWorks features and Nvidia admitted it themselves. As for compute, TressFX is compute heavy and it was worse on Nvidia but now it’s better on Nvidia thanks to open source nature of the tech, that’s called transparent optimization irrespective of hardware but you cannot say the same for HairWorks.

          8. Well, it’s all in the hands of the devs, no one side or the other. NVIDIA have built up a relationship with devs for a very long time, way before AMD came along, NVIDIA have the tools, support to back that up and people like John Carmack back NVIDIA for it(even though he did criticise NVIDIA on PhysX). Ask yourself why devs work with NVIDIA so much, it’s not because all about money, it’s because of their support, tools.

          9. Yet where that support and experience goes when the game performs horrible ? neither developers nor Nvidia comes to the rescue and fanboys like you then blame developers for it.

            It’s all about money and how much they feed into developers mouths, it’s Nvidia going into the nose of every developer instead of the opposite, they don’t miss any opportunity to market their hardware weather they can deliver a good title in the end or not, if they have such big SUPPORT to back up their tools then tell me why every GameWorks title turns out horrible and require a dozen patches before it becomes playable while some games like this this Batman S H I T are downright impossible to fix. They only have great tools for marketing, that’s it.

          10. You’re making some really bad comments levelled at Gameworks title games to do with the amount of patches. NVIDIA’s problem which I’ve said time and time again, even I’ve said this on GeForcedotcom. NVIDIA associated their Gameworks tech with badly optimised games, end of story.

            You and other people forget to say that I’ve actually backed up AMD in how they pick their evolved titles. You know all you need to do is look at my Discuss history, you can see how I criticise NVIDIA for multiple reasons. Would you like me to quote what I actually said about Gameworks titles being bad because NVIDIA chose badly?

          11. AMD should either offer alternatives to Gameworks or stop complaining about it because it comes across as them trying to divert attention away from the fact their rival is offering “extras” with their GPU’s and they aren’t matching them. Besides when a Gameworks game runs poorly on AMD they are up in arms complaining , however when a non Gameworks game has issue on AMD it’s quietly swept under the rug.

            NVIDIA is making Gameworks features to sell high end NVIDIA cards, everyone with half a brain knows this and should accept that they aren’t going to spend any money designing software that’s suited to anything other than the GPUs they have to sell and people who choose AMD are well aware of this and know they can choose to disable these features. Besides, if NVIDIA wasn’t creating these features then developers wouldn’t add them into games as no developer is going to discard their own PC enhancements that are optimised for both AMD and NVIDIA and then choose to replace them with NVIDIA specific enhancements that are designed with one GPU brand . Long story short, without NVIDIA these effects wouldn’t even exist so if they don’t run well on AMD just disable them.

            FarCry4 runs better on AMD cards even with all the Gameworks features enabled

          12. Actually highest settings in this game doesn’t match the PS4 version mate as Digital Foundry said the PS4’s textures are the same as PC’s “normal” texture setting and there is now a “high” texture setting which is above console quality according to DF. I can play it maxed out at 60fps now but I haven’t compared the settings , I am waiting on a Digital Foundry face off

          13. yep thats true. on high after 2 hours game become unplayable for me, it dips below 20fps, lol. but textures on medium is ok.

          1. More like 40-60FPS, DF have a video on the performance, it’s in the low 40s upwards. I posted a screenshot, it’s pending.

        2. It runs great maxed out at 60fps on my Titan X despite being down sampled from 3072x1702p to my native 1440p resolution although you would expect good performance from that kind of set up with 16gb of RAM and i7 using Windows 10.

          However I also tried it on my GigaByte Brix that’s a tiny PC 2 inches tall and 3 inches wide and it ran maxed at 1080p 50fps despite the fact the CPU is below minimum requirements as it’s one of those low power mobile i5’s CPU’s which is simply and i3 with hyper-threading only running at 2ghz. So as you can imagine the CPU is way below minimum requirements although it has 8gb of LDDR3 and a custom gtx760 with 6gb of GDDR5 which was able to run it maxed at 1080p 50fps.

          I think it runs fine for most people now if you are sensible with the settings you enable, for example the consoles use “normal” textures despite being able to use more than 3gb of RAM for graphics and have unified RAM so no streaming data from DDR3 to VRAM which means anyone using a 2gb GPU will have issues if they try to enable “high” textures as the 2gb VRAM isn’t sufficient and the texture streaming is bad in this game.

          Also people with 16gb of DDR3 seem to have no issues.

          It’s not the best but I am still happy with it on the two PC’s that I have tried it, besides, I only paid £12 for the game and the season pass that’s selling for £32. I would have paid more for the game at launch had the developer put the effort in instead of just locking the frame rate to 30fps instead of fixing the streaming software for PC gamers day one

          1. I stopped reading at the first paragraph.

            Seriously you thing everyone has that kinda specs?

          2. Well seeing as you stopped reading after the first paragraph then you wouldn’t have read that I also tried it on my GigaByte Brix which has a CPU below minimum CPU requirements and a gtx760 yet it ran maxed out 1080p at 40fps.

            Also to answer your question, no I don’t think everyone has those kind of specs but that’s irrelevant because my point was that I have tried it on a high end PC using Windows 10 and a modest PC using Windows 7 and am happy with the results on both.

            Besides if I am able to run all settings on ultra at 3072x1702p at 60fps then any gamer with a good GPU should expect decent 1080p performance….

      2. A Titan X is a $1000 video card and not a lot of people have it. Most people have 8 gigs of ram, which when split with the Vram on the video cards, should be more than enough (twice as much actually) to play an unreal engine 3 game. To put things in perspective, the 8 gigs of gddr5/ddr on the PS4 and Xbone already have 3.5 gigs taken off the top for the operating system, with only 4.5/4.0 gigs left for the game itself. See how requiring 12 gigs of ram on arkaham knight doesnt make any damn sense?

        So someone with a gtx 970 and 8 gigs of ram, enough to max any other game at 1080p and probably 1440p, cannot get a stable framerate in arkham knight. That is inexcusable. The elephant in the room is that unreal engine 3 streams texture in real time without loading, and having denuvo DRM thrown into the mix encrypting and decrypting constantly while you are playing the game is taking up both CPU and RAM resources. God forbid they just dropped denuvo from the game.

        1. People in other comments says they achieve 60FPS framerate on max settings with GTX 970. I assume it’s in FHD. So I still don’t see problems. In one week I will try it with GTX 780 in FHD with max settings.

      3. you dirty casual scum, a Titan X is f’ing 3000$ alone tf u talking ’bout? you can bend over all what you want with the companies, fgt, i won’t.
        god damn..

      4. “storytelling is why we are playing games”

        No. Just no. You’re either in WB’s marketing team or you got lost on your way to a console focussed site.

        1. I’m not working in WB’s marketing and I’m not console player so I’m not visiting console focussed site. I appreciate games with good graphics and I’m looking forward on games which offers enhanced graphics effects. But there are games don’t look so good (accoding time when they were release), but have great story and gameplay. For me it is still Borderlands or Dead Space. I play these games even now more then new ones. So not every game can looks like Crysis. But with better graphics you have to expect more performance impact. If game like Batman Arkham Knight can achieve 60 FPS with GPU like GTX970, I’m not considering it as bad performance. I played this game for a while and it looks really good.

  1. A tip for you John!
    You should try the new updated version of the game, as im betting you have the old release 😉 LOL!!

    1. Look at FPS results. They are good. So why complain about it? WB had to fix FPS drops and bugs. And it seems for now, that they do their job. So what’s wrong?

  2. Wasn’t the whole problem about drops to a couple of fps and poor performance on some systems, in addition to the 30fps lock? I didn’t expect average fps to rise, since when was 57fps in 4K bad performance? Even for titan x and without gameworks.

    1. arkham origins gets an average of 62 fps in 4k with 2 980’s
      the problems were more than that it was also fairly buggy in general and it was missing some visual effects like smoke and rain or something on the PC version that the Xbone and PS4 had

    2. The average FPS did rise. This test is just done on the pre released version, who is the one who have gotten the fps increasing patches, a month, maybe more ago. So the article is as missleading as the game was bugged to sh*t on the original release.

  3. What’s broken is Gameswork. There is no improved frame rate for that however turning it off boosts frame rates post patch by

    66% at 1080p
    70% at 1620p
    75% at 2160p

      1. TitanX gives 89 FPS in FHD, 61 FPS in 2160p and 43 FPS in 4K. All with Gameworks on. Is this really that bad? And you have still chance to turn Gameworks effects off. So what’s the deal? And what is acceptable performance penalty? Let’s look at one example. When you turn AA on and you compare MSAA 2x vs MSAA 16x, the performance penalty is big. But I never see people complaining about that and permanently hating AA in general. This is again only about NVIDIA and it’s pretty lame.

        1. MSAA is an old technique that used to work flawlessly (with barely any performance impact) in older engines. Deferred rendering has put a stop to that and MSAA is only still an option because it existed previously and people complain when it’s missing. No reasonable person actually uses it anymore, it’s never worth the performance impact. We will re-activate it when we replay today’s games in ten years. Until then it’s just there for the triple GPU people.

          P.S. Remember what your Titan X cost? Remember that Arkham Knight is in no way a good looking game? Get higher standards. 43 fps is barely playable, far from enjoyable.

          1. MSAA was only example. If somebody claims worse performance when he gets better graphics, it’s the same for me like if he claim worse performance impact with high level MSAA. But I repeat – it’s only example. You can take whatever graphics feature which cost additional performance and change it for MSAA.

            “Remember that Arkham Knight is in no way a good looking game”

            What is wrong with graphics in this game? I think it looks great.

            “43 fps is barely playable, far from enjoyable ”

            Depends on game and it’s individual. I have no problem in most games with stable 30 and more FPS. I have high end PC btw, so I don’t have to do many compromises. Without stutering and frame drops is 30 FPS fine. But I know that there are people who need stable 60 FPS. Luckily it’s not my case. 🙂

          2. Correct on all counts.
            Also note that the children are constantly quoting averages. Who knows what the minimums are. Based on the average frametime performance of certain green hardware, I would say it’s downright unplayable in terms of minimums.

      2. Clearly they like the version to look like the console version and not have Gameworks, effects that the consoles just can’t handle. PC players complaining about enhanced effects, while crying about downgrades and console level multi-platform games, who’d of thought it.

        1. If they don’t want Gameworks, they can turn it off. Nobody forced people used this effects. And they behave like they have no choice.

          “PC players complaining about enhanced effects, while crying about
          downgrades and console level multi-platform games, who’d of thought it.”

          The funny think is that we are talking about same people. When graphics doesn’t look good, it’s bad. When developers offers better graphics, it’s bad too because it costs performance.

          1. And what proof do YOU have actually that GameWorks is not meant to sabotage performance on every other hardware except the best from Nvidia ? Irony is killing me.

          2. You want me to prove a negative do you? It doesn’t work like that I’m afraid, it’s up to them to prove it not me and they have no evidence what so ever, just assumption.

          3. I’m not running away, You’re putting the burden of prove on me when I didn’t even make the accusation that Gameworks hurts AMD. The only reason it’s assumed is because the libs are closed and nothing more.

            Also, you always seem to argue with me in every Gameworks conversation, what does that say about you then? Where is your proof you’re so right?

          4. What proof do you need really ? either your eyes are closed or you don’t want to see intentionally. Alright three AMD optimized titles are coming. Starfront, Dues Ex MD and Rise of Tomb Raider and mark my words that they will only perform slightly better on AMD while giving a very consistent performance across the board but you cannot say the same for Nvidia optimized titles specially those that rely heavily on GameWorks unfortunately.

            Nvidia partner with every dev out there to slap us with GameWorks because with every such title they can do heavy marketing of their tech and sell more graphic cards but the title itself turns out average at best in the end but that very same game looks SOOO good in Nvidia promo videos… What a disgusting way of fooling customers.

          5. You gave no proof at all and you never have, all you do is talk and believe what the media say. You call me biased towards NVIDIA yet you’re always in here defending AMD, we’re never going to get anywhere.

        2. We like games that work, visuals and extra eye candy comes later besides Nvidia biased eye candy is more for marketing rather than making games better otherwise this S H I T shouldn’t have happened with all those false advertising and hype making videos from Nvidia.

          1. Then turn them off, are you so stupid you have to moan about something you can turn on and off yourself? Are you that hard up you have to moan about effects you don’t need to us?

            Stop arguing about choice for f*ck sake.

          2. And are you idiot enough to not know that I also own high end Nvidia cards and I bought this game on all the hype built by Nvidia just like they do every time and yet it didn’t worked. and even now Nvidia owners are complaining, I am still waiting it to run like they have shown in GameWorks trailers, go fck yourself with these so called techs that are biased to the core and only make game industry worse.

        1. Again, try as you may to avoid what I said about Sleeping Dogs, the Async Compute isn’t even an issue. Just how many games does AMD claim to destroy NVIDIA in regard to DX11 Compute, well, not even a handful, yet their performance in DX11 Compute is superior to the point of it’s not even competition. Now that Maxwell has really good Compute performance, few games actually make such a big difference, in no way is it a lot of games.

      3. No but you put an additional layer of eye candy when you’re sure that the actual game works well. GameWorks is bringing a bad trend where devs get lazy, do a S H I T PC port and then bluff gamers by adding GameWorks on top of their POS game, this makes things even worse due to the fact that GameWorks itself is not very well optimized, I cannot expect great optimization effort from Nvidia when this is one of their best ways to promote new graphic cards.

        Is it running slow on GTX 970 ? then buy a GTX 980, it runs slow on 980 no problem we got you covered with our 980 Ti, thing is the innovation in GameWorks techs are not as fast as they are increasing the requirements for them, even a blind person can see this happening except someone who lacks brain.

        1. If you place Gameworks to game which is bad optimized from the ground, it doesn’t save the game. And it’s the fault of game developers. No NVIDIA or Gameworks.

          “GameWorks itself is not very well optimized”

          It is optimized. But only for NVIDIA GPUs. NVIDIA is using implementation methods which conform their GPUs. For example line tesselation for hair soften in Hairworks instead of Geometry shaders, etc. This procedure isn’t good for AMd HW which is worse in tesselation, but why should NVIDIA care about it?

          “when this is one of their best ways to promote new graphic cards”

          This is the best way how all manufacturers promote their HW. With SW which use it. What should be another reason for people to by new components? What do you think TressFX was about? It was AMD’s way to make something like Gameworks. Do you think that Mantle was made for common good and for every player? In first place it helps AMD’s CPUs. And if you think that AMD doesn’t work on Mantle with this in mind, you are naive. AMD is exactly the same company as NVIDIA. Only with significantly less money.

          Another thing what should people realize is that the time of big graphics jumps with less performance price are gone. It was possible because everything was fake like bump mapping for rough surfaces, static shadows and other “effects”. Now if you implement dynamic shadows or global illumination, it has huge performance impact.

          1. It is Nvidia’s fault to only care about selling their products instead of improving PC gaming in general. Neither they care about performance nor anything about optimization, all they care about to partner with every dev, make them go lazy on PC port and then slap GameWorks on top of it as a lollipop for PC gamers and Nvidia lovers so happily accept that lollipop.

            Dream on what you will but it isn’t optimized, in fact GameWorks is one of the worst SDK that not only perform bad on AMD cards but also perform average on old Nvidia cards otherwise HairWorks should have run great on GTX 780 Ti but go to TW3 forum and you’ll see a lot of complaining Kepler owners. This so called tech is a gimmick meant to promote every high end GPU from Nvidia, that’s all.

            Btw AMD greatly improved Tessellation performance in Fury X cards so it’s an old fact that Tessellation is AMD’s weak point however Nvidia is making sure that no other architecture than Maxwell run their technologies better or equal, whether it’s GCN or Kepler doesn’t matter. Seriously 64x Tessellation factor for hairs lol yeah next time tell me 3 billion vertices for a nail.

            TressFX is open source so no amount of argument is going to make it look bad, how do you think Nvidia managed to improve TressFX performance otherwise ? Mantle was meant to benefit Radeon users “yes” but tell me how it gimped performance on Nvidia cards ? does it hurt DX11 performance on anywhere ? does it add any eye candy effects that only AMD owners can enjoy ? the answer is NO, please find some good facts to counter my arguments.

            And whats the point of telling me about graphic jumps ? I am talking about GameWorks not graphic in overall. With every new card it’s becoming heavier and heavier while looking same.

          2. Then explain the 23% performance drop for TressFX on AMD GPUs compared to 17% performance drop of Hairworks in The Witcher 3.

          3. “It is Nvidia’s fault to only care about selling their products instead of improving PC gaming in general.”

            It’s not fault but business. You are talking about developers like you are one of them. I’m working in IT long time and I can guarantee that they have much work with other stuff just graphics. I bet that the most people who are offending developers with laziness, couldn’t do their job for the longer time. And I’m not mentioned, that the most of these people event don’t have capabilities to do that. But I understand, it’s easier to blame someone to be lazy then try it for youself. I recommend to you and other people like you to buy some books about computer graphics and try make something on your own. I did it that way and it learnt me many interesting stuff.

            “TressFX is open source”

            That some SW is open source doesn’t mean it’s optimised or best solution.

            “does it add any eye candy effects that only AMD owners can enjoy”

            The most GW features are not locked only for NVIDIA users. GW is much more than TressFX, which can be compared only to one GW modul – Hairworks. Hairworks works fine even on AMD GPUs when it’s properly configured (configuration is not implementation). I agree with you that 64 tessellation factor is overkill and that the Wither 3 may have Hairworks configuration from the start. I criticized that too.

            “With every new card it’s becoming heavier and heavier while looking same”

            If performance impact is bigger without visual improvement or other meaningful usage, that’s bad. I agree with that too.

            “Mantle was meant to benefit Radeon users “yes” but tell me how it gimped performance on Nvidia cards ? does it hurt DX11 performance on anywhere ?”

            Mantle is something different with completely different purpose. Mantle is API. It was build to get better performance from our HW. Gameworks was built to offer enhanced graphics effects or features which (from it’s own purpose) take more performance for itself. Mantle or other similar APIs are not competition to Gameworks. They can coexist with GW together.

            “but tell me how it gimped performance on Nvidia cards”

            As was said many times, if you don’t like what GW offers, you can turn it off. If you don’t like performance cost of its features, you are not forced to use them.

          4. If you know so much about business then you should also know that it should be fair and not just ripping off everyone.

            As for developers, don’t give me nonsensical excuses please, they get paid for their work and they earn a lot so there is no excuse for shoddy ports. I also work as an IT and network administrator with some projects of web development and sometimes my shift goes over 10 – 12 hours but I earn good so if I start showing of laziness then I am not worthy of my job.

            Open source software is transparent, if it’s optimized then anyone can see that but we cannot say this for closed source software.

            Yes I agree with the fact that most GW features work on AMD though their performance is always questionable.

            And now you’re changing your statement about Mantle, you said earlier that AMD did it for their own good and I said yes though the difference is GW also affect AMD users with bad performance while Mantle doesn’t affect Nvidia users in anyway, if you have an Nvidia card and you get 60 fps with some game in DX11 then Mantle won’t change that however if you have an AMD card and you get 60 fps with some game, GW can make it 40 while giving 50 to Nvidia users, this is a pathetic business practice because hardware vendors should only compete on hardware grounds or if they bring software technologies then it should be transparent for all so it can be optimized equally not some black box program that restrict devs to work with AMD or vice versa. Ideally all graphic technologies should come from developers not hardware vendors who are biased.

            I can turn off GameWorks yes but it doesn’t change the fact that it’s making developers lazy, they do shoddy PC ports then slap GW on it and then fool us with so called superior PC version. Not to mention Nvidia’s half baked support is getting ridiculous and tiring, see what happened to Batman. Nvidia heavily promoted that title, showed a lot of false performance videos and how it turned out ? then they told us that they are helping Rocksteady fixing the game and people will love what they’ll get but where is that claim now ? where is Nvidia now ? after the fact that re-release of this game failed miserably. What about the tessellation patch that they promised for Assassins Creed Unity ? I feel for those who bought GTX 970 and 980 for these games, where is Nvidia’s support to these customers ?

        2. Well, AMD are losing money year on year, maybe you need to realise they’re doing something wrong with their marketing and NVIDIA is doing something right. Maybe you should look why 3DFX, ATI and S3 are dead instead of blaming a successful company that has helped some of the top devs of all time. If NVIDIA didn’t help John Carmack out when they did, who knows what would have happened.

          Also, where the you get this nonesense from about Gameworks being “not very well optimized”. You do know there is only a 17% hit on Hairworks right? You do know there is a small hit in performance on HBAO+ right? Do you really think somehow think thousands of hair strands are not going to hit performance much or do you think that rendering more 2D grass somehow makes the game better? Where do you get the idea that somehow TressFX runs great? No it doesn’t, on AMD GPUs is took a 23% performance hit just for hair alone on one character and tanked frame-rate really bad once you got close to Lara’s hair.

        1. No there isn’t because the time and money to research it for the game engine is too much. Only big game engine studios like Epic can do it because it’s what they do. This is why NVIDIA step in with their tech ,they proved tools and support for the devs to implement the advanced tech.

        2. Yes. You’ll find all those effects in your average Crytek or DICE game, running well. Gameworks effects are trash.

          1. Can you tell me example of these average Crytek or Dice game? And please don’t compare scripted effects with real time computed.

    1. Stop blaming Gameworks, the game was broken end of story. It seems to me you’re looking for a scapegoat. The game still runs above 60FPS on a GTX 970 with everything on max plus Gameworks

      1. They’re not wrong, you’re just confused my love. I was comparing prepatch no gameswork to post patch no gameswork.

        1. Then how do you explain nearly 60FPS at 4K max settings Gameworks on and only a 14FPS decrease with Gameworks off at 4K?

          Seems to me you’re looking for reasons to hate Gameworks and failed.

          1. That seems to be the only resolution that improved as you noticed the other ones did not.

          2. Yeah it’s funny how you’re making such big assumptions off one benchmark on one system. The main complaint wasn’t the averages, it was the dips in frame-rate and the uncapped 60 caused the most issues, now it doesn’t. Watch the video posted here on a GTX 970, always over 60fps, no drops at all.

            Also, the bigger VRAM cards didn’t have the issues so much, it’s the lower end cards that had the issues with frame-rate drops. Total Biscuit even said this and he has Titan X SLI, Sloppywetblow say exactly the same thing on his channel about the game.

    1. Sure but the thing is, my friend even lowered the resolution to 480.. and he was still lagging with top of the line PC. There is something really odd with that game. Console gamers of course keep sayin how the PC is worse than a PS4.. -_-

      1. Yeah and it seems our Johnny boys is one of them!
        WTF, just STFU and enjoy this game, that now seems perfectly fine!

        Well people like to b*tch, most of those havent even played the damn game I guess, pathetic loosers :/

        1. There is NO Sli Support after FU*KING MONTHS/WEEKS . I NEED Sli Support cause i use a 690. Nothing more to say 🙂

    1. Nope. Even Dying Light has better textures and effects than Arkham K.

      So its just a plain $hit from Rocksteady.

    1. Not hiring more competent programmers to fix their mess. It’s clear whoever was programming the existing game was either under unreasonable time constraints or plain incompetent. And this patch proves that the situation has not changed.

  4. “ng at least 12GB of system RAM on a PC allows the game to operate without paging and provides a smoother gameplay experience.””

    Ok so what you are saying is that we have to play this in the future and STILL get perfomance issues.

    Man it doesnt worth playing even if it was free.

  5. Guys, look at the video posted by Denis Kutanjac , max settings, Gameworks on and it seems to run perfect and above 60FPS. I didn’t see any horrible frame-rate drops, plus it’s uncapped.

  6. This is not about fixing but about better textures. As a PC player I expect to have better visuals comparing to console version. And I know that it costs more HW resources and needs more performance.

  7. So what was Nvidia actually doing with them in bed all this time ? seriously 12 GB for Win 10, paging issues on Win 7 and not even an SLI profile after that many months. A total piece of S H I T.

    1. The key word here is “suggested” that 12GB of RAM is recommended for Windows 10. I’m not sure why people think it’s anything other than a recommendation. If 8GB of RAM has bad stuttering when yes, we could consider this update to be a failure.

      1. To be honest I don’t care because I have 32 GB though they are suggesting it because they saw a problem there which they cannot fix. This game is made on Unreal Engine 3 (one of the most proven engine in gaming) so those atrocious requirements don’t make any sense and this re-release is a failure because people are already complaining about it on Steam with some even making Youtube videos. I don’t know what other proof you want.

          1. I don’t have the game anymore, returned it while I could but all Steam reviews shouldn’t be ignored and John also mentioned that the game is utilizing 9 gigs of RAM and it has a bug where GPU utilization drops to 50% of what it should be and frame rate drops to 35/45.

          2. I just posted benchmark images of Sleeping Dogs. average 28FPS performance drop with AMD’s HDAO on, nearly 50FPS drop in max FPS.

  8. I’ve posted this yesterday already , but i’m gonna re-post it ’cause i’m too lazy to write this again 🙂 :

    “About the Batman patch that’s coming in a couple of hours : i don’t
    think it’s gonna be a performance related patch ( like SLI support etc. )
    . In the steam statement you can’t find anything related to increasing
    the performance . It’s only a “content” patch.
    ” …we’ll also be releasing a patch that brings the PC version fully up-to-date with
    content that has been released for console.”
    “This means that next week, all PC players will have access to Photo
    Mode, Big Head Mode, Batman: Arkham Asylum Batman Skin, and character
    selection in combat AR challenges”.
    etc.
    I think the game will still run like crap, but i hope i’m mistaking.
    Well , anyways , Fu*k WB!
    🙂 “

  9. played around 1hours (after 12hours of update 10GB) it runs 60fps for now (two 1sec freezes) on ultra without GW effects, but let see if it can maintain that 60fps in more hours. but man, those DLCs, they are usless, it’s just some ugly skins, and those challege maps should’ve been in the game since day1 for free. also that r*tarded blue line in the bottom of the screen has been fixed.

    also 4free games and a free challenge map pack is ok. however you can buy all of those games in sales around 16$ and cheaper than this re*arded season pass.

    also didn’t waste your money on this t*rd, wait for crack, not worth the money.

  10. Please beta windows tester…do make sure its will arrive at SteamOS well optimized…

    So…complain at STEAM as much you like!

    #SteamOSdayONEftw

  11. lol. pc gaming is dead. meanwhile consoles copies of batman sold more than 5M copies and PS4 consoles sold 40M. gg pc gaming with no exclusives.

    1. Even a console pleb can’t get his figures right for sales, let me a PC users get it right for you.

      “Our installed base now is well over 25 million”

      “Sony has admitted that, following the delay of Uncharted 4: A Thief’s End into 2016, its first-party lineup is “a little sparse” for this holiday”

      LOL

  12. RIP PC gaming … it so simple, the optimization in PC cost more than Trice the budget of Console optimization. It is bussiness … and pc doesnt make good market anymore …

  13. First, Titan X is 1000$ GPU. Second, if 89FPS is get on max details with Gameworks in FHD (which is the most using resolution), then you can achieve decent framerates with weaker GPUs in range 500$. Yes, you have to lower settings with mainstream GPUs, but it’s expected. Or do you want 60FPS and graphics with max details with low end GPUs?

  14. HAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA HA HA HA HA HA HA HA HA HA

    “They’ll fix it”, said the casual plebs.

    No, they won’t. They never do. They sometimes release patches claiming the patch does this or that and when you actually look at it yuou’re usually one of the mysterios, rare users who still experiences the same problems as before the patch. Then you talk to some people and it turns out everyone still has the same problems and the patch didn’t do anything, or actually made some things worse. The state in which a game is released says everything you need to know about the company behind it. If it’s broken at release it will remain broken. It took FANS to fix terrible releases like Gothic 3, Oblivion and Dark Souls in the past. And games are more closed off than ever before, modders can’t access the important parts of most games nowadays. Don’t buy broken games, there is no excuse.

  15. eurogamer.net/articles/digitalfoundry-2015-re-released-batman-arkham-knight-performance-analysis

    FUBAR4EVER

Leave a Reply

Your email address will not be published. Required fields are marked *