bak_sshot088

Batman: Arkham Knight – Minimum PC Requirements Updated, AMD Performance Issues Reported

Rocksteady has updated the PC minimum requirements for Batman: Arkham Knight. According to the new minimum PC specs, an AMD Radeon HD 7950 will be required in order to play the game. Rocksteady has also listed a GeForce GTX 660 as a minimum GPU for NVIDIA owners.

In addition, Rocksteady reported that there are currently some known performance issues with AMD’s graphics cards. Rocksteady claimed that it is currently working with AMD to “rectify these issues as quickly as possible and will provide updates here as they become available.”

Here are the updated minimum PC requirements for Batman: Arkham Knight:

MINIMUM:
OS: Win 7 SP1, Win 8.1 (64-bit Operating System Required)
Processor: Intel Core i5-750, 2.67 GHz | AMD Phenom II X4 965, 3.4 GHz
Memory: 6 GB RAM
Graphics: NVIDIA GeForce GTX 660 (2 GB Memory Minimum)  | AMD Radeon HD 7950 (3 GB Memory Minimum)
DirectX: Version 11
Network: Broadband Internet connection required
Hard Drive: 45 GB available space

  • realist

    go figure..

  • archaven

    this dont look good on AMD..

    • Anaron

      Except it isn’t AMD’s fault. And I wouldn’t be surprised of NVIDIA users with Kepler cards suffer performance issues as well. NVIDIA has a history of shady behaviour. Take a look at Project CARS. They’re so eager to push the GTX 900 series that they’d let their own customers suffer too. What a joke. I mean, in what world does a measly GTX 960 (with a 128-bit memory interface) outperform a R9 290X and GTX 780?

      • mareknr

        Jokes are these funny theories. Yes, there where some problems with older NVIDIA GPUs, but they are solved with latest drivers. I own GTX 780 and I can tell that in Witcher 3 I have significantly better performance then before this driver.

        • Frosty Mug O’ Beer

          Yep, I have 2 970’s and 2 660ti’s Prior to getting the second 970, the 660ti sli performed on par with a single 970 in most games. I can attest that the Witcher 3 now has the expected results.

  • Gutts

    I want to option of turning off all the nvidia gw

    • djghost1133

      You always have that option.

      • Gutts

        nop always

        • mareknr

          Can you give us example where you didn’t have options to turn Gameworks features off?

  • PublicNuisance

    This is just being found out now ? Shouldn’t it have gone through QA prior to this ? I guess it’s good to know prior to launch but a day before shows sloppy QA.

    • Primey_

      AMD probably didn’t work with Rocksteady during development. AMD are known not to work with devs unlike nvidia

      • Psionicinversion

        or nvidias gimpworks screwed the performance up at the last minute

        • Primey_

          It’s just physx and you can turn it off. Nice try blaming nvidia though 🙂

        • Then tell me, why don’t AMD send and engineer to the game dev team and help them optimised the game better for their GPUs? It’s pretty simple, it’s just not optimal path for AMD GPUs so of course it’s going to run slower.

          • Sandy Bridge

            Console sales of this game will be 10x easily what PC is, if not more.

            You are wondering why 1/4 of the 1/10 of the market which is PC doesn’t invest much money in the development process –

            Because it is not financially worth it. It’s worth for Nvidia because 3/4 of PC gamers use Nvidia card and they sell much more than AMD.

          • Shredder Orokusaki

            Well all batman games still sold millions on pc

            http://steamspy.com/search.php?s=Batman
            More than 2 millions. So the console sales will not be 10 times more when the pc version sales will be 2+ millions. Maybe 0,5+ more than pc. And before you ask how steam spy works: It search on steam users profiles for ownwes of the game name that you type. Then it shows us the total numbers fof steam profiles users who have the game(so the ones who has bought it). Sales for this one on pc will be as good as the previous batman games were.

          • They won’t be 10x, that just silly, plus PC per copy makes more than consoles per copy because Microsoft and Sony both take their share off the game, plus money for patches.

          • SGTBuzzyBean

            A guy from AMD said that gameworks titles don’t allow AMD to optimize as closely as nvidia. Something about not being able to access the source code.

          • But like I said, if you look at benchmarks you’ll see it’s not that bad for AMD, in fact they win in some NVIDIA title games, NVIDIA win in some AMD title games as well. AMD just need to optimised their drivers better, which they clearly don’t.

          • Anaron

            I’ve seen a lot of benchmarks for games on AMD and NVIDIA hardware. The prevailing trend is that AMD cards perform worse than they should in GameWorks titles. No amount of driver optimizations will change that because AMD isn’t allowed to have access to the source code for GameWorks libraries. If they’re allowed access, then they can optimize it.

          • It’s not Gameworks that’s the problem, it’s the game itself. NVIDIA pick poorly optimised games.

          • Anaron

            Poorly optimized for NVIDIA’s hardware, sure. AMD gets a bit of a head start because of the consoles but that only applies to multi-platform games. NVIDIA is definitely better than AMD at releasing optimized drivers though so team green users won’t suffer for long.

          • SGTBuzzyBean

            True, FC4 runs better on my 270X than my friend’s 660 which is the closest equivalent to my GPU. I guess the game will still run fine on AMD. DX12 will help AMD though according to DF’s DX test.

          • connos

            You just send engineers here and there whenever you want and like? Even if Nvidia is behind the game. No one will notice or say anything and all will be happy. Especially Nvidia. And then they will hold hands with the dev and optimize the game together. Are you serious man….

          • I’m not sure what you’re trying to say, I mean if you want an example of good dev partnership, ask John Carmack since NVIDIA helped him out when he needed it most.

          • connos

            Man is this a good example? Of someone saying something. There are others that say AMD help them.

            Also for me Carmack is overrated. Up until Doom3 from 2004 he created nothing. Rage engine was a failure and Doom3 was the most limited engine ever created. Don’t get me wrong Doom3 was a great game for me. He is also very bias over Nvidia. Even when AMD was doing the right thing with Mantle he criticize it in front of Johan Anderson that help created it. From that day I loose total respect for him. And Johan said in a lot of interviews that AMD help them a lot.

          • Carmack doesn’t design games and Mantle is just a pawn to push DirectX and nothing more. He’s been a figure head for OpenGL for along time, something AMD hasn’t been, so they created their own API that now pretty much dead for a hidden agenda, we know now it’s to make DX12.

          • connos

            I didn’t mention games I mention engine. Other people created much better engines than him.

            And please stop imagining stuff. There is not any hidden agenda. AMD was actually fighting with Microsoft and was blaming them for Directx and that they didnt see any problem with it. OpenGL Vulkan will be based on AMD Mantle. AMD is contributing and helping OpenGL to create Vulkan. Follow the bellow links to educate yourself

            2011 AMD is blaming dx for bad performance on PC games
            http://www.bit-tech.net/hardware/graphics/2011/03/16/farewell-to-directx/1

            Opengl and AMD support
            https://community.amd.com/community/gaming/blog/2015/05/12/one-of-mantles-futures-vulkan

          • You’re talking about Windows, their OpenGL support and drivers on Linux are poor and still are, so much for AMD supporting an “open” OS.

            Can’t you see? Mantle is dead, they denied the existence of DX12 and then brought out Mantle, then DX12 came along, it’s just a pawn in a bigger picture

          • connos

            So you are using Linux with your geforce to play games. Mantle is evolving. Even DX12 is borrowing from it.

            Microsoft denied the need for a low level API. AMD push in the right direction with developers that are thinking ahead and beyond the narrow head Carmarck. From the very first day and the very first presentation they clearly said Mantle may not take off but it will push the right buttons and push Microsoft for a low level API.

          • Carmack said OpenGL can do the things like Mantle, he’s been pushing for it for years. Carmack development idtech 5 with megatexture which no one did, then Microsoft copied with in DX11.2/3/12. At least Mega Texture has been used in games, it doesn’t depend on the API to use mega texture, rather than crappy tiled textures.

            Carmack is just one man, he knew he can’t change the whole API so he did it himself in his own engine, now DX 11.2/3/12 has tile resources, virtual texturing which is pretty much inspired from Mega Texture

          • connos

            I know what Carmarck said and that’s why I looked around to see what others are saying and why no one is using it. And you know what, all the others said opengl is not easy. There to many layers, it doesn’t work.

            Mega Texture was a flop. The theory to execution is very different and we are talking about a guy that he abandon gaming engines.

          • It’s not a flop, Microsoft copied it, virtually no games use DX11.2/3 tiled resources and about 5 games use idtech’s Mega Texture. Where did Microsoft get DXTC from? Oh it was S3 formally S3TC(OpenGL has S3TC).

            As for OpenGL ,Linux and mobile’s use it, let’s not be small minded about OpenGL, even NVIDIA used it in their tech demos to show case their GPUs, about 14% of NVIDIA’s business comes from Linux, Benchmark apps like Unigine use OpenGL, Valve use OpenGL, Epic Games use OpenGL, UE3/4 use OpenGL.

          • DamZe

            Sean you are the best NVidia rep I have seen on the internet i a long while, keep it up!

          • Don’t tell them I buy AMD hardware will you?

      • connos

        Alien Isolation, BF4, Crysis 3, Dragon Age Inquisition, Tomb Raider, Bioshock Infinite, Civilization, Thief, Star Citizen, Sniper Elite, Dirt Rally, Blacklight Retribution and much more. All this games perform equally on any hardware and they are very well optimized. Actually they are more AMD games than nvidia games. Nvidia is behind the Witcher Batman and Ubishit games.

      • HankTheSpank!

        With gameworks amd is actually prevented from working too closely with developers, here is a quote on the subject from Hallock:

        “Participation in the Gameworks program often precludes the developer from accepting AMD suggestions that would improve performance directly
        in the game code—the most desirable form of optimization.”

        So a partner studio like Ubisoft can suggest or write enhancements to the GameWorks libraries, but AMD isn’t allowed to see those changes or suggest their own.

        “The code obfuscation makes it difficult to perform our own
        after-the-fact driver optimizations, as the characteristics of the game are hidden behind many layers of circuitous and non-obvious routines,” Hallock continues. “This change coincides with NVIDIA’s decision to remove all public Direct3D code samples from their site in favor of a ‘contact us for licensing’ page. AMD does not engage in, support, or
        condone such activities.”

        • Then AMD should optimised at the driver level, which NVIDIA is known for doing alot better at. An example of this is TressFX, NVIDIA performed bad in two ways, one is that their Compute performance at the time was poor, the other is that they didn’t get the review code but like AMD don’t get review code.

          Notice how NVIDIA 99% of the time get a driver out before AMD and before the game is out for optimisation. People need to learn that games are optimised at the binary level anyway, you don’t need source code and if you want source code you need be a partner, this is true for any proprietary software.

          • HankTheSpank!

            That’s what the second part of the quote deals with. Nvidia obfuscates code to impede AMD’s driver optimization.

          • Yes because it’s NVIDIA’s code. No one but no one can prove they’re doing anything wrong in terms of gimping performance, simple because they don’t have access to the code, it’s just one silly assumption brought on by NVIDIA cards actually being optimised better or perform better in certain ways. Just look at Farcry 4, actually performs better on AMD GPUs, AMD GPUs in Watch Dogs won some benchmarks as well, Bioshock Infinite runs better on NVIDIA GPUs but is an AMD labeled game. The frame times on AMD GPUs are actually pretty good as well.

          • HankTheSpank!

            Please just read the quote I posted, it is done is such a way that it affects things beyond Nvidia’s own blackbox. This is not something new, this is not theory. This is something developers already know about and have to deal with regularly. You can read their discussion for yourself.

            These guys are Kostas Anagnostou the Senior graphics programmer at Radiant Worlds, John W Kloetzli, Jr a graphics programmer on the Civilization team at Firaxis Games, AngeloPesce a Rendering Technical Director & Johan Andersson the Technical Director on Frostbite at Electronic Arts.

          • Again, AMD titles don’t dominate performance, it’s swings and runabouts in terms of which GPU has better performance, especially after a few patches and driver updates on both sides.

            I give you Bioshock Infinite as an example of this.

          • HankTheSpank!

            You’ve gotten away from the point, Nvidia’s actions don’t mean that AMD will never do better in all games. It’s just a way to get an artificial boost in certain games.

          • Then explain why AMD perform better in Farcry 4 at 1080p medium settings? Oh that’s right, they MUST have gimped Kepler cards, AMD GPUs CANNOT possibly be faster.

            http://www.tomshardware.co.uk/far-cry-4-benchmark-performance-review,review-33099-3.html

          • Frosty Mug O’ Beer

            At the time, everybody thought their was no license available.That was old and debunked, Nvidia does allow Source code to be viewed by those that buy a license. Any dev can buy a license to see source code. Thus, these tweets you posted are inaccurate, and you are spreading fud….

        • Primey_

          Nvidia said that was bullsh!t. Don’t believe everything AMD says.

  • Oo Dari

    This was totally unexpected /sarcasm

  • Silviu

    NO WAY!
    well well well what do we have here!
    MINIMUM:
    OS: Win 7 SP1, Win 8.1 (64-bit Operating System Required)
    Processor: Intel Core i5-750, 2.67 GHz | AMD Phenom II X4 965, 3.4 GHz
    Memory: 6 GB RAM
    Graphics: NVIDIA GeForce GTX 660 (2 GB Memory Minimum) | AMD Radeon HD 7950 (3 GB Memory Minimum)

    optimization 10/10(IGN score also)

  • FasterThanFTL

    I am sure the new Radeon Fury X will run Batman Arkham Knight just fine even if it not perfectly optimized for it.

  • Psionicinversion

    Nvidia gimpworks wrecking AMD GPU’s again. Cant wait to see the 7xx 6xx series GPU’s fail underneath it to

    • Silviu

      true they start to be against their own green kind lol !!!
      Witcher 3 = perfect example!

      • mareknr

        After last driver is everything OK. Surely with GTX780. Performance in Witcher 3 is good.

        • Silviu

          No i have acces to GTX 780 so no don’t lie !
          idk about the batman driver to be honest didn’t tested W3 with these

          • Frosty Mug O’ Beer

            Post a video of you playing with this 780 card and specs….

          • Silviu

            Ok

          • KingK76

            It has been shown on multiple sites and forum posts that playing the Witcher 3 on a Keplar GPU (and with settings not exceeding 3 GB of vRAM for “your” 780) that the new driver made an appreciable improvement in performance. And not Nvidias numbers but independent websites. Even if you do “own” a GTX780, who’s to say that the settings you are selecting are appropriate for the GTX780? If your gains are not what everyone else is showing then I would guess that your a). a Liar or b). an idiot…. Neither are good. So SSSHHHHHHH.

          • Big liar as usual, you don’t want to eat your words about TW3 and NVIDIA gimping Kepler performance because they fixed it.

          • Silviu

            they didn’t fix it IDIOT you BLINDED FANBOY!
            if you look at some YT reviews+ what i tested personally they add just 5-8 FPS more !
            Is that a fix?
            some people are stupid and you define them!

          • mareknr

            I don’t lie. 🙂 I didn’t talk about Batman driver, but driver before it. I didn’t know there is newer driver already. Before that driver I had framerate in Witcher 3 from 22 to 36. All settings max, Hairworks MSAA 4x instead of 8x. After that driver, I have framerate 29 to 40 with the same settings which is good impovement. What framerate do you except from GTX 780?

    • AMDlunatic

      AMD TearWorks detected.

      • John

        kek

      • Phoenix

        More like TearFX 😀

      • KingK76

        LOL!!! Good one! 🙂

    • BigRush12

      Hmm, glad I went with a 980ti

  • Silviu

    i just hope they didn’t cut the performance on kepler again !
    and i am happy i like batman games and all of them work amazing in terms of performance!

    • Anaron

      I’d be just as angry if it was an AMD title and they purposely made it run worse on NVIDIA cards.

      • Silviu

        All AMD titles run amazing on Nvidia GPU’s in special now BF4,Tomb Raider,Hitman !

        • Anaron

          Yes, that’s because AMD doesn’t pay or encourage developers to give them an unfair advantage. They also don’t support proprietary tech that is limited to their GPUs. Even Tomb Raider with TressFX runs fine on NVIDIA hardware. Heck, some NVIDIA cards outperform AMD cards with TressFX enabled. And it’s AMD tech.

          • mareknr

            “AMD doesn’t pay or encourage developers to give them an unfair advantage”

            NVIDIA didn’t do that too. TressFX in Tomb Raider run like crap on NVIDIA GPUs on release date. It took a time while they repair it. It works the same in some games with Gmaeworks too. Don’t run good on release date on some AMD GPUs, AMD than release a driver in a few weeks and much more everything is ok (except bugs which are the same for everybody – and that has nothing to do with Gameworks).

          • Anaron

            That’s not how it works. NVIDIA is free to optimize their drivers for TressFX but AMD can’t do that with GameWorks effects. Shortly after the release of Tomb Raider, NVIDIA released drivers that fixed any performance issues with TressFX. Some NVIDIA cards ended up outperforming AMD cards with TressFX enabled.

            AMD, on the other hand, can’t optimize their drivers for GameWorks. They’re not allowed to access the source code so as long as NVIDIA maintains that stance, it’ll always run poorly on AMD cards. Keep in mind that developers can access the source code but they’re contractually obligated to keep that information to themselves. They’re not allowed to optimize GameWorks themselves for AMD hardware. NVIDIA knows that optimizing it would benefit AMD and they don’t want that at all.

          • The payment comes in NVIDIA engineers not money.

          • connos

            You wish its only that.

          • Prove otherwise or stop your talking.

          • connos

            I am not saying that I know like you. Its like you where there and you closed the deal. But use you “logic”. Even AMD said they didn’t pay BF4 anything. Do you actually believe that. Then the developer showed on twitter box full of AMD cards and a thanks note. Come on man wake up.

          • Frosty Mug O’ Beer

            AMD paid 8 million to Johan and his Crew…….Nothing is free….

          • That’s why I said they’re paid in NVIDIA engineers, never said anything is free.

          • connos

            No its not only that and you know it. They may not even send engineers if you see the status of the game.
            Its plain marketing on both sides. You mention me and I will mention you kind of thing.

          • Again, you can’t backup what you’re saying. The game may run like sh*t but that’s not NVIDIA’s problem, their tech is optimised for NVIDIA GPUs, it’s the game devs job to optimise the game fully for the PC as a whole.

          • connos

            Neither you. We are all making assumptions here. And how its not Nvidias problem. If it performs well its because they send engineers and supported the developers to use the OPTIMAL RENDERING PATH and when its not performing well for everyone its only the developers fault? Make your mind and again you don’t make sense.

          • ROdNEY
          • ROdNEY

            And was 5 MIL enough for state of features inside of 2 ubisoft games? Apparently not.
            Lest hope next time nvidia pays 50 MIl for 2 games and we will see even better “optimization” from their engineers.

        • ROdNEY

          most of EA, square enix + some other developers are in Game Evolved program.
          Quality of optimization even in worse game from GE is incomparable to best what ubisoft (nvidia supported dev) can do.

        • JAGUARCD32X

          Actually that isn’t true at all. The Hitman game had a funny bug where it wouldn’t break 50fps whether using a gtx690 or Titan as shown by Tom’s Hardware benchmarks and tested by ME. Sleeping Dogs had abysmal performance on NVIDIA GPU’s so much so that a lowly 7970m outperformed a 680 and a gtx690 ran no better than 7970ghz edition with super sampling enabled. It took many months for NVIDIA to fix this and even now the game runs around 20fps better on like for like NVIDIA and AMD hardware. Tomb Raider ran like ass on NVIDIA hardware because the game inexplicably had it’s whole build changed right before launch. So you very obviously don’t know what you are talking about and I can prove it. Even Tom’s Hardware commented on Hitman having a “strange” bottleneck that only effects NVIDIA cards…

          Also any tech that AMD has used in games like TressFX has been open source, they didn’t spend money , time and effort designing it

          Then you have NVIDIA sponsored games like FarCry4 or Metro LL that happen to run better on AMD cards as does Batman Arkham City… So you really should be giving AMD the hard time otherwise they won’t change if it’s you know….all nasty NVIDIA’s fault….

      • John

        People nowadays cannot see beyond their bellybuttons. It’s how we get scr*w*d as costumers the way we are.

        • Silviu

          very true!
          i was blind too in 2012 and i regret all the insults adressed to people with AMD 6970 and GTX 580 i was to blind and fool!
          i’m sorry!!!!!

          • Anaron

            You’re making progress. I like that. 🙂

        • Anaron

          Sad but true.

    • Aki Mikage

      I’ll state this first. I’m asking honestly here. I’ve been using generations of Nvidia cards and I do benchmarks when I’m updating drivers. The performance either increase or not but never did it decrease drastically. The most FPS I lost in a driver update was 1-2 FPS but in exchange of something positive. Now, I see a lot of people say that Nvidia gimps their old gen cards so I searched the internet but never really saw proof of that. Is there any benchmarks that will prove that Nvidia gimps their old gen cards ?

      • mareknr

        There where issues for Kepler and older GPUs in some new games. But they are already solved with latest drivers. Of course AMD fans will use this next 10 years as a proof of general shady behavior. They are catching everything they can.

        • Silviu

          They didn’t solve anything !
          the FPS boost on GTX 780Ti,780,TITAN was just 5-8 FPS LMAO.
          and on GTX 770,680,660Ti only 3-6 FPS.
          While GTX 960 2GB ( a card that lose against GTX 770 in every game only in W3 it wins and destroy everything from Kepler …fishy marketing i demand to open your eyes sir and see this dirty game of gimp performance for more money)

          • Frosty Mug O’ Beer

            I find it laughable, that you fail too see the difference between the two archs, W3 is compute heavy, something Maxwell excels at, and Kepler sux at……Imagine that…….and as more compute heavy games come, this will be more of an issue, as an owner of AMD, Nvidia, (kepler and maxwell) I laugh even more that you and many others would rather Nvidia play lay down Sally instead of advancing their arch, and gaming in general……The reality is this is about guys who do not have enough money to be PC gamers in the upper echelons. You want top tier performance now, and for several generations…..You either were not a pc gamer back in the 90’s when you needed 3 brands of cards, a suitable sound card, etc…..and you kept buying every year as things advanced. This isn’t console gaming…….things get better much earlier. If you can not keep up that isn’t Nvidia’s fault.

          • Silviu

            Yes its a big difference ofc lol when they cut driver performance to make Maxwell look decent as for Witcher 3 in graphics/performance ratio looks worse than BF4 lol and runs 500% worse because of the driver performance cut anyway not the first marketing scam out there.”PS: in specially with poor persons like the Polish from CDPR that can be corrupted very easily”
            Same goes with Batman look at that performance on GTX 980Ti 45 FPS avg 1080p!!!
            With the upcoming Pascal a mid range of their cards will destroy Maxwell by 35-50 FPS in every game in special the new ones and i can’t wait for that “laugh” to happen.

          • Frosty Mug O’ Beer

            Why wouldn’t Pascal be that much better as a PC gamer I expect results every new GEN of GPUs…..Obviously you do not…Look you are obviously and AMDrone….it’s cool with me. I owned 2 290s at one time, not bad cards, I plan on buying a Nano or 2 depending on unbiased benchmarks, Then I will buy next gen as well. Because I can…..That is PC gaming. Go play on a console if you can’t hang.

            P.S. A process node upgrade and a new arch should net a 35 to 50% gain. I am looking forward to it. If you are not, you certainly are not a PC gamer.

          • Silviu

            Ofc in higher resolution than 1080p and i respect evolution of GPU’s in that direction for 4k/60FPS but nowdays 1080p/Ultra 60 FPS should be a standard since every PC Gaming Rig its much better than PS4,Xbox 1!
            As for consoles i will never move to consoles due the 30 FPS limit or only 60 FPS and no K/M what so ever.
            Things i love in PCgaming:
            1.I am loving GTA V at 60 FPS Ultra
            2.i love BF4,BFh at 120 FPS on Ultra settings with mindblowing textures and graphics effects
            3.CS:GO with 350-400 FPS on Ultra 1440p
            4.Just Cause 2 in multiplayer with 1500 persons in server again there Ultra 2x MSAA 150+ FPS
            5.Arma i know its 35-45 FPS but its ultra fun and smooth
            6.FIFA 15 Ultra 1080p 280+ FPS
            And most of the games with amazing graphics that consoles can’t run and can’t render at higher FPS so thats why i love PC gaming and also i love the flexibility and fredoom that PC gave me.
            WHAT I DON’T LIKE is the marketing scams that people are to blind and stupid to see and when a GPU company mostly green one.. cut performance on older arhitecture to make the newer one better , we are not stupid and we do know that we already have hardware that is OP for 1080p/60(beside Cryengine)!

            I am not a rich bastard to spend money only on latest hardware i spend the money on everything its worth and has a good life span and seen GTX 980Ti doing 45 FPS avg in Batman well…….and Pascal mid-range will destroy it just from the “driver” will be just sad and is sad in general because Nvidia dosen’t have a real competitor so things like this will happen more often.

      • Silviu

        Go on Nvidia forum are at least 208 guys with GTX 660ti,670,680,770 SLI and GTX 780,780Ti they will show the performance cut in every single game from every each new driver released and lower 3d mark scores and it is just from Nvidia driver just to scam old arhitecture and make the new one look better in this case Maxwell.
        Same thing happen with GTX 580 and iCafe(optimized for every arhitecture) drivers slap Nvidia face really bad at that time!

        • Aki Mikage

          I can’t see the complaints you are saying. Can you give me the links ?

  • The simple fact is that NVIDIA work with devs to optimised the game better for their GPUs, not cripple them. AMD GPUs just don’t run through their optimal render path. People also need to understand that the newer line of GPUs like Maxwell do tessellation, Direct Compute vastly better than previous GPU architectures.

    Nothing to see here guys, this happened when ATI got superior Pixel Shader 2.0 performance in Tomb Raider AOD and the game wasn’t gimped for NVIDIA GPUs at all, same with Far Cry where PS 2.0 performance was bad on NVIDIA GPUs so they had to use half the FP to get some performance back.

    If you don’t believe me go look at Direct Compute and tessellation benchmarks with Maxwell, Compute on Maxwell is up their with AMD GPU now when before AMD’s Compute performance was destroying NVIDIA’s. Tessellation is even better now, that’s why Femi and Kepler perform poorer.

    • connos

      You comparison is way off. To perfom better on newest cards is not the same to underperform on cabable cards. This is a trend with every Batman game and every nvidia game. They had plenty of time to make it work. There are no excuses for this. Do you want me to accept that they only developed the game to be only playable on 9 series gpus? What about the consoles with AMD cards. At what render path they run? The unoptimized one? Why did the bother optimized for the consoles? Why did they bother develop the game for the consoles? For their great direct compute and tesseletion? Your post and understanding of this situation is way off.

      • 1. You can’t compare to the consoles end of, plus they have less detailed tessellation or no tessellation at all.

        2. You really think that Maxwell’s superior Tessellation performance plays no part in hairworks and tessellation geometry? What about Maxwell’s superior fillrate when it comes to heavy foliage like in TW3? It all adds up.

        3. Also, if devs don’t get help from AMD then how are devs supposed to optimised for AMD? Go look on their website for GPUs and docs? Console won’t help them because they’re optimised differently with a much more complex low level API that talks directly to the metal, DX11 is the bottleneck of PC gaming, either that or the driver ATM.

        4. Each GPU has it’s own optimal render path, whether that be for MSAA, shaders, rendering geometry and as we know, all GPU don’t run at this optimal path unless helped by the vendors.

        5. Money talks, devs don’t want to go out their way to optimised well for multiplaform games, they have a budget and do as less work as possible, just the way it goes.

        • connos

          1 Yes you can compare.
          2. Dont care shouldn’t matter. And I said faster and not optimized performance its two different things
          3. So you expect a game to have both nvidia and amd engineers to work with it. And then what logo will be on the box, a red nvidia logo? Come on man. There are more AMD games than nvidia. How is this posible if AMD doesn’t send engineers to the devs. There are plenty of videos of how AMD is working with the devs. AMD created Mantle with the devs
          4. Red nvidia logo all over again
          5. So its with dev the problem.

          • ROdNEY

            Yeah AMD does not work with developers, but most of those games AMD never supported win each year best optimized game awards.
            Make real sense!

          • Jay

            Game optimization is metric of game development, not driver support.

        • ROdNEY

          AMD does work with developers, just nvidia adds newest GameWorks DDLs into latest version just months before release and usually AMD HW get huge hit by that. AMD cannot fix that as code is not accessible so only what they can try is workaround like limit max tessellation factor etc.
          All AMD Radeon SDK is open for everyone so nvidia never had such a issue with AMD games or if they had, then they really have to sux in optimization.
          Also using extreme tessellation without any visual gain is not optimization, it is just making sure only some HW will able to run it.
          Developers in nvidia program get paid, ubisoft received millions for single game.

    • Martin

      NO NVIDIA OBVIOUSLY SABOTAGES AMD LIKE ALL THE TIME OLOLOL, ALSO AMD DRIVERS BEING GARBAGE IS AN URBAN MYTH OLOLOLOL

    • SGTBuzzyBean

      That’s understandable yes but if the game runs like utter garbage on AMD, then clearly something is up.

      • Their drivers, me, you and R9 200 series users learned this from AC Unity, remember how bad it was and that driver fixed it?

        I had a few older games that performed terrible on my old R9 280, that included Crysis, performance was horrible in places.

        • SGTBuzzyBean

          Yeah the Omega drivers improved performance a LOT but that was the only time I had a huge increase just from a driver update. I’m not saying the game’s is gonna run bad, I don’t mind gameworks and if they optimize it for nvidia as long as it still runs decently on AMD. But say for instance a 660 getting 10-15fps more than a 7950, something like that is messed up.

          • My GTX 660 used to perform better in games I had than my R9 280, Crysis, Eve Online had frame-rate problems were my 660 didn’t.

          • SGTBuzzyBean

            That doesn’t make sense really. I can understand with games that favor nvidia such as Bioshock Infinite or like you said Crysis 1 but in general the R9 280 should be faster.

          • JAGUARCD32X

            I imagine that could be to do with your CPU as it’s now well known that AMD cards have massive CPU bottleneck issues when using weaker CPU’s or i3’s as shown time and again on Eurogamer where the AMD card will become CPU bottlenecked and drop as low as 30fps while the equivalent NVIDIA card using the same i3 CPU will continue to run @ 60fps.

  • DatocanNyvengyn

    So, how’s the game’s performance on the current crop of AMD GPU based consoles, with shared memory?

  • goodbyejojo

    nvidia promoted title, 660 2gb for nvidia but 3gb for AMD, yeah….

    gg

    • ZachFMorgan

      It’s normal, HD 7950 with only 2gb doesn’t exist.

      • goodbyejojo

        but you have 2gb cards in the same if not more powerful range with the 660

        • HankTheSpank!

          Yeah, the 660 usually lands somewhere between the 7850 and 7870 depending on the game. Right now its closest equivalent is the R7 265.

          • Alawami

            7950 is ~10% faster than 660 when there is no vRam bottleneck. i don’t think 10% will make that difference

  • Shredder Orokusaki

    As a GTX 970 owner i have nothing to fear! It will run perfectly maxed!

  • bimmyz

    meh, typical WB, remember the overhyped GPU requirements for Shadow of Mordor?

  • Prometheus

    can’t wait for your the performance analysis so I can decide whether to
    buy the game or not 😀

    • D.Zoolander

      My thoughts exactly. AC didn’t run very well so I’m not holding my breath for AK.

  • HankTheSpank!

    So nice of them to withhold this info until the day before launch

    • Silviu

      /
      i have a cinematic feeling and a bad one.

  • stalker420

    do devs these days really releases ganes without tests their game on amd cards?

  • Ground Zeroes was an exception.!!!

  • Amir

    45GB hdd space required for minimum 55GB for recommended and ultra (also gtx 980 for ultra), pre-load size is 33GB, rumored a 3.5gb day1 patch, maybe a 9gb+ hd-texture pack ?

    • Wahid

      There is no recommended or minimum for storage 🙂

      • Amir

        Minimum:
        Hard Drive: 45 GB available space

        Recommended:
        Hard Drive: 55 GB available space

  • Anaron

    AMD cards perform better in some titles because they were optimized for their GCN architecture. These are games that have been released for the Xbox One and PS4.

    • What’s that have to do with anything, FC4 is an NVIDIA Gameworks title.

      • Anaron

        Not all developers implement GameWorks effects the same way. You mentioned AMD cards performing great in Far Cry 4 and I told you why. It isn’t limited to just Far Cry 4 either. The same applies to Ryse: Son of Rome and Dragon Age: Inquisition. Just because it’s a GameWorks title doesn’t necessarily mean AMD cards will perform worse.

        • Yet the assumption is that AMD GPUs will perform worse. You can make that assumption purely on how late AMD release drivers.

          • Anaron

            Late? Oh, I guess that’s why the Catalyst 15.6 drivers for Arkham Knight are out before the game is. And let’s not forget about the day one driver release for GTA V.

          • A beta yeah.

          • Anaron

            It’s only a beta because it isn’t WHQL certified. All the previous beta drivers have been stable.

          • They’re beta, so you’re a beta tester, end of story.

          • Anaron

            Really? Now I know you’re an anti-AMD troll. You’ll grasp onto anything if it means saying something negative about AMD.

            The sad thing is, you think you know what you’re talking about but you really don’t.

            What’s the matter? Does it feel bad to be called out?

          • It’s not a troll, it’s a beta driver, that simple. I’ve criticised NVIDIA for doing this as well. I see you’re jumping to conclusions about one line of text.

          • Frosty Mug O’ Beer

            So, one game in the past year lol

          • Anaron

            Try again. They released drivers for Dragon Age: Inquisition, Call of Duty: Advanced Warfare, Far Cry 4, and more.

          • Frosty Mug O’ Beer

            At release ??? nope try again….

  • SteXmaN

    AMD lol

  • Dark Moyan

    Nvidia plays dirty

  • Jay

    Strange how Nvidia only needs 2gb VRAM but AMD needs 3gb VRAM. What’s up with that?

    • DamZe

      NGreeda the way its meant to be gimped for the competition….

      • Jay

        Isn’t it the other way around? It seems like AMD has gimped hardware that looks good on paper but uses lesser-quality parts in order to undercut Nvidia.

        Maybe, I don’t know – I still need an explanation for why it takes an AMD card 3GB to do something an Nvidia card only needs 2GB for.

        • JAGUARCD32X

          I think it has to do with batch calls, see the AMD cards can’t handle lots of CPU batch calls that tell the GPU what to do so the AMD drivers tell the CPU to combine them to reduce the amount. Then you have things like memory controllers and how efficient they are and basically how quickly you can flush your memory to get new data into it.

        • smizzoker123

          memory speed maybe

          • Jay

            Or memory BUS width, perhaps

    • 7950 comes with 3GB of VRAM, GTX 660 comes with 2GB of VRAM.

  • Michael

    I’m sure I’ll get flamed for saying this, and it’s just my opinion, but AMD needs to up their game! I don’t support Nvidia intentionally hurting performance on their competitors cards or even that of their older generation gpu’s if indeed that is what they’re doing, but AMD needs to stop playing the victim and start competing.

    I know a lot of people don’t like Gameworks, and even I will admit that I question its implementation at times, but without it PC games would have precious few defining differences from their console counterparts. Why would Nvidia invest time and resources into making these technologies only to hand over the source code to AMD so that they can piggy back off of their work? They wouldn’t, and they shouldn’t. Nvidia even said that Gameworks was designed to encourage developers to include newer graphics technologies to improve gaming experiences for Nvidia customers. You may see this as shaddy, but then again that just how business competition works.

    In my mind Nvidia has done a much better job at pioneering the industry over the last few years. They’ve consistently released better hardware and innovated newer technologies while AMD has been scrambling to catch up. AMD likes to portray themselves as the white knights of the gaming industry with releases like Freesync, but what else can they do when they’re so far behind the technology curve? The R9 390X releases this week and it’s only competing with the GTX 980 which has been out since last September! Don’t get me wrong, I don’t want to see AMD fail because that’s bad for everyone, but unless they seriously step up their game I’m afraid that they eventually will.

  • OvidiuGOA

    same game as before, i got tired of them fast. Plus you need to play with the sh*t tank in this one because why else would it be there if not to screw the whole campaign up. Worst batman game of the bunch.

  • Any you expect me to believe those links?

  • Don’t like the look of these Steam reviews coming about performance problems on both AMD and NVIDIA. sh*t storm incoming.

    • connos

      Buy a console then.

      • Silviu

        nice touch

      • That’s not even funny.

        • connos

          Why??? They used their optimal render path.

          • I’ll keep my i5 4690K, GTX 970 thanks.

          • connos

            HAHAHAHAHA

          • Well I don’t like Cinematic frame-rates like the XB1 and PS4 has. :p

          • connos

            But what about the optimal render path.

          • Don’t you think you’re taking that a bit too far now?

          • connos

            With you and your imagination of things never :P. But i am glad we are having this conversations.

  • prudis

    Also Denuvo not-“DRM” confirmed….. requested a refund – cant support that sh*t.
    And whole internet went nuts because of poor performance

  • Aki Mikage

    Saw the B: AK reviews on steam and they’re really bad. They say it’s locked on 30fps, stutters and has a shi**y performance. It’s a really bad port 🙁

  • Again you’re deluded and I’ll say it again, YOU’RE NOT RUNNING GTA 5 ON ULTRA, just keep convincing yourself that all because you’re so proud of your GTX 680 and can’t except.

    • Silviu

      yes i do !
      don’t be idiot Sean like you are !
      ofc using icafe drivers!!!
      damn this haters will gonna hate!!!

  • Hvd

    seams like nvidia is making games bad these days look at ac unity and now this with batman if i was an amd user i would stay away from games with nvidia invloved untill amd releases a patch for it.

  • The nvidia gtx660 is barely enough, I tried it and you have to use console resolution but also low settings so it looks crap XD. 2x r9 290 maxxed out at 1080p though XD Pity the r9’s suck in some games I play that are nvidia optimised – microstuuttter and low fps. And in games that you can run off laptops like wth…