Monster Hunter World header 2

NVIDIA GeForce GTX1080 unable to run Monster Hunter World with 60fps on highest settings at 1440p, PC screenshots [UPDATE]

Monster Hunter World releases next month on the PC and it appears that some people have already gotten their hands on it. Resetera’s ‘FluffyQuack‘ who got access to it has shared some initial performance impressions, and shared some PC screenshots that you can find below.

As FluffyQuack noted, an NVIDIA GeForce GTX1080 is simply unable to offer a 60fps experience at 1440p on the Highest settings. On High settings this powerful GPU is able to offer 60fps in the Hub area and on Medium settings it can offer 65fps. Ouch.

“I did a quick performance test in the starting hub (running the game at 2560×1440):
Highest preset: 44
Highest preset (except for volumetric set to “high”): 50
High preset: 60
Mid preset: 65
Low preset: 108″

FluffyQuack used an Intel i7-4790K with 16GB of RAM so obviously he was not CPU limited. Moreover, he noted that Capcom has added a very taxing option, called Volume Rendering Quality, and that’s the first setting that gamers should disable in order to improve performance.

Do note that the aforementioned numbers appear to be minimum framerates as the user provided a comparison between the Highest settings with and without Volume Rendering Quality, and the game was running with 78fps. Still, it’s pretty obvious that PC gamers will need a GTX1080Ti in order to play it at 1440p, despite the fact that the game’s visuals are similar to those found in the console version.

Last but not least, here are the PC screenshots that FluffyQuack captured!

UPDATE:

FluffyQuack got in touch with us and clarified his results. Those framerates that he listed aren’t supposed to be seen as average framerates (and we’ve already said that these are initial impressions and you can consider them as the lowest framerates).

“Those framerates I listed aren’t supposed to be seen as average framerates. I included them to highlight the scalability when changing between graphics settings. I quickly noted the framerate with the camera still in the starting hub, and the default position just so happens to be looking at one of the more complex scenes in the hub.

I feel a bit bad as people are taking it as a serious performance test, when my post isn’t supposed to represent that at all. I’ve edited my original forum post with the numbers, so if you could edit your quote of my post to reflect that, then that would be appreciated.”

221 thoughts on “NVIDIA GeForce GTX1080 unable to run Monster Hunter World with 60fps on highest settings at 1440p, PC screenshots [UPDATE]”

  1. hmmm The visuals don’t look that impressive… Guess we will just see what happens to performance on launch.

    1. Especially considering that Denuvo gets cracked faster and faster every time someone tries to put it in their game. I think the last time I heard of a company putting Denuvo in their game, it was cracked within the day that the game launched. Honestly, at that point, they’re really just wasting money on it.

        1. That doesnt mean nobody else can do it. Besides, as long as theres a precedent for denuvo being cracked in a single day, it proves that Denuvo simply doesn’t work as a serious anti hacking method.

    2. ‘Thunk’ is not a legit word. Sure, it’s the internet but at least write intelligibly.

    1. “til it’s fixed” Lol This IS actually a impressive. Even on a PS4 Pro it barely goes over 30 fps and dips in the low 20s very often at 1080p. Staying anywhere near 60 at 1440p is signs of a good port, really

      1. Crazy how I don’t see frames that low on my pro. You should check out digital foundry’s analysis of the console versions before you continue sharing misinformation. This game isn’t very well optimized, which you’ll learn if you watch the video.

  2. Monster Hunter World can be put on the backburrner for a while. At this point, I’d like to see what the first batch of patches (if any) bring to the table before deciding to splurge for the game.

    1. Not all games support hyper threading, some games support more than 4 cores as well, so it can be a limitation if the game uses more than 4 cores, and i pretty much expect this game to use more than 4 even if it’s an Nvidia optimized title, so you can’t assure that the game is not limited by the cpu until you see some frametime graphs with the CPU usage.

      1. I may be wrong…but I’m thinking MT framework supports like 4 cores?
        I couldn’t find the number specificially.
        But yeah…
        If that processor is bottlenecking…the game has some serious programming issues.

        1. You are maybe right, i was searching for it as well, i haven’t seen any update for the MT framework to support more than 4 cores explicitly.
          So if the 4 cores is the limit i7 4790k would probably not bottlenecking the gpu, unless the nvidia specific gigatrhead engine is pushing one of the cores too much, since Denuvo is there as well.
          We will see.

      2. “Uh what? You might wanna update your tech knowledge, because that has not been true at all for nearly a decade.”
        So you are implying that all games support hyperthreading for nearly a decade, false.

        “So this must be crazy news to you but games (not even just games but now most applications) now scale to support any and all cores and threads available.”
        False as well, many applications do, most of them, not necessarily, few of them scale up to all cores and threads available.

        “Hyperthreading is also supported in pretty much everything too, since DX11.2 which was released alongside Windows 8.1 in 2013.”
        Again, everything? The support can be present and not be used.

        “Basically, any game from 2012 onwards supports more than 4 cores. All the modern iterations of engines, CryEngine 3, Lumberyard, Unreal Engine 4, etc all support both more than 4 cores and 4 threads as well.”
        Top AAA games and engines support, not necessarily developers use it, or use it efficiently.

        “And that also goes for MT Framework, which Monster Hunter: World was made on.
        MT literally stands for Multi-Thread. Lol. That’s the ironic part about your argument here as well.”
        This can be true, i’m not sure if it is, but having multithread does not mean necessarily supporting or using hyperthreading.
        Multi thread means multiple threads so if you have 4 cores you can assing 4 threads, hyperhtread is a somewhat special type of thread that is based on intel technology that can make better use of one core with two threads per core when the developer and programmer knows how to use it, btw AMD has it’s own Hyper thread like SMT and does not necessarily work if the game is developed with the intel hyperthreading way of coding.

        So i made very clear in my first post that could be, not that definitly is, and while it’s the most probable that it supports more cores, but not necessarily hyperthreading.
        “Not all games support hyper threading, some games support more than 4 cores as well, so it can be a limitation if the game uses more than 4 cores”

        So please…

    2. As i replaced my 4790k with a 8700 at 5,2,its really faster than 20%.Compared to my bro smt off 1700 at 4.0,at 1080p,i am far ahead,especially at the minimums.

    1. Eh? That texture resolution…
      The lack of any discernible dynamic lights…
      The lighting in general…
      Low geometric detail…
      at least the draw distance looks pretty good I guess.

      1. What the hell, please stop BS without looking objectively to the things, the game photos looks clean very clean, consoles suffer from some temporal reconstruction and look rather bad in motion like a 720p or worse.
        This looks far better than the consoles, with far better viewing distance.

        1. I already posted a link, but it needs to be approved by the moderation. I’ll copy and paste the text:

          Monster Hunter: World is finally coming to PC next month, but will
          the game take advantage of meaty PC hardware to make some visual
          upgrades? We put the question to producer Ryozo Tsujimoto in a written
          Q&A.

          “When the game launches the visuals will have parity with
          the console versions,” he says, “but we’re considering releasing a free
          update after launch.”

          See? no bullshit, it’s what the producer is saying.

          1. LOL, the visual parity is about not giving the PC extra exclusive features like gameworks, the key visual art style is maintaned, Consoles are not maxing out the engine, PC will look far better cleaner with better shadows, viewing distances, less pop in, higher resolution, etc…

  3. To Mae the assumption u need a Titan class GPU is silly, games not out yet and probably will have a day 1 perf patch.

  4. Clearly a 1080Ti is much faster but a 4790 k isn’t new anymore. If adding a coffee lake cpu increases the frames with the same GPU than the cpu is still a bottle neck.

      1. You can’t say that without testing with a new CPU. Go try to find a new game where the 4790k and 8700k get the same frame rate.

        1. Most games aren’t as CPU intensive anymore though. Regardless of how old your CPU is, if a game doesn’t put it at 100% usage, it’s not CPU limited.

          1. I am not saying the gpu isn’t slowing down the system. Clearly it is.

            The only thing I am saying is If adding a coffee lake cpu increases the frames with the same GPU than the cpu is still a bottle neck.

      2. Well,going from 4790k to 8700,I saw Massive gains at 1080p,with 32 gb ram and 1080 ti.To compare,at bf 1 i went from 120 av to 155 and min from 40 to 120.At 1440,being more gpu bound,i would expect lesser improvent,but from my experience as software engineer,the 8700 destroys the 4790k.

        1. Hmm…kinda pointless in a 1440p discussion, wouldn’t you say? The real world difference between these CPUs when gaming is concerned is actually very little unless your just interested in looking at numbers that mean absolutely nothing in real life circumstances.

          1. Well,since have a 1440p 144hz asus monitor,It actually matters a lot.Again,since i actually tested about 150 games and autocad to compare cpus,the diffrence in real world is actually quite big,especially the min,the actual subject of this article.Good day.

          2. You listed 1080p, not 1440p…the difference is pretty big…and as stated earlier, the game isn’t CPU bound, it’s GPU bound which makes having a 8700k or a 4790k meaningless.

          3. Ok.You are absolutely right.Even though it doubled my minimum fps in b1 and saw 25 fps avg increase in 1440p,you are right.I would like to see a test for people who actually know about the pc platform,like me ,who i am a software enginner,to actually measure knowledge,in order to keep people who have no clue,from stating false information.But a 20% performance drop from 1080p to 1440p makes a game ENTIRELY Gpu bound,right??

          4. If you think your CPU doubled the performance in a game, I’d say your clueless…there’s only about 500 reviews and videos showing gaming performance increasing very little among the last 3-4 generations of Intel CPUs…and please read up on what it means when a game is NOT CPU bound…the CPU is NOT a factor in THIS game…I don’t know why the hell your bringing up other games which has absolutely no bearing on this review…your repeatedly talking about how good performance with a CPU is in games you play yet what does that mean to this particular game or setup??

    1. A 1080 Ti isn’t much faster. I saw this as an owner of a 1080 Ti. The best Benchmarks put it @ 30% faster. But on average it’s 15-25% faster.

      1. On average that at 4k its 26% faster at time of release. 22% at 1440p. I went from a overclocked 1080 to an overclocked 1080ti. Depending on the game that can be the difference between above 60fps and less than.

    1. Even on a PS4 Pro it barely goes over 30 fps and dips in the low 20s very often at 1080p. Staying anywhere near 60 at 1440p is signs of a good port

      1. I wouldn’t say so, honestly. The possibility that my 1080Ti might not hit a consistent 60@1440p is far from this title being an acceptable port. I have the powerful hardware in my rig for a reason.

      2. You keep saying this, you keep being wrong.
        There is a reason EVERYONE here is disagreeing with you. it’s because your’e wrong.

      3. Just because it ran like dog s**t on consoles doesn’t mean it’s optimized or it’s a good port for PC. unoptimized garbage (on consoles and probably on PC) what it’s called.

  5. Brace yourselves guys, pathetic port incoming.
    Seriously though for a game which will be released with same graphics as that of consoles, a single GTX 1080 not achieving 60fps?? Damn.

    1. Wow everyone here is so ignorant. Even on a PS4 Pro it barely goes over 30 fps and dips in the low 20s very often at 1080p. Staying anywhere near 60 at 1440p is signs of a good port, really

      1. A lot of games dip to 20fps on consoles which run at 60fps even on gtx 1060 so what’s your point?
        Sign of a good port? Lmao, do you seriously think a GTX 1080 not achieving 60fps at 1440p is a sign of good port? I mean I would’ve understood if it was 4k but at 2k? Nah I don’t think so.

          1. I do believe if you ask around, 95% of anyone who plays on PC will agree.
            When I say budget PC, I don’t mean a potato. I mean a decent 6-800 dollar machine.

      2. Stop copy/pasting the same comment under every post. Ignorant is the one who discard the game’s problems and making excuses.

    2. “same graphics as that of consoles”

      I just went to double check. Capcom is claiming to do this but gameplay video comparing a PS4 Pro and the PC shows that the game does look better on PC. Are they planning to downgrade it or not?

  6. Keep in mind the console versions barely manage 30 with a lot of dips into the 20s, even on a PS4 Pro. At 1080p. Also, I have that CPU and it DOES hit full use in a lot of games lately like far cry 5 so it might actually be CPU limited if it’s programmed in a similar way as Farcry 5.

  7. Honestly, with an inevitable day one patch, and the updated game drivers coming I can see an easy 20% increase in performance over what we’re seeing here… And if that’s the case it’s really not that bad. 1440p on a GTX 1080 maxed out getting around 75-80 fps sounds just about right for a brand new AAA game in 2018 especially an open world one… I’m guessing I’ll be able to keep around 100 fps maxed out on my 7820X at 5Ghz on all 8 cores and my titan Xp under water with a heavy memory and core OC and finally my DDr4 that runs at 3600Mhz with CL14 Timings… People don’t realize just how much ram speed/bandwidth matters in a ton of games… Its actually pretty wild seeing performance difference between like 2133mhz DDr4 vs even 3000mhz…

    1. There’s a nothing wrong here at all. People are somehow expecting a 1080 to run the game over twice as well as a PS4 Pro. That’s an unrealistic and silly expectation

      1. Well, the PS4 pro has a 4.2 tflop GPU whereas the GTX 1080 has a 9 tflop GPU…so running more than double should be an expectation. Especially when paired with a vastly superior processor to push it.

        1. Which it is. It’s running at almost double the frame rate, with a higher resolution, higher draw distance, higher AA and all a month before release. Before the custom drives or day one patch are released.

        2. You can’t measure performance in straight Teraflops. It’s not a good argument. Things like core count, shaders, matter, in straight up benchmarks, a 1080 has 3.35x the power of a PS4 Pro system.

          1. Considering we can’t compare benchmarks between a console and a PC, that’s about all we have.
            Do you have some other measurable metric that’s useful here?

          2. No but why not just wait for the final release comparison video’s? How often is it that on paper predictions depict real world performance?

            Also, just looked at the Digital Foundry video and now I really wonder what Capcom and people paraphrasing them means when they say there is parity between the PC and console versions.

            Seeing as the Pro and X have 3 options to choose from and all run like like … well garbage, whether in performance mode, resolution mode (checkerboard or something else even) or graphics mode, it doesn’t seem like the game is optimized for anything right now which is a shame.

            I prefer 60fps over better graphics but the consoles can’t even keep consistent with 30. The more I look at analysis videos on Monster Hunter, the more I start to understand why this is even a topic.

      2. “People are somehow expecting a 1080 to run the game over twice as well as a PS4 Pro”
        Alright now it’s clear you’re just trolling and don’t really game on PC otherwise you wouldn’t post such stupid comment.
        In case you do game on PC and really believe what you just wrote then I really feel sorry for you.

      3. When a GPU that costs more than a console, can barely run it better than said console then you know there is a huge problem there.

        I know you dislike to admit this, but paying more and getting less is objectively bad, no matter how many times you can try spinning it.

        Also, I haven;t seen you around here before, I wonder what brought you to this site when this article showed up.

  8. Quite the opposite, if you had actually seen how bad the console version runs. Even on a PS4 Pro it barely goes over 30 fps and dips in the low 20s very often at 1080p. Staying anywhere near 60 at 1440p is signs of a good port

    1. Just because the console version is worse trash doesn’t make slightly less stinky trash not trash.
      If a 1060/580 can’t do 1080/60 on high settings, it’s poorly optimized in my opinion.

      1. How do you figure? If a game is using demanding rendering techniques and is really pushing the envelope then performance is going to go down. Your “opinion” is obviously sorely lacking in any kind of legitimacy… You have a huge open world with very high IQ… I’m sure the game can be run at 1080p@60fps no problems on a GTX1080… The GTX1080 can’t run every AAA game at 1440p@60fps anyway. I would expect that neither a Titan Xp or a GTX1080ti can run the game at 4K@60fps either… It DOES NOT mean it is “poorly optimized” at means that it is a beautiful game that is very demanding on GPU resources… Besides I highly doubt that the person who was doing these tests has a driver that has been properly optimized to run the game anyway… Let’s wait to judge when the game is ready to launch (meaning very likely a day one patch and proper drivers will be available…).

        1. “A game is unoptimized when a game developer insists on using unnecessary extra quality effects”

          Which is why you can turn them off. These are effects gamers will be using in three years time when the next generation of hardware is more common. Its future proofing.

        2. Which games are you refering to? I run everything just fine with a 1080 @ 1440p 60+FPS. Unless the game is a poorly optimized POS.

          1. Right so a newer game can’t run worse on older hardware……..???? You know that card is pushing three years old right?!

            People need to get back to their consoles before hurting themselves on things they don’t understand and keep talking about.

          2. But this particular game should not have any issues running on a 1080. This game is poorly optimized no doubt.

          3. And to say this I just stupid because these are the lowest framerates and not the average guy…. THINK ABOUT IT…….

        3. Sometimes the demanding rendering techniques are largely demanding because they haven’t been modernized to take advantage of parallel processes or thread scheduling. So it appears demanding, but only because its not utilizing the hardware effectively.

          I’d like to know the GPU utilization other than simply just FPS markers.

          1. Yeah the comment from FluffyQuack (It seems to be the guy who posted the screenshots) talks about it, that is the lowest not the average so people are complaining with no reason at all.

      2. Seems like you don’t remember the days when games actually pushed the envelope and you needed a proper GPU to run on high settings hehe

      3. Riiiiiight because there were/are never any engines in any games released ever that are hard on today’s hardware because of hardware constraints.

        Yeah okay. Crysis didn’t take years to be ran with one card at a high/proper resolution…..

        you people are programmers here…..? Understanding the code behind the game in its entirety?!

        How does the game fare on AMD hardware….?? Because then there would be room for talk about being un-optimized if there were serious gains in performance over at team reds corner.

        Wow………. Ungrateful everyone seems to be these days.

        1. Sure, there are engines that push hardware, Such as the Cryengine. However, these engines also have modern features and effects. The engine we’re seeing here is basically the same we saw on PS3. It only runs slower because it’s not designed to properly use todays hardware.
          I actually DO program, although I don’t enjoy it. I instead repair and maintain computer hardware for a living.

          Testing it on a different architecture only specifies how well it performs on that particular architecture. But, we can wait on the DF coverage to see how it really runs.

          I did see a later stream via IGN where it was running fine on a 980, so most of the conversation is irrelevant at this point anyway…

          Side not: Gratitude isn’t warranted when you’re purchasing a product. Expectations and value are instead capital. If they were giving us the game? Sure, that’s gratitude. They’re not doing us a favor here, they’re trying to make money. let’s make them earn it, shall we?

          I don’t expect gratitude when I do my job. I expect a paycheck.

          1. so then. What’s this argument about the engine nit bring optimized…. Does it run beautifully on order hardware?? Driver optimization is a real thing you know.

    2. I’ll never understand why they make games too demanding for consoles. There is no simple upgrade path for them. At-least with PC games if a game releases that is just a insane push forward like Crysis was you know future hardware isn’t too far away and you can just enjoy the game on lower settings in the mean time.

      I get that they don’t want the console version to look too inferior but performance stability is much more important than graphics.

      1. Ah I can answer that one.

        The reason why many developers / publishers (don’t really know who has the final say in this) push for better graphics is because you can sell that. Advertising that a game runs at 60 fps isn’t as eye catching as just having eye catching visuals, even if it comes at the expense of performance.

        I do agree though. It’s bullshit that this is how it is done. I’ve only played PC for 4 years or so and going to 30fps (I do play the PS exclusives) is just painful. I’d rather have lesser graphics for better frame rate.

        And then console fanboys call PC players graphics wh*res. Hm yea ok.

    3. Lmao, no, it is. A 1080 is more expensive than a PS4 Pro or Xbox One X by itself.

      It’s also massively more powerful than the One X or Pro. Just by GPU specs alone, it’s stronger by 335% performance wise than a Pro. 1440p is 80% more than 1080p in pixel count and usually has around a 50% performance hit.

      It has 300% more performance than the Xbox One X GPU as well. So, ignoring things like faster RAM and WAY faster and stronger CPUs, the performance is crap.

      1. What? The X1X has GPU performance that puts it between a GTX1070 and GTX1060 (you can’t simply use “specs” to determine actual real world performance when comparing a console to a PC btw… the integrated nature of the X1X APU clearly gives it advantages over a similarly specced PC). Besides even if you put it squarely in line with a 1060 then that would also be fair… My point? A GTX1080 is NOT 300% more powerful then a GTX1060. That is a ridiculous statement It is anyway from between 23% and 63% faster… A quick search for benchmarks between the two proves this. Where do you get your numbers from btw…

      2. xbox X GPU is much faster compared to GTX 1060, and after looking at some games like wolfenstein 2 sometimes even GTX1070 cant match xbox X performance. I have already sold my all gaming platforms, because games can be a totall waste of time (and I have to fix my problems), but I had xbox X before and I know it’s performance was very close to GTX 1070 in GPU limited scenarios. So GTX 1080 is nowhere near 300% faster, and even my 1080ti wasnt. In multiplatform games I could only double my framerate on 1080ti compared to xbox X and that was it.

        1. I remember watching GTX 1060 2GHz OC gameplay, and even with setting at low and dynamic 4K performance was worse than xbox X, and game is nearly maxed out on xbox X. On xbox X it was also possible to run wolfensten at 4K native awith uncaped framerate on top of that, and performance was very impressive even compared to GTX 1070. Card like GTX 1060 is simply not enough to match xbox X setting in GPU bound scenarios, and the only time when GTX 1060 will be faster are CPU limited scenarios, because Xbox X CPU is very weak. Listen, GTX 1060 is already comparable to RX 580, and XBox x has RX 580 on steroids because it HAS MUCH MORE IMPROVED ARCHITECURE (some vega features like delta colors compression, dx12 build into GPU, and much more efficient shader compiler just to name a few). Go play far cry 5 on PC at the same settings as xbox X on GTX 1060 or RX 580, and you will see you cant do that, and you know why? It’s because these GPU’s are just too weak and there’s nothong you can do about it. Xbox X has very weak CPU, but it’s GPU performance is very impressive for a console, and GTX 1080 is nowhere near 300% faster compared to xbox

          1. Yes xbox X cant handle more fps in destiny, but not because of GPU power but CPU bottleneck. That’s why PC is always a better option for multiplatform games, especially if you have good CPU. I can show you rise of the tomb raider gameplay on GTX 1070 and you will see exactly the same performance as xbox X, meaning 30 fps at 4K + high settings, with dips below 30fps on both machines. GTX 1060 barely runs that game at 20fps, not to mention 30fps, so obviously GTX 1060 cant match xbox x settings in GPU bound scenarios. On 1080ti I could indeed surpass xbox X settings in games, but 1080ti is 2x as fast. Right now I dont have xbox X, (and I also sold PS4P), but If someone wants to run console games on PC wih the same ir higher settings he needs GTX 1070 at least.

          2. The CPU is indeed the bottleneck… The CPU that the Xbox One and PS4 of every variety use is about as powerful as the processor in your tablet or smartphone. That has been known for quite some time. You are misinformed

          3. In some games like wolfenstein 2 you can. But that doesnt matter anyway because xbox X version also feature 4K native mode with uncapped framerate, so anyone can see how fast xbox X GPU is compared to PC GPU’s. In rise of the tomb raider xbox X also is not using dynamic resolution, although there are two modes, 4K native with high details, and 4K checkerboard with maxed out settings. Anyway both xbox X and GTC 1060 are cheap as hell, and these are not true 4K machines. My 1080ti was.

        2. “Also, I like how both you and Kristofer both completely ignore the fact he said it was 335% faster than the PS4 Pro, and not specifially not the XBOX One X”

          In the last paragraph he wrote:

          “It has 300% more performance than the Xbox One X GPU as well”

          So cant you read? Clearly he was also talking about xbox X, not just PS4P

          BTW- as I can see you have already edited your post :).

          1. I have quoted your post before you delated it. I have clearly seen your nick and avatar and these were your words. If you would not write it, I would not quote it.

    4. It’s 40-50 fps on PS4 pro. Staying Near 60 at 1440p is not a sign of a good port on a gtx 1080 lol. The Witcher 3 stays above 60 at 1440p on my gtx 1070. (with a few settings turned down to high and hairworks disabled).

  9. Even on a PS4 Pro it barely goes over 30 fps and dips in the low 20s very often at 1080p. Staying anywhere near 60 at 1440p is signs of a good port, honestly.

    But really that’s a weird resolution to test on. Steam survey has only 3.45% of users using 1440p so this is unrealistic for most people. Show how 1080p runs for something actually relevant

    1. Dude you’re pathetic. Clearly you work for this company otherwise your responses are just pathetic please stop

  10. “visuals will have parity with the console versions” AND will need a much stronger hardware to run at highest settings for something that hardly look impressive nowadays, still people will make up excuses for their favorite corporation while they just laugh.

    1. Like others here, you’re ignoring the fact that the game barely hits 30fps at 1080p on PS4 Pro. So while looking the same, getting almost twice the performance at at a better resolution is actually not a bad sign

      1. so because it runs horribly on consoles we should be happy it runs a little “better” on PC while looking practically the same?

      2. No, your standards are just horribly, horribly low.
        Good should not be relative to consoles. It should be relative to other games on the same platform.
        In this case, this puts it at really, really bad.

    2. Yeah I was really excited for this game but seeing how terribly optimized it is I think I’m gonna have to pass unfortunately.

  11. “FluffyQuack”…..what a name.

    If a 1080 can’t handle 1440p at highest settings then the game needs optimizing.

      1. You’re underestimating the 1080. My 1070 runs any and every AAA game at over 60 on the highest settings.

          1. pssh. My RX 580 runs most games at 1440p above 60 fps.
            People overestimate their needs, and grossly underestimate mid tier cards.

          2. I totally agree. My 390X still trucks along at 1080p and even 1440p above 60fps as well. Hell, some games I can play in 4K just fine, and this on a 2012 architecture.

          3. Some people are happy with “just fine.” For others that isn’t good enough.

            Personally speaking, just good enough isn’t good enough and I’ve spec’ed my system out accordingly.

          4. Yes at 1440p. Wtf obviously I was referring to 1440p. It’s more common that games would run it over 60 then not. I’ve only seen a SELECT FEW games which can’t do it like Deus ex mankind. Idk maybe one or two others I can’t remember right now but the MAJORITY of AAA games easily go over 60fps at 1440p highest settings as long as your cpu isn’t the bottleneck

          5. He’s talking about 60FPS dude. Yeah. I have a 1080 Ti, I run 1440p, 120HZ with High on on most games and still see like 40-50% GPU usage in them. So reduce that to 60FPS, max out the GPU usage, and you got about 1070 level. Totally believable. Benchmarks support it.

      2. Was watching streams of this last night (3 hour demo released for the special people) and the textures were blurry and all graphic effects and everything else us PCers look for were the same as consoles. If everything is going to be on par with consoles then at the very least, and I mean the most minimal effort, the game should be optimized enough. A 1080 should be able to eat up 1440p 60.

  12. Just one question: are there any game ready drivers for this game? If not we might be ok, I tried several games with old drivers to see differences and, most of the times, framerates (when there are problems like in this occasion) improve by a 40-50%

      1. Yeah. I have a 1080 Ti and 1440p60FPS was hard at times due to random stutters that still exist.

        Also it is badly optimized for space. You can use a compression software on it and reduce it form 75GB to 21GB in size, at no cost to performance, textures, and an actual decrease in load times.

    1. Listen. This performance is crap. I’ll agree with that. But it looks way better than Nioh, just from a technical standpoint, Nioh look PS3 level.

        1. Which is usually a great Engine. But so was Source Engine once upon a time. Now it’s overtaxed with new games still using it. Sucks.

  13. That’s largely an unfair and inaccurate assessment given that neither AMD nor Nvidia have put out optimized drivers.
    Just look at how terribly the BF 5 alpha runs. Yeah, just blatant clickbait.

  14. This is mafia 3 all over again…old engine but they put tons of next gen visual effects and perform bad…smh

    1. Doesn’t change the title. There should be no case in a game like this where a 1080 can’t handle 60FPS @ 1440p. It’s graphically pleasing, but an empty world outside of boss fights.

      1. I think the article mentioned the issue was in hubs, which are a little busy. But yeah I still agree with you.

        1. Hubs. Hmm. CPU is always the issue in hubs. Regardless, that’s because most MMO titles don’t properly use them. I have a 8700K and most titles max one or two cores out in hubs, causing the drops, MT Framework is supposed to be pretty well threaded…

          But I have no faith. Hell, Capcom, 2 years ago released the DMC4 Special Edition. A re-release of the 2008 DMC4 on PC. Which uses MT Framework. It was a beautiful, well optimized title for the time. The re-release? Which, if you disable the new motion blur and depth of field, looks identical besides some new god rays…the performance is incomparable.

          A 8800 GT and a Q6600 could max out the old one @ 1080-60FPS…well and over.

          The new one, even if you turn it’s settings exactly the same…runs far, far worse. CPU usage is increased, massively. On my 5GHZ 8700K I see 40-60% CPU usage at times. For no reason. The old one shows sub 10%. And they look the same.

      2. Yep. I can imagine the game running worse in actual maps such as the smoke filled Rotten Vale. My PS4 lags like hell when fighting Vaal Hazak for example. I can’t imagine this port fairing any better with these hub results.

        Either way we’ll see if an Nvidia patch on release can help mitigate this or fix it entirely.

  15. Did you see the article about how this sold 1M+ Copies in China already on PC? Where they tend to have weak PCs? What a mess that will be.

  16. I’m not that surprised with the poor performance given how it runs on consoles.
    In order to run game at highest graphics settings on consoles it lowers resolution to 1080p with frame-rate on the PS4 pro in the 30s and XB1X in the 40s.

  17. That’s a shame, Capcom games usually aren’t visually impressive, but they tend to run pretty good on any hardware.

    If there’s one thing you could count on, is that you will have few issues running a Capcom game, DMC, Resident Evil, Street Fighter all run on any hardware and they don’t look so amazing.

    Maybe the open world factor is the problem. As recent Dead Rising games did have some issues, if I recall correctly.

  18. Thanks for saving me $60. This game sounded awesome, until now. I will not waste my money on a poorly developed game. For the money we pay for games these days if it’s not polished, your going to wish you did. I’m less of a gamer now than ever, because the games are more poorly developed and asking for what I consider premium money for games these days. Developers are killing off their own industry. If it’s not this it’s lootboxes or DLC. Gone are the days of paying for a game and getting what you paid for.

  19. It seems odd to be that there are not metrics in these screenshots. (making them, if real, almost useless.)
    I’m want to GPU % usage. Also, the clocks. The GPU might not be ramping up.

    It seems strange to me also that there is a “login bonus” (it is in one of the screenshots) – those only happen when talking to online servers. That said, servers could be up..

    also, there is a different between unoptimized vs demanding. Just because something isn’t running well doesn’t mean it was poorly done. Like running smaa x4 on 4k. That is going to be hella demanding.
    But, being that the game isn’t going to wow anyone on the grafix front, something is off. Either settings are to demanding, poorly done code, or something is odd with talking to the GPU. (or this is fake)

    also note the xbox one x version drops deep into the 20 … (same with the ps4 pro)

    1. Or the fact that it was originally developed for consoles means that AMD cards will run the game better.

  20. Hello, I’m that FluffyQuack person who posted the ResetEra posts this article got based on. I wanna quickly point out those framerate examples are not supposed to represent average framerates in any way. I noted those framerates by standing still in the starting hub by looking at one of the most complex scenes in the hub (it just happened to be the default spawn position after I restarted the game). The goal was to highlight the scalability of the graphics settings, not the performance to expect. It’s more the minimum FPS for that area rather than average.

    1. The main issue I have is, is that Capcom has stated that the PC version will have parity with the console versions. We can see this via the visuals themselves. The only thing we’re even getting out of this is a non blurry image and 60fps, yet getting the former two somehow demands even more power at 1080-1440p.

      That just doesn’t seem like a legit port, even with that taxing setting (seriously how is it taxing when it outputs so little for the power it asks for?) turned off.

      The guy who works at PCgamer claimed during the stream that it was a a good port, but he also claimed Assassins Creed Origins was a “demanding good port”, yet it looked practically the same to the console versions.

      I don’t see how these ports are considered legit or even demanding when they look so very close to current gen systems. When I see games like Star Citizen, Crysis or Hunt Showdown, I see the demand for more power from those games, but multiplat games that are strictly held within parity for console devices?, not by a long shot.

      The guy at PCG thinks anyone who thinks it’s not a good port is “crazy”, yet the game apparently demands much more power when you reach either 1440p or turn up the settings at 1080p. My 1080ti should be eating that game for breakfast, but it apparently won’t if we go by the current level of power that we’ve seen via screens and streaming.

      There have definitely been non demanding games this gen, especially non legit demanding ones that some claim to be. The way I see it, if the output is exactly what the power demands, then it’s legit. if the game is asking for a 1080ti to handle it at a version that looks 5-10% different than consoles, then it’s simply not even remotely legit.

  21. Anyone want to take a guess that the DRM is probably partly responsible for this? One word Denuvo!!!

  22. IGN is streaming this game right now on a 980 and it seems to be running just fine, normally over 60 fps.

  23. IGN is playing it live right now and it’s performing well contrary to what this random forum user that’s being sourced.

  24. Superb! The game has everything, Console parity, no mods, Denuvo, Delayed PC version and s**ty port. how much is the price ? 60$ ? nice.

    1. Plus we don’t get any updated monsters day one! Such a bargain! A steal we should be paying $90 for this great deal they have given us.

  25. Game from Japanese dev’ is poorly optimised on PC. Shocker!

    To be fair, it also reportedly runs abysmally (sub-30fps at reduced settings in 1080p, lol) on PS4 Pro so it’s evidently a shambles all-round.

  26. I’m so shocked! Another Japanese developer screwed up a P.C port job? I’m so done with their absolute dog sh-t games and ports. Resident Evil 7 was a fluke job of a port now.

    Seriously, Jalanese developers do not seem to care about P.C gaming at all, and there is zero reason anybody should care about their trash tier video games any more.

    They can get fu-ked for all I care.

  27. why are you amazed ??

    msi gtx 1080 gaming x 8gb, cant achieved solid 1080p 60fps with highest setting in;

    gta v with msaa 8x
    deus ex 2016 dx12 with msaa 4x
    homefront the revolution with super sampling 2x

    ……….. >.<

    1. Those you mentioned are far from well optimized games, not sure about homefront tho.
      Not that i have any interest in this game

      ……………>.<

  28. Um…the game will get a post-release patch, like all games do (maybe a “Day-1” patch) and Nvidia ALWAYS releases their game-ready drivers with every major release. This is dumb, and not news AT ALL.

  29. Clickbait. Title States gtx 1080 won’t run at 60fps yet deeper in article they come clean stating, it was 60 fps minimums lol, what is this Fox tech?

  30. Well there goes running this game at 7680x1440P @ 144hz with SLI GTX 1080TI FTW3’s… Let’s keep in mind drivers for this game have yet to be released. I have a feeling this will not have SLI/Crossfire support for those of us with extreme monitor setups.

  31. 75 c on prime 95 all night.Two of my friends are at 5.3 with similar setups as mine.I dont test core for clock,because i want the max out of my components,even though clock for clock,its easily faster.I dont know where you live,but 5.1 or 5.2 is quite easy,based on my expirience,having built around 300 pcs.

  32. “THE LOWEST FRAMERATE……………”. NOT THE AVERAGE FRAMERATE!!!!!! This is what’s given on those numbers provided….

    You people need to pull the hater veil from over your eyes. Straight up…. Pretty ungrateful if you honestly think that damn near three year old hardware can’t compete with newer games, but these are the “LOWEST NUMBERS RETURNED” and not the average.

    But I guess it doesn’t matter when the haters go right to the numbers they think they’re reading accurately for their assault to kick off.

    I guess no game should ever be released after the hardware because it will poorly optimized and not received well by the masses. HAHAHAHAHAHAHA DAMN GENIUSES OUT THERE TODAY.

  33. My understanding is this game is supposed to be identical to the PS4 Pro version graphically at the highest settings minus the special taxing feature they added. A 1080 ought to smash this game easily .Something is wrong somewhere.

  34. Hmm I think we’ll have to wait for the final result.

    Capcom is claiming to have graphics parity between the PS4 Pro and PC version. If that’s the case then no. It’s not a good port. Just an okay one.

    But when looking at comparison video’s between the 2, the PC version looks better so who knows what (optional) added details you have on PC that make the 1080p work it’s a*s off?

    I think when Digital Foundry makes their video between the two, we’ll have our answer as they go much deeper into this than any other channel.

  35. Game runs 60fps 1080p highest setting on config i7-8700k oc 5ghz GTX1060 6GB 32gb ddr4 2400mhz ram. However RAM usage is very high 90% at all times.

  36. so ps4 pro res mode runs the game on steady fps and 1080 4x times more expensive struggle 1440p… rip pc port because its actually a freaking adorable and addictive game

  37. Several Months later on PC
    Bad PC Port
    Denuvo
    Capcom

    Do i really need to say ”dont buy this cr*p?”

  38. https://youtu.be/0XqWMZVdJ_A
    A regular 980 having a minimum of 50 fps at maxed out settings at 1080p is actually not bad all,especially when we knows the xbox one x and ps4 pro versions struggle with this game and even got drops below 30fps.

    So after watching this video im positive it will be a good port and the superior version by far.

  39. I wonder if this is a combination of poor coding and overzealous DRM causing the performance issues?

Leave a Reply

Your email address will not be published. Required fields are marked *