The Cyberpunk 2077 E3 2018 demo was running in 1080p on a single NVIDIA GeForce GTX1080Ti

CD Project RED’s Community Lead, Marcin Momot, has confirmed that the E3 2018 behind-closed-doors demo of Cyberpunk 2077 was running at 1080p and not in 4K. Despite earlier reports suggesting that the demo was running in 4K, Momot stated that CD Projekt RED showed the demo running in 1080p.

Although Momot did not comment on the framerate, reports suggest that it was locked at 30fps (with some deeps here and there).

As we’ve already stated, CD Projekt RED used an Intel i7-8700K @ 3.70Ghz with 2X16GB 3000Mhz RAM and an NVIDIA GEFORCE GTX1080Ti in order to showcase the game’s E3 2018 demo.

Now I’m pretty sure that most of you will be worried about this performance, however we should remind you that the game is in a pre-alpha state. As such, the game will receive various performance optimizations once it enters its post-beta and gets closer to its final build.

The fact that the game was also running with 30fps in 1080p on a single GTX1080Ti is perhaps one of the reasons CD Projekt RED decided to only show the demo behind closed doors.

CD Projekt RED will be preset at Gamescom 2018 so let’s hope that it will show a gameplay video of Cyberpunk 2077 to the public!

50 thoughts on “The Cyberpunk 2077 E3 2018 demo was running in 1080p on a single NVIDIA GeForce GTX1080Ti”

  1. I’m not worried at all. 1080p @30 FPS with an 8700k and a 1080 Ti tells me they wanted that hardware because the demo material was badly optimized for now.

    CDPR is a decent Publisher imo especially when compared to the average Publisher/Developer but they want to make money from this game in high sales and there’s no way that they will release a game that doesn’t even run well on a high end GPU @1080p. I expect the game to run fine when released on a midrange GPU @1080p and for the few that are running 4K then a high end GPU.

    Looking forward to some gameplay video when it’s ready to show.

    1. “They want to make money from this game”…

      How exactly did you reach that conclusion??!

      1. Not if there’s nothing else to upgrade to….

        I’ve had a 1080 Ti for over a year and I’m still needing an upgrade (though not for any CDPR titles). I said goodbye to SLI with my last pair of 980 Ti’s because no one cares about supporting it anymore; even NVIDIA is trying to move away from it.

        I’m actually expecting this game when they’re all finished with it to run at 4k/60FPS on whatever the best card at the time is (There will most likely be a replacement for the 1080 Ti by the time the game releases). I find that perfectly acceptable for a game with visuals like what we’ve seen so far.

        1. and i will play it with my gtx 970 at max settings 1440p at 30+ fps like iam doign with all games so far!

        1. also with my 970 i played witcher 3 1440p max settings with no issues. I played it first time 1080p max settings and then in 2016 wheni got 1440p monitor i replayed it maxed 1440p and no issues at all.

      2. my gtx 970 runs evrything at max settings 1440p at 30+ fps some games even at 60 fps and there is no need to ugprade!

    2. my ryzen 1700 and gtx 970 runs evrything maxed 1440p 30+ fps so i dont worry at all. It will also run max settings 1440p like all games do!

  2. i dont see the point, it is obvious they use the most powerful pc they have to show off those demos.

  3. Well, it’s CDPR. It’s going to be fine. Heck Hunt Showdown started like couple months ago with a small dev team and the game was very rough, performance wise. Now it’s WAY better. I can only assume that a dev team like CDPR with so much time on their hands can achieve.

    I’m sure we will all have fun. Plus, didn’t they announce no drm, no lootshyte etc ?

    Gotta love’em

  4. Nothing new .
    The did a safe demo with stable fps so the game wont F up .
    I would do the same .

  5. They would be crazy to release a game that can’t run high settings on mid range PCs. (1080p)
    This whole “pushing PCs to the limit” thing is nonsense. Doesn’t make any sense from a business perspective. PC gamers have this disturbed sense of pride when thinking back to the original crysis but that’s never gonna happen again because publishers/devs these days aren’t stupid and want to make money.

    1. being stupid =/= not wanting to make money.

      They just got greedier, and less dedicated and passionate about their job as game developers, fortunately it’s not like that for everyone.

      Doing another game crysis style would only give back pc what it deserves, being the best platform to work on.

      1. “would only give back pc what it deserves”
        PC deserves having a game that only 5% of PC gamers can play at decent performance and visual quality ?

        1. No, PC deserves the possibility to take advantage of its potential in full, something games don’t do anymore.

        2. 5%? What? A 9600 GT from 2008 could run Crysis in high settings no problem.

          Have you ever actually played a PC game?

          1. Does he even know how we have expensive and non expensive things in this world?.

            Like I can’t afford a Lamborghini, but I’d never want them to stop making them or not allow anyone to buy them simply because I cannot afford it, that’s just stupid to think such a thing, not to mention childish.

        3. So we should stop making expensive things like sports cars, homes and services now, only because the 1-10% can afford them?.

          What are you smoking?.

      2. He likes to do away with art and integrity over money, like any corporate person/slave wants. It’s not surprising that he’d rather see anything be stifled, as long as it makes them more money.

    2. It’s not even disturbing. people like seeing high quality games running at their best, but this game is clearly not running at it’s best because it’s in a sodding pre-alpha stage.

      You need to stop this whole “PC gamers can never have high end games because I think it’s worthless” mentality.

      Stop thinking about just money like some greedy corp suckup. Art and integrity are just as important in life than just money.

  6. >don’t back up or get stuck into visual console parity

    There will be parity. Just not with the base models but rather with the Xbone X. So pretty nice graphics, but the game still has to run on 8 gigabyte total system memory, in terms of scripts, AI, physics. That is, if the game comes out in the next 3 years. I doubt it though. I think they’ll move it to next-gen to escape the console limitation.

    1. Well this gen the only differences I’ve seen between PC and consoles has been shadow quality, shadow rendering distance and object/detail LoD’s. We’ve obviously had the resolution part down for years, but honestly we haven’t seen PS2 vs PS4 levels of difference between consoles and PC this gen. I don’t see Cyberpunk pulling that with next gen systems and PC either.

      Imagine how the media would be outraged to see next gen running the game at PS2 level graphics, with PC running it at Crysis levels. Sure I’d love it, because that’s how I think it should be when you sport high end to always cheaping out console level hardware. With how it actually works in the industry, it won’t happen the latter way, so like you said there will most definitely be a form of parity.

  7. Many who saw it seemed to think it was running at 4k though, which tells me the game must have high IQ, not abuse filters, and have some great AA implementation in place.

    1. I wonder what AA options they’ll put in the game. Hopefully not just TAA/FXAA/MSAA. We have more options than just those 3.

  8. That is not what this means. It means they didn’t run at a higher resolution because its a pre alpha build and running it at a higher resolution could easily not yet be supported or could still be highly unstable.

  9. If this stays the same then most PC gamers will not be able to play the game with decent visuals and performance. Consoles?????

    1. It’s either going to be heavily optimized over the next few years (because it’s in pre-alpha, so we’re looking at yet another 3-4 years of dev time) or they’ll have to cut a lot of what makes the game back for it to run on consoles and lower end hw.

      Making a game for both consoles and PC, but somehow crippling a 1080ti?, that’s nowhere near legit.

      1. Also keep in mind the game will still be designed with current gen consoles. I was hoping for next gen only so the game could have less holding it back.

        1. well the thing is, is that by the time this game comes out, current gen will be around 8-9 years old, which would set a bad example of trying to show us a “taxing” world such as Cyberpunk.

      1. if a 1080ti is having this much of an issue for a game that’s been 6 years in development, then something has to give.

        Either the game is legit demanding on every front, or it’s badly optimized and 6 years of going through like that is a very bad sign.

    1. I can buy it where I want to buy it?.

      I’ll support CDPR when they deliver the goods on PC. If it’s another console toned back experience both visual and performance wise then no, I won’t support them.

      Going by their nearly 6-7 years of development with this game and the news that a 1080ti is BTFO, it doesn’t even remotely give me confidence in their work.

          1. GOG still the best option to get the game since it will be completely free from any kind of DRM.

  10. well it’s apparently in “pre-alpha”, so apparently there has been little optimization involved. If current gen systems have no chance of running the game, then it’s going to have to be scaled back severely for them to run it, and if next gen can run the game, then it’s going to have to be optimized for GPU’s like the 1080ti, as not everyone is going to wait nor flock for an 1180ti.

    So really this is going to be down to how well they optimize their game and how much they will have to cut back.

    Again, people give RSI trouble for their optimization, and yet here we see a pre-alpha game being throttled badly with s 1080ti at 1080p.

  11. Throwing actual optimisation out the window isn’t something I celebrate, nor do I celebrate the fact that the game has to still adhere to console level ahrdware. We’ve seen this gen how much AAA games “pushed” PC hardware, so much so that we got a whole 10% difference in shadows and LoD’s.

    I don’t see next gen being PS2 vs PC from 2018, it’s not been like that since Crysis 1, and I don’t see it being the same with Cyberpunk.

    If Cyberpunk looks nearly identical on next gen systems to a PC with an 1180ti, then I’ll call you on it.

    Asking for that much power requires an actual output that measure what it’s asking for.

  12. what an idiotic point of view. look how outdated Witcher 2 is now yet its still crazy demanding with ubersampling on. there are games that look orders of magnitude better while also running much better. this nonsense notion that hardware will catch up to a stupidly demanding game never pans out as the game always ends up looking outdated before that happens. hell Crysis 3 looks way better than Crysis 1 and runs much better and more consistent to boot.

    1. I saw it. overall the first Crysis looks laughably outdated and still runs like crap even on today’s hardware which is orders of magnitude faster than was available way back then. it already had outdated looking inside environments for the time and outside there is a lot of blurry looking crap such as that large mountain at the beginning of the game even on max settings. those things then looked bad and now are a complete joke. again super demanding games get outdated looking and irrelevant before hardware catches up.

Leave a Reply

Your email address will not be published. Required fields are marked *