Cyberpunk 2077’s E3 2018 demo was running on an Intel i7-8700K with an NVIDIA GeForce GTX1080Ti

At the official Discord server, CD Projekt RED’s community manager “Alicja” confirmed the PC specs of the machine that was running Cyberpunk 2077’s E3 2018 demo. According to Alicja, the demo was running on an Intel i7-8700K with 32GB of RAM and an NVIDIA GeForce GTX1080Ti.

Unfortunately, Alicja did not reveal the graphics settings/details used or the resolution (though we expect it was 4K) or the framerate (we also suspect it was 30fps) at which this demo was running.

Nevertheless, we now have an idea of the PC specs that will be required in order to play the game. Our guess is that CD Projekt RED will be able to further optimize the game, though it remains to be seen whether the full/entire city will be more demanding than the E3 2018 demo that was shown.

Below you can find the complete PC specs of the machine that ran the Cyberpunk 2077 E3 2018 demo.

  • CPU: Intel i7-8700K @ 3.70Ghz
  • MB: ASUS ROG STRIX Z370-I GAMING
  • RAM: G.SKILL RIPJAWS V, 2X16GB, 3000Mhz, CL15
  • GPU: NVIDIA GEFORCE GTX1080Ti
  • SSD: SAMSUNG 960 PRO 512GB M.2 PCIe
  • PSU: CORSAIR SF600 600W

130 thoughts on “Cyberpunk 2077’s E3 2018 demo was running on an Intel i7-8700K with an NVIDIA GeForce GTX1080Ti”

      1. This game will be released in late 2020 with next generation of consoles.

        So current consoles like XOX (6 TFLOPS, 12 GB GDDR5) will cost less than 300 usd and run this game in 1440p 30fps. XOX is basically like GTX 1070 – both have 6 TFLPS of power. But there will be also “next gen” consoles in stores with a lot more power 12-15 TFLOPS as minimum which easily run this game at full 2160p 30fps.

        1. The next Xbox will look pretty good when you go to buy it when it’s released but how will it look before the next Xbox release? Sure, it’s easy to spend your $500 on the next gen Xbox but you are locking yourself into a cycle of ever decreasing performance compare to PC GPUs and you are at the mercy of MS whim as to when you will be allowed to upgrade that Xbox. Bear in mind MS has gone as long as 8 years in that cycle with the Xbox 360 in 2005 and the Xbox One in 2013 and even when MS released the Xbox One it was not very good in performance compared to PC GPUs of that time.

        2. “will cost less than 300 usd and run this game in 1440p 30fps. ”

          HAHAHAHAHAHHAHAAH

          SURE AND MY FREAKING SMARTPHONE WILL RUN THE GAME BETTER THAN PCS

          AHAHHAHAHAHAHAAH

          NO WAIT ILL GET A CAISO WATCH AND RUN IT MAXED OUT

          BAHAHAHAHHAHAHAHAHHAHAHHAHA

          1. In two years I can see the 1X costing around $299-$349. It isn’t that far fetched.

          1. Not yet, should i ? I see he’s posting a lot of infos about how consoles are so good yet they achieve nothing close to what a mid class pc can achieve.

          2. All he’s posting is a lot of BS with which he’s trying to build a point, and i blocked him like 2 years back, so…

      2. Why do morons make this comment? Even the devs said witcher 3 wouldn’t be as big and AAA w/o consoles.

        1. If they’d kept the original, unoptimized version without the “downgrade” lots of PC gamers would have been left out in the cold as well.

          1. Until they upgraded their PC’s. If the finished game looked as good as that E3 demo lots of people would have purchased new GPU’s to play it. The Witcher 3 would have been this generations Crysis, Wing Commander, Myst, Doom, 7th Guest or Quake 2 i.e. a system seller. Instead we got a watered down version because it had to run on consoles.

        2. So the Witcher 2 wasn’t a AAA game? A day one console release got them more more money from WB but it also got us a visually downgraded version of the game. As a direct follow-up to part 2 Witcher 3 would have been great with or without consoles.

    1. – GeForce 1080 TI (11 TFLOPS, 11 GB GDDR5): 2160p 30fps
      – Xbox One X (6 TFLOPS, 12 GB GDDR5): 1440p 30fps
      – PS4 Pro (4 TFLOPS, 8 GB GDDR5): 1080p 30fps
      – both PS4 (1.8 TFLOPS), Xbox 2013 (1.4 TFLOPS): 720p 30fps

      Most new games will work 720p or 900p on consoles from 2013. This is a new reality of consoles without generations. Now its more like PC. New console hardware every 3-4 years with more power with full backward and forward compatibility (like on PC)

      MS already confirmed during E3 that hey work on two next boxes:
      Xbox One X “Slim” (cheap version of XOX made in 7nm process – still 6 TFLOPS)
      Xbox One “Scarlet” (also in 7nm but with more power – full price, probably 500)

      1. even if i do my best … i cant come up with the dumb sh.t that i just read …

        “- Xbox One X (6 TFLOPS, 12 GB GDDR5): 1440p 30fps”
        arent you told us like 5000x times that the one x is a 4k machine and not is only a 1440p ?

        “New console hardware every 3-4 years”
        so the most powerfull console alredy needs to be replaced ?

        “Xbox One “Scarlet” (also in 7nm but with more power – full price)”
        wow …. cant wait to play assasins creed 12 and far cry 19 on 8k and 19FPS

        1. “so the most powerfull console alredy needs to be replaced ?”

          I read that “most powerful GTX 1080TI” will be replaced this year with GTX 2080… only after 2 years. You must be very sad about that…

          More serious:

          MS said that when cheap XOX Slim will be released then it will replace Xbox One S as main SKU (300 usd). New Xbox “Scarlett” (next gen) will be sold as premium device for full price. MS want create continuous delivery of new hardware instead of “old generation”.

          1. There is a new gpu series every year . . . . .like in the last 20 years . Where have you been ?

            new hardware ? Yeah . . . . Ut will be underpowered the day it will come out . .. Just like every new console . .. And like every console it will hold back pc. dont expect anything else its a cheap console for peasants wbo play at 30 fps . . . .

          2. And now there will be new console hardware every 3-4 years. With full backward and forward compatibility like on Windows. MS will always support “two generations”:

            Xbox 2013 (1.4 TFLOPS): 2013-2023
            Xbox X (6 TFLOPS): 2017-2027
            Xbox “Scarlet”: 2020-2030
            and so on…

            This is very easy pattern. Instead of creating expensive “new generation” (which is hard to make at 400 usd)… MS decided to create incremental hardware updates. So they can sold one SKU as base hardware for 300 usd and second more powerful as premium device for 500 usd. No more hard generations. Xbox consoles will work more lika a Windows on PC… Xbox One X already have Windows 10.

          3. Xbox 2013 (1.4 TFLOPS): 2013-2023
            Xbox X (6 TFLOPS): 2017-2027
            Xbox “Scarlet”: 2020-2030
            and so on…

            Where the F u get this things ?
            And . . . Are you realize this flip flop means nothing ? Its the most powerfull console flip flops 55 big numbers . . . And it plays games slower than a 2 core 50 $ cpu . . . . .

            Xbox consoles will work more lika a Windows on PC..
            HAHAHAHAHAHA …. No

          4. Xbox 2013 (1.4 TFLOPS): 2013-2023
            Xbox X (6 TFLOPS): 2017-2027
            Xbox “Scarlet”: 2020-2030

            This is a new console life cycle. Phil Spencer (head of Xbox) said that he don’t believe in “console generations” and instead proposed continuous hardware delivery. New hardware every 3-4 years with 10 years of support (during that period all games must run on all supported hardware to be published on Xbox).

            This is great for gamers (more hardware options) and better for game developers (long support of each hardware so they can sell games to more gamers). And of course it is good for MS (better ROI). Everyone wins. Phil always want console to be more like PC

          5. (during that period all games must run on all supported hardware to be published on Xbox).
            And thats why consoles holding back pc .

            This is great for gamers .
            Nope . Console peasanta dont give a crap how the game runs on consoles .
            And publishers dont give a crap how it runs on pc . Look at assasin creed .

            and better for game developers
            Nope . They make a game on a pc . For a console . Hen they port it bwck to pc and it runs like sh.t .

            long support of each hardware so they can sell games to more gamers
            So milk players like GTA 5 or skyrim ?

            Phil always want console to be more like PC.
            Phill is rerarded . Consoles never will be like pc . . . And pc never be like consoles. If you want you xnox be like a pc then just buy a pc

          6. I never really saw consoles holding back pc, I kinda think pc holds back pc, not everyone out there has a 1080 ti, most people have something like a 960 or a 560..which were pretty even. I mean even waaaaay back in the psone era pc games didn’t look THAT much better, every now again you’d get an unreal, but they weren’t blowing ps1/2 out of the water just much higher res more solid looking versions…like now. That said, you fellers do have points and I’m also sure you’ve got a solid argument sometimes. But not always, its been like this for a good long while, the sega saturn didn’t hold back a pentium.

          7. Its not only about the HW . Devs prioritaze console platform and then port it to pc . And here on pc we can do what ever and run what ever on what ever grapbics . BUT ! If the dev make a console game and the port it poorly to pc then its the console fault . Like AC: ORIGINS
            Or look at GTA 5 . It was desing for xbox one and PS4 . That why cars cant go faster than 130 miles/h .
            Pc exclusives look better than multi platform games . And publishers making sure pc setrings arent that mutch better than consoles ones .

          8. “Its not only about the HW”

            Did you know that more than 50% of PC gamers use 10-years old system Windows 7? This system is so old that it not even support dynamic resolution scaling introduced with DX11.2 in 2013… On Xbox nearly every game use dynamic resolution scaling for example for constant 60fps. Every GPU since 2010 support that… but on PC developers can’t use this feature because half of gamers use Windows 7 that don’t support DX11.2 (system is limited to DX11.0 with partial support of DX11.1)

            Every Xbox use Windows 10 with 1804 update on my Xbox with full support of DX11.4 and DX12.1.. Power of forced software update. Developers can use latest software features because they know that every Xbox is always up to date

            I bet that in 2020 at least 30-35% of PC gamers will still use Windows 7 with DX11.0 and no support of DX12 🙂

          9. Darling …. I will use win7 until i can . And i dont care.about dx12 or features . . . . . .
            And how did we come from xbox one x 4k . . . To win 7 is bad cuz it dont have xbox one x features .
            No one wants or need dynamic resolution scaling . . . Our systems arent underpowered . Like consoles . . .
            Do you know why dynamic resolution scaling exsist ? HAH .? I tell you
            Its becasue the most powerfull console can allow its fps drop. . . so its using smaller resolution so the FPS dont drop under 15 fps . . . Get it . Its covering your weak system . . . .one x Cant hold 4k res when a granate explode cuz one x is garbage . . . . . .get it ?

          10. “I will use win7 until i can”

            I’m not surprised… you also use old GPU with 3.5 GB of memory 🙂

          11. Your math skills are amazing:

            “12 < 3" – Jinx Math 🙂

            Now I not surpised that you think that GTX 1070 is faster than Xbox One X… If 3 is better than 12 then everything is possible 🙂

          12. 8GB of system memory + 4GB vram = 12GB. And unlike the X, the PC does have 3GB reserved for system use only.

          13. yes for 1080p its enough …4K is just a marketing word for dumb ppl to buy new TVs .. .like 3D was

          14. There was a bug in the Steam hardware survey, Windows 10 is actually at 55.5%. Go double check it yourself.

            Still not a great source of info tbh though.

          15. Hmm i was on ps1 in the 90’s and even n64 and when i saw unreal and half life in 1998…yea i was blown away

          16. Meanwhile a 980ti is surpassing the xbox one x in every multiplatform games get rekt

          17. Depends on what you mean by competition. AMD has GPUs that compete with entry level through midrange Nvidia GPUs all the way up to the 1080. They are not as efficient as Pascals though and use more power. That doesn’t matter to a lot of people because electricity is cheap where they live. The only GPU that AMD doesn’t have an offering to compete with Nvidia is the 1080 Ti and Titan XP but very few can afford those cards anyway.

          18. AMD is competitive on paper but not in the real world. Thanks to miners you can’t get a AMD gpu for anywhere close to msrp. This has caused their gpus to be way over priced for their level of performance. Until you can get a Vega 56 for the same price as a 1070, there’s no competition.

        2. “so the most powerfull console alredy needs to be replaced ?”

          Well, in reality, soon it will be necessary. Right now the Xbox One X is comparable in performance to a midrange PC GPU. Later this year when Nvidia releases their next generation GPU the Xbox X will be comparable to an entry level PC GPU. By the time the new Xbox comes in 2020 the performance of this one will be pitiful. This happens over and over with consoles. They don’t get a choice to upgrade the GPU to keep up with demands.

      2. This isn’t for you Sp4ctr0 because I and others have pointed out to you often enough that you’re posting misinformation. This is for gamers that don’t have a lot of tech knowledge and may accept what you post as sensible.

        You cannot directly compare the RAM in the Xbox One X to the VRAM in a discrete card like the 1080 Ti because you are forgetting the System RAM in a PC and a lot of that is being used to run the game. The VRAM in the 1080 Ti is dedicated only to the graphics. Part of the 12GB in the Xbox One X is being used by the OS, the game engine and the data loaded in memory to run the game.

        A new silliness from you is comparing the 4K 30 FPS in this demo which is clearly not optimized very well to the 4K 30 FPS in games already optimized and released on the Xbox One X.

        1. “Xbox One X graphics performance is comparable to a 1060 6GB”

          Xbox X have much more faster GPU and 2x faster memory. It is a lot faster than GTX 1060 which on PC is sold as low-end GPU

          https://uploads.disquscdn.com/images/5f2be7525bcf1ccbec3914f3abf29f9824dd2ae7b013bf31af558427e8a18119.jpg

          Nvidia Pascal family (for gamers):
          Low-end: GTX 1060
          Middle: GTX 1070
          High-end: GTX 1080TI
          Pro: Titan

          I know that you hate Xbox but you can’t compare low-end GPU to consoles like Xbox One X. This console is not only much faster as pure hardware but also software is much faster (HSA memory 326 GB/s is 10x faster than fastest available dual channel DDR4 – 34 Gb/s)

          1. Except the 1060 matches or beats the X in most games, and the 1070 beats it in everything. Like how many more game do I have to test to prove this to you? You are always using GameGPU’s benchmarks, all based on the PC version maxed way beyond the X’s settings then declaring victory like there’s no difference.

          2. So you use old gen GTX 970 with only 3.5 GB of memory? Sorry but you can’t use “max settings” on old GPU with 3.5 GB of memory, you can’t use anything more than lowest texture settings. Its 2018 🙂

          3. The X probably doesn’t have more than that available for vram after it loads all the other game assets.

          4. you really are a special kind of stupid. 3.5 gb of vram is plenty for high textures and even very high in many games.

            and these comparisons you keep making do not take into account that pc running maxed at 4k is NOT the same settings the XBOX One X is running. most games are not actually at 4k the whole time if at all. and every game has settings reduced from the max settings on pc.

          5. Digital Foundry said the 1070 could do a locked 30fps at native 4K with Ultra/max settings. That’s better than the X, which as usual runs with lower settings than PC.

          6. LOL, it’s hopeless mate!
            Is this Sp4ctr for real or just some troll. Can a persson really be THAT stupid ..LOL!

          7. all peasant are, thsat why they waste money un those underpowered x86 fake consoles 😀

          8. Sp4ctr0 you can’t get all of your info from that one disreputable site and spread it all over gaming sites as fact.

            For the record, I don’t hate the Xbox One X. I think it’s pretty good for people that don’t want to be bothered with tweaking a games settings or anything else for the best experience possible and just want to pop a DVD in and press start.

            Pointing out the limitations of the Xbox X as compared to a PC isn’t hating and pointing out your misinformation isn’t hating either. It’s only intended to keep that misinformation from spreading and getting reposted by others all over gaming sites and eventually being accepted as fact by some because so many people are saying it.

          9. “just want to pop a DVD”

            DVD? Nobody use DVD at least 10 years… Xbox have UHD Blu-Ray 4K with full hardware support of Dolby ATMOS

            DVD was limited to only 8 GB of data
            UHD BR contains 100 GB of data (12.5x more than DVD)

          10. He uses the fact no one speaks Russian, but I do.
            Every single screenshot he made uses the HIGHEST setting, the one from FarCry says MAXIMUM Settings and the one from State of Fail 2 says ULTRA Settings.

          11. You are dumb, that xbox spec is a COMPLETE spec CPU+GPU because xbox uses an APU.
            You compared full system to just one GPU.

          12. it is hard to believe you are actually this stupid. you have been told over and over and over that TFLOPS is not directly comparable as Nvidia is much faster than AMD at the same TFLOPS.

            and digital foundry and others have concluded that the gpu in the XBOX One X is about on par with the RX580 which of course trades blows with the 1060.

        2. We have explained to him a million times that you can’t compare architectures directly like he’s trying to do with the X and 1060. Nvidia generally gets more performance per tflop and needs less bandwidth. He thinks the X is comparable to a 1070 because it’s 6tflops vs 6.5.

          1. I am impressed that you had the patience to try to explain to him that you can’t compare different architectures directly. Honestly I think that’s more than he wants to learn. I can’t even get him to realize that part of his 12 GB RAM on the Xbox One X is being used by the OS, game engine and program.

          2. The Xbox One renders at a native resolution of 1536×864. The Xbox One X uses a dynamic resolution with the native resolution ranging between 2688×2160 and 3840×2160

            always compering dinamic with native?? what a peasant 😀

      3. Either you are getting paid by console makers, or you’re simply spreading misinformation just for fun..

      4. There is no xbox one x slim, MS works on Scarlet and Xbox Streamer.
        He said it right there they are working on game streaming solution, to stream xbox games to any device, one of these devices will be xbox “light” a dumb streaming box, probably 99$ using ARM, the only on-site features it will do is loading apps like netflix

    2. It will run great because all games are designed with consoles first in mind. We should expect a combination of medium/high settings at 30 fps to run just fine.

      1. you are not getting “mid high” at this point. Unless new consoles come out. Also do you really think that consoles can maintain 30 fps in most games?

        1. I don’t keep up with console news or benches for them at all but what I wonder is if the reported 30 FPS is always maintained or is it an average. I don’t consider 30 FPS to be very good for first person shooters and if that 30 FPS is average then I consider it to be unacceptable because in that case it could be dipping into the teens during situations when you really, really don’t want such low FPS like in intense battles with a lot of onscreen combatants and action going on.

  1. imo the specs of the rig for that demo don’t mean a lot. If it was running at 4K then a 1080 Ti would make sense anyway. The 8700k and 32GB RAM most likely won’t be necessary. They went overkill on that rig to make sure that the demo (probably not optimized very well) would run without any chance of glitches.

    My guess is that what CDPR will be aiming for on release is a 4 core 8 thread CPU and 16GB of RAM though it will probably run ok on 8GB RAM with some limitations. The mighty 1080 Ti will be pushed down to comparable midrange performance when Nvidia launches their next generation this year. They will probably recommend an 1180 Ti for 4K on release a couple of years from now.

    CDPR intends to make the biggest and best game that they have ever made with Cyberpunk 2077 but at the end of the day they want to sell as many copies as possible and that will mean taking into account the hardware used by mainstream PC gamers and of course console gamers a couple of years from now.

    1. Jensen Huang said that but it was unclear what he meant by “a long way off”. There are rumors that Nvidia is buying up GDDR6 VRAM for a while now. I can’t prove that the 1170/1180 will be released this year but I will say that I paid attention to the release schedule of the past 4 generations from Nvidia and my gut feeling is that we will get the next generation midrange GPUs this year. That’s not meant to sway anyone’s opinion. I could easily be wrong. Not to toot my own horn but I have been right about the last 3 generations : )

  2. i don’t care , i keep reading about a demo that demo this, CDPR stop being an a*** and show us the gameplay, for a game that i was waiting for since 2012, in 2018 u only showing u a trailer it’s d*** move, enough with the private demo bs, it’s just a video game, why hide it, show the thing already

      1. CDPR definitely know, they’re just toying with the kittens, ahem, their audience to increase hype as much as possible. From a marketing point of view they are doing everything right.

        1. There definitely will be an outrage in case they don’t show it at Gamescom. It’s been 5 years since the announcement and couple that with reports of engine change, horrible working conditions, scrapping 4 years of pre-production, employees that got on board just to develop CP2077 leaving after massive burnout that resulted from working on TW3 etc. People en masse are tired of waiting.

      1. I was exagerating im pretty sure my 980ti would be able to be at high at 1080p

        I was running the witcher 3 at 1440p(custom resolution for downsampling) maxed out with hairwork at 50+ fps so im positive i will be able to run this game good enough…besides it was an unoptimized demo running at 4k.

  3. By the time the game comes out 1180s/2080s will be out and most likely 1180tis/2080tis so at that point running that game won’t be THAT hard. Let’s not forget this game isn’t close to be finished. I’d say another year and a half but what do i know ? Nothing.

    1. That sounds like a reasonable estimate. I’m expecting the game to be released sometime in 2020. We should get the 1170 and 1180 this year and then sometime between 6 and 9 months after that the 1180 Ti should come based on the past 3 generations of release schedules so I would certainly expect the 1180 Ti before this game releases.

      1. Yea so if we gauge witcher 3’s release state, i believe we’ll have more than enough computing power to enjoy the game at it’s best.

        1. I’m almost 100% positive that by release a midrange from the next generation will be fine for 1440p and an 1180 Ti for 4K should be fine for this game unless you are looking for extremely high FPS for a 120 Hz or 144 Hz monitor.

    2. Yea but many users like me are stuck on old cpu like sandy bridge,ivy bridge and haswell and putting in a new gpu that would be bottlenecked by the cpu is not option

      1. Really depends on the specific chip and what top framerate you are looking for. I have a 4690K and I plan on getting at least one more GPU upgrade out of it. Plus on PC you can smartly adjust settings to reduce CPU usage.

        1. What GPU do you have now? I have GTX 1080 paired with Ivy Bridge i7 3770K@4.5Ghz. I use it for 4K resolution and I think, that I handle one more upgrade with GPU more powerful than GTX 1080Ti.

          1. I have a 1070. I still need a monitor upgrade, I’m only on an old 1080p/60hz 27″, though I often downsample from 1440p. I’m going to get either a 1440p monitor or 4K tv next, not sure. But either way I’m sticking with 60fps as a target which should keep my CPU requirements reasonable.

            Gears 4 on PC has a really nice benchmark that actually illustrates how much your CPU is bottlenecking your PC and what its lows are. The game itself is kinda meh but I kinda enjoyed playing with the benchmark. (obviously how much it bottlenecks you will vary by game but it still gives you a sense of things)

            https://uploads.disquscdn.com/images/2c39f10200f7394dc622e548b79adebc32512e5f2f69c17515e07e3e0bab9579.png

      2. It depends on resolution. For example I have i7 3770K @4.5GHz (Ivy Bridge) paired with GTX 1080. And I using it for 4K. In this I can see, that even more powerfull GPU can be used with my current CPU. And of course like Jinx wrote. It is also about maximum framerate you want to have. For me it is 60FPS because my monitor is 60Hz. But if you have 120Hz monitor or above, you will need much better CPU to get highest FPS.

    1. I think they’re referring to the gameplay shown behind closed doors. The actual trailer said “in-engine” but I don’t expect gameplay to be quite as pretty. When you have a camera moving along a pre-determined path you can optimize more than if you are giving a player full freedom of movement.

    1. I’ve see a EVGA 1080 Ti for $750 with instant rebate on their store and available to buy recently but they sold out very quickly. $750 for the non reference cooler 1080 Ti was what they were released at and $700 for the reference FE edition.

      Right now, unless you get lucky, you are still going to pay around $100 more for the non reference cooler 1080 Ti.

      But to answer your question, I think you will get a decent price on a 1080 Ti when the 1170 and 1180 get released later this year. I would be very cautious about buying a used one on Ebay though because there are so many used cards out there that have been beaten to death used for mining.

  4. Currently targeting this generation of consoles, I would say release date will be fall 2019. With an enhanced version coming later for the next gen.

    Very much looking forward to this game from what I’ve read and seen. Deus Ex with Prey 2 and Blade Runner.

    Hope CDPR releases a Complete or Enhanced version for free when it does comes out for next gen.

    Why make money of one gen when you can make money of two?

    Especially if have so much confidence in your reputation and product.

    Can’t wait for Gamescon to check out the demo!

  5. Give me good sli update and my 2 980ti cards, 6700k and 64GB ram will flawlessly run this game at ultra settings! Can’t wait

      1. Nope, purely for gaming. It was cheap when i got it, 500 dollars, the same sticks cost 800 dollars today. Wont need to upgrade my ram for many years to come. Is it wrong to have that much ram? Even if i can afford it? Even if all i play is minecraft?

        1. It’s not a matter of right or wrong. It’s just a matter of at what point is adding more RAM going to make any difference at all in gaming. 8GB is plenty for most games. There are some games that can make use of more than 8GB so 16GB is of some use. Beyond that there isn’t any return on your investment by adding more.

          Also by the time even 32GB could possibly offer some very slight performance increase your 6700k and LGA 1151 mobo will be so hopelessly outdated and pretty much worthless and we will probably be on DDR6 by then so all of that 64GB RAM will be good for nothing but filling up a trash can to throw away.

        2. Yes, it’s really dumb. You should have stuck with 16GB (32 max) and put the money you saved into a faster GPU or CPU, or a bigger SSD, etc. Those things would actually affect performance, anything over 16GB of RAM is pointless for gaming.

          And by the time you *need* 64GB of RAM, what you bought will be outdated and need to be replaced with something faster anyway.

          1. I built my rig in 2015, 980ti was the strongest gpu back then and 6700k was the best gaming cpu at the time as well. I have a 1TB m.2 ssd, my rig cost me 3500, my GPU and cpu are water cooled, my monitor is 1500 dollars, 21:9, 3440×1440. When its time to upgrade I’ll gladly and easily do so. I don’t own my company for no reason buddy.

          2. That doesn’t make it any less of a waste of money. But it’s yours to waste so you do you.

          3. Exactly, I can go out today and buy a 10,000 dollar rig with a 5,000 dollar monitor, still my money im spending, not yours or anyone else’s.

        3. There is no harm in having more system RAM on your PC, and it isn’t wrong either.

          As such, the extra GB of MEM on your OS won’t get fully utilized, if your main objective is Gaming, and other simple desktop tasks.

          But that extra memory will surely help if you are into Heavy rendering, Encoding, Animation and other similar stuff like that, or using Virtual machines/VM and Servers.

          For CAD, and other heavy Animation work, high-end audio/video production, the extra system MEM might come in handy, but not for average Gaming.

  6. If that is the case and they were running in 4K that is perfect for me. I’m running,

    i7 4770k @ 4.2Ghz
    16gig of DDR3
    EVGA 1080Ti
    4K Gync Monitor

    with my GSync monitor as long as I stay above 45 frames it is butter smooth

    1. How? The video they made about the Cyberpunk trailer didn’t say it was running on a Xbox One. They said it’s probably a tech demo for the next Xbox. I don’t see how that’s shilling. Fanboys for anything are retarded.

  7. I hate that I have to keep buying computer hardware to play these games. Why don’t we have the tech to just wirelessly pipe these games into my eyeballs?

  8. Regardless of whatever is going on. I will honestly say if anything comes out on the next gen systems that runs less than 60 frames per second. I will put a penny towards any of it.

    Tried to invest in this generation of consoles and got burned right there again. I would happily accept 1440P or 1080P at 60 frames per second.

Leave a Reply

Your email address will not be published. Required fields are marked *