Cyberpunk 2077 2020 new screenshots-1

Here is how you can run Cyberpunk 2077 on non-AVX CPUs and improve performance on AMD Ryzen CPUs

Modder ‘Scorpija’ has released a mod that enables PC gamers to run Cyberpunk on non-AVX CPUs. Furthermore, there is a FEX tweak via which AMD Ryzen owners can improve the game’s performance on their systems.

Let’s start with the non-AVX CPU fix. All you have to do is download the mod from here, and place the executable file inside your Cyberpunk2077\bin\x64 folder. Obviously, you’ll have to and replace the .exe file. Do also note that this fix only works for the GOG version of Cyberpunk 2077 (which is DRM-free).

On the other hand, Reddit’s UnhingedDoork has provided the following tweak guide, via which AMD Ryzen owners can improve the game’s performance. This tweak aims to improve multi-threading support, which will increase minimum and average framerates and the overall gaming experience.

Here is what you have to do in order to improve the game’s performance.

  • Download HxD hex editor from here
  • Open the Cuberpunk2077.exe from HxD
  • Search for “75 30 33 C9 B8 01 00 00 00 0F A2 8B C8 C1 F9 08”
  • Replace them with “EB 30 33 C9 B8 01 00 00 00 0F A2 8B C8 C1 F9 08”
  • Save the executable file and run the game

Here is hoping that CD Projekt RED will include these tweaks in a new PC update. Yesterday, the team released Patch 1.04 and you can find here its complete changelog.

71 thoughts on “Here is how you can run Cyberpunk 2077 on non-AVX CPUs and improve performance on AMD Ryzen CPUs”

  1. Funny how I mentioned on many tubers channels about how it pounded 1 single core like a DX11 game while being a DX12 game.

    1. that is issue with game development. for game developer they have reason to fit everything into a single core. turn 10 for example they said they try to make everything to be processed on a single core as hard as they could for latency reason with forza motorsport 7. and that game is pure DX12.

      DX12 makes the API able to leverage multi thread better but doesn’t mean the game itself will benefit from more multi threading.

      1. It’s just like how Horizon 3 was but then the developers fixed it and in Horizon 4 it’s just perfect or close to perfect

  2. There’s a Steam version on Nexus as well. I have to use it also. I have no clue how to use a hex editor. Don’t know why they didn’t do that in the recent patch.

  3. There is now a mod that does this automatically so you wont need to fiddle with a hex editor:
    https://www.nexusmods.com/cyberpunk2077/mods/117

    According to this presentation by AMD: https://gpuopen.com/wp-content/uploads/2018/05/gdc_2018_sponsored_optimizing_for_ryzen.pdf

    This hex code translates roughly to:

    DWORD cores, logical;
    getProcessorCount(cores, logical);
    DWORD count = logical;
    char vendor[13];
    getCpuidVendor(vendor);
    if (0 == strcmp(vendor, "AuthenticAMD")) {
    if (0x15 == getCpuidFamily()) {
    // AMD "Bulldozer" family microarchitecture
    count = logical;
    }
    else {
    count = cores;
    }
    }

  4. if the modders really made it perform better, it just showes cdpr’s optimization and competence is even poorer than what seemed at first.

    1. Man, I dunno. I play with a 5700 XT / R5 3600 at Ultra/Psycho, 1080p, with Static CAS at 80% rex scale.

      I was GPU bound at all times before, I’m still GPU bound at all times now. No fps improvement. Still 60+ fps.

      1. it really doesn’t look anything special yet eats all the gpu power.i don’t know why some think the game looks gorgeous or next gen or …the game simply looks bad compared to the fps it gives.

  5. There is now a mod that does this automatically so you wont need to fiddle with a hex editor:
    https://www.nexusmods.com/cyberpunk2077/mods/117

    According to this presentation by AMD: https://gpuopen.com/wp-content/uploads/2018/05/gdc_2018_sponsored_optimizing_for_ryzen.pdf

    This hex code translates roughly to:

    DWORD cores, logical;
    getProcessorCount(cores, logical);
    DWORD count = logical;
    char vendor[13];
    getCpuidVendor(vendor);
    if (0 == strcmp(vendor, "AuthenticAMD")) {
    if (0x15 == getCpuidFamily()) {
    // AMD "Bulldozer" family microarchitecture
    count = logical;
    }
    else {
    count = cores;
    }
    }

  6. And to all you dummies on here that i was defending 8GB of Vram GPUs for Next gen at 1080p all because NVidia decided to pull a fast one and you defended it; you can go do you know what to yourselves. An amateur could tell you that 8GB of vram ain’t gonna cut it at 1080p ultra in next gen gaming and this game just made you eat your words. A lot games of last gen were already topping out at 6GB of Vram on 1080p Ultra. So, it’s dumb on you to think otherwise for nextgen/whatever gen we are actually in.

    In short you can thumb me down all you want, NVIDIA is playing all of us with those 8GB cards they will be useless in no time. I don’t care whether you are a fan or not, start using your brain and don’t get gpt. This new generation will Eat Vram like a fat kid eating cake. Don’t waste your money buying those 8GB cards.

    https://uploads.disquscdn.com/images/b1991c00c5af0a7e3554847f298c5fd1f4b17382ac0687b180df2970d93350c8.png

      1. From the moment i saw the stats on those cards, i said to myself “yes the fix is in” and the dummies are gonna actually buy it. Nvidia gave them 2080 ti performance for cheaper but you aint gonna get 2080 ti performance with new generation gaming, gonna run out of Vram before that even have a chance to happen. These companies are slick with it, but the consumer are even dumber. No one should be buying those cards, let them rot on the shelves. Cyberpunk 2077 is maxing and going over 8GB of vram at 1080p ultra, imagine what 1440p is doing. People are willing paying for bottlenecks these days it seems.

        1. Ooh i laugh now when people said to me my RTX 2080Ti was overkill in 1080p..i should sell it and buy a RTX 2060 it would ” play CYBERPUNK 2077 in 1080p Ultra with RT over 144 fps”
          “You dont need 10 – 12 GB VRAM in 1080p..thats stupid”
          “..why buy a card with high VRAM.memory on a 1080p monitor? Are you dumb”

          Ooooh how hard i am laughing at that old comment from 2019..

          1. holy crap man.sell 2080ti and buy 2060? the word dumb can’t describe how foolish they were.

        2. When the 30 series was released the vram was my main concern and it seems they didn’t utilize any major improvement into compression either so yeah… that said many games allocate whatever they get their hands on and its easy to spot in many titles, shame so few reports actually used vram.

          Seen graphically underwhelming games at 1080p allocate over 20 gigs on my 24gb vram card…. Heck i bet the game itself + fully decompressed assets + other rendering data would not be able to fill even 10 yet they allocated whatever they got after leaving some for os + other apps. So don’t always believe what you see when it comes to allocated vram vs actually used.

    1. Then don’t use ultra? Haven’t play this game yet but i have seen some people doing videos thay even medium can look almost as good as ultra. Well CDPR have history going nuts on their max setting. Remember the witcher 2 uber setting?

      1. Don’t use ultra? If you want a subpar experience by a console. If you are an enthusiast buy a card and play game. Sell the card and buy new card. Rinse and repeat.

        1. what’s the point using ultra and have performance hit when you can’t really see the visual difference? dirt rally 2 for example have massive performance hit going from high to ultra. but after really extensive time spending the time with that game and keep switching between the preset i really did not see the diffrence between the two. my GTX970 can get around 80 FPS with high settings but using ultra it can drop my FPS into mid 40s.

      2. That’s NOT the point here dude & I don’t play in Ultra for this game. I just did this to show people what was up/as a warning. I have no interest in playing at 35-45 FPS. I play Cyberpunk 2077 in medium, sadly. I know I’ve been quite privileged. I’ve never played a game at anything other than Ultra in all my PC gaming history.

      1. Thank you for being logical and person of reason. Because AMD knew, AMD knew that it couldn’t continue to release 8 GB cards. This generation of gaming won’t cut it with 8GBs, but too bad they paper launched the damn cards and is further playing with us. AMD has too many SKUs to deal with and unfortunately the desktop GPU market is on the bottom of their list ????. For years I’ve watched Games eat 4-6.5 GB of Vram and I’ve been waiting patiently for 12 GB cards minimum. That’s I sat on my Vega for this long. Had Titan X before and it died then I had no choice but to her a Vega 64, what was available at the end of 2018. So now I will have to play on medium for the first time in my PC gaming history and patiently wait on 12GB minimum or for these 6000 series to get their minds right over at AMD.

      1. Anyone buying a new card shouldn’t buy anything under 16 GB if you are looking longevity & 1440p. 12 GB is a sweet spot for 1080p. I’m just simply saying don’t run out to buy any damn 8GB GPUs & so many people are getting salty because they wanna Live in denial. I’m not saying you should go in your rig and rip out the 8 GB GPU that you already owned for years and burn it in effigy. I’m just saying to people, DON’T WASTE YOUR MONEY ON THESE new GPU OFFERINGS CURRENTLY from Nvidia. They released this extremely fast GPUs but placed flat tires on them. That’s what’s happening, they are Fuqing useless at 8GB & an insult. All because nvidia wanna come off as cheaper.

        Instead of leaving the 3000 series to rot on the shelves people are selling their left nut to filled their system with a very useless product. Then when you see evidence of a possibility you get denial thrown your way. That’s alright everyone has an opinion, but it’s not alright when people are burning up their cash all because of hype and not using their brain. The evidence is clear as day, that if you continue into next gen with anything less than 12 GB, you’re gonna run into some problems and can’t run ULTRA. When I’m actually playing Cyberpunk 2077, I turn the game to medium and VRAM usage is sitting at 6.5-6.9 GBs. I’m currently using a Vega 64 at this point my card has finally met it’s match at 1080p. We had a good run ? Thanks for listening, peace.

        1. Useless products being the only products on the market that can run games with ray tracing optimally. You can’t even turn on ray tracing in cyberpunk 2077 with a RX 6800 XT nor would you want to. They have no answer for DLSS and Fidelity FX fails badly in cyberpunk atm.

          1. ray tracing’s still a gimmik. i wouldn’t even worry about it till 3rd or 4th gen. it’s the equivalent of 4k gaming when that whole craze started like 4 yrs ago. and they still can’t even do it with out cheats/upscalling in the most titles, even still in 2020. dont buy in to hype or they’ll just keep pilling it on.

        2. 980ti only has 6GB and i’m running 1080p high 50-60 fps. I agree if i was buying a new card i would really want at least 16GB. you can say what you want about what the consoles have but we are the master race and if you want to enable all the bells and whistles all the while keeping a solid frame rate, you’re gonna want to out spec em unless you’re expecting console like performance. (which is obviously pathetic) so don’t buy into the hype of the little guy and have a little patience and let the holiday rush come and go, and eventually there will be stock on the shelves and if they want to sell it then they will drop the price. also now that we have some competition we will be seeing some upgraded sku’s with better specs. remember, good things come to those that wait.

          1. I Couldn’t agree more man. The passage you just wrote should be pasted on everyone’s desktop because it will help them get over FOMO. I wanted to buy a 5700 XT to replace my Vega 64 so badly but I didn’t because it’s just another 8 GB cards in a thing me when I should be looking at 12-16 GB card. 5700 XT is a Very great card but not the best buy right now what’s to come.

    2. Yep i said that from the moment FF XV came out on PC.
      Everyone was like 8GB Vram for 1080p is overkill yOu WiLL bE fINe wItH a GTX 1070 fOr YEaRs…… Yes when you play games on LOW or MEDIUM..in 1080p
      Things change for a rapid when you are going for ultra settings with 4K texture packs or even better newer next gen games then 6-8 GB VRAM isnt enough anymore even on 1080p.
      Why people only think VRAM is only need at 1440p or 4K is beyond me.

      1. some just don’t understand that some day even a well written and optimized game will need more than 8gb.even the texture resolution apart from the render resolution can eat the vram.honestly i haven’t seen a game worth so much vram yet but believing that x amount of (specially 8gb) vram is gonna be enough for years is just wrong.

        1. And most importantly some can’t READ or COMPREHEND. Now you have people saying, “what’s the point of running ULTRA, when you can’t see the difference”. I wasn’t aware that this discussion or argument was about what you can’t or can see as an individual. Some people i tell ya, very frustrating to deal with. That’s why I just don’t respond. Reading and comprehension is very difficult for alot of these folks.

          1. So I present evidence refuting your claim that 6gb is enough and it proves your point?

            *slow clap*

    3. In Cyberpunk 2077 4GB is enough for 1080p and high textures. There is barely any texture issues on my GPU (4GB RX 580). Actually impressed by it cause expected worse when it comes to VRAM side.

      1. at what? 30fps? can’t work with that personally… i’m running 3800X/980ti (oc’d) / 32GB DDR4 3666 and getting 50-60fps on 1080p high. i am happy. for now. but i’ve seen the bench’s on RX580’s and they aren’t looking up to snuff for the old skool PC peeps… but hey prob w/ a little tweaking sure it’s way better than consoles.

    4. Game runs fine for me at ultra with 8GB of VRAM at 1440p, even with DLSS off. RTX 2080, ultra, with no raytracing (raytracing murders the framerate but that’s a different issue and the game looks good without it).

    5. I started playing Horizon, i have a patato PC with a rx 560 4GB, setting the game at 720P and “original settings” eats most of it. At 1080 + same settings it struggles to get 30FPS.

      So yea, no doubt 8GB is hardly enough to run this one, an OLD game, at 1080p + High settings with 60FPS. No way in hell future games, as in 1 or 2 years from now, will not struggle, more so when consoles have 16GB Vram already.

      1. you amd fangays are rytarded as fck. when we point out issues with owning an amd component in your system, you always cry about how this was a thing in the past while in reality if anyone’s going to run into trouble, it’s always amd users.

          1. you are a sand n*gger. If I was below average in my country, I’d be still the smartest person in your country.
            as soon as I saw your profile pic I knew I was talking to a smart person.

    1. More like yet again a pos optimized Nvidia the way you were meant to be played game comes out and needs patches up the @$$ lol

        1. like how the 6800XT crushes the 3080? or how Nvidia tried to out hardware unboxed cause they did not focus enough on ray tracing in reviews lol

          1. amd is great, keep buying their garbage. you’ll be here again and it’s not going to be me who’s complaining, that’s for sure.

          2. AMD is great. Suckers like you keep buying Nvidia so they can keep taxing you on BS that games are not ready for 😀 How is ray tracing going for those 2080 Ti beta testers XD

          3. what? taxing me on what? are you off your meds? ray tracing works great, looks great.
            are you mad because you can’t run it?

          4. lol looks great and runs great? you don’t even get paid to type that bs… sounds like you are the one in need of meds

          5. you’re mad because of 30fps you get in your games . only poor people complain about how taxing ray tracing is because they can’t afford the hardware.

  7. Ads vs real game (both images from PC version – console version looks even worse)
    source: YouTube – Wk6HAEsEPo8

    Update: Sorry, I checked that mission on Xbox Series and it looks like top picture (only hairs have worse light). I don’t know what settings are on picture below, maybe this is some kind of error (maybe old drivers?)
    https://uploads.disquscdn.com/images/a8663f8bc8fe4eb65a731b3683b561c6f455e10da03f13162256b3b70a578a6b.jpg

  8. Thanks for the info, John. It doesn’t really increase framerate but indeed helps with 1% lows and driving is smoother now.

Leave a Reply

Your email address will not be published. Required fields are marked *