AMD Ryzen 9 7950X3D DSOG feature

AMD Ryzen 9 7950X3D benchmarked in 10 recent CPU-heavy PC games

And the time has come to say goodbye to our beloved Intel Core i9 9900K. DSOGaming has upgraded its main PC gaming system to an AMD Ryzen 9 7950X3D. As such, we’ve decided to benchmark it in the ten recent most CPU-heavy PC games.

Before continuing, we should list our full PC specs:

    • Corsair 7000X iCUE RGB TG Full Tower
    • Corsair PSU HX Series HX1200 1200W
    • Gigabyte Aorus Master X670E
    • AMD Ryzen 9 7950X3D
    • G.Skill Trident Z5 Neo RGB 32GB DDR5-6000Mhz CL30
    • Corsair CPU Cooler H170i iCUE Elite
    • Sandisk SSD Ultra 3D 2TB
    • Samsung 980 Pro SSD 1TB M.2 NVMe PCI Express 4.0
    • Samsung 970 Evo Plus SSD 1TB M.2 NVMe PCI Express 3.0

Here are also some images from our PC system.

AMD Ryzen 9 7950X3D DSOG PC System-2AMD Ryzen 9 7950X3D DSOG PC System-3AMD Ryzen 9 7950X3D DSOG PC System-1

Instead of using Windows 11, we’ve decided to stick with Windows 10-64Bit. From what we could tell, there aren’t, at least for now, any major performance differences between these two operating systems.

For the following benchmarks, we used an NVIDIA GeForce RTX 4090 with the latest GeForce WHQL Drivers. Since we’re talking about CPU benchmarks, we ran all games at 1080p. Furthermore, we’ve disabled SMT as it makes no sense to have it enabled for gaming. PC games use usually 4-6 CPU threads and the AMD Ryzen 9 7950X3D has two CCDs with 8 CPU cores each. By enabling SMT, we are basically cutting in half the performance of each core. Take for example Resident Evil 4 Remake. As we can clearly see, SMT brings a noticeable performance hit.

RE4R with SMT OnRE4R with SMT Off

The AMD Ryzen 9 7950X3D is a beast and had no trouble running all of the following CPU-heavy games. As we’ve already reported, some of these games rely heavily on one-two CPU threads. However, even with Ray Tracing in games like Gotham Knights, The Witcher 3 Next-Gen or The Callisto Protocol, the AMD Ryzen 9 7950X3D can provide smooth framerates. And that’s without enabling NVIDIA’s DLSS 3.

What’s also interesting is how these games behave on AMD’s CPU.  For our benchmarks, we used AMD Ryzen Master to enable/disable the second CCD. And although we’ve heard reports about the 7950X3D underperforming in Hogwarts Legacy when both CCDs are enabled, we did not really see any major performance differences. Not only that, but Marvel’s Spider-Man Remastered was faster when both of the CCDs were enabled. After all, Marvel’s Spider-Man Remastered is one of the few games that can take advantage of more than 8 CPU threads.

AMD Ryzen 9 7950X3D benchmarks

In our tests, only two games performed worse when both of our CCDs were active. These two games were Wanted: Dead and Resident Evil 4 Remake. For whatever reason, these two games were using the second CCD (and not the one that had the 3D V-Cache). By disabling the second CCD, we forced these two games to run on the appropriate CCD which resulted in a significant performance increase.

For all our future benchmarks, we’ll be testing both of the 7950X3D’s CCDs. Given the fact that our CPU has 16 CPU cores,  though, we won’t be testing SMT/Hyper-Threading!

73 thoughts on “AMD Ryzen 9 7950X3D benchmarked in 10 recent CPU-heavy PC games”

  1. For the games that use more than 8 threads (like Spider Man), you could test the difference between both CCDs with SMT off and only CCD0 with SMT on

  2. I get that SMT is unnecessary when you have all 16 cores active, but could you add a comparison of SMT on/off when only CCD0 is active? I don’t expect 8C/8T will be limiting on such a fast CPU, but it would be nice to see a comparison if you’re going to have SMT off for all future benchmarks.

    1. We’ll have such comparisons only for games that can use more than 8 CPU threads. For those that don’t, we won’t have any graphs with SMT On.

    2. Most games don’t scale past 6 cores these days … John has shown this in several games where adding more cores past 6 results in practically zero gains in performance, some don’t even scale past 4 cores very well. The reason for that is experienced programmers tend to become lazy and 6 cores (2 cores were reserved for the OS) was the standard set with last gen consoles so they are still programming for only 6 cores and also not taking advantage of SMT/Hyperthreading at all since last gen consoles couldn’t do SMT.

      “That’s the way we’ve always done it” is a death knell for engineering and programming and after a lot of engineers/programmers get 10 years under their belts they don’t want to have to go through all the trouble learning something new again like they did in college and for their first couple of years in the field.

  3. @john If you want to close the case, moddiy has a 90-degree cable, I’ve been using it for a month now without any issues. Also some higher resolution benchmarks would be nice.

  4. Each 8-pin power cable must be independent and not shared… come on, it says so in the instructions, but you should know that

    1. Yup …. you’ll likely end up with everything on one of the two 12V rails and start tripping the OCP. I just helped a guy last month who made that mistake with a 3080 and AC Origins was causing a complete reboot which is a classic symptom of an OCP tripping. His problem is he had an ASUS motherboard with the completely unnecessary extra 4 pin 12V plug (12V 8 pins can carry 336W, more than enough for any AMD CPU) so he had the motherboard on one rail and the GPU on the other rail. The solution was to just pull the unnecessary 4 pin feed to the motherboard and move one of the GPU 12V lines up to it. That balanced the GPU load across both 12V rails. Problem solved

    2. It’s the transient spikes you have to worry about but Nvidia managed to smooth them out with the 4090. It’s more of a problem with the 3090 and 3080. The main reason he’s not seeing a problem is he has an oversized 1200 watt PS but the downside to that is it will operating well below it’s peak efficiency most of the time. An oversized PS costs you more both initially and in the long run because it will use more electricity than it would if it were operating in the middle of it’s efficiency curve. A 1000 watt PS would probably have been a better choice here at least from a cost and energy efficiency standpoint. Of course if you load one side only with a high powered GPU then you would stand a better chance of tripping the OCP but that should never be a problem if you load a power supply correctly across both rails.

      Proper power supply balancing is something few if any PC building guides ever mention but all power supplies 450 watt and above are going to have two 12 V rails each capable of only handling half the rated total power

  5. ICOs have been used to fund a variety of projects, from new blockchain platforms and dApps (decentralized applications) to gaming and gambling platforms, as well as social media networks and marketplaces. ICOs have become a popular alternative to traditional funding methods such as venture capital, as they allow startups to raise funds from a wide pool of investors without giving up equity.

  6. getting Gayzen which has horrible DDR5 support was a mistake imo. DDR5 is getting faster and faster and game devs are catching up and soon AMD and its abysmal support will be a hindrance for proper benchmarking.

    1. Most game devs can’t even do proper CPU threading these days or take advantage of DX12’s shader pipeline which can also be multithreaded so shader compilation doesn’t take as long and most games can’t scale past 6 cores much less 12 or 16 …. Thus faster memory just puts a band-aid over their threading incompetence …. The root of the problem is programming games doesn’t pay very well and the people who actually understand how to do proper threading on modern CPUs are working in the commercial markets where they make considerably more because consumer markets pay sh*t in comparison.

      Just look at AMD or even better Nvidia and you will see that more and more of their profits are coming out of the commercial sector. The downside to that for consumers is they have to sell their consumer lines for more money (higher margins) or it just isn’t worth their time when they can make more using limited production time at TSMC making their high margin commercial products

    2. Most game devs can’t even do proper CPU threading these days or take advantage of DX12’s shader pipeline which can also be multithreaded so shader compilation doesn’t take as long and most games can’t scale past 6 cores much less 12 or 16 …. Thus faster memory just puts a band-aid over their threading incompetence …. The root of the problem is programming games doesn’t pay very well and the people who actually understand how to do proper threading on modern CPUs are working in the commercial markets where they make considerably more because consumer markets pay sh*t in comparison.

      Just look at AMD or even better Nvidia and you will see that more and more of their profits are coming out of the commercial sector. The downside to that for consumers is they have to sell their consumer lines for more money (higher margins) or it just isn’t worth their time when they can make more using limited production time at TSMC making their high margin commercial products

    3. nothing wrong with that. but AMD shills treat the company like it’s literal alien utopia tech. I’m just balancing it out

  7. Who the hell wants to spend $700 on a CPU.

    Most big-budget PC titles are horrible ports as of late.

    This high price of PC hardware and the stuttering mess PC games are, is driving people away from PC to consoles at an unprecedented rate.

    I was looking over the most sold games. 12 out of 20 best selling titles are Nintendo games that run on potato hardware.

    PC gamers are busy being consumption wh-ores with 8k and doing PC build spec and RGB p-ssing contests. Meanwhile Nintendo is outselling everyone with toaster hardware, because the games are fun and enjoyable.

    PC gaming is going completely in the wrong direction, alienating anyone who isn’t a wealthy person or a kid living off their parents. Many people are trying make ends meet, but they’re trying to sell PC gamers $700 gaming CPU to play badly optimized ports, f off. PC gaming is going to tank with these prices.

    F overpriced PC hardware like these CPU, no one wants to spend this kind of money to play badly ported, stuttering, and DRM infested PC games.

    Grow a damn spine, don’t be a consumption wh-re and reject this crap.

    1. 7950X3D is for people who use their CPU for productivity and want top end gaming as well. 7800X3D is cheaper and is targeted mainly for gamers.

          1. And who except you have a RTX 4090 ?! I mean overkill much. Anyway have THE CALLISTO Protocol 1 CPU Core been fixed after 15 GB, Johnny ?

          2. I have one… It’s AWESOME! 🙂 Upgraded from a Pascal Titan X… That was upgraded from dual GTX980’s in SLI… Those were upgraded from dual GTX480’s in SLI… Those were upgraded from Dual GTX280’s in SLI… Those were upgraded from Dual GTX 8800GTX’s in SLI… Those were upgraded from a single 7950GX2… and that was upgraded from a 7800GS. Before that I had taken a decade break from PC gaming and my last Video Card before that was a Matrox Millennium paired with a Orchid Righteous 3D Voodoo FX Card. The OG. Just to say I wait years before upgrading… But when I do… BAM!

          3. So glad John FINALLY got a CPU that is worthy of it too! I have a 12900k and my benchmarks were always so much better, results wise, than what he was getting. Good on you John!

          4. Ahhh.. You missed out the TNTs and the first iterations of the Geforces.. That was the times..!

          5. Whose “we” thought it was a one man show. Of course John gonna say he bought it, talk is cheap, can you provide proof ?

        1. John most likely bought it himself, but even if he got it from AMD or another company, that is fine. He is not writing biased articles that are pro AMD.

    2. The fact that you think these CPU’s are targeted at gaming exclusively, is delusional. They are for gaming+streaming, heavy multitasking, video encoding or rendering. Giving workstation performance at a consumer price. Even 6~ years ago it cost double what the 7950X3D started at.

    3. You need to calm down a bit. No budget gamer looks for 7950X3D. If a ‘budget’ gamer has $700, they’d get 5800X3D for $330, a solid B550 MoBo for $150, 32GB 3600 DDR4 RAM for $90, a 2TB Gen4 SSD for $150, and enjoy 88% of this CPU’s performance.

    4. Or you could do what the smart PC gamer does and just get the Steam Deck, which is unbeatable from a price/performance perspective right now.

      As a bonus you won’t get any shader compilation stutter on that hardware, because Valve has already pre-compiled all shaders for all games on their storefront for the Steam Deck.

      On top of that you will be able to emulate Nintendo’s Switch games with better performance, plus pretty much any other console in existence.

      Just sayin’…

    5. Well said. Totaly like this and it’s disgusting!
      I know I wont upgrade, even if had the money. My 2060S and old CPU (9600K) will do!

      No way I will even go close the 40X series and I will never look at POS AMD!

      To hell with all these overpriced cpu and gpu HW. It stinks ?

      I can emulate Switch perfectly and run most games good anyway, so it will do!

  8. So no more testing the 9900k?
    It would be nice knowing how the 9900k can fare throughout this gen(and also because im still on a 9900k your performance analysis helped me a lot)

    1. It’s already a pain in the a*s benchmarking the 7950X3D with its different configurations. And then we’ll have to re-download a game for the i9 9900K. Unfortunately, it’s a lot of work (which is why we also stopped using our 4930K when we bought the 9900K).

  9. Forever people been crying about the 9900K used in benchmarks now you got people crying about the price of the CPU.

    BE HAPPY AND SHUT UP FOR ONCE 😀

  10. nice build bro 🙂 and the hogwarts legacy result are without ultra RT ? maybe we need the ultra rt results ^^

  11. The problem with Witcher 3 is it is unintentionally CPU heavy because of improper programming of the game engine threading ….. I can’t even get mine to start up since the last update but that’s just something you have to learn to deal with when modding a game that’s still getting updates, there is always a chance an update will break your save but not a big deal since I finished the main story a couple of months ago but it will mean I’ll likely have to start all over again to play Hearts of Stone and Blood and Wine but that’s not going to happen anytime soon anyway. Not until they actually fix the threading problem so I can actually use RT in the game. Hopefully CDPR learned something about outsourcing ports …… Just don’t do it. If you want something done right you have to do it yourself.

  12. The problem with Witcher 3 is it is unintentionally CPU heavy because of improper programming of the game engine threading ….. I can’t even get mine to start up since the last update but that’s just something you have to learn to deal with when modding a game that’s still getting updates, there is always a chance an update will break your save but not a big deal since I finished the main story a couple of months ago but it will mean I’ll likely have to start all over again to play Hearts of Stone and Blood and Wine but that’s not going to happen anytime soon anyway. Not until they actually fix the threading problem so I can actually use RT in the game. Hopefully CDPR learned something about outsourcing ports …… Just don’t do it. If you want something done right you have to do it yourself.

  13. John first of all Congratulations on your new Battlestation!
    Its good that you listened to the people suggesting to get a new CPU after you got the RTX 4090.
    This will bring more quality to your benchmarks and thus your site.

    at all readers i need advice :
    2nd im planning to upgrade soon from my i7 8700k towards AMD 7800X3D with 32 GB DDR5 -6000Mhz.
    Im totally new with the AMD platform i dont know what SMT is and are there any other things i should need to know if its only gaming performance what im looking ?

    1. AMD SMT = Intel Hyper-Threading. It basically lets one CPU core handle two threads (think of it as “cutting” a CPU core in half).

      1. That’s not true at all …. It’s two separate execution units that share cache, and I/O / memory bus. So while it doesn’t double the power it does increase it considerably (30-40%) and doesn’t “cut it in half” The reason it got a bad rap is Intel’s version had a security issue that didn’t exist on AMD so Intel basically had to gimp it with a microcode update but anything 10xxx and above don’t have that problem anymore either since it was a hardware architecture problem that was fixed.

    2. 1. The new AM5 platform takes 5+ minutes to train memory. Don’t assume problem when your new build doesn’t boot and you get no signal—just wait it out;

      2. Ryzen loves lower latencies. Get the G.Skill Trident Z5 6000 CL30. It offers the best performance out of the box with EXPO enabled (AMD’s version of Intel’s XMP);

      3. Gigabyte is offering the better B650/X670 motherboards, however, it requires flashing newer BIOS to work with memory properly. Just something to keep in mind;

      4. If you get a Radeon GPU, ReBar is free performance (up to 20% in some games). Make sure to use it.

    3. SMT = simultaneous multi threading …. or what Intel calls hyperthreading which is also SMT with a branded name because …. marketing

  14. @JohnDio:disqus Congratulations on the new setup. I still rock my 9900KF, going to upgrade soon so this is a good comparison on what I will get by doing this upgrade.
    Is it possible for you to run benchmark on Destiny 2 Lightfall expansion on Neomuna (Neptune) by doing the Terminal Overload event and record the FPS during the whole encounter ?

    1. No more whiny articles saying “The 4090 when it runs X game at X fps… with a 9900k” faulting the graphics card. Ah, the relief.

      So, lesson learned i hope, at least @1080p THERE IS a massive performance difference.

    2. Not really. 100-120 fps is my target minimum framerate and even the 7950X is failing to hit this in several of these games. The fact that is the case is a shocking indictment of the developers of these poorly coded games.

      1. Based on studies we know the human visual system can only process between 7 and 14hz of information, so it’s interesting to see people like you, who say they need over 120fps just to enjoy playing games :P.

        From pcgamer article: “How many frames per second can the human eye really see?”

        When you want to do visual search, or multiple visual tracking or just interpret motion direction, your brain will take only 13 images out of a second of continuous flow, so you will average the other images that are in between into one image.”

        But there are some benefits in running games at extreme framerate for sure. The human vision is sensitive to repetitive artefacts, and a higher frame rate helps with this. Also input lag is and motion clarity is better. Even over 120 fps on sample and hold will blur the picture during motion. I have 170Hz panel, and I can still see blur during motion very clearly, while on CRT even 30fps looked perfectly sharp during motion. I wish they would still make CRT displays, because I was perfectly happy with 60fps on CRT, and now games even at 170Hz are blurry during motion.

  15. It baffling to me that gamers buy these expensive CPU just to get gains from the CPU at 1080p. I’m rocking a 3700X gaming at 4k for a while and still I see no reason to upgrade my CPU…

  16. Nice! You finally realised 9900k is a relic of the past. Now do some RPCS3 benchmarks. Try RDR, GoW III, Killzone 3, TLoU.

  17. For someone who games exclusively at 3440×1440 and 4K, do you think it’s a worthwhile upgrade too? Or is it only for lower resolutions?

    1. Depends on the framerates you target. For 60fps, an Intel i9 9900K is still powerful enough (provided the games are properly optimized. The games we featured in this article are known for their CPU bottlenecks even on high-end PC systems).

      1. Yeah some games these days can look incredible while being insanely detailed. like Atomic Heart had no issue running at 120fps.

        But do you think the difference at higher resolutions would be more like 10-20. Or are we talking 20+fps, even at such high resolutions? 🙂

        1. Depends on your graphics card too. If you are always GPU limited because you have something less than a 3080 a better CPU isn’t going to help much at resolutions above 1080p. This generation’s 4070 Ti is more powerful than last generation’s 3090 and just a couple percent less than a 3090 Ti which should give you some idea what a beast the 4090 really is. I can afford one but it’s way overkill at 1440p. I’ll probably get a 4070 Ti eventually but not for at least another 15 months when my current GPU is 2 years old. By then I’ll probably just build a new system from scratch on AM5 platform. For now the 5800X and 3070 Ti is doing pretty much everything I want

          1. But if you have the latest 4000 cards you can just enable DLSS 3 and just bypass the CPU, right?

            I’m thinking my next upgrade will be in 15 months too.

    1. The developers of the games are the biggest bottleneck. Because none of these consoles are running anything even close to a 9900k so y’all need to cut the rubbish. This bottlenecking argument is futile.

      1. Well obviously, but if the CPU is at 100% usage it’s bottleneck if GPU isn’t no where near that. 9900k should be enough but it isn’t

    1. It still depends on the game, settings, GPU and resolution. With a lesser GPU it wouldn’t be as noticeable, at a higher resolution it wouldn’t be as noticeable. However the 4090 is such a beast it can CPU limit about any processor even at 4K.

      For most of us lesser mortals who are using say a 3070 Ti at 1440p with maxed out graphics settings we probably wouldn’t notice much difference between a 9900K and a 7950X3D unless we gimped the graphics trying to get maximum FPS. I don’t play any multiplayer games anymore because of my schedule and none of my personal friends are into gaming. They’d rather go hunting and fishing and shoot real guns

Leave a Reply

Your email address will not be published. Required fields are marked *