It appears that MSI may have leaked the first gaming benchmarks for the upcoming AMD Ryzen 9 9950X3D CPU. Since these are coming straight from MSI, they are legit. As with most “first-party” benchmarks, though, we suggest taking them with a grain of salt. So, let’s dive in.
MSI has benchmarked three games at 1080p. These are Black Myth: Wukong, Shadow of the Tomb Raider and Far Cry 6. The MSI graphs include both the AMD Ryzen 7 9800X3D and the 9950X3D. MSI has not named them, but this is obvious from the graphs and the CPUs with which the team compares the AMD Ryzen 9 9000X3D CPUs.
So, the AMD Ryzen 9 9950X3D appears to be 11% faster in Far Cry 6, 4% faster in Shadow of the Tomb Raider, and 2% faster in Black Myth: Wukong. On the other hand, the AMD Ryzen 7 9800X3D seems to be 13% faster than the AMD Ryzen 7 7800X3D in Far Cry 6, and 2% faster in Shadow of the Tomb Raider and Black Myth: Wukong.
Now I know that the SOTR and Black Myth: Wukong may disappoint some gamers. However, there are some things you should keep in mind.
You see, the built-in benchmarks for both Shadow of the Tomb Raider and Black Myth: Wukong are mostly GPU-bound, even at 1080p. So, even though MSI has used an NVIDIA RTX 4090, we are basically looking at GPU bottlenecks in these two games.
On the other hand, we know that Far Cry 6 is a game that relies heavily on one CPU core/thread. That was one of our biggest gripes with it when it came out. Due to this CPU optimization issue, FC6 is CPU-bound even at 1080p or 1440p on high-end GPUs like the RTX 4090. So, a performance increase of 11-13% is great news.
Again, these appear to be early gaming benchmarks. So, take them with a grain of salt. My advice is to best wait for some third-party gaming benchmarks.
Rumor has it that AMD will reveal or launch its new 3D V-cache CPUs on October 25th. As with all rumors, though, I suggest taking it with a grain of salt. Man, there is a lot of salt in this article, right?
Anyway, although the benchmarks are legit as they come from MSI, the release date rumor seems fishy.
Stay tuned for more!
Thanks Hardwareluxx

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email

I get why they use 1080p as a resolution for these game tests as its more likely to expose a CPU limitation than at higher resolutions but I always feel its so misrepresentative. I only wonder how many people buying these newer X3D chips are using them to push 1080p. At the esports level they do use these beefy CPUs to push 600+fps on 360hz and 540hz monitors but they rarely benchmark esports titles for these comparisons.
Exactly. It's more bad than good to me for being too abstracted from real life use cases.
I still wanna upgrade but buying any part feels like playing the lottery these days.
The era of buying individual parts is kind of coming to an end, except for the extreme high-end.
People are moving to mini PC, they are outselling desktop PC by a lot.
We went from ATX, to micro-ATX, mini-ATX, and now to mini-PC. Mini PC are basically notebook components put into a shoebox, a NUC really.
Mini PC have killer price/performance ratios and killer performance/watt. The memory is shared and the components are bulk, so they are way cheaper than buying individual parts, and there is healthy competition between mini PC vendors, individual components was basically a Taiwan monopoly which is what made building PC so expensive.
Yeah I might go with a prebuilt, it was nice to build my own stuff while it lasted. Huang played the silicon lottery, and we lost.
These will never outperform a serious, dedicated, discrete gaming gpus. Much like a laptop you're trading performance for convenience, and some aren't willing to make that trade. Maybe in the future when memory is fully integrated and advanced apus are the norm. But that isn't here and minipcs are still second class pcs.
They are perfect for desktop use, but their gpu power is at least 10y old.
I hope their release will bring down the 7800x3d which has it's price skypocket after the 13gen-gate started
2% faster, holy moly! Insta buy!
What most people don't understand is this new generation of CPUs aren't for gaming they are built for the corporate desktop market. All the new additions to the CPU is to increase business performance and business efficiency. That is why when the 9000 series came out they were pushing efficiency so hard and the same is true for the new Intel lines where a Core Ultra 285 isn't really any more powerful than a 14900k but is considerably more efficient
When you have hundreds or even thousands of desktop CPUs a 20% gain in efficiency translates into 20% less energy costs and that adds up quickly when you have thousands of desktops across corporate. Thanks to AI the corporate desktop markets are booming while the consumer desktop market is still depressed so all improvements are concentrated on the sector that is booming.
John doing spreadsheets needs a CPU with 16 cores?
He needs a celeron.
Very bad take. Silicon+x86 based architectures have pretty much peaked and are at a cliff in performance and power consumption efficiency, so they simply can't ramp up clockspeeds anymore the way that brought them up to this point without incurring into massive inefficiency and thus power consumption and heat for too little gains. That's why they're doing anything else they can do like add cores and cache, while trying to make them however little more efficient they can and working on ARM for the near future.
Except GN did efficiency testing on the 9000 series and found that it was often identical or worse than the 7000 series. In reality some workloads did see noticeable improvements but those were mostly server related tasks IIRC, L1T did a bunch of testing related to that.
https://www.youtube.com/watch?v=6wLXQnZjcjU
Crazy that anyone can get excited over an 11% performance increase in multitasking.
We are a long way off from the 90s, where hardware performance was doubling every generation, breaking single core clockspeed with massive jumps in performance.
CPU are no longer getting faster where it matters. X3D stacking a bunch of L3 cache was a one-time performance increase freebie for games. Game developers will actually need to optimize games, learn….. advanced mathematics….., read through thousands of Siggraph papers. The era of relying on Unreal Bloat Engine 5 is over.
And they will do none of that. These developers have the most cutting edge tools of all time but refuse to do the work necessary. Now it's the age of, "fake frames" will fix it. Funny how unreal engine was becoming a has been until fortnite picked up steam, then gave way to the epic store. With that fortnite money, epic was able to throw their weight around and pretty much BUY people to use their engine again through incentives. It's a cold world we live in. Unreal Engine is the oldest engine there is, you'd think they wouldn't have as much bloat and would be streamlined after being around for 25+ years. In every other field experience is a good thing, not for epic apparently.
I think the problem is silicon is giving diminishing returns atm.
They are searching for an alternative material, but that's all that it is atm. A search.
Many people think Quantum Computing is the future, however as of yet that is suitable for other workloads.
However I still don't think chasing photorealistic graphics as of now is a solution to the gaming industry woes.
I hope once they finally get there, they will delegate more people/resoruces to write meaningful stories again.
But then again – for whom? The modern mainstream audience seem really brainwashed.
Games became political agenda lately and their purpose as a storytelling medium that connects with "normal" people comes in as not a priority.
Nah, performance wasn't' doubling every generation' in the Pentium I era and onwards, clockspeeds were, but if you're that old, by now you should also know that every tech in its early stages moves crazy fast and at plateau it slows to almost no progress, and here we are.
I'm as curious as much as anyone on how this may regain pace in the next few years, other than adding cache or cores, or maybe moving to arm… A replacement for silicon or quantum computers are both still far away… Lithography is also slowing down. Better software optimization? Doubt it too. It will have to be a very concerted effort on various fronts.
Sure, but at least my tactic works well. I went from i5-2500k to 7800X3D. This is how you get nice boost in performance. This yearly upgrade thing is just weird. Its been going on for many years too. Look at Intel ++++++ I havnt seen huge performance boosts in more than 10 years.
So if they keep going with 5-30% increases, ill probably have a really good new CPU in 5-8 years.
Speaking of devs.. yeah. They are just getting lazier. Fake frames and DLSS is not a solution for bad engines and optimizations. Its a shame they dont optimize their games. Even with the amazing hardware we got today, games run badly and stutter. Unreal Engine 5 is never running smoothly too. What a mess. I know people had issues with UE 4, but i never did. My problems started with 5. UE5… my god. Worst Engine Ever.
This is also what I do. Jumping from a heavily OC'ed 2600k to a 13600k was an insane jump in performance a year or two ago or so. In the meantime, I had upgraded the GPU twice, with decent sized perf gains at each point. The CPU also helped with performance for the older GPUs, but then scaled even more with an RTX card.
I guess I'm not made of money like some game hobbyists are, but as long as I can hit 60 fps or higher, i'm happy. 90+ is of course nice, and at some point, it's just diminishing returns for a LOT more money. GPU prices are flippin' insane nowadays. If I have to choose between rent, car, food or gaming, the answer's obvious. I guess there's also things people with a life do with their money, but that's gonna have to take a backseat now that I'm living that sober life.
HAIL SATAN!
This is old news now, but I am hopping is that the 9900 will as fast or faster 7800x3d in gaming. I current run a 5900x and the only reason I didnt upgrade to the 7800 was its no faster or slower in some instances fo non-gaming tasks. Which the 9900x will not be.
There is literally nothing you can throw at your current cpu that it can't handle. Save that money and invest instead.
Rocking intel 9900k . Holded off on 14900k, 7800x3d, but i feel i could postpone again since the benefits are arounfld 10%
That 9900k is considered very old now and you will for sure be CPU bound in every game you play. The 7800X3D is roughly double the perf of that 9900k – https://youtu.be/z5HwZr5W0Hc
save instead.
Did anyone else notice this? I saw TechPowerUp publish the wrong info because they didn't see this mistake in the graphic.
https://uploads.disquscdn.com/images/65d7cffbd1bc3a0234c4b22c2ea47981ad89aadd0699b8de3c734d131d8765bd.jpg
Gamers won't be happy until they can reach 50,000 fps, at which point they will cease to be in this continuum.
Haha, 50,000 fps 😭😂🤣 I can't even argue with that because true. But the technology ain't the problem anymore it's the people using it. These crappy developers don't want us to jump continuum. 😁
The 8-core SKU is actually the AMD Ryzen 7 9800X3D processor. Not Ryzen 9.
The Ryzen 9 parts are obviously the 9950X3D and 9900X3D SKUs. We also have the Ryzen 5 9600X3D in the pipeline as per rumors.
.
It appears that only the AMD Ryzen 7 9800X3D 3D V-Cache CPU will be introduced on the 25th of October which is what was reported earlier, but the actual retail launch/sales will take place in the first week of November.
The rest of X3D SKUs would be announced at the CES 2025 event in January next year. Both the Ryzen 9 parts, 9950X3D and 9900X3D.
Retail availability is expected be aroud late Q1 2025 timeframe for these 16 and 12 core parts.
.
Whom are these 3D for?
People with 4090s that play in FHD?
is there any benefit for 4K gamers?