Darksiders 3 has just been released to everyone and we’ve decided to use the final/retail version for our PC Performance Analysis purposes. The reason we’ve postpone this PC Performance Analysis was because we wanted to see whether things would improve with the game’s day-1 update. As such, we’ve benchmarked the final version and it’s time to see how this new action RPG performs on the PC platform.
For this PC Performance Analysis, we used an Intel i7 4930K (overclocked at 4.2Ghz) with 16GB RAM, AMD’s Radeon RX580 and RX Vega 64, NVIDIA’s RTX 2080Ti, GTX980Ti and GTX690, Windows 10 64-bit, GeForce driver 417.01 and Catalyst driver 18.11.2. NVIDIA has not released any SLI profile for this game so our GTX690 performed similarly to a single GTX680.
Gunfire Games has included very few settings to tweak. PC gamers can adjust the quality of Shadows, Anti-Aliasing, View Distance, Textures, Post Processing, Effects and Foliage. The game supports Window, Windowed Fullscreen and Fullscreen, however it currently locks at your native resolution in Fullscreen. In order to run higher resolutions, we had to increase our Desktop resolution to either 1440p or 4K and then select Windowed Fullscreen, so bear that in mind if you are interested in downsampling. The game also lacks a FOV slider and is locked at 62fps (though you can remove the framerate lock via its .INI file).
Darksiders 3 is powered by Unreal Engine 4; an engine that is well known for its multi-threaded capabilities. However, and even though we are in 2018, Darksiders 3 is mainly a single-threaded game. While the game uses partially two or three cores/threads but relies heavily on a single thread. This basically means that a lot of gamers may be CPU limited due to the game’s inability to take proper advantage of multiple CPU cores/threads. And to be honest, this behaviour is inexcusable for a 2018 title. Due to these CPU issues, our simulated quad-core and six-core CPUs were slower when we enabled Hyper Threading. Thankfully though, and despite these CPU optimization issues, Darksiders 3 can run with 60fps on a variety of CPUs.
Now while the game is mainly using one CPU core/thread, it can be easily described as a GPU-bound title. With the exception of our NVIDIA GeForce RTX2080Ti, all of our other graphics cards were used to their fullest at 1080p on Epic settings. Yes, even the AMD Radeon RX Vega 64 was being used at 95-98% at 1080p, something that clearly shows how GPU demanding this new game actually is.
At 1080p, our AMD Radeon RX580 was unable to offer a smooth gaming experience. At 1440p, our NVIDIA GeForce GTX980Ti and AMD Radeon RX Vega 64 were not able to run the game with 60fps and in 4K, our NVIDIA GeForce RTX2080Ti was unable to offer a constant 60fps experience. Now as always, we used an open area that seemed to really push our graphics cards to their limits. As such, consider our benchmarks as stress tests, meaning that there are other areas that can run better/smoother. Still, we believe it’s best to use the worst case scenarios for our benchmarks.
Now we wouldn’t mind these extremely high GPU requirements if the game’s visuals justified them. However, that’s not the case here. Darksiders 3 looks very good but it’s not the next Crysis game. A lot of light sources do not cast shadows, environmental interactivity and destructibility are limited, the game’s lighting system does not appear as impressive and it’s not up to what we were expecting from a game using Unreal Engine 4, and there is literally nothing to ‘wow’ you here.
All in all, Darksiders 3 is a really demanding title that simply does not justify its GPU requirements. The game is also mainly using a single CPU thread, something that is really inexcusable in 2018. At 1080p, an NVIDIA GeForce GTX980Ti or an AMD Radeon RX Vega 64 are enough for 60fps on Epic settings. At 1440p, PC gamers will need a high-end GPU like an RTX2080 or a GTX1080Ti. And at 4K… well… even the most powerful GPU is unable to offer a constant 60fps experience. There is undoubtedly room for improvement here and Gunfire Games has stated that the game will support NVIDIA’s DLSS tech (right now it does not) so here is hoping that the team will fix at least some of the issues we’ve mentioned – and especially the window/fullscreen issues when using higher resolutions than native – via a post-launch patch!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email























It looks beautiful. Fight me 😉
It has a nice artstyle, making it look “beautiful” but the actual graphics are mediocre.
Artistically it does look very beautiful. Running on the same hardware other AAA games look and run way better.
so do the previous games, but that doesnt justify a high end gpu.
Yeah
“looks beautiful”
That means it good ey ????
I’m playing with 1080Ti at 5K epic settings at constant @60fps. If this is a really demanding game what about the others?
others? You mean the previous ones? They can run on a toaster.
i think he meant other recent games and not prequels.
Exactly.
Even with an overlcock my KitchenAid stutters a bit…
Bad multi-slice optimization I think. It’s an 8 slice.
I don’t think your 1080ti managed to run this game on 5K @60fps, unless everything is set on low
Yup. Called it. It has visuals that are simply not good enough to justify the performance.
I think the only game within the last 5 years I have seen that shows a different in High and Ultra is Dues Ex MD. But the Ultra settings makes that game run even more like crap.
John. You have an old CPU. The age of your CPU is what’s limiting you. Not the game. The IPC improvements in the 6th gen onward are considerable compared to the 4th gen he owns. Therefore the only limit is his hardware. Therefore your analysis is useless because it is not representative of the average persons hardware nor is it representative of what the game was intended to run on
Developers tend to not design games around outdated hardware like a 4th gen CPU
The lowest a Dev would be likely to go would be 6th gen. 9th gen to 6th gen is likely what it’s designed for as far as Intel goes
So the way modern games are developed invalidates anything you test using your outdated CPU
Keep in mind that 4th gen was released during the 360 era. It’s not a modern platform in the slightest.
People like to say things are GPU bound entirely but that’s innacurate
A CPU will always affect performance. Of course the impact is lessened at 4k. But is definitely still there
A CPU calculates physics commonly. I can almost guarantee that hair is 100% CPU bound in that game
A CPU is also what the blueprints used in UE4 run on. So obviously the CPU is being used.
And if your CPU is old and slow, like yours is, the game isn’t going to run like it should
If the CPU was the issue the game would not continue to improve as you upgrade GPUs, but as you can see, across the board as better GPUs are integrated the framerates continue to climb. Now, if these charts flatlined, you would see a CPU issue.
Sure, there would likely be more frames to gain from a faster CPU, but it is by far not the bottleneck here.
Although, the multi-core utilization is obviously problematic with many cores and especially hyper-threading/SMT being a mess.
A GPU will consistently improve framerates. You can always brute force performance, which is effectively what he is doing.
Not if you CPU is pegged out. If you are maxing your CPU, it cannot deliver any more information to the GPU (draw calls).
You could increase the resolution at maximum CPU load, but not the framerate.
It’s not 4th gen. The 4930 is Ivy Bridge, not even Haswell.
Well that’s even worse then, isn’t it?
It is :))
It’s not, i ve got 6th,7h,3th,4th Intel CPU and the game does run the same ,there’s a CPU bottleneck becouse the game engine stress the gpu while the CPU is not being used properly,i ve seen this issue in past games and you can try to force the CPU in the task manager to run the game with highest priority . All the Intel core are same family CPU and games doesn t really Need more the 2 or 3 cores although more cores are welcome for live streaming, virtualization and video rendering in that case IPC count ,my i7 6700k Smash my i7 4770k in premiere pro although when i oc the i7 4770k performance are almost the same .
He knows it…
No he doesn’t. He doesn’t understand that his CPU is severely outdated. He keeps putting out misleading and incorrect information. So either he doesn’t understand or he just doesn’t care about DSOG’s reputation, what little it has left.
You are f*king dumb. How is an i7 4930K not representative of the average persons hardware lol. You are a f*king moron.
i get 60 fps at Epic at 1440p settings on my 980 ti.
We need a video here.
What CPU are you on?
i7 6700k
Seems possible given the older one used in the article isn’t too far behind.
Most likely his DDR3 memory speed and CPU ipc+higher clocks speeds needed.
Correct
I feel like this article would be more useful with at least high and medium settings listed as well. Often, games look nearly identical if not completely identical between many “ultra” and “high settings”, but often gaining back huge amounts of GPU processing time.
To me, these are never complete without information, as you’re only painting the corner of a picture.
Strong fragrance. All this ultra stuff is just fluff, and not terribly useful unless you’re just all about image quality. Being a performance analysis, you would think more people here are more interested in frame rates anyway.
I hate that every game requires a GPU.
Your CPU and memory speed are bottlenecking the system…
Hence why AMD is many times at the same level as a 980ti or worse…
You should not assume things that are not happening ;). It’s well known that AMD’s GPUs under-perform in Unreal Engine 4 games. We’ve also said that the Vega 64 was used to its fullest (so no, we were not CPU or RAM limited).
https://gamegpu.com/action-/-fps-/-tps/darksiders-iii-test-gpu-cpu
First, it’s only for the sake of comparisons at the best light, for all vendors, secondly oddly enough the game at 1440p scales better for Vega 64 for some reason, so the plausible cause is a CPU/memory speed bottleneck somewhere.
GameGPU shows far better results for the RX580 at 1080p 43 min and 59 avg.
So yes your CPU is a limiting factor somehow.
We don’t really know what’s the difference between your hardware and theirs, only the CPU that they have a 8700k @5ghz
I don’t know about memory speed as well.
Sign, this is why I don’t reply to comments and it was a mistake to even reply to this one; because the article and the previous comment explain what I’m about to write right now. The game does not feature a benchmark tool so the performance numbers depend on the scene each outlet is testing. We used the most demanding scene and obviously GameGPU did not test the same scene with us. This explains the number differences and not your false idea that we were CPU/memory limited (if we were, the GPU would not be used to its fullest, period. That’s a fact. You won’t magically get 10fps when you pair the GPU with another more powerful CPU when it’s already being used at 98%. For the RTX2080Ti, that’s a completely different story. It WAS CPU limited as it was used at 57% at 1080p, no doubt about that. However, that wouldn’t be the case if the game wasn’t mainly single-threaded. So yeah, for the RTX2080Ti it’s a similar case to Forza Horizon 3 and Halo Wars 2).
As for the sake of comparisons at the best light for all vendors… AMD should simply up its DX11 game. It’s a similar situation with a game being single-threaded. Yes, you can brute force its optimization issues by using a more powerful CPU but that doesn’t mean that the game suddenly becomes optimized. It’s inexcusable for all DX11 games to have such a huge CPU/memory overhead due to their drivers on AMD’s front, especially when the competitor offers drivers that have minimal overhead. This is why a lot of gamers claimed that the previous AMD GPUs aged like old wine. It’s not because NVIDIA degraded the performance of their older GPUs. It’s because AMD’s DX11 drivers were really awful at launch and – at least to its credit – the red team has slightly improved them. It was able to improve things, imagine how awful things were in the past, but it still has a long road ahead. And yes, these overhead/optimization driver issues disappear the moment developers use Vulkan or DX12.
“The game does not feature a benchmark tool so the performance numbers depend on the scene each outlet is testing”
So bringing the GameGPU results on the first reply was meaningless as well.
You probably know that DX11 driver overhead is not just a software problem that AMD has to solve…
John…
Finewine little to do with AMD itself, it’s basically games that are ported from consoles that get better optimization from developers, since the two major consoles are based on the same archtecture, hence why the 200 series is the one who gains the most, if drivers were really been slowly optimized by AMD older games would receive a boost as well, and they simply don’t, Unreal is simply bad optimized for AMD in general.
Unreal engine has since the beggining a very well known parnership with nvidia…
I think is just a waste of time discussing anything about this matter to you, you have worked on this for many years, if you haven’t noticed the trends in the hardware business, who is with who in terms of engines and partnerships, that each brand has it’s positives and negatives aspects, is because you don’t want to see.
A review where a CPU is not the limitation to both hardware’s for whatever reason, is simply the way to go.
You are contradicting yourself when you come with gamegpu results as a valid reference and it is not in the next reply.
Bye John, sorry for my uneducated matters.
I was not the only one to point this out btw.