CI Games has just released its Unreal Engine 5-powered Souls-like game, Lords of the Fallen, on PC. Lords of the Fallen takes advantage of Lumen and Nanite, and its Ultra Settings are “Next-Gen” demanding. In fact, the game can be so demanding that it can drop to even 40fps at Native 4K on the NVIDIA GeForce RTX 4090.
First things first though. For these early PC benchmarks, we used an AMD Ryzen 9 7950X3D, 32GB of DDR5 at 6000Mhz, and NVIDIA’s RTX 4090. We also used Windows 10 64-bit and the GeForce 537.58 driver. Moreover, we’ve disabled the second CCD on our 7950X3D.
Lords of the Fallen does not feature any built-in benchmark tool. Therefore, for our benchmarks, we used this scene. This was one of the most demanding areas early in the game, so consider this a stress test. Other areas will definitely run better, so be sure to keep that in mind.
At Native 4K/Ultra Settings, the NVIDIA GeForce RTX4090 cannot come close to a 60fps experience. In our benchmark/stress test, the RTX4090 could push a minimum of 42fps and an average of 48fps. However, by enabling DLSS 3 Super Resolution Quality Mode, we were able to almost double our performance. With DLSS 3 Quality, we were able to get constant 70fps. Then, with DLSS 3 Frame Generation, we were able to increase our performance to a minimum of 105fps.
Now the good news here is that, contrary to other Unreal Engine 5 games, Lords of the Fallen is scalable. By simply dropping the settings to “High”, we were able to get over 60fps at Native 4K on the RTX 4090. This clearly proves that the game’s Ultra Settings are incredibly demanding, and you should avoid them if you don’t want to use an upscaler. And I can assure you that most of you won’t even notice the visual differences between Ultra and High.
Hell, this is Lords of the Fallen at Native 4K/Low Settings with Ultra Textures. These could have easily passed as “High” settings in other games.
In fact, Assassin’s Creed Mirage looks on par with it (or even worse). Here’s a screenshot from Mirage at Native 4K/Max Settings.
So, Lords of the Fallen looks absolutely amazing on PC and is scalable. As such, it’s really funny witnessing gamers and influencers putting the game on Ultra and complaining about its performance. This would have been a major issue IF the game could not scale down. That’s not the case here.
Now I’m not saying that there isn’t room for improvement. However, this isn’t an “unoptimized mess” (at least on PC) as some ironically describe it. Let’s also not forget that the game uses Lumen and Nanite. These techniques are really demanding, which explains why most UE5 games perform so horribly on consoles.
Our PC Performance Analysis for Lords of the Fallen will most likely go live tomorrow, so stay tuned for more!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email





Using Mirage with Chromatic Aberration to prove a point. Oh my.
It’s not his fault jewbisoft didn’t give players the option to turn CA off. This mindset of using workarounds and mods to fix games really has got to go. We need to hold developers to higher standards, especially since they now cost 70 USD at launch.
Not the point. John knows this game does not look a generation above Hogwarts Legacy (UE4) either that or he is smoking some very heavy chronic.
Instead he used a really poor image of AC Mirage to prove a point, I was expecting better from DSOgaming.
You know the poor image is from the game right? So you admit it looks poor? And ACM looks that bad plus is a bad game. LOTF seems to be a bad game that looks good.
why is Lords of The Fallen a bad game?
Don’t know. Is it good?
Been saying this for years. But was soon as I say no to modding I get jumped like a negro in the wrong bathroom during Jim Crow South. I’m so tired of hearing, “It’s PC & we have modders”. But imagine if Microsoft had it’s way with UWP, we’d never have access to a bin file again. Everything is always about giving the end user less control and it’s really pissing me off. That’s why certain developers/publishers games I don’t even care for. UBISOFT is top of the list.
Yet another day, yet another unoptimized woke trash shovelware.
Sorri that Your life is so miserable
Did I offend you because there are 0.1% sane human beings in the world that are not a 1:1 carbon copy of your glorified, mentally deranged, tr4nny artificial religion?
Bruh you are very salty. Go eat some tree bark or something.
Triggered oversensitive snowflake detected.
You will come out at some point. For your sake i hope it’s sooner rather than later. All that h0m0 rage inside you…
OK groomer.
You clearly don’t know what a groomer is….. btw per your standards your a groomer, the church, your parents, grand parents.
So yeah
Tbh only snow flake is you with your stupid a$s rants lol
OK goy.
All hail the Tr4nny God of Snip Snap.
I’ve seen your type of bigotry first hand. There’s a trans 6 month-old living next door to me and people still refer to xim as Brian instead of using xir preferred name ‘Sprinkles McNugget’. By doing so, they are completely voiding xir many weeks of lived experience.
Shame on you.
Wtf is woke about this game, because it has a black cleric? Bro, touch grass. The “based” guys are turning into bigger fannies than trans activists. It’s a gothic fantasy with basically nothing political in it. Always the victim, always so oppressed, pathetic. If we ever make the ethno-state, fruits like you aren’t getting in.
1. I have a job. I literally work for 6 hours a day every day.
2. Another one of those goyim who still plays the agenda-poisoned video gaymes masquerading as an “entertainment product!” that was secretly designed to brainwash you and everyone to slowly slowly accept and sympathize with the rainbow cult if not turning you into one of them. “Wtf is woke about this game” (((Body Type A/B))) and not Male/Female that is, silly goy.
You sound f*king stupid
“If we ever make the ethno-state, fruits like you aren’t getting in.”
Obviously if “we ever make the ethno-state” it will be full of multi-racial worshippers … 🤡
What the fuck is up with the large amount of effeminate soylent goyim normie NPCs coming to this article? Sounds like the goyim are playing the latest video gaymes and are searching the latest articles about it on Jewgle so this article popped up in front of the goyim and clicked on the article only to be triggered by sane people opposing glorified mental illness.
English if your going to ramble and cry…. maybe stop hitting the meth pipe so hard
I want whatever you’re huffing and puffin
More like lord’s of the fallen is unoptimized trash.
I don’t think having 20+ different graphics settings is helpful in games, it’s way too confusing for players. Many players on Steam are rightfully pissed at this game because it runs so poorly out of the box. The developer had to come out with statements showing which settings needed to be changed.
Digital Foundry advocated for having tons of options, but they’re wrong.
An engine needs to scale automatically to a default 60fps target out of the box. Several games do this very well.
Games that limit options and scale automatically don’t get such backlash.
Basic things are necessary, like :
-Resolution
-Borderless/Windowed/Full Screen
-Vsync ON/OFF
-DLSS/FSR
-Target FPS setting (default set to 60FPS)
(and don’t put junk like motion blur and CA in the game)
https://uploads.disquscdn.com/images/0b69e77b13c6d4870fe58a3b00f8b63102c9f68b3e93fc14ea544b4fe006a51c.png
And people will say “but I can handle all these extra graphics options, I am smart”.
Most people want a frictionless PC experience. Most people use Windows and not a custom built version of Linux Arch. Most people use the Windows GUI, and not Windows Powershell commands.
Just because you know what “trilinear filtering” is, doesn’t mean the average gamer knows what it is. These are games meant for entertainment, the experience is not supposed to be frustrating.
That doesn’t mean PC gaming needs to be a like Nintendo console, a limited set of well-understood options is good. But you can’t overwhelm players with options.
LOL so now PC gamers are becoming more stupid, now they want less options.
Just set it to low, ur pauper hardware can’t run it.
Please don’t confuse PC gamers with that idiot, we do want more options.
If he doesn’t know how to handle options, then he should just use the developer presets.
Same applies to the last settings he claims are unneeded, I actually play with Motion Blur and Film Grain, is another setting if you don’t like it, disable it, there is no need to remove it entirely.
You reminded me of an argument I had with another commenter who were insisting that games should start allowing us to change the DLSS mode.
Some settings belong to the config file and some to the game menu
Consoles exist for a reason. Stupid people expect too much from their 1060s
And people will say “but I can handle all these extra graphics options, I am smart”.
Most people want a frictionless PC experience. Most people use Windows and not a custom built version of Linux Arch. Most people use the Windows GUI, and not Windows Powershell commands.
Just because you know what “trilinear filtering” is, doesn’t mean the average gamer knows what it is. These are games meant for entertainment, the experience is not supposed to be frustrating.
That doesn’t mean PC gaming needs to be a like Nintendo console, a limited set of well-understood options is good. But you can’t overwhelm players with options.
I’ll advocate spending some time in fine tuning my settings rather than having to be forced to use presets. You can have a good old “autodetect”, or we can go back to the times when you ran a benchmark for the game to calculate what the best settings where, but removing or reducing the settings is not a solution.
Go play Xbox if you want limited settings man. You can’t be real here. Then the PC platform ain’t for you. The abundance in settings is for scalability.
If that is the case, then they should buy a console. Nonetheless, one real issue is people mindlessly setting “ultra” and expecting decent performance. Meanwhile, more gamers are sitting on older hardware, because the current gen are massively overpriced (as were prior gen during mining).
notice everybody that this is already a pattern with all these ue5 games, no matter what they work like that 40ish on 4090 4k ultra and we talking standard raster here, where is place for hybrid ray tracing, not to mention path tracing in this?
Cyberpunk running 4k ultra 60+ WITH PATH TRACING on is somewhat like a tech miracle compared to ue5 dumpster fire this engine currently is. Cause visuals arent even mind blowing for that 40fps price, something is very messed up and whole dev industry knows it but its too late cause everybody jumped that ship
It’s either one of two possibilities. Either UE5 is the problem and Epic needs to fix the damn engine or Developers aren’t using UE5 properly and Epic needs to work with the Developers to help them use the engine more effectively.
Either way the ball is in Epic’s court to take action one way or the other because at present UE5 games are mostly a fail as far as performance goes.
The only people that seems to know how to use unreal engine is epic. It’s kinda always been that way every since unreal 4. Look at splinter cell blacklist, that game is using unreal 2.5 and still looks amazing till this day. Not the best game out there but it looked and ran great. Apparently the devs had problems with it at launch. What I’m saying is, you have a major point and I don’t see a fix in sight.
The traversal stutters are the damn worse. I swear I don’t remember that being a thing. Now every damn game wants to be an open world game. I miss hub world/hallway games. What’s worse to hear is how these people talk about dlss, in a review, as if it’s a STAPLE part of the gameplay. I have no words for that. “Just set it to this and DLSS, then game runs great”… Oh yeh?.. 🤦🏾♂️
This idea of taking an off-the-shelve engine and building your whole game with it is pretty recent.
Usually game companies took existing engines, like the Source or Quake engine, and used that as a base to build out their own engine. It was just a time saver, the end result was an engine that shared very little code with the original engine.
You now have tons of developers making 3D games, leaning too heavily on Unreal Engine, many not skilled enough to deal with Unreal’s shortcomings.
Maybe Unreal Engine will be the Photoshop of game design one day, where users don’t need to know anything about Unreal Engine’s code or inner workings, but we’re clearly not there yet.
That’s because companies like Nvidia, Google, Amazon, and Microsoft keep snatching up all the best programmers
Who would you rather work for if your were a top tier coder? Nvidia who has an unheard of 97% satisfaction rate from their employees or some crappy game developer that pays like crap in comparison, doesn’t respect their own employees and is run by a Business Major who probably couldn’t program a universal remote or his home thermostat.
Exactly lumen is fake tracing designed for the consoles which can’t handle it anyway.
Lords of the Fallen uses Ray Tracing however, do your homework before spreading nonsense.
What the hell are talking about? Cyberpunk is not running 4k ultra 60+ with path tracing. A 4090 gets 19 fps on those settings since you are comparing the UE5 games without DLSS. And “path tracing” is just a bit misleading as it is not like the full game is actually path traced. And outside of the ray tracing and path tracing elements in Cyberpunk, the rest of the game looks very outdated.
Everything you’ve said is wrong.
CP2077 does not run at 4K ultra 60+ with path tracing, not even close. You need to use DLSS and frame generation to get over 60fps. Don’t bullshit people.
2160p dlss balanced all ultra. Runs nice with the 3080 10gb
Devs should add a option to disable Lumen and Nanite.
In Thaumaturge a game from 11bit studios, also on UE5m you can switch from Lumen to screen space, and safe 20-25fps
Nanite does not appear to impact the performance by a lot. If you reduce View Distance to High, you’ll get geometry/grass/bushes pop-ins with just 1-2fps boost.
You can also lower Global Illumination to High and get a big performance boost.
There are currently three settings that can significantly improve performance. Global Illumination, Visual Effects and Foliage. Here are some numbers for the AMD RX 7900XTX.
Ultra Settings/1080p: 79/86
Ultra Settings (with Visual Effects, Global Illumination and Foliage at High): 95/105
Nanite should be improving performance or improving visuals while keeping performance.. Lumen is the demanding thing – it’s quite awesome… Real time global illumination – ray tracing.
I don’t think Unreal Engine necessarily sucks per se. It just ends up in the wrong hands.
Unreal Engine allows a wide range of developers with vastly different technical competence to develop 3D games.
A lot of developers are using Unreal Engine because they do not have the technical know-how to develop their own engine. These developers need a lot of handholding and they can not fix many of these shader compile and DX12 issues themselves.
Well the art design doesn’t look that great, so ultimately, who cares? Just another jumbled mess of trees and crap that people seem to think is proper gothic artwork for some sad reason.
I know ….. It looks like it’s a test for Gray Scale performance. It’s all blacks. whites and grays
The game takes place in two worlds, the Umbral world is desaturated like that. There’s a lot of variety in locations and colour palettes throughout the game.
I get locked 70 fps (75 hz g-sync monitor) at 1440 ultra on a 3080ti, dlss quality. Game looks great imo, but there’s this weird slow down every time I enter a boss for a couple of seconds, pretty annoying.
I’m digging it so far but there’s a few weird design choices, why is the roll so insanely long? Also, multiplayer is a disgrace and should have been delayed. It’s not fit for release in this state. It’s entirely one sided net code, meaning everything is calculated on the hosts side.
Not only does this make for some of the most consistently horrendous desync I’ve seen but means for example i-frames are on the hosts side, so if you don’t roll it on their screen you take the damage. Good luck parrying bosses or even just avoiding their attacks. Same logic applies to everything online, co-op and pvp, it’s a mess.
Try high… Not ultra. You’ll probably get a significant FPS boost for minimal loss in image quality.
Unreal Engine doesn’t necessarily suck per se. It just ends up in the wrong hands.
Unreal Engine allows a wide range of developers with vastly different technical competence to develop 3D games.
A lot of developers are using Unreal Engine because they do not have the technical know-how to develop their own engine. These developers need a lot of handholding and they can not fix many of these shader compile and DX12 issues themselves.
Sometimes you have teams like Iron Galaxy that shouldn’t even work in the gaming space.
Devs should just stop using unreal 5 trash untill the makers of it learn to optimize it.
Every god dame game that releases with it has horrible performance.
Thing is Devs are the ones who’re supposed to optimise it, the engine is a toolset how one uses it depends on the dev. You may have stutter fest here and then you’ll have a game like gears of war where Devs took time to optimise their game, hell look at lies of P even. I know they’re ue4 games but still
UE5 us a toolset program. It has cutting edge technology to create high end graphics. There’s nothing wrong with the engine, and Lords of the Fallen does not have horrible performance. It’s up to the devs to try and optimize the game. It’s highly demanding technology.
“highly demanding” that not even the high end GPU’s can run it propperly.
No, that just a poorly developed engine.
Lol. It’s not the engine peon. It’s the people using it, and the fact that the tech it uses is next gen. When do you think the 4090 was designed??? About 3 years ago.
You’ve got no idea.
You spout stupid nonsesne with every reply.
I’l leave you to it~
Oooooh… good one. Really got me there.
Got nothing in response, so responds with stupidity.
I can see both of your points. Where Kittenko is coming from is that what is the point of developing usign UE5 if £1600 GPUs can’t run it well.
I’m sure that in the next 5 years, maybe we can see mainstream affortable cards able to run it, but the game is not releasing 5 years from now, is it?
It seems like his point is that the Engine itself is unoptimised and poorly made, and therefore games built in it run poorly.
What he’s failing to understand is that it’s basically UE4 with extra features like lumen etcetera.
Games built it UE4 don’t run poorly, unless they’re developed poorly, it’s the cutting edge tech being used in UE5 that is highly demanding.
It’s up to the devs to optimize the game. Since Ue5 is still newer a lot of devs are still learning it
Huh? You know it’s up to the devs to optimize the game…… the makers of the engine can optimize the engine but if the game devs don’t optimize the game there nothing epic can do
Why is it tested using Windows 10? Why not Windows 7? Or XP?
Because those are long obsolete and this game won’t even run them as it needs DX12..?
Don’t see any problems with the performance.
Getting a solid 60fps at native 4k Ultra with a 3080ti.
Seeing all these new games barely running on the top of the line video cards is so damn weird. Back in the day a new high end card you would be pumping all your graphics settings carelessly to ultra and have huge frames. Now… you get to experience max settings at 40fps and realize you wasted your money on an instantly already outdated video card
This is why one does not “upgrade” to a 4K monitor.
What the f*k are you talking about? Games have always been unoptimized garbage that couldnt even run well using TWO of the best GPU’s in existence or even the one after that. It was always like this.