2K Games has just released Borderlands 4 on PC. Powered by Unreal Engine 5, the game takes advantage of Lumen and Nanite. And, since it supports DLSS 4 with Multi-Frame Gen, we’ve decided to test it first before doing our full PC performance review.
For our DLSS 4 benchmarks, I used an AMD Ryzen 9 7950X3D, 32GB of DDR5 at 6000Mhz, and NVIDIA’s RTX 5090. I also used Windows 10 64-bit, and the GeForce 581.29 driver.
Borderlands 4 does not have a built-in benchmark tool. This came as a surprise since Borderlands 3 did have one. So, I don’t know why Gearbox did not include one.
For our benchmarks, I used this area. This is when Arjay asks you to find his stash. This area appeared to be among the most demanding areas early in the game.
At Native 4K/Badass Settings, Borderlands 4 runs with 40-43FPS on the NVIDIA RTX 5090. Compared to Cronos: The New Dawn, Borderlands 4 actually runs better. I’m not saying it runs great. Just that we’ve seen other UE5 games that run worse than it at Native 4K.
With DLSS 4 Quality, the NVIDIA RTX 5090 can provide constant 60FPS at 4K with Badass Settings. So, this is actually great for a game that uses Lumen and Nanite. As I’ve said countless times, Lumen is a form of Ray Tracing. So, no. An NVIDIA RTX 5090 is not able to run most modern games that use Ray Tracing at Native 4K/Max Settings with 60FPS.
Could developers skip Lumen and go back to baking the lighting? Yes, they could. Would that make the game run better? Also yes. But here’s the thing. Baking the lighting on such big environments would take much more development time. That’s why many devs choose to use UE5’s Lumen. Lumen saves time, and it also makes dynamic lighting look more consistent. You may not like it, but that’s the harsh truth.
With DLSS 4 and Multi-Frame Gen X2, we were able to get framerates between 110-130FPS. With MFG X3, we got to 150FPS. And then, with MFG X4, we got to 200FPS.
But what about Native 4K? What settings do you need to get 60FPS on the NVIDIA RTX 5090? By dropping the settings to High, we were able to get framerates over 60FPS at all times. Below, you can find a comparison shot between the Badass and High settings.
If you look closely, you’ll notice that High Settings have less grass and slightly worse AO. Other than that, the graphics look almost the same. This is a good reminder to adjust your settings if you want to game at native resolutions. You can nearly double your performance without losing much visual quality.
From these initial tests, it appears that Borderlands 4 can run better than some other UE5 games. It’s not THE most optimized UE5 game. At the same time, it’s not among the worst we’ve seen so far.
But do its visuals justify the high GPU requirements? That’s up to you to decide. The cel-shading filter doesn’t do the game any favors either. Yes, it gives it that ‘Borderlands’ look. And if you compare Borderlands 4 to Borderlands 3, you’ll see that B4 looks much better. Still, I don’t see anything that feels truly next-gen here. It mostly comes down to how demanding Lumen, Virtual Shadows, Nanite, and the other UE5 features are. However, the final result on screen won’t make your jaw drop to the floor.
Our PC Performance Analysis for B4 will go live later this week. So, stay tuned for more!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email




UE5 = Sh**t
The problem is less UE and more automatic realtime lighting. RT/software, both garbage where the performance cost and frametime are destroyed to save dev time and make the CEO's more money. UE just gave the dumbest people on Earth the ability to make the cheapest, quickest game possible. It's also why games are buggy pieces of garbage. Playtesting was often done while all the baked stuff was being done. So now you get broken, stuttering games made by the most incompetant, cheapest labor possible.
I think when UE5 is used well with baked lighting its still gonna look good and perform better. Tekken 8 is an example. Bloober made SH2 remake and is an absolute trainwreck on pc to this day. Just wait for Silent Hill f which is made by Neobards and from the looks of it, it is far better performing on PC than any bloober title.
Pitty thay dont do benchmarks on the average users rig a this gives us no refrence of how the game runs unles you have the ultmate system.
Our PC Performance Analysis, which will include numerous AMD and NVIDIA GPUs, will go live later this week.
nanite is well handled by gpu, but was obvious lumen would give problems since its announcement, also developers push it too much to the limits but without providing those higher limits with premium graphics, you eventually wonder if you are missing too much from epic down to high settings and maybe not because developers dont really work on making epic settings stand out, they leave it to the engine
In any case still traversal stuttering legacy of unreal 3…
Many are lazy, timeconstrained or simply incompetent. Some typical sh*t – High res textures on objects that show maby 5 pixels worth of screen space once rendered. No sense with geometry etc, "leaks" plugging things like culling etc. And thats before even starting with the code habbits of overusing copy/paste classes especially poly ones and the like, all to make sure the hardware can't run it effichently as things like brnach prediction etc gets thrown out the windows… helped by drm like denuvo who make sure things like cache bloat gets exponentionally worse. And bloated easy to dev for api number 100 on top of the others, yet wonder why it run like dogshit vs purposebuilt code like back in the day… how strange you need a super computer for a hello world msgbox
when you have screen space even if together with global real time light thats not real ray tracing which doesnt need any ssr…im not a developer, i guess ssr is less expensive in terms of computing time but based from my experience in recent games i would say its the opposite! when all reflections are handed over to the ray traced engine they work more efficiently…thats the point with lumen which is lighting without baked reflections if ssr can be considered baked…but i may be wrong on my assumptions
on the other hand i see no path tracing in the game to justify a moderate framerate on a 5090 with upscalers on
as a game i like it very much though, i love when shooting is difficult and you are not equipped with too powerful weapons, it increases the thrill in the game
I watch this guy's videos. He usually does that in his videos. This is just an example of his channel. This one does not cover what you asked for, though.
https://www.youtube.com/watch?v=s7i_1PSPmlA
Daniel is great but for optimization/how the game is gonna run check out BenchmarKing. Basically does what Digital Foundry does testing every setting but better. Tests on a 3060Ti and a 4070 Super so that the majority of gamers know how the game is gonna perform. Reviews check out ACG. Bout the only decent reviewer out there.
I usually watch BenchmarKing for optimized settings on average PCs, fantastic quality videos, found him just this year.
"the game takes advantage of Lumen and Nanite" so that the CEO could give himself a bigger bonus, with a quicker development time where our PC's are used as special effects studios doing pointless real time lighting that makes the games run like garbage. This was 9 years ago on a 1080Ti with baked lighting. RT is not for us, Lumen is not for us. It is for Triple A publishing and accelerated GPU upgrades and ripping off the customer. People like Alex from Digital Foundry are not our allies and closest friends. They are unethical pieces of sh@^ selling you a fairy tale to make their pals at garbage Triple A publishers and Nvidia/AMD more money. https://uploads.disquscdn.com/images/972addfb41439cba8460203b02a1519bcba7fb6e072ec333fb596c06fc6aae67.png https://uploads.disquscdn.com/images/bd28689b620a5c9dbd78453305d70a0e4d105a6a650cdbb7ce2ce6d25fca432f.png https://uploads.disquscdn.com/images/243724f89b4e83d3e0f20423e59becae5b92f428e08be8708d0a37272fc024c9.png https://uploads.disquscdn.com/images/e610710f81364eb2889e50be4c9c449b385a9b0f52c4962e7d4f620bd6bf7079.png
Frostbite never used baked lightmaps. Battlefront very likely used realtime radiosity by Enlighten. It's a lot more work to set up though.
imagine unreal 5 in the fifa series…with traversal stuttering when you cross midfield line
agreed
I quadruple THIS! This is what I'm saying always.
I guess you got your 5090 back.
Why? How? I don't even understand, it being open world doesn't explain how it runs this poorly when the game doesn't look significantly better than the first game in the series which is 15 years old.
I think bl3 looked better than this
From my own quick test runs on Linux I can say that this game certainly runs worse than the recent Hell Is Us, which is even playable on the Steam Deck with low settings.
Of course it runs quite well on my workstation with a Linux 6.18 release candidate using the scx_bpfland CPU scheduler, but that is a high-end system.
Reporting suggests the Switch 2 version isn't running all that great either, so performance optimizations probably weren't a big focus for Dandy Randy, unlike his hairstyling…
54 FPS in 1080p on low settings on an RTX 4060.
It looks like a mobile game with PS2 graphics, and it can't do 60fps.
Can we ban Unreal Engine. This engine is complete garbage.
https://uploads.disquscdn.com/images/2d38d839066b8ce3fdf3835adb1e4be5cca56cfb7c108ff00e4a1dfe2286c743.jpg
Grindy games are for losers, they all look the same and they're soulless
What a disgusting game
It's grindy if you choose to make it so.
Average review score is sitting at 52%.
Game is getting slammed by performance issues.
Unreal Engine sucks so bad.
This "lumen" and "raytracing" idiocy has got to stop.
I agree.
On top of UE5 slop, it has TWO forms of DRM running at all times.
Hey it saves the devs cost to not have to do things like proper baking lights etc… all while pushing the cost onto the gamers who need insane hardware to do it realtime instead. Win for them.. lose lose for the gamers… ohh and they are nice to push up the price of said games too – Ie tripple loss for the gamers and win for the lazy a*s corps.
54 FPS in 1080p on low settings on an RTX 4060. It looks like a mobile game with PS2 graphics. Can we ban Unreal Engine.
https://uploads.disquscdn.com/images/2d38d839066b8ce3fdf3835adb1e4be5cca56cfb7c108ff00e4a1dfe2286c743.jpg
David own on Youtube Ryzen 5600x and Rtx3060 12gb on Medium settings Dlss Quality is getting 60fps.
This thing doesn't even have any sharpness to the textures or a setting to adjust sharpness, yet the requirements are sky high.
You haven't touched a PS2 game in a long while.
touches silver ps2….hmm ps2
That horrible character is definitely below PS3 level graphics. So yeah, PS2 is probably about right. PS3 had much more detailed character models.
PS3 had just 256MB of VRAM. Today's AAA games are completely horrendous, they are pure unoptimized slop.
Borderlands 4 VS PS3 (Beyond 2 Souls):
https://uploads.disquscdn.com/images/7866c7ece66e94c748ec171a8d1b96734da6e780a8ba60a45eedf1919c429b98.png
Nowadays, FG has become an excellent excuse for a lack of optimization.
Performance aligned with Cronos from bloober…its enough time for multiframe i guess
Great 👍🏻👏🏻 that you reported on one settings lvl below the maximum, because as usual, and just as you found out, it's pretty much the same graphics but without the unoptimized BS massive performance penalty.
zWormz tested this heap of dogshit on a 5090 at 1440p and it couldn't maintain 60 FPS.
2 forms of DRM + UE5 + trash devs = Borderlands 4
Doom Dark Ages has fully dynamic global illumination and doesn't run like sh*t. Stop making excuses for this.
Doom: The Dark Ages? You mean the game that everyone was bashing for its performance? Look at the comments in that article, lol. Today's gamers are whiny kids because, yes, a lot were calling Doom: The Dark Ages an unoptimized sh*t when it was released. This further proves that most don't even know what they are talking about. All they do is scream like kids so that they can get someone's attention. Screaming "UE5 destroys games" does not make you look like an expert. It just makes you look like a complete clueless idiot. Oh, and this doesn't mean that B4 runs great BTW.
https://www.dsogaming.com/pc-performance-analyses/doom-the-dark-ages-benchmarks-pc-performance-analysis/
unreal 5 does not build bad performance, too short development deadlines do
unreal 5 does not build bad performance, too short development deadlines do
Dark Ages runs at least at twice the frame rate than Borderlands 4 and looks a lot better too, while still having realtime GI. So yes, I am making that comparison.
Doom: The Dark Ages? You mean the game that everyone was bashing for its performance? Look at the comments in that article, lol. Today's gamers are whiny kids because, yes, a lot were calling Doom: The Dark Ages an unoptimized sh*t when it was released. This further proves that most don't even know what they are talking about. All they do is scream like kids so that they can get someone's attention. Screaming "UE5 destroys games" does not make you look like an expert. It just makes you look like a complete clueless idiot. Oh, and this doesn't mean that B4 runs great BTW.
https://www.dsogaming.com/pc-performance-analyses/doom-the-dark-ages-benchmarks-pc-performance-analysis/
The sad part is that you could have almost achieved the same visuals by baked lighting that would have run on 10 series cards with much higher internal res and base fps. Technically games are better, but visually look no better than 10 years ago
Borderlands 4 is more demanding compared to Cronos if we compare results just with software lumen. "Gamegpu" RTX 5090 results for both games : 48fps average in borderlands 4 and 71fps in Cronos.
On my RTX4080S Unreal Engine 5 games runs around 45fps with epic settigs and 70fps with DLSSQ. Borderlands 4 needs RTX5090 to get similar framerate and that card is almost twice as fast. I dont know why John defends Borderlands 4 performance so much, when obviously this game is much more demanding even compared to other UE5 games. Based on what I saw I will need to run Borderlands at 1440p to get comparable results to 4K in other UE5 games.
Also I'm not buying that argument: "lumen is a form of RT and that's why game must be very demanding". Lumen should be cheap dynamic GI compared to RT GI, that's the whole point of it. Yet in borderlands 4 we are seeing results comparable to PT games and much worse compared to games with RT GI.
I don't see any defense here.
"Still, I don’t see anything that feels truly next-gen here. It mostly comes down to how demanding Lumen, Virtual Shadows, Nanite, and the other UE5 features are. However, the final result on screen won’t make your jaw drop to the floor."
That's graphics fidelity, but I was thinking about performance. You made a performance comparison with the Cronos, and argued that people shouldnt expect 4K 60fps from UE5 games on the RTX 5090.
Anyway, I tried out Borderlands 4 for myself, and based on my initial impression, I might start defending this game :P. When the game loaded, I saw 45–55 fps at 4K DLSSQ and default settings (mixture between medium / high / max), which was what I had expected. I though that maybe I will need to lower settings even more, or run 1440p (my monitor has excelent 24 inch 1440p mode), but suddenly framerate improved DRASTICALLY and jumped from 50fps to 94fps with exactly the same settings (4K DLSSQ), and 151fps with FGx2 on top of that. It seems the game was loading something in the background initially, which is why the performance was much worse. I will test higher settings later on and see how the game performs in open world sections. For now, though, I'm pleasantly surprised at how well this game runs.
4K DLSS-Quality, default settings.
https://uploads.disquscdn.com/images/40da3b87f84f2d1b3ff805a2a9f82079d3b4845aeb30f32018077ea80b58457c.jpg
4K DLSS-Quality + FGx2, default settings
https://uploads.disquscdn.com/images/0bdd40cc140093ab57e69e4b8409e304d50b5d4cde3f3f4627093a40250b1a6e.jpg
nanite is not too heavy but its also not even the revolution they claimed it would be
lumen is a disaster
I've played like 2 hours on my 7700X and 9070XT on High Settings Preset with FSR4-B and FG to get over 160fps on a 1440p UW. It's a sin that games are still rendering 30fps locked cutscenes in big '25. I've had some drops but it's been pretty playable for me and friends who are one a tad weaker hardware have had more frame issues than me. One on a 11700k and a 3070 and the other on a 12700k and a 7700XT.
Guys, the era of truly beautiful games is pretty much over, with only a few exceptions. Ever since FSR, DLSS and their companions took over, it’s all about performance optimization now – and that’s a pretty sad reality in the age of DirectX 12.
4k screens came out … 25 years ago…. just a reminder….for folks defending 1080p on $2,000 GPU.
5090 is clearly only capable of 1080p in todays "aaa" games … LOL.
Sloppy Engine 5 – Where the lazy devs can ignore things like propely baking lights etc by pushing the cost to the gamer who needs insane hardware compared to what they get on screen, sad but try yet some defend that lazy a*s behaviour!
I wonder how many sales they loose by not doing such basics that would allow the game to run on far wider audiences hardware.