Last week, 2K Games released the next part in the Borderlands series, Borderlands 4. Powered by Unreal Engine 5, it’s time now to benchmark it and examine its performance on PC.
For our benchmarks, I used an AMD Ryzen 9 7950X3D, 32GB of DDR5 at 6000Mhz, AMD’s Radeon RX 6900XT, RX 7900XTX, RX 9070XT, as well as NVIDIA’s RTX 2080Ti, RTX 3080, RTX 4090, RTX 5080, and RTX 5090. I also used Windows 10 64-bit, the GeForce 581.29, and the Radeon Adrenalin Edition 25.9.1 drivers.
Gearbox has included a few graphics settings to tweak. PC gamers can adjust the quality of Textures, Shadows, Global Illumination, and more. The game also supports NVIDIA DLSS 4 with MFG, as well as AMD FSR 4.0 and Intel XeSS 2.0.
Borderlands 4 does not have a built-in benchmark tool. So, for our tests, I used this open-world area. This area is more demanding than the one I used for our DLSS 4 benchmarks. As such, it should give us a better idea of how the rest of the game runs.
Borderlands 4 is easily the most demanding non-path-traced game I’ve tested. At 1080p/Badass Settings, the only GPUs that can offer a 60FPS experience are the NVIDIA RTX 4090 and the RTX 5090. The RTX 5080 came close to it, though it could frequently drop to 57FPS. As for AMD, the only GPU that is able to provide a smooth gaming experience, provided you use a FreeSync monitor, is the RX 7900XTX.
At 1440p/Badass Settings, the only GPU that can push framerates over 60FPS is the NVIDIA RTX 5090. Yep, at Native 1440p, the NVIDIA RTX 4090 is unable to provide a 60FPS experience. Crazy, right?
As for Native 4K with Badass Settings, there is no GPU that can offer a smooth gaming experience. In this area, the NVIDIA RTX 5090 pushes a minimum of 35FPS and an average of 40FPS. Ouch.
But what about the in-game settings? Can the game scale well when lowering them? Sadly, it cannot. At Native 4K, you need to lower your settings to Medium to get a 60FPS experience on the NVIDIA RTX 5090. Medium Settings. On the NVIDIA RTX 5090. In a game that doesn’t use Path Tracing.
The problem with Borderlands 4 is that its visuals do not justify these huge GPU requirements. If you put Borderlands 3 and 4 side by side, you’ll see that B4 looks much better than B3. I’m not saying that the game looks bad. Still, these performance numbers are abysmal for the visuals we get on screen. Borderlands 4 uses Lumen, Nanite, and Virtual Shadow Maps. Still, the performance we see is nowhere close to what we’ve been getting in other UE5 games. You could also say that B4 is not as impressive as Hellblade 2 or WUCHANG: Fallen Feathers. Hell, the game looks worse and performs almost like the path-traced versions of Black Myth: Wukong and Alan Wake 2.
And then we have some weird visual issues. Take a look at the following two screenshots. Where the hell is the sun in the left image? No seriously. You can clearly see the sun’s reflection on the water. But if you look up, you won’t find any sun. That image is so wrong in so many ways. And now look at the image on the right. Look at those atrocious low-quality terrain textures.
Not only that, but Borderlands 4 suffers from MAJOR grass pop-in issues. The game uses UE5.5.4.0, which explains why we get these pop-in issues. Epic has added Nanite support for vegetation in UE5.6. So, every game using an older version will have these pop-in issues with plants and grass. I also noticed some weird shadow issues here and there. The game also has some stutters, though they are not as awful as those we’ve seen in Cronos: The New Dawn (with Ray Tracing), RoboCop: Rogue City – Unfinished Business, or Oblivion Remastered.
All in all, Borderlands 4 is a mess. The game does not justify its enormous GPU requirements for the visuals it displays on screen. B4 right now runs like games using Path Tracing. Instead, the game only uses Lumen. It should run WAY BETTER than what we see. And, in case you’re wondering, the game also performs horribly on consoles. So, this isn’t a PC-only issue. Borderlands 4 is a mess on all platforms!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email

















so john UE5 is garbage ? we agreed now ?
Unreal Engine 5 is one of the best engines and has some of the best tech features. If you can't differentiate an engine from a game, you are an idiot. Or one of those clueless grifters who are jumping on whatever "hate bandwagon" they can find to look "cool".
Man, you've been getting real uppity lately whenever someone makes a negative comment about UE5. Is epic paying you to get this butthurt or something?
Just commenting since those who post such idiotic comments are so “passionate” about it, they’ll get angry and post even more, which only increases traffic. In their minds, they are justice keyboard warriors. In reality, they’re just boosting the article’s traffic (even if they use Ad Blockers).
That's exactly what you've been doing John. Have a little self-awareness, man.
He is only doing it for the money, being a shameless shill is his gig.
if a game has a unreal engine 5, it wont run well on my pc, i dont make enough money to buy a 5090 like you. Most people dont, this industry is making games for the 1%, in both literal and metaphorical ways. I have yet to see something made in unreal engine 5 that couldnt have been made in 4. They just use it for the sake of using it.
you ask this question when 1/3 of the articles are "wow look at this recreation on unreal engine" slop crap.
I dont think they look that good, the point of a engine is for game to look and play well and having good features and running well, not to make epic games rich or to brag you are using the latest technology. There is a reason why people make indie game son old engines and thats because they are tried and tested.
"New features mean nothing when no one can use them" Speaking of which abiotic factor uses unreal engine 5 but they turned down every setting and made the game look like half life so the game will run, it still has traversal stuttering at the obvious areas that it loads the new zone.
Lmfao
The Borderlands series has always had poor optimization. UE3, UE4 or UE5.
Borderlands 3 was very demanding with the maximum (badass) settings and especially at 4K resolution, but the game was scalable and even my 2012 PC (i7 3770K + GTX680) was enough to run this game with better framerate and settings than PS4 version. With GTX1080 the game run with locked 60fps at 1440p and high settings. Now on 4080S I get 260-280fps at 1440p.
How is traversal stuttering that hasn't been fixed for 10 years but still one of the best engines? are you high?
Games using the Unity Engine also had traversal stutters. And other games using other engines had traversal stutters. These aren't exclusive to UE5. But we can agree that they need to fix them. That's why I'm always mentioning them in our analyses.
Cyberpunk 2077 has the best engine but proprietary. The graphics actually warrant the fps on that game unlike UE5.
Let's not forget how broken CP2077 was at launch. Also, CDPR switched to UE5. RED Engine was hacked together for CP 2077. However, they are talented devs that made it work. They should be able to work some magic for The Witcher 4 with UE5. The tech demo they did was impressive and ran great on a base PS5.
There are plenty of UE5 games that run good too, but people love to focus on the negative. UE5 is not the problem. Bad development practices are the issue.
They are also going to be working on one of the special builds of UE5 that gets additional attention from Epic due to the scope of the project.
Epic partnered with CDPR to help improve Unreal Engine back in 2022. So, they have been working very closely for a long time. These new features and optimizations get put into the general versions released to all developers.
There is only so much optimization that can be done at the engine level. At the end of the day, UE is just a tool. Each developer is responsible for the scope of their game and optimizing to hit performance targets. So, the blame lies on the developer or publisher who are rushing games out too early. As I said, bad practices are the problem.
Epic partnered with CDPR to help improve Unreal Engine back in 2022. So, they have been working very closely for a long time. These new features and optimizations get put into the general versions released to all developers.
There is only so much optimization that can be done at the engine level. At the end of the day, UE is just a tool. Each developer is responsible for the scope of their game and optimizing to hit performance targets. So, the blame lies on the developer or publisher who are rushing games out too early. As I said, bad practices are the problem.
And what? Unity and Unreal are the only engines on the market? That's the point we're making, there are fundamental issues with Unreal Engine such as shader compilation stutter & traversal stutter that have literally been a major issue for 10 years now. Devs select UE because it's the cheapest way to get access to an engine with very modern technology, not because it's genuinely the best product in existence.
at some point you have to admit the engine is the problem, we are at unreal engine 5.5 now and it still looks like sht and runs like sht, if it was one or two games that had this issue you would have a point, but what are the odds that every game using it runs like crap? Do all the devs suck?
I've played every game I've benchmarked. Right now, the most visually impressive games are using UE5. In terms of graphics, you have to be an idiot to say that UE5 games do not look amazing.
Here is also an example of what REALLY small devs can achieve. Compare Tainted Grail: The Fall of Avalon or Blades of Fire with Echoes of the End. Visually, Echoes of the End looks a generation above these two games.
The one issue we can all agree is that the devs will have to optimize their code to avoid traversal stutters. Then again, traversal stutters happened in other engines, too. So, this isn't a UE5-only issue.
I have played over twenty UE5 games and experienced real problems with only two of them (SH2 Remake, and TES4 Oblivon Remake), so the ratio is positive.
UE5 games arnt perfrct, becasue they usually have have problems with traversal stuttering, but this only occurs occasionally in most games, so it doesn't affect the overall experience. There are also UE5 games where I haven't seen any stuttering at all.
Thanks to UE5, game graphics have reached a new level, resembling CGI (for example Hellblade 2). Such insanely high graphics fidelity has it's cost, but if you own capable PC performance is very good at 1440p (even at native resolution), and if someone want high refreshrate experience at 4K DLSS can help with that.
https://uploads.disquscdn.com/images/f42495b26fdc03e0dc2153d96d71d2e7122de7b70ce57509c271d335f0ddcc82.jpg
https://uploads.disquscdn.com/images/3d9ac790aa6e2a302072b36ab06d1965fe101b115a8fac247f6a40d5e8389bf8.jpg
https://uploads.disquscdn.com/images/b501df0b1635e050242868158fba0707dd947d6cfc7ccf232fbe3ba0fe0dcf6b.jpg
https://uploads.disquscdn.com/images/16d02684ccb91de44646a52aede265d822f849e04778bd948ff0384ff0d0ac2c.jpg
https://uploads.disquscdn.com/images/7b800a3e2ab0d1298193d10c5eb307e266fe3cad0353fb4eddfeaeebeeeaed70.jpg
https://uploads.disquscdn.com/images/a53e0dc343570084162dee1e68c9285f2f0b1e11f0d3967ff49d017d9a323037.jpg
Even the ground surface is extremely detailed. I havent seen such details in other games.
Thanks to UE5 even smaller AA studios made amazing looking games like robocop.
4K DLAA + FGx2
https://uploads.disquscdn.com/images/05d5378fb5806748ba6487ad218b14d76718386af0622834ceefebe01803eb4b.jpg
4K DLSS Ultra Quality + FGx2
https://uploads.disquscdn.com/images/b251a8adad9fbbcb79183c687b220197bcc9a454b729b766077d4bcb3628d7c8.jpg
https://uploads.disquscdn.com/images/f84cfbb39fe13fdcec5776ca40267749fdd730ef5a00eac3fe1dcabc9fac23bf.jpg
https://uploads.disquscdn.com/images/b1cd7e477368704468883e969494ac16520485d3ba24a4440232bdd82d1989ea.jpg
https://uploads.disquscdn.com/images/6f4adefb93af23a0f8f0f397289f9bfbe2d5351103b2fa2982920f97cffebf36.jpg
1440p DLAA + FGx2
https://uploads.disquscdn.com/images/a29d702091abbc20a3b8d87d69a8d454e5fb7d18bce4bb3943c412dd72fc4f1c.jpg
https://uploads.disquscdn.com/images/f2b5402cf80287aadbca981679560ab71dd5fd194bee181b3394ceb399357ee8.jpg
https://uploads.disquscdn.com/images/aa4be010a92e480948be56067f9f53ac1898c6a9b71ee888981366fc4d15ec26.jpg
There is NOTHING special or amazing about UE5.
This is A Plague Tale Innocence, Asobo Engine.
2018, without raytracing:
https://uploads.disquscdn.com/images/e2dd2cb83ffb4a2bb29b1a005989f6883a6d477471a476c95173d5ab751c4175.jpg
There is NOTHING special or amazing about UE5.
This is A Plague Tale Innocence, Asobo Engine.
2018, without raytracing:
https://uploads.disquscdn.com/images/e2dd2cb83ffb4a2bb29b1a005989f6883a6d477471a476c95173d5ab751c4175.jpg
I have a 4080 Super and hate UE5. That engine has been a stuttering mess since at least UE3.
I mentioned stuttering, but I only found it annoying in two of the UE5 games I played. In the vast majority of the UE5 games I played, stuttering only occurred occasionally during traversal and didn't distract from the overall experience. There are also UE5 games that don't stutter at all (as confirmed by digital foundry). If you ignore all the good examples and focus only on the worst-case scenario, then sure, UE5 can be considered a mess.
Personally I dont hate UE5, because I'm an artist and I can see that UE5 games render far more detailed games that custom engines. Insanely high graphics fidelity has it's cost, but these UE5 games usually run on my PC at 4K DLSSQ at 70-80fps with DLSSQ and around 120-140fps with FGx2, so I'm satisfied with performance and image quality. Brief stutters in some UE5 games doesnt ruin my experience. If brief 1ms stutter every 15 minutes ruins your experience you should play old games from 90s or early 2000s. Back then, games had loading screens between levels that lasted 15–30 seconds, which was much more annoying than 1ms stutter during loading a new area.
There is nothing special or amazing about UE5.
This is A Plague Tale Innocence, from 2018, running on Asobo Engine.
No raytracing, it runs at 60FPS on a 1050 Ti, a 2GB GPU from 10 years ago. It runs at 300+ FPS on a 5090. Custom engines can be 3x-4x more performant than UE5:
https://uploads.disquscdn.com/images/8da4c8ab83a50b6417564d9c18284ffb622f39f24ce0f9e7318ebcdc9d347cab.png https://uploads.disquscdn.com/images/b7331a64906e12ee11e9cbafaac72929f29687e76532687fdaa7c539c5c489ff.png
There is nothing special or amazing about UE5? Dude, even the ground surface in Hellblade 2 is made up of individual small rocks, whereas in Plague Tale you only have flat texture. Rock formations in plague tale innocence have limited geometry, so they dont even look round.
Plage Tale Innocence has beautiful scenery, and its overall aesthetic is pleasing to the eye. However, the quality of the assets looks dated by today's standards, as does the prebaked lighting. Plague tale also doesnt use dynamic GI, so dynamic objects arnt grounded into the scene.
Take a closer look at the assets and you will see low-poly models and low-quality textures everywhere. That's the reason why even old GPUs can run Plage Tale Innocence. If you render 10x times less detail and prebake the whole lighting, the game will run much faster – there's nothing surprising about that Sherlock.
Try rendering the same number of polygons as Hellblade 2 with dynamic global illumination on a custom engine, and then prove that it can render all of that at 3–4 times faster. Then we'll talk. Until then, it's just a pipe dream.
Yes, Robocop games are beautiful, but as soon as there are a lot of effects on the screen during bigger battles (ED-209, UED robots, Cryo gun, etc.), performance becomes hellish. Without Framegen, I get about 30-40 fps with 4k Dlss Balanced on a 4080 Super.
I played Robocop: Rogue City a couple of times. On my previous 2560x1440p 170Hz monitor I played it with mixture or high and epic settings at 1440p DLAA (native) + FGx2, which helped a lot with aiming and improved smoothness. I also locked my framerate with RTSS (using reflex fps limiter instead of async) to 160fps in order to achieve the lowest possible latency (that's why my 1440p screenshots show 160fps). At this resolution, 160 fps was maintained 99% of the time and, at worst, the frame rate could drop to 140 fps during the most intense fights. Also stutters werent a problem. This game run like a dream with these settings.
Now however I use 4K monitor. Performance is still very good for 99% of time (120-140fps at 4K DLSSQ + FGx2), but there's a nasty memory leak in one level (steel mill with bikers and ED209 boss later on). If the memory leak is triggered, the performance will continue to degrade from 140 fps to as low as 30 fps (15 fps without FG) if I continue playing the game and do nothing about that memory leak. The fix is simple though, I have to go back to the main menu and load the last save and I can complete that level with usual performance 120-140fps. This memory leak happens only at 4K and only in this one level, so overal I had a blast playing this game at 4K.
I also played Unfinished Business lately and this standalone DLC doesnt have any memory leaks, and framerate was even better (FGx2 improved framerate to bigger degree compared to Rogue City), so I played the game with Ultra Quality DLSS and still had the same framerate 120-140fps for ENTIRE GAME. With DLSS balance the game was running between 150-170fps. Overall, I had a very good experience, however, unlike the base game, I noticed some stuttering when traversing the levels. These stutters were rare and didn't affect my overall experience.
4K DLSSB + FGx2, mixture of high / epic settings
https://uploads.disquscdn.com/images/73a5e29bf59e0aca7615c542314401cb133e36328f187921860d749099dc31eb.jpg
https://uploads.disquscdn.com/images/e26e082936ed8d982cb03d03ea2bcfa573141c6e13beaf5b636e9fb893aae0f6.jpg https://uploads.disquscdn.com/images/e49b57dee325115a5971ec16877c6b05ae9c8e87eadf601f10f04e568ddc1c83.jpg
These robocop games amazing overall, and my RTX4080S can even at 4K native 60fps if I use high settings preset. I enabled DLSSQ and FGx2 just to play at high refreshrate and to improve my aim (it's a fast FPS game, you need accuracy and extremely fast reaction time).
There are no performance issues in the first few hours of Unfinished Business. But once you get the cryo gun and start using it, the fps drops significantly. Plus, towards the end, when you control the ED 209 and fight against two other EDs or when a big combat drone shoots through a window and there are a million effects. In these parts, the performance are horrible.
I'm pretty sure I used that gun from time to time, but I don't remember seeing dips below 120 fps with the settings I was using, let alone to 30 fps. I don't remember any problems during that boss fight either (I only remember destroying both EDs very quickly). Maybe Unfinished Business also had this nasty memory leak as original gsme and you were unlucky enough to trigger it? That would explain such low performance.
I have this game, so I could replay the boss fight to confirm my observations, but I have already finished the story and the game doesn't allow me to select completed levels. However, if you know how to start this boss fight without having to restart the game, I'm willing to test it again.
BTW. I heard that there were some issues with "Unfinished Business" at launch. For example, some people couldn't even pick a new weapon. Some people also couldn't progress through the story. I played the game after the developers had patched it and didn't have a single problem.
heck even unfinished business has a crazy stutter as soon as you walk into the first super building.
Unreal Engine relies on Lumen, a form of raytracing that requires a lot of very expensive dot products that tanks performance.
There’s a consistent trend: games using Unreal Engine suffer from poor performance, it's not some wild coincidence. Dismissing and insulting your own audience for recognizing this trend, and calling them "idiots", is pure arrogance.
It’s frustration from people dropping $80 on games that launch as a broken, stuttering mess. It’s consumers getting ripped off while arrogant developers act like they’re above criticism.
You get your GPU for free from Nvidia, good for you. Most people didn’t, and they’re tired of paying premium prices for subpar performance. Your $2,000 RTX 5090 isn’t a mainstream card, let's stop pretending that your experience reflects what the average player sees, it doesn't, the average player sees horrible performance with Unreal Engine games.
Those who don't have an RTX 5090 or 4090 should not even attempt to run at 4K. Most UE5 games run with 60FPS at 1080p with a mix of High/Medium Settings on an AMD RX 6900XT. So, those without a high-end PC can certainly get a great 1080p experience.
In future analyses, I might include a 60FPS graph for every GPU (listing the settings/res you need to have to get a 60FPS experience).
It's not about "4k".
The most popular GPU on Steam is an RTX4060. A very recent $300 mainstream GPU. Not exactly cheap for most people.
Borderlands 4 UE5. 1080p LOW. 52FPS.
Metal Gear Delta UE5. 1080p. LOW. 52FPS.
https://uploads.disquscdn.com/images/8cfb22798e323036775bc9192010e92012ca8cff40656364b784804748dc5172.jpg
It's not about "4k".
The most popular GPU on Steam is an RTX4060. A very recent $300 mainstream GPU. Not exactly cheap for most people.
Borderlands 4 UE5. 1080p LOW. 52FPS.
Metal Gear Delta UE5. 1080p. LOW. 52FPS.
This is not an isolated incident, this is game after game after game that can't even do 60FPS on the most popular GPU, on the lowest settings.
And you might argue, this is the price you pay for high-end graphics. But these games look anything but high-end. These UE5 games look like every other game, but have horrendous performance.
https://uploads.disquscdn.com/images/8cfb22798e323036775bc9192010e92012ca8cff40656364b784804748dc5172.jpg
https://uploads.disquscdn.com/images/4a82e244446bf6032cbe585f40db47993c61c73c92e161b420452d83c95acf07.jpg
I dont have 5090, but there are UE5 games that runs at 4K native 60fps (real framerate without FG) even on my 4080S. People have every right to expect 3x more expensive card to deliver 60fps locked in every single UE5 game except for PT games.
https://uploads.disquscdn.com/images/dbb32b963f3932003ec2b5c70fe1cf2f93a58e08ecf1647f87df67e0e862bcca.jpg
https://uploads.disquscdn.com/images/98dea297ceab3c99ab87272d2d1f7f2c60d58d0166e0b64d3f6ee8032d6d4b7e.jpg
https://uploads.disquscdn.com/images/f28d71c8db51b950c7a17bad21bb759a7cfdcaf3b5a458c29ea853fa3c1daf18.jpg
https://uploads.disquscdn.com/images/60a594e86155b4e270408761e120f9776735ffcf2bee16847c3e60f42ef846a0.gif
Let's be honest it's toxic combination of both devs leaving optimization as very last task in backlog + UE5 hard on CPU/GPU without custom addons/tooling which only largest dev studios with large technical/engine related teams can afford + relying on MFG = NO GO. I think Borderlands 4 looks very nice, but by any means it doesn't justify how it works on RTX 5090 class…
also its open world on ue5 and it has too many modern effects to qualify as a cartoonish game, so yeah it runs like trash.
Proper rasterisation and baked lighting takes actual work. It takes passion and time to have programmers and artists decide on how something should look and how to accomplish it.
Using UE5 is the opposite. You turn on the Lumen raytracing switch under rendering category, and you have a fully lit scene. A child could do it. But you also end up with horrible performance.
Guess which direction most developers prefer. Doing actual work, or arrogantly telling gamers they should just buy a $2,000 GPU.
Frame generation does not work AT ALL on mobile RTX 50 series GPUs.
And they wanted 80$ for this trash.
We as end-users may complain all we want, but this right here is all that counts for the CEOs of these publicly traded gaming corporations:
https://uploads.disquscdn.com/images/21189835d3bae71918dd18df074789d54f041f5e1de07b1030dd29fa14ea1b47.png
i legit dont get why people still buy borderlands.
Sheep mentality
Damn, I can't believe this is real.
Actually, the all-time peak is a bit higher now (304K players). So, it sold pretty well despite its major performance issues.
Correction, it sold well for most games, the problem is did it sell well for a borderlands game.
The next highest in the franchise is BL2 which peaked at 124,000. Hard to compare BL3 since I believe that launched on the Epic store with 6 months of exclusivity which is probably why the peak on Steam was so much lower at 93,000.
the problem is borderlands 2, most likely unlike 4, was far more popular on console and most who got it on pc got it either in bundles later (how i got it) or got it on a sale way later, spreading out the numbers.
Now this isn't to say it isn't outpacing 2, but one thing i've learned is that you cannot use steam charts to measure success, final fantasy and dragon age proved that imo, yes i know it's popular to do so, but to me it should only be a part of it.
That said i think it sold well, but not sure if it sold well enough, randy pitchford kept bragging about it's much bigger budget.
wait and see how many will go for refund
another day, another crappy terrible optimized 70-80 dollar game comes out that i dont care about, another shtshow and people complaining about it.
I tell you man, it feels good not caring about new AAA game sand just playing old game sand indies. Its so triesome..
Oh and of course its unreal engine 5 and runs like crap, look at those graphics, what is the point of the cartoonish style anymore if it looks like a standard looking game?
So yeah unreal engine plus open world plus non cartoonish graphics, meaning more effects and detail and you get bad performance, who would have thought?
It's not the game though, it's our rigs! So says Randal Pitchfork. The guy is an utter moron. Always putting his foot in his mouth.
That "CEO" pointed out a video with "tricks" to improve performance.
The tricks are, set the game to 720P (1440P + DLSS Performance) and enable Frame Gen…
https://x.com/DuvalMagic/status/1966570269952098396
Meanwhile, game from 2007: https://uploads.disquscdn.com/images/307e194cece0006c9c290632f9efa92dd6ad7ae7e0199642695a46d6be77cb80.png
The Ascension level still runs like dookie though. It's like the asset streaming system or whatever just sh*ts the bed from having loading so much in and out.
Have you tried using DXVK and see if that changes anything on Windows?
It's been awhile since the last time I tried out the OG Crysis on the Steam Deck, but even back then I was positively suprised by the performance, given that I originally played that game via WINE's DX9-to-OpenGL translator, and that one gave me multi-second delays back in the day on that same level on Linux, truly a sight to behold.
Thanks for the trip down memory lane, BTW! 😀
No I haven't tried DXVK with Crysis. I might give that a try when I upgrade my CPU and GPU about 3 years from now.
I've heard Ascension runs like trash because the AI is all scripted in LUA which only runs on a single thread, hence you get a level with lots of AI and the performance just collapses.
Easily fixable through a mod though
There's a drag and drop mod that fixes the performance in that level, i shared the link for it on another comment but it's awaiting approval, so maybe by the time you read this, it will be here, or just google it
Well I'll be darned, the mod completely resolved performance issues in Ascension! I'm getting like 120FPS on average at 4K maxed now. With dropping the resolution down to 800×600 to test for CPU bottleneck it's still exceeding 180FPS. This is on an RX 9070 plus 7800X3D system.
It’s a great CPU, definitely helped in the single core scenarios of the game too, glad it helped. If you plan to do a playthrough i recommend Maximum Immersion mod for the best vanilla-friendly graphics or Enhanced Edition if you want to push graphics further.
There's a simple drag and drop mod that fixes the performance issues on this level, get it here
https://www.moddb.com/games/crysis/downloads/ascension-performance-fix
Cool, thanks. I’ll re-download Crysis right now, and maybe try out the mod along with DXVK later tonight.
DXVk doesn’t work well on DX10 and it’s needed for some effects to function properly, i recommend Ccomrade’s C1 Launcher that fixes a lot of stuff like the notorious low refresh rate issue and adds QoL features like skipping intro videos
Borderlands UE5 character (RTX4060, 8GB VRAM)
versus
Beyond 2 Souls character (PS3, only 256 MegaBytes VRAM)
https://uploads.disquscdn.com/images/7866c7ece66e94c748ec171a8d1b96734da6e780a8ba60a45eedf1919c429b98.png
2025 Borderlands UE5 character (RTX4060, 8GB VRAM)
versus
2010 Beyond 2 Souls character (PS3, only 256 MegaBytes VRAM)
https://uploads.disquscdn.com/images/7866c7ece66e94c748ec171a8d1b96734da6e780a8ba60a45eedf1919c429b98.png
2018 Plague Tale Innocence.
Asobo Engine. No raytracing.
Runs at 300FPS+ on a 5090 until it hits the CPU limit. Can probably run at 500FPS+ without CPU limit.
https://uploads.disquscdn.com/images/b7331a64906e12ee11e9cbafaac72929f29687e76532687fdaa7c539c5c489ff.png https://uploads.disquscdn.com/images/8da4c8ab83a50b6417564d9c18284ffb622f39f24ce0f9e7318ebcdc9d347cab.png
2018 Plague Tale Innocence.
Asobo Engine. No raytracing.
Runs at 300FPS+ on a 5090 until it hits the CPU limit. Can probably run at 500FPS+ without CPU limit.
https://uploads.disquscdn.com/images/b7331a64906e12ee11e9cbafaac72929f29687e76532687fdaa7c539c5c489ff.png https://uploads.disquscdn.com/images/8da4c8ab83a50b6417564d9c18284ffb622f39f24ce0f9e7318ebcdc9d347cab.png
2018 Plague Tale Innocence.
Asobo Engine. No raytracing.
Runs at 300FPS+ on a 5090 until it hits the CPU limit. Can probably run at 500FPS+ without CPU limit.
https://uploads.disquscdn.com/images/b7331a64906e12ee11e9cbafaac72929f29687e76532687fdaa7c539c5c489ff.png https://uploads.disquscdn.com/images/8da4c8ab83a50b6417564d9c18284ffb622f39f24ce0f9e7318ebcdc9d347cab.png
"2025 Borderlands UE5 character (RTX4060, 8GB VRAM)
versus
2010 Beyond 2 Souls character (PS3, only 256 MegaBytes VRAM)":
I would agree but It's still up to the developers on how a game looks and plays. and second, The Borderlands franchise has never impressed visually, historically speaking.
and using DX9!
The peak of game graphics was reached with Crysis 1, and once developer achieved a certain level of geometric complexity, adding more polygons will not make that much difference to most people. For example, the character models in Half-Life 1 were made up of around 800–1,000 polygons. If you compare that to a character model in crysis 1 made up of 30,000-50,000 polygons, the difference will be immediately noticeable. If you however compare 50,000 polygons to 500,000 or even 1M polygons it's much more difficult to notice that difference. You need to pay attention to details and view character models from up close and only then you will see improvements.
The graphics in Crysis 1 still may look pleasing to the eye, particularly the beautiful scenery. However, if you examine its assets closely and have an understanding of lighting, you will realise how dated the game is compared to today's standards.
It's the same with PS4 era games. For example Uncharted 4 is still considered one of the most beautiful raster games. If you focus on the beautiful scenery, the game still impress:
https://uploads.disquscdn.com/images/a2fef940938991ca79a8b6381534f172ce167baa553f0fe8397f88853542dbd8.jpg
But take a closer look at the assets from up close and the magic disappears. That's exactly why you posted crysis 1 screenshot showing open scenery, because from a distance dated assets arnt as noticeable.
https://uploads.disquscdn.com/images/e7bb5a5e9b04d3707c6b0caf579be1f64da3067ada4a026e9f790713873f1da8.jpg
https://uploads.disquscdn.com/images/b05344acce20f692166b7ba73edead5b18999be2d80a0fffd19f2aa60c4b57c9.jpg
Lemons in uncharted 4
https://uploads.disquscdn.com/images/5b70705063a305ebfeba09a05871f331b06312a17f9a2e64bb16cda3a39b5b38.jpg
Lemons in Mafia the old country (UE5). Screenshot from the PS5 version.
https://uploads.disquscdn.com/images/9b36417585274de6b01cc4d6dc3e9eef97d84e408853ab3c538d375e0e98f98d.jpg
There's a reason why UE5 requires more hardware resources than older games, and if you have a knowledge about game graphics and know where to look the difference is very big.
This silly meme argument about polygons not mattering after a certain point completely misses how games actually exist in reality. Yes, you'll stop being able to notice the difference in character models after around 10,000 polygons, but once a GPU can render 20,000 of them on screen you can have 2 characters talk in the same frame, once that hits 1,000,000 you can render an entire crowd of people at close range. Most people don't seem to notice the stuff you pointed out at the end, small granular details now being given individual models sells the effect way better than an obvious 2D texture.
Was pretty funny to see this game literally crash when Conan O'Brian did his Clueless Gamer segment.. never seen that happen before.
What lazy morons, UE5 realistic lighting just clashes with what's left of their cell-shaded art style that it makes no sense to even use this engine for this game..
GUYSSSSSSS guys remember it's not the UE5 fault, it's just every single Devs incompetence
CHE MUNNEZZ!
I just like the helpful information you provide in your articles
This is gonna need a lot of patches and engine upgrades to be worth playing.
or you need to wait more generation RTX 9000 ?
I dont want to defend Borderlands 4, because this game is much more demanding as a typical UE5 game and in my opinion, the graphics do not justify the requirements. Some of the assets in the game, especially the textures and trees reminds me X360 era. The lighting can look stunning at times (especially volumetric lighting), but I think a similar look could be achieved at much lower cost. That being said, thanks to the AI, even this demanding game is perfectly playable on PC.
I will share my results soon, but I should mention that if you start the game for the first time, or even change the graphics settings, the performance will drop until the game will stop compiling shaders in the background. Initially I saw 45-55fps, but few minutes later it was 94fps with exactly the same settings in exactly the same location. I don't know if John waited a couple of minutes after applying the new settings or changing the GPUs, but this problem could have affected some of his results.
Here's the same area where John said the RTX5080 drops below 60fps at 1080p. On my PC (7800X3D + 4080S) 1080p native (DLAA) and badass (maxed out) settings the game runs at solid 60fps. I killed everyone in that area and framerate was around 75fps. Perhaps the framerate would dip below 60 fps during some intense boss battles later on in the game (or COOP), but based on what I saw, 1080p is definitely playable.
https://uploads.disquscdn.com/images/b557aa469c0a6f1ea2cddaf753dddef6ca85cf43bda24236bd19b6808714df26.jpg
And other locations with the same settings.
https://uploads.disquscdn.com/images/beb49351b2bcc2efa23ffc6b4a2ff891c621037159c059d3d476a211ec009c39.jpg
https://uploads.disquscdn.com/images/4a7ad1146c7c8783b989512f2006d345f932657fc905a7f8d5988ad689d39b92.jpg
https://uploads.disquscdn.com/images/f63d7972f0f52b3c24dd321f429dfa9a442e9b6c34759b1e5815fcfc1c09e579.jpg
https://uploads.disquscdn.com/images/7b9238d533d0f98e64a0a3dc6117c4ff3893eb3a93b7fe87bc5ea62e131869a4.jpg
With DLAA at 1440p, my RTX 4080 Super is no longer powerful enough to run the game at 60 fps and maxed out settings.
https://uploads.disquscdn.com/images/b3bc4bb4a75763968fd4ed50f02ad8c226acb9bc0bde8e9cf92a208639671912.jpg
https://uploads.disquscdn.com/images/975ad34aa59ff96cece465966224c923ebd37f4ff5514a1a1d1e5d819e182142.jpg
https://uploads.disquscdn.com/images/3649cf1d89acecc4b0e8f597f54ffa590cee3890ac0eb4a55d88747b3d046519.jpg
i need to uze DLSSQ to play the game with badass settings at locked 60fps
https://uploads.disquscdn.com/images/2b56b502557dec90cfa1e4ab13ce641e88b6b0ba2b96d92f152fb14b67affcc3.jpg
https://uploads.disquscdn.com/images/080ef2efffe23b034e236354d964d90d0252579dd3948a43033f1d46896d92f8.jpg
1440p DLSSQ + FGx2, badass settings 130-140fps (28-32ms latency):
https://uploads.disquscdn.com/images/92db9c13420aed3a721f816c22845be520c2d6d5cbcfdca6b6d3a69eaad19437.jpg
https://uploads.disquscdn.com/images/5a4052614905792b96998a704739e916e7fccb1a3c0e6c0f2e6214bb6eafa284.jpg
https://uploads.disquscdn.com/images/9482b6c5df1befa44f20bd636b513b3f43db4645b637c6b1c2092bc2722a841f.jpg
Here are the recommended 1440p settings for my RTX 4080 Super, according to Gearbox.
https://uploads.disquscdn.com/images/6f854c46279820ca22124ffcc30d694638253590fe941d06fb2f99aa78751c33.jpg
With these settings I get 165fps+
https://uploads.disquscdn.com/images/dd2e471b41a8666ee7b0db48cc8de2c6835f152e8507a64895e91c8659ea2f8b.jpg
https://uploads.disquscdn.com/images/b44d2432c0545c631e3d047e54052f200a7259e6205ccfb063c89e75d830d426.jpg
https://uploads.disquscdn.com/images/c4c3e38da27e60787e1e7e28dcf4aa94b65ba5f7fd58302b95e9be4ccec314f4.jpg
At 4K native my 4080S is weak as Biden 😀 (24-27fps), so I'm not even going to share screenshots. Lowering settings to medium and using AI features makes it possible to play this game smoothly. Some people can say that these are fake frames and bla bla bla, but I'm totally convinced that I'm playing at 4K with 130–140 fps, and that's what matters to me. Of course, 4K DLAA (native) and 140 real fps would be even better, but it wouldn't change my experience that much.
4K DLSSQ + FGx2, medium settings
https://uploads.disquscdn.com/images/0bdd40cc140093ab57e69e4b8409e304d50b5d4cde3f3f4627093a40250b1a6e.jpg
https://uploads.disquscdn.com/images/9f0aea265b1e78224754bbdc66f7af2ac821eec4cbca292f7f7403d573873d24.jpg
High settings are playable as well, but I need to use DLSSP instead of DLSSQ. Image quality looks pretty much the same.
https://uploads.disquscdn.com/images/8229e3860fffaeda51d2d13b890ce8ff1d9550b5a493396460172886bc1c509a.jpg
If I would play completitevely I would however use DLSS Ultra Performance and medium settings. DLSSUP still look sharp (image is sharper even compared to PS5Pro version) and 190fps even with FG is perfectly responsive (I measured around 25-28ms).
https://uploads.disquscdn.com/images/363cea0155dc95295599caa186df78684eaf0fa39eb8ef99f3cf3936f8ea7788.jpg
I did wait. When you change a setting, the game will compile the shaders. All of my CPU cores were maxed out for the first 1-2 minutes while compiling the shaders, and GPU usage was not at 98%. Then, CPU usage went to normal levels, and I was GPU-limited the entire time.
Ok, I believe you John. On my PC however I definitely noticed this problem and I wanted to let you know that running this game for a first time, or just changing the graphics settings can trigger nasty performance drop. At first I thought thr game will run like crap on my PC, but 3-5 minutes later performance improved drastically.
I know that Ray tracing is demanding doesn't matter game looks good or bad but for the sake of god it's software lumen but it's as demanding as cyberpunk path tracing so it's on devs and those rich people that use raytracing to decrease game development time by a few years but don't have few months to optimize there game because they know people will buy it anyway
And for people that saying it's a open world game that's why they couldn't optimize it keep in mind that stalker 2 with all of that problem runs and looks significantly better than this game
tbf, stalker runs better now months later and was similarly not terribly well optimized as well as technically unfinished for a few months at release as admitted by the devs. I think the biggest difference here is that other developers would admit fault, whereas Randy here is telling people 2-3 year old hardware isnt enough for a "premium experience" or that they shouldnt be playing on 4K even if they have good hardware.
AAA Developers: "Just buy an RTX 5090, it's not like you losers have a choice with everyone switching to UE5"
Gamers: "K, we'll buy Silksong for $20 that runs on an iGPU instead"
Nvidia: "guys…guys…let's calm down here….how about some AI slop?….no?….how about some more fake frames?…guys?"
Dear Epic! Is the problem with the developers again, or can we say that UE5 is a trash?
From your videos the game looks dull. Big areas with little to offer?
Sounds like a match for you.
I’m sorry, I really didn’t mean to butt hurt you.
haha
"The game uses UE 5.5.4.0, which explains why we get these pop-in issues. Epic has added Nanite support for vegetation in UE 5.6"
Really? Does it use 5.5.4???
You mean Nanite Foliage? It was added in 5.1 and has since received upgrades and better functionality, though yes, 5.6 was another big step.
Also, watch BenchmarKing's video on YT for the best optimization guide, and to see for yourselves how worthless and a performance hog and a graphics card selling scam the Ultra settings are.
John, you're only perpetuating this by testing only on Ultra. That's why i always suggest at least one more benchmark in the next lower preset.
This game is sooo f*king boring!
The game is fun and is like if BL3 didnt have such an annoying story set up. It's just a damn shame about the optimization and Randy's social media posts not helping. I was having a grand old time day one playing on my 7700X and 9070XT at UW1440p with high settings over 120fps with upscaling and frame gen but after the Saturday patch, I crashed to desktop randomly like 3 different times. Then my friend shows me the posts from Randy saying people can't be using 2-3 year old hardware to play his premium game, posts the optimizations guides and then see he's expecting people to use 4x Frame Gen + upscaling + low settings. If I hadn't bought the game from one of the Key Sites, it would have been refunded already because of astronomical hubris. it's also crazy that most people I've watched aren't even having the regular UE5 issues be the bother as much as it just seems to be overly taxing for what the game looks like. There was a post on the BL4 subreddit that's been deleted now where a guy edited his ini to remove some post processing and the game looked way better and got back like 10% performance. I don't know what else they have baking in this title but it's just been ridiculous.
So demanding to achieve the same shit*y cellshade visuals made in UE3 already, but now they look more realistic…which goes against the artstyle, LOL what a crappy franchise.
Enable Lumen -> Save development cost (vs proper development through real baking of lights etc) -> Saved cost but resulting in crappy performance -> But who cares right – That cost is after all pushed to consumers who needs higher grade hardware than what would have been needed for a proper developed title & engine -> Laught while going to the bank and give the other devs the bonus middle finger ceo style… ohhh while also raising the cost of the skimped product -> Sheeps still keep on buying -> Rinse and repeat! All while nvidia laugh and give worse gen on gen upticks with all higher costs… and that weak performance uptick is needed more than ever due to all those nifty features abused to not even bother doing proper development. Whats next? VBA coded games?
Cant get why peeps keeps praiting things like upscaling, frame gen and features that will be abused to skimp further on costs vs doing proper work. Its the hard truth -> All things that COULD have been good have been used to push the dev costs towards the consumers instead. Yet … ill put it kindly some less enlightened people lack the perspective to see it for what it is…
perfect!
It is so (unfortunately & obviously) crystal clear that AAA game industry, along with "nvidia/amd/intel" triopoly, learned absolutely nothing from SilkSong incident.
Just like in last 4 – 5 years, they;
– Release games waaay before the product being fully ready
– Either partially (or in some cases totally) disregard performance and optimizations
– Make sure that, more and more expensive components will become "never without" if not "bare minimum" to normally experience the game.
– Keep people disctracted with updates, patches and bug-fixes in the first year or so…
– Sometimes they even make the saves gets erased and/or un-usable after the new patches.
– Also make sure that fake frame features (and increased input lags) are unavoidable, no matter what.
Therefore, all they do (and push) is;
More expensive games, more expensive components, more expensive power supply units, more heating, more electrocity consumptions, more pre-orders, more money collections, more rushed development processes, more problematic releases, more bugs, more performance issues, more stutters, more blurry visuals, more frustrations…
Unfortunately, that (still) could be the case, even after several launch delays…
Script is very simple:
It is more profit and more earning for them
and (unfortunately) more loss for us…