Gearbox has just released Gunfire’s third-person action-survival shooter, Remnant 2. Remnant 2 is powered by Unreal Engine 5, takes advantage of Nanite, and it’s time to examine its performance on PC.
For our Remnant 2 PC Performance Analysis, we used an AMD Ryzen 9 7950X3D, 32GB of DDR5 at 6000Mhz, AMD’s Radeon RX580, RX Vega 64, RX 6900XT, RX 7900XTX, NVIDIA’s GTX980Ti, RTX 2080Ti, RTX 3080 and RTX 4090. We also used Windows 10 64-bit, the GeForce 536.67 and the Radeon Software Adrenalin 2020 Edition 23.7.1 drivers. Moreover, we’ve disabled the second CCD on our 7950X3D.
Gunfire Games has added a few graphics settings to tweak. PC gamers can adjust the quality of Shadows, Post-Processing, Foliage, Effects and View Distance. The game also supports all PC upscaling techniques like DLSS 2/3, FSR 2.0 and XeSS. Furthermore, Gunfire has included a FOV Modifier.
Remnant 2 does not feature any built-in benchmark tool. Thus, for both our CPU and GPU tests, we benchmarked the game’s central hub, Ward 13. This area features a lot of NPCs, making it ideal for our tests. For our CPU benchmarks, we also lowered our resolution to 1080p and enabled DLSS 2 Performance Mode (so that we could avoid any possible GPU bottlenecks).
In order to find out how the game scales on multiple CPU threads, we simulated a dual-core, a quad-core and a hexa-core CPU. And, similarly to Layers of Fear, Remnant 2 does not require a high-end CPU. As we can see, even our simulated dual-core system was able to provide framerates higher than 80fps at 1080p/DLSS 2 Performance/Ultra Settings.
Unreal Engine 5’s Nanite can improve overall CPU performance, making the game run smoother on a wide range of CPU configurations. By taking advantage of Nanite, this game was silky smooth, without any noticeable traversal stutters. So, not only does Nanite eliminate geometry pop-ins, but it also reduces traversal stutters. What’s also interesting here is that you can disable Nanite when setting the in-game graphics options to Low. And, on Low settings, the game suddenly suffers from major traversal stutters. We were able to replicate this numerous times so yes, Nanite will play a major part in reducing traversal stutters in future UE5 games.
Now while Remnant 2 does not need a high-end CPU, it certainly requires a powerful GPU. At native 1080p/Ultra Settings, the only GPUs that could run the game with 60fps were the AMD RX6900XT, AMD RX7900XTX and the NVIDIA RTX4090. Our NVIDIA RTX3080 came close to a 60fps experience, though there were drops to the mid-50s.
It’s also worth noting that Remnant 2 performs incredibly well on AMD’s GPUs. In fact, the AMD RX7900XTX can beat the NVIDIA RTX4090 in all native resolutions. We’ve already informed NVIDIA about this, so hopefully they will be able to improve performance via future drivers.
At native 1440p/Ultra Settings, the only GPUs that could offer a 60fps experience were the AMD Radeon RX 7900XTX and the NVIDIA RTX 4090. And as for native 4K/Ultra, there wasn’t any GPU that could offer a smooth gaming experience.
But what about scalability? Well, the game was running with the exact same framerates at both Ultra, High and Medium on our RTX 4090. At first glance, you’d assume that this was due to a CPU bottleneck. However, even at Medium Settings, our RTX 4090 was used to its fullest. We don’t know what is going on here, however, we are not the only ones witnessing this odd behavior.
Graphics-wise, Remnant 2 looks great but it does not justify its ridiculously high GPU requirements. The game has a lot of high-quality textures, and thanks to Nanite, there aren’t any objects/NPCs/grass pop-ins. The game also has some amazing lighting/global illumination effects in some areas. However, its AO solution is not that great. Contrary to other games, in Remnant 2, SSAO completely disappears from the edges of the screen while panning the camera. Gunfire needs to fix this (or add support for RTAO).
All in all, Remnant 2 is a mixed bag. On one hand, we have a game that runs on a wide range of CPUs, takes advantage of Nanite, and does not have any shader compilation or traversal stutters. On the other hand, though, the game can stress even the highest-end graphics cards, despite the fact that it does not use any ray-tracing effects.
Thankfully, PC gamers can enable upscaling techniques in order to improve overall performance. And for those wondering, the performance of Remnant 2 on current-gen consoles is not that great either. Unless, of course, you’re fine with a game being rendered at 794p. So, if you have the hardware, Remnant 2 can be fully enjoyed on PC via DLSS, FSR or XeSS. However, Gunfire should not be using these upscalers as a crutch. NVIDIA should also improve the game’s performance on its hardware.
Here is hoping that Gunfire will improve the performance of Remnant 2 via future updates!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email


















“ Unreal Engine 5’s Nanite can reduce overall CPU performance”
I don’t think you meant to say that. Interestingly enough, when Daniel Owen ran the game on a 9600k he had serious stutter eliminated by switching to his 7950x3d. Might be a system RAM bottleneck.
Nanite resolves traversal stutters which can surface when you disable it (when using Low settings). These traversal stutters occur on Low settings because the game loads assets in a more traditional way (like all the rest of games). And when we simulated a dual-core system, we were still getting a smooth experience with Nanite (no major traversal stutters). So, assuming you want a stutter-free experience, without Nanite, you’ll need a way more powerful CPU/RAM combo to calculate and load those assets without hickups.
So you’re saying Nanite improves CPU performance?
Yeap, fixed the error. I thought I had written “reduce overall CPU usage/requirements”. Brain fart, I know.
So UE 5 could be even more CPU efficient than UE4? UE 4 already has some very good examples of great multithreated games.
UE4 is extremely single threaded by default. Whatever games you may be thinking about, were heavily optimized by the devs.
Botched situation of PC Gaming are continued
I tried the game yesterday, it is very weird performance wize.
It is heavy until you enable DLSS.
I have an old rtx 2080 and it runs great (60-80fps) at 4K High with DLSS UP
4090 can’t run the game more than 90 fps 4k native. Another mess.this game is not a next gen looker by any means.
This is what happens when a game is optimized fully for console and that optimization is transitioned to the PC version. It’s just not RDNA3 winning vs new RTX gen it’s RDNA2 down to polaris beating it’s pascal competition from what I have seen. This is much like when every single Unreal Engine was fully optimized for Nvidia only and AMD had no chance.
What are you on about? The game runs terribly on consoles, they are having to resort to absurdly low resolutions upscaled, to get 60 fps, and even fail at that consistently.
It’s as optimized for consoles as it can get giving the state of those console APU’s being used.
Xbox X is pretty much a 6700 XT with a 3700X.
You literally don’t know what you’re talking about.
yeah I do. Unlike many of you dips on here who think just cause a game is not getting what you want out of performance then it must have a problem.
The game with a 6700 XT gets a avg of 62 fps at 1080P Ultra settings with FSR on as well as being paired with a 5800X3D.
So the console vs PC performance scaling is as close as it gets.
You’re utterly confused.
The game performs equally poorly on consoles and PC, literally nobody here has been talking about console vs PC scaling.
more like you are confused. And the game is rated at very positive on Steam 😀 Learn about hardware scaling. 😀
You’re a schizo.
Welcome to the Internet ; )
It’s not even up to par with a 3700X as it has nerfed clocks and nerfed voltage to keep the package TDP lower than a 3700X
Same is true with the GPU section which is not a full RDNA2 and also needs lower clocks and voltages to keep the package TDP under control. It’s probably somewhere between a 6600XT and a 6700.
However a console has an edge over a PC because it’s a One Trick Pony and thus can be optimized to do one thing and one thing only where a PC is a Jack of All Trades and thus has more overhead and needs slightly more powerful hardware to play a game at the same performance as a console. So in practice it is about equal to a 3700X and a 6700XT
Hey. I wanna need those smokies you are onto. Care to share the names? I need them too.
Don’t do it bro, otherwise you risk ending up just as derpy as Mellinger!
Just shows how much you wannabe PCMR kids know about game optimization. A Xbox X is pretty much a Zen 2 with a 3700X CPU and a 6700XT GPU.
Another example of this type of optmization is with COD’s latest game engine. Highly optmized for AMD hardware. Which is why a 7900 XTX is on par with a 4090. And a 7900 XT beats a 4080.
Unreal Engine is destroying gaming !!!
F###K Unreal Engine !!!
They are not lazy !! they are fuc##g retards and As##Dumbs !! RiP Gaming industry , i remeber the years 2004-2007 every game has its own engine :
Doom 3 = ID TECH 4
HL2 = source Engine
Painkiller = Pain Engine
Battlefield 2 = Refractor Engine 2.0
F.E.A.R. =LithTech Jupiter EX
Prince of persia = Jade Engine
Assassin’s Creed II = Anvil
Call of Duty 2 = IW 2
Tomb Raider Legend = Crystal Engine
Killzone 2 = KZ 2 1.27 ( PS3 Exclusive )
Hitman Blood Money = Glacier Engine
………. etc
it was a n AWESOME GREAT ERA !!!
Small correction. BATTLEFILED 2 was based on Refractor engine. HL2 was just Source engine.
Thnx my mistake = HL2 was built on Source 1 not 2 !! obviously !!
It’s the lazy retarded devs man not the engine. This is what happens when lazy developers abuse things like dlss.they develope bad running games and count on upscaling to compensate for it. We had lazy devs for a long time but now with upscaling at their disposal,the bad situation is getting worse.having upscaling is good but not as a replacement for optimization.
Well get in there and show them how it’s done …..
Any Fool can identify a problem, actually solving a problem takes skill and determination …..
Sure lazy devs working with UE. On top look again, they relying heavily on the “scalers” crotch then you see 2080ti and 3080 struggling wtin 1080p.
Looks meh and runs badly. Nothing new
Sadly only the first area and town was tested, the game drops like 30-40% after the tutorial.
steady 140+ fps down to 90 fps for me…
From what I’m seeing Nanite actually lowers overall performance. While it takes some of the load off the CPU it’s just putting a lot more load on the GPU to make up for it and in the end costs performance.
Like most things it’s a tradeoff, while you have less load on the CPU keeping it from bottlenecking and you eliminate transversal stutters and pop-in it comes at the cost of lower overall FPS because it is working the GPU much harder.
I get ~45fps Outside with mobs on my 7900 XTX, but also ~75fps Inside with mobs. So, it’s a ~60fps avg overall and plays great either way as it’s in the VRR range of my OLED
Problem is this game is just HARD AF. Dying all over the place… 🙁 but I do like the game