NACON and Rogue Factor have just released the PC demo for their action-adventure game, Hell is Us. Hell is Us is powered by Unreal Engine 5, and its PC requirements worried a lot of PC gamers. So, the big question is: do these PC requirements really match the game’s performance? Let’s take a closer look and find out how it actually runs.
For these initial tests, I used an AMD Ryzen 9 7950X3D, 32GB of DDR5 at 6000Mhz and an NVIDIA GeForce RTX 5090 Founder’s Edition. I also used Windows 10 64-bit and the NVIDIA GeForce 576.40 WHQL driver.
At 4K Ultra Settings, the game cannot hit 60FPS on the NVIDIA RTX 5090. From what I can see, Hell is Us uses Lumen. So, the fact that it cannot achieve 60FPS at 4K/Ultra is understandable. For instance, other UE5 games (that also use Lumen) can’t either run with 60FPS at Native 4K. For instance, Steel Seed, Oblivion Remastered, STALKER 2 and more do not run with 60FPS at Native 4K on the NVIDIA RTX 5090.
What I’m saying is that no, Hell is Us is not THE most demanding UE5 game we’ve seen. More or less, it runs similarly to other UE5 games that use Lumen. So, if you’ve played the aforementioned UE5 games, you know what you can expect from it (in terms of performance).
To get a smooth gaming experience on the NVIDIA RTX 5090, we had to enable DLSS Quality Mode. With DLSS Quality at 4K on Ultra Settings, we were getting over 70FPS at all times. The game also supports Frame Gen, meaning that you can increase performance even more.
I should also note that I did not experience any major stuttering issues. This is a UE5 game that does not seem to suffer from stutters. This is great news for everyone who is interested in it.
In the end, the talk about Hell is Us having crazy PC requirements turned out to be no big deal. Yes, the game is demanding, but it’s not as bad as people first thought. After all, the game is using Lumen, which is a form of Ray Tracing. So, you can’t expect to be running it at Native 4K/Ultra with 60FPS, even on an NVIDIA RTX 5090.
Bottom line is that Hell is Us runs similarly to other UE5 games. To its credit, it does not suffer from any stutters. Plus, its graphics look similar to those we first saw back in September 2024. In other words, the game has not been downgraded.
Finally, I suggest trying the demo for yourselves. You can go ahead and download it from this link.
Enjoy and stay tuned for more!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
Guess I will see how my 3070 does.
it use UE 5.5.4 zero stutter Amazing for UE5 game
To get a smooth gaming experience you guys just shouldn't choose ULTRA
7900 XTX, 4K, FSR 3.1.4 Performance, FSR FG + AFMF = 240-290 fps.
Game runs fine, looks fine.
Gamers like you are the reason AAA games have gone downhill. Even if they release unoptimized tr@sh games they got f@gs like you defending it
Dude went full turbo!
LOL
Come back after disabling the fake frames and resolution, until then none of your information is useful.
We want real numbers, not fake numbers.
I like DLSS FG (it improves not only aiming, but also motion clarity), but he is using much worse AMD FG, and AFMF on top of that (two frame generators).
AMD FSR FG literally looks the same as DLSS FG x2, while offering higher FPS and implicitly, lower input lag, lol. Get with the times.
AMD FSR FG literally looks the same as DLSS FG x2, while offering higher FPS and implicitly, lower input lag, lol. Get with the times.
I'm not talking about image quality and artefacts, although DLSS FG probably has fewer artefacts than FSR FG anyway. I'm talking about responsiveness and general feel during aiming. Nvidia is calculating flip metering on tensor cores ensuring perfect frame delivery and synchronization with mouse movement. The impact of this synrchonization cant be shown on screenshots, but I can easily feel it during gameplay.
I can enable DLSS FG at 70 fps to achieve 120 fps and my aim (mouse responsiveness) improves and feel much closer to the real 120 fps (it almost like nvidia is calculating mouse movement at higher framerate). With FSR FG, however, my aim always feels much worse compared to the base framerate (the difference is noticeable to me, even at very high base framerate). DLSS FG is also perfectly usable at 30 fps therefore I no longer worry about sub 60fps dips, whereas FSR FG needs at least 60 fps.
I measured a difference of just 1–4 ms in Cyberpunk between DLSS 4.0 FG on and off (OSD in the upper right corner of my monitor screen). DLSS 3.0 FG had higher latency 9-12ms and 10% lower framerate but even DLSS 3.0 was already much better than FSR FG.
https://uploads.disquscdn.com/images/7181d4f6d7c02c83d6512bc6b2dee2c834b225db0ed6542654834bd2415b4c1a.jpg
https://uploads.disquscdn.com/images/9bd453d42951909a4b4ae11a6adf76eec2acea66cce6f3c15042386abc455be8.jpg
In summary, enabling DLSS 4.0 improves my experience and that's why I like to use this frame generator. That's not the case with FSR FG, because that latency difference is impossible to ignore (I would rather turn down some settings than to use FSR FG). AFMF is even worse than FSR FG, so I can't imagine how unresponsive games must feel when you're using two frame generators at the same time.
I have a 240Hz monitor, that’s how I play my games.
Come back after disabling the fake frames and resolution, until then none of your information is useful.
We want real numbers, not fake numbers.
"To get a smooth gaming experience on the NVIDIA RTX 5090, we had to enable DLSS Quality Mode. "
So in other words pass – If they have to duct-tape performance through lower render resolution to get non-immersion breaking performance they did something wrong during the optimization pass considering the scene/model quality – Perhaps they should hired dev's who know what they are doing to earn their salery or have enough staff to do a proper jobb… And the tier 2 duct-taping aka multiframe generaion only makes it worse as it adds to input latency making it even less immersive than without.
Both techs were created to allow 100% raytraced / path traced games earlier on mid tier hardware (At the current rate of disappointments i would expect 20+ years at the rate the mid tier hardware only gains what? 5-10% per series)… They were not created to allow devs to slack off on proper optimizations!
Just use high settings for much better performance. why people nowadays are so mentally incapable to not understand that games have quality settings? Did tik-tok destroyed the critical thinking skills?!
I agree – Seems tik tok have done a number for sure, as there are those who even defend lazy developers and add to the issue rather than use their logic center to see the fallacy in their ways
If you ever coded or developed anything you can easily see the downwards trend in what today's developers can get out of the hardware they have at their disposal
If you accept sloppy work that's on you and its at the end your money
Yes, pay 2k $ for a graphics card to play on medium. You must be real smart.
it runs with 80 fps in 4k dlss4 quality on 4090 so I have no idea what are you talking about? are you lost or something ? xD
https://www.youtube.com/watch?v=WFIsLr3l1QA&t=322s&ab_channel=DanielOwen
Its still 3k+ euro in Europe
Wow, I hadn’t checked the prices for a while and I thought I might’ve high balled it but I actually went way too low
Nice post. I learn something totally new and challenging on websites
ill have to give this a try later. The dispatch demo is out and man seems like itll be a really fun narrative.
These kind of demands from a game that looks pathetically bland and mediocre is just embarrassing.
It feels like a budget AA game, but it has an interesting atmosphere and I didn't get bored. As for the requirements they arnt that bad, because the game is very scalable. At 4K Ultra settings (90% resolution scaling) I had like 33-40fps, but with very high settings and DLSSQuality 62-70fps. At 1440p DLSSQ 110-130fps. Sadly, DLSS FG doesn't work in this game, but it's still playable without it.
https://uploads.disquscdn.com/images/714d78799d6315180c900844a73fb42e9e467095189fea78a6ed60415157d287.jpg https://uploads.disquscdn.com/images/e42b2b98ce3a1026114fe9e6b174616c6e481a96cd9504bd58722dfef0b13978.jpg
https://uploads.disquscdn.com/images/64a15f8e3d3b3a98be397c520f0890a17404db95790dc9e0f95b67a45537ad6b.jpg
https://uploads.disquscdn.com/images/746950e36a26b996021d6de4def48c16c57758007e621fe77c862da7deddf5e3.jpg
"After all, the game is using Lumen, which is a form of Ray Tracing. So, you can’t expect to be running it at Native 4K/Ultra with 60FPS, even on an NVIDIA RTX 5090."
The owner of a PC dedicated site talking like the most braindead console peasant, when giving excuses to why the most expensive tool still can't run the thing.
Nothing wrong with that, right Johnny Whitewool?!
I noticed DLSS seems to be broken if you load into the game with it disabled. So turn off DLSS. Load in. The. Crank settings up and enable DLSS/FG. Over 120fps in 4K easily. With lower system latency than in cyberpunk. But….regardless…the game kinda looks like crap to me. -_-