Immortals of Aveum is a new single-player magic FPS game that is powered by Unreal Engine 5. The game uses both Lumen and Nanite, and it for gaming at native 4K, it requires GPUs that have not been released yet. Even the mighty NVIDIA GeForce RTX 4090 is unable to offer 60fps at Native 4K… on LOW SETTINGS.
For our initial benchmarks, we used an AMD Ryzen 9 7950X3D, 32GB of DDR5 at 6000Mhz, and NVIDIA’s Founders Edition RTX 4090. We also used Windows 10 64-bit and the GeForce 537.13 driver. Moreover, we’ve disabled the second CCD on our 7950X3D.
Immortals of Aveum does not feature any built-in benchmark tool. Thus, for our GPU benchmarks, we used the first arena area (during which Jak awakens his powers). This area features numerous enemies and lots of particles on screen, so it can give us a pretty good idea of how the game performs during its combat sequences.
At Native 4K/Low Settings, the NVIDIA GeForce RTX 4090 drops at 34fps during our benchmark sequence. That’s on LOW SETTINGS. Let me repeat that. LOW SETTINGS. I don’t really know what Ascendant Studios was smoking while developing it, but I certainly would like some. This is an inexcusable performance for a game that looks the way Immortals of Aveum does. Seriously, when your game cannot run at Native 4K/Low Settings with 60fps on a beast of a GPU like the RTX 4090, you know that you have majorly f’ed things up.
Performance is all over the place on both AMD’s and NVIDIA’s hardware. For instance, at Native 1080p/Ultra, the only GPUs that can offer constant 60fps are the AMD Radeon RX 7900XTX and the NVIDIA RTX4090. The NVIDIA RTX 3080 runs the game with a minimum of 40fps and an average of 46fps. At 1080p. Ouch.
Our PC Performance Analysis for this game will go live in a couple of days. Until then, though, we wanted to warn you about its performance/optimization issues. Yes, you can use DLSS/FSR to further increase performance. However, these AI upscaling techniques are used as a crutch in this particular case. I mean, we all saw that coming when Ascendant revealed the game’s PC requirements.
Before closing, I wouldn’t mind these performance figures if Immortals of Aveum offered graphics comparable to The Matrix Tech Demo. It does not though. And, this isn’t the next “Crysis” game.
I seriously hope that the developers will further optimize it via post-launch updates. Nanite and Lumen are great new techniques. Nanite, in particular, makes a huge difference. However, there should be settings to scale down a UE5 game. It’s inexcusable for a game like this to run so badly at Native 4K/Low Settings on an RTX 4090.
Stay tuned for more!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email






I hope nobody purchased this game. Would be nice if every gamer would not buy a game till dev’s fix it 😀
They didnt. It flopped hilariously bad.
I only just heard about this game in the last week or two. Doesn’t look that great either
Yep this is totaly disgusting. What a bad joke for a game!
It even dont look that great!
Literally called it, awaiting their “apology” now:
https://uploads.disquscdn.com/images/7f77680259d91d608b8dca8f4403e950e4323be8ad1d1af6584d9dc21be3b8ca.jpg
Is it really any surprising though? It’s a gayme made by anti-video game activists who’d proudly push their Satanic degeneracies on your face through more dogcrap products masquerading as “entertainment products” like this one, it doesn’t take a genius to figure this out.
This kosher dogcrap apparently already flopped (deservedly so), on to the next Satanic heterophobic propaganda brainwashing “entertainment product.”
the message is more important than the game being good.
satanic? you must be using same drugs guy as devs of that woke nighmare.
He’s a low IQ sh*t skin jihadist from the middle east. There are a lot of them on this site.
Yeah like half the site lmao. I hate woke garbage as much as the next guy but I hate even more how the opposition are religiously zealous retards 9 times out of 10.
Cry more XDDDDDDDD @adamslowinski:disqus @disqus_iQeuLGOatO:disqus
https://uploads.disquscdn.com/images/10b9d905aa275dcc2554a0fb76e8fd2506e2cbbe15fafdc60782fe4d4ca50156.gif
https://uploads.disquscdn.com/images/312faca4951935bfaf0600ade2adf23eb7af5a0e6c8782438d7e7b90c357ae0d.jpg
whaaat? it just poorly optimized game and U just now pushing another agenda on top of agenda agenda and being same cis snowflake as some trannysnowflake. at this point all these labels are silly.
Any time someone mentions anything satanic in non-religious discourse, it’s time to administer some pills.
Like i said its like they made this game for a very small group of rich middle upper class woke hipsters and no one else.
If i was a woke hipster, i would be ashamed that this game is targeting me
funny thing is that the game even locks dated
The graphics look about as good as Bioshock (2007).
Dont insult Bioshock like that
Stop sh*tting on John’s new crush.
https://uploads.disquscdn.com/images/649c7801b1cef8c6e8a194909657a10194f2c6524e847b5e84df0689abc271df.jpg
Interestingly enough, part of the performance problem is a limitation of current low-level APIs like DX12 as used by UE5.
To overcome these limitations, AMD of all companies just recently created an experimental extension for the Vulkan API that specifically mentions UE5’s Nanite as the motivation for this new extension called
VK_AMDX_shader_enqueue.
Here’s the detailed description:
Sorry, but D3D12 is more efficient than Vulkan on Windows 10 in every game I’ve ever seen. The FPS problem with the game has to do with devs not taking the time to do the hundreds of little tweaks and “optimizations” to make the game run at better FPS. They’re using DLSS and FSR as an excuse to not bother spending that extra time making sure the game runs well.
As for what you quoted, it doesn’t seem to have anything to do with DirectX at all. Are you sure the point behind this new Vulkan extension doesn’t have more to do with Vulkan having issues with Nanite than with DirectX?
First of all, why do these game companies chose to develop their games with UE5 in the first place?
Because Epic Games takes care of all the low-level technical code like the renderer for them.
Therefore, you are wrong to assume that the poor performance here has anything to do with the game developers, because the poor rendering performance with DX12 is squarely on Epic Games.
As for your believe that DX12 is more efficient than Vulkan on Windows 10, here are recent benchmarks with RDR2:
https://uploads.disquscdn.com/images/47062b4c3e847f29269c5bc7c7d3a8577f7723deb1d0a6fb1ddbe1872f0be8b5.jpg
And you know what’s really funny about that Vulkan renderer of RDR2?
It only exists because Google paid Rockstar a truckload of money so that they would port it over to Stadia, which required Vulkan support, since it had to run natively on Linux.
And even though that port was only an afterthought, it performed better than the optimized DX12 renderer, since that one also needed to perform decently on the Xbox.
And ever thought about why ID Software went out of their way to write a PC-exlusive Vulkan renderer for their ID Tech engine, even though that same engine already has a DX12 renderer courtesy of the Xbox version?
Not to mention dxvk helps with a lot of unoptimized directx games even in it’s wrapper form. Gta 4, sekiro on amd cards, borderlands 2, kingdoms of amalur just to name a few. Developers won’t use it for the same reason they will keep using unreal engine. They just don’t care about optimization for pc.
That’s only because the devs who made those games did a piss-poor job designing them. DXVK hurts the performance of every game I’ve tried it with on Windows.
Quite the opposite. I already gave examples above. You can try looking them up on youtube. In my experience gta 4, sekiro performed better with dxvk.
Those are examples of games where the devs did “a piss-poor job designing them”, as I said before. In all other games that I’ve seen DXVK hurts performance on Windows.
Just because DXVK helps a handful of poorly designed games perform better, that doesn’t mean it works that way in every game.
DXVK isn’t magic, and it doesn’t replace the game’s usage of DirectX with Vulkan. It’s a translation layer, meaning that the game tries to use DirectX and DXVK translates the DirectX calls into Vulkan calls, which means increased overhead and lower performance. I’ve tried various versions of DXVK in a number of different games, both old and new, and lower FPS is the result every time. It can also cause worse fame pacing and can introduce additional stuttering, although that seems more likely to happen in newer games.
I have RDR2. Let me install it and run some benchmarks.
As for your statements about game performance, game engines can have issues of course, but you’d be surprised how many things factor into game performance. The number of polygons in meshes, the number of transparent textures and meshes in a scene, the number of light sources in a scene, the engine configuration, etc. The list is seemingly endless, and these are all things that the game devs control. When a game performs badly, it is always the fault of the game developers. For instance, they could have used forward rendering instead of deferred rendering, and gotten a 20% performance uplift simply from that (according to UE5 documentation).
RDR2 does seem to have higher FPS in Vulkan mode, however FPS seems more consistent in DirectX 12 mode while watching the benchmark run. That being said, the DirectX 12 mode had issues and the game would crash with an out of memory error if the RTSS overlay was enabled. Async compute also seems to be disabled by default in DirectX 12 mode, and there’s no apparent way to enable it in-game. One other interesting note is that both Vulkan and DirectX 12 performed better in earlier benchmarks, and then worse in later benchmarks (at the same quality presets), which does not seem to have been related to background processes using resources or to CPU/GPU or room temps (they were all under control, and room temps were dropping over time). My best guess is that this game has engine issues that Rockstar has not bothered resolving, which isn’t surprising since from what I hear GTAV is the same.
RDR2 is also not an Unreal Engine game, so benchmarks in it don’t really prove or disprove anything about Unreal Engine.
All of that being said, this is the first game I’ve seen where Vulkan outperformed DirectX 12. Usually Vulkan is 15-20 FPS lower performance than DirectX 12 with the same settings in the same game. This could indicate a DirectX 12 implementation issue in RDR2, or perhaps they have one or more inefficient HLSL shaders (Vulkan uses the GLSL shader language, while DirectX uses the HLSL shader language, so the same shaders don’t work in both Vulkan and DirectX 12). There are other games that have DirectX 12 implementation issues (The Division games are an excellent example, as they have crashing issues possibly due to a broken shader cache for DirectX 12).
Anyway, one game performing better in Vulkan than in DirectX 12 doesn’t prove anything. There are too many other games where DirectX clearly outperforms Vulkan.
So wrong, you can add all you like to an engine but if people don’t impliment correctly it goes to hell, and that’s proven.
Go look at unity when you see amature idiots making awful games that don’t work for hell and they cry foul at its AI or optimisation issues especially in relation to mesh.
And then you look at rust and you’re scratching your head confused what these apes are on about.
Who cares about performance when you have gender and race representation?
yeah what are you a gamer? Gamers are dead incel, videogames are art, cinematic hollywood movie art made for upstanding moral woke californians not for gamers.
You better be excited for alan woke 2 where alan wake looks like john wick because this is what the hipster actor who plays him looks like now and its really important that character models in the game look like their voice actors like how in the dead space remake issac looks like a completely different person because thats how the voice actor looks like and his girlfriend is now 30 year older because thats what the voice actor looks like, just like the calysto protocol the games are not movies and the voice actors must be the same race as the characters because you may not be able to see them but thats important thats why the voice actors of yakuza and shadow warrior had to be fired so they can hire…2 asian youtubers, its not for cutting costs, the same way they raceswap super hero in comics so they can argue that is a new character so they dont have to pay the original creators of those comics, honest, its because they are so woke and progressive, speaking of which, no white characters will be played by black actors and will be raceswapped and blackwashed, like say god of war and dont you dare call this game woke or the approve anti sjw anti woke youtubers are gonna brand you a incel, you can only like the slop or critique the cringiest parts of it and of course you must buy the latest product and get excited for the next product.
BUY THE 5000 DOLLAR RTX 5090 so you can upscale frame gen and fake skip into a playable framerate. Now of course you might be asking if all those youtubers who have made a career promoting these garbage games and the youtubers who are too friendly with the dev studios and became friends with exclusive cover, are these people shills? Nonsense its just that they love the industry so much that they get jobs in it by being good little corporate slave woke deepthroaters. Imagine thinking the video game industry is about making fun videogames, what are you a trump supporter?
What we were talking about? Ah yes alan woke 2 in which they put all that effort for a game that is a bait and switch and alan wake will be replaced my shaniqua woke. BUY THE NEXT WOKE SLOP AND GET EXCITED FOR THE NEXT WOKE SLOP….YOU BETA TESTER. Speaking of which all the people who bought baldurs gate 3 in early access for full price are beta testers.
its not tho, devs are lazy that all there is to it. You can run tech demos(playable) that look 1000x better with no problems or build your own game that will look better and run good even on older gpus using UE5. Issue is not with engine itself rather implementation of 3rd party tech and simply garbage code of diversity hires.
Tech demos have barely any gameplay, no AI, no calculations, no complex systems, they’re just… graphics, thank god they’re running normally.
This is not a good way to measure performance at all
Time to upgrade, John
At least a 6090TI
Yup. We can safely say 4090 is a good 720p card
that will be 14.999$ adjusted for inflation and nvidia raising prices.
Are you sure you know what that dot infront of 14 means?
its behind.
Take a screenshot, have you passed middle school math yet?
lol
.14 this is 14 cents
14.xxxxxxx this is 14 dollars
Clearly you both need to go back to school.
i know what i posted moron.
Well, time to get back to school then.
You and the other dude who doesn’t know how to read numbers.
because i am the only one calling you out.
I don’t know what that means, I’m talking about the ability to read numbers.
i am not the only one calling you out.
Thanks for calling me out bro, I couldn’t have done it without you.
Now, let’s talk about reading some numbers like 10 year old kids do.
Are you sure you know what infront of means? Stop having interracial s3x, it’s affecting your brain.
Do you have eyes?
https://uploads.disquscdn.com/images/8f1a4fefe14eaeb57f1a316a5f4c07beaa4c7ca21ff277fe7df9974249a58f1a.png
.14 this is 14 cents
14.xxxxxxx this is 14 dollars
Clearly you both need to go back to school.
.14 this is 14 cents (behind)
14.xx this is 14 dollars (in front)
Clearly you both need to go back to school.
It’s a steal!
THE MORE YOU BUY THE MORE YOU SAVE
So far UE5 is a trainwreck. I’d rather have a tuned custom inhouse engine than some asset flip on UE5.
Exactly why I’m buying Alan Wake 2, Forza Motorsport and Avatar, but skipping each and every single UE 5 game.
Wasn’t Remnant II on UE 5 and people seem to like it.
It sure looks a lot better then this game to me.
That also runs like liquid dogshit and barely looks any better than the first game. The lighting is trash and the animations as you can expect from generic talentless AA studio.
I know it runs bad.
But I thought that was done by some mid budget studio.
Didn’t think it was some AAA game.
Graphics are so underwhelming, it’s a clear case of unoptimization (like there’s any other reason most of the time). That game Fort Solis looks much better, though that’s not really optimized either and is more or less a walking simulator.
On the plus side, if flopped spectacularly.
already? To me this game was like forspoken, aside from a few woke kotaku journalist san fransisco hipsters, no one asked for this.
Yeah looks like it’s a big flop.
Max ~751 players on Steam, ~64% score.
The number of sold copies must very, very low.
Kudos to any game flopping worse than Forspoken, that is an achievement in and of itself.
And you’re still simping for that messy engine at every occasion while its widespread usage in the industry is going to send PC gaming right into dark ages, ironically blind people like you still think this engine is something positive that is happening to gaming just because it doesn’t have pop-ins while in fact it’s just turning the whole industry into asset-flipping amateurism and counting on upscaling technique to compensate for sh*t coding and optimization, Jesus i want pop-ins back, and what i want the most is seeing you reporting what’s wrong with every game made on this engine instead of s*cking its D on daily basis
…and the RTX 4090 became obsolete…
RTX 5080/90 series here i come!
Not to defend sh*tty coding by soy devs, but 4K is probably the most r3tarded fad ever to come to PC. Most hardware is simply not ready for this rubbish, and besides: Why? do you have robotic laser eyes and can see a difference between lets say 2K and 4K? what is this BS? Who did this? the hardware manufacturers? Or ret@rded OC nerds who don’t even play games just count frames? everyone needs a slap in their stupid faces. 2K is sharp enough. 4K 1diots should all be rounded up and put in a gorilla rape camp. What matters is graphics, art, narrative direction and sound quality in games. I f*ckn hate 4K snobs.
And before the 4K Defense Force: No, 27-32″ display is perfectly fine, if you need bigger monitors you’re mentally ill
I can perfectly see the difference between 1440p and 4k on a tv or on a monitor. It is pretty noticeable.
No you can’t.
what size TV? can you see the difference on a 27″ or 32″ (normal people size) display?
the point of 4k is no jaggies no need for aaa due to higher resolution, now we got dlss and frame gen to “fix” that.
I usually just put the rendering scale up to 1.2-1.4 and the jaggies disappear on 1440p without any AA
never heard about jaggies not being present at 4K. wtf. I’m on 2K and jaggies are everywhere. does another 2K just simply eliminate them?
32″ is smaller then my d*ck. I have my PC hooked to 55″ OLED TV like any decent human would have (used to have 100″ from projector but constant changing of lamps was annoying so i settled for 55″ as its perfect for my current setup). I have no problem running even full rt games that are well made like metro ee in 4k and 120FPS on my rtx4080. Aside from cyberpunk in full rt mode and no dlss everything easily runs in 120+ FPS in native 4k.
You hate 4k because you are too poor to have 4k, its all there is to it. Stop being butthurt your life will be better for it.
I’m not poor. It’s not about money you shallow NPC. “muh you’re poor that’s because you don’t like sh*t I like” typical sign of a megalomaniac consumerist goodgoy. I’m just not a smoothbrain ret3rd like you, who feels lonely if he doesn’t have the biggest, newest best thing a goy could buy. 55″ holy f*ck this zoomer got some issues
1st off tard, 2k is 1080p if you round it to the closest number, not 1440p. Ppl have been using the idiotic term for over 10 years now.
4k is absolutely worth it if you go above 27 inches, and it’s what enables gaming on TVs of 40inches+.
An LG OLED sh*ts on every gaming monitor by a wide margin especially when you consider the pricing of similar monitors, and you need 4k at that size.
Makaleniz açıklayıcı yararlı anlaşılır olmuş ellerinize sağlık
if this runs like garbage even a 4090… this game is probably running at 900p on consoles
What is the author of this article talking about? I played the game on a Ryzen 7 5800X and a RTX 4070, and I played in 2560 x 1440 mostly low settings, with DLSS 2 and DLSS 3 frame generation and it was mostly running above 100 fps.
read the article it clearly states 4K native without dlss
Johns article for the life time of the 4090 goes down the drain. Why not a new 3000 euro card that is value to the gamers.
What I see is when UE5 is associated with Lumen and Nanite, the game is a total mess so who is the culprit here; UE5 or Lumen and Nanite ?
Fortnite have all beel and whistle of UE5, and it can scale on any hardware to run just fine. Sadly, the UE5 guy are way more hands off towards developers licensed their games in term of optimisation( cant blame them because there is a lot of them) . Devs are just let loose, unless they are good developer like Ninja Theory or The Coalition that dedicated their time to fully understand UE5 quirk, bell and whistle
i laugh at the idiots that bought this
Unreal 5
Remnant 2 also runs like a*s.
Unreal 5
Remnant 2 also runs like a$$ on this engine.
As a spiritual person I warn you in the most loving way to be careful what you meditate on. I know it’s just a game but if has real elements of spells and names of real demons, these things can ‘haunt’ you. They bring sickness (mental and physical), try and get you into accidents and also effect your relationships. Jesus is Lord over all these things, stay close to Him.
Shame I was hoping this would be good, it looked so promising.
Maybe but I don’t think we should judge that based on a sh**ty game that sucks in every single aspect.
Why would you expect a game this bad to be optimized?
Somebody’s gotta be lying out of everyone saying this is unplayable, cause I have an i7-11700k with an rtx 3060 and 32gb ram. Have over 10 hours of play time and have had zero lag + zero performance issues.
This is where DLSS 3 comes into play. From benchmarks I’ve seen, it’s a rock solid 65-75fps with DLSS 3 at 4K. The game looks amazing and UE5 once again delivering the goods. This is one of those rare games that will look even better at 8K resolution, hopefully something 50-series Nvidia like the 5090 can run well.
Getting 65-75 WITH FG is not good, you can feel the added lag at that framerate and the anomalies also start showing up more and more. You need a final fps of at least 90+ for FG to feel good.
The game runs absolutely BEAUTIFULLY on my ROG Strix Scar G533ZS. A hybrid of high/ultra with maybe 2 things on low. I get between 90 (heavy heavy battle scenes) and about 135 (for 90% of the game so far) fps. I might be missing a detail. I’m here to educate myself. Mine is 2560×1440 and at this exact second I’m at 118fps. I have a 3090 and 64gb ram. Intel i12 I believe. Is it a bug possibly? Again, I apologize if I am coming off as an idiot lol. I’m pretty new to PC gaming as a whole.
Nvidia shouldn’t have ditched SLI.
A second 4090 would be able to handle it no sweat.
Odd cause at 3440×1440 at max settings in getting 165 fps
I guess I can’t say I’m having the same problems, I’m playing on ps5 and haven’t had any game play problems. Sure I don’t care to see a woman with arms as big as hulk hogan that you’d never see in real life but whatever. What upset me what the hoops I had to jump through to get the pre-order and deluxe edition items. This whole thing about expired credentials and my account being linked to a deleted account? I had to live chat with 3 different people and took about an hour to get it figured out, otherwise the 10 extra dollars and the pre order would be all for not. I find it a pretty crappy way to force people onto a service.
What’s funny is over the past few years realism isn’t getting better, graphics are hitting their peak and it should be reasonable enough to assume a 4090 can carry that hardware for 3 years of AAA games.
What we’re seeing is questionable graphics with worse and worse performance…. It’s almost like GPU vendors are paying companies to code badly!?
Starfield is another “10yrs in the making” that isn’t even coop… Like what the fk Bethesda? I honestly thought the cash cow 76 members sub would have pushed it online, but nah.
Seriously, gpu vendors are either leaning, or coders are just getting giga lazy for a quick buck.
The gaming industry has never been in a worse spot.