Immortals of Aveum feature

Immortals of Aveum cannot run at Native 4K/60fps on NVIDIA RTX 4090, even on Low Settings

Immortals of Aveum is a new single-player magic FPS game that is powered by Unreal Engine 5. The game uses both Lumen and Nanite, and it for gaming at native 4K, it requires GPUs that have not been released yet. Even the mighty NVIDIA GeForce RTX 4090 is unable to offer 60fps at Native 4K… on LOW SETTINGS.

For our initial benchmarks, we used an AMD Ryzen 9 7950X3D, 32GB of DDR5 at 6000Mhz, and NVIDIA’s Founders Edition RTX 4090. We also used Windows 10 64-bit and the GeForce 537.13 driver. Moreover, we’ve disabled the second CCD on our 7950X3D.

Immortals of Aveum does not feature any built-in benchmark tool. Thus, for our GPU benchmarks, we used the first arena area (during which Jak awakens his powers). This area features numerous enemies and lots of particles on screen, so it can give us a pretty good idea of how the game performs during its combat sequences.

At Native 4K/Low Settings, the NVIDIA GeForce RTX 4090 drops at 34fps during our benchmark sequence. That’s on LOW SETTINGS. Let me repeat that. LOW SETTINGS. I don’t really know what Ascendant Studios was smoking while developing it, but I certainly would like some. This is an inexcusable performance for a game that looks the way Immortals of Aveum does. Seriously, when your game cannot run at Native 4K/Low Settings with 60fps on a beast of a GPU like the RTX 4090, you know that you have majorly f’ed things up.

Immortals of Aveum 4K/Low Settings-1Immortals of Aveum 4K/Low Settings-2Immortals of Aveum 4K/Low Settings-3 Immortals of Aveum 4K/Low Settings-4Immortals of Aveum 4K/Low Settings-5Immortals of Aveum 4K/Low Settings-6

Performance is all over the place on both AMD’s and NVIDIA’s hardware. For instance, at Native 1080p/Ultra, the only GPUs that can offer constant 60fps are the AMD Radeon RX 7900XTX and the NVIDIA RTX4090. The NVIDIA RTX 3080 runs the game with a minimum of 40fps and an average of 46fps. At 1080p. Ouch.

Our PC Performance Analysis for this game will go live in a couple of days. Until then, though, we wanted to warn you about its performance/optimization issues. Yes, you can use DLSS/FSR to further increase performance. However, these AI upscaling techniques are used as a crutch in this particular case. I mean, we all saw that coming when Ascendant revealed the game’s PC requirements.

Before closing, I wouldn’t mind these performance figures if Immortals of Aveum offered graphics comparable to The Matrix Tech Demo. It does not though. And, this isn’t the next “Crysis” game.

I seriously hope that the developers will further optimize it via post-launch updates. Nanite and Lumen are great new techniques. Nanite, in particular, makes a huge difference. However, there should be settings to scale down a UE5 game. It’s inexcusable for a game like this to run so badly at Native 4K/Low Settings on an RTX 4090.

Stay tuned for more!

97 thoughts on “Immortals of Aveum cannot run at Native 4K/60fps on NVIDIA RTX 4090, even on Low Settings”

  1. I hope nobody purchased this game. Would be nice if every gamer would not buy a game till dev’s fix it 😀

  2. Literally called it, awaiting their “apology” now:

    https://uploads.disquscdn.com/images/7f77680259d91d608b8dca8f4403e950e4323be8ad1d1af6584d9dc21be3b8ca.jpg

    Is it really any surprising though? It’s a gayme made by anti-video game activists who’d proudly push their Satanic degeneracies on your face through more dogcrap products masquerading as “entertainment products” like this one, it doesn’t take a genius to figure this out.

    This kosher dogcrap apparently already flopped (deservedly so), on to the next Satanic heterophobic propaganda brainwashing “entertainment product.”

        1. Yeah like half the site lmao. I hate woke garbage as much as the next guy but I hate even more how the opposition are religiously zealous retards 9 times out of 10.

    1. whaaat? it just poorly optimized game and U just now pushing another agenda on top of agenda agenda and being same cis snowflake as some trannysnowflake. at this point all these labels are silly.

  3. Like i said its like they made this game for a very small group of rich middle upper class woke hipsters and no one else.

    1. Interestingly enough, part of the performance problem is a limitation of current low-level APIs like DX12 as used by UE5.

      To overcome these limitations, AMD of all companies just recently created an experimental extension for the Vulkan API that specifically mentions UE5’s Nanite as the motivation for this new extension called
      VK_AMDX_shader_enqueue.

      Here’s the detailed description:

      Applications are increasingly using more complex renderers, often incorporating multiple compute passes that classify, sort, or otherwise preprocess input data. These passes may be used to determine how future work is performed on the GPU; but triggering that future GPU work requires either a round trip to the host, or going through buffer memory and using indirect commands. Host round trips necessarily include more system bandwidth and latency as command buffers need to be built and transmitted back to the GPU. Indirect commands work well in many cases, but they have little flexibility when it comes to determining what is actually dispatched; they must be enqueued ahead of time, synchronized with heavy API barriers, and execute with a single pre-recorded pipeline.

      Whilst latency can be hidden and indirect commands can work in many cases where additional latency and bandwidth is not acceptable, recent engine developments such as Unreal 5’s Nanite technology explicitly require the flexibility of shader selection and low latency. A desirable solution should be able to have the flexibility required for these systems, while keeping the execution loop firmly on the GPU.

      1. Sorry, but D3D12 is more efficient than Vulkan on Windows 10 in every game I’ve ever seen. The FPS problem with the game has to do with devs not taking the time to do the hundreds of little tweaks and “optimizations” to make the game run at better FPS. They’re using DLSS and FSR as an excuse to not bother spending that extra time making sure the game runs well.

        As for what you quoted, it doesn’t seem to have anything to do with DirectX at all. Are you sure the point behind this new Vulkan extension doesn’t have more to do with Vulkan having issues with Nanite than with DirectX?

        1. First of all, why do these game companies chose to develop their games with UE5 in the first place?

          Because Epic Games takes care of all the low-level technical code like the renderer for them.

          Therefore, you are wrong to assume that the poor performance here has anything to do with the game developers, because the poor rendering performance with DX12 is squarely on Epic Games.

          As for your believe that DX12 is more efficient than Vulkan on Windows 10, here are recent benchmarks with RDR2:

          https://uploads.disquscdn.com/images/47062b4c3e847f29269c5bc7c7d3a8577f7723deb1d0a6fb1ddbe1872f0be8b5.jpg

          And you know what’s really funny about that Vulkan renderer of RDR2?

          It only exists because Google paid Rockstar a truckload of money so that they would port it over to Stadia, which required Vulkan support, since it had to run natively on Linux.

          And even though that port was only an afterthought, it performed better than the optimized DX12 renderer, since that one also needed to perform decently on the Xbox.

          And ever thought about why ID Software went out of their way to write a PC-exlusive Vulkan renderer for their ID Tech engine, even though that same engine already has a DX12 renderer courtesy of the Xbox version?

          1. Not to mention dxvk helps with a lot of unoptimized directx games even in it’s wrapper form. Gta 4, sekiro on amd cards, borderlands 2, kingdoms of amalur just to name a few. Developers won’t use it for the same reason they will keep using unreal engine. They just don’t care about optimization for pc.

          2. That’s only because the devs who made those games did a piss-poor job designing them. DXVK hurts the performance of every game I’ve tried it with on Windows.

          3. Quite the opposite. I already gave examples above. You can try looking them up on youtube. In my experience gta 4, sekiro performed better with dxvk.

          4. Those are examples of games where the devs did “a piss-poor job designing them”, as I said before. In all other games that I’ve seen DXVK hurts performance on Windows.

            Just because DXVK helps a handful of poorly designed games perform better, that doesn’t mean it works that way in every game.

            DXVK isn’t magic, and it doesn’t replace the game’s usage of DirectX with Vulkan. It’s a translation layer, meaning that the game tries to use DirectX and DXVK translates the DirectX calls into Vulkan calls, which means increased overhead and lower performance. I’ve tried various versions of DXVK in a number of different games, both old and new, and lower FPS is the result every time. It can also cause worse fame pacing and can introduce additional stuttering, although that seems more likely to happen in newer games.

          5. I have RDR2. Let me install it and run some benchmarks.

            As for your statements about game performance, game engines can have issues of course, but you’d be surprised how many things factor into game performance. The number of polygons in meshes, the number of transparent textures and meshes in a scene, the number of light sources in a scene, the engine configuration, etc. The list is seemingly endless, and these are all things that the game devs control. When a game performs badly, it is always the fault of the game developers. For instance, they could have used forward rendering instead of deferred rendering, and gotten a 20% performance uplift simply from that (according to UE5 documentation).

          6. RDR2 does seem to have higher FPS in Vulkan mode, however FPS seems more consistent in DirectX 12 mode while watching the benchmark run. That being said, the DirectX 12 mode had issues and the game would crash with an out of memory error if the RTSS overlay was enabled. Async compute also seems to be disabled by default in DirectX 12 mode, and there’s no apparent way to enable it in-game. One other interesting note is that both Vulkan and DirectX 12 performed better in earlier benchmarks, and then worse in later benchmarks (at the same quality presets), which does not seem to have been related to background processes using resources or to CPU/GPU or room temps (they were all under control, and room temps were dropping over time). My best guess is that this game has engine issues that Rockstar has not bothered resolving, which isn’t surprising since from what I hear GTAV is the same.

            RDR2 is also not an Unreal Engine game, so benchmarks in it don’t really prove or disprove anything about Unreal Engine.

            All of that being said, this is the first game I’ve seen where Vulkan outperformed DirectX 12. Usually Vulkan is 15-20 FPS lower performance than DirectX 12 with the same settings in the same game. This could indicate a DirectX 12 implementation issue in RDR2, or perhaps they have one or more inefficient HLSL shaders (Vulkan uses the GLSL shader language, while DirectX uses the HLSL shader language, so the same shaders don’t work in both Vulkan and DirectX 12). There are other games that have DirectX 12 implementation issues (The Division games are an excellent example, as they have crashing issues possibly due to a broken shader cache for DirectX 12).

            Anyway, one game performing better in Vulkan than in DirectX 12 doesn’t prove anything. There are too many other games where DirectX clearly outperforms Vulkan.

          7. So wrong, you can add all you like to an engine but if people don’t impliment correctly it goes to hell, and that’s proven.

            Go look at unity when you see amature idiots making awful games that don’t work for hell and they cry foul at its AI or optimisation issues especially in relation to mesh.

            And then you look at rust and you’re scratching your head confused what these apes are on about.

    1. yeah what are you a gamer? Gamers are dead incel, videogames are art, cinematic hollywood movie art made for upstanding moral woke californians not for gamers.

      You better be excited for alan woke 2 where alan wake looks like john wick because this is what the hipster actor who plays him looks like now and its really important that character models in the game look like their voice actors like how in the dead space remake issac looks like a completely different person because thats how the voice actor looks like and his girlfriend is now 30 year older because thats what the voice actor looks like, just like the calysto protocol the games are not movies and the voice actors must be the same race as the characters because you may not be able to see them but thats important thats why the voice actors of yakuza and shadow warrior had to be fired so they can hire…2 asian youtubers, its not for cutting costs, the same way they raceswap super hero in comics so they can argue that is a new character so they dont have to pay the original creators of those comics, honest, its because they are so woke and progressive, speaking of which, no white characters will be played by black actors and will be raceswapped and blackwashed, like say god of war and dont you dare call this game woke or the approve anti sjw anti woke youtubers are gonna brand you a incel, you can only like the slop or critique the cringiest parts of it and of course you must buy the latest product and get excited for the next product.

      BUY THE 5000 DOLLAR RTX 5090 so you can upscale frame gen and fake skip into a playable framerate. Now of course you might be asking if all those youtubers who have made a career promoting these garbage games and the youtubers who are too friendly with the dev studios and became friends with exclusive cover, are these people shills? Nonsense its just that they love the industry so much that they get jobs in it by being good little corporate slave woke deepthroaters. Imagine thinking the video game industry is about making fun videogames, what are you a trump supporter?

      What we were talking about? Ah yes alan woke 2 in which they put all that effort for a game that is a bait and switch and alan wake will be replaced my shaniqua woke. BUY THE NEXT WOKE SLOP AND GET EXCITED FOR THE NEXT WOKE SLOP….YOU BETA TESTER. Speaking of which all the people who bought baldurs gate 3 in early access for full price are beta testers.

  4. its not tho, devs are lazy that all there is to it. You can run tech demos(playable) that look 1000x better with no problems or build your own game that will look better and run good even on older gpus using UE5. Issue is not with engine itself rather implementation of 3rd party tech and simply garbage code of diversity hires.

    1. Tech demos have barely any gameplay, no AI, no calculations, no complex systems, they’re just… graphics, thank god they’re running normally.
      This is not a good way to measure performance at all

          1. .14 this is 14 cents
            14.xxxxxxx this is 14 dollars
            Clearly you both need to go back to school.

          2. Well, time to get back to school then.
            You and the other dude who doesn’t know how to read numbers.

          3. Thanks for calling me out bro, I couldn’t have done it without you.
            Now, let’s talk about reading some numbers like 10 year old kids do.

          4. Are you sure you know what infront of means? Stop having interracial s3x, it’s affecting your brain.

          5. .14 this is 14 cents
            14.xxxxxxx this is 14 dollars
            Clearly you both need to go back to school.

          6. .14 this is 14 cents (behind)
            14.xx this is 14 dollars (in front)
            Clearly you both need to go back to school.

    1. Exactly why I’m buying Alan Wake 2, Forza Motorsport and Avatar, but skipping each and every single UE 5 game.

      1. That also runs like liquid dogshit and barely looks any better than the first game. The lighting is trash and the animations as you can expect from generic talentless AA studio.

        1. I know it runs bad.
          But I thought that was done by some mid budget studio.
          Didn’t think it was some AAA game.

  5. Graphics are so underwhelming, it’s a clear case of unoptimization (like there’s any other reason most of the time). That game Fort Solis looks much better, though that’s not really optimized either and is more or less a walking simulator.

    1. already? To me this game was like forspoken, aside from a few woke kotaku journalist san fransisco hipsters, no one asked for this.

    2. Yeah looks like it’s a big flop.
      Max ~751 players on Steam, ~64% score.
      The number of sold copies must very, very low.

  6. And you’re still simping for that messy engine at every occasion while its widespread usage in the industry is going to send PC gaming right into dark ages, ironically blind people like you still think this engine is something positive that is happening to gaming just because it doesn’t have pop-ins while in fact it’s just turning the whole industry into asset-flipping amateurism and counting on upscaling technique to compensate for sh*t coding and optimization, Jesus i want pop-ins back, and what i want the most is seeing you reporting what’s wrong with every game made on this engine instead of s*cking its D on daily basis

  7. Not to defend sh*tty coding by soy devs, but 4K is probably the most r3tarded fad ever to come to PC. Most hardware is simply not ready for this rubbish, and besides: Why? do you have robotic laser eyes and can see a difference between lets say 2K and 4K? what is this BS? Who did this? the hardware manufacturers? Or ret@rded OC nerds who don’t even play games just count frames? everyone needs a slap in their stupid faces. 2K is sharp enough. 4K 1diots should all be rounded up and put in a gorilla rape camp. What matters is graphics, art, narrative direction and sound quality in games. I f*ckn hate 4K snobs.

    And before the 4K Defense Force: No, 27-32″ display is perfectly fine, if you need bigger monitors you’re mentally ill

    1. the point of 4k is no jaggies no need for aaa due to higher resolution, now we got dlss and frame gen to “fix” that.

      1. I usually just put the rendering scale up to 1.2-1.4 and the jaggies disappear on 1440p without any AA

      2. never heard about jaggies not being present at 4K. wtf. I’m on 2K and jaggies are everywhere. does another 2K just simply eliminate them?

    2. 32″ is smaller then my d*ck. I have my PC hooked to 55″ OLED TV like any decent human would have (used to have 100″ from projector but constant changing of lamps was annoying so i settled for 55″ as its perfect for my current setup). I have no problem running even full rt games that are well made like metro ee in 4k and 120FPS on my rtx4080. Aside from cyberpunk in full rt mode and no dlss everything easily runs in 120+ FPS in native 4k.
      You hate 4k because you are too poor to have 4k, its all there is to it. Stop being butthurt your life will be better for it.

      1. I’m not poor. It’s not about money you shallow NPC. “muh you’re poor that’s because you don’t like sh*t I like” typical sign of a megalomaniac consumerist goodgoy. I’m just not a smoothbrain ret3rd like you, who feels lonely if he doesn’t have the biggest, newest best thing a goy could buy. 55″ holy f*ck this zoomer got some issues

    3. 1st off tard, 2k is 1080p if you round it to the closest number, not 1440p. Ppl have been using the idiotic term for over 10 years now.

      4k is absolutely worth it if you go above 27 inches, and it’s what enables gaming on TVs of 40inches+.

      An LG OLED sh*ts on every gaming monitor by a wide margin especially when you consider the pricing of similar monitors, and you need 4k at that size.

  8. What is the author of this article talking about? I played the game on a Ryzen 7 5800X and a RTX 4070, and I played in 2560 x 1440 mostly low settings, with DLSS 2 and DLSS 3 frame generation and it was mostly running above 100 fps.

  9. Johns article for the life time of the 4090 goes down the drain. Why not a new 3000 euro card that is value to the gamers.

  10. What I see is when UE5 is associated with Lumen and Nanite, the game is a total mess so who is the culprit here; UE5 or Lumen and Nanite ?

    1. Fortnite have all beel and whistle of UE5, and it can scale on any hardware to run just fine. Sadly, the UE5 guy are way more hands off towards developers licensed their games in term of optimisation( cant blame them because there is a lot of them) . Devs are just let loose, unless they are good developer like Ninja Theory or The Coalition that dedicated their time to fully understand UE5 quirk, bell and whistle

  11. As a spiritual person I warn you in the most loving way to be careful what you meditate on. I know it’s just a game but if has real elements of spells and names of real demons, these things can ‘haunt’ you. They bring sickness (mental and physical), try and get you into accidents and also effect your relationships. Jesus is Lord over all these things, stay close to Him.

  12. Maybe but I don’t think we should judge that based on a sh**ty game that sucks in every single aspect.
    Why would you expect a game this bad to be optimized?

  13. Somebody’s gotta be lying out of everyone saying this is unplayable, cause I have an i7-11700k with an rtx 3060 and 32gb ram. Have over 10 hours of play time and have had zero lag + zero performance issues.

  14. This is where DLSS 3 comes into play. From benchmarks I’ve seen, it’s a rock solid 65-75fps with DLSS 3 at 4K. The game looks amazing and UE5 once again delivering the goods. This is one of those rare games that will look even better at 8K resolution, hopefully something 50-series Nvidia like the 5090 can run well.

    1. Getting 65-75 WITH FG is not good, you can feel the added lag at that framerate and the anomalies also start showing up more and more. You need a final fps of at least 90+ for FG to feel good.

  15. The game runs absolutely BEAUTIFULLY on my ROG Strix Scar G533ZS. A hybrid of high/ultra with maybe 2 things on low. I get between 90 (heavy heavy battle scenes) and about 135 (for 90% of the game so far) fps. I might be missing a detail. I’m here to educate myself. Mine is 2560×1440 and at this exact second I’m at 118fps. I have a 3090 and 64gb ram. Intel i12 I believe. Is it a bug possibly? Again, I apologize if I am coming off as an idiot lol. I’m pretty new to PC gaming as a whole.

  16. I guess I can’t say I’m having the same problems, I’m playing on ps5 and haven’t had any game play problems. Sure I don’t care to see a woman with arms as big as hulk hogan that you’d never see in real life but whatever. What upset me what the hoops I had to jump through to get the pre-order and deluxe edition items. This whole thing about expired credentials and my account being linked to a deleted account? I had to live chat with 3 different people and took about an hour to get it figured out, otherwise the 10 extra dollars and the pre order would be all for not. I find it a pretty crappy way to force people onto a service.

  17. What’s funny is over the past few years realism isn’t getting better, graphics are hitting their peak and it should be reasonable enough to assume a 4090 can carry that hardware for 3 years of AAA games.

    What we’re seeing is questionable graphics with worse and worse performance…. It’s almost like GPU vendors are paying companies to code badly!?

    Starfield is another “10yrs in the making” that isn’t even coop… Like what the fk Bethesda? I honestly thought the cash cow 76 members sub would have pushed it online, but nah.

    Seriously, gpu vendors are either leaning, or coders are just getting giga lazy for a quick buck.

    The gaming industry has never been in a worse spot.

Leave a Reply

Your email address will not be published. Required fields are marked *