Avowed feature

Avowed Benchmarks & PC Performance Analysis

Microsoft has released Avowed in Early Access. The game is powered by Unreal Engine 5, and it uses both Lumen and Nanite. So, it’s time now to benchmark it and examine its performance on the PC.

For this PC Performance Analysis, we used an AMD Ryzen 9 7950X3D, 32GB of DDR5 at 6000Mhz, AMD’s Radeon RX 6900XT, RX 7900XTX, as well as NVIDIA’s RTX 2080Ti, RTX 3080, RTX 4090, RTX 5080 and RTX 5090. We also used Windows 10 64-bit, the GeForce 572.42, and the Radeon Adrenalin Edition 25.2.1 drivers. Moreover, we’ve disabled the second CCD on our 7950X3D.

Avowed CPU scalingAvowed CPU scaling-2

Obsidian has added a few graphics settings to tweak. PC gamers can adjust the quality of View Distance, Shadows, Textures and more. Plus, there is day-1 support for both AMD FSR 3 and NVIDIA DLSS 4. However, there is no support for Intel’s XeSS. So, that’s kind of a bummer.

Avowed PC graphics settings-1Avowed PC graphics settings-2

Before continuing, I should mention a weird game behavior. On AMD’s hardware, the game uses by default AMD FSR 3. And, similarly, the game enables DLSS Quality Mode by default. Even if you disable it, the next time you launch the game, it will re-enable it. To game at Native Res, you’ll have to select FSR/DLS, enable it, and then turn it off. Here is a video that showcases this awkward behavior. I was able to replicate this multiple times on all of our RTX GPUs.

Avowed Graphics Settings Bugged

Even though it uses Unreal Engine 5, Avowed is one of the most multi-threaded games we’ve seen lately. Avowed can effectively use more than 8 CPU threads. Thus, I highly recommend enabling Hyper-Threading/SMT for this title. Also, since this can be a really CPU-bound game, I suggest disabling your background programs (which may hinder your overall performance).

Avowed CPU benchmarks

At 1080p/Epic Settings/No RT, the game will require a really powerful GPU for gaming with over 60FPS at all times. The NVIDIA RTX 3080 and AMD Radeon RX 6900XT can provide a smooth experience, provided you use a G-Sync/FreeSync monitor.

Avowed GPU benchmarks-1

Avowed is a game that appears to favor AMD’s hardware. As we can see, the AMD Radeon RX 6900XT matches the performance of the NVIDIA RTX 3080, and the AMD Radeon RX 7900XTX matches the performance of the NVIDIA RTX 5080.

Now although I used the term “No RT”, the game does use Lumen by default. And, as I’ve said multiple times, Lumen is a form of Ray Tracing. The in-game “Ray Tracing” setting simply enables Hardware Lumen. From what I’ve seen, the performance cost of Hardware Lumen on the RTX 5090 is around 5-7FPS.

At 1440p/Epic Settings/No RT, you’ll need at least and AMD Radeon RX 7900XTX or an NVIDIA RTX 5080 for gaming with over 60FPS. As for Native 4K/Epic Settings/No RT, there is no GPU that can hold over 60FPS at all times.

Avowed GPU benchmarks-2Avowed GPU benchmarks-3

The good news here is that you can use the in-game settings to gain some performance back. At Native 4K/High Settings, our NVIDIA RTX 5090 was able to push over 75FPS at all times. So, we got a 39% performance boost by dropping to High Settings. Medium Settings and Low Settings offer an additional 11% and 20% performance increase, respectively.

Avowed GPU benchmarks-4

Graphics-wise, Avowed looks amazing. To be honest, I was a bit disappointed with the first beach area. But once I explored its environments and went into some dungeons, the game quickly changed my mind. Thanks to Lumen, the lighting looks incredible in almost all areas. As said, Avowed also uses Nanite, meaning that there are minimal pop-ins. The only downside is the characters which, although they look great, are nowhere near to those we’ve seen in other UE5 games. Characters have not been the strongest point of Obsidian’s titles (in terms of graphics). Still, they look WAY better than those of The Outer Worlds.

Before closing, I should mention how good DLSS is in this title. Here are two screenshots. The first is at Native 4K and the second is with 4K DLSS Quality. As you will see, they look the same. Hell, if you zoom in, you’ll see that the DLSS image is a bit sharper. This is a game that greatly benefits from DLSS Super Resolution without any image degradation, so make sure to use it.

Native 4K4K DLSS Quality

Oh, and in case you’re wondering, Avowed does not suffer from any shader compilation stutters. When you first launch it, the game will pre-compile its shaders. I also did not experience any major traversal stutters. Everything felt smooth. So, kudos to the devs for offering an almost stutter-free UE5 game.

All in all, Avowed is a really demanding game on Max Settings. Since it uses Lumen and Nanite, it shouldn’t really come as a surprise. The good news is that you can get a huge performance boost by dropping your settings. So, the game is scalable. Moreover, it can scale incredibly well on modern-day multi-core CPUs. In terms of performance, the game runs exactly like Hellblade 2. However, it does not look as mind-blowing as it did. Still, Avowed is not a bad-looking title. Far from it. This is a really beautiful game. Plus, it does not suffer from stutters. So, to my surprise, this is a really polished and optimized game on PC.

Avowed - NVIDIA RTX 5090 - 8K/DLSS 4/Epic Settings/Ray Tracing

43 thoughts on “Avowed Benchmarks & PC Performance Analysis”

    1. That gameplay, especially combat is dog water. Dark Messiah from 2006 has far superior combat, that is also fun and satisfying.
      This thing, doesn't even have a impact hit, that frame pausing is bad.
      Magic combat, well that's done nice.

    2. One of your companions is as gay as they come. You'd have to mod him out. More than one gay reference too aside from him. Pretty fruity game.

  1. i dont get it about nanite .it was not about pop ins .is was advertised as it means unlimited polygons .so we have unlimited polygons or not?if we have unlimited polygons and path tracing why we cant make cgi quality games?

    1. It takes A LOT OF WORK to create the high-polygon assets. What you see in games that use Nanite is the highest level of detail of those assets. In short, that's the best the artist have created for said games. That's also why a lot of studios use packages from the Epic store (like Quixel Megascans).

      1. thanks for the answer .yeah but i expected more from nanite ,something like that ue5 matrix demo and even crazier than that .i mean devs literally have no polygon limit?they have no limits ?what stops them ?they can make games with 100x more polygons than cyberpunk 2077?

        1. They have a polygon limit but it is imposed by them selves, It's called resource budget and I'm not talking about money, I'm talking about cpu, gpu, ram. They aim for a spec sheet and work around it.

    1. Is this trash worth pirating? It looks too feminine for my taste. First the gay pirate theme with pillars 2 and now this colorful slop.

  2. In the opening scene, looking towards the island, on 7900 XTX, I was doing 105-110 at 1440p, max with RT on. Not sure how much heavier later scenes are, given John is seeing 86 fps at 1080p without RT.

    1. Yeah idk where tf they're getting their stats from. I'm using a 4080 Super and at 4K Native with Epic Settings without RT, I get like 60 FPS in the area after you save that girl and talk to those two people who meet you at the docks.

    2. The game by default uses DLSS/FSR whenever you launch it (even if you've disabled it). To get Native 4K, each and every time you launch the game, you'll have to select DLSS/FSR, apply settings, and then disable Upscaling.

      There is a video that proves it. I can replicate it each and every time. And for the love of God, instead of just looking at the graphs, read the whole article :P.

    3. The game by default uses DLSS/FSR whenever you launch it (even if you've disabled it). To get Native 4K, each and every time you launch the game, you'll have to select DLSS/FSR, apply settings, and then disable Upscaling.

      There is a video that proves it. I can replicate it each and every time. And for the love of God, instead of just looking at the graphs, read the whole article :P.

    4. The game by default uses DLSS/FSR whenever you launch it (even if you've disabled it). To get Native 4K, each and every time you launch the game, you'll have to select DLSS/FSR, apply settings, and then disable Upscaling.

      There is a video that proves it. I can replicate it each and every time. And for the love of God, instead of just looking at the graphs, read the whole article :P.

  3. Before continuing, I should mention a weird game behavior.

    It's Obsidian, so I suspect there will be a large number of bugs and they'll be slow at fixing them, if the past titles – chiefly Pentiment – are any indication.

    I won't bother with this anyway; very demanding graphics aren't a motivation for me. I may give it a try out of curiosity in a few years when it's at sale price and when/if I have a new high-end system.

  4. I just noticed the cover art has the trans flag colors. Not sure if just coincidence or intentional. But it's obsidian so it might be on purpose

    1. The game by default uses DLSS whenever you launch it (even if you’ve disabled it). To get Native 4K, each and every time you launch the game, you’ll have to select DLSS, apply settings, and then disable Upscaling.

  5. no its another nonsense un-optimized game actually, its not about multi cpu demands. how the hell can you be excited about the 5090 card? what a waste of money when all you do is to use DLSS anyway I can easily get the same results with my 4090 oc card and . Not sure what they're game is other than annoying pc gamers.

  6. Correct me if I’m wrong but a 6900 xt matching a 3080 does not ‘favors amd hardware’ a game make. I think the 5080 is just really underperforming in this title.

  7. it`s not that demangind to be honest, with almost all options on utlra and with hardware RT my 3080 ti with OC runs this quite smooth and 4k with DLSS performance 55-80+ fps.

    Only options I changed is draw distance and shadows to high.

  8. I don't like your journalism approach here John. You are behaving like the RTX 4090 doesn't even exist. That is simply unacceptable. No excuse/work ethic can justify this approach.

Leave a Reply

Your email address will not be published. Required fields are marked *