AMD Radeon RX 6000 GPUs renders-2

First third-party benchmarks for AMD Radeon RX 6800XT surface, faster than NVIDIA GeForce RTX3090 in some rasterized games

It appears that AMD is back. The red team has lifted the embargo for the AMD Radeon RX 6800 and RX 6800XT and as we can see, AMD has finally delivered a competitive product. In fact, the AMD Radeon RX 6800XT can, in some rasterized games, surpass even the NVIDIA GeForce RTX3090.

Now I want to make it crystal clear that for the most part, the AMD Radeon RX 6800XT offers performance slightly lower than the RTX3080. However, there are some situations in which AMD’s GPU can surpass NVIDIA’s high-end RTX3090. Also, do note that AMD is also working on a more powerful GPU, the Radeon RX 6900XT. Thus, we should theoretically see this GPU beating the RTX3090 in more rasterized games.

For instance, the AMD Radeon RX 6800XT beats the NVIDIA RTX3090 in Assassin’s Creed Valhalla. Of course, this should not surprise us as Valhalla favours AMD’s GPUs.

Guru3D Assassin's Creed Valhalla

Now what’s really interesting is the amazing 1080p performance of the AMD Radeon RX 6800XT. Multiple websites have reported that the performance of this GPU can come close to the one of RTX3090. However, when raising the resolution to 1440p or 4K, the gap between these two GPUs widens. Both HardwareUnboxed and Guru3D reported these performance differences.

Hardware Unboxed AMD Radeon RX 6800XT benchmarks

AMD Radeon RX 6800 XT Benchmark Review, Smart Access Memory, Thermals & Gaming

Tomshardware also reports that the AMD Radeon RX 6800XT is faster in some games than the RTX 3080, but slower than the RTX 3090.

Tomshardware Far Cry 5Tomshardware Forza Horizon 4

PCGamesHardware reports that the AMD Radeon RX 6800XT is faster than the RTX 3080 in Battlefield 5 at 1440p. On the other hand, in some games that favour NVIDIA GPUs (like Black Mesa), the RX 6800XT can be even slower than the RTX2080Ti.

PCGamesHardware Battlefield 5PCGamesHardware Black Mesa

It’s also interesting looking at the 4K numbers, especially since RTX 3080 comes with 10GB of VRAM, whereas the RX 6800XT comes with 16GB of VRAM. Despite that, the NVIDIA GPU is noticeably faster in lots of games in 4K. Kitguru reports that in Ghost Recon Breakpoint, Watch Dogs Legion and Red Dead Redemption 2, the AMD RX 6800XT is slower than the RTX3080 in 4K.

Kitguru Ghost Recon BreakpointKitguru Red Dead Redemption 2Kitguru Watch Dogs Legion

Unfortunately, we haven’t seen a lot of Ray Tracing benchmarks. As HardwareUnboxed reported, in Dirt 5 (which is a game that favours AMD GPUs), its Ray Tracing effects run faster on the RX 6800XT. However, in other games (like Shadow of the Tomb Raider), the RX 6800XT performs similarly to the RTX2080Ti. Similarly, Eteknix reports that in Metro Exodus, the RT performance of the RX 6800XT is similar to the RTX2080Ti. So yeah, the Ray Tracing performance of the RX 6800XT appears to be inconsistent. Not only that, but AMD has not implemented yet its DLSS-like technique to any RT games.

Eteknix Metro ExodusEteknix Shadow of the Tomb Raider

Overall, AMD has finally delivered a competitive product. However, and in the majority of games, it does not beat the RTX3080 or the RTX3090 in higher resolutions. We are also really puzzled at the 4K results. We don’t know whether this is a driver issue that AMD can resolve via future updates. After all, the RX 6800XT can outperform even the RTX3090 in lower resolutions. Additionally, its Ray Tracing performance does not appear to be that great, especially if we take account NVIDIA’s DLSS tech. Not only that, but we haven’t seen the FidelityFX Super Resolution. As such, we don’t know whether it will be as good as DLSS 2.0 or as awful as DLSS 1.0. Lastly, and at least for now, there are no benefits to the 16GB of VRAM.

63 thoughts on “First third-party benchmarks for AMD Radeon RX 6800XT surface, faster than NVIDIA GeForce RTX3090 in some rasterized games”

  1. I have been waiting to get my hands on a RTX 3080 since it launched and I am quite sad to see these products being hoarded and scalped while people have to spend more time at home during a pandemic. I did however manage to get a RX 6800 XT this morning so I am finally free of my 1080ti and back on team red since the Radeon HD 4870 heh. Hopefully they do something about super resolution and hopefully Nvidia and AMD in the future, better plan their launch products so customers don’t have to deal with bots on launch day and left with few options over the holiday season.

    1. it’s not, from the summery what i have got that>>>
      1080P= 6800xt severely outperforms rtx 3000 series
      1440p= 6800xt on par with rtx 3080
      4k= 6800xt performs bellow rtx 3080
      ( although it costs less, you have to compromise feature like rt 2nd gen performance and dlss. you can say rx 6800xt has SAM but the performance gain is 1-2 fps in most games.)

      1. You don’t have to compromise shat, devs just need to release patches for their rt games, since AMD beats NVIDIA in dirt5 rt, witch is an AMD optimized game.
        Sam is still in beta phase, NVIDIA wouldn’t jump on it, if it was 1/2 fps.

        1. No patch couldn’t bring the performance if you do not have the sufficient rt cores which do the ray tracing for Nvidia, more or less AMD now just like Nvidia on dxr when they had first brought the ray tracing back in 2018 and after the RTX 30 series with the 2nd zen rt cores, they far advance in ray tracing performance. which can be also said for AMD because they are new to it and subsequently they will gain performance to compete with Nvidia and at that time it will be interesting to see what Nvidia came up with because they are already 2 years ahead of AMD on ray tracing!

  2. Not long now until Cyberpunk. Will you be crying inside when you try to turn on RT because you bought a 6800XT or having a great time on your 3080?

      1. Ah, that is why the industry is moving towards RT, because most people don’t want it? Or is it that you just don’t understand what RT is?

        1. RT is a gimic.
          Like physX.
          You needed a dedicated nvidia card to have physx but then amd also could do physx after some time and the tech just died….

          And now history repeat it self…

          So yeah i know whats going on

          1. Yeah, like colour TVs. Who’d want that ……

            You do realise this is Nvidia’s gen 2 RT, AMD’s attempting to do RT and Intel have planned RT? Hell even some of the next gen mobile SOCs are offering limited RT support.

          2. Yeah. Remember TVs with 3D tech? Where are they? Oh yeah no where but… . just like Nvidia Hair Works cuz who want a single graphics effect that will tank your FPS….
            Last gen gimic was 4k ( and 4K Tv and 4K this and that)
            Before that it was kinect and garbage like that….

            Or you still using xbox kinect? Cuz you are s good sheep who eat everything that the marketing team tells your?????

          3. AMD, Nvidia, Intel and some mobile SOCs are providing hardware support for RT. BTW, who doesn’t have a 4k TV now?

  3. “We are also really puzzled at the 4K results. We don’t know whether this is a driver issue that AMD can resolve via future updates. After all, the RX 6800XT can outperform even the RTX3090 in lower resolutions.”

    lower bandwith and slower memory

    1. yeah that should be a nobrainer for anyone

      it beats 6800 in 4k due to these and nothing else and the gap is not that big

    2. 256 bit bus with the infintiy cache is proving good at 1440P and 1440P UW but not just yet at 4K. It has the rop power. Guess we will see if drivers can work better in the future with the new cache…

    3. No, its not the bandwith.
      As Steave from HW unboxed pointed out in his review its not that RDNA 2 scales badly with resolution but Ampere scales better then other architectures.

      1. The answer lies most likely in the cache that’s designed to mitigate that slower memory – It keeps loosing effectiveness as the vram fills up.. ie super efficient when you need it the least and lost most of its capability’s when you need it the most…

        That’s my no1 guesstimate as the culprint for the poor scaling as resolution and assets etc gets all bigger its “low res – low asset” performance just isn’t there anymore.

        Suspect this series will not age like fine-wine… assets will go up with next gen games

      2. The answer lies most likely in the cache that’s designed to mitigate that slower memory – It keeps loosing effectiveness as the vram fills up.. ie super efficient when you need it the least and lost most of its capability’s when you need it the most…

        That’s my no1 guesstimate as the culprint for the poor scaling as resolution and assets etc gets all bigger its “low res – low asset” performance just isn’t there anymore.

        Suspect this series will not age like fine-wine… assets will go up with next gen games

      3. The answer lies most likely in the cache that’s designed to mitigate that slower memory – It keeps loosing effectiveness as the vram fills up.. ie super efficient when you need it the least and lost most of its capability’s when you need it the most…

        That’s my no1 guesstimate as the culprint for the poor scaling as resolution and assets etc gets all bigger its “low res – low asset” performance just isn’t there anymore.

        Suspect this series will not age like fine-wine… assets will go up with next gen games

      4. The answer lies most likely in the cache that’s designed to mitigate that slower memory – It keeps loosing effectiveness as the vram fills up.. ie super efficient when you need it the least and lost most of its capability’s when you need it the most…

        That’s my no1 guesstimate as the culprint for the poor scaling as resolution and assets etc gets all bigger its “low res – low asset” performance just isn’t there anymore.

        Suspect this series will not age like fine-wine… assets will go up with next gen games

      5. The answer lies most likely in the cache that’s designed to mitigate that slower memory – It keeps loosing effectiveness as the vram fills up.. ie super efficient when you need it the least and lost most of its capability’s when you need it the most…

        That’s my no1 guesstimate as the culprint for the poor scaling as resolution and assets etc gets all bigger its “low res – low asset” performance just isn’t there anymore.

        Suspect this series will not age like fine-wine… assets will go up with next gen games

        1. No, “poor scaling” in RDNA2 does not exist. It scales just like Turing or RDNA1 or Pascal etc.
          Its that Nvidias high end gpus don’t work so well on lower resolutions.

          There is nothing out of the ordinary with RDNA2 but there is with Ampere.

    4. Nod – Slower memory _and_ the caches thats designed to mitigate that keeps loosing effectiveness as the vram fills up.

    5. Nod – Slower memory _and_ the caches thats designed to mitigate that keeps loosing effectiveness as the vram fills up.

    1. The problem is that AMD has no dedicated hardware for it, so it has to drop FPS to process DirectML. And if it’s not done well it will be slated for not matching Nvidia’s DLSS.

    1. But AMD needs to get money for that R&D, haven’t you read reddit/AMD? Also yachts and expensive escorts. But shhh.

    1. According to guru3d, SAM makes a significant difference in AC Valhalla.
      112/133 fps at 1080p
      91/105 fps at 1440p
      56/63 fps at 4K.

      But two other games they tested don’t benefit from SAM.

  4. You could also cherry pick games that favour Nvidia and post the same story. I have seen a lot of benchmarks where the RTX cards easily beat the AMD ones etc. Tech power up’s reviews have the 3080 beating both AMD cards at 4k 60hz in AC Valhalla among others.

    1. To be fair that is a fully raytraced specifically made for Nvidia game. Not that I don’t agree that Nvidia is the clear RT and perhaps 4k winner here.

      1. The only Nvidia specific feature in it is DLSS.

        With it disabled, the game allows a proper benchmark of the ray tracing capabilities of a GPU.

        RDNA2 is not even as good as Turing at RT.

  5. The pref looks promising, considering this is AMD, and the gains usually come later on with driver updates, so it will get even better.

  6. level1techs benched 6800 series with CL16 3200 ram (a common ram) with a 5900x and found the AMD counterpart kicked the guts out of the 3090/2080ti. Wendell’s video was about how most people don’t buy the most expensive ram and how that does affect relative performance from the GPU perspective. Very interesting tbh.

    https://youtu.be/yMz-5uP8sBw

  7. Now I want to make it crystal clear that for the most part, the AMD Radeon RX 6800XT offers performance slightly lower than the RTX3080.

    3% lower in some select titles is not “for the most part”. Basically these GPUs are the same performance-wise, only one has a bigger VRAM buffer, is more power-efficient and $50 cheaper than the other.

    Kudos to AMD for competing in the high-end. Wish my G-Sync monitor wasn’t a Nvidia lock-in. If it weren’t I’d gladly swap 2070 Super for 6800XT. But now I guess I’ll have to wait for 3070Ti or even for 4000 series to come out. But then again 10 Gigs of VRAM might be enough for 1440p 144hz for a couple of years. 3070Ti with 10 Gb of VRAM is supposed to release no sooner than January.

Leave a Reply

Your email address will not be published. Required fields are marked *