It appears that AMD is back. The red team has lifted the embargo for the AMD Radeon RX 6800 and RX 6800XT and as we can see, AMD has finally delivered a competitive product. In fact, the AMD Radeon RX 6800XT can, in some rasterized games, surpass even the NVIDIA GeForce RTX3090.
Now I want to make it crystal clear that for the most part, the AMD Radeon RX 6800XT offers performance slightly lower than the RTX3080. However, there are some situations in which AMD’s GPU can surpass NVIDIA’s high-end RTX3090. Also, do note that AMD is also working on a more powerful GPU, the Radeon RX 6900XT. Thus, we should theoretically see this GPU beating the RTX3090 in more rasterized games.
For instance, the AMD Radeon RX 6800XT beats the NVIDIA RTX3090 in Assassin’s Creed Valhalla. Of course, this should not surprise us as Valhalla favours AMD’s GPUs.
Now what’s really interesting is the amazing 1080p performance of the AMD Radeon RX 6800XT. Multiple websites have reported that the performance of this GPU can come close to the one of RTX3090. However, when raising the resolution to 1440p or 4K, the gap between these two GPUs widens. Both HardwareUnboxed and Guru3D reported these performance differences.
Tomshardware also reports that the AMD Radeon RX 6800XT is faster in some games than the RTX 3080, but slower than the RTX 3090.
PCGamesHardware reports that the AMD Radeon RX 6800XT is faster than the RTX 3080 in Battlefield 5 at 1440p. On the other hand, in some games that favour NVIDIA GPUs (like Black Mesa), the RX 6800XT can be even slower than the RTX2080Ti.
It’s also interesting looking at the 4K numbers, especially since RTX 3080 comes with 10GB of VRAM, whereas the RX 6800XT comes with 16GB of VRAM. Despite that, the NVIDIA GPU is noticeably faster in lots of games in 4K. Kitguru reports that in Ghost Recon Breakpoint, Watch Dogs Legion and Red Dead Redemption 2, the AMD RX 6800XT is slower than the RTX3080 in 4K.
Unfortunately, we haven’t seen a lot of Ray Tracing benchmarks. As HardwareUnboxed reported, in Dirt 5 (which is a game that favours AMD GPUs), its Ray Tracing effects run faster on the RX 6800XT. However, in other games (like Shadow of the Tomb Raider), the RX 6800XT performs similarly to the RTX2080Ti. Similarly, Eteknix reports that in Metro Exodus, the RT performance of the RX 6800XT is similar to the RTX2080Ti. So yeah, the Ray Tracing performance of the RX 6800XT appears to be inconsistent. Not only that, but AMD has not implemented yet its DLSS-like technique to any RT games.
Overall, AMD has finally delivered a competitive product. However, and in the majority of games, it does not beat the RTX3080 or the RTX3090 in higher resolutions. We are also really puzzled at the 4K results. We don’t know whether this is a driver issue that AMD can resolve via future updates. After all, the RX 6800XT can outperform even the RTX3090 in lower resolutions. Additionally, its Ray Tracing performance does not appear to be that great, especially if we take account NVIDIA’s DLSS tech. Not only that, but we haven’t seen the FidelityFX Super Resolution. As such, we don’t know whether it will be as good as DLSS 2.0 or as awful as DLSS 1.0. Lastly, and at least for now, there are no benefits to the 16GB of VRAM.

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email











I have been waiting to get my hands on a RTX 3080 since it launched and I am quite sad to see these products being hoarded and scalped while people have to spend more time at home during a pandemic. I did however manage to get a RX 6800 XT this morning so I am finally free of my 1080ti and back on team red since the Radeon HD 4870 heh. Hopefully they do something about super resolution and hopefully Nvidia and AMD in the future, better plan their launch products so customers don’t have to deal with bots on launch day and left with few options over the holiday season.
it is not slightly slower than a 3080
it goes head to head with it John
it’s not, from the summery what i have got that>>>
1080P= 6800xt severely outperforms rtx 3000 series
1440p= 6800xt on par with rtx 3080
4k= 6800xt performs bellow rtx 3080
( although it costs less, you have to compromise feature like rt 2nd gen performance and dlss. you can say rx 6800xt has SAM but the performance gain is 1-2 fps in most games.)
You don’t have to compromise shat, devs just need to release patches for their rt games, since AMD beats NVIDIA in dirt5 rt, witch is an AMD optimized game.
Sam is still in beta phase, NVIDIA wouldn’t jump on it, if it was 1/2 fps.
No patch couldn’t bring the performance if you do not have the sufficient rt cores which do the ray tracing for Nvidia, more or less AMD now just like Nvidia on dxr when they had first brought the ray tracing back in 2018 and after the RTX 30 series with the 2nd zen rt cores, they far advance in ray tracing performance. which can be also said for AMD because they are new to it and subsequently they will gain performance to compete with Nvidia and at that time it will be interesting to see what Nvidia came up with because they are already 2 years ahead of AMD on ray tracing!
good 1080p performance. now shave 200w and im in
Not long now until Cyberpunk. Will you be crying inside when you try to turn on RT because you bought a 6800XT or having a great time on your 3080?
Nah.
6800 XT should be perfect for 1440p60+ maxed visuals.
Wow tranny punk will look shiny with rt, but still disgusting, not gonna even buy that game if you ask me.
Most ppl turn off RTX for free fps.
Ah, that is why the industry is moving towards RT, because most people don’t want it? Or is it that you just don’t understand what RT is?
RT is a gimic.
Like physX.
You needed a dedicated nvidia card to have physx but then amd also could do physx after some time and the tech just died….
And now history repeat it self…
So yeah i know whats going on
Yeah, like colour TVs. Who’d want that ……
You do realise this is Nvidia’s gen 2 RT, AMD’s attempting to do RT and Intel have planned RT? Hell even some of the next gen mobile SOCs are offering limited RT support.
Yeah. Remember TVs with 3D tech? Where are they? Oh yeah no where but… . just like Nvidia Hair Works cuz who want a single graphics effect that will tank your FPS….
Last gen gimic was 4k ( and 4K Tv and 4K this and that)
Before that it was kinect and garbage like that….
Or you still using xbox kinect? Cuz you are s good sheep who eat everything that the marketing team tells your?????
AMD, Nvidia, Intel and some mobile SOCs are providing hardware support for RT. BTW, who doesn’t have a 4k TV now?
“We are also really puzzled at the 4K results. We don’t know whether this is a driver issue that AMD can resolve via future updates. After all, the RX 6800XT can outperform even the RTX3090 in lower resolutions.”
lower bandwith and slower memory
yeah that should be a nobrainer for anyone
it beats 6800 in 4k due to these and nothing else and the gap is not that big
256 bit bus with the infintiy cache is proving good at 1440P and 1440P UW but not just yet at 4K. It has the rop power. Guess we will see if drivers can work better in the future with the new cache…
No, its not the bandwith.
As Steave from HW unboxed pointed out in his review its not that RDNA 2 scales badly with resolution but Ampere scales better then other architectures.
The answer lies most likely in the cache that’s designed to mitigate that slower memory – It keeps loosing effectiveness as the vram fills up.. ie super efficient when you need it the least and lost most of its capability’s when you need it the most…
That’s my no1 guesstimate as the culprint for the poor scaling as resolution and assets etc gets all bigger its “low res – low asset” performance just isn’t there anymore.
Suspect this series will not age like fine-wine… assets will go up with next gen games
The answer lies most likely in the cache that’s designed to mitigate that slower memory – It keeps loosing effectiveness as the vram fills up.. ie super efficient when you need it the least and lost most of its capability’s when you need it the most…
That’s my no1 guesstimate as the culprint for the poor scaling as resolution and assets etc gets all bigger its “low res – low asset” performance just isn’t there anymore.
Suspect this series will not age like fine-wine… assets will go up with next gen games
The answer lies most likely in the cache that’s designed to mitigate that slower memory – It keeps loosing effectiveness as the vram fills up.. ie super efficient when you need it the least and lost most of its capability’s when you need it the most…
That’s my no1 guesstimate as the culprint for the poor scaling as resolution and assets etc gets all bigger its “low res – low asset” performance just isn’t there anymore.
Suspect this series will not age like fine-wine… assets will go up with next gen games
The answer lies most likely in the cache that’s designed to mitigate that slower memory – It keeps loosing effectiveness as the vram fills up.. ie super efficient when you need it the least and lost most of its capability’s when you need it the most…
That’s my no1 guesstimate as the culprint for the poor scaling as resolution and assets etc gets all bigger its “low res – low asset” performance just isn’t there anymore.
Suspect this series will not age like fine-wine… assets will go up with next gen games
The answer lies most likely in the cache that’s designed to mitigate that slower memory – It keeps loosing effectiveness as the vram fills up.. ie super efficient when you need it the least and lost most of its capability’s when you need it the most…
That’s my no1 guesstimate as the culprint for the poor scaling as resolution and assets etc gets all bigger its “low res – low asset” performance just isn’t there anymore.
Suspect this series will not age like fine-wine… assets will go up with next gen games
No, “poor scaling” in RDNA2 does not exist. It scales just like Turing or RDNA1 or Pascal etc.
Its that Nvidias high end gpus don’t work so well on lower resolutions.
There is nothing out of the ordinary with RDNA2 but there is with Ampere.
Nod – Slower memory _and_ the caches thats designed to mitigate that keeps loosing effectiveness as the vram fills up.
Nod – Slower memory _and_ the caches thats designed to mitigate that keeps loosing effectiveness as the vram fills up.
Great card….AMD is on fire!
Now AMD must implement new DirectML Super Resolution (DLSS) in drivers. This should be easy because MS open sourced DirectX Machine Learning Super Resolution on GitHub
The problem is that AMD has no dedicated hardware for it, so it has to drop FPS to process DirectML. And if it’s not done well it will be slated for not matching Nvidia’s DLSS.
Looks like the best 1080p card, if it wasn’t $650.
But AMD needs to get money for that R&D, haven’t you read reddit/AMD? Also yachts and expensive escorts. But shhh.
Is this with or without smart memory ? If it’s with, it’s highly misleading.
According to guru3d, SAM makes a significant difference in AC Valhalla.
112/133 fps at 1080p
91/105 fps at 1440p
56/63 fps at 4K.
But two other games they tested don’t benefit from SAM.
Interesting. Now to see Nvidia’s version and what % perf upgrade it’ll give.
I wish I could pirate graphic cards.
Cheapskate. I hope someone gets the product or service that you do as work for free.
will never happen, 90 percent of people fail at what I do.
https://www.youtube.com/watch?v=4bOpo1ZTRmQ&ab_channel=98Thew
its simple 2k AMD 4k NVIDIA
I would call it a draw at 2K. Well done AMD, finally we have some competition in the high end.
If smart memory is off it’s a draw otherwise i’d like to see with/without.
He said that the Smart memory has seen no gain except very minimal or not at all, if you watched the video.
support for sam is still in beta by mobo makers, wait.
i had not thx
wut
I think you wrote the reverse of what you meant
Last I checked, modding is a core component of PC gaming…
Let’s fanboy war begins
Looking forward to comment section wars. Opens bag of popcorn
You could also cherry pick games that favour Nvidia and post the same story. I have seen a lot of benchmarks where the RTX cards easily beat the AMD ones etc. Tech power up’s reviews have the 3080 beating both AMD cards at 4k 60hz in AC Valhalla among others.
Just leaving this here https://uploads.disquscdn.com/images/53ad998fc1939cb262967dae5c8ffcf68bab7096f0e9a74ba016314dfe7348b2.jpg
To be fair that is a fully raytraced specifically made for Nvidia game. Not that I don’t agree that Nvidia is the clear RT and perhaps 4k winner here.
The only Nvidia specific feature in it is DLSS.
With it disabled, the game allows a proper benchmark of the ray tracing capabilities of a GPU.
RDNA2 is not even as good as Turing at RT.
Except the benchmark is made in the Portal pioneers which is an RTX beta map released in partnership with Nvidia. Look I don’t knock the fact that nvidia does rt much better it’s just that flouncing some biased benchmark is hardly fair. Here’s the source.
https://www.minecraft.net/en-us/article/new-worlds-rtx-windows-10-beta-
How is this relevant?
It’s just a map, it has nothing to do with the underlying technology.
The pref looks promising, considering this is AMD, and the gains usually come later on with driver updates, so it will get even better.
Ok with that logick you can also oc nvidia cards and boom… Same result.
level1techs benched 6800 series with CL16 3200 ram (a common ram) with a 5900x and found the AMD counterpart kicked the guts out of the 3090/2080ti. Wendell’s video was about how most people don’t buy the most expensive ram and how that does affect relative performance from the GPU perspective. Very interesting tbh.
https://youtu.be/yMz-5uP8sBw
Well that’s interesting to say the least.
3% lower in some select titles is not “for the most part”. Basically these GPUs are the same performance-wise, only one has a bigger VRAM buffer, is more power-efficient and $50 cheaper than the other.
Kudos to AMD for competing in the high-end. Wish my G-Sync monitor wasn’t a Nvidia lock-in. If it weren’t I’d gladly swap 2070 Super for 6800XT. But now I guess I’ll have to wait for 3070Ti or even for 4000 series to come out. But then again 10 Gigs of VRAM might be enough for 1440p 144hz for a couple of years. 3070Ti with 10 Gb of VRAM is supposed to release no sooner than January.
LOL
https://www.kitguru.net/wp-content/uploads/2020/11/RT.png
Mod Edit: Kitguru does not allow hotlinking
1080p is peasantry?
Bit*h please, 1080p 144Hz is VASTLY superior to higher resolutions.