AMD RDNA 2 temp header

AMD confirms RX 6000-series RDNA 2 ‘Big Navi’ GPU lineup will support Ray Tracing in existing titles

Just a few days ago AMD officially announced its RDNA 2-based flagship Big Navi graphics cards, the Radeon RX 6900XT, RX 6800XT and RX 6800, respectively. Even though the RDNA 2 Live event showcased benchmarks that favored the Radeon RX 6900 XT, Radeon RX 6800 XT, and Radeon RX 6800 over the competition, none of them included metrics related to ray tracing.

A lot of gamers have been waiting to hear about AMD’s approach to Ray Tracing. Recently a statement from AMD was published by AdoredTV, where the manufacturer has confirmed that it will support ray tracing in all games that do not come with proprietary APIs and extensions, such as NVIDIA DirectX RTX or NVIDIA Vulkan RTX.

AMD confirmed that they would be only supporting the industry standards, which basically includes Microsoft’s DXR API and Vulkan’s raytracing API.

“AMD will support all ray tracing titles using industry-based standards, including the Microsoft DXR API and the upcoming Vulkan raytracing API. Games making of use of proprietary raytracing APIs and extensions will not be supported.”— AMD marketing.

Both Microsoft DXR or Vulcan ray tracing API industry standards are slowly getting more popular, and the launch of AMD’s RX 6000-series GPUs will also push game developers away from NVIDIA’s in-house implementation, or proprietary APIs. For what’s it worth, Intel also plans to support DXR as well, with its Xe-HPG discrete GPU lineup.

What this basically means is that existing DXR-supported titles are supported, and also PC titles which will utilize the official Vulkan ray tracing API, like e.g. Control PC game which uses the DXR API. However, titles such as Wolfenstein: Youngblood and Quake II RTX will not be supported, because both of these titles use Nvidia’s proprietary Vulkan RTX API extensions.

Wolfenstein: Youngblood and Quake II RTX use non-standard, proprietary API extensions, so AMD’s RDNA 2 cards won’t support similar PC titles, though most of these games can still run without ray tracing enabled, and still offer a satisfying visual/gaming experience.

AMD RDNA 2 next-gen features-2

In case you didn’t know, AMD Radeon RX 6000 series also feature Ray Accelerators. One official slide elaborates on the hardware component that Radeon RX 6000 Series GPUs leverage for ray tracing – the Ray Accelerator (RA). Each Compute Unit carries one Accelerator as shown below:

AMD RX 6800 DXR Ray Tracing-3

These RA units are responsible for the hardware acceleration of ray tracing in games. The RX 6900 XT features 80 RAs, RX 6800 XT features 72, and RX 6800 has 60. The same Ray Accelerators can be found in RDNA2-based next-gen gaming consoles.

“New to the AMD RDNA 2 compute unit is the implementation of a high-performance ray tracing acceleration architecture known as the Ray Accelerator,” a description reads. “The Ray Accelerator is specialized hardware that handles the intersection of rays providing an order of magnitude increase in intersection performance compared to a software implementation.” via AMD.

AMD will launch the Radeon RX 6800 and RX 6800 XT series of graphics cards on November 18th. The AMD Radeon RX 6900 XT flagship card will be available on December 8th and will cost around $999 USD.

Thanks, OC3D.

Stay tuned for more!

48 thoughts on “AMD confirms RX 6000-series RDNA 2 ‘Big Navi’ GPU lineup will support Ray Tracing in existing titles”

  1. While RTX cards support Raytracing in EVERY game with Raytracing support, even boosting the performance of games with software-based Raytracing support.

    1. That’s true right now. But, basically starting next week everything is going to be coded for AMD first as their hardware is now in the modern consoles. Nvidias stuff, as usual, is going to become an afterthought at best. Proprietary stuff never wins.

      1. both AMD and nvidia ray tracing implementation are proprietary (hence both of them have different way of doing it). but both implementation are compliant with MS DXR. there are reasons why nvidia aiming to be fully DXR compliant with their RTX implementation since the very beginning. this is to avoid that “coded for AMD first” situation in the future.

      2. we heard that before. the “AMD next gen console advantage” rarely comes to fruition. if any, it drives NVIDIA to dominate PC Games while AMD taking a nap expecting someone else to do the job.
        Proprietary stuff drives to new standards. in NVIDIA’s case it’s usually outperforming, or holding an advantage over them standards during the transition timeframe, which can take years. there’s plenty of win in proprietary.

        1. This time it will be even harder because only Xbox support RDNA2 and PS5 use older RDNA1. Features like hardware ML, VRS, Mesh Shaders and Sampler Feedback are available only on PC and Xbox. This is not a good info for gamers

          1. No, both AMD and MS confirmed that PS5 don’t support:

            – hardware ML (CU in RDNA support 4-bit ops, in PS5 opnly 16-bit ops)
            – mesh shaders (PS5 instead use Primitive Shaders from GCN),
            – sampler feedback
            – vrs

            Why are you talking about things you don’t understand? Next time use Google and check. Do you even saw AMD RDNA2 presentation?

          2. And to avoid backlash or possible lawsuits from nit picky dumbos they saying it’s between RDNA 1 and 2, even tho it’s based on RDNA 2.

          3. CU units are core element of every GPU from AMD – shaders

            RDNA1 CU support 32-bit and 16 bit operations (2 in clock)
            RDNA2 CU support 32-bit, 16-bit (2 in clock), 4-bit (8 in clock)

            Now lets check how this looks on all platforms:

            PC 5xxx cards – CU support 32-bit, 16-bit
            PC 6xxx cards – CU support 32-bit, 16-bit, 4-bit

            PS5 – CU support 32-bit, 16-bit
            Xbox Series X|S – CU support 32-bit, 16-bit, 4-bit

          4. RDNA2 without shader units from RDNA2…

            this is why Sony don’t support 4-bit operations (ML- DLSS), VRS, Mesh Shaders, Sampler Feedback and instead support old Primitive Shaders from GCN (AMD Vega) which is not available on RDNA2 cards

            Sony use mix or RDNA1 and older GCN – hardware supported BC with PS4 and some old Promitive Shaders from Vega

          5. Sounds like lies, mix between GCN and RDNA 1 can’t be RDNA 1.5, it should be RDNA 0.5.
            Sounds like someone is jealous that PS5 based on RDNA 2 with less features will win this console battle over full featured RDNA 2 xbox.

          6. Sony confirmed that GPU will support whole instruction set from GCN because Playstation use hardware level backward compatibility. This instruction set was supported by Radeon 5700 cards which supported both RDNA and GCN instruction. New Radeon 6xxx cards don’t support GCN instructions (this is why it never be used in PS5, PS5 Pro)

            Check – “Road to PS5” by Mark Cerny about hardware BC with PS4. This is very different from software BC used by MS on PC and Xbox

          7. RDNA2 with less features but to be sure not to trigger some butthurts they called it RDNA 1.5.

          8. Because this term was used by lead Sony GPU engineer Rosario Leonardi who said in private message to “Grid” game developer that PS5 is RDNA1 and don’t support 4-bit operations. That developer published whole conversation on reddit as proof few days later

          9. RDNA without:
            – CU units from RDNA2 (4-bit support), Sony use CU from Raden 5700
            – VRS
            – Mesh Shaders
            – Sampler Feedback

            I think that PS5 will be 20% slower than Radeon 5700. Lets wait for benchmarks

          10. CU units are core element of every GPU from AMD. Lets check official AMD description of architectures:

            RDNA1 CU support 32-bit and 16 bit operations (2 in clock)
            RDNA2 CU support 32-bit, 16-bit (2 in clock), 4-bit (8 in clock)

            Now lets check how this looks on all platforms:

            PC 5xxx cards – CU support 32-bit, 16-bit
            PC 6xxx cards – CU support 32-bit, 16-bit, 4-bit

            PS5 – CU support 32-bit, 16-bit
            Xbox Series X|S – CU support 32-bit, 16-bit, 4-bit

            Do you see difference?

          11. No, Both AMD and MS confirmed that PS don’t support:

            – hardware ML (CU in RDNA support 4-bit ops, in PS5 opnly 16-bit ops)
            – mesh shaders (PS5 instead use Primitive Shaders from GCN),
            – sampler feedback
            – vrs

            Why are talking about thinks you don’t understand? Next time use Google and check.

  2. rt is amazing for developers, and terrible for consumers, yes it is the future, but today? still in its infancy, and not worth the headache

  3. Would love to see how the Same game like e.g. CONTROL performs on AMD GPU. But I doubt AMD can beat Nvidia in ray tracing. They can match the perf though.

    Because ray tracing hardware is still not powerful for full RTX implementation. will take 3-4 more years when this rtx becomes mainstream though. yeah I know next-gen consoles also have RDNA 2 graphics, but how powerful these are, remains to be seen.

    1. RTX becomes mainstream next week with the consoles. It will be extremely limited, but we will have it.
      The consoles will both usher in RTX and also limit it at the same time. Fun how that works.

      1. Honestly speaking, I won’t call that mainstream though. Because ray tracing hardware is still not powerful for full RTX implementation.

        By mainstream I meant every GPU be it low-end/mid-range and high-end must give a playable frame rate on all resolutions, at least on mid-high settings, if not ultra. Also, how powerful PS5 and XBX/S are, we don’t know yet.

        1. Mainstream just means available to most, or common. Quality is really a different thing.
          We’re a long way away from getting rtx at 4k, let alone that on every GPU.
          Wec actually basically do know how fast they are. DF did some good breakdowns. Ps5 is roughly 2080 territory or even better. Sx is above that.

          1. The gap between 2070S and 2080 is much larger than that in actual games. Maybe PS5 is closer to a regular 2070, doesn’t matter.

            Xbox is on par with a stock 2080, that right there is not up for debate.

            You clearly don’t know wtf you’re talking about if you think that a Zen2 CPU is going to bottleneck a game at 60 fps or lower. The XSX does locked 120 fps in Gears 5 multiplayer, which is almost completely CPU bound.

            Turing TF are worth more in game performance, as is shown by the XSX and 2080 comparison…

Leave a Reply

Your email address will not be published. Required fields are marked *