AMD vs NVIDIA by MAHSPOONIS2BIG

AMD has filed new Ray Tracing patents for future hardware

In January 2025, NVIDIA introduced some new Ray Tracing techniques using Neural Shaders. And, to the surprise of no one, AMD has filed similar patents for Ray Tracing. So, let’s see what the red team will be bringing to gamers.

The key features of these patents are Neural Network-based Ray Tracing, Traversal and Procedural Shader Bounds, Ray Tracing Structure Traversal Based on Work Items and Lossy Geometry Compression Using Interpolated Normals for Use in BVH Building and Rendering.

Going into more details, the Neural Network-based Ray Tracing seems similar to what NVIDIA is doing with the Neural Shaders. So, I guess now PC gamers will be fine with this tech? After all, some criticized NVIDIA for introducing it as a new “gimmick”.

The point is that AMD appears to be going all in with Ray Tracing for its future hardware. So, in theory, RDNA5 and the next-gen consoles (PS6 and the next Xbox) should be more capable of running ray-traced – or even path-traced – games.

This also proves how far ahead NVIDIA is. The exact same thing happened with DLSS. NVIDIA first introduced it, only for AMD to follow up (one or two years later). Now if only there were any RTX50 GPUs to buy.

Realistically, I don’t expect any games to support these new RT techniques anytime soon. We are most likely looking at things that may happen in two or three years from now. So, this is a nothing-burger for most of you. Still, it can give you an idea of where things are heading.

You can find links for all of AMD’s RT patents here. All of them have brief descriptions, so you can find out what AMD is trying to achieve with them.

Stay tuned for more!

55 thoughts on “AMD has filed new Ray Tracing patents for future hardware”

  1. This and the neural shaders refer to two completely different things. The term "neural shader" is actually just a buzzword. In reality, these are regular compute and graphics shaders with cooperative vector support. This enables the optimization of matrix-vector operations, but that’s all, it doesn’t offer anything more. Every hardware can do it, if there is big enough VRAM and bandwith for the neural network.

    The linked patents are more about how the acceleration structure is built. cooperative vector cannot be used for this because DXR does not support programmable traversal in the first place. If a stage in the API can't execute a shader, than you can't use cooperative vector for that stage.

    The "neural network-based ray tracing" patent appears to be a technique that offloads internal traversal shaders from the hardware while keeping memory usage particularly low.

    While "lossy geometry compression using interpolated normals for use in bvh building and rendering" patent will allow the built of the acceleration structure with Dense Geometry Format, aka DGF, which is a block-based geometry compression technology. This comes with UDNA.

    Every UDNA technique is part of the Project Amethyst concept, because Sony will use UDNA for PS6. They partnered with AMD, that if some UDNA technology is not supported by the standard APIs, AMD will allow the usage of it on PC, so Sony can port their PS6 games to PC. At least their games can run with max settings on UDNA-based Radeon GPUs. Even if it requires proprietary APIs.

  2. Well, tbh it was a gimmick when Nvidia introduced it but they have continued to move the tech forward over the years and now we are at a point where there are over 500 games that use RT in small implementations and the number increases regularly. Developers say that it is easier to use than rasterization methods.

    When AMD first started talking about much improved RT performance in the generation that they are currently releasing it was obvious to most that RT was the future but it will be gradual and developers are still going to have to implement it well for it to impact visual value.

    There were still the vocal minority saying that RT was worthless and almost no one wanted it but they seem to have quieted down a lot since AMD has made it a serious priority. I wonder if there is some connection there? hmmmm

    1. I think there is still that group of people who say they don't care for RT because the games they play mostly dont utilize them the same way newer titles do or still don't think the trade off is worth the cost to performance. We, as commenters of PC gaming related forums, are part of an echo chamber where we probably do have an misrepresentative AMD userbase as opposed to the vast amount of pc gamers who are on RTX cards. Over time as more new games have shown to demand such tech to run, obviously the manufacturers have to bend to that larger demand than just the few percent of vocal dummies trying to hold on to pure raster especially when Microsoft and Sony are demanding RT and upscaling improvements for the consoles and windows.

    2. Yes there's a connection, AMD fans are vocal big time, but that's not the only reason, it's been 6 years since the first RT game and as always people just accept things with time, it happened with Steam, it happened when Quake 3 forced people to use dedicated GPUs, it happened when games forced Pixel Shaders…

    3. Get real. Nvidia continued with the RT bs because they made money out of it selling it as the next big thing by snake oil salesman Jensen. Dlss was created to keep the rt hype alive as it is unusable for mainstream gaming at native resolution. They are actually making you pay a lot for software base performance enhancements for midrange cards with lower specs. Watch the shrinklation video.

      1. Not sure if you are serious but it's easy to find info if you want it. Just copy/paste the following in Google and find the info:

        how many games use RT

        1. First, the nVidia count you are quoting refers to “games and aps”. Second, practically anyone you ask will have trouble naming you 20 games where RT does anything, let alone 50 or more.

          1. First, I didn't quote anything. I gave you a way to learn for yourself. You seem to have read one headline and drew an erroneous conclusion. Second, whether or not anyone can name 20 of anything has nothing to do with how many there are of anything. What the hell kind of logic is that? 🙁

          2. Does it matter it 500 wild pigs simultaneously fart in a forest if there’s nobody around to hear them?

  3. OMG John, don't be so offended by what people say about your dear nVIDIA.

    Gee, take a chill pill dude. Δεν θα σε κάνει και εικόνισμα ο Jensen…

  4. My comment was "detected as spam" because i called out John?
    I guess all that "free speech" poll on a yearly basis is excluding this site's bias?

    Pity…

        1. Disqus does some automated filtering. I know that certain words in comments cause messages to be held for moderation, so it's possible you just so happened to trip one of their automated spam filters. They probably tweak those fairly regularly as spammers are always trying to bypass them.

        2. In addition to what GT said it's also possible you've been downvoted or reported a lot by butthurt homos which effects whatever invisible metric Disqus uses to help keep non-queer people off their service. Happens to me all the time.

    1. Blame Disqus, only God knows what comments you've made to other sites to force Disqus to automatically mark your comment as "spam".

      1. No other comments have i made at least today and you should know better the kind of comments i make since you’re the owner of this website and i comment in here too.

        1. Disqus could have marked you as spammer for its own reasons. I don't know. But, whenever another comment of yours gets marked as "spam", feel free to tag me to manually approve it.

    1. That's true though, AMD has been following behind NVIDIA and reproducing the same techs for years, that's not bias, actually the bias is to deny that as an AMD fanboy

      1. Fanboyism Is strong with the writer. A different technological implementation of ray tracing and path tracing is not synonymous to Nvidia implementing ray tracing as being far ahead in technology. It didn't help that 9070 is eating up the mid 4000 series and even some 5000 series card at ray tracing and fsr 4. So to proclaim and post such inaccurate statement is definitely fanboying. No need to ruin the party just to hide their affinity. Just distasteful.

        1. Where have you been these 4 – 5 years? AMD has always been behind Nvidia from the development of Upscaling, frame generation, Ray tracing and now Neural Rendering technologies.

      2. Both have invented new tech and the other made their own implementation after it got into the mainstream api's. Nvidia have done more of those but both have contributed (haven't looked at marketshare vs contribution but would not be surprised if the roles were reversed if looking at it from that angle). Just shame nvidia is basically at the point of doing more harm than good to the PC as a gaming platform – I mean look at its prices and gpu series "shrinkflation" after the 30 series. When the mainstream PC gamers hurt so will the install base and that means we all hurt in the end as we will get less and worse games.

        1. AMD is also harming PC gaming by not being competitive enough, their console market made them lazy

    2. oh no evil author didn't bootlick my chosen pet company that is, and always will be inferior. this is heresy!

  5. POS Lowlife AMtrash, allways been copying nVidia, so it's Nothing New!
    It would be New, if these POS trash would do a decent job at it tho. But they Never does!

    Sadly AMdead is garbage and allways will be, in the Gpu space!

    Pretty sure Intel will eat em alive in their upcoming GPU's 🥳

    1. "Pretty sure Intel will eat em alive in their upcoming GPU's"

      The jury is till out on that one. AMD reported a few days ago that they had sold 10 times more GPUs in the same time frame as the last generation release. Obviously part of that is due to Nvidia totally messing up their launch with almost no availability unless you want to get scalped and the defective missing ROPs problems and the pitiful new generation performance increases of most of the Slackwell stack.

      We will have to wait for the next big Market Share report but I imagine AMD did take some of that 90% dGPU market share away from Nvidia and since Nvidia is mostly concerned with their AI revenue going forward and PC gamers are mostly just an afterthought they're leaving AMD a golden chance to make serious impact.

      Remember the spanking that AMD gave the giant Intel? It didn't seem possible to most people at the time but it did happen. This could happen again to Nvidia too. Nvidia has become fat, complacent and overly greedy in the gaming GPU sector just like Intel was in the CPU sector.

      1. Well I am happy as they actually sold some this time, but ppl are brainwashed by nVidia and likely still pay for the extremely underwhelning 50 series – yes even I who am an nVidia boy – as well ho aint with their suprene tech/ drivers etc and Allways pushing forward!

        But this time, oh my, we all know it cant get any more bad really, so even tho AMD overpriced their, atleast it’s less bad. But still sh*te!

        nVidia with their 90% market share surely gone Intel. So we really need a shake up. Heres hoping Intel will came on strong, so nVidia may see where things are going. But probably not if ppl still buy into this. Well they dont care probably as AI is their thibg now!

      2. The thing with Intel is that they kept being idle at innovation as they considered AMD not big enough to compete with them, NVIDIA is a different case as they keep pushing innovation and being very agressive towards AMD GPUs, they don't retain themselves, unlike Intel, they don't underestimate AMD

        1. I say complacent in the gaming GPU sector because look at the pitiful generation increase in performance with Slackwell. This is the same thing Intel did with CPUs for years and years. Minimal performance increase generation after generation and keeping more than 4 core CPUs out of mainstream pricing.

          I'm not talking about what Nvidia is doing on the AI/professional side when I say complacent. That's none of my concern. Just the gaming side.

          Maybe it will take Rubin gaming GPUs to roll out with a sh*tty increase in performance over Slackwell and bloated real world MSRPs before it is clear to everyone what Nvidia has become due to lack of serious competition from AMD until now.

          1. I won't call AMD GPUs a serious competition for now, they only compete with them in pricing and i'm pretty sure the day they will be truly competitive they will raise the prices as they did for their CPUs when they became decent, and i still prefer seeing them being really competitive with high prices rather then the opposite, it will be healthier for PC Gaming.

            Even though i'm neutral, i have to admit that NVIDIA always finds a way to get PC gamers on their side, perhaps AMD became lazy as they already have a big market in consoles and they don't really care about PC as they did back then, that's my opinion and i think it makes sense on a business perspective for them.

    2. Jeez, did Lisa Su steal your boyfriend or something? Every single one of your AMD related posts are wccf-tier unhinged.

      1. Are there any normal people left on that site? I was banned years ago trying to reason with those lunatics.

        1. No, they're all varying degrees of insane. Worst of course are all the tech fanatics. The astro-turfing there is off the charts.

          The site itself is also becoming increasingly ghey as they seem to keep hiring rabid left-wing types to write articles for them, even though I'm pretty sure most of the staff are muzzie browns (likely pakis). My posts get regularly deleted there whenever I call any of them out. One of the regular writers there (alessio) outright blocked me because he showed his face here once and I called him out on his BS. What a pansy.

    3. This, children, is what is known as "lose-lose".

      Obviously, if PcRules is wrong, they lose immediately.

      But even if PcRules is right, they ultimately lose in the end, due to NVIDIA not having any competition.

      Don't be like PcRules, kids. Say yes to drugs. Just not the ones PcRules is into.

  6. good job for nvidia for pushing tecknology, good job for AMD for keeping up (9070xt) and Intel for catching up.
    now how's that "1000e msrp but 1800e retail" 5080 that should be a 70 class card treating you?

  7. AMD is still behind Nvidia in terms of efficiency. For instance a 5080 will beat a 9070 XT yet uses 14% fewer transistors to do more work. That is why AMD can't compete with a 5090 or even a 4090 because the number of transistors needed to do so would far exceed the maximum package size.

    We'll just have to wait and see if UDNA improves on that or not but this generation only saw a minor improvement over last in terms of transistors needed

    1. Wasn't 9070 XT compared to 5070 and 5070ti, why compare higher tier of RTX gpu to lower tier of AMD one?

  8. lol John Papadopoulos=nvidia fanboy bought and paid.

    the real truth is nvidia is 5 years behind amd in mcm data centers muchless gaming space and it's gonna be their downfall just like it was intel.

  9. Fan boys argue without realising Nvidia is producing PC consumer GPUs while Radeon push console and mobile solutions.

  10. This is misleading; literally nobody in tech or gaming calls Neural Ray Tracing a "gimmick." The blog writer completely made that up. What's actually called a gimmick is AI-generated frames. Ray tracing itself has always relied on approximations, so replacing parts of an algorithm that has always approximated anyways with another type of algorithm that does the same (AI here.) would make zero difference since objects are still rendered from real geometry, not made-up fake AI frames like in DLSS which creates hallucinations, input lag, and fake smoothness that breaks once you move the camera or upon encountering any unusual objects to the model in scene. People need to understand that Neural Ray Tracing is something that simply predicts paths from real data with real depth and 3D environment, saving performance for rendering real data (say volumetric which is hard for path tracers, but we've techniques for them.) that'd have otherwise probably took minutes to path-trace down to milliseconds, thus having much less downsides (in fact, much more positives) than with DLSS that'd instead completely take over your frames with sh**ty techniques; don't confuse the two.

  11. In the future, whoever gives me the best performance per dollar, for LESS than $2,000, will get my business. I don't care if it's team red or team green. I'm sick of nVidias absolute greed, and I'm sick of AMD never even coming close to bringing the best. Something has got to change. I'd rather buy a used motorcycle than spend $3,000 on a video game card.

  12. In the future, whoever gives me the best performance per dollar, for LESS than $2,000, will get my business. I don't care if it's team red or team green. I'm sick of nVidias absolute greed, and I'm sick of AMD never even coming close to bringing the best. Something has got to change. I'd rather buy a used motorcycle than spend $3,000 on a video game card.

Leave a Reply

Your email address will not be published. Required fields are marked *