AMD RDNA4 AIBs GPUs

AMD now plans to release its Radeon RX 9070XT & 9070 RDNA4 GPUs in March 2025

AMD’s David McAfee has just announced that the red team will release its Radeon RX 9070XT and 9070 RDNA4 GPUs in March 2025. This means that AMD will release its GPUs after the release of all of NVIDIA’s RTX 50 series GPUs.

It’s pretty obvious to everyone at this point that AMD is playing catch-up. The red team is simply waiting to see the price and performance of the NVIDIA RTX 5070 so that it can adjust the price of its RX 9000 series GPUs. This became obvious at CES 2025.

From what we know, the NVIDIA GeForce RTX 5070Ti and the 5070 will have an MSRP of $749 and $549, respectively. So, the AMD Radeon RX 9070XT should be somewhere between those prices. Rumor has it that it will be closer to a $599 MSRP.

Anyway, it’s pretty obvious that AMD is not THAT confident about its products. If it was, it would have released them at really competitive prices from the get-go. Instead, the red team is now looking to adjust their prices based on the competition.

In one way, this makes sense. NVIDIA has been leading the GPU market these past years. So, there is no point at all for AMD to release its GPUs prior – or close – to the launch of the RTX 50 series. By releasing its RDNA4 GPUs after the release of the RTX50 series, AMD will be able to attract more attention to them.

On the other hand, I’m a bit concerned about this whole approach. For all we know, AMD may even jack up the prices of these GPUs. Do not forget that AMD – like NVIDIA and Intel – is a corporation and not your friend. Not only that but the red team hasn’t even released one or two benchmarks somehow please its audience. Again, that’s a bad move. And don’t get me started about its software which is miles behind NVIDIA’s.

Ultimately, I’m fine with this decision. However, we can all agree that AMD is once again playing catch-up. That’s a bit disappointing but hey, it is what it is. And… well… I told you so. I don’t want to rub it in your face, but here is a direct quote from my previous article. Guess I knew what I was talking about after all, right?

“Instead of exposing themselves, AMD is simply waiting for NVIDIA to reveal its RTX 50 series. Then, once AMD sees how NVIDIA has priced its GPUs, it will most likely adjust its own prices so that it can be competitive with them.”

Stay tuned for more!

40 thoughts on “AMD now plans to release its Radeon RX 9070XT & 9070 RDNA4 GPUs in March 2025”

  1. It's like that saying goes: AMD never misses an opportunity to miss an opportunity. Given how nvidia is like practically 90% of the market IIRC, fumbling this hard will most certainly push the green team's share beyond 90%. If the 9070XT isn't 399 USD then it's basically dead in the water. They need to price it this low to seriously shake things up, otherwise they shouldn't even bother.

  2. It is a mistake to wait so long imo. Most gamers are impatient. If they have been needing to upgrade for a long while and didn't want to buy a GPU that was so far along in it's lifecycle then they will grab a new Nvidia Blackwell GPU right away. Hence when AMD does finally start releasing they have already lost the opportunity to sell a lot of cards. First to market is a smart business tactic.

    Also please step up the production of 9800X3D CPUs. I badly need an upgrade from my i7 7700k and I'm not going with Intel again until they get their sh*t together.

    1. You can go with a 14700KF. Excellent CPU. I use a 13700KF. Probably I have hit a silicon lottery too. The temps and volts always seem quite good. But with their microcode recently, are the problems finally sorted?

      1. The 9800X3D is faster in gaming but the i7 performance is more than enough for me. The 9800X3D is in short supply right now and I'm not very confident that AMD will handle the situation well. Mostly they are just blaming Intel for the shortage because the new Intel CPUs are a disappointment but really they should have anticipated high demand for the 9800X3D months ago and stepped up production then so will they manage to do so now? Who knows. I might end up going with the 14700k anyway.

        1. I think they would even had stock, but scalpers were buying whole trucks again.

          Sadly nor AMD nor NVIDIA do want to eradicate the problem. It could be eradicated, but alas.

        2. I think they would even had stock, but scalpers were buying whole trucks again.

          Sadly nor AMD nor NVIDIA do want to eradicate the problem. It could be eradicated, but alas.

      2. there is no comparaison between the two in every aspect for gaming , perf or power consumption
        + lga 1700 is already dead and am5 has a lifespan until 2027 and probably beyond
        buying 14700 over 9800x3d makes no sens

        1. AMD used to be the slightly less performance but price is awesome but it's not like that anymore, I'm sure you already know. As a computer business, I've spotted maybe 23 bad AMD Ryzen CPUS after the 2000 series. Once straight out of the box, but otherwise months down the road. I respect their advancements but that in itself is a good reason why most people now have gone back to Intel.

          My work and gaming PC are both 13700Ks and couldn't be happier. Only paid $150 for the gaming PC CPU bc someone contacted my business who won it in a raffle and had no use for it. New in box. It's a great CPU and CPUs have plateaued in performance so who cares, just get an i5 or 9700X, it's all fine. It'll be good for ages.

    2. Finally got one, haven't got around to install it – Still seems a lot of peeps is having mem issues with the mb & memkit i will use (asus x670e-f, asus have really started to decline imo) and the latest non beta fw. Feels like i'll wait a little, the former x3d still does its job nicely

  3. that whole industry is kabuki theater. 15+ years of waiting for AMD to land a REAL punch, It never comes.(it benefits both to allow Nvidia to markup 200%).. So NVidias head gets bigger and bigger and WE pay the price.

    1. 15+ Years?
      7970, 290X, 390X, Fury non X, 480, 580, 5700 XT, 6800 XT / 6900 XT / 6950 XT, 7900 XTX, all top tier GPUs (both high end and mid-range.

      All HD 7000, R200-300, RX 400-500, 5000 and 6000 GPUs were top tier gems (with a rebadge here and there).

      1. RX480/580, and the 5700XT could not compete with the high-end Nvidia cards. these were good midrange cards. RDNA2 and RDNA3 tried to compete in raster performance, but lacked AI features and RT performance was not nearly as good.

        The last competitive high-end GPU built by AMD was the Radeon 7970. The Radeon 290X was not bad either.

          1. If AMD were to be competitive at the high end, it would mean that they had achieved very good performance per watt, and this would scale across their entire GPU lineup.The 7900XTX consumed more power than my whole PC, while delivering the same performance in raster, and much worse in RT. AMD card also offered FSR features that were simply not good enough compared to DLSS.

            But it seems AMD has realised its mistake and is moving in the right direction. RDNA4 GPUs will finally offer ML based FSR4 (probably FG will be ML based too), and even ray reconstruction. Maybe they will be finally competitive when it comes to the image reconstruction and framerate generation. Leaked benchmarks also suggest good performance per watt. If the 9070XT has all these features and offers 4080/5070ti performance at an attractive price point, I might finally start recommending people to go for the AMD card instead of Nvidia.

          2. udna might give them that but even when amd is superior all across the board, gaming i diots still buy nvidia as the company got them by the balls with successful marketing that amd simply cannot match, they either r&d or shill, they can't do both, and the usually stick to the former

          3. Yeah, best for us consumers is heavy competition for the money. Brand loyalty when there are other viable options usually result in higher prices and less innovation. Have had both brands through out the years and both have done their intended job nicely.

          4. You delusional then, since NVidia GPUs consume way more power and run way hotter than AMD cards.

            To the point they have their own power connecter.

            Don't know what reality you came from, but wasn't this one.

          5. That's not true.

            The RX7900XTX is the fastest AMD card currently available, so let's talk about this GPU and compare its performance per watt with my RTX4080S.

            The Radeon RX7900XTX reference model consumes 350W at 99% GPU usage regardless of the game (even old games without RT will consume 350W). The non-reference model draws 460W at 99% GPU usage as shown in this screenshot (from bang4buckPC gamer YT video):

            https://uploads.disquscdn.com/images/fa3b6de8019c9c4b9ace0548bc1e56dae97aa42ef87538f1f11cdb218b81f44a.jpg
            My RTX4080S draws different amount of power at 99% GPU usage depending on the game. Old games can draw as little as 220W. Modern games that use RT and Tensor cores draws between 260-315W (typically around 275W) at 99% GPU usage and that's including OC.

            That's quite a big difference in power draw compared to radeon 7900XTX, yet both cards offer comparable performance in raster as shown here:
            https://uploads.disquscdn.com/images/39d868a9dac9b7befce81b976e5d1b2baecb0a616c83b75589f331ea42b399dd.jpg
            Techpowerup average fps based on 25 games tested. Keep in mind my card has 59-60TF (depending on the power budget) and 820GB/s memory bandwidth compared to 52TF 736GB/s in stock RTX4080S, so it would reach 100fps in this test.

            In raster, the Radeon 9700XTX is well behind my card in terms of performance per watt, and that's not all, because Nvidia has built in AI cores (tensor cores) that I can use to boost performance per watt even more. In some games, such as Red Dead Redemption 2, even 'DLSS performance' (with the latest 3.8.1 dll) offers better image quality than TAA native (especially in motion), while boosting the fps through the roof. In other games I can also use DLDSR (AI powered downsampling), and DLSS FGx2 (AI powered frame generation).

            TAA native in RDR2

            https://uploads.disquscdn.com/images/aeeacde9369122e7f9eef9155374c2144a5d8d6c484eeea4eb4bdafafc1e1ea8.jpg
            DLSS Performance

            https://uploads.disquscdn.com/images/fa72121f782e81b9cae261feac2b5637fcf513361159625bba5a2ed33d824a17.jpg
            My RTX4080S can run RDR2 at 4K Ultra settings at around 130-140fps with these settings and GPU will consume around 290-300W in this particular game. Radeon at 4K native TAA has worse image quality, less fps (around 100-110fp) and consumes 350W minimum on reference model, and 460W in non-reference.

            Now let's look at RT performance per watt.

            In Black Myth Wukong my RTX4080S has 35fps at 1440p DLAA (100% resolution scale). Radeon 7900XTX with similar settings (FSR native instead of DLAA) has just 10fps (according to techpowerup test). My card is 3.5x times faster and uses much less power.
            https://uploads.disquscdn.com/images/40e5a701c43c1532d5daa064b8fce83632d28d5aaf02458a09af10c6bf180bb7.jpg
            Now let's tweak the settings, because 35fps isnt a very good experience

            https://uploads.disquscdn.com/images/9d927e47d346351bf21cd94f5dbdad154d1962b05caf38336a20b28e5a9c606e.jpg https://uploads.disquscdn.com/images/b55fd38517b13a227f3011c44b80d1da6b4d7f8bbbc2ef832b296554ed5e1d12.jpg
            29fps on the RX7900XTX, and 104fps on my OC'ed 4080S. I get 3.5x higher framerate and my GPU consumes less power than radeon 7900XTX.

            And here's my optimised settings, 135fps in built in benchmark. Medium PT still looks very good and I get over 120fps during gameplay. Radeon with similar settings would get around 40fps.

            https://uploads.disquscdn.com/images/a98036e7fb875130a25b4b53ad4b275c20c747dec8fa47373004aed828cdf744.jpg
            As for the temps, without OC my card hits below 60C if the game doesnt use RT and tensor cores. In normal games though I see around 63C at 99% GPU usage at stock clocks and with OC between 65-70C. It's not hot by any means, and my card remains dead quiet even at full load.

            It is what it is, AMD is clearly behind Nvidia when it comes to performance per watt, especially in RT workloads. Nvidia also offers better image reconstruction technology powered by AI (DLSS super resolution, DLSS FG, DLDSR, Ray Reconstruction), so overall I think RDNA3 cards werent very competitive compared to RTX 40 series. Maybe however RDNA4 will change something, will see.

          6. The RTX 4080 Super has a power draw of 320W not 250W, and boy do I have bad news for you when it comes to performance. Contrary to your numbers the RX 7900 XTX matches and surpasses the RTX 4080 Super performance in majorly of games.

          7. I have never seen 320W power draw on my RTX4080S. I have seen occasional spikes to 310-315W with OC, but I usually see under 300W in games that use both RT and Tensor cores.
            Older games that only use shader cores draws even lower amount of power. That's not the case with the RX7900XTX. Even old games will max out the power budget at 99% GPU usage on this Radeon card.

            Techpowerup is the most well known and respected benchmark site. They tested 25 games, that's a lot. Yes, the RX7900XTX surpasess RTX4080S in a majority of games by a whole 1fps, but it consumes way more power, so it's performance per watt is still much worse. Also keep in mind, my OC'ed 4080S has 8TF more than stock model, so it's even faster. There are games where my GPU is just a hair behind the RTX4090. For example in black myth wukong the RTX4080 has 89fps, my card 103fps (I even linked you screenshot from built in benchmark), and the RTX4090 111fps.

            Techpowerup has once again benchmarked GPUs for their RTX5090 review (see the attached benchmark chart) and in newer games even stock 4080S is 1fps faster on average compared to radeon 7900XTX. Both cards offer literally the same performance in pure raster and can be considered equall in this regard.

            The RX7900XTX was $100 cheaper up front, but I chose to go with the RTX4080S because it's better card overall and it uses less power. An additional 150W running for 8 hours is 1.20 kWh per day and 438.00 kWh per year. In my country I have to pay about $108 for this, so in the long run the RX7900XTX would cost me more. The RTX4080S has also much faster RT (up to 3.5x times) and I wanted to plat games with RT. Nvidia GPUs also have useful AI-powered features (Ray Reconstruction, DLDSR, FG, and DLSS). With the latest Transformer model update, even DLSS performance offers better image quality than TAA native.

            I think the reason why AMD cards sell so much worse than NVIDIA's is that they're not very competitive when it comes to performance and features. The last AMD that I was really considering buying was the HD7970. The Radeon 290X was also good, and everything after that was a long period of Nvidia dominance.
            https://uploads.disquscdn.com/images/2c8a8ba8449192a2076897b4198241cbfd26a04cde4533095a278de5d8b18e1e.png

          8. I don’t know what universe you from where a RX 7950 XTX runs at 350W at all times, cause it sure as hell is not the same universe as me.

            We can associate this to the typical user error scenario, especially when the RX 7950 XTX runs at passive cooling 80% of the time.

        1. Calling those cards midrange is nonsense. Specially the 5700xt.
          Just because there is a ultra premium/ultra luxury/ultra performance segment doesn't mean other premium products are now downgraded to midrange.
          This would be like saying since a bugattti veyron exists, Lamborghini is now a midrange brand.

          1. The 5700XT could be considered high-end in the sense that it was the fastest AMD card when it was released.

            Nvidia had however similar GPU's in midrange. Cards like RTX2060 super, or RTX2070 were equally fast in raster and had DLSS (very big fps boost) and HW RT on top of that. Thanks to this technology both 2060/2070 cards can still run games quite well, even games like Indiana Jones.

      2. and yet they seem to never catch up with nvidia contracts and technologies. Every time they discover sth nvidia is a few steps ahead. It's almost as if one sibling is ahead of the other, but wait, they are cousins after all, not just a saying lol

    2. Yeah the last time they made a massive impact was the HD 4000 series back in 2008. They kicked Nvidia in the nuts with a steel-toed boot so hard that they had to drop the price of the GTX 280 by $250 US and the GTX 260 by $100 US. It was truly a glorious moment, of which the likes we may never see again without some drastic reworking of the Radeon Technologies Group.

  4. It's like that saying goes: AMD never misses an opportunity to miss an opportunity. Given how nvidia is like practically 90% of the market IIRC, fumbling this hard will most certainly push the green team's share to over 90%. If the 9070XT isn't 399 USD then it's basically dead in the water. They need to price it this low to seriously shake things up, otherwise they shouldn't even bother.

    1. I am pretty sure their FSR4 still in preliminary stage + their hardware side is not up to the mark (yet). They are continuously doing this $hit for over a decade. I am a Nvidia fanboy, I have used several Radeon GPUs too (I do have a 6700XT currently) but this is disgraceful from AMD to all the gamers out there. Unbelievable!

      1. I'm glad to know my child loves good quality and knows how to escape the evil clutches of society. nVidia needs to come up with sth better than dlss4 if they wanna sell us their latest line. AMD tho, they just lack in the RTX department badly.

    2. Been thinking about it and I can't see any other reason for the delay other than they are hoping that by releasing later they won't have the pressure to match or undercut price. The consumer base has been shouting all over social media that the card just should not cost more than a 5070 especially if they arent bringing 5070 features and unless they are about to do like mfg or some other killer feature in a single month I don't know what gets them there.

  5. Of course they want in march since nVidia took january and ppl will drain their bank accounts lol

    Oh capitalism how we love you.

  6. Some retailers already have Radeon 9070XT cards in stock, so I'm surprised AMD is willing to wait so long. AMD already knows what the price of the RTX5070 is, so why wait another 2 months?

    I have this theory that AMD may have been surprised by all the FGx4 marketing. Of course "5070 as fast as 4090" was just marketing chart (gimmick), not raw performance, but now if they want to compete they have to offer similar technology. Maybe AMD will prepare their own FGx4 method during these two months, and then they can market RDNA4 GPUs in similar way as nvidia.

      1. Yeah and optimize & stabilize the driver's and make sure its compatible with the major engines etc. Way more goes on behind the scenes than one might think

  7. Jesus John, you are such a prophet!

    Stop blowing your damn horn, it was anticipated but AMD fans expected better.

    So here we go.

  8. I suspect part of the hold up is FSR4 isn't ready for prime time and they are betting a lot on it hoping to catch up with DLSS. They haven't been able to make AI neural networks work on the 7000 series and are likely having problems with the 9000 series too. Their FSR4 demo was nice and all but it's easy to make it work in one section of a game with scripted gameplay and a lot harder to make it work in all games. As far as I can tell most game developers haven't gotten an FSR4 SDK to help them implement it in games which they need to have at least 50 titles working at launch.

    1. launching feature incomplete hasnt stopped them in recent years. I think its probably more plausible that they either want to hope that they can get away from pricing pressure to make it compete with the 5070 or that there is just an internal functional issue that the press doesnt know about but they estimate will take about that month to fix.

  9. Hope they can compete, we sourly need more competition in the gpu space or the prices will get even more insane than they already are along with stagnation. Latest 50 series don't bring that great performance increases in raster and that is still the most used render tech by far

Leave a Reply

Your email address will not be published. Required fields are marked *