AMD vs NVIDIA by MAHSPOONIS2BIG

AMD FSR 4.0 appears to be better than DLSS 3 CNN but worse than DLSS 4 Transformer

AMD has lifted the review embargo for the AMD Radeon RX 9070 and the RX 9070XT GPUs. As such, DF has shared its thoughts about AMD FSR 4.0, which is what may interest a lot of PC gamers. So, can AMD FSR 4.0 compete with NVIDIA DLSS 4? Let’s find out.

To cut to the chase, AMD FSR 4.0 is better than AMD FSR 3.0. It can also be better than the previous DLSS model, which was used with DLSS 2 and DLSS 3, the CNN model. However, it is unable to match the visual quality of the DLSS 2/3/4 Transformer Model.

AMD FSR 4.0 will be exclusive to the RX 9000 series GPUs. So, it will be at least a good option for all those that will get these new GPUs. It sucks that this new version of FSR is locked behind the new GPUs. However, it shows how limited AMD FSR was. It also proves that NVIDIA was right to use AI from the get-go.

In a way, the NVIDIA RTX owners have nothing to fear about FSR 4.0. Thanks to the Transformer Model, which is compatible with all the RTX GPUs, they can get a better image. And yes, even those with an RTX20 series GPU can get a better visual quality than those with an RX 9070 or RX 9070XT. I’m mentioning this because a lot have been using the “open” nature of AMD FSR to justify its compromises. Well, that bite everyone in the ass except the RTX owners, didn’t it?

Speaking of the RX 9000 series, the RX 9070XT appears to be the best option among the two available GPUs. According to the reviews, it can be 26% faster in Ray Tracing games than the AMD Radeon RX 7900XTX. In terms of competition, in RT games, it can be 21% slower than the NVIDIA RTX 5070Ti. If the RX 9070XT comes out at its MSRP, it will sell like hotcakes. And that’s the big question. Will we see it at its MSRP? Or will its street price be something like $800-900? Because if that happens, it will make more sense to get the NVIDIA RTX 5070Ti.

As I’ve said, I’ll make sure to purchase an RX 9070XT once they are available in Greece. So, it will all come down to overall availability.

Stay tuned for more!

AMD FSR 4 Upscaling Tested vs DLSS 3/4 - A Big Leap Forward - RDNA 4 Delivers!

36 thoughts on “AMD FSR 4.0 appears to be better than DLSS 3 CNN but worse than DLSS 4 Transformer”

  1. damn john, I got a hint of it in your previous work but I didnt think you were this partisan about the amd cards. Its cool they did make the upgrade but as you said the availability of the cards closer to msrp will tell the real story. I think for what's on offer $150-$200 less for an upgrade is a significant enough discount. I think it will also be very interesting to see what happens with the 60 class cards later and whatever Intel does for its own actual mid range options.

          1. I was able to get the card but they had like no stock of anything else remotely in that price range. Even the 5070s and 9070s were totally gone.

  2. He is compare images with different level of sharpness. I think its need to retest with the same sharpness and then tell the difference

        1. TAA has fcked your brain so hard that now you confuse a blurred mess with clarity.

          However, that ghosting on the hand kills me.

  3. I have a question for you people, since we all know of the issue with x32 PhysX games and the RTX 5000, i would like to know if an RTX 5000 owner can bypass the issue by using a non-5000 dedicated Physx card, thanks

    1. Imagine bought hundred or thousand dollar GPU but need a second GPU to play games from a half and a decade ago with all features turned on.

      1. Nope, i'm still using a GTX 1080ti, was just asking, but nice try
        PS : The second GPU can be a used 40$ GTX 970, it doesn't have to be a monster

        1. The irony is if you play Mirrors Edge using that $40 GTX 970 as standalone GPU with all settings turned on at max you will probably have better experience than playing it with standalone RTX 5090…
          PS : great OG GPU you own, 1080Ti will also beats RTX 5090 experiences as standalone GPU to play any 32 bit Physx games , imagine that GPU for almost a decade ago beat latest high end… lol

    2. Yes there have been many reports of people using a second card. Even something like a 3050. To dedicate to PhysX and reclaim full hardware performance advantage. If I had an extra pcie spot available I’d probably do the same since a 3050 doesn’t really cost money.

    3. Yes, there is an option in Physx settings when you have 2 GPUs to "dedicate this GPU to physx"
      Right now you can grab something like 1050Ti and keep it as backup GPU
      To future-proof, 3050 will be optimal since it will never be depreciated from drivers, the 1050 will eventually be depreciated [your 970 also fine if you have space for it]
      If you have 1080Ti and still have 970, you can boost the FPS of ALL physx games on your PC, plug that 970, select it as PhysX card and you'll get a nice boost of FPS.

      I plan to do the same, probably get one of Aliexpress server grade single slot 1050Ti 4Gb models, such card even if driver support drops, will always be needed by people building small home servers, media PCs etc so it wil be easy to sell it

      I have a backup 4070 super, but its dual slot GPU that uses 16pin power, i rather get something that has no power plug at all, and preferably single slot or dual slot but small sized like 3050

  4. Personally I found that the DLSS 4 transformer model introduced new artifacts. It wasn't necessarily "better" than the CNN model, it just has different deficiencies. As for ghosting, I would believe that Gamers Nexus found that in certain situations the transformer model was actually worse at ghosting, whereas in others it wasn't as bad.

    1. It will be a long time until we see a "fits all" ai upscaler… and suspect it will be even worse when they add even more fake frames than the already ridiculous latency bloating fake frames the 5 series added. The 6 series will no doubt have yet another "fake frames double down" as only improvement for twice the price over the 5 series… Hope AMD really starts to make a dent before nvidia prices will ruin PC gaming as we know it. When the average joe cant get ok performance for their money… we will get a smaller install base… meaning less games… and the domino is going. Nvidia used to be better, but now they only care about their bottom line even if it means kill of PC gaming in the process

      1. I don't think NVIDIA cares about anything but AI anymore. As long as they can get sales on consumer GPU's for AI stuff, then they'll do it, and once they can't they probably won't care about the desktop graphics market anymore.

    2. As an addendum to this, I tried xHybred's fork of NVIDIA Profile Inspector and enabled Profile J for DLSS/DLAA in Cyberpunk 2077 and set the game for DLAA with the Transformer model, and there's no noticeable ghosting artifacts. There may be some slight blurring out in motion, but I can't be certain if it's DLAA or my monitor. I can't get the FPS high enough even with DLSS due to one core on my CPU capping out at 100% (I guess there's some increase in CPU usage in this configuration).

      Now I do notice a weird artifact every now and then that almost looks like a mesh or fishnet pattern on something that slowly disappears as I stare at it, but it happens pretty infrequently. There may be some other smaller artifacts that I see every now and then, but they are pretty minor and hard to notice.

      One downside of this configuration is that the dithering in hair is more noticeable than with the default DLSS/DLAA profile. I deal with this problem with ReShade by using a Directionally Localized Anti-Aliasing shader (old-school DLAA from before NVIDIA stole the acronym) and the SweetFX CAS shader to sharpen the image. For someone who isn't used to Temporal AA of any kind it looks a bit weird to me, as if someone sharpened an oil painting, but it is playable.

      Another downside is lower FPS. Even with DLSS set to Performance I can't get the same FPS I was getting when AA was disabled via Cyber Engine Tweaks (CET) and I was using a boatload of ReShade shaders to deal with the aliasing, dithering, and other issues in the game.

  5. Performs closer to Nvidia, offering good enough features, being cheaper while consuming more power. It's like we've gone full circle to GCN era.

  6. Ahem, whats all the excitement?
    From what i see in the reviews, RX 9070XT is in ballpark of 4070 Super, sometimes faster, on average equal.
    4070 Super are not expensive cards

    1. The f*** they aren't, they're also hardly in stock. There are maybe 2 new 4070 Supers in my microcenter, and they're 650.

      1. Seriously? When i got one 12 months ago from amazon, there was plenty of stock, I paid 600$ for asus model.
        I watched more 9070XT reviews and I would get it instead, sure it has sh**ty ray tracing but sometimes it touches 4080 Super in raster, so its great card.
        has birthing issues thou

Leave a Reply

Your email address will not be published. Required fields are marked *