AMD feature 2

First gaming benchmarks for the AMD NAVI GPUs, Radeon RX 5700XT & RX 5700, have been leaked online

The embargo for the third-party gaming benchmarks for AMD’s NAVI GPUs, the AMD Radeon RX 5700XT and the AMD Radeon RX 5700, will go live on July 7th and it appears that the first benchmarks have been leaked online.

These first benchmarks are from the Polish website Benchmark which accidentally published its article earlier yesterday. Now while the article has been taken down, the benchmark graphs have already been leaked online, thus giving us a small taste of how these upcoming GPUs perform against the new NVIDIA SUPER graphics cards.

As we can clearly see, the AMD Radeon RX 5700XT is unable to match the performance of the NVIDIA GeForce RTX2070 SUPER. NVIDIA’s latest SUPER GPU will replace the original RTX2070 model (at the very same price). The AMD Radeon RX 5700XT will be priced at $449 whereas the NVIDIA GeForce RTX2070 SUPER is priced at $499.

Not only that, but the NVIDIA GeForce RTX2070 SUPER is more future-proof than these first NAVI graphics cards. As we’ve already reported, both the AMD Radeon RX 5700XT and the AMD Radeon RX 5700 will not support real-time ray tracing; a feature that all of the RTX GPUs currently support. And since the next-gen consoles will do support Ray Tracing, we can safely say that owners of these specific NAVI GPUs may encounter visual degradation with all the next-generation games.

Obviously, we’ll have to wait for more gaming benchmarks however if these three benchies (Shadow of the Tomb Raider, Far Cry 5 and Wolfenstein 2) are any indication of how these new NAVI GPUs perform on a wide range of games, we can then safely assume that NVIDIA’s SUPER GPUs are better in every possible way.

86 thoughts on “First gaming benchmarks for the AMD NAVI GPUs, Radeon RX 5700XT & RX 5700, have been leaked online”

  1. Not impressed with anything from this generation of cards. I will see what Nvidia is doing with their 3000x series which I expect to be around with Cyberpunk 2077.

        1. I’d be happy with everything you said but at high settings. I just want a noticeably better experience then I could get on my ps4 pro

  2. It really is insane on how good of an investment my 1080 Ti has been so far.

    Probably the best GPU I have ever bought, I really hope the 7nm next gen Nvidia tech will have a new Ti that will be as good as my 1080 Ti. Looks like I can easily wait till next year when they drop the new Ti, feels good Man.

    1. It is an amazing card that runs everything. If I decide not to buy the new Ti next year, then my 1080Ti should last a few more years.

      1. For sure, low temps, insane performance and it has lasted so long really you could not ask for more.

        If you run 1080p you could easily hold on to it for a few more years and if you don’t care about ray tracing. Pretty sure 1440p is fine as well.

        My self I run 3x 42″ 1080p screens(5760×1080) so I really could use 2x a 1080 Ti to be 100% perfect on Ultra. Having said that it has been crushing everything really well, I can’t complain what so ever. When the 3080 Ti comes out I will 100% getting it(unless it is a marginal upgrade, but I doubt it).

        Also I really love to get new screens, those 120Hz variable refresh rate HDMI 2.1 screens sound amazing! LOL, I going to need to rob a bank 🙂

      1. Exactly! I think by the time Cyberpunk drops Nvidia and AMD should both be releasing their top tier cards.

        I really hope Nvidia does not hold back on the 3000s. If yourself and I both upgrade to the 3000 version of our cards I bet money they will be minimum 2X the performance which is really all I want. If the RTX is also 2x the performance than that will just be a bonus!

        With all the tech that we have I think it will 100% be doable, price is really all that I worry about. This is where I really hope AMD picks it up so we have a competition like the CPUs right now(well that be ideal, will see).

  3. “As we’ve already reported, both the AMD Radeon RX 5700XT and the AMD Radeon RX 5700 will not support real-time ray tracing”
    I don’t agree with this statement. The current RTX cards can barely run RTX. Once new cards drop that leverage the tech further, and games require more power for the effects, the RTX cards that are currently available will become even more useless when it comes to these features. If they barely function now, they’ll be pointless in the future.
    Sure, the next gen will be improved, but at the sacrifice of current cards.

    1. Common sense, a little bit of fresh air into this site. Thank you. 😉

      For me it is always that, if RTX worked without a performance hit, that could be called an advantage over AMD GPU’s, added up to a better performance overall GPU’s by Nvidia. But if once you activate Ray Tracing effects, your frames go down like the Titanic, that is not a good thing since you not only paid for the overall performance but for the “extra” features that should work way better and not just “playable”.

      This reminds me when I had my 560Ti and had to turn PhysX and Hairworks OFF so I could play at 60FPS. Yeah the GPU could do it but poorly…so to me it wasn’t even an advantage. With the GPU I have now I can turn that ON but in 5 to 6 year old games…amazing, and it is not even advertised by the GPU maker.

      I said this to another guy that was saying that “RT is an advantage that Nvidia has over AMD GPU’s) but imagine you buy RTX GPU’s and once installed RT effects will be enabled by default but you can’t turn them OFF, imagine paying what you are paying for that and looking at the performance you got.

      For me, raw power is more important and then pricing, Nvidia is also winning in performance at least with the new “Super” ones guaranteed, but those will also perform like crap with RT enabled (I’m talking 2060/2070 GPU’s here, competition to the RX5700/XT) so you won’t end up using those effects at all since you always want better frame rate with also the best visuals as possible. (and whoever says the contrary is either on denial or not a true gamer)
      It is the same story as happened with mid range GPU’s back in the day but instead of PhysX and Hairworks, we have Ray Tracing. And they are trying to sell this “tech” as gold when it is barely working…because it is barely working. Lower frames is something you don’t want in your games, that’s why on PC we have settings to choose. They are giving you the option to choose to enable it or not (that is fine)…but they are charging you no matter if you are going to use it or not. If that sounds right to you.

      I’m using Nvidia GPU’s and probably will keep using/buying them. But prices must go down, and if they want to put new tech, at least make it so it won’t have a direct impact on performance. That’s why you add a function, to have that added detail but working as it should, not half of how it should be and no “future proof bs” so it might work in 5 years from now…instead develop the thing and took those 5 years but give me something that works properly, then charge for that if you want.

      They are doing the “GPU’s as a service”, GPU’s now are like EA Games, unfinished with drivers that need to have fixes every week, dying GPU’s on release (a friend of mine lost two 2080Ti’s on release -one was DOA, and another with that artifact death image thing on screen, no OC’ed or anything, just playing Fifa 19 with a 1300€ GPU-, he “RMA’ed” them of course as he builds PC’s to sell to others).

      So yeah…The Ray Tracing thing doesn’t work for me, the only GPU that can do it at 1080p 60+FPS is the 2080Ti, and is quite expensive, the others are obsolete to do said function.

      I’m going to wait and see what happens with 3000 series or whatever next year.

        1. “AMD Fanboy”. I have PC’s since the late 80’s/90’s, 80% of them with Intel CPU’s.
          If the only thing you can write is that without an argument. You are another one that goes into the bin. Ignored (as most of known trolls in this site)
          I won’t bother to even read people like you with the mentality of a 12 year old teenager. 😉

          1. Hairworks and RT may be gimmicky but Nvidia still offers a better
            deal over AMD in every way. Nvidia is always experimenting and trying
            new things, setting new trends.
            Only thing AMD has done worth
            mentioning past decade is gimp gaming with their cheap console chips.
            Only reason i hope they stick around is so Nvidia still has some
            competition.

      1. This is not about common sense. It’s only about how are you using your PC. Not everyone needs 60 FPS. For example I rather play games in 4K with the best graphics if they give me 30 and more FPS. It is better for me than 60 FPS in 2K. Better graphics give me better immersion. So for example Witcher 3 with all settings on (also with hairworks) in 4K with around 35 FPS is better than 60 FPS in 2K. So there are people who likes better graphics than higher FPS. That’s why I like all that stuff from NVIDIA like PhysX, Gameworks or now Raytracing. I don’t have GPU for RT now, because I do not accept current prices for RTX GPUs. But I do not agree with statement, that it is not benefit over NAVI. RTX 2070 and above are giving 60 FPS and obove in FHD and above 30 FPS in 2K. The least powerfull RTX 2060 gives you above 30 FPS in FHD. RTX 2080 Ti gives you 60+ in 2K. How can someone tell that it is not usable? Or not advantage? For you maybe. But it does not mean it is like that for everyone.
        BTW are you Foxiol, who made games for VR? Or that’s another guy?

        1. I believe you are in the rare minority here. For most, 60 fps is the bare minimum that is considered playable, and for some, even that isn’t enough.
          I, personally, consider nothing below 60 fps playable as it just isn’t smooth and the latency/input lag lower framerates create just isn’t fun.
          But, I can definitely see where you’re coming from. You can value whatever you enjoy.
          But, for most, the performance just isn’t there for their minimum requirements (60 fps and high resolutions).

          1. But, I can definitely see where you’re coming from. You can value whatever you enjoy.

            I just like inovations and it does not bother me if they don’t have ideal framerate for me. But I also have boundaries. 30 FPS are minimal for me. Not average but minimal.

        2. Gameworks and PhysX weren’t worth the performance penalty. Have ya notice NVIDIA don’t really push it like they used to. RTX is a joke currently it will take another 5-6 years before it really shines and these RTX cards will be just a memory.

          We definitely agree on the prices being to high

          i play on a 1440p 144hz monitor i cant go back to 30fps 60 ist he lowest i will go. I don’t mind turning down settings to get that high refresh.

          1. For me, PhysX was definitely worth it, but I had advantage from the begining with this. I always had second GPU just for PhysX. When I bought new GPU, my old one became PhysX GPU. So I had 8800 GT + 8600GT, GTX 460 + 8800 GT, GTX 780 + GTX 460 and now GTX 1080 + GTX 780. So I did not have so much performance penalty in this way.
            Whit Gameworks, it depends on what modules the game uses. I like HBAO+, VXAO, PCSS or HFTS. All of this make graphics better with acceptable performance hit for me. As I said, 30 and more FPS is good enough for me. So I did not have to lower resolution because of it.

            RTX is a joke currently it will take another 5-6 years before it really shines and these RTX cards will be just a memory

            .

            You can say the same for example about first DX12 or DX11 GPU generation. Where they are with performence now? Did we have waiting until now to buy first DX12 GPU. Or should we wait another 5 or 6 years for GPUs with significantly better performance?

        3. I and people I know find that 30 fps, no matter the kind of game, is not what we call “playable condition”…

          And… what’s the size of your display, to see that big difference between 2K and 4K ?

          1. For me it is good enough for single player games. But in most cases I have more FPS because I have good GPU (GTX 1080). What I am saying is, that I don’t have to have 60+ FPS. For example I played Witcher 3 in 4K with 35 FPS in average. It was without problem and better comparing to better FPS in 2K.

            I have 27” 4K Acer Predator. It has IPS display and GSync.

        4. Well yes you are right at some things but I also said that just in the case of the use of Ray Tracing these GPU’s (2060/2070) “are obsolete”.
          The only one that is worth (not by the cost which be both agree are expensive GPU’s but by its raw performance at everything) is the 2080Ti.
          Nvidia shouldn’t advertise “these GPU’s do Ray Tracing” as the performance hit is as great as it is, because in the realm of gaming yes, there are people like you that can play with 30FPS (heck people still play with laptops that barely hit 20FPS in games and they are “happy” because they can at least play those games with that) and max graphics…but is not an optimal thing to do for sure.

          Stressing a GPU just because you “don’t mind” certain FPS target or because you prefer having that little “extra” detail enabled is your choice, but again we are talking about playing games in where latency is also a part of it, having great controls over what you do with your character, inputs while doing certain actions are still important. So in gaming is more relevant to always have a better frame rate than the most beautiful graphics possible if you can’t achieve those with whatever GPU you have. (the ideal is having both of course…and that is what these GPU’s can’t achieve from this perspective)

          Each to their own I guess as always, no one has the absolute truth or is right about this matter, but there is a good example here as well. Most of my friends did the jump to 144hz this year (I’m still at 60hz playing PC games with a massive but good HDTV) and they always tell me how they can’t go back to 60hz again, even if they have to sacrifice some graphics settings the way you feel all the games, the smooth it all feels in your hands is incredible. Glad I still have to try that, but I also played on consoles (since Spectrum/Atari days) with those 30FPS for a long time and once I stopped playing on consoles (my last one was the PS3) and did a single GPU upgrade on my PC back in the day, I also can’t go back to those 30FPS.
          So it is all about the feeling more than anything in gaming, at least for me…and again the day I’ll do the jump to 144hz or 240hz, I bet my goal will be to find a GPU that reach that FPS mark to play as smooth as possible. (always depends on the game too, not all games will be at those higher frames if you want also the best graphics enabled, etc)

          Answering your question about the guy who makes VR games…No I’m not. Just a regular gamer here (did some stuff on Cryengine back in the good old days for Crysis, but all personal little projects), weird to see someone with the same nick as mine. hehe

          PS: At the end of the day, we all want to have fun playing games…just that.
          I totally understand people like you but you have to understand people like me, I want both things and I bet you too want that (frame rate and best graphics as possible), if I can’t achieve one thing (not having good frame rates because of a single feature), that feature is useless to me (as it is Ray Tracing now).

          Glad to see someone actually talking instead of calling things to others. 😉

          Edit: Corrected some derp finger stuff (it might be more).

    2. RTX cards can now barely run RTX. —-> have you forgotten the real performance of the RTX 2080 8GB, RTX 2080ti 11GB, Titan RTX 24GB ?? ^^

      1. The real performance? You mean struggling to get 1080/60 with RTX on?
        That’s not real performance, that’s sad.

    3. rtx super does fine tho

      rsgardless Amd gpus are holding back pc.

      But cpus pushing pc forward lol

      1. Fine? Depends on your standards I guess. I don’t run anything below 1440p /90 fps if I can avoid it.

  4. All this is another reason why Nvidia’s marketshare lead with keep them ahead of AMD for years in terms of R&D and holding back on GPU power. Cause lets face it. If the market was in a more even state. These “Super” cards from Nvidia would have been part of the original GPU turning lineup.

    The further AMD is behind. The more Nvidia will hold back on it’s GPU’s while selling them at higher price points.

    1. The problem is or will be when Intel joins the party…and it is another company known for its higher prices as well.

      If the future battle for GPU’s will be against both of these companies…the horrors that await us from here to a mere couple of years…-sending virtual hug- xD

      1. “and it is another company known for its higher prices as well.”

        that’s only true for market where they have the dominance. if they are not they will not going to hesitate to go price war. what happen on mobile and compute accelerator market prove this. though in mobile intel still fail because Qualcomm have some sort of unfair advantage that they able to force through legal means (that’s why qualcomm get into hot water in several countries). in compute accelerator market they somehow able to erode nvidia market share and at the same time pretty much kill AMD in that market.

        1. Hope you are right, if Intel enters as a third GPU competitor, prices should either go down or up depending how their cards will perform against Nvidia/AMD counterparts.

          I don’t expect them to release a product to beat the others, but if Intel does it, do you think they will put a lower price compared to Nvidia for example? If they want market share as soon as possible maybe yes, they could put a mega low reasonable price with a great performing GPU and everyone will jump ship (well…maybe), but they could be doing what they were doing with the leadership they had up to this point with its CPU’s and keep rising prices compared with the competition…which I’m not so sure if that will be ideal for them without market share (not counting IntelHD graphics here).

          Let’s see what happens, a third competitor in this space could be great for us…or not. Just cross fingers. 😉

    2. The problem for AMD is, that NVIDIA is not holding anything. That’s the reason they are still behind. How do you know that these new gpus could be released a year ago? They are using partial faulty GPUs from higher tiers which didn’t have to exist in relevant numbers in the begining of production…

  5. Current RTX cards can barely play current-gen games with their partial implementation or DXR.

    Next-gen games coming late next year onward + same or more complete implementations of DXR?

    There will be no difference between Pascal, Navi or Turing in rendering performance, given all of us will be playing with DXR off, lmao.

    1. This is the way I’m looking at it too. They already don’t have the power, in the future they’ll be even further behind. There is no future proofing here.

    2. can barely play?
      Metro Exodus – 2080Ti – 4K, average 60+fps on extreme with ultra ray tracing.
      – 2080 – 1440p average 60+fps on extreme with ultra ray tracing
      – 2070 – 1440p around 60fps high with high ray tracing
      – 1080p 60+fps on extreme with ultra ray tracing
      and 2060 will also play the game nicely in 1080p. Ultra with ultra ray tracing 60+fps.

      I don’t know where everyone is taking all that “barely playing games” Do you just take the performance of the game when it was released? Because then yes. They sucked. My 2070 had problems in Exodus in 1080p on high with high ray tracing. 50-60fps. But now after few updates I can run it on extreme with ultra ray tracing 60+fps.

      I won’t talk about other games because I think Exodus has the most performance heavy ray tracing implementation. Global Illumination. Of course there’s Quake 2 but we won’t get modern games with full implementation of raytracing in next few years.

      If you don’t believe this performance check recent Youtube videos where the game is updated.

      1. Exactly. There are still people who are spreading talks that you need RTX 2080 Ti for 60+ in 1080p. Thats nonsence.

  6. And since the next-gen consoles will do support Ray Tracing

    They will NOT.
    There were some marketing claims because HYPE but no, they will NOT. The hardware for those consoles had been finalized for some time now and AMD didn’t have any hardware solution for raytracing at that time, and even if they had it would have a lower RT budget than the RTX cards (which is not very much and just for gimmicks).

    1. are you sure? Sony for one is very obsessed with spec. there is no way they will let only pc version of the game to have ray tracing.

      1. As I said, the new GPUs won’t do it on a hardware level. Even the RTX cards can only do gimmicks, it will take a few generations to really mature this. The new consoles might do some gimmicks, but nothing more than RTX; more likely even less. It’s wasted graphics budget for things that the devs could utilise better.

      2. They let only the PC versions of most games run at 1080p (or higher) this gen, when they had promised that games would be 1080p. MS had promised destruction physics powered by THE CLOUD. That never happened either. Both manufacturers will claim random nonsense to hype up their hardware and then forget all about it a year later.

  7. Not that bad actually.
    On directx the anniversary edition OC might be enough to put it on par, Navi will be competitive given a good cooler.
    Something must be wrong with the vulkan driver though.

    Still, i don’t understand the logic of launching a non-RT card when the next gen consoles(which are using your tech btw) are confirmed to have it.

    I thought at least Navi would have that hybrid thing, these cards are guaranteed to be obsolete in a year.

    1. And its also compleatly irrelevant from a consumers standpoint.
      It would be relevant if nvidia was on 7nm with more performance but they are not.

    1. Don’t think that they couldn’t launch a RX5950 XT which could outperform even 2080 ti if that was their business plan. However, since they hold a very limited share of the market they need to prioritize what they spend their resources on, and as a first step for this generation they strike the upper-mid section of the market where the arguably largest profit is to be made, possibly allowing for higher end cards to be developed in the future.

  8. “And since the next-gen consoles will do support Ray Tracing, we can
    safely say that owners of these specific NAVI GPUs may encounter visual
    degradation with all the next-generation games.”

    “we can then safely assume that NVIDIA’s SUPER GPUs are better in every possible way”

    Read – don’t buy it, buy NV 😉 (John please lol, we are not in the market)

    But this Navi is (for real) Polaris replacement ! (Yes this is 250mm2 chip)

    So the 400-500€ price tag is not acceptable, IMO they should price it at resonable 349€ & 299€, and they will sell like hot cakes !

  9. So if you buy nvidia, you get slightly more performance for paying slightly more money? Wow who wudda thunk!

  10. How can they screw their customers? Original RTX GPUs were released a year ago. Are you feel screwed everytime when a new and better hardware is released than you have?

  11. The only thing i see is that AMD is an incompetent company that can’t create competitive cards

  12. AMD late to the party, and still can’t compete.

    Sadly this just doesn’t change.

    They’re not even offering good value any more, just slightly cheaper, but inferior cards.

    I’ve given up hoping this is going to change.

    Only Intel can shake things up, but I doing think that will be at high end

      1. It’s ok, but doesn’t act in any way to change the market in any way.

        Nvidia will just cut prices on other cards and make it look less value.

        They are no longer a bang per buck competitor in my eyes, they’re slightly cheaper, but offer less performance.

        Nothing that really puts pressure on Nvidia

  13. As someone who built a PC this gen all I can see is I WILL NEVER EVER REPATE THAT MISTAKE EVER AGAIN.

    1. Well up to some point you need to upgrade. In my case I’m with a mix of 2012 up to 2014/2015 components and sure need to upgrade, and whatever I put in here it will be a very good jump in performance.

      I don’t care when they release GPU’s like this to be honest, the little gains (8/10% at best, which translates to 10/15 more FPS) are not a big deal at all if you bought the 2070 compared with the Super.
      Again if you came from something like me it really doesn’t matter, you won’t need to upgrade from a 2070, it should mean nothing to you.

      Next upgrade do this…buy the best thing you can, and then never go watch what comes next and you’ll be happy. 😉

  14. People will get used to premium prices. Price for mid-end cards will never go back to 250€ thanks to stupid people buying heavily overpriced Daddy Jensens cards.

  15. Fresh news. The prices has now been cut down by AMD, the
    RX 5700XT will be priced at $399, which it make sense for me.

  16. “As we can clearly see, the AMD Radeon RX 5700XT is unable to match the performance of the NVIDIA GeForce RTX2070 SUPER.”

    Based on what? The weird Wolfenstein benchmark? As far as I can see, in the other two benchmarks the 5700 series is more or less matching the S series in performance for less money.

    Also, it seems like the 5700XT will be a $100 cheaper than the 2070S if the rumors considering the price cut are true. (RX 5700 for $350, 5700XT for $400, 5700 XT 50th Anniversary $449)

    “Not only that, but the NVIDIA GeForce RTX2070 SUPER is more future-proof than these first NAVI graphics cards. …Radeon RX 5700XT and the AMD Radeon RX 5700 will not support real-time ray tracing; a feature that all of the RTX GPUs currently support. And since the next-gen consoles will do support Ray Tracing, we can safely say that owners of these specific NAVI GPUs may encounter visual degradation with all the next-generation games.”

    LMAO what kind of bullshit analysis is this. RTX won’t really be relevant until about a couple of years. And by that time this first line of RTX cards probably won’t be able to support improved raytracing.

    “Obviously, we’ll have to wait for more gaming benchmarks however if these three benchies (Shadow of the Tomb Raider, Far Cry 5 and Wolfenstein 2) are any indication of how these new NAVI GPUs perform on a wide range of games, we can then safely assume that NVIDIA’s SUPER GPUs are better in every possible way.”

    Thanks for the quality shillin’ and the hard tech know-how on how it all actually is; and the journalistic thought put into this article. I’m sure the bigots will be stroking their e-peens on the 2070S.

    1. “And since the next-gen consoles will do support Ray Tracing, we can
      safely say that owners of these specific NAVI GPUs may encounter visual
      degradation with all the next-generation games.”

      Both Xbox “Project Scarlett” and Playstation 5 have been confirmed to be based on AMD Zen2 CPUs with Navi GPUs (essentially Ryzen 3700x CPU with X5700 graphics).

      1. Yeah.. that statement was so weird. It’s not as if the change is going to happen overnight where all the games that can use ray tracing start substituting regular graphics creation with full or partial ray tracing, where all AMD and NVidia gpus from Pascal and down start getting degraded graphics.

        If anything, AMD’s products should perform better because of the optimizations studios do for games on consoles? Although that might not be necessarily true but if anything hopefully we should start seeing better performance in games utilizing more than 4+ cores.

        I think I’ve seen somewhere that Sony will be using their own solution for ray tracing or something along those lines but I don’t if AMD is involved in that work.

        As for degradation.. maybe in 10 years time? I don’t know if it even works like that, not savvy on that front and I’m not about to speculate on something I don’t know without any evidence to back it up (talkin’ to you Papadopoulos).

        1. Developing for consoles and PC is completely different despite the fact that they might share similar architecture.

      2. The PS5 and Xbox Scarlett could be using navi 20 architecture which will support hardware ray tracing. If not they will have their own custom ray tracing solution which will have nothing to do with the 5700/XT or how first-gen navi cards support the technology.

  17. “And since the next-gen consoles will do support Ray Tracing (…)”

    LOL I’ll believe it when I’ll see it….

  18. >As we’ve already reported, both the AMD Radeon RX 5700XT and the AMD
    Radeon RX 5700 will not support real-time ray tracing; a feature that
    all of the RTX GPUs currently support.

    False. They don’t support the nvidia specific tensor core hardware acceleration of ray-tracing aka “RTX”. However, the most likely future standard for ray-tracing on GPU will be something that works on any GPU. Crytek are already doing this, achieving perfect reflections and great indirect lighting at 1080p30fps on a Vega 56.
    https://www.cryengine.com/news/how-we-made-neon-noir-ray-traced-reflections-in-cryengine-and-more#

  19. In case you guys don’t know, AMD just dropped the prices on these GPUs 2 days before launch! This is why my friends, competition is good for us consumers!

    Source: wccftech(dot)com/amd-radeon-rx-5700-xt-and-rx-5700-launch-price-drop-benchmarks-leak/
    Source: digitaltrends(dot)com/computing/amd-rx-5700-xt-price-drop-nvidia-super/

Leave a Reply

Your email address will not be published. Required fields are marked *