AMD Radeon VII seems to offer similar performance to the NVIDIA GeForce RTX2080 according to these leaked benchmarks

The embargo for AMD Radeon VII’s reviews will lift later today and it appears that the first gaming benchmarks for it have been leaked online. These benchmarks are coming from HD-Technologia and from the looks of it AMD’s latest high-end GPU offers similar performance to the NVIDIA GeForce RTX2080.

HD-Technologia used an Intel Core i7-7700K (4.2GHz) with an ASUS Z270F Gaming motherboard, 16GB Corsair Vengeance LPX (2x8GB) DDR4-3000MHz, 2TB Seagate Barracuda 7200 RPM and the AMD 18.50-RC14-190117/NVIDIA Driver 417.71 WHQL drivers.

As you’d expect, the NVIDIA GeForce RTX2080 wins some benchmarks that favour NVIDIA’s GPUs and the AMD Radeon VII wins others that favour AMD’s graphics cards. However, and for the most part, we can say that – at least according to these benchies – these two GPUs offer similar performance.

Now what’s really important to note here is that contrary to the RTX2080, the AMD Radeon VII does not offer support for real-time ray tracing. Furthermore, and while DirectML could be used for super sampling, developers have not experimented with it. As such, DLSS may be used in more games, thus giving RTX2080 the edge over the AMD Radeon VII in more games.

Both the AMD Radeon VII and the NVIDIA GeForce RTX2080 have a MSRP of $699 and according to some reports, overall availability for the AMD Radeon VII will be limited. As such, I don’t really know why someone would choose AMD’s offering over NVIDIA’s as the latter comes with more features, same price and same performance.

But anyway, I’m pretty sure that more publications will reveal their benchmarks later today. Until then, here are the benchmarks from HD-Technologia.

Kudos to our reader Metal Messiah for bringing this to our attention!

121 thoughts on “AMD Radeon VII seems to offer similar performance to the NVIDIA GeForce RTX2080 according to these leaked benchmarks”

      1. Resident Evil 2 Amazing is a bit of an exaggeration, it’s good surely, don’t know about DMC always looked like a pretty stupid game, the division 2 speaks for itself lol. Anyway surely better than the games nvidia is offering, that’s for sure.

        1. Nono, it’s pretty good, it’s actually a pretty good remake, but amazing i find it a bit exaggerated, anyway…

          1. I think i only died 10 times or probably less, and it was only during boss fights. I played it at normal difficulty, it was very easy, never played any Resident evil game, yet i could finish this in less than 7 hrs. I honestly expected it to last longer, at least 10+ hrs, but i’m sure that cranking up difficulty will add 1~ hrs of game or something, also, i didn’t get all the items, like the magnum or the smg.

          2. I can imagine, i just feel like there’s way too little of game time (and not that much to see) to play it like this, and besides i’m not really a fan, i found it way too slow paced and too clumsy in certain things, also the horror component is basically non-existent, which i kinda appreciate, as i’m no big fan of horror, and i don’t understand people saying the game is horror or scary.

          3. I thought it was pretty amazing. There really isn’t anything else out there like it. DMC is also my jam, so to me, that adds a lot of value (although I’m honestly keeping my Vega 56 for a long time yet. No reason to upgrade).

      2. Just want to say that i do not recommend the Radeon 7 over the 2080 but i was just giving a reason to why someone might buy the radeon 7 personally i would buy the 2080

        1. Yeah it’s fine, i understand your point, i just don’t think it’s a valid reason to buy hardware based on the bundle only pretty much.

  1. I have really grown to dislike Nvidia over the past few years, and if it weren’t out of absolute necessity in my old 980 dying on me, I would not have gotten a 2080. It’s a terrible price vs performance card, and I hated having to buy one, even well before this Radeon VII came out.

    On the AMD note, I mostly hate their thermals. I’m not inclined to drop a makeshift oven in to my case, if I don’t have to. Aside from that, ya, they’re cheaper, but they are also always under outperforming, and running hotter, than Nvidia’s counterpart. However, I still would not have bought one of these, which inarguably have less features than Nvidia’s RTX, for the same price, sorry.

    I have absolutely zero “brand loyalty” in anything though. If AMD, or even Intel, make a better card than Nvidia anytime in the future, I’d drop Nvidia like a bad habit, without issue. I used ATI Radeon cards for many years back in the day, and they were awesome. Then they were bought by AMD, and here we are. Sad, but there it is.

    1. And it’s incredible how AMD’s greed ruined both them and ATi when they bought them, but everyone is always talking about nvidia’s greed or intel’s greed, and not facing the fact that both have been having the better product when compared to AMD, for years, before ryzen. I guess AMD was hit by a pretty fast and heavy train, and that’s how now they’re crawling back to the industry leading spot they were supposed to be (if it wasn’t for greed) at least on the CPU side of thing…About GPUs…Well that’ll need to wait i guess.

        1. No they were greedy because they wanted more, and they forced themselves to buy one between nvidia and ATi, not knowing (or ignoring) they wouldn’t be able to fully support both teams with their economy, that’s why we had ATi/AMD Radeon group, initially doing well, and then (now) the opposite.

    2. i have sapphires R9 390 8gb GDDR5. Compared with gtx 970 ,( my brother have) which have the same age, R9 in 1100mhz runs stable, faster and cooler (63 C after many hours in GTAV and Fortnite..) than an oc ed 970 NOW. When i bought it, the “analysts” were saying that 970 was slightly faster….(in some games was, but not any more) and R9 WITH 2 8PIN , consuming a lot and was much hotter .Something that i never seen!

      1. I supprot a company who bring balace to the GPU market
        I support a company who wont sell the graphics settings for 1200€
        I support a company who wont sell gpus with 3.5GB of vram on a 4Gb card
        I support a company who make new tech without makeing it expensive af

        1. Don’t waste your time with these people man. I’ve always know NVIDIA was scummy as hell but because of my lack of knowledge in teh past i went with them. Lets just say marketing and fanboyism is a helluva tool. Then came 2018 where i came to my senses and drop Nvidia like a bad habit. Nvidia is the EA of GPU’s.

          1. I can really see the lack of knowledge is still there, or maybe just fanboyism, i’m not saying you have to buy nvidia cos they’re better, but at least admit they’re better, there’s evidence everywhere.

        2. Fury X rumored $800 price but Nvidia saved the day with 980 Ti

          Vega Frontier Edition $1500

          AMD’s Gaming Scientist Richard Huddy lied that 4 GB HBM = 12 GB GDDR5 (selling 4 cores CPU as 8 cores also counts)

          What’s that supposed to mean?

        3. But you are supporting a company who is making a card that’s equally as expensive as the 2080, whilst charging for less technological features to justify that expense

          If AMD are the White Knights you portray, why are they charging as much as the EVIL Nvidia for a card that has less features?

          1. I’m sure the profit AMD gets out of this card is much less than what Nvidia gets on the 2080, because AMD uses 16GB of HBM2 memory which by itself costs half the entire price of the card.

            They just can’t sell it any cheaper without losing money.

          2. I guess the idea of going HBM2 again was all theirs and not doctor’s prescription, so they’ve only go themselves to blame.

          3. Halving the HBM2 actually halfs the bandwidth of the memory on these designs. They were sort of forced to go the route due to the architecture. It will be nice for long term results, but a harder sale in the present.

          4. Half the bandwidth would be already be enough since it should be around GDDR6 capabilities and if that’s enough for nvidia i don’t see why it can’t be enough for AMD.

          5. I’m not sure what the technical reason is, but in the interview I watched (PC gamer I think), AMD talked about the bandwidth having a larger impact on the Vega architecture than others. Not sure why, but they seemed to believe 16 was the way to go on this card. They didn’t entirely rule out an 8gb version though, so maybe they’ll replace the 64 with that?

          6. I’m not sure that’s true tho, this card here seems like a hybrid between a gaming card and a content creator card, like the frontier, maybe that’s the reason it has 16GB or Vram, they launch a single card that’s ok for all purposes.

          7. I had to look it up, but apparently it was to keep the power consumption down. Using tradition ram, the card would consume another 70-100 watts, which would be a problem. It also would likely exceed what the memory controller would handle.
            Testing suggests that memory speed/bandwidths has a huge impact on performance, as even a modest overclock to the HBM2 speed boost performance more than overclocking the GPU.

            https://www.gamersnexus.net/guides/3032-vega-56-cost-of-hbm2-and-necessity-to-use-it

          8. If so, it’s a tricklet design and decision from ground up, and they are still priced in line with competition card, as they have been with all their recent high end offerings, whilst not offering same level Of technological development.

            The fact Nvidia could include Tensor cores to do raytracing and still be cheaper just puts AMD’s offering in worse light tbh.

    1. 450 is the cost of HBM2 alone. AMD should have dropped it to just 12GB, and priced it in the $500-$600 range.

  2. “As such, I don’t really know why someone would choose AMD’s offering over NVIDIA’s as the latter comes with more features, same price and same performance.”

    Because We Aren’t Nvidia Fanboys.

    *We Know That AMD Ages Better
    *We Don’t worship Nvidia
    *We Don’t Want To Support Scumvidia
    *We like things because we just like

    Edit: They are not the same price to performance. HBM2 is extremely expensive in comparison to GDDR memory. Some people I tell you.

    1. I’d say it’s more because you’re AMD fanboys. A non-fanboy would pick the one with the most bang for the buck, which clearly is Nvidia in this case.

      1. Idiot, I’ve owned 690-Titans- Titan x’s form Nvidia for the last 8 years and all enthusiasts cards. I’m for fair practices and Nvidia is a Joke and if you are this delusional then something is wrong with you. Why don’t you go and watch some videos on Nvidia’s scummy practices and then come talk to me. Oh wait a minute, didn’t they just overcharge everyone for their Ready To Xplode RayNotSoTracing Cards. Dude, go play in traffic man.

        With Nvidia you pay 200-400 dollars for 10-15% more performance than you you get with AMD and AMD has richer picture on screen because they are using raw compute unlike Nvidia that has washed out picture because they are doing some shady computing to achieve more frame rates. I can tell you know absolutely nothing about GPU Technology and Business practices if you think i’m an AMD fan boy. I’m just responding to the Blatant fan-boy statement in the article. To deny that’s what that was is to be disingenuous.

        Sorry dude i know way too much about business and scummy practices to ever buy an Nvidia product again. I’ve turned a blind eye for far too long. What can i say, we all gotta grow up and face the music sometime and i did.

        1. With Nvidia you pay 200-400 dollars more for the same performance you get with AMD and AMD has richer picture on screen because they are using raw compute unlike Nvidia that has washed out picture because they are doing some shady computing to achieve more frame rates. I can tell you know absolutely nothing about GPU Technology and Business practices if you think i’m an AMD fan boy. I’m just responding to the Blatant fan-boy statement in the article. To deny that’s what that was is to be disingenuous.

          I’m not sure you realize how this sounds. And how the earlier post sounds too.

        2. Wow, so edgy. Careful you don’t cut yourself, kid 🙂 “With Nvidia you pay 200-400 dollars for 10-15% more performance than you you get with AMD” Did we read the same article?

          In case you missed it, the article, this discussion and the lines you quoted are about this particular set of graphics cards, not the entire line-up.

          Nice strawman, though. Don’t worry, buddy, you’ll get there someday.

        3. “AMD has richer picture on screen because they are using raw compute unlike Nvidia that has washed out picture because they are doing some shady computing to achieve more frame rates.”

          You have reached a new level of stupidity.

      1. Yeah why would I buy this card when i Just bought a Vega 64, idiot? Do I look like one of you Nvidiots that buys a new shillvidia every release. The problem with people like you is, you use dismissive language like “Fanboy” and think that validates your argument. Fuqboy…

  3. An adequate amount of VRAM for a $700 flagship GPU, that’s what. The fact that the RTX 2080 only has the same amount of VRAM as the not even range-topping R9 390/X launched with FOUR YEARS AGO (!!!), in an age with games capable of sucking down ?12GB of VRAM if you have it (Resident Evil 2), and 16GB GDDR6 packing consoles coming in literally just a year, is absolutely freaking ridiculous. Same for the RTX 2060 only getting 6GB. Every RTX card except maybe the 2070 is a ticking VRAM-deficient time bomb of the same kind as AMD’s 4GB Fiji GPU (Fury/X/Nano).

    And that’s BEFORE even considering any super memory intensive RTX effects.

    1. I can agree on xx60 cards, but benchmarks are proving those games don’t need that much VRAM and benefit more on the processor capability than anything. At least for videogames that is.

      1. You’re completely missing the point. If it was being literally VRAM choked from the day it came out, it’d be an outright failure, but that’s obviously not the case; but that says nothing about how long it’ll be able to stay that way (not long, bc new 16GB consoles next year). And considering the fact that more & more games can & will use significantly more than 8GB of VRAM if it’s available, it’s literally just a matter of time before the level of load in new games reaches a point where performance is negatively effected.

        It’s literally the exact same thing that happened with AMD’s Fiji GPU from 2015, where it just BARELY had enough VRAM at launch to not lose performance, but games could already use more than 4GB if you had it. Fast forwarded a year/2 and it’s losing performance to VRAM overload left & right when everything’s maxed out.

        Anyone who buys an RTX 2080 (or 2060) is just ASKING for problems circa 2020-2021, and that’s before even considering the huge memory demands of RTX effects.

        1. If you’re buying a 2060 you shouldn’t be expecting to have good performance 3 years from now, if you do then something is wrong with you. And pretty much this applies to the old cards you’re talking about. Since Pascal there wasn’t any problem with memory, like at all, because they calculated the potential of the chip, pairing it with just the amount of VRAM. Besides stop assuming stuff like “16GB consoles next year” because you just don’t know it, and console work a different way.

      1. Which are releasing next year… And Navi isn’t as fast as the RVII so of course it doesn’t need as much VRAM (not to mention keeping it at 8GB helps keep the costs down). But we aren’t talking about Navi/mid-range parts here, now are we??? No, we’re not. And it’s an embarrassment for your top-tier products to begin having performance issues within a year/2 of release, because you didn’t just keep the VRAM amount the same from gen to gen, but bloody DOWNGRADED IT (1080 Ti = 11GB, 2080 = 8GB; 1070/Ti = 8GB, 2060 = 6GB).

        At LEAST AMD’s Fiji (Fury/X/Nano) kept the same 4GB framebuffer size as their prior flagship, Hawaii (R9 290/X) instead of pulling an Nvidia and chopping it down to cut costs, while still rising the MSRP to increase profit margins.

          1. Spot on like yours, lmao, you idiots have no clue, go watch that video, also i’m done with you unluckynumber911, you truly are one piece of art.

          2. Boyfriend. Ok i got it you’re at it again, another one of your episodes. guess i better block you.

          3. You should’ve just shut your mouth, because you know nothing and as you were doing it before, you’re ranting and raving caught from another one of your episodes, spouting BS all over the place about pretty much everything you can think of. Come on sweetie, go sleep now, will you?

      1. Pretty sure that at 1080P it doesn’t go above 6GB, if it does it’s only slightly, and enabling basically anything available.

          1. But this isn’t not 2010 or 2011, i can see why 2GB weren’t enough but now the minimum is like 4GB for low end cards, and that’s enough for the capability of the graphics processor they serve.

            I agree that more is always better, but not if you sacrifice the price to do that, especially with HBM2.

          2. No it’s not the same situation and the proof is that 2080 with 8GB is faster overall than this steaming turd Vega 7 here, despite it has double the memory, with double the speed of transfer. Also i never heard anybody running into problems with 8GB VRAM, or 6GB VRAM on 1060, so yeah again, do some fact checking and stop trying do defend AMD at all costs, it doesn’t make you look good.

          3. There was no card with that amount of Vram, the one with more memory was a AMD card, don’t remember which one, and it had 3GB, that’s why everyone was saying “AMD ages better”. Anyway you can’t know what consoles will be like, and consoles as i already said, work in a different way, CPU and GPU share the overall system RAM. 16GB of HBM2 are either a recycling of their old instinct card, maybe to launch a single card for both gamers and content creators, or just a very very stupid idea, which increased dramatically the price of this, which could’ve cost 100 or even 150 $ less, and could’ve been a good competitor in price/performance.

          4. No it’s because GTX 680 wasn’t fast enough to play games in that future you talk about, you don’t understand the point and you keep saying you need VRAM always. YOU DON’T NEED RAM IF THE GRAPHICS PROCESSOR ISN’T FAST ENOUGH. GTX 680 was already a slow turd when you actually needed those 4GB of memory. Memory is important, but not even close to be as important as a fast processor. Stop it seriously you’re starting to be annoyingly ignorant and stupid.

          5. To people like you the only way is rude comments because you don’t know sht yet you need to defend your stupid manufacturer for some reason. AMD is sht in terms of videocards, the earlier you realize it the better for you, otherwise don’t, keep this BS up and keep giving money for the worse product, as long as you’re fine with it.

          6. You’re contradicting yourself, I know how fast was a 680 back when it came out, but not that fast 3 years later when those 4GB of yours weren’t enough anymore, and talking about how it gets capped now in john reviews. You sir are stupid, you seriously are, you don’t understand you can’t compare any of new cards to old cards, and you can’t compare the games they were tested on, and you can’t use them as reference, 680 processor was already too slow to push anything.
            https://www.youtube.com/watch?v=B0T-NBetHhY

            Get fkin lost for the love of god.

          7. (In a southern american accent)Oh lord this time is pretty bad huh? May the lawd be mercy on your soul.

          8. Oh boy, this Paul86 kid is one stubborn SOB for sure, lol….Well said.

            Damn, 2 of my comments just vanished by mistake.

          9. Like I’ve said before, this Paul aka LuckyNumber8 guy keeps on arguing too much, each time he posts a reply out here.

            He is one stubborn kid for sure.

          10. Listen you M***fing idiot.

            Like I said before, I don’t play graphic demanding games, mostly OLD titles, so I never had any VRAM issue, even with 3GB VRAM. Get a life, you c*uk.

          11. Listen you M***fing idiot.

            Like I said before, I don’t play graphic demanding games, mostly OLD titles, so I never had any VRAM issue, even with 3GB VRAM. Get a life, you c*uk…

          12. Cut him some slack the guy’s got problems come on! I mean it’s rare i block people, i had with him, he was having episodes…

          13. There is no point in blocking such kind of people though. Doing this just gives him an upper hand, and puts him in a win-win situation, imho.

          14. Well he’s got mental issues, what can i do about it, i tried my best, the guy is troubled and won’t hear anything.

          15. It looks to me this Paul86 guy is just mental, and he acts like a stubborn KID. Immature behavior as evident from his replies.

            It’s actually fun reading his nonsense though, to be honest. Shows his lack of knowledge and understanding. Blocking him would just take away the fun/comedy.

            You shouldn’t be blocking him in the first place, so just relax.

          16. people who have 8GB will run into vram related issues in games

            And, how many of these so-called gamers are there actually ? They are only in a minority. These people will run out of VRAM only on higher 4K resolution, that too in selected game titles, not ALL.

            That’s a fact..

          17. No, 16GB vram is not tempting for playing majority of the recent Games. It is still kind of overkill, regardless of the resolution. That’s why this AMD card is expensive, in the first place, because of hbm2 memory.

            Please stop posting your gibberish nonsensical comments.

          18. As if you are some TECH expert ?…You are nothing but a MORON. period.

            Yes, I know my GPU might struggle in some games, but I still haven’t stumbled upon such an issue. That’s why I need to upgrade once the proper time comes, but 8GB would be sufficient.

            16GB is still too much as of now, for playing games.

            So yes, I know MORE than you do. You on the hand, just boast about all this every time you post something here on DSOG, as if you have some extra tech knowledge. LMAO.

  4. Not too bad! I mean if this is right, the performance difference between the gpu king and AMD’s top card is around 25% which is not too far. Man gpus are costing so much these day ughh.

      1. If we’d have the SAME performance then yes but it is not the case. With every new gen comes new tech, new processes and R&D so since these costs an arm and a leg you and i end up paying more. Look at the Titan Xm. Back then i paid around 1200$CAD to get it and now a 1080 is stronger for 850$. So yes the price dropped but our expectations got higher meanwhile so we’re not looking at the TitanXm performance anymore…

        I
        WANT
        MORE.

    1. Yeah but with a manufacturing process in advance, don’t forget AMD had to tap to 7nm to match (barely) a 2080 which isn’t even the fastest one. So what when nvidia goes to 7nm too?

      1. Hmm i wonder. Fun times ahead when we’re going to see those prices. LMAO Nvidia gonna rip me another bu**hole. I heard they’d be using Samsung’s tech for the 7NM, none of that is going to be cheap.

    1. LOL, that’s not copy/paste, it’s called recommendation. But who cares even if he is doing this, as long as that tech article is worthy of giving a read.

      At least he is contributing something. I’m not defending any member here though.

      But what’s your problem with all this ? Many tech/gaming websites do copy articles from other tech articles, and they give credit, where it is due. Nothing wrong with all this.

      I’v seen this on almost every tech website I visit. Even VideoCardz/Guru3d give credit if any new news comes from outside their sources.

      In this case, it was JOHN’s decision to post/publish this article, based on MM’s recommendation, and he even credited HD-Technologia for that. Some of MM’s recommendations have been very nice though. John can also decline any article recommended by others, if he feels it that way.

      But in this case he didn’t. MM didn’t force him to publish this article either. Also, we can’t expect John to scour the entire internet and check for every tech/gaming website for the latest news. He has a busy schedule.

      You seem like you are getting annoyed by all this. Your reply doesn’t make any sense whatsoever. Get a breath of fresh air outside, and relax.

  5. I’m interesting about TDP. If TDP on Radeon VII is higher than RTX 2080 (and I think it will be as usual) than Radeon is finished!

  6. THIS IS A 16 GB HBM2 7nm card against only 8 GB GDDR6 12nm

    WHY always people seem to forget that alone gives the radeon v11 the edge against a card that offers experimental features which won’t be adopted until the next console cycle but i guess that settles it right

    if it was the other way around JOHN you would have said the exact opposite

    not to mention board partners cards are coming and will offer better cooling and at least small performance boost
    and the fact that there are faulty 2080 out there

    THEY offer double the vram

    SHAME ON YOU

    1. 980 Ti 6 GB vs Fury X 4 GB + experimental features, AMD fans claiming AMD is better
      1080 Ti 11 GB vs Vega 64 8 GB + experimental features, AMD fans claiming AMD is better

    2. “” not to mention board partners cards are coming and will offer better cooling and at least small performance boost “”

      Unfortunately, no. There won’t be any custom models at launch from AIBs, though you can expect such designs at least around Computex (i.e. late May).

      As of now, the board partners are going to release the same reference-based models with a reference design which rocks three fans on the shroud (triple fan design).

      This was stated before as well, so we shouldn’t expect AIBs to release any custom variants so soon. Earlier it was also leaked that “PowerColor” is releasing custom boards of this Radeon 7 GPU, dubbed as ‘Red Devil’ and ‘Red Dragon’, but that was just a rumor, and the company also denied having any such plans.

  7. I have read that availability will be huge problem with this card.
    Some European shops have spoken about this and claimed that there are only around 20 cards available for entire countries as big as France or Germany.
    This may just be the paper launch we were warned about weeks ago.

    Just like the VEGA cards I think these cards will never sell anywhere near MSRP for the foreseeable future.

  8. I would think that a few months of drivers should have AMD up a few %. Launch drivers are always down a few points.

  9. “As such, I don’t really know why someone would choose AMD’s offering over NVIDIA’s as the latter comes with more features, same price and same performance.”

    It is not that simple. As you know enabling RTX comes with a huge performance hit, it is not just a case of “the same, but better because of RTX”. Which is where DLSS is supposed to come in and bring the FPS back on track. However a side effect of the upscaling process (which is what DLSS basically is) is deterioration – very noticeable at that, esp at 4K where you are supposed to use it – of higher resolution textures’ quality.

    What that means is that you’re essentially choosing between high quality textures and whatever RTX effect (there is usually only one per title) the game in question allows you to use. The thing is screenspace reflections are nigh indistinguishable from the raytraced ones during gameplay and the same goes for raytraced shadows and even global illumination. Basically all modern engines have postprocessing effects that cover the ground RTX does very well even if they’re not technically perfect. However you can’t replace high resolution textures with anything – you either have them or you don’t.

    The bottomline of all that is that the image a Radeon VII (or a 1080Ti for that matter) will display at maximum detail at native 4K will be in most instances BETTER than the image an RTX 2080 will, when running say raytraced reflections at 4K (i.e. 1440p + DLSS). Raytracing is indeed the future of GPUs, but at this point in time it is nowhere near the obvious choice you’re making it out to be 😉

  10. Hello JOHN,

    TPU did some more testing with the latest AMD drivers, and only 2 games seem to benefit from a huge performance GAIN on the RADEON VII GPU.

    Assassin’s Creed Odyssey, and Battlefield V both achieve multi-digit improvements, looks like AMD has worked some magic in those games, to unlock extra performance.

    Using the latest 19.2.2 drivers added +0.45% on top of that, for a total improvement of +0.653%.

    https://www.techpowerup.com/252691/amd-radeon-vii-retested-with-latest-drivers

Leave a Reply

Your email address will not be published. Required fields are marked *