AMD Radeon RX 6000 series GPUs

Rumor: AMD Big Navi RX 6000 GPU memory specs have been reportedly leaked, having 16 GB & 12 GB VRAM

It appears that some alleged specifications for AMD’s RX 6000 RDNA 2-based graphics cards have been leaked by Twitter user @Rogame, who has a strong track record of leaks in the past. But since AMD has not yet confirmed these specs you should treat this leak as a rumor, at least for the time being.

We already know AMD’s next-generation Navi 2x RDNA 2 lineup of GPUs would initially come in two flavors, with two chips having different memory configurations, which are NAVI 21 and NAVI 22, respectively. Navi 21 would likely house the Big Navi RX 6900 XT flagship GPU, which AMD has teased before, whereas the Navi 22 chip will likely land up in the Radeon RX 6700 XT or RX 6800 XT series of cards, successor to the RX 5700 XT GPU.

@Rogame has leaked the memory specs of at least two Navi 2x GPUs based on the new RDNA 2 architecture, which are Navi 21 and Navi 22, respectively. Coming to the VRAM configuration, Navi 21 will feature 16 GB VRAM, and Navi 22 will sport 12 GB of VRAM.

One photo of an alleged RDNA2 engineering sample GPU was already leaked before which featured 16 GB of Samsung’s GDDR6 memory across a 256-bit wide bus interface. We presume this GPU to be the Navi 21 RX 6900 XT GPU. So @Rogame’s 16 GB of VRAM prediction does indeed make sense in this context. The second variant, Navi 22 GPU, will likely feature 12 GB of VRAM across a 192-bit bus interface. As mentioned before, the Navi 22 chip will be featured in the Radeon RX 6700 XT or RX 6800 XT series of cards.

Though, the leaker mentions that he is not sure whether these two Navi 2x GPU variants are based on the full Navi 21/22 GPU die, or they are just cut-down SKUs. The low Memory bus interface on these flagship cards seems a bit odd, but there’s also a possibility that these are just early engineering factory samples.

16 GB and 12 GB VRAM configuration is also possible with 512-bit and 384-bit bus interface, so we can expect AMD to follow this route as well, as there are more options available.

We’ve heard numerous rumors about the Big Navi GPUs these past few weeks. Some early synthetic benchmarks were supposedly leaked; however, we don’t consider them that trustworthy. On the other hand, there are reports that AMD may possibly adopt the USB-C connector interface for select Big Navi 21 Radeon GPUs.

We have also seen pictures of AMD’s Radeon RX 6000 RDNA2 GPU before, showing triple-fan and dual-fan designs. The triple fan variant would most probably be launched as the RX 6900 series, whereas the dual-fan model might house the RX 6800 or 6700 series family. The dual-fan version also features two 8-pin PCIe power connectors.

It will be interesting to see whether the new RDNA 2 will be able to compete with NVIDIA’s high-end RTX 3090/3080, or its mid-tier brother, the RTX 3070. AMD will officially reveal these new GPUs on October 28th. We expect the RX 6000 series to officially debut in November.

Stay tuned for more!

54 thoughts on “Rumor: AMD Big Navi RX 6000 GPU memory specs have been reportedly leaked, having 16 GB & 12 GB VRAM”

  1. I don’t want AMD to beat Nvidia. They don’t have to. They should just release a competitive card, offering similar levels of performance at a lower price.

    AMD has already beaten Intel in the CPU department though.

    1. then nvidia respond with their own super or price cut. in the end sales still goes into nvidia pocket….for the majority of them. hence after a year “jebaiting” nvidia with RX5700 XT nvidia market share rose from 70% to 80%.

  2. 12GB is still a pretty sweet spot for VRAM. Don’t know why you guys always B*** about the GPU memory capacity.

    You guys do know, having extra VRAM does not come at a cheaper price ? The cost of the GPU is going to be more.

    1. VRAM is always going to be cheaper and cheaper, just like RAM over the generations. It’s not a static price that will forever make GPUs more expensive.

  3. God I wish AMD would have announced something allready, with the hype Nvidia is getting – Im pretty sure the competition would have been great for us consumers. Not too many of their target customers have the patience to wait months to decide on a GPU – and im thinking it will hurt AMDs sales.

    1. Customers are going to have to have patience anyway. It’s not like an impatient customer can buy a 3080 right now anyway. They aren’t in stock anywhere unless they want to get scalped and it will probably be months after Big Navi release before they can buy at MSRP.

  4. 5700 XT didn’t beat Nvidia, and it’s still a great card for the price after the abysmal launch driver issues.

    I reckon it’s gonna take at least several years before AMD GPUs are gonna compete with Nvidia in the top end.

    1. That’s just 7 more years of Nvidia high end dominance and more mindshare. Hardly worth even trying to fight by then if you ask me.

      1. I dunno. I think the fight is worthy, but there probably should come a third contenter into the game because Nvidia’s monopoly is not good.

        1. not really, if they won’t bother there’s no real point. It sounds more like an endless loop of doing nothing, but claiming the best will come later, but it never does. They need to do something now, not 7yrs down the line, because by then Nvidia will be well and truly fortified in their tech field.

          Nvidia’s at it’s current state because AMD aren’t TRYING. We don’t need a third player, because then AMD will just end up further behind, or worse, we end up with a second player who doesn’t bother competing in the high end.

          Do you know what an actual full on monopoly is?. Does anyone here for that matter?.

  5. So turns out AMD will target the 3070 and 3070 super customers…good strategy as majority gamers will be upgrading to the 3070/S.
    3080 and 3090 are best for 4K/enthusiast gamer which are currently in minority.

    1. What makes you say that?
      Seriously:
      1. their cards have way more Vram
      2. we know they are more efficient then Nvidia
      3. we know that both of these cards use two 8 pin connectors and have beefy coolers- they are high end cards

      How does this not point to AMD having at least a 3080 competitor?

      1. I didn’t say they don’t have a 3080 competitor. Just said that these 2 seem more in line with the 3070/s. By saying they are in line means they will give the same or more performance for a cheaper price.

  6. Good..We really need AMD to compete this time with Nvidia. I’m more curious to know the exact Specs of these cards, rather than just VRAM info…

    Nonetheless, that leaker is trustworthy, so at least we have some confirmation on the Memory configuration.

    1. If you’re expecting AMD to “compete” at the level of 3080, you’re probably gonna have to wait for a while more.

      AMD has always competed at the higher mid end cards which they do well, but their drivers suck and changing people’s mindshare from Nvidia to AMD is going to be a lot harder than even beating Nvidia in the hardware side.

      1. Why not just say “you’re never gonna see it”, because each time this happens, they hardly bother, so it’s logically better to just say “they won’t, don’t get your hopes up”.

          1. Where do I even begin.

            Well first AMD said they will compete in the high end and that they will have a ‘halo product’. Those are really strong words they didn’t use in a long time.

            Secondly we know the highest end card is 80CUs – that’s a lot, doble that of 5700XT.

            50% Perf/watt improvement is an official goal from RDNA1 to RDNA2, rumors say they beat that goal.

            To put that into perspective Nvidia got about 8% from Turing to Ampere.

            We have 2 confirmed coolers, pictures were shown on this page too. Those coolers have two 8-pin conectors. This means they are high-powered cards and not midrange.

            high power + high efficiency = high performance

          2. “Well first AMD said they will compete in the high end and that they
            will have a ‘halo product’. Those are really strong words they didn’t use in a long time.”

            “Poor Volta”, nuff said.

            “Secondly we know the highest end card is 80CUs – that’s a lot, doble that of 5700XT.”

            Doesn’t necessarily directly imply certain performance.

            “50% Perf/watt improvement is an official goal from RDNA1 to RDNA2, rumors say they beat that goal.”

            Doesn’t mean it’s gonna compete with 3080

            “To put that into perspective Nvidia got about 8% from Turing to Ampere.”

            Ok?

            “We have 2 confirmed coolers, pictures were shown on this page too. Those coolers have two 8-pin conectors. This means they are high-powered cards and not midrange.”

            It’s not the first time that AMD has a high wattage GPU.

            “high power + high efficiency = high performance”

            No, not necessarily.

            I don’t see how it’s “clear” that AMD will compete with 3080. Obviously I want them to, but until we see the 3rd party benchmarks, I don’t think so.

          3. This comment didn’t age well.

            Surprise surprise I was right and the performance is right where I thought it would be with 6900XT right behind 3090.

          4. It aged as well as it could have. Nobody knew anything.

            You didn’t have concrete evidence to back up your speculation.

            It was based on rumours and leaks.

            This time you happened to be right. But what the graphs showed 6900XT wasn’t behind 3090, it was matching it -+.

            I don’t see the reason to even make this comment anymore.

            I have no issues admitting that your speculation on rumours and leaks proved to be at least close to reality, but coming back here to say it makes you look like a jerk.

          5. 6900XT was matching it with the “RAGE MODE” on, so it was auto overclocked, not out of the box. Also it had the zen3 + 6000 series thingy on for a little bit of extra performance. In other words, out of the box its slightly behind.

            I wasn’t just believing random people. You seem to think its all about luck for some reason. Its not.

            You are also ignoring that a lot of it wasn’t based on leaks at all but official information from sony, microsoft and AMD. My conclusions were mostly based on the official information actually, only the CU count was from rumors but the count was so logical I could have guessed it anyway.

  7. I’d prefer them all to have 16GB but 12GB is the bare minimum going forward. Even 1080p games can use up to 8GB already.

    1. of course more is better but i don’t think games actually use 8GB at 1080p. the game can allocate more but they did not need that much.

      1. Games will use about 8-10 gb soon with RT enabled at 1080p…soon as in the next 2-3 years. We are good with a 6gb card for 1080p as of today.

        1. So you feel games may need more than 10GB in ~3years time, which is after Hopper’s release. RT performance is making huge jumps with Ampere, Hopper will be no different. You will want a new GPU when you feel the need for more than 10GB of VRAM.

          1. Hopper are going to be the first multi chipped GPUs…More like an SLI but on one board and utilize less VRAM. Those will be true 8K GPUs. Ampere are actually 4K GPUs advertised as 8K.

        2. Next gen console only have 16GB. with 10GB or so will be used by games. 10GB at 1080p? then next gen console won’t be able to run at 1440p let alone 4k which is what those console being advertise at. Game will try to “secure” as much as resource they can but it doesn’t mean it needs all that to run. Now my pc have 16GB RAM. before that i have 8. Back then i have seen usage around 6GB. then when i have 16GB installed the same game now “use” up to 10GB. performance still stays the same. This upcoming gen game developer will be have to optimize VRAM usage more. Just look what CDPR did with The withcer 3. When we saw some game use 4GB at 1080p their games only use a little bit over 2GB at 4k res. And the game is among the good looking one for 8 the 8 gen.

          1. Comparing RAM with VRAM is like comparing apples to oranges. Current gen consoles were advertised as full HD 1080p but we’re around 900p stretched out to 1080p. The new consoles are going to be a little less than 2K resolution with the ability to stretch it or 4K.

            Do not fall for what’s being advertised. These consoles will always keep pc games at the back foot.

            With the release of new games on next gen, the minimum requirement will shift to 10 series (which started already) and for RT on to the 20 series (which has also begun)…while these next gen consoles are around we will never need a 30 series GPU…expect for enthusiasts who want more FPS in 4K

            Lastly, these next gen consoles will be the first batch, there will be a PS5 Pro/Slim with more VRAM for proper 4K by the end of 2021/2022.

            All this may make you laugh but when this happens you will recall this joke and realize how much we have been fooled by these companies.

          2. the joke is when we are make to believe we need bajillion of VRAM/RAM than what we actually need. pc gaming is about flexibility. why you need that bajillion of VRAM? because some people believe they need to play everything on ultra else they will lose something significant.

          3. You are right. We need RAM about 16 if not 32. But VRAM and Terraflops (consoles) is what these companies are using to target casual and uninformed gamers. By making them feel more is better…even though more is always good but in terms of graphics there are a lot of other variables that need to be counted too.

    2. Even 1080p games can use up to 8GB already.

      Could you please name those games? I asked you before, but you’ve never answered.
      The Division 2 allocates up to 12 GB at 1080p, but it actually uses only around 5GB. You’re perfectly good with a 6GB card.

        1. No, you DIDN’T. That was your answer:
          “Nope, because I don’t play all of them and I can’t recall every game that I’ve played that do.”

          Sorry but that’s not an answer.
          Two recent games that use the most VRAM at 1080p are the new Flight Simulator (7GB), and SOTTR (6.5GB). All other new games use, at 1080p, no more than 5.5GB.

          Have a nice day.
          Edit. Is pointing out that you’re incorrect childish to you?

          1. So first you say that I didn’t answer, then literally in the next sentence you copy my ANSWER to you. Fkin moron.

            You’re a child because you won’t accept and answer if you disagree with it. And you’re blocked.

          2. It’s called “evasion”. People do it when they deliberately avoid giving a clear direct answer.

            Have a nice day, buddy.

  8. I have a Asus TUF 3080 OC on order with 10GB VRAM. I figure that by the time I need more VRAM I will be also need a new GPU. Nvidia really screwed up with this 8nm node. I’ve been using a 1080Ti FTW3 since release and never went near it’s 11GB capacity.

    I wonder if AMD is screwing with Nvidia here as there is nothing to stop AMD launching more budget cards with 6 and 8GB of VRAM. Also remember AMD is using GDDR6, Nvidia are on GDDR6X and use tensor compression with results between 20-40%.

    1. AMD most likely going to rebrand 5500/5600/5700 into their 6000 series as their mid range offering. And we already heard some talks about Navi 10 will be “refreshed” into 6000 family before.

      1. They wont, that would be stupid…
        You rebrand when you don’t have a new architecture that’s significantly better than the last.

        Why would they rebrand it if they can make a much better card at the same cost for them?

        1. because they need to recoup the investment made on early architecture. this is one of the major thing that Rory Read bring to the company and inherited by Lisa su. they will not retire the chip unless they believe they generate enough profit from the original investment made on it. just look at pitcairn. 7870 to 270X to 370. this is from 2012 to 2015. then we got polaris 10/20 that cover AMD offering in this segment from mid 2016 to late 2019. navi 10/14 will probably will last for several years before AMD truly replacing them.

          1. Thats not the right way to think about this.

            7870 – 370 period AMD couldn’t make a significantly better gpu then they already did as the jump from gcn 1.0 to gcn 2.0 was basically non existant.

            Long story short they couln’t make a better gpu at the same dye size so why would they?

            Same story with polaris – they could’t make anything better so they rebranded.

            RDNA1 to RDNA2 is compleatly different story – AMD claims 50% perf/watt.

            Another thing to put into consideration is now they have the money to make a new chip while back then they had to be really careful on what they use their resources.

          2. yes they have more money. but they also need to spend it wisely else they will repeat the struggle they had to go through for a decade ago. why AMD only recently releasing 5300? why not release it together with 5500 some 9 months ago?

  9. AMD
    >No innovation
    >always second
    >horrible software
    >slow software updates
    >Power hungry
    >Bad temps
    >Bad cooling
    >cheap tricks like “huge VRAM, don’t mind the bottleneck goy”
    >prices only in US cheaper, everywhere else in the world it’s same as Nvidia
    >internet forums are full of AMD victims (and cultists) asking for tech support everytime a new games comes out

    I’m not an Nvidia cultist. I want Nvidia to have competition. But AMD is sh*t. Canine sh*t. Will always be a cheap knockoff. But at least they have a vocal minority of pathetic cultist followers, which is about the only plus

  10. I think 12gb for 3070 competitor and 16 for 3080/3090 competitor is perfect.

    Nvidia is going to release 3070 and 3080 with double Vram but thats going overkill and its not gonna come cheap.
    Nvidia made some bad choices this time around and ended up with a meh architecture for games. This is a great chance for AMD.

  11. AMD take me home, can’t wait. All I need is 12GB that’s a smooth spot for me. I don’t want anything less, because I know what’s coming down the pipe line for games and how much Vram they’ll demand per 1080/1440/4K. AMD, in the words of Shannon “The Cannon” BRIGGS, LET’S GO CHAMP! LET’S GO CHAMP! LET’S GO CHAMP!

Leave a Reply

Your email address will not be published. Required fields are marked *