We have some new info to share with you regarding AMD’s Big Navi GPU lineup. Last month, AMD officially announced its RDNA 2-based flagship Big Navi graphics cards, the Radeon RX 6900 XT, RX 6800 XT and RX 6800, respectively.
AMD launched the Radeon RX 6800 and RX 6800 XT series of graphics cards on November 18th, and many third-party reviews were also published. The RX 6800 XT will be priced at USD 649 while the RX 6800 will be priced at USD 579. The AMD Radeon RX 6900 XT flagship card on the other hand will be available on December 8th and will cost around $999 USD.
All the above three GPUs were based on the NAVI 21 GPU silicon. Other SKUs in the Navi lineup were notably missing from the announcement, such as the cards based on the NAVI 22 GPU, which will target the upper mid-range GPU market segment.
In a recent Tweet coming via Patrick Schur, some details on the power requirements and specs have been shared for this Navi 22 GPU core. AMD has not yet officially confirmed any of these specs, so you should treat this as a Rumor.
NV22 XT 186-211 W TGP (RX 6700 XT) 🧐
NV22 XTL 146-156 W TGP (RX 6700?)
12 GB GDDR6— Patrick Schur (@patrickschur_) November 20, 2020
It's also possible that NV22 XTL has a lower min. TGP than 146 W. I'm not 100 % sure about that. So far I haven't seen any NV22 XTL with less than 146 W TGP.
— Patrick Schur (@patrickschur_) November 21, 2020
According to the info posted, the upcoming AMD Radeon RX 6700 series will succeed the RX 5700 series with the same core count, but will have better power efficiency, and higher clock speeds. The NAVI 22 GPU will initially come in two SKU flavors, the Radeon RX 6700 XT and the Radeon RX 6700, respectively.
The Radeon RX 6700 XT will be based on the NAVI 22 XT GPU, whereas the non-XT plain RX 6700 GPU variant will sport the NAVI 22 XTL silicon.
According to Patrick, the RX 6700 XT GPU is expected to feature a Total graphics power/TGP of between 186 to 211 Watts. The power consumption of this SKU is still expected to be lower than the RX 5700 XT GPU (225 Watts). This RX 6700 XT GPU will have a total of 40 Compute Units, same as the previous gen RDNA 1 Navi 10 GPU.
The RX 6700 non-XT, on the other hand would feature a TGP of between 146 to 156 Watts. Both of these SKUs based on the Navi 22 silicon will have a 192-bit wide Memory Bus interface, and will sport 12 GB of GDDR6 VRAM. Though, 6 GB VRAM config is also possible.
AMD is currently expected to launch the Radeon RX 6700 series in January 2021. AMD Navi 22 will also come to notebooks/laptops as a high-end gaming GPU, similar to NVIDIA’s GA104 Mobile chip.
Also when it comes to the power consumption figures, AMD calls the power consumption of the entire card as TBP (Total/Typical Board Power) while NVIDIA calls the equivalent as TGP (Total Graphics Power). These values are not equal and do not represent the same thing for both AMD and NVIDIA.
Previous reported rumors on the Navi 21 TGP value only took into consideration the GPU socket power (i.e. GFX, SOC, and VDDCI), and VRAM.
NVIDIA’s TGP figures are based on the entire board’s power consumption, which includes GPU, VRAM, VRM, fans & everything else that feeds on the power. The AMD equivalent of TGP is TBP, and the values reported earlier were in fact just the GPU and VRAM figures alone.
As Igor explained before, in AMD’s Graphics Power ONLY the supply voltages of the GPU for the GFX controller (VDDCR_GFX) and SOC controller (VDDCR_SOC) are included as the most important component, as well as the supply voltages for the memory such as VDDMEM and VDDCI (controller and bus).
So AMD’s graphics power is not equal to Nvidia’s TGP, since that represents the power consumption of the whole graphics card.
RDNA 2 GPUs are fabbed on an optimized 7nm process node, and they will support hardware-level ray tracing as well, and are expected to deliver a 50% increase in performance per watt over previous-gen RDNA arch.
Along with an improved performance-per-clock (IPC), and logic enhancement that helps reduce the design complexity and switching power. The GPU clock speeds are also going to get a boost.
Stay tuned for more!
Hello, my name is NICK Richardson. I’m an avid PC and tech fan since the good old days of RIVA TNT2, and 3DFX interactive “Voodoo” gaming cards. I love playing mostly First-person shooters, and I’m a die-hard fan of this FPS genre, since the good ‘old Doom and Wolfenstein days.
MUSIC has always been my passion/roots, but I started gaming “casually” when I was young on Nvidia’s GeForce3 series of cards. I’m by no means an avid or a hardcore gamer though, but I just love stuff related to the PC, Games, and technology in general. I’ve been involved with many indie Metal bands worldwide, and have helped them promote their albums in record labels. I’m a very broad-minded down to earth guy. MUSIC is my inner expression, and soul.
Contact: Email
Putting Nvidia’s pitiful 8GB of VRAM offering to shame.
Having less VRAM doesn’t put the Green Team to shame though. They will respond back with higher Vram Ampere cards in future for sure.
Nvidia already has some SKUs with more VRAM then previous Models which have been released thus far.
I wonder how the people who bought the 3070 and 3080 will feel when they find out that nVidia made fools of them with last gen amounts of vRAM. If that isn’t putting someone to shame then I don’t know what is, oh yeah the costumer is put to shame not the greedy corporation.
The solution is simple: if they think the vram is not enough then don’t buy it. But if they still buying them then they should know what they put themselves into. AMD are offering cards with more VRAM. This is well known even back when nvidia still did not put their 30 series on sale. even of they are not available right now then they just have to wait. Doesn’t matter if they have to wait for 3 months or even 6 months. The key is being patient.
Personally i’m very itching to get new cards as well to replace my 970 that almost 6 year old at this point. But i have decided to wait until entry level GPU being release (like 3050Ti) before deciding which GPU to take. Might even looking for used part if nothing from the new generation worth getting.
But the difference is usually no more than 0.5 GB of VRAM, in most of those games quite less.
One exception is The Division 2. It can allocate up to 12 GB, but then the game runs very smooth even with a GPU that has only 6 GB of VRAM.
Yes, allocation, that’s what I meant. The thing with some games is, the more VRAM a GPU has, the more the game will allocate, but it doesn’t mean that all of it will be used. That’s why The Division 2 is happy with even 6 GB.
Anyway, not that I question this, but could you name the games where AMD cards need 2 GB more? I know that with MS Flight Simulator 2020, Gears 5, or Far Cry 5 AMD cards use more VRAM than Nvidia cards, but it’s only around 0.5 GB or less.
Take dat green card up the behind with no lube like the good nVidia fanboi that you are.
From marketing perspective that will be the case. But i assume this card is not meant for 4k. From what i can see new games end up using more VRAM even at 1080p. But the memory usage increase are not that dramatic as we go up the resolution. One game for example might be able to use more than 6GB at 1080p but at 4k only use 8GB or less.
And to be honest comparing VRAM when it comes to AMD and nvidia cards can be like comparing apples and orranges. Sometimes in AMD cases they really need that extra VRAM else their card will be severely impacted (in graphic quality or performance) but that’s not the case with nvidia cards. Remember what happen with horizon zero dawn?
Good . More GPU variants the better
120W TBP “green” RX 6700 for me!
I don’t think the card will that much lower 120 watts value.
then i will wait… patience is my best friend
I assume the RX 6700 XT to be at least much faster than the 5700 XT. If the number of CUs are same, but if RDNA 2 packs much better efficiency, and higher clocks, then the card should easily dethrone previous gen cards by a reasonable margin.
It’s best to hold out until the rtx 4xxx / rx 7xxx series release, which should be able to handle TRUE next gen graphics at 2k/ 4k.
And then you’ll say “it’s best to hold out until rtx 5xxx / rx 8xxx” because RT 2.0 /DLSS 3.0 / MASS TESSELATION NUCLEAR POWER COMPUTER is coming out the year after.
Live the present day and worry about future problems like those once it comes jesus. With that thinking you’d still be with a pentium 3 180mhz.
That’s the problem with putting things off. There will always be something better and faster just around the corner. If a person doesn’t need to upgrade right now then waiting for the next gen makes sense but if an upgrade is needed right now then you might as well upgrade.
One good reason to not upgrade immediately is that it’s hard to find a GPU that isn’t twice the price of the MSRP.
That’s true. Availability for ps5/xbox/rtx30/rx60 is so scarce my god idk wtf these companies are thinking.
There’s currently nothing I can’t play on an 8gb rx 580.
Just because I won’t partake in this gen’s first peek at real time raytracing @ 4k, doesn’t mean I won’t get to enjoy 80% of the upcoming games.
Plus, I have a large enough selection of games to play as it is. It’s more about looking for the best value proposition, especially when gaming is far from being my main priority.
Also, it goes without saying you just can’t simply ‘buy’ the latest GPUs. It’s easy to assume that by the time they become widely available at their respective msrps, the next refresh will be around the bend.
I could also remind you that the RX 5000 series came out just over a year ago (July 7, 2019). Mindless consumerism will only make me resent my poor decision making and the market.
Op was misleading then. If you’d said that to begin with, everybody and their mothers would be doing something else instead of replying, me included.
And yes you’re perfectly fine with what you have if it feels your needs, this wasn’t the point.
You could always buy now and sell, but then again you can’t even buy now so whatever.
Or simply wait for the 7nm Ampere Super refresh with more VRAM in Q1 2021.
3070 and 3080 are nowhere to be found anyway, and even then it’s got a scalping price. If i had one of those bought at MSRP, i would have sold it now a bit higher to cover the Super MSRP, which would probably be $50-100 more expensive.
Lol there won’t be a 7nm refresh of Ampere.
It will be a cut down 3090 with less VRAM and different memory bus.
Right. I bet you also believed AMD has hundreds of thousands of RX 6000’s ready at launch.
In the end there was what ? 10 rx6800xt per micro center ? Jayz2cents talked about it and it’s worse than whar nvidia did and nvidia was fkin bad with their launch so… imagine launching a good gpu with jaksheet stocks…
Both AMD and Nvidia are trying to pass the buck by citing “unprecedented demand” as the reason for shortages but the real reason was extremely low supplies on release. Both companies were in such a hurry to release that they released with nowhere near enough GPUs available to meet demand.
Wanna tell me what AMD has to do with baseless claims about a 7nm Ampere refresh out of nowhere?
Do you even have the most basic understanding of what goes into engineering a GPU die, and how it relates to the fab node?
Not exactly a refresh but nvidia can make even faster and bigger chip than GA102 at TSMC. and they just need this one chip (for their gaming chip) to be built at TSMC so they can secure the performance crown. Everthing else will still going to be made by samsung.
Some people said nvidia can’t just port ampere to TSMC. well they already had the blue print in the form of A100 at TSMC. yes A100 is more compute oriented but that thing is still ampere. The only difference between A100 and the rest of ampere is how the chip being rearrange for it’s purpose. Gaming or compute. It is similar to volta and turing. Although back then nvidia decided to call the two using different architecture name.
lol
no 7nm super in q1 2021 for u
Made me giggle 🙂 i like this comment
Haha He might snag one in Q2 2022
So we regressing back now?
the X700 series rocking a 192-bit bus now?
AMD coudl’ve easily kept the 256-bit bus, but no they couldn’t put 12gb on it because they have to segment it
Infinity cache is there to help
It’s not because there’s “infinity” in it that it’ll solve every problems.
Laughs in thanos 🙂
hahahahahah
How about make more than 10 gpus /weak huh?
If only the ray-tracing performance were better this would be a card to look out for. But seeing how even the top end RDNA2 cards are getting beaten in ray-tracing tasks by cheaper nvidias the lower power nvidias will be more future-proof, even before you consider DLSS 3.0.