It appears that some product SKU codenames for some of the unreleased Nvidia GeForce RTX 30 series cards have been leaked by Gigabyte, as spotted by Videocardz. Nvidia has not announced any of these GPU variants yet.
Gigabyte has spilled the beans on three upcoming RTX 30 series graphics cards, the GeForce RTX 3080 20 GB, GeForce RTX 3070 16 GB, and GeForce RTX 3060 8 GB. Gigabyte’s Watch Dogs Legion code redeeming website has listed these new SKUs.
The list includes several variants from Gigabyte. We have variants listed with the S suffix/moniker which could mean a SUPER series branding, or maybe a Ti SKU as well, but we are not fully sure about this. Some models are also listed with twice the memory buffer size but without the S series suffix.
- Gigabyte GeForce RTX 3080 (SUPER/Ti) AORUS Master 20 GB (GV-308SAORUS M-20GD)
- Gigabyte GeForce RTX 3080 Gaming OC 20 GB (GV-308GAMING OC-20GD)
- Gigabyte GeForce RTX 3070 (SUPER/Ti) AORUS Master 16 GB (GV-307SAORUS M-16GD)
- Gigabyte GeForce RTX 3070 (SUPER/Ti) Gaming OC 16 GB (GV-3070SGAMING OC-16GD)
- Gigabyte GeForce RTX 3060 (SUPER/Ti) Eagle OC 8 GB (GV-3060SEAGLE OC-8GD)
The GeForce RTX 3080 20 GB model is expected to feature the same CUDA core count as the 10 GB SKU at 8704 cores, so we are not sure if the product code is final yet, or the GPU will just feature the Ti/SUPER branding. There are two variants based on the listed product codenames for the RTX 3080 20 GB. It makes little sense for Nvidia to just release a GPU having more VRAM, if the specs remain unchanged, so there are chances that Nvidia might also release a higher-spec’d x80 series variant as well, to counter AMD’s Big Navi GPU.
The GeForce RTX 3070 on the other hand would be getting a 16 GB VRAM buffer, since NVIDIA claims that the card is going to be faster than the RTX 2080 Ti that features 11 GB memory. This leak confirms the Memory specifications which were rumored before as well. More importantly, the RTX 3070 16 GB GDDR6 variant will feature the full GA104 GPU die, having 6144 CUDA cores. The plain non-S/Ti RTX 3070 is having a total of 5888 CUDA cores, which is 256 cores less.
Finally, we have the GeForce RTX 3060 SUPER GPU featuring 8 GB of GDDR6 VRAM, and it has been reported that this card would end up using a cut-down GA104 GPU die, instead of the GA106 GPU core which is designed for the mid-tier and mainstream SKUs.
According to @Kopite7kimi, who is a well known leaker, the RTX 3060 Ti/SUPER GPU would allegedly feature the GA104-200 GPU having 4864 CUDA cores, in a total of 38 SMs, which is 1024 CUDA cores less than the RTX 3070’s 5888 count. The card is also said to feature only 8 GB of GDDR6 memory, since this is more of a successor to the GeForce RTX 2060 SUPER SKU.
But do note that the Ti and SUPER variants might also be the same GPU, since the RTX 3060 is based on the same board, the GA104-200 SKU/ PG142 PCB. Also, don’t expect similar pricing for these Ti or Super variants, since the GeForce RTX 3070 with 16 GB of VRAM could end up close to the pricing of the GeForce RTX 3080, and the RTX 3080 20 GB would cost roughly around $799-$899 USD.
These Ti/Super variants aren’t meant to replace the existing RTX 30 series lineup, but they rather offer a step-up and upgrade in terms of memory and better core specs, but at higher prices. This all depends on how AMD responds back with their NAVI 2X lineup based on the RDNA2 architecture.
Stay tuned for more!
Hello, my name is NICK Richardson. I’m an avid PC and tech fan since the good old days of RIVA TNT2, and 3DFX interactive “Voodoo” gaming cards. I love playing mostly First-person shooters, and I’m a die-hard fan of this FPS genre, since the good ‘old Doom and Wolfenstein days.
MUSIC has always been my passion/roots, but I started gaming “casually” when I was young on Nvidia’s GeForce3 series of cards. I’m by no means an avid or a hardcore gamer though, but I just love stuff related to the PC, Games, and technology in general. I’ve been involved with many indie Metal bands worldwide, and have helped them promote their albums in record labels. I’m a very broad-minded down to earth guy. MUSIC is my inner expression, and soul.
Contact: Email

I rly wonder how AMD will respond to this
They don’t have to tbh…they are in profit through consoles
consoles are not enough for them to be viable and competitive.
And these will go out of stock too on launch day lol
Early adopters pig the 3080 are getting f***as pie usual with the marketing venue, NV takes lately For those who want to buy the cars hold your horses just yet
They know this. That’s why they do it. Stupid impatient people are the reason lol
They are not necessarily stupid.
For some people money is no object, they upgrade their high-end GPUs every year. I wish I could fit a new expensive GPU into my budget every year.
I noticed people with lots of money being very stupid actually. They waste their money on sh*t without thinking and then wonder where it went when they really need it.
But what really bothers me is that they are so horribly uncritical of the things they buy. They don’t care if their new car makes weird noises or doesn’t give them the promised performance. They don’t care if a very expensive part of furniture is scratched. And they of course they dgaf if their new graphics card has massive frame drops, as seen on last gen and the 2080 stutter.
So Nvidia only made a 10GB variant to make the announcement price point right on the border of appealing, and the true 3080 with enough VRAM for 4K just went up another $100-$200, creeping ever closer to that $1000 price point. I really hope RDNA2 is competitive, these prices really need a check.
Thats why i wait. And even if i wont buy a new gpu i am happy with my 5700xt
There is not game/reason why shoud i upgrade
Why would you upgrade from a 5700XT? That card’s barely a year old.
And one bad A$$ card too. Just make no sense to me but maybe some people have more money than brains.
Has nothing to do with brains.
I could see selling it for resale value, but you’re not gonna get anything for last gen cards at this rate. So I don’t see the point in upgrading unless you’re going for an ultra 4k @120hz setup.
People do sht because they can. I’m pretty sure if you’d be a billionnaire you’d have at least one car that’s worth 100k$++ or a house that’s worth 1m$++
It’s the world we live in. No need for a point to do something when you just can.
Some gamers have money some don’t.
Ahem, Billionaires own cars that cost over 1 mil, 100K+ for a car is pauper trash and mansions that cost over 100 mil as minimum, and apartments in central locations for 10+ million
I was very conservative for the purpose of the argument. But in a nutshell YES you’re right lmao.
Look man, yes people do things cause they can. But you know I’m not talking about that as a reason as to why get away from that card to the next. That’s not a very valid argument here. As a matter of fact it’s not really an argument and you know it. “Do what you wanna do cause you wanna do it” is not what’s on trial here but if you wanna go down that road, I can’t argue that because no one can or should. That is not what I’m even remotely talking about. But that’s cool, play the fool.
No what is on trial here is you saying that something is useless or makes no sense. You are not everyone and neither am i so who are we to decide what is right or not in this situation ? Common sense ? Perspective ? Everybody has a different angle on every subject.
Both our points are good matter of fact your point is very much valid since power consuming creates mucho e-waste etc. But needs are needs.
Well not rly. The game i play on 1080p 60hz is enough
Cuz my kid wants her own pc and i have some old parts like i5-3570k and a gtx970. But i dont rly wanna give her a sh*t pc. I know rly well what is it like to have sh*t FPS. And its rly old HW.
I have a amd cpu 3600 and a 5700xt.
I might just give her that and i will get someting for my self. You know chrissmas is coming and all that sh.t
Absolutely. We knew it would be comming!
Happy to wait it out and see. My 2060S are good enough to hold me over. Will most probably look at the 3070S – 16GB. I really want more mem then these regular have. We surely will benefit for 16GB for nextgen gaming im sure!
So cynical of you. I approve!
Still, the RTX 3080 (10GB) will be a great card with enough VRAM for those wanting to play at Ultra settings on ultra-widescreen 3440×1440 monitors and in VR. It’ll also be fine for many games at 4K albeit perhaps not always with 8K textures for the minority of games featuring them.
10GB WILL NOT SERVE 4K GAMING! That is the biggest bait and switch Nvidia is pulling and y’all are gaming for it. The most you will do with 10GBs in next gen is 1440p if you’re lucky. They knew what they were doing with that reveal and price point. That is still a 1080p/1440p card for next gen. 4k ultra is already maxing out 8-10GBs this generation. 10gb will be a joke for 4k come next gen.
Thats a 100% lie, Gaming Nexus confirmed that 10Gb is more then enough for 4K, also he confirmed that VRAM usgae that you see using software is wrong, allocated vRAM =/= needed VRAM, most games allocate everything they can, even if they dont need it
See, you know how I know you are full of $hit & you don’t have anything for to any of these techs? Because you need gaming Nexus to Trump on my experience. Well you keep believing what gaming Nexus tells you and I’m just gonna keep on having my experiences. Good luck experiencing life through a YouTuber. Haha, “Gaming Nexus Said So”, that is what we call analytical and not factual. Just like how must of these idiots on YouTube run the benchmarking tools and call it a day. None of which are real world scenarios and usages. Since as I actually use $hit, I’m gonna go with what I saw and have seen. You’re a joke.
Thats the dumbest comment ever, Gaming Nexus has the tool and experience in what he does day to day vs our hobby.
My last card was 11Gb 2080Ti, I NEVER EVER felt that I need more VRAM for 4K, 11GB or 10Gb is the same thing, 1Gb wont change much.
You just argue against smarter people with more experience [Me and Gaming Nexus] and thats why you look and sound dumb.
You’re pretty close to the dumbest comment ever yourself. 8 GB were filled up easily properly in 1080p in 2016 already. The 2070 should have had 10gb already. Instead we see a 3080 with 10gb now. Lol
Gamer Nexus doesn’t get what VRAM allocation means either. IT DOESN’T HAVE TO BE NEEDED.
But keep believing some tar d who scratches up his 3080 completely when taking it apart.
https://media3.giphy.com/media/XWwIzh5GIWWf6/giphy.gif?cid=ecf05e47977109e8de1145beced584e08b22bf50f64634f8&rid=giphy.gif
Sure bro
Yes, very sure. Fire up Rise of the Tomb Raider for example, in 1080p at max details and be amazed.
And look at the 3080 at the end of their disassemble video.
And then accept that some things in the VRAM aren’t needed simply because you as a player chose a different path that won’t show them, but would have needed them if you went the other way. Yet in both cases frame time drops were avoided because of the big VRAM.
Not to mention that low VRAM is hindering development of bigger and better games in general.
Im gaming in 4K since about 2014-2015, first I had 4K Philips 40 inch monitor and since 2016 4K OLED C6 and since 2019 C9.
In that period I used these GPUs and had ZERO VRAM issues:
780 Ti 3GB
SLI 970 3.5GB
980 TI 6GB
GTX 1080 8GB
GTX 1080Ti 11GB
RTX 2080TI 11GB
P.S. Lets not forget that NVIDIA been using VRAM compression ever since that improved with every generation and right now it doubles the VRAM, also with new RTX3XXX series you have that technology that used in Consoles to stream from SSD and use it as VRAM.
Great. Doesn’t matter. If you don’t care about frame time drops, then you don’t care about it. Doesn’t mean others won’t notice them.
Compression is only 10%.
And the API for that has to be implemented into games by the game devs. Seeing how long even other DirectX Support took to get supported, you can imagine how long this will take. Plus nobody has tested it yet if it’s really as good as they claim.
Here is an article about Pascal 4th Gen Delta Color Compression.
You get 20% Increase in Memory Bandwidth
+ 2:1, 4:1 and even 8:1 Compression
Even at lowest 2:1 you can load twice as many textures, at 4:1 four times as many, of course its run automatically on the fly by the GPU based on pattern recognition and some get 2:1, some get 4:1 and some get 8:1.
https://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/8
Since then we had Turing 5th gen DCC and now Ampere 6th Gen DCC so it only got better in both bandwidth improvements and level of compression
Gaming nexus is wrong.
We should thank this teenage specialist for telling us the truth.
I’ve worked in the industry since before said teenager was born and I know for a fact that this 10 gb claim is false. I don’t need ignorant tubers to tell me lies about things they don’t understand based on pure theory when it has been repeatedly disproven in real tests.
Nope sorry. Anyone with eyes who’s actually played a game can confirm 10gb is absolutely not enough
10gb isn’t even enough for 1080.
Play a game like RE2 Remake with everything on ultra. It takes 12.5 gb and that;s just at 1080p without DSR turned on.
Yea but imagine the 3080with 20gb price
Starting from 999$ i bet
More like $1299.
This is one of the major reasons I hated dealing with/owning Nvidia in the past. Their regular and their Ti Bull$hit. It’s so manipulative and annoying at the same time. People actually fall for this rubbish. 50 SKUs of the same damn card, with 5-10% power increase for $200 more. Nothing has changed and they know just what to say/leave out to get people excited.
Thank you but i will never EVER buy anything from GIGABYTE … by far the worst manufacture motherboards,bios,dpc latency ,bad VRM’s and low quality GPU builds
I had a card that caught on fire.
gigabyte has always been my favorite outside of sapphire
they have to pay to use gigabyte :)))
Any proof to that? Especially the dpc latency and bad VRMs. I’ve had several Gigabyte boards before and the only time I got high dpc latency was when an Nvidia card caused it. Gigabyte boards are also the must durable ones I know. I even use them in cars.
game on a asus mobo/ gpu combo vs gigabyte or MSI the difference in micro latency , stability ,bios support its like 5 star hotel vs 1 star in the bay
That’s not proof. That’s just you claiming something. Give me some proof FFS. What’s so hard about that? What you claim can’t be seen in any review I’ve seen so far. Yet according to you it should be as obvious as having an GTX 260 instead of an 2070. No typo on the 260.
once you game on a Asus mobo you will never go back to crapbyte
i had an asus MB inbetween two gigabyte ones they are far superior for me
ewww gigabyte mobo really i am to poor and respect myself to much to go with a gigabyte mobo for my PC.
had 5 mobos from them all terrible
Show proof then. Should be easy if what you say is true. I had several Asus boards. No difference in performance, and I am someone who panics at the slightest frame time drop.
Those Asus boards all died very early, with the notorious Asus black screen. And after having to deal with the horrible Asus support I won’t ever buy one again.
its easy test a gigabyte mobo + 60 hz
asus mobo + 240 hz
Asus always but always will feel more snappy and fast no matter what.
Yeah drive Ferrari with square wheels and then drive Golf with round wheels. Golf is superior.
they have to pay me to use gigabyte:)))
I bet they do. I bet they do.
Holy sh take my $1500. My 1080 is ready for retirement i.e serving in the Plex mines.
“It makes little sense for Nvidia to just release a GPU having more VRAM, if the specs remain unchanged”……..why not? For example, I am waiting for the 20Gb version to buy the RTX 3080