The specifications of the cards based on the next-gen Ampere GPU architecture, the GeForce RTX 3090, GeForce RTX 3080, and the GeForce RTX 3070 have been leaked by Videocardz.
According to Videocardz’s sources, Nvidia will initially launch three SKUs in September: RTX 3090 24 GB, RTX 3080 10 GB and RTX 3070 8 GB. All three cards will be announced on the 1st of September during the GeForce live digital event.
The custom AIB board partners are also preparing a second variant of the RTX 3080 featuring twice the memory (20 GB), but this SKU will launch at a later date. This product is yet to be announced, but a 20 GB Model is surely in the works. The GeForce RTX 3090 and GeForce RTX 3080 will be the first cards to hit the retail shelves, with a hard launch planned for mid of September.
According to this leak from Videocardz, the GeForce RTX 30 series cards will feature the 2nd Generation Ray Tracing/RTX cores, and the 3rd Generation Tensor Cores. The GeForce RTX 30 lineup will also support the HDMI 2.1 and DisplayPort 1.4a connector interface for the display output, and will pack a brand new NVLINK connector. These new cards will also support the PCI Express 4.0 interface/standard.
Coming to the specs, the GeForce RTX 3090 will feature the GA102-300 GPU having 5248 cores and 24 GB of GDDR6X memory across a 384-bit wide bus, thus giving a maximum theoretical bandwidth of 936 GB/s or 1TB.
Most importantly, according to Videocardz’s sources, the custom AIB cards will utilize dual 8-pin power connectors because this flagship card has a TGP of 350W (total graphics power). The RTX 3090 represents a significant increase in core count over the RTX 2080 Ti, which featured 4352 CUDA cores. The GA102-300 GPU will have a boost clock frequency of 1695 MHz.
We don’t have any details on the TMU (texture mapping unit), ROP count (raster operations pipeline) yet. The RTX 3080 on the other hand gets 4352 CUDA cores and 10 GB of GDDR6X memory. The RTX 3080 will feature the GA102-200-KD-A1 GPU die, which is a cut-down SKU having the same 4352 CUDA cores as the RTX 2080 Ti with a total of 68 SMs.
Assuming the memory is running at 19 Gbps across a 320-bit wide bus interface, we can expect a bandwidth of up to 760 GB/s. This model has a TGP of 320W and the custom AIB models will again require dual 8-pin PCIe connectors. RTX 3080 will feature a maximum clock speed of 1710 MHz. The memory speeds for both the RTX 3090 and 3080 cards are expected to be around 19 Gbps.
The RTX 3070 will also launch at the end of next month. This GPU is rumored to get 8 GB of GDDR6 memory (the non-X variant). The memory speed will be 16 Gbps and this GPU will have a TGP of 220W. The GeForce RTX 3070 will feature the GA104-300 GPU core. CUDA cores are expected to be around 2944-3072, the same as the existing Turing RTX 2080 SUPER lineup. This card will have a bandwidth of around 512 GB/s.
Nvidia is using the high-end GA102 GPU for the RTX 3080 as well, which appears to be an upgrade over the previous TU104 core featured on the RTX 2080. This could also mean that the new Ampere card has higher wattage requirements and thermals, and it will fall under the high-end enthusiast segment.
There is also a possibility that the Ampere GeForce RTX 30 series will be utilizing the 7nm process node, though this is yet to be confirmed.
Nvidia is hosting a Geforce Special Event on September 1st, and we expect the company to announce the next-gen Ampere Gaming GPUs.
Stay tuned for more!
Hello, my name is NICK Richardson. I’m an avid PC and tech fan since the good old days of RIVA TNT2, and 3DFX interactive “Voodoo” gaming cards. I love playing mostly First-person shooters, and I’m a die-hard fan of this FPS genre, since the good ‘old Doom and Wolfenstein days.
MUSIC has always been my passion/roots, but I started gaming “casually” when I was young on Nvidia’s GeForce3 series of cards. I’m by no means an avid or a hardcore gamer though, but I just love stuff related to the PC, Games, and technology in general. I’ve been involved with many indie Metal bands worldwide, and have helped them promote their albums in record labels. I’m a very broad-minded down to earth guy. MUSIC is my inner expression, and soul.
Contact: Email

My gtx 1080 is starting to get old but i still might skip this gen
I have a 3770k + gtx 1080. I’m planning for a new build with a 3090.
Get rid of the CPU, I also upgraded to the latest ryzen from i7 3770k. Even with same gpu, the performance improved a lot. To be fair, the i7 3770k is also nearly a decade old
That’s why I’m doing a new build. 4950x + 3090 should be pretty nice. 🙂
What resolution are you using? I also have i7 3770k overclocked to 4.5GHz. I am using also GTX 1080 in 4K and for VR. I would like to upgrade to RTX 3070 or 3080. I know that my CPU will not handle to get performance I could get from these GPUs, but in 4K and VR it should not be that big deal. Or am I wrong?
I’m using WQHD on my monitor and 4K on my TV. I didn’t have the 4K tv back then, so can’t really say something here with 2160p.
I didn’t overclocked my i7, as it couln’t handle it well.
I bought the 3700x and a b450 Mobo with 3200Mhz ram
TBH I would upgrade the CPU first for an overall improvement of performance and the system. It will also smoothen your frametime.
Just consider, that the i7 3700k suffered heavily performancewise through all the security patches.
In GW2 my min. fps doubled. With FC5 i wen’t from stuttering sluggish 1080/medium-high settings @40-60 fps to rock stable 60FPS at 1440p on very high-ultra settings.
Couldn’t belive how different and smooth the majority of the games were.
Also the performance in TESO improved a lot and FF XII doesn’t stutter anymore (but that could also be through an update, didn’t played it for a long time).
I know I have to upgrade whole computer base, but now is not the good time for that. ZEN3 is going to be released till the end of the year. So I am thinking about buying RTX3080 now and everything else next year. And after that, I believe that it will be like I buy also new GPU. 🙂
Rip wallet. I just hope the aib RTX 3080 with some 16GB ram will be announced fast.
I have an i7 5820k @ 5ghz and am also buying a 3090 and then buying a new CPU later down the road. I think I would be able to make do with the 5820k but theres no NVMe slot and games will require fast storage going forward. So upgrade to 3090 from Titan X Maxwell and then later build a new PC for the 3090 to go into. I haven’t decided on the CPU yet, maybe when Intel get off the 14nm as I have usually always went Intel for my main CPU’s
I am gaming on the Titan X Maxwell so I’ll be upgrading to the 3090 when it launches. The whole reason the Titan has lasted 5/6 years is because it has 12gb of VRAM so I am shocked to see the 3080’s launching with 10gb of VRAM which will age them very quickly as even current games on low settings are asking for 4gb of VRAM
If money is no object…
I can build a really nice gaming PC for the price of 3090.
I suppose you could mate. But I like to treat myself now and then and it’s been 5/6 years since I bought a high end GPU so now is the time to do so. Out of interest what PC parts would you choose for the price of one 3090
What PC parts would I choose?
Let me wait for Ryzen 4000 and all those new, fast graphics cards from both companies. It won’t be that long.
One thing that contribute to significant VRAM usage was 8th gen console. Back then we have something like 256MB (PS3) and 512MB (360) to 8GB. that is 32 times and 16 times more memory respectively. The upcoming console only double that to 16GB depite being targeted towards 4k res. so memory usage probably will not going to bump significantly this time around like it did like last gen. I could be wrong though. But 20GB variant will also going to exist from what i heard. Hence why there is significant difference amount of VRAM between 3080 and 3090. Knowing how nvidia do things since kepler they will not want the lower tier card to having more VRAM than the higher tier one. The only caveat is the model with double the memory will cost significantly more. And board partner will be glad with this kind of money they will be making.
Don’t upgrade now wait till 2022 you will get ddr5 ram with rysen 6000 series with nvidia rtx 4000 series
I will wait for 2022 to fully upgrade my system current have gtx 1080ti i7 4790k 16gb
well good for you. you have low low expectations as a 4790k cant even keep 60 fps in some games and is fully pegged quite often in many newer games. in Watch Dogs 2 I get literally double the min fps with a stock 9900k as opposed to my oced 4770k. your cpu is only going to crap itself even more next year as many games will be made with modern 8 cores/ 16 threads as baseline.
Metal, killing it with the articles these days. Can’t wait to hear your take on AMD when their showcase begins. Good works man, ?
If AMD’s history is anything to go by, you can expect low to mid-range product by the end of the year. Then a card a year later that competes with 3080 that is a slightly cheaper, only to see Nvidia then release cards with performance tweaks with super labels that outperform them all.. I think we are better off waiting on Intel to offer up some competition at this point.
I requested METAL MESSIAH’s opinion. NOT, Metal Fanboy a.k.a Nvidia’s Fuqboy. I’m not looking for COMPETITION, I’m looking for well performed cards at reasonable prices.
Competition is what’s gonna give you good cards at reasonable prices, you stupid idiot!
Meh, if you just sell your last Nvidia card when you buy the new on it’s not really a huge amount of money. When it all comes down you could just sell your card after all the upgrades and not lose almost any money at all and you get years of being at the high end of gaming.
Well you got both all the same. Not to mention my dollar has always considered price for performance I have probably owned more AMD/ATI cards than I have Nvidia cards. My criticism is more of disappointment than it is fanboyism only consumers win from competitive products. I never considered myself ever considering paying over 1k for a GPU now here I am rubbing my chin to a 3090 because of AMD failing.
Hey, I’m myself desperately waiting for some leak on BIG NAVI GPU, and the next-gen RDNA2 lineup of cards. But sadly, call this a coincidence, we only got rumors/leaks on Ampere cards this whole season.
I think by now there are almost 20+ articles/leaks on Ampere, and not a single one on the Big NAVI. But nonetheless, I’m more excited for AMD.
I’m pretty sure the Next-gen NAVI cards are going to give a much better price/perf ratio than Nvidia’s Ampere. I don’t want AMD to compete with Nvidia, at least on the high-end. They are already very competitive in the CPU department.
They just need to focus more on the GPU side of things, and if previous rumors are anything to go by, the next-gen RDNA2 arch packs a lot of punch ! It will also have hardware-level support for ray tracing.
Even though the flagship Ampere cards like the RTX 3090/3080 will again take the performance crown, on the high-end, but the RDNA2 cards won’t be left far behind either. But like I said before, I don’t want AMD to beat or compete with Nvdia’s high-end offerings.
Because their “mainstream” Navi2X cards are once again going to be equally competitive, and we also expect better price/performance ratio from these cards.
.
How come it says “will be utilizing 7nm” in the headline, and then, “There is also a possibility that the Ampere GeForce RTX 30 series will be utilizing the 7nm process node, though this is yet to be confirmed.” at the end of the article?
I’m aware of that. Wanted to change it but according to Videocardz, “”The data that they saw clearly mentioned the 7nm fabrication node.””
It is still not confirmed 100% though, but chances are high these cards would be utilizing 7nm process node..
Anyways, kindly Wait, I will edit it soon, since it appears contradictory.
EDIT:
Done ! Title has been changed to, “”NVIDIA’s GeForce RTX 3090, RTX 3080, RTX 3070 specs have been leaked, may utilize 7nm Process Node””
Good deal, because it’s 8nm, haha. ?
Yup, its Samsung 8nm. I guess previous rumors were true, and Videocardz was wrong. Don’t know, but when I asked them, they told it would TSMC 7nm for sure.
Anyways, thanks for correcting the title of this article though. 😀
“Ampere second-gen RTX cards are all built using the Samsung 8N node, as opposed to the TSMC 7nm production process used for its initial Ampere GA100 GPU.” So it seems they switched. Cool stuff!
And no, thank you for being a good mod and writer here.
skipity skipy skip
So, given these specs, do we have a ballpark of how the 3080 will compare to the 2080ti performance wise?
My guess is that it will perform better with Raytracing by quite a bit and without raytracing I expect you will see similar performance from the two cards.
Well that’s a disappointment, to say the least. Let’s hope Nvidia has some additional tricks up their sleeves for it, otherwise, I’d be tempted to go for the 3090, depending on the price.
I need a solid 1440p card with 90FPS in mind. My GTX 1070 is just not doing it for me. I hope AMD can deliver this time because i don’t want to pay 800$ for a GPU.
Ive seen 1080tis floating around on ebay and FB market for 400-500$. Maybe track down one of those, thatll get you 1440p 90fps in most titles. I had one previously and i can vouch for how solid that card is even in 2020
Buying a 1080Ti now makes zero sense.
for 400$ yes it does, not everyone can drop 800$ on a 3080, the 20 series is still super expensive even used and 400 even 500$ for a 1080ti that beats the 2070 handily absolutely makes sense
you would be a damn moron to spend 400 bucks on a 1080 ti right now. the brand new 3060 will be around that price and be quite a bit faster plus have ray tracing and DLSS.
“TGP of 350W”
Expect ~400W with a typical overclock.
It looks like im skipping another gen, waiting for big navi or acquiring a 2080ti to pair with my new LG OLED CX55. 3090 seems unoptimised and the 3080 only getting 2gb more vram over my current 2080 is a slap in the face for what these are going to cost. AMD needs to give nvidia the same swift kick in the pants they gave Intel.
ok, keyboard warrior lol also watch the 3080 tie in benchmarks with the 2080ti if not slightly lose to it since the 2080ti has 11gb and the 3080 has 10gb. sure its faster memory on the 3080 but we’re increasingly seeing games use every last bit of available vram. The fact that theyre even offering a 10gb card on a “new gen” for probably 800$ or so is laughable, the only ones who can defend this garbage are the nvidia shills and bootlickers so which one are you? moving to the other point, the 3090. IT USES 400watts, WHO WANTS A MASSIVE, HEAT SPEWING GPU IN THEIR EFFIN CASE? The thing is gigantic, ugly, expensive as all hell and its gonna be able to cook filet mignon on it. A lot of people with atx cases wont even be able to fit the thing in their case. forget the markets with matx and itx cases. the 3090 is a joke when it comes to engineering. Im sure performance is gonna be great but at what cost? No thanks. 3rd point you threw in, the LG OLED, its a good tv, 120hz, 4k, hdmi 2.1 once the new cards support 2.1 standard. Im playing at 1440p 120hz, adaptive sync on an OLED with HDR and insane color quality with 3ms of response time so I dont see why that was even brought up to begin with. When I do upgrade either this gen or next, ill be using the full hdmi 2.1 to go from 1440p to 4k, dont have to buy a new display, keep all the other features and still have an awesome TV for netflix and other media.
Did you even read? They said there might be a 20gb model for the 3080
I specifically was talking about the launch 3080. I have no doubt the 20gb model will be 900$ or more. Also way to jump down my f**king throat before anything is even confirmed, that 20gb model theyre SPECULATING on might be the 3080ti instead, If it even happens at all so way to act like a console fanboy or an Apple apologist. All those people do is try to defend companies against any sort of criticism.
Jeez ! Kindly give some space/gap between the paras. I can’t read your comment properly.
Great !
These custom cards are indeed huge. ZOTAC GeForce RTX 3090 Trinity Pictured below. Trinity RTX 3090 is a triple-fan and triple-slot design which requires triple 8-pin power connectors.!
The ZOTAC GeForce RTX 30 series lineup includes a total of eight graphics cards which include the next-generation Trinity, Extreme & Twin Edge Holo series variants. The Trinity and Extreme cards will feature triple-fan cooling designs while the Twin Edge series will feature dual-fan cooling and will only be available with the RTX 3070 graphics card.
https://www.hd-tecnologia.com/se-filtra-la-zotac-gaming-geforce-rtx-3090-trinity-holo/
https://twitter.com/momomo_us/status/1299399692632031232
https://twitter.com/ExecuFix/status/1299408858440495106?ref_src=twsrc%5Etfw%7Ctwcamp%5Etweetembed%7Ctwterm%5E1299408858440495106%7Ctwgr%5E&ref_url=https%3A%2F%2Fdisqus.com%2Fembed%2Fcomments%2F%3Fbase%3Ddefaultf%3Dwccftecht_i%3D119877020http3A2F2Fwccftech.com2F3Fp3D1198770t_u%3Dhttps3A2F2Fwccftech.com2Fzotac-geforce-rtx-3090-geforce-rtx-3080-geforce-rtx-3070-custom-graphics-cards-pictured2Ft_d%3DZOTAC20GeForce20RTX2030902C20RTX2030802C20RTX20307020Custom20Graphics20Cards20Leaked202620Picturedt_t%3DZOTAC20GeForce20RTX2030902C20RTX2030802C20RTX20307020Custom20Graphics20Cards20Leaked202620Pictureds_o%3Ddescversion%3D74650ae616abed8b7420508576c52640
https://uploads.disquscdn.com/images/8d1da7f85f3eda4e082ee43be710ad70fc6e0f1538259e0ae41ee719f4c04b08.jpg
Oh boy, 3 8-PIN cables, WTF !! These cards are surely going to suck much power.
Some custom Models will require 2 8-PIN as well, whereas other high end SKUs will get the power from 3 8-PIN cables.
The RTX 3070 on the other hand, shouldn’t be having very high power requirements, imo, most probably some models will come with 6 and 8-PIN connectors.
Wait, I thought Nvidia was using Samsung’s 8nm process instead of TSMC’s 7nm node??
2 years for the same performance, Pc is just closing the doors
No. it is not. Go back to your console hub.
Great !
These custom cards are indeed huge. ZOTAC GeForce RTX 3090 Trinity Pictured below. Trinity RTX 3090 is a triple-fan and triple-slot design which requires dual 8-pin power connectors.!
The ZOTAC GeForce RTX 30 series lineup includes a total of eight graphics cards which include the next-generation Trinity, Extreme & Twin Edge Holo series variants. The Trinity and Extreme cards will feature triple-fan cooling designs while the Twin Edge series will feature dual-fan cooling and will only be available with the RTX 3070 graphics card.
https://www.hd-tecnologia.com/se-filtra-la-zotac-gaming-geforce-rtx-3090-trinity-holo/
https://twitter.com/momomo_us/status/1299399692632031232
https://uploads.disquscdn.com/images/8d1da7f85f3eda4e082ee43be710ad70fc6e0f1538259e0ae41ee719f4c04b08.jpg
8nm node always seemed like bullsh1t to me. A100 is made on 7nm node and Titan chips, now 3090, were always the same process as Quadro, just disabled SM units for upped freqs to keep within TDP limits.