It appears that NVIDIA is prepping a new RTX 30-series flagship SKU which is going to be positioned between the already released RTX 3070 and RTX 3080 GPUs. This leaks comes just after NVIDIA was rumored to have cancelled the production of the RTX 3080 20 GB & RTX 3070 16 GB Ampere-based SKUs.
Kopite7kimi, a known and reliable Nvidia leaker has predicted that the green team is preparing a new Ampere card based on the 8nm GA102 GPU silicon, having the ASIC code GA102-150-KD-A1.
GA102-150-KD-A1, 7424FP32, 320bits
— kopite7kimi (@kopite7kimi) October 23, 2020
The cancellation of the RTX 3070 Ti/16 GB graphics card may hold weight as the card might have now been allegedly pushed up in the GPU tier/hierarchy, since the original RTX 3070 Model was supposed to be based on the full GA104 GPU silicon instead, having 6144 CUDA cores.
Maybe Nvidia thinks the previously planned SKUs won’t be able to tackle AMD’s RX 6000-series Big Navi GPU lineup in terms of performance, so the company may have a new SKU in the works, with better specs instead of just having more VRAM slapped on the PCB.
This new card might be the GeForce RTX 3070 Ti or even the GeForce RTX 3070 SUPER SKU, but we don’t know for sure. So you should take this leak with a grain of salt, since this is just a rumor.
Anyways, this GPU appears to be the third entry based on the GA102 silicon.
Coming to the rumored specs, the GA102-150-KD-A1 GPU silicon will feature 7424 CUDA Cores or 58 SMs/streaming multiprocessors (29 TPCs). This gives the card 1536 cores more than current RTX 3070 8GB GPU, and 1280 cores less than the current RTX 3080 10GB card.
This amounts to 26% more CUDA cores than the GeForce RTX 3070 and around 9% less than the GeForce RTX 3080. The ROP count is not known, but we expect this SKU to feature 232 tensor cores, 232 TMUs, and 58 RT cores.
According to the leaker @kopite7kimi, the GPU is configured with a 320-bit wide memory interface bus, and we expect the card to sport 10GB of GDDR6X VRAM. The TDP should fall between 250-280 Watts.
As for the pricing, given that the current RTX 3070 8GB GPU is positioned at $500 and the RTX 3080 10GB at a $700 bracket, we expect this new SKU to be priced somewhere in between, i.e. roughly $599 USD.
Stay tuned for more!
UPDATE:
It appears we have some more details on this new SKU, once again courtesy of @Kopite7kimi.
Here is some confirmation on the board number, device ID/SKU number of this GPU. The card is going to house the ‘PG132 SKU 35’ board and will be based on the GA102-150 GPU with 7424 CUDA cores. 2207 has been mentioned as the Device ID. Kopite has not only confirmed the board number but also the TGP which would 320 Watts, same TGP as the RTX 3080.
https://twitter.com/kopite7kimi/status/1320702114977304577
He expects this Model to compete with the Radeon RX 6800 XL low-tier Navi 21-based graphics card. The RX 6800 XL name was confirmed by @KittyYYuko.
6800 XT
6800 XL— Elysian Realm (@KittyYYuko) October 26, 2020
According to the leaker @kopite7kimi, the GPU is configured with a 320-bit wide memory interface bus, and we expect the card to sport 10GB of GDDR6X VRAM.
Hello, my name is NICK Richardson. I’m an avid PC and tech fan since the good old days of RIVA TNT2, and 3DFX interactive “Voodoo” gaming cards. I love playing mostly First-person shooters, and I’m a die-hard fan of this FPS genre, since the good ‘old Doom and Wolfenstein days.
MUSIC has always been my passion/roots, but I started gaming “casually” when I was young on Nvidia’s GeForce3 series of cards. I’m by no means an avid or a hardcore gamer though, but I just love stuff related to the PC, Games, and technology in general. I’ve been involved with many indie Metal bands worldwide, and have helped them promote their albums in record labels. I’m a very broad-minded down to earth guy. MUSIC is my inner expression, and soul.
Contact: Email
sounds like nvidia has soiled their depends and trying to febreeze the mess they made if you ask me.
AMD is gonna fk Nvidia in terms of hardware if the leaks are true, the problem is AMD’s software.
Nvidia is 10 steps ahead if not more.
Eh this one looks promising, even if we don’t believe the leaks which look super good just from the console version of RDNA2 (which is nerfed obviously because of the small factor) it looks tremendous.
Even RDNA1 was okay however we didnt have a big one (5700XT was ~200mm iirc ?)
The problem is the Software, always had been. AMD has inferior Software big time.
You win the average consumer if he has a good experience and this mostly comes from Software not as much Hardware, obviously the HW has to be stable however -/+ 10% performance doesn’t matter if Nvidia offers TONS of tools like Shadowplay (Share named atm) Reflex/RTX Voice etc etc which are Leagues above of AMD’s offers, hell they don’t even have alternatives.
Let’s hope the 2021 Adrenaline is good. Otherwise we gonna have one more year of okay Hardware with mediocre Software to say the least and be polite.
Every new game will have it. It’s the best tech for AA ever and it increases performance. Unless AMD has something to combat it it doesn’t even matter if they have better hardware Nvidia will still have the better output.
Agreed, I really don’t care if AMDs new CPUs are 2x as power efficient if they don’t provide an attractive upgrade in single core performance from my 2014 i7 4790k.
you must not pay attention to hardware news. amd now holds the performance crown. single/multi/gaming/workstation whatever AMD is king of performance now in every aspect. where have you been hiding at? joe’s basement?
I know what the performance gains are and sure I could spend $700 on a new reasonably priced CPU, Mobo, and RAM but it wouldn’t translate to much real world performance improvement since I only need it for gaming.
to me both are 300W ahead if not more.
AMD did beta Nvidia original 2060/70 with the 5700/x until Nvidia release the super variants. So its possible for AMD to win. But Nvidia have a lot of performance to spare. So they can create a new card and be the leader again.
It is rumoured that amd has superceded intel on the ipc department and they will provably even match the clocks too. So good news for us consumers.
Gpu wise they are releasing their most able contenter in the high perf market since the 7xxx/290 line so that’s also great.
I wouldn’t call Nvidia “playing alone in the field” when you can’t buy a 3070 or 3080 or 3090 anyway unless you want to get scalped. The worst part is that Nvidia did this to themselves. They released Ampere before they had enough stock to go around and they allowed customers to buy more than 1 card per address.
I was on Ebay a couple of weeks ago and there was a guy advertising the 3080 for $1,050 (buy it now). He said “only 9 left” in his ad. How did he get so many 3080s? Is it any wonder that the average customer can’t get one at MSRP. btw he sold every one of those within 2 hours.
Local stores here literally sell the Asus 3080 TUF OC for $1,210, and they say “It’s Nvidia’s pricing+14%VAT.” It’s BS.
Well they still win with RT. The people who say they don’t care about RT are still watching black & white CRT TVs. They also have that magic known as DLSS though I’d rather have the performance so as not to rely on it.
What this may do is push Nvidia on to the 40 series much earlier than expected. Bad news for 30 series buyers if they do so any intent will be kept very much under wraps.
I’m 50/50 on cancelling my 3080 TUF OC order. Looking forward to AMDs launch.
Oh, my 3080 TUF OC queue position has gone from 167 to 153 since shops started taking orders….
RT is good. Current RT hardware is trash.
Wake me up when it becomes widely available in SKUs of all range, from 100 to 10000€, when you enable it it won’t tank your FPS.
Until then its irrelevant, because when there are not enough models of all prices using this hardware you won’t see all the games using it, it won’t become common.
Especially now that consoles at 399$ have RT we need more than ever RT on PC at a normal price, not that garbage Nvidia offers.
Yep. As long as RT halfs your framerate and resolution, you might as well get a next gen console and save yourself a lot of money. Once you turn on RT, there isn’t much difference.
We already have the solution with DLSS. RT will always lower your resolution. It’s always going to be the most hardware intensive thing. Soon hopefully it will simply be the only option because it will save a ton of time in game development.
DLSS is not supported in all games DLSS is not the solution its basically upscaling and RT will always lower the performance obviously however RT right now TANKS the performance not just “lowering it”
We need better hardware and more SKUs are accessible prices for the masses of the tech can be adopted and normalized. Until then its almost irrelevant.
Consoles manage 1/4 res RT during a 4k scene. That’s 1080p RT at 4k. Good enough if you are playing fast paced games.
Devil may cry has a 4k/30 mode with RT or 1080/60 with it on. To me, for a launch game this is very impressive. Better than what the 2080 managed. I think it may improve a little over time.
Turing failed to deliver what
itJenson promised.Should have been 1440p with RT on. Anyway maybe ps6 and Xsx2 will get the job done properly ?????
You expect a 500 dollar console to perform like a 1200 dollar GPU?
You really have some personality issues if you’re getting this upset over video games.
Nvidia’s 3080 just manages a good RT experience at £650. Check out Quake 2 RTX, which can deliver 60+ FPS at 1440p using path tracing. I’m not a pro gamer though so I am happy with 60 FPS.
Consoles manage 1/4 res RT during a 4k scene. That’s 1080p RT at 4k. Good enough if you are playing fast paced games. I’m sure AMD’s desktop GPUs will manage a bit better, but looks as though they will lag behind Nvidia.
dude quake 2 came out in 1997
Right…it seems people are just starting out in to gaming…thinking Q2 is new age hahaha
I mentioned Quake 2 RTX, which is the most impressive display of RT using full scene denoised path tracing ‘dude’.
So when Digital Foundry talks about how impressive RT is in the new Spider-Man game, they are pushing a false narrative? It is not as they are advertising it as?
RT is at its infancy and we need a few generations (maybe 2/3 minimum ?) to get some powerful hardware, get multiple SKUs of all price ranges and get more games supporting it, true.
However its a fact compared to their last generation conslows for 399$ have tremendous Hardware (Jaguar cores vs Zen 2 cores lol) which you can’t get on PC for even double the price, this benefits us – PC gamers – because sadly most games in the market are made for conslows so now games will actually make use of SSDs, RT Hardware etc.
So like it or not i want the console garbage not so i can buy it. The better they get the better ports we will get… i hope ? Never underestimate the power of trash devs i suppose.
AMD will be behind in RT vs Nvidia for sure however it doesn’t matter i believe, hardly anyone plays with RT enabled anyway, so AMD being idk 30% behind in RT its irrelevant. And in the next RT gen or two they will get some a huge performance jump that the current RT hardware is gonna be irrelevant.
In the other hand they probably gonna be really close at Raster performance i feel like, they look super confident this time around and some leaks make sense.
In the end of the day it all depends on the Software tho and AMD’s Radeon drivers suc? my *** compared to Nvidia’s. It could change tho with the 2021 Adrenaline… who knows.
Also Nvidia has no availability this time around, 3000 is nowhere to be seen i swear to God this lineup is a myth.
remember PhysX cards? now nvidia sells RT cards, its the same, a costly gimmick. and a exclusive one
Ray Tracing is not a gimmick it’s the natural progression of the tech and everyone has been using it for years in movies but needed giant work stations to render it for even Toy Story all those years back.
Reality Isn’t Binary, RT its both a technical achievement and a huge colossal gimmick (in its current propietary state)
It’s not really propierty since it’s been channeled through DX.
… and Vulkan now has RT extensions.
DLSS adds detail and actually looks better than native resolution. While I get your feelings on not needing it in the first place (like I wish I could find the perfect headphone that doesn’t need EQ’ing) the fact is that it does actually look better and gives you ridiculous performance gains so I really have NO reason to hate on it or not want it.
Oh it’s coming…. and won’t have been rushed to market with a faulty design with bad capacitors to try to grab market share with a trash product. and i’m sure they have bought share in febreeze too at this point! LOL! wow.
wonder why both PS and XB both used AMD tech in their new gaming machines if it’s “trash” as you imply?
Because it’s cheaper. That’s why they always use AMD tech.
you do know that with the Ryzen 5000 series processors that AMD have now regained the performance crown, right?
What a fking mess. They can’t even meet half the demand of the original SKUs and they keep making more of them. They can’t make a proper SKU with enough VRAM for 2021s standards and beyond.
Using almost the same amount of VRAM we had back in 2016 in a GPU made for late 2020 – basically 2021
“Low” prices in theory but not in practice because demand its tenfold of production.
BLEH
Bro “half the demand” bro it’s more like 10-20% of the demand ? I mean they’ve completed 10-20% of orders on etailers that do b/o ordering. That number also doesn’t count for tue bunch of people f5ing everything and not “backordering” the part.
You’re damn right. A fkin mess
I got stuck having to buy a 2080 when my previous GPU blew up on me. I sure as hell hope AMD has a card at least in competitive range by the time I look to upgrade in a year or two.
This is a clusterf**k at this point, and I just want AMD to be comparable to Nvidia, and I’ll happily jump ship.
ok AMD cultist
Bye.
Your comment screams of BS and bias. First off, why didn’t you buy AMD in the first place if you like them so much? Secondly, what’s the current 3xxx series debacle have ANYTHING to do with your 2080 GPU?
Also AMD will never be comparable to Nvidia because Nvidia is better at what they do and pour way more money into R&D and creating new technologies and software.
You do understand English……… right?
I’m pretty sure AMD will be competitive in rasterization, i doubt in RT since its gonna be their first implementation, the problem is the Software.
Let’s hope AMD won’t fk it up like in Navi 1.
Looking more like I’m gonna skip this generation. The memory configurations of NV cards are ridiculous, no 12-16GB cards, c’mon. Going for AMD makes no sense for me (I have RTX 2070) now that DLSS 2.0 is getting traction in AAA titles and even some indies, unless they come up with a competing tech.
Not gonna happen DLSS will remain limited to whatever game NVIDIA sponsors.
not a feature to base your purchase on. despite your card is already history, new games will just get bug fixes from the driver, not performance.
Good thing NV sponsors plenty of games, I guess. And obviously I didn’t buy the card for DLSS; it was sh*t back in 2018. I bought it because I needed a new card. Now that 2.0 works great, no reason to upgrade for now, just a waste of money.
you’re gonna want one of the new gen cards if you want to be able to run microsofts new directStorage tech when it’s released.
Asus just teased RX 6000 series.
https://uploads.disquscdn.com/images/2db0d6015ddaffe1c7275b5caf01fc4b122492dc642905598651aea0307c6ea2.jpg
Nvidia is afraid of the power of the Xbox SX!!
RTX 3080 Remastered
Lmao
More like a 3070 remastered because performance will be below the 3080 still.
Keep giving me reasons to jump to AMD, hell, even Intel isn’t out of the question, as long as they don’t fall ‘completely’ on their face.
So tired of Nvidia and their antics.
you mean In”secure”tel? sure go right ahead. they no longer hold the crown no reason to go intel unless you want to run the new avengers cash grab, i mean game, in all its glory.
You got me. In”secure”tel. Once I stop slapping my knee and laughing, I’ll give you all the kudos you deserve. Man, I never saw that one coming.
I would really just like to buy a card. That should be the biggest priority for Nvidia right now..