NVIDIA Ampere GPU general header 2

Custom AIB RTX 3090, RTX 3080, RTX 3070 Ampere GPU Models’ prices unveiled, Premium SKU to cost more

It appears that prices for some of the custom AIB Ampere GeForce RTX 3090, RTX 3080 & RTX 3070 cards have been listed over at Overclockers.co.uk. The UK based retailer has listed several custom AIB card with varying prices, depending on the GPU model, PCB quality, and design (PCB cooling).

Some of the custom AIB cards are priced at the base MSRP, though some premium models are also listed as well, and they carry a higher price tag. These premium Models come with a higher factory overclock, and feature a high-end cooling solution for the PCB.

Even though these models are listed online, they won’t be available at launch date (not all models). We don’t know whether these AIB custom models will be available the same day as the Founders Edition SKUs, or if they will launch at a later date. The RTX 3080 Founders Edition/FE model will launch on 17th of September for $699 USD. The RTX 3090 FE card on the other hand will launch on the 24th of September for $1499 USD, and lastly the GeForce RTX 3070 is planned for launch in October for a price of $499 USD.

Several NVIDIA AIBs have showcased their custom GPU designs, such as ASUS, Gigabyte AORUS, MSI, EVGA, ZOTAC, Inno3D, Palit, GALAX, Gainward, Colorful, PNY, EMTEK etc. Most of the high-end models are bulky, and come in triple fan designs, 2.5/triple slot, and they feature different cooling for the PCB. The custom RTX 3090 appears to be a monstrous GPU. You can read Wccftech’s full roundup to get some idea on these custom models, since it is not possible to include all these details in one article, for each AIB.

With that being said, let me list the prices of these Models, as posted by Wccftech.

NVIDIA GeForce RTX 3090 Custom Model Price (coming via OCUK):

NVIDIA GeForce RTX 3080 Custom Model Price (coming via OCUK):

NVIDIA GeForce RTX 3070 Custom Model Price (coming via OCUK):

Update

TechPowerUp also has some official info on the international pricing of these cards. My guess is that this is the Founders Edition pricing.

UK: RTX 3070: GBP 469, RTX 3080: GBP 649, RTX 3090: GBP 1399

Europe: RTX 3070: EUR 499, RTX 3080: EUR 699, RTX 3090: EUR 1499 (this might vary a bit depending on local VAT)

Australia: RTX 3070: AUD 809, RTX 3080: AUD 1139, RTX 3090: AUD 2429

 

 

80 thoughts on “Custom AIB RTX 3090, RTX 3080, RTX 3070 Ampere GPU Models’ prices unveiled, Premium SKU to cost more”

    1. We need to wait till the cards launch. Though I hope someone breaks the embargo and leaks some of the benches, lol.

      1. Digital foundry are reporting 1.7-2X times better performance on the rtx 3080 compared to rtx 2080 on a variety of games(borderland 3,control and doom eternal,quake 2 rtx…)and thats without gpu ready drivers

    2. I’m going to wait till Tim and Steve from Hardware Unboxed YT channel get their hands on the new Ampere GPUs and thoroughly benchmark them. It’s also prudent and practical to wait for AMD’s new lineup to release so that they can be compared head to head when it comes to price-to-performance ratio.

  1. I would reallllly like to see some detailed 3080 vs 3090 benchmarks. Since it’s tue same « core » with a little less specs maybe the 3090 is useless overkill vs the 3080. Time will tell.

  2. Some of the 3080 listings don’t mention a Display port, I assume that’s just a mistake? And the cards will have display port ?

      1. hard not to be “overpriced” when the most expensive videocard from your competition can barely scratch 1080-nonTI from 4 years ago. And with Ampere coming – AMD may as well be on a suicide watch.

        1. also dlss 2.0 is a joke, it has to be programmed into games, you cant just apply it to any game without it being programmed into the game, amd will come up with a better solution that doesnt require it to be programmed into the game.

          1. 1660 super is a bad buy compared to the rx580/rx590 is only tiny bit faster yet cost 100 dollars more lol.

            amd drivers are fine, i’ve had no issues with my rx580 still going strong.

            nvidia is just overpriced for the performance they offer, it’s like saying omg my 200k super car is faster than your 90k car, yet that same company couldnt make a better car @ 90k.

          2. radeon image sharpening will be expanded upon and be useful for all games unlike dlss 2.0

          3. nvidia drivers are outdated, they’re still using the same UI from 15 years ago and there’s a lag/delay turning things on and off unlike amd drivers.

        2. lol nvidia only puts out these highend cards so they can claim the performance crown highend cards isnt what sells the most or makes nvidia the most money in the diy market, that’s mid range and ati/amd makes the best mid range cards.Nvidia can never offer better performance @ the same price tag tho in apples to apples comparisons.
          Rdna2 will be alot better than ampere period, just like the rx480/rx580 was better than the 1060 gtx.

          look @ the 2060 super costing 400 dollars and a 5600 xt is 260 dollars has the same performance lmao.

          look @ the 5700 xt which can be had for 350 dollars it can even outperform a 2080 super that cost 700 dollars in certain games.

          1. Forza Horizon 4? Driver, my friend, graphics driver.

            This video is from July 2019. In August a new GeForce driver was released. With the new driver even 2070 Super, not to mention 2080 Ti, outperformed 5700 XT.

        3. lol nvidia only puts out these highend cards so they can claim the performance crown highend cards isnt what sells the most or makes nvidia the most money in the diy market, that’s mid range and ati/amd makes the best mid range cards.Nvidia can never offer better performance @ the same price tag tho in apples to apples comparisons.
          Rdna2 will be alot better than ampere period, just like the rx480/rx580 was better than the 1060 gtx.

        4. also nvidia dlss 2.0 is a joke it has to be added to games by a game by game basis , it cant just be applied to any game like radeon image sharpening.

      2. MID RANGE SELL THE MOST AND AMD/ATI MAKES THE BEST MID RANGE CARDS GOOD GAME STUPID CUNT, ONLY REASON NVIDIA PUTS OUT THESE HIGHEND CARDS SO THEY CAN CLAIM THE PERFORMANCE ”CROWN” THESE HIGHEND CARDS ARENT WHAT’S MAKING NVIDIA MONEY THO.

    1. Check the exact dimensions of the AIB models. Are you talking about this enclosure/cabinet ? https://www.louqe.com/ghost-s1/#specstable

      Well, as per the specs the GPU room has been mentioned as a Dual-Slot, with dimensions of up to 145 x 45 x 305 mm. Check the length/width/height of the AIB card before purchasing, and measure how much free room is currently there in this cabinet/Louqe Ghost S1.

      Check the following GPU database to get info on the custom card dimensions, and slot width from all the AIBs. This database will be updated frequently, so check back if there isn’t any info for a particular SKU you are looking for. But for reference, in general this card measures 313 mm in length, 138 mm in width, and features a triple-slot cooling solution.

      But yours is a MINI-ITX cabinet though, so longer/thicker cards won’t fit easily.

      https://www.techpowerup.com/gpu-specs/geforce-rtx-3090.c3622

      https://videocardz.net/nvidia-geforce-rtx-3090
      https://uploads.disquscdn.com/images/0b4486d80a6a8983189a938bec48d07c0a82665991837d01e482e30dd9fac223.png

  3. Update. TechPowerUp also has some official info on the international pricing of these cards. My guess is that this is the Founders Edition pricing.

    UK: RTX 3070: GBP 469, RTX 3080: GBP 649, RTX 3090: GBP 1399

    Europe: RTX 3070: EUR 499, RTX 3080: EUR 699, RTX 3090: EUR 1499 (this might vary a bit depending on local VAT)

    Australia: RTX 3070: AUD 809, RTX 3080: AUD 1139, RTX 3090: AUD 2429

    https://www.techpowerup.com/271628/nvidia-announces-geforce-ampere-rtx-3000-series-graphics-cards-over-10000-cuda-cores

    1. A couple of reasons is that the exchange rate for an Australian Dollar is only worth 73 cents compared to a USD and then they automatically apply a 10% VAT to goods in Australia as the listed price.

      The prices listed for goods and services in the USA don’t reflect the sales tax that will apply at the time of purchase. For example the state that I live in the sales tax is 9.25% so a $1,500 card actually costs $1,640 when purchased.

        1. I didn’t know about the AU tax added on top of the VAT. The thing I do know is that the government is going to get their tax money somehow. For example in my state their is no State Income Tax but property taxes are higher than normal and the State Sales Tax is 9.25%

    2. Are they really high? How much is a 2080 Ti in Australia?

      I’m asking because the Canadian dollar is almost at par with the Australian, and in Canada 2080 Ti sells for $1,600-1,800 + tax, some as high as $2,000 + tax.

      If RTX 3070, which is reportedly faster than 2080 Ti, is priced at around $800, that will be awesome.

  4. Do you guys think 10GB of VRAM be enough for the 3080? I know programs falsely report vram usage a lot but my 1080ti apparently uses up to 9.7GB in HZD and and 9.5GB in the COD WW2 campaign. What about Cyberpunk and other future games? I feel like they may use even more.

        1. Then I think 10GB VRAM is going to be a bit less for your needs. It depends on the game being played though, but future games may require more VRAM, so you might be needing a bit more of memory.

          Though remember VRAM filled is not the same amount which is actually consumed by the Game.

          1. Exactly. REmake 2 for example claims I’m using well over my 12gb limit with my 1080ti (12gb), yet it’s been proven to be false and that it doesn’t actually consume well over the 12gb limit and that it’s actually less.

            I doubt Nvidia would release brand new cards, using new memory architecture, that is then limited out of the gate. People need to understand that the VRAM is not the ultimate end all to be all. There’s a reason why MS stoped touting XSX VRAM after a few months, and then moved onto something else to tout.

            A lot of console folk atm think that VRAM is the end all to be all, but forget on how the actual memory itself is used, not it’s total size. Otherwise they’d believe false flags like the ones found in REmake 2.

          2. It’s a common misconception that just because a certain amount of VRAM is being used then that is how much is needed. Some engines load up VRAM with textures needed for that level but don’t delete old textures no longer needed just because there is no reason to.

            You will know for certain when you don’t have enough VRAM because the engine will start using System RAM which is considerably slower than VRAM and it will drag the FPS down.

        2. If you need a card asap I wish you luck but smart money says wait just a bit longer. I honestly don’t care who wins as long as these prices keep coming down. I have the same ultrawide specs and a 4k qled tv and I’m pretty happy with my 2080super for now.
          If AMD has a 16gb card that hangs with a 20gb 3080(whatever they call it) I will just go for whatever the best deal is on a high quality card when I decide to upgrade.

          1. That is a wise comment. If a gamer is happy with their card then there is no reason to feel pressured to upgrade. I have a 2070 Super and game on 1440p. I’m happy with it even though I have to lower the settings on a few games.

            I will plan to upgrade to the 4xxx series or an AMD card that can deliver the performance needed at that time.

          2. That is a wise comment. If a gamer is happy with their card then there is no reason to feel pressured to upgrade. I have a 2070 Super and game on 1440p. I’m content with it for now.

            I will plan to upgrade to the 4xxx series.

  5. PS5/XBOX AND PC USING RDNA2=SAME ARCHITECTURES

    RTX ISNT A SHARED ECOSYSTEM LIKE RDNA2 ON CONSOLE/PC.

    IT’S EPIC FAIL

      1. there is no trolling, its just the truth nvidia’s proprietary solutions are always expensive and always fail compared to amd which is more open source about their stuff.

      2. He’s actually pretty good at trolling. Look at all of the people responding to his nonsense and BS but he is a troll. Ignore him and he will go away to another site looking for attention.

    1. amd’s even talked about this rdna2 ecosystem, nvidia was scared of amd reason they had to jump the gun and launch ray tracing cards early.

      nvidia’s failing just like g sync failed and nvidia had to give into freesync lmao try to rebadge freesync moniters as gsync compatible.

      ray tracing doesnt truely start untill rdna2 ecosystem with ps5/xbox and rdna2 for pc.

      1. umm no they didnt bro lmao.

        nvidia was forced to jump into ray tracing early because amd was gonna bring it to the masses with rdna2 on ps5/xbox/pc.

        nvidia is always fail man i dont think people realize this.

        1. amd’s already confirmed rdna2 highend will be a 4k 60 fps card.

          nvidia cant keep their prices cheap because doing ray tracing with hardware acceleration makes the gpu’s die’s more expensive to produce/wafers.Amd’s ray tracing solution is cheaper it’s using hybrid tech splitting the load between doing it on hardware and shaders amd will beable to produce their gpu’s at cheaper price and pass that onto the consumer.

          just like gsync is done via hardware chip needs to be put inside a moniter raising the cost of the moniter, that’s why freesync dominated gsync it doesnt need a chip inside the moniter.

          will be same with ray tracing, amd’s solution is just cheaper and better.

          1. Freesync does not need “chip” inside the monitor? You clearly did not know what you’re talking about.

          2. Hes right about that nVidia commercialize everything which rises the price up, licences etc. where is AMD more for “open standard” , also specific nVidia optimisation/tech which benefits ONLY nVidia cards but if AMD makes optimisations for its cards it also benefits nVidia cards. This is one thing what I like about AMD…..

            Too bad that only nVIdia can compete in the high end front…. and really we need competition on the GPU front.

        2. Hes right about one thing…………. nVidia wants to commercialize everything unlike AMD (atleast most of the time). AMD is more for making a “standard” where nVidia is more for “proprietary” stuff (commerciaizing it) and thats basically the worst thing about nVidia since basically ever. But now RTX 3000 price/performance ratio is almost perfect, its a way in the right direction on that department…. this is were nVidia was not that comppetitive but now they are which is good.

    2. Just block him. I’ve blocked all the overzealous AMD fanboys that have been infecting this site for over a year now.

  6. I’m excited for the 3080, but still curious about AMD’s answer. Maybe they’ll force an NVidia price drop even!

  7. so if i buy 3080 i will be screwed with 3080ti that might have 16 or 20 GB vram

    nice one Nvidia as always

    i was ready to make the switch but i will wait for amd and the ti version then decide

    1. I dunno, imagine the 3080ti costing 200 more. To me 200 more wouldn’t make it worth it, I’d just go for the 3080.

      Also, why not go for the 3090 if you’re ultra specifically Vram hungry?. I own a 1080ti and I find the 3080 perfect as an upgrade over the 1080ti, I’m not going to peddle in the “what if I wait for the next model, then the next model, then the next model etc” (because that becomes circular logic and it’ll never end, and that’s something console folk are spouting with these new cards, so don’t fall into that cycle).

      1. I feel like ive seen plenty of games this gen that push 7+gb when cranking up the settings, For me im kinda worried that ill be gimped come next gen.

        1. I’m honestly not worried at all. It’d be suicide if Nvidia just made some brand new cards, using new memory architecture, and somehow, as soon as next gen starts, all those 10gb VRAM cards suddenly “can’t handle” all these next game games.

          I think some of you here are putting an insane amount of mind stock into the size of the VRAM pool, and not how the new memory let alone VRAM in general is being used.

      2. the ti will release close to 3080 they are just waiting for AMD

        there is no upgrade needed after 20GB vram as DXR in full spec will need this headroom

    2. 20GB – they can’t make it 16.
      They get 20 by swaping the 1gb chips with 2 gb chips, they can’t just remove or add more chips though, or mix them.
      I mean they could do some of those stuff but it would make it way too complicated and stuff would need be redesigned.

      1. Why not? The amount of chips equals to memory interface.
        16GB will be 256-bit memory interface.
        Remeber the 3080TI cant be very close to 3090, otherwise no one will buy it, if its going to be 20GB and same speed as 3090 for 1200, then its a loss.
        But 16GB and speed in the middle between 3080 and 3090 is more realistic

  8. Im planning to upgrade this generation, now that its gen 2 RTX with a lot of arch and performance improvements.
    Though wish 3080 had more VRAM, atleast 12GB…

    1. That’s the thing the rumor is that NVidia will release Ampere cards with more VRAM to counter AMD Navi2 GPUs.

    2. wait for the 20GB version

      it is common prcatice from Nvidia to wait for AMD to counter and then counter back against AMD and their own customers on top of that

    3. Wait for RDNA2 and/or 20gb varient.
      AMD will have a 3080 competitor for sure and its hard to imagine they would go for less then 12gb.

      1. 3080 with at least 12 GB or RDNA2 which competes with 3080…. for 3090 and similar I dont really care, much more expensive and im not even on 8K – 4K. My goal is 2K/max out – 144Hz

  9. AORUS GeForce RTX 3080 Xtreme 10 GB– £799.99
    AORUS GeForce RTX 3080 Master 10 GB– £749.99

    excuse me, what?. Going by both those cards descriptions, they are exactly the same, except another one costs nearly £50 more…

    I’ll wait and see what Zotac has to offer, because I wasn’t exactly thrilled with my Aorus 1080ti

Leave a Reply

Your email address will not be published. Required fields are marked *