NVIDIA Ampere GPUs general header

First leaked NVIDIA GeForce RTX 3090 Ampere GPU gaming benchmarks

It appears that we have some preliminary Gaming benchmarks of the upcoming RTX 3090 Titan-class Flagship GPU. TechLab has once again lifted the embargo, after the RTX 3080 review leak.

The Chinese Bilibili channel @TecLab has just leaked some alleged benchmark results of the GeForce RTX 3090 GPU. There are both synthetic and gaming benchmark scores, but we will only focus on the gaming part of this leak. But before I continue, do take these results and fps scores with a grain of salt.

The test was done on the same system as before, using an Intel Core i9-10900K CPU, and the DDR4 memory was clocked at 4133 MHz. Coming to the Gaming benchmarks, the following games were tested; Far Cry 5, Borderlands 3, Horizon Zero Dawn, Assassin’s Creed Odyssey, Forza Horizon 4, Shadow of the Tomb Raider RTX DLSS On/Off, Control with DLSS On/OFF, and lastly Death Stranding with DLSS ON/OFF. These are 4K in-game benchmarks.

NVIDIA RTX3090 leaked benchmarks-1

In the above system specs, the line just before the RTX 3080 means “10 thousand Yuan Flagship Display Card”, after translation, which is the RTX 3090 GPU. The reviewer does not refer to the RTX 3090 by its actual name, instead, the leaker mentions ‘10 thousand Yuan Flagship’ GPU vs ‘5 thousand Yuan Flagship’ tested, which is the RTX 3090 1500 USD GPU and the RTX 3080 700 USD GPU, respectively.

Seriously, why can’t this guy just show some solid PROOF, like a GPU-Z screenshot to showcase that he really owns the RTX 3090 GPU? Anyways, overall it appears that the RTX 3090 is only 10% faster than the RTX 3080, but these benchmarks might be done on early GPU drivers, so the final results are going to vary. Just 10% faster seems a bit off, at least in my opinion.

Below you can find the chart as compiled by Videocardz, to give you a better idea on the performance difference between these two Flagship GPUs. Again, do take these results and fps scores with a grain of salt. The performance difference could also be due to poor scaling as well.

NVIDIA RTX3090 leaked benchmarks-2

Far Cry 5 / Forza Horizon 4:

NVIDIA RTX3090 leaked benchmarks-3NVIDIA RTX3090 leaked benchmarks-4

Assassin’s Creed Odyssey/ Horizon Zero Dawn:

NVIDIA RTX3090 leaked benchmarks-5NVIDIA RTX3090 leaked benchmarks-6

Shadow of the Tomb Raider (RTX DLSS ON / OFF):

NVIDIA RTX3090 leaked benchmarks-7NVIDIA RTX3090 leaked benchmarks-8

Borderlands 3:

NVIDIA RTX3090 leaked benchmarks-9

Death Stranding (DLSS OFF/ON, here the pictures are tagged incorrectly):

NVIDIA RTX3090 leaked benchmarks-10NVIDIA RTX3090 leaked benchmarks-11

CONTROL DLSS On/Off:

NVIDIA RTX3090 leaked benchmarks-12NVIDIA RTX3090 leaked benchmarks-13

65 thoughts on “First leaked NVIDIA GeForce RTX 3090 Ampere GPU gaming benchmarks”

      1. This is Fermi all over again, compared to the 2080Ti you get 30% more power for 30% more power consumption. So much for 8nm…

      1. How so i wanna hear your take. First time i see anyone criticizing “Nvidia The Saviors” 3000 line up. I’m eager to hear your take on this. Thanks.

        1. The 3000 series has subpar power efficiency, in fact it is dysmal when you consider it is on 8nm. The 3080 is on average 50-60 and 80-something percent faster in ideal scenarios in 4K than the 2080 but consumes around 100+Watts more, and it has 10GB of framebuffer which WILL become a bottleneck in 4K in the not too distant future.

          The Ampere cards were overhyped by nVidia. They are good performers, especially when the 2000 series was a dud when it came to rasterized performance (except for the 2080Ti) compared to Pascal.

          But groundbreaking these Ampere cards are not, heck even the RT performance is perhaps only 15% more efficient/powerful than last gen of cards and is carried more by the uplift in rasterization.

          1. The is quite uplift with RT performance actually. But you need most of the scene done in RT. hybrid like were are seeing right now? Not so much. Hence why in pro app such as blender (with RT) there is quite significant uplift on ampere vs turing.

    1. I own a 2080 Ti and bought it the day it was released. It pretty much gives me the performance I want at 3440 x 1440, from 90 to 120 FPS, sometimes a bit less but for the most part I’m close to where I want to be.

      I really want to get into a 4K 21:9 monitor but would obviously fall well below my desired framerate target of 100+FPS. That would obviously mean that I’d need the 3090 to be able to get closer to keeping my frame rate target at the 4K resolution. Is this pathological? I don’t want it simply because it’s “the best”. I don’t care about bragging rights, but if you really want 4K 120+Hz, you can’t expect it in newer games without a 3080 or 3090. If I was happy with 4K60, I could probably stick with my current 2080 Ti. is it still pathological for me to want the 3090?

      And I have a hard time believing the 3090 only has a 10% performance lead over the 3080. Just the 20% increase in CUDA cores alone should guarantee it a little more than that, plus the extra bandwidth of the RAM bus. This benchmark is almost certainly lacking the appropriate drivers NVIDIA has yet to release for the 3090, and I’d expect at the very least, a 30-40% lead over the 3090, especially for the huge price difference. This isn’t the usual $150 premium over the xx80 for the xx80 Ti of past cards. They are still marketing this card entirely for gamers though and not prosumers, so I don’t believe the cost increase is for prosumer features alone.

  1. Another leak popped up on videocardz yesterday i think; About 20% in synthetic benchmarks. All in all, 2x price tag for 20% ? if that’s the case, not touching this. They’re trying to profit from the fact that people are reluctant to buy a 10gb vram gpu and marketed this one @24gb vram so people that can’t wait for any reason, will go for the 24gb vram.

    There are also medical, scientific and other reasons to get the 3090; i was only talking about the gaming side.

    1. People rush to buy product right after launch, the worst time to do it… inflated prices because lack of stock, bugs, lack of features because lack of initial competition. If you want better models, both AMD and Nvidia, you should wait to at least AMD releases.

          1. Lmao don’t let the downvotes get to you! You are a fine young person. You’ll make it out of this alive.

            From my POV it is insane indeed cuz i’m poor asfk bruh.

          2. Hey, the downvotes amuse me! They’re typically mostly all from the same small bunch of obsessive clowns and their sock accounts intent on following me around and seemingly wishing to advertise how triggered they are, lol.

          1. As if every 20yo+ gamer makes 20k a year and lives in their mamas basement.

            Doing 80k/year which is not that much money, depending on your priorities, could give you the buying power needed for thos kind of gpu.

    2. Nvidia has done this from the beginning with the Titans. They released the First Kepler Titan for $1,000 and left people doubting that there would be a 780 Ti. Quite a few people bought the Kepler Titan and later on they released the 780 Ti for $650. There was considerable buyers remorse among the Titan buyers.

  2. I get this is a leak and get that people expected more put of the 3080. But with only a 10% difference that leaves no room for a 3080ti, what a 5% difference card? Would Nvidia really do that to themselves? They like 3 things. Chewing bubblegum, $$$$$$, and undercutting AMD at every corner, and they’re out of bubblegum.

    1. I doubt 3080 Super/Ti will be faster, just more VRAM. I know the TDP will be the same, so I don’t expect performance bump, maybe slightly higher clock from the factory.

      The real question is, if 3080 can handle 4K on 10GBs easily, why would anyone need 20GB framebuffer? “Prosumers” will get either 3090 or Quadro.

  3. I have some serious doubts that the 3070 is going to end up being more powerful than a 2080 ti as was stated in the presentation. But, we will see…

      1. Jesus Christ man, these Robo-Fernando Memes are too much. Never have i seen one person on a site inspire so many memes about himself. This was priceless. ???

  4. I don’t know how anyone could justify the higher cost of the 3090 when it looks like its only giving you a 10 to 14 percent increase in FPS. I think Ill just wait out the 3080 Super/Ti Seems to me that will be the better buy.

  5. 10-15% more performance than 3080 at double the price? Emmm…. No thanks! Shunt mod a 3080 for a 10% performance increase? Yes please!

    1. Even for 30% more its still a huge meh for that kind of money. Might be just me tho. Seeing how the last year Titan is dirt cheap now.. really makes me wonder why people buy stuff at release date for those outrages prices. I get the 3080, but the 3090? Sure, if you are made out of money.. Spending above 1K for 1 single part = gg

      Technically speaking, would even 1 % of the PC users buy the card if it was giving 60% more performance at 1500$? Fk it, lets make it 100% more! I doubt it, kind of why this generation is so well priced. Nobody wanted to buy a 2080 ti.

      Steam statistics dont lie yo.

  6. I’d rather buy a 3080 and after 2 years buy the 4080, that way I’ll sell my 3080 and add some cash to get the 4080.

    This would be more ideal to me as I’ll save and gain better performance

    1. Having to buy a new “flaship” every 2 years makes you an nVidia hamster. Flagship cards should be able to last more than 2 years, but the 3080 is perhaps not as “flagship” as nVidia wants you to believe.

      1. I know what you mean, but to be honest with you the price difference I spend in my country is only 200$ because people sell their graphics cards here as close to a new graphics card price thanks to the market here. I buy my graphics cards abroad and after 2 years of use sell them, yes these cards are meant to last at least 4 years, but that’s where the huge price drop happens so I find myself paying 200$ every 2 years ain’t so bad

        1. If you can sell a second hand card for $200 less than new every year then that is pretty damn impressive!

          That would be impossible in my country, where you would need to slash around 33% and often even more on a flagship card’s original price after two years to have a chance to sell it off.

          1. yeah because I buy the card for $699 and lets say I add another $38 for delivery to my country, here they sell Zotac cards for 904$ to be exact, after 2 years the price will be around $650 and I will sell mine for $500

          2. If you paid $904 and sell it around $500 you still lose a hefty 55% of what you paid for it, so in that regard you aren’t really just spending $200 every 2 yeras, you are also losing $200 in the proces.

      1. 24GB of VRAM would be very useful for 4K gaming, which is the only resolution that truly benefits from a 3080 or 3090 upgrade.

      2. Yeah, seems to be more for if you want to run a GPT2 thingy in CUDA in a server setting because you could get two of them on one card with that.

      3. meanwhile, the 3070TI/S with 16GB, and the 3080 with 20GB vram is coming soon… so I don’t see that much of a markup just for double the vram and some more shaders is worth it when all you get is a pitiful 10% uplift in performance. if anything this launch is yet another slap in the face to anybody who buys now instead of waiting a few months.

  7. the 3090 with all the extra vram seems like a waste of money since no game is gonna utilize it at all. The 3080 def seems like the sweet spot for price/performance.

  8. 8-10% faster on avg….and a ~115% price bump, really?!
    The 3090 is a real dud judging from these numbers! I guess the only real boost you get is in your credit card statement with that $800 more on the MSRP compared to 3080 if you are dumb enough to buy it for gaming!

  9. Pretty much the same story with the Turings. Titan RTX had a small performance boost, larger memory bus width and an increase in VRAM over the 2080 Ti and cost twice as much as a 2080 Ti. I never have considered the Titans to be gaming cards from the very first Kepler Titan although Nvidia does market them as such. I see the Titans as a poor man’s card for work.

Leave a Reply

Your email address will not be published. Required fields are marked *