It appears that the Chinese Bilibili channel @TecLab has just leaked some alleged benchmark results of the GeForce RTX 3080 GPU. There are both synthetic and gaming benchmark scores, but we will only focus on the gaming part of this leak. The original link has been taken down, but you can watch the video over here.
But before I continue, do take these results/scores with a Grain of salt. Some scores have been compiled by @_rogame, and @davideneco25320. The leaker claims to own one of the Ampere GPU samples and a working driver, which is still under NDA, though he has used the press driver version 456.16 for the GeForce RTX 3080 reviews.
To quote TecLab:
“RTX 3080 has 30% (performance) increase compared to 2080Ti and 50% increase compared to 2080S. Power consumption of the whole machine is 500W.”
Coming to the Gaming benchmarks, the following games were tested; Far Cry 5, Borderlands 3, Horizon Zero Dawn, Assassin’s Creed Odyssey, Forza Horizon 4, Shadow of the Tomb Raider, Control with DLSS On/OFF, and lastly Death Stranding with DLSS ON/OFF. These are 4K in-game benchmarks. From the looks of it, the RTX 3080 appears to be 48 to 62% faster than the RTX 2080 SUPER GPU.
RTX 3080 4K Gaming vs 2080 SUPER:
- Far Cry 5 +62%
- Borderland 3 +56%
- AC Odyssey +48%
- Forza Horizon 4 +48%
4K Gaming vs 2080S
Far Cry 5 +62%
Borderland 3 +56%
AC Odyssey +48%
Forza Horizon 4 +48%— _rogame 🇵🇸 (@_rogame) September 9, 2020
RTX 3080 vs 2080 Ti:
- BL3: 1.34x
- Doom Eternal: 1.34x
- RDR2: 1.30x
RTX 3070 vs 2080 Ti:
- BL3: 0.97x
- Doom Eternal: 1.00x
- RDR2: 1.01x
https://twitter.com/davideneco25320/status/1303693882975227904
Death stranding also gets 100FPS at 4K with DLSS off, and 160-170 FPS with DLSS ON.
RTX 3080
- Far cry 5 : 100 FPS 4K
- Borderlands 3 : 61 FPS 4K
- Horizon zero dawn : 76 FPS 4k
- Assassin’s creed odyssey : 67 FPS 4K
- Forza horizon 4 : 150 FPS 4K
- Shadow of tomb raider : RTX DLSS 100 fps 4K , RTX no DLSS 84 (ofc 4k)
- Control : DLSS ON 100FPS , no DLSS 50-60 FPS (4k)
https://twitter.com/davideneco25320/status/1303687388988833792
Hello, my name is NICK Richardson. I’m an avid PC and tech fan since the good old days of RIVA TNT2, and 3DFX interactive “Voodoo” gaming cards. I love playing mostly First-person shooters, and I’m a die-hard fan of this FPS genre, since the good ‘old Doom and Wolfenstein days.
MUSIC has always been my passion/roots, but I started gaming “casually” when I was young on Nvidia’s GeForce3 series of cards. I’m by no means an avid or a hardcore gamer though, but I just love stuff related to the PC, Games, and technology in general. I’ve been involved with many indie Metal bands worldwide, and have helped them promote their albums in record labels. I’m a very broad-minded down to earth guy. MUSIC is my inner expression, and soul.
Contact: Email








Interesting performance jump nonetheless….
Thanks METAL messiah ! Some good stuff you have been posting lately. Much appreciated.
Which CPU did he pair this GPU with ? Also, whats the refresh rate of the Monitor ?
The Core i9 10900K CPU. Don’t know about the refresh rate though
The Core i9 10900K CPU. Don’t know about the refresh rate though.
4k is trash, 1080p and 144hz is the right way to go.
I agree
If you like aliasing in your games, sure. 1080p is the way to go.
Please. 4k hdr is amazing for single player games. 4k 144hz is doable now with dlss. There is no reason to play at 1440p now with dlss. Just need to wait till the monitors are cheaper or you find a deal.
4K 60fps overrated..
I rather play on a 144hz or 120hz 1080p monitor or even better with ULMB than going back to 60hz on whatever monitor resolution.
i don’t care about resolution, i only care about frame rate and refresh rate.
Going to get the 360Hz monitor?
1440p@144hz is actually the sweet spot IMO.
I agree.
I’d love to get a 4K, 120FPS+, Freesync HDR monitor with HDMI v2.1 support for consoles and my PC but they aren’t even out yet and will be damn expensive… and current 4K monitors are limited to 98Hz if you run 4:4:4, 10-bit as DisplayPort is insufficient.
I use 1080p 240hz gsync ips, that’s the way to go.
Underwhelming considering the hype but the price makes them really good. I was expecting more than 30% increase VS the 2080ti
There were never more than 30% increase.
Yeah, 60fps isnt something i would consider a “WIN” for 4k gaming. Especially when were on the brink of next gen and probably more demanding games.
I mean y 4k tho ? 2k 165hz is pretty juicy no ?
that becomes a cpu limit NOT a gpu limit.
Hmm depending on cpu but mostly unlikely. If you’re spending 1500$ on a gpu and still have a bad cpu you deserve it.
are you saying its capable of that or was that what you were hoping for. it sure as hell is not even remotely capable of 4k 144hz in many if not most modern games now and games are only getting more demanding. I mean how clueless do you have to be to think the 3080 can do 144hz 4k max when the 2080 ti cant even do half that in many games?
Wont happen…. mark my word. Would like it really but highly doubt.
I’m aiming for 1440P 144hz with a 120 fps minimum. This generation will make this happen most likely.
Kyne,
RTX3080 will play 4K@144FPS for up to four years?
Sorry. You are really misinformed. Some games maybe. Especially as DLSS improves, but game developers will continue to improve quality too such as more ray-tracing which is why hardware requirements keep going up.
If you’re talking MEDIUM settings it’s a moot point since you shouldn’t be running at 4K only to drop the quality.
For My self all I ever wanted was to be able to play all My games at 5760×1080(that can) and at 60 FPS on Ultra. The 3080 will 100% do that without getting a 3090 for a truck load of cash.
I mean sure, once I get 3x 4K OLED screens for a new Nvidia Surround setup I am sure I will need more GPU power but I am not sure I really want to run 11520 x 2160 LOL. Honestly if anything I will run 5760×1080 at 120 FPS but I am more than happy to stick with 60FPS. My rez is less than 4K anyways so I will be fine for years and 120 should work too!
Also I would wait till someone bench these cards on Gen 4 PCIe and a new mobo and CPU for I have a feeling the benches will be even better. I love to see a new Zen 3 system running benches on both the 3080 and 3090 or even run it on a 3970x and with a Gen 4 NVMe.
Three screens aren’t 3x as demanding. Probably closer to 2x on average as it’s much wider and less content is at the edges so you can probably just half the FPS that 4K benchmarks show (and possibly DLSS will scale well)… I really hate triple monitors for gaming though… why not consider a single ultrawide?
The nvidia doom video, comparing a 3080 with a 2080Ti shows between 50 and 70% increase. These results dont seem accurate here. The digital foundry video shows increases between 70% and more than 100% next to a regular 2080. That means at least 50% compared to a 2080Ti. These are some funky chinese results
The Doom video had plenty of sections where the increase was more like 40%, even lower.
no it didnt. go back and look at the video as its below 50% for average. and those 2080 ti numbers they were showing in Doom Eternal are quite a bit lower than I have seen from tons of other users on same settings.
It depends on the scene. You have the DF video, where they couldnt show the exact framerate, but they showed the difference in percentage between a regular 2080 and a 3080. You had from 70% all the way to over 200%.
Now, a 2080Ti is only around 25% faster than a 2080, its not that big of a jump. So at the low end you have the 3080 being from 45% all the way to 70% more powerful than a 2080Ti.
Some games will perform better than others as always.
They used a game, which like it’s predecessor is optimized way more than the average game is. Not really the best game to use benchmark wise. That was done on purpose by the looks of it if these results are anything to go by.
Doom is highly optimized so it’s a best-case scenario. Digital Foundry themselves said to wait for 3rd party benchmarks because the games they tested were dictated by NVidia.
This is what i was afraid of. Nvidia played it well, ill give them that.
Considering upgrading my 1080 ti Founders. Is still amazing though. But I guess this means new case, new mobo, new ram, basically a new PC right?
While you’re at it, yea 🙂
Not at all
These are games from 2018 to 2020, now imagine next gen cutting this performance in half at 4K and so these become obsolete hype wise…
And then they will announce the 3rd gen RTX..
Q3 2022 :RTX 4090 RTX 4080 RTX 4070
Q2 2023 RTX 4080Ti RTX 4070Ti RTX 4060Ti
with a performance bump of 25% to 30% compared to its previous gen. cards.
“…we benchmarked Cyberpunk 2077 version 4.8 with Nvidia Driver 889.43 from march 2023 in 1080p 1440p 4K and 8K…
With the newest Nvidia Tech SDDLSs 4.2 and here are the results …
1080p 37 %
1440p 28 %
4K 17%
8K 10%
…..”
Which is why Ampere is pointless, if you truly want to experience next-gen instead of pre-next-gen crp we received this year and potentially next then wait for Hopper/RDNA3. Ampere is only worth, if you want to play current titles in much higher resolution than Turing is capable of. DLSS is improving, but it won’t be fully fledged out till Hopper most likely.
Jesus, everybody said “don’t buy RTX2000 series” due to high price and ray-tracing being barely used (which I agreed with for the most part) and now it’s like “wait some more”… not sure why you think DLSS needs a new architecture as DLSS v2.0 has already improved and it’s not even the hardware that’s the main issue, it’s simply refining the algorithm that’s trained on the super-computer… all newer architecture is going to do is increase how much up-scaling you can do (how fast)… possibly they’ll have a dedicated chip or section of the GPU that does nothing but up-scale one frame… but again DLSS on Ampere is going to get to be damn impressive
With that take it means the 2000 series and lower will be worthless. 75% perf cut at 4k…
I don’t know what you are talking about ‘next gen’ as pc is always next gen. Have you seen the ps5/xbox x teasers? Looks like 2018 – 2020 games running at high 4k gfx. Bright Memory is a good example. All that post procesding may look good in screens, but in action Shadow Warrior even looks far better.
I think you’ll be surprised at how good some games will end up looking on console when they start utilizing added ray-tracing, AI upscaling, SSD asset streaming, mesh shading, sampler feedback etc… games coming to PC won’t be optimized for less than 1% of the market who have high-end cards. Some games on PS5 will be exclusive to that machine and coded to leverage it in a way you won’t see on PC beyond needing brute force with a much more powerful machine…
Not saying every game will be like that but probably the best looking game will come to PS5 in the next year or two.
i’m guessing old games will not take full advantage of new hardware.
it’s the same reason why SOTR plays much better than ROTR.
better game optimizing.
Right. There’s many features that old games can’t benefit from. VRS (variable rate shading), Mesh Shading, Sampler Feedback, DLSS (though may come to all games with TAA), ray-tracing of course, SSD streaming of assets (no game uses yet)… and of course old games aren’t using the general architecture as efficiently as they could either…
On top of this there are DRIVER optimizations for newer games that don’t usually include optimizations for older hardware.
IDIOT
It’s quite the jump. Glad i sold my 2080 and getting a 3080/3090
Yes i bought 2 RTX 2080Ti for the price of 1 RTX 3080 and still will have a better performance than the RTX 3080.. see you in 2022 when you are going to sell your RTX 3080 tome for €250
SLi …
Lol see you when your SLi profiles aren’t giving you 100%/100% GPU usage…
SLI and Crossfire are dead and gone.
Pretty much yes. Hence my comment.
Ill just game in 1440p. 4k is still just too much for gaming atm lol. Especially with next gen games that will look even much better in the next few years.
Gamers who chase after 4K will spend a fortune on cards. 1440p is where it’s at.
4K doesn’t always look much better than 2560×1440 anyway. Many games have great anti-aliasing. Some don’t though, but generally speaking aliasing is becoming less of an issue as time goes on.
Obviously 4K sometimes is a big improvement. But I was staring at Tomb Raider (Shadow of the) and thought “would 4K improve this that much?” Meh.
Cyberpunk at 4K/ultra setting will be a great test, assuming the port is good of course.
How much you wanna bet it won’t be lol! Cyberpunk looks cool and all but CDPR doesn’t exactly having a good record in terms of their games being technically efficient at launch. In fact, their record is horrible lol. I expect to be able to play a decently optimized build of Cyberpunk hopefully some time Q1 of next year after the first dozen or so updates.
You can easily spot the bad ports with the lower fps
Downgraded Gpus?
Go look at some of the rumor videos most of them said Nvidia was aiming for it to be 50% better than a 2080 so that comes out to around 20-30% better than a 2080ti.
Borderlands 3 : 61 FPS 4K
Far cry 5 : 100 FPS 4K
https://uploads.disquscdn.com/images/3c65c217d06ef79c6587007c53fcbcc070738bf4446ca8642b012feaf8c84ee8.gif
The RTX 2080 Ti averages 44 FPS in Borderlands 3 benchmark test, if you are curious.
Just shows how severely unoptimized that game is.
nvidia sure is twisting those efficiency numbers. these cards are power hogs with only tiny increase in perf per watt.
DAMN!!! September 17th 2020 Will go down in history as the official start of 4K gaming taking over the 1440 defacto throne
1440p isn’t the defacto king. 1080p is by far the most used monitor. 4K isn’t even relevant to gamers. When did it come out? Like 8 years ago and still the adoption rate is like 2%
4K adoption has been low with PC gamers due to A) Price (It’s remarkable how many members of the “master race” are El Cheapos) and B) Lack of hardware to push modern games in 4K at at least 60fps. Personally, I’m a couch PC user. My PC has been hooked up to a 4K display in my living room since like 2017. And while 1440p has looked decent enough, I’m very happy to be able to get into a 3080 and finally run games at a decent clip in 4K. Now that this hardware is getting to the point of being able to easily handle it, we might see 4K adoption go up.
PCMR was a joke all along, yet people are too obtuse to realize that, it’s basically a cult that people take too seriously when in reality nobody bloody cares what PC you have.
For the Pc gamers, get this, olha o Macacooooooo kkkkkkkkk
Vai trocar a sua fralda criança, kkkkkkkk!!!
I love how so many know it alls in here are looking at these results and treating them as if they are verified. Hilarious.
It aligns with Nvidia’s DOOM Eternal benchmark.
Current games maxed at 4K are using less than 8GB VRAM. And when RTX IO takes off and they’ll be able to speedily move assets from storage to VRAM on the fly like the new consoles, it will matter even less.
500W with what other hardware?
Who knows, though if I had to guess it would be 150W + 350W where the 350W is the graphics card power draw… if that’s accurate and it does seem likely, then that’s about 200W more than what I draw when gaming with my R9-3900x + GTX1080… my GTX1080 draws about 175W.
Power draw is an issue for me solely because my room gets hot about half the year so any extra heat is unwanted. (hot when air conditioning or in winter because window is closed)
Are you assuming that because of the game consoles?
I wouldn’t assume that’s the case. XBox One/PS4 series use an AMD GPU based on GCN architecture and that didn’t translate into any desktop advantage for AMD cards. While I’m sure the architecture is closer this time between console and RDNA2, AMD is also at a disadvantage by having a much smaller driver development team to optimize for new games.