In 2018, NVIDIA released the high-end model for its Turing GPUs, the NVIDIA GeForce RTX2080Ti. Back then, NVIDIA advertised that GPU as a 4K/60fps graphics card. Thus, we’ve decided to collect in a single article all of the games we’ve benchmarked these past two years, and see whether this GPU was able to achieve that initial goal.
For these benchmarks, we used an Intel i9 9900K with 16GB of DDR4 at 3600Mhz. We also used the latest version of Windows 10.
Do note that in this article we’ll be focusing on pure rasterization performance. We’ve also excluded any DLSS results, as the earlier versions of this tech weren’t that good. With DLSS 2.0, NVIDIA has managed to achieve something extraordinary. However, there are currently only five games that support DLSS 2.0 and these are: CONTROL, Mechwarrior 5, Wolfenstein Youngblood, Death Stranding and Deliver Us The Moon.
We also won’t focus on this GPU’s Ray Tracing support/performance. Similarly to DLSS, there are only a few games that support real-time ray tracing. Among them, there are only two that offer a full ray/path tracing renderer, and these are Minecraft RTX and Quake 2 RTX. Both of them look absolutely stunning, and will undoubtedly give you a glimpse at the future of PC gaming graphics. However, the overall adoption of real-time ray tracing wasn’t that high these past two years.
It’s also worth explaining what 4K/60fps means to us and to companies. For us, a GPU that has a minimum framerate of 60fps in 4K is a GPU that can be truly described as a 4K/60fps graphics card. However, most companies advertise their GPUs as “4K/60fps” when their average framerate hits 60fps or above (regardless of whether there are drops below 60fps in some scenes). Therefore, keep that in mind when viewing the following results.
So, with these out of the way, let’s see how the NVIDIA GeForce RTX2080Ti performed in 83 PC games on 4K/Ultra settings. Needless to say that you can find below benchmarks from a lot of triple-A games; from Battlefield 5 and Doom Eternal to Red Dead Redemption 2 and Horizon Zero Dawn.
As we can see, the NVIDIA GeForce RTX2080Ti was unable to offer a 60fps experience in 40 games. Additionally, and while the average framerate was above 60fps, in 21 games the minimum framerate was below 60fps. Thus, in only 22 games the NVIDIA GeForce RTX2080 was able to offer a constant 60fps experience on 4K/Ultra.
These results also justify why we were describing the RTX2080Ti as a “1440/Ultra settings” GPU. Despite what some PC gamers thought when this GPU came out, the RTX2080Ti was only ideal for 1440p. While you can hit 60fps in 43 games in 4K, there are drops below 60fps in half of them. In total, 73% of our benchmarked games were dropping below 60fps on the RTX2080Ti in 4K. On the other hand, and in all of these games, the RTX2080Ti can offer a smooth and constant 60fps gaming experience on 1440p/Ultra settings.
With the release of the next-gen games, the RTX2080Ti may even struggle running some of them in 1440p/Ultra. Thus, it will be interesting to see how the RTX2080Ti will handle the upcoming next-gen games. It will be also interesting to see whether more games will adopt DLSS 2.0.
Before closing, we’d like to let you know that we’ll get an NVIDIA GeForce RTX3080 when it comes out. As always, we’ll purchase this GPU ourselves (like a customer, just like you). Since availability may be limited, we may not get one at launch day. Still, we plan to get both the RTX3080 and the Big Navi. Naturally, we’ll be sure to benchmark the RTX3080 with our most demanding games. Therefore, it will be interesting to see whether games like Quantum Break can finally run in 4K/Ultra with 60fps. It will also be interesting to see for how long this GPU will run games with 60fps on 4K/Ultra settings!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email




It would be nice if you would also build a ryzen based test system to make cpu comparisons
No point as Intel beats ryzen at gaming for now.
How about in a couple weeks when the monster PCIE 4.0 GPUs come out? What if Big Navi cards have a noticeable performance benefit on Ryzen systems? Seems like “for now” could apply to a very short period of time..
I’ve heard that story about AMD before
First of all, that’s what AMD always says, and I always get excited about moving from intel to amd, and I ALWAYS get disappointed by how intel still wins at gaming every. Single. Time.
Secondly, tests have already been done for PCIE 3.0 vs PCIE 4.0 regarding GPU’s and all current GPU’s don’t even get NEAR the limitations of PCIE 3.0, let alone benefiting even slightly from PCIE 4.0, soooooo…
Problem is at 1440p cpu becoming the major bottleneck for 2080ti even using one of the fastest CPU with overclock.
That’s not true. I can’t think of a single game that struggles CPU wise at 1440p. Even my 3600 runs every title at higher than 60 fps, generally well beyond 120 fps. CPUs are not that important in 2020.
i’m not talking about how much frame rate we are getting. but how much 2080Ti true potential being held back by CPU.
There is no CPU bottleneck at all at 1440p. You can find all of these benchmarks in our PC Performance Analysis section (we have CPU benchmarks for each and every game).
yes there were. just because we can get for example over 100FPS using 2080Ti at 1440p means there is no bottleneck.
Of course there is. Its only at 4k that, as far as we know you loose the cpu bottleneck. Digital Foundry has several sequences in different games where it shows that there is.
Problem with game benchmarking is that they handle the gpu side of things. They benchmark a gpu focused area and say theres no cpu bottleneck. When its proven that there is, at 1440p. You need to find the specific areas where the load shifts on the cpu, not do the ingame bechmarks and call it a day.
Another humongous flaw in your methodology is using ultra settings and call the gpu unsuited for 4k. Which is laughable nonsense. The card can do 4k/60 in every game on the market if you adjust the settings accordingly. Why would you blindly set everything in any game to ultra and put a conclusion that its not a 4k card ?
Actually, we always benchmark the worst-case scenario (when it comes to both CPU and GPU), that’s why our results may be lower to what other people report. For instance, Horizon Zero Dawn’s benchmark tool is not that demanding (the game has way more demanding areas, which are those that we used). Same can be said about Deus Ex: Mankind Divided.
Of course, if you had bothered reading our individual PC Performance Analysis articles, you’d know how we benchmarked all those games. You’d also know how these games perform on both an Intel i9 9900K and an Intel i7 4930K. You’d also know whether there were performance differences between 1440p and 1080p.
Worst case scenario decided by whom ? On what grounds ? You dont seem to understand. We know for A FACT that 1440p is cpu bottlenecked. FACT. Documented, logged timeframes. But you need to find the specific scenes, in the specific games. There are areas in certain games that have frametime issues on both 1080p and 1440p, that only go away at 4k. You can absolutely be cpu limited at 1440p.
You dont seem to know what cpu bottleneck means or is, based on what you write. You dont find those areas by usinng your 2 cpu’s, wtf …
Determined by US. Because it’s our job and we know what we’re doing (fun fact: we are the only ones that discovered the VRAM issues in Marvel’s Avengers on the RTX2080Ti in 4K/Ultra Textures. You don’t see this being reported anywhere else). We were also among the first that reported Hitman’s CPU issues, Doom 2016’s OpenGL driver overhead issues, etc. Also fun fact, we were the only tech site that criticized Forza Horizon 3’s single-threaded CPU issues. Back then, some even said that we should upgrade our CPU. Hilarious. Fast forward a few months and Playground released a patch that offered proper multi-threading CPU support that significanlty improved performance on all CPUs. You may not like it, but we know what we’re doing, so you should at least show some respect given our solid track record.
We know for a fact, documented in our articles, that an Intel i9 9900K does not bottleneck the RTX2080Ti in most games. I’m saying “most” because every now and then you get a single-threaded CPU-bound game like Flight Simulator. If you have evidence that an Intel i9 9900K is bottlenecking THE MAJORITY of modern games and not one or two games like the Far Cry, Assassin’s Creed and Microsoft Flight Simulator games (for which we, once again, were among the first ones that said that they are CPU limited due to their single-thread behavior), you should present them. Plain and simple.
You’re not adressing what i said. Which is that 1440p can be cpu bottlenecked. And it is, and its documented, and it demonstable.
The only res which escapes cpu bottleneck is 4k, aparently. this was the original point, that there is no cpu bottleneck at all at 1440p. But there is. Of course its not gonna be in every game at all instances. You need particular scenes in particular games and you need to look for them. But point remains, 1440p can be cpu bottleneck, even if popular opinion is the oposite
I think that @ the base frequency of 3.6 Ghz there could have some bottleneck issue with engines that are poorly optimized for multi-core (more than 4 – 6) … But these are certainly exceptions…
BTW great article John!
The CPU auto-boosts at 4.7Ghz when an application/game is running. It’s not set at 3.6Ghz 😉 . We can run our i9 9900K at 5.0Ghz (24/7) but there are only minor performance improvements at 1080p in some games (and to be honest, those minimal performance gains do not justify the extra heat/voltage needed to hit 5.0Ghz)
It’s not though. Even with a 200 dollar CPU you’re GPU bound at anything higher than 1080p.
this is utter nonsense
i hope RTX 3060 Super gives RTX 2080 level of performance while keeping the cost down to $399
Sweet spot 1440p 144hz
Sweet spot 1080p 144hz
Nobody’s got a PC for that. 1080p120? Infinitely more achievable.
It is but 1440p is the unsung hero. For me that is
I’d agree we should be moving on from 1080p by now (hell, I’ve been on 1080p since 2008), but hardware requirements kept increasing exponentially in the last 12 years and now I’m used to high framerate gaming.
Short of a 2080 To/3070/Big Navi, you can’t enjoy 1440p120 yet in most titles.
Agreed!
Most titles yes, I have a 1440p 165hz and a 1080ti. Most titles I’m lucky to get to 120.
To me everything 60fps or higher at 1440p is bliss to me. Once you go 1440p it is hard as hell to ever think of going back to 1080. 1440p is the mafugin sweet spot
I went from a 27 inch 1080p monitor years back to a 27 inch 1440p and I noticed a big difference in sharpness and clarity. I will never go back to 1080p but I also won’t spend more than $700 on a GPU.
Funny thing but I have that same reaction, but with framerate. I had a 1440p@144hz monitor for a while, but had to return it as it had some issue. The thing that bugged me most about going back to 1080p@60hz, wasn’t the resolution, but the framerate. So I actually ended up getting a 1080p@240hz from Alienware, and I don’t feel the need for 1440p, at least not as much as I felt the need for 120fps.
Good technical analysis.
2080ti when launched was hyped as a 4K card when it actually was a 2K. The 30 series are 4K cards hyped as 8K.
With the release of the 30 series and next gen consoles, the 2080ti will be the card that gives 100 fps give or take with all settings maxed on 1080p…60 with RT on.
I may get downvoted for this but it’s the hard truth.
It’s always the same story. People who chase after 4K gaming are going to spend a fortune on cards. They will probably need to spend $1,500 on a RTX 3090. That’s why so few do game at 4K.
1440p/60fps or 1080p/120-144fps
? am sorry I find your results and conculsion totally inacurrate.
I own 2080ti and have more fps than yours in all games. E.g. in Sottr tomb raider have 72fps avg with 60 min.
2080ti IS 4k 60fps capable Card with ultra settings , whilst there is no reason of using AA , and ofc it is 2k 144 capable in several games.
Furthermore stock 2080ti with my for eg. Asus Strix OC is almost 25% more difference. Just sayin.
I just checked a few benchmarks online, and that’s what I found: 57-60fps avg with 49-50fps min.
There was one benchmark with Asus ROG Strix, 60fps avg, 51fps min.
Asus Strix is usually 2-3fps faster in games than vanilla 2080Ti. Very, very far from your 25%. Just sayin.
Seen as every game i play ran at 60fps or more at 4k this is a crap article. EG: who uses ultra clouds on AC Odyssey when high looks pretty much the same with a huge fps gain. Also no one uses TAA at 4k. Clickbait crap.
You are a jerk!!! Just a huge garbage jerk. Just goto a different site then. Jeez man.
Why are you getting so defensive are you John’s lover?
Defensive… please. Just pointing out jerks when I see them
A bunch of people need to chill lmao. Bunch of angry mad gamers in here
lmao yesss
2080ti owners be angry
We’re not angry, but the results are flawed. I played with only a 2080 and got consistently around 60FPS on Ultra with Kingdom Come, so how could I have gotten better results with less hardware?
Nobody even plays 4K on PC lets be honest here. This years steam survey i believe said only 2% were actually playing at 4k, the rest was 1080 or below.
Personally ifeel like 60fps is a thing of the past in general. People are getting high refresh rate monitors playing games like League on 200+fps they arent thinking about 4k. Thats not what PC gaming is about, and the most popular games on PC show it.
All these 2080ti owners playing at 1080p 😛
At 144hz 120hz or ULMB yes
And good for them. They can get 1080p144 locked in anything.
No you can’t. Lots of games are cpu bottlenecked at ultra settings, 144fps locked is a no go.
Exceptions and exceptions, yes.
They are just too lame to OC their CPU… it is not very difficult these days and it removes most of the CPU bottlenecks…. I had a i5 2500K @ 4.6Ghz until this year and it was holding up well with my GTX 1070…
so in other words you are delusional. a 2500k at 4.6 cant even hold 60 fps in many modern games and has minimums even in the 40s and 30s in some games. and most modern cpus are already close to their limits so you dont even gain 5% more actual performance most times yet do all that tinkering and make power consumption and heat go up.
it depends on the GPU, with a GTX 1070 it’s quite homogeneous. The GPU is always near 100% usage in game with a 2500K@4.6Ghz
I do and have done for the last 4 years at least.
I guess I am part of that 2%, and I have to tell you, once you go 4k there’s no going back. I have been gaming for over 25 years and one of the biggest changes to gaming ever made was 4k, it’s basically like getting glasses for the first time. The games are so clear and detailed, once I go back to 1080 it’s like I lost my glasses, it’s very blurry. Most people are at 1080 and high refresh because it’s MUCH cheaper than trying 4k gaming.
What’s the point to play in 4K with a bad framerate ????
I’ve got a1080 ti and just about all of my games run at 60fps or more, I played the latest metro at 50-60+ fps and it was fine. It just depends on how picky you are, I am happy with 60fps. Also if they’re slightly older games, your going to see a massive difference when compared to 4k vs fps gains.
1080p on a 4K screen is blurry.
1080p on a 1080p screen isn’t blurry.
Fake news. Integer scaling is a thing you know. 1080p to a 4k screen can now be done with a sharp perfect scale on both sides.
Even that isn’t as sharp as native screen rez with no scaling involved.
It is as sharp as native res but obviously 1080p on a 4k screen is not 4k detail. Compare a 27″ 4k screen doing 1080p integer to a 27″ native 1080p screen and the integer scale is much nicer on the 4k screen.
That’s far from from true. I even run custom resolutions 1620p and 1800p. By your claim 1440 and my custom res would look blurry, which they don’t. Crappy monitor with a crappy scaler. Try GPU scaling if it allows ot.
This is true. It’s like playing n64 on an hdtv… blurry aF. but if you play on a tube tv it looks fn mint sexy
1440p FTW !!!
LOVE That 1440p/Ultra Juice settings!
I’ve played at 4k since it began and i don’t do steam surveys so yeah. Take that data with a huge pinch of salt.
Exactly. 4K gaming is irrelevant these days. The majority still have a 1080p monitor.
You thumbs down me when I agreed with you?? 😛
I think 1440p@120fps is still the sweetspot
I think 1080p120 is the sweet spot. And even for that, you need a 5700 XT / 2070S / 1080 Ti.
4K60 and 1440p120 is not yet achievable for the mass market.
Who cares about the masses of the stupid and poor.
Devs.
To claim that people are only playing at 1080p using Steam Survery is either stupid or dishonest. What you´re talking about is only the primary display resolution.
To be fair, Steam’s metric is for people with 4K displays. It doesn’t (or can’t) count for people that use downsampling from 4K to 1080p, 1440p for various older games where 4K60-144 is achievable.
Random fun fact: I played the Prince of Persia Sands of time Trilogy and Splinter Cell 1-3 in 4K120 (downsampled to 1080p). Visual benefits were nonexistent, but heh, I could do it.
4K/60 fps isnt near the holy Grail of PC gaming that would be 120 /144 fps or ULMB in ANY resolution
keep in mind 2% is about 1.98 Million Steam Users. That is still a decent number
Pointless article….This does NOT prove any damn thing about gaming. Just pure click bait BS for the weekend.
You also need to consider other parameters to justify a 4k/60 fps gaming experience. Also, don’t forget about Game optimizations. Poorly optimized games.are releasing these days, and its not fair to blame the GPU just because it cannot maintain 60 fps. Blame the Game.
This article is a JOKE. Don’t forget about any CPU bottleneck either, if any.
This actually proves…everything? I mean, if you run the tests and you show the results in a plain, numerical form, there is nothing closer to truth than that.
Some games are harder to run than others, and some are less optimized than others, but it is still the cards job to run them, especially at 1200 bucks.
So, it does prove something. No. It’s not a 4K/60 card, not quite. Also,t he 3090 won’t be a 8K/60 card (except with 4x or 8x upscaling like their benchmarks shown).
Reality has to set in at some point.
I think the guy above forgot what his role is as a consumer. Using all these terms like “lay the blame” and “its not fair” like its a personal relationship or something.
Probably sleeps with his GPU under his pillow.
To be fair, the claim of 4k/60 is not the same thing as 4k/60@ultra.. the quality setting restriction is an arbitrary limit of this article.
It is fair to expect ultra settings from what was an extremely expensive GPU. When it is advertised as a 4k 60fps GPU, it is fair to expect ultra at this price.
Otherwise NVIDIA could advertise a 2060 as a 4k GPU as long as you run every game at low settings with all the extra bells and whistles disabled (an extreme example but that is the idea).
For a GPU as expensive as the 2080ti, and also being the top tier of its time that was advertised as 4k/60fps, ultra settings is absolutely expected. Otherwise advertise it is 1440p ultra and 4k high settings.
Anyone can adjust graphics settings to get any GPU into a higher FPS/resolution bracket, but the very top end GPU is expected to be tested at ultra because that is what you paid for it and what I would expect from it at the price.
So I think it would be disingenuous to benchmark it at lower settings and claim it can do 40k/60fps. Even my 2080s can do that if I lower enough settings. But I didn’t purchase my 2080s to play at 4k/60fps with medium/high settings. And this article isn’t titled “How to achieve 4k/60fps in games by lowering graphics quality settings”. And it isn’t titled “How to make an 2080ti run at 4k/60fps in games like NVIDIA advertised, by lowering settings”. That makes no sense, anyone can do that with any GPU to achieve higher fps/resolutions, and this is not an optimization guide article. It is not fair to have to adjust settings in so many titles to achieve NVIDIAs claims with such an expensive GPU.
Obviously you can’t expect 4k 60fps ultra from every single game I think, especially for more demanding games that come out at the end of the GPUs on store shelves lifetime (before its replacement is released, like right now 2 years later). But I expect it from a large chunk of games, much closer to the 80%-85% range (as in no drops below 60fps in that percentage of games at 4k with ultra settings). So I am not unreasonable, and it is the price/performance and the 2080ti’s place at the top end that raises expectations.
And I do understand that everyone has different standards and has different ideas of what is acceptable from such an expensive GPU, but personally I think the 2080ti did not live up to NVIDIAs claims. And NVIDIA didn’t market the 2080ti as 4k/60fps with medium/high settings (even my 2080s can do that).
Don’t get me wrong, the 2080ti is still a very decent GPU though performance wise (excluding the original MSRP) and no one should be upgrading just yet if they don’t want to, and I am sure anyone who owns one is happy with it. But I am glad I went with the 2080s and saved some money instead (PC parts are a lot more expensive in my country though).
But I think the RTX3000 series looks much better value in general, so let us hope they are and let us see what DSOGaming find in their GPU tests. And I am also excited and hopeful that RDNA2 also brings a great lineup on the price/performance front. I really want to see the GPU market become as competitive as the CPU market currently is, so we can see performance go up and prices go down in the GPU market, like we currently have in the CPU market.
I have a feeling the MSRP for RTX 30 is a smokescreen. They will sell a few FE models for that price and get amazing reviews from youtubers about performance and value.. and then the real mass produced cards will be available later for a higher price. I really hope people haven’t fallen for the marketing and written AMD off yet. It’s such a bad idea to buy before seeing everything that’s coming out first.
Absolutely, I agree. Those are wise words and I don’t disagree with you at all. Always best to wait and see what performance really is (not NVIDIA in house benchmarks, or biased reviews) and also to wait and see where the prices actually settle at after launch (and some time after launch). Prices could very easily increase above what is currently meant to be MSRP, and that would absolutely alter any price/performance ratio, and this should be considered by everyone. If NVIDIA is planning this, it will heavily alter the value of the 3000 series (and also a very dodgy move by NVIDIA). I won’t jump to any conclusions just yet, but it should be something we must all look very closely at.
So I think it is very possible that you are correct and exactly that could happen. And I will also be watching keenly to see if that is what is going to happen. And it is solid advise that everyone should wait to see what the real prices will end up at a little later and also what the real performance falls in at.
Nothing wrong with your advice, so I absolutely agree. And I am really hoping AMD knocks it out of the park, we really need them to bring some much needed competition to the market, and I do think RDNA2 is going to be a performer (from the leaks I have seen). So it is wise to actually see ho RDNA2 turns out, I do think it will be an very viable alternative to the 3000 series. AMD has also done incredibly well in the price/performance metric with RDNA1 as well, and RDNA2 should increase that ratio I think. I am excited for its release.
I jumped on an Zen2 3700x last year, and am thrilled with its performance. And I will be upgrading to an 4950x come release (I do rendering and tons of video editing) and will be giving my son my 3700x and a B550.
And I will be purchasing at least one RDNA2 GPU for my kids as well, maybe more (they have been waiting for them, and they have done great this year with schooling even with all the cv19 issues, so they have earned it).
But I will also be waiting a good while to see how things actually turn out with concern to actual pricing and performance. I am in no rush to upgrade my GPU (I use it mostly for encoding nowadays, not as much time to play games at the moment), but I will be grabbing Cyberpunk and make some time I think.
Lol, the name of the game is to blame. You’re a joke, buddy.
But you are correct, the article doesn’t prove anything about gaming, it proves that 2080Ti wasn’t a truly 4K/60 GPU.
4k is a waste of fps. 1440p still very good.
Thanks for the work John.
If RTX 2080 Ti ideal for 1440pultra then “next gen” consoles are ideal for 960pultra.
Those settings are overkill for consoles.
If you launch those games on PC in 4kconsole settings you’ll be getting over 120 fps!
To be fair, all games benchmarked here run at far beyond console visual settings.
That’s exactly what I’m talking about.
ultra stupid settings, makes -any card- run like a*s. brute force era is over, you will see…
It’s amusing how quick people are to dismiss perfectly good tech as soon as something better comes out. Suddenly the (still current) fastest gpu is total trash lol.
Why would you put unoptimized TRASH like Quantum Break in the mix to drag the averages down?!
It’s not unoptimized, it’s just very demanding. Easily more demanding than Control (in pure rasterization). It’s also running with 4xMSAA at all times.
the crappy running game is f*king unoptimized and so is the damned control.stop sucking remedy’s bulls.what is wrong with you?
It’s 720p on Xbox One and 1440p on X1X (with reconstruction enabled of course) with 4xMSAA.
The game is full of unique props, SSR on all materials, high rez textures and deformation of the environment in realtime at any moment + a ton of physics interactions.
The game is heavy, not unoptimized.
Cryio,the games looks average for the hardware it needs and fps it gives.wasting resources and looking average is not optimization.
good idea, poor execution.
i love how suddenly now the 2080 is the worst card in the planet. nonsense. i dont have one, but my gut / brain tell me is stronger than the ps5 and xboxsx, so no. i don’t think NVIDIA lied ( this time) it is a 4k 60fps card – if you remove the sticks in the wheel- (craports) mark my words the 3080 will have the same “problems” eventually, in time, does this mean is also an inferior card?
Good luck trying to get one these cards, John. I will look forward to your review.
I assume the scores here are for a stock or reference clock speed 2080 Ti? Because I can definitely get about 100-120 fps at 4K in Doom Eternal in many situations on the Ultra Nightmare preset. RDR2 with the settings I use runs at a pretty stable 60 fps with a 2080 Ti that is overclocked for about 1900-1950 MHz boost clocks.
To me these benchmarks read more like if you give up some settings you can easily get a lot of games to run at a solid 4K 60 fps. Insisting on ultra settings for often very marginal image quality improvements is ridiculous especially if games offer stuff like MSAA. Ultra settings should be just that: punishing stuff for future GPUs. The rest of us will be perfectly fine on high or very high.
john really wants a free rtx3080 from nvidia https://media3.giphy.com/media/UVvnHO0RskFm9W8wa3/giphy.gif
ps: hope he gets at least an rtx3070, he really tries
DSO deserves a free card for all the bs and toxic ppl they deal with in the comments most of the time.
people in the comments sound like console fanboys. BUUT DUURRR MY PS4 had different frame rates… Not to mention the complaining… Seriously… so many toxic idiots here. Go back to PC Gamer or just stfu… https://media2.giphy.com/media/XcR9fGtkZzG2lAlzEK/giphy.gif
All the bitter 2080ti owners mad their card is worth like $400 now and the article didn’t make them feel better about their sh1tty purchase https://media3.giphy.com/media/VypUGhOpPUUs8/giphy.gif
You really think all these people own a $1200 gpu?
News flash… $1200 is not a ton of money and there is no reason for anybody to be mad unless they bought one.
Most of them are from Western Europe, is not unreasonable to believe so.
Well I’m laughing at those who laugh at us when they can’t even buy a f*cking 3080 until probably 2021:)
Always interesting to see that the new nvidia card is a for sure case, amd is always like maybe. 2080ti still is to me one of the worst cards from nvidia ever, and i would say the 20 series as a whole was the worst bar none. Cards that can barely do RT, heavily ovrrpriced with very unsubstancial performance upgrade over pascal and full of problems of its own from bad drivers to sudden deaths… What a mess…
That is usually the case with new technology tho for practically everything. Everything new is expensive aF with terrible performance compared to newer cheaper stuff. Look at VR. The first headsets and how much they cost and how crappy they look to cheaper and better models now.
Not really.
ok… well i guess you are new to the world. There was a time believe it or not I went to a rich friends home… they had a plasma tv in the living room. I had never seen one in my life. Stores were still selling massive thick hdtv’s still… I asked the guy how much it cost him…. $50,000 and gawd damn that thing compared to plasma tv’s when they came to stores already started to be outdated. new tech always cost’s way more. RTX 2000 series cards were new tech… of course they were overpriced.
Not every new thing is overpriced, you are losing the track not sure how many years you have been buying graphics cards, but if you ever seen a worst deal than the 20 series frim nvidia you are lying. It was simply not worth it.
Who gives a f*k about a Ubis*it game?
Exactly. With their track record i wouldn’t hold my breath for anything ubisoft releases. That being said, we could be surprised one day, we never know.
Moral of the story, 2000 cards were sold to suckers
Something in your test is flawed. I only have a 2080 and got around 60FPS on Kingdom Come on Ultra. How are you getting less with a better card?
Since from the beginning for me 2080 Ti was never a 4K GPU.
From all tests/reviews/benchmarks that I saw and some tests I made myself, I kind got to this conclusion:
Performance of a 3080 in 2160p is the same as a 2080 in 1440p and the same as 2060 Super in 1080p (including Ray Tracing and DLSS).
So if you got a monitor with said resolution, in practical we would all get the same performance. So for this, I would say that this 3080 is the bare minimum to play in 4K all AAA games (Max settings), imho. And those who want 2160p at 240fps in AAA games for example should (if possible) wait for the next GPU card that performs at least the same as 2080 Ti in 1440p but this time in real 2160p. And let’s not forget 21:9 and 32:9 monitors.
It would be nice if someone with more resources and like a channel could try to make these kind of tests too.