RDR2 releases in a few hours on the PC, and most PC gamers are really looking forward to it. However, it appears that Rockstar’s latest open-world game is a really demanding title. From the looks of it, the NVIDIA GeForce RTX2080Ti will only be capable of offering an optimal performance at 2560×1440 in Red Dead Redemption 2.
By optimal performance, we obviously mean 60fps on Ultra settings. And according to NVIDIA, the RTX2080Ti will be able to offer 60fps in 4K with a mix of High and Medium settings. Yeap, you read that right; there won’t be any GPU on the market that will be able to run the game in 4K/Ultra and 60fps. To be honest, this is the first time I’ve seen a mix of Medium/High settings recommendation, even for 4K, for the RTX2080Ti.
For gaming at 2560×1440 with High settings and 60fps, NVIDIA recommends an RTX 2070 SUPER. As for 1920×1080, NVIDIA recommends using an RTX 2060 SUPER.
Truth be told, we’ve already seen other triple-A games that have trouble running with constant 60fps in native 4K and on Ultra settings on the RTX2080Ti. Such titles are Anthem, The Outer Worlds and Tom Clancy’s Ghost Recon Breakpoint. In most of them, we were able to get a 60fps experience on Ultra by lowering the resolution to 1872p. However, that might not be enough for Red Dead Redemption 2. Still, we are fine with this if the game’s graphics justify its high GPU requirements.
Unfortunately, Rockstar has not sent us a review code yet so we won’t have a day-1 PC Performance Analysis. It sucks, we know, but there is nothing we can do. Therefore, and if Rockstar does not sent us a review code tomorrow, we’ll go ahead and purchase it.
Stay tuned for more!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email

Is that news worthy ? Seriously what’s up with all those weird articles about “2080ti can or can’t run X game at 4k/60fps…”
It’s bound to happen as engines are extremely scalable nowadays.
“This RDR2 mod now lets you play as a nude female”
“Find out the details below“
Sh*t. I though it was a real article… Don’t do that!
I’m waiting for the nude horse mod with endowment scaler.
Yep I am waiting for those Nude Wh*res mods too
It’s dirt being kicked in Nvidia’s face and deservedly so. They released a completely overpriced card when looking at the raw power gains over its predecessor. A card being sold on the notion of being capable of ray-tracing, even though it’s barely noticeable or even playable in some cases with the few implementations available and limitations of the hardware. The 2080 series should have been been aligned as the replacement for the 1080Ti, and priced competitively, then people wouldn’t be making articles about why a card that cost $1500 in some cases continues to prove it’s not worth the price.
The 2080ti is a $700 card at best, based on the performance.
But, people keep spending stupid money on things so they won’t lower the price (they would still make money at 700 bucks).
Ok, I hate your guts. There is no one on here that I hate more than you. I even hate you more than Robo-Fernando, Timothy Sweeney’s most advanced A.I. But I 100% agree with you. I’d even say the highest that card should’ve cost was 799 at launch. I still hate your guts though, but thank you for having a sound mind on this one. Respect ? Still hate your guts though.
Why do you hate his guts?
Sad but true. I am still holding on to my 1080 Ti because there’s no compelling reason to upgrade. The 2080 Ti is a joke for what they’re asking for it.
Yes go buy a better card from AMD
Lmao AMD is for budget builds.
I think it is in the case of this game. The game was amazingly well optimized on Xbox One X to run at native 4K as it did, and given previous PC releases from Rockstar, people expected excellent optimization for the PC release to take full advantage of the hardware. If it can’t run just as it did on console at native 4K on PC at 60 FPS with the most expensive card you can buy at the moment, then that doesn’t look good for optimization. I’m only hoping that this isn’t completely accurate, or that there’s an extreme setting or two intended for “future hardware” that you wouldn’t put on “Ultra” for current hardware, sort of like Gameworks features of the past or Hairworks in The Witcher 3.
I’m not sure that’s true. You might be able to run the game well but at medium settings just like the consoles.
GTA 5 has some of those extreme settings like advanced shadow draw distance so it most likely does.
PC version has vastly superior quality settings. If you matched the settings on PC to the XboneX, it’d be like medium-high and at those settings it would kill the console. Don’t forget, we’re also talking about 60fps, not the 30fps the consoles target, thats a straight double-up of the compute right there…
Probably because they added pointless settings like extended shadow distance that murder your fps.
Thing is, the “Ultra” settings in RDR2 probably would require 3 Xbox One Xs 😀
some are saying 1pm GMT unlock time
2080 Ti can’t even run GTA V at 4k 60fps on ultra settings.
use some titans, 2080 ti is an overrated turd sandwich.
True
Yeah we’re still not there yet. But i am pretty sure Ampere will finally “tame” the 4K resolution in ALL games.
The ultra high-end variants of the GPU series after Ampere will likely begin the process of taming the 4K resolution, breaking 30 teraflops.
I mean they will run current gen games at 4K, but you’re a fool if you think that means the next gen card will be able to run any game until then end of time at 4K… as hardware improves, games become more demanding. Over and over and over
so in english means it’s a BULLSHIT PORT !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
In english it means that the fxckbags over at Nvidia are taking advantage of RDR2’s launch to advertise their crappy videocards. If you look at system reqs provided by Rockstar you’ll see that a GTX 1060 is the recommended GPU for 1080p.
Exactly, not to mention the RX screenshots bullshit when the game doesnt support Raytracing.
I think is more relevant to know if it runs fine in a mid range or mid/high at 1080/1440p, I don’t know how many people cares about this kind of articles of super high end products that like less than 10% have in the less used resolution…
https://uploads.disquscdn.com/images/d1d5d97a0b2341b354e3b9e3848bfd9f52b9d8cb627be003536f1e4f389249ed.gif
2K is enough
RTX is a fraud series
4k does not define PC gaming. It’s super rare anyway. 1080p is standard and 1440p is the next thing
Its pretty standard on console and if you have a 4K TV its easy enough to plug into a PC. I’m planning on playing this at 4K with HDR like I did Resident Evil 2.
1440p was “the next thing” five years ago. Taming the 4K UHD resolution is now “the next thing.”
1080p is just fine
I really want to see 1440p benchmarks, especially with the 2080ti, because I’m sure it’ll be bogged down with those “demanding” settings.
Funny how PC gets the latest hw, consoles can only do 30fps with R* games, yet when the PC ports finally arrive, they too are bogged down in some way or another.
Wake me up when R* make a Crysis style game, that actually delivers on the visuals and legit demands.
OH NO!!
The world is gonna end..
So what I can expect from gtx1080ti??
RTX 2080 Ti – 20 fps
Because rage is outdated.
As a 2080 Ti owner, I’m definitely not fine with this. I play at 3440 x 1440, but it looks like at higher settings, I’ll only be able to expect 60 FPS, and if this title is supposedly as well-optimized as Rockstar titles usually are, then I feel I should be able to expect closer to 90-100 FPS on a 2080 Ti at a 1440 resolution.
Why is that bizarre? I play at 3440 x 1440, so more intensive than 16:9 1440p, plus I prefer higher than 60 FPS whenever possible, so that’s why I’m not using 4K. Plus IPS, high-refresh rate ultrawide G-SYNC monitors were hard to come by at the time that were ALSO 4K.
Who are you and why are you upvoting your own comments?
Excellent news, this means that they have increased visual fidelity in order to accommodate future graphics cards on ultra settings.
Thats like the most stupid sh#t I read all day lol
As an A.I he’s getting better though. Stupid or not, he’s stringing together different words to appear as if he’s real. Robo-Fernando is an incredible piece of Tech but not a very good pass for human. The first of his kind to survive outside of the Unreal Engine 4. Timothy Sweeney’s best work to date. Haha, with that said… It is the dumbest Schitt I’ve read all day too.
That would make sense if a modded Skyrim setup from 2016 didn’t look better than the material they released to advertise the PC version.
Modded Skyrim can look pretty but it’s still static like a photograph
What isnt excellent for you.
You windowlicker
Haha @windowlicker, I’m gonna use this ?
Or the game runs like garbage.
Runs great. Very smooth even at sub 60 frame rate using a Gtx 1080 and 5.2ghz 9700k
What resolution?
1440p with high settings in game. Even medium settings look very very good
hahahaahahahahahahahahahahahahahahahahaahha lol no
Even though I dont agree with that, I must hand it to you for being the eternal optimist (on every topic lol).
The article says that it’s based on the game benchmark tool which might not be representative of average game performance, take the example of Metro Exodus benchmark which has some extreme scenarios you almost never see in the game.
We also don’t know what kind of clocks the GPU was running at.
They also might have some unoptimized foliage settings like GTA V which still to this day causes a 2080Ti to drop below 60 in many wooded areas.
This is what I’m hoping for… Either something like that, or a Hairworks type setting like in the Witcher 3 that is completely unnecessary, yet halves the frame rate.
Have you not seen how gorgeous Hairworks makes Geralt look?
I’ve played the game thoroughly three times and messed with lots of mods to improve the Hairworks so I’m very familiar with it. Hairworks does not axiomatically look better in that game; it’s pretty controversial whether it’s an improvement or looks worse.
Anything related to gameworks is garbage. All Nvidia goes is punch Tessellation to the max, X64. Same thing can be done with X16. They are such a snake-oil salesman. Gameworks was designed to Gimp the competition and the funny thing is it even gimps themselves just as bad. Nvidia and their garbage tech that NEVER works properly. Designed to trick dummies into buying a new card every 3 years.
They are such a snake-oil salesman
Still remember their “beating” 3Dfx to 32-bit color, even though it was unplayable.
I know. Its a joke.
Doesn’t make much difference on Geralt, but all the beasts do look much better with Hairworks on.
To this day ultra grass settings on gta5 pc murder the fps due to each blade of grass having shadows on ultra,even at very high its still demanding.
4K gaming is dumb on a PC, and Ultra preset is for screenshots.
I think 1440p (144hz) with mixed Ultra-High settings is perfect sweet spot for visuals and performance.
So rather test out all the graphics settings, and see which has high impact on performance but negligible improvement to visuals.
Find and provide the best settings that we should keep for best performance with negligible visual difference compared to Ultra preset.
Also try out the new NVIDIA Sharpening Option in control panel with GPU scaling enabled and let us know if it helps in this title.
I’m playing at 4K, and I’m loving it
1440p doesn’t look great on a 4K TV with HDR. For most games I agree, but this game is more cinematic and slow paced than an FPS game.
It’s not all about raw pixels and native resolution. It also depends on the contrast modulation of the display itself. A 4K screen with great contrast modulation for instance will produce a crisper image than an 8K resolution screen with lesser contrast modulation.
I wasn’t talking about TV. I am talking about gaming on a desktop PC. On a big screen TV, 4K is obviously better. But a computer monitor, not really that distinguishable. Rather 1440p 144hz makes more sense in PC gaming.
I see your point but 4K is distinguishable on a 24″+ screen. Its actually more so than a lot of TVs because you are sitting so close to it. With a 50″ 4K TV, you cant tell much difference past 5 feet away. There are 4K phones btw.
I’d much rather have a 4K 27″ monitor at 60hz for strategy games which is why I’m thinking of getting one to replace one of my side monitors.
You are full of crap if you think native 4K is distinguishable from native 1440p in any meaningful way at all on a 27 inch or smaller screen. Only in the game that has some horrible aliasing would you see any difference in even then you have to stop and look for it most cases.
I have them side by side and the difference is insane. I still choose my 1440p/165 panels for gaming but that doesn’t mean I can’t tell a difference lmao
“You are full of crap if you think native 4K is distinguishable from native 1440p in any meaningful way at all on a 27 inch or smaller screen”
what a full of crap sentence.
Remember when they also said playing above 60fps was dumb because no one can tell the difference?
FPS matters more than resolution. Every PC gamer would agree to this. So at 1440p if you can 60+ FPS at all times, it’s better than struggling on 4K resolution with 30-40 FPS.
For FPS players yes. There’s not much point in having high fps over resolution if its something like a turn based strategy game. That’s why I have a 1440p 165hz monitor with gsync and a 52″ 4K HDR tv.
Agreed
That’s what I’m saying. Man you guys are dense…
Dislikes are from b*tthurt 4K display owners with mid range graphics cards who realize that if they want RDR2 on PC they will get powerpoint slideshow.
or maybe some people get butthurt because the truth is that 4k is actually better?!
4k is dumb?! dude seriously?
4k is flat out retarded, and 2k is borderline (which i’m sitting on now, often scratching my head “why would i do that”).
Higher refresh rate beats the sh*t out of higher resolution either way.
2 years?
The Ultra grass setting must be in cause
I like how the Nvidia chart you linked literally says “RTX 2060” and you claim it says SUPER. Gtfo
Grosss… 30 FPS
If its 40-60fps its playable for a third person shooter as long as there is no stutterings.
maybe if the game would support SLI and CF we would not have problems like this ………
This just means the game has been “future-proofed” in that future graphics cards will be able to run it at even higher settings just like Crysis back in the day. I’m willing to bet the 2080 Ti can handle 4k60 on high settings instead of ultra settings. Still leaps and bounds ahead of the consoles.
This just means the game has been “future-proofed” in that future graphics cards will be able to run it at even higher settings just like Crysis back in the day. I’m willing to bet the 2080 Ti can handle 4k60 on high settings instead of ultra settings. Still leaps and bounds ahead of the consoles.
this game looked amazing on the oneX at 30 fps 4k. So 60 fps on pc at medium ought to look awesome
Wasn’t dx12 supposed to integrate some crazy new techs, like begin by able to compress gigabytes of textures down to a few hundred megas… among other things. ..?
F*k, a $1500 card should have the power to run ALL games at 4k60 ultra, today !
If game consoles were coded at the same level as a pc, it would require 4 or 5 ps4 pro consoles running in parallel, just to be able to run a game like God of War.
Have you noticed how nvidia always come up with a new tech, that absolutely destroys the fps and visuals, thus requiring a much superior card:
16 bits…32 bits…1024…1280…1600…1080…suddenly, when a simple $400 card can run games at max settings, they come up with a new tech, that will put the hardware to its knees, with only 5 or 10fps… and there you go, the race for more fps is launched, again.
Dual vision this, V.r that…now that a top model was able to run all games at max. .. ‘ohhh, let’s introduce something that will destroy the fps of any modern game: raytracing’
And here we go again. ..you want to play at 4k ? Do you like your 7fps ?
Again, the race is started. Gamers will need a much more powerful graphic card.
And guess what ? 12-18 months from now, when 2 new gen cards will have been released, and 4k60fps with raytracing enabled will be possible, they will introduce raytracing v2.0, which will handle many more objects and stuff to be raytraced, as opposed to today’s tech.
AGAIN, games will install and run their latest game, with raytracing v2.0, and will get 11fps !
GUESS WHAT ? Another race will start, in order to achieve the 60fps, 4k ultra, with raytraced v2.0 enabled…
And the day a 4080ti, or 5080..or 6080ti, at 1’900 bucks, well be able to use raytracing v2.0 with ALL 3d objects in the scene, nvidia will invent/introduce a new rendering technique, or a dx13, or any other set of features, lighting engine. V2.0, or whatever, that, once enabled, will require so much processing power, that, again, gamers will only reach 10-15 fps !
And on… and on…. and on. …
Have you noticed that today, many years after dx12 was out, with all the power we have, games rely 95% on the gpu, and only a little, on the cpu…?
Why is it possible that, close to 2020, most triple A titles still run in a single gigantic thread, and are only optimized for 1 cpu core… max 2… when the gamer has a 10..12…or even a 18 cores cpu, at 4ghz…?
And please, don’t talk about latency, bla bla… operating system…bla bla..devs are able to extract 100% of the juice from 2,4,8, or even 128 core cpus, on consoles, why can’t it be done on pcs ?
Because that way, instead of going out and buy a $1’500 cpu, because games barely use any cpu, the gamer will have to buy a better gpu, instead ! And that’s exactly what nvidia wants !
The first thing they use, to cripple even the most expensive gaming gpu, are drivers.
Ask yourself, why can’t the be any alternative drivers for these expensive cards ?
Ohhh, maybe nvidia don’t want 3rd party drivers to unlock 50..100..150% extra processing power. ..?
Drivers must be super important, as nobody knows what they are doing, and how they are supposed to perform. ..
..but hey, just look at the biiig, expensive, professional graphic cards, which probably have 99.9% of the same components as a gaming card. ..that can even have dozens and dozens of gigas of memory…that probably are 4 or 5 times more powerful than a top end gaming card. … and LOOK AT what happens, when nvidia don’t want gamers to buy such cards, and being able to run ALL GAMES at 4k ULTRA, at 120 or 240 fps, or even 8k 60fps HIGH ?
WELL, they simply release a crippled driver, that will turn that 10’000 dollars PRO graphic card into a $100 card, performance-wise !
YOU SEE what can be achieved through drivers ?
Give me a reason for nvidia to not PURPOSELY code bad drivers, and CRIPPLE those high-end gaming cards ?
The, tell me WHY can’t nvidia release a 4gpus-in-one card, or even 8 gpus in one card, add 32 Gb of gddr6, add 2 or 3 times more engines, more stuff. …sell it for 3’000 or 4’000 bucks, and the gamers who will buy such monster, they won’t need a new graphic card before dec 2021, or even later. ..? WHY ?
AGAIN, they need to sell cards just powerful enough to run the latest games at 2560p 60fps, mid-high settings, IN ORDER FOR that gamer to NEED to buy a new graphic card, within the next months. ..or when the next big game is released (say hi to rdr2…)
Do you remember how the lazy Intel, without any desktop competition, has spent the last 10 years releasing a new cpu, often, on a new socket, between 400 and 1500 bucks, for a 4cores or 8-10 cores, and every 8-9 months, that new cpu would barely be 4 or 5% faster…
…but people kept upgrading, upgrading. ..and Intel kept making a ton of cash, with barely no R&D efforts…
….
….and suddenly AMD pops up out of nowhere, with the ryzen cpu, and barely 2 years later, we are talking about 64 and 128 cores, for desktop cpus !
Nvidia had very little competition. .. another reason why they don’t need to come up with incredible gpus and cards.
And 2 or 3 years from now, when a $1’000 graphic card will be able to run all games at 4k, 60fps, ultra, with raytracing v2 or v3 enabled, which strategy will nvidia be using, in order to bring those fps and visual settings down to 5-10fps and low-mid visual ?
‘SAY HI TO 8K GRAPHICS’ !
What better way to reset the race and counters, than 8k graphics ?
Again, gamers will have to buy the new $1’700 graphic card, to reach 8k mid
Long comment. … my apologies …
poor coding needs more hardware software developers working along side hardware developers to milk the public
Did anyone seriously thought a 2080Ti would run this game at 60fps in 4K??? this is the best looking game that exists, yall need to be brought down to reality.
It’s well optimized, it’s the best looking game ever, dont be delusional.
People complain when games don’t use their hardware to it’s breaking point and now that a game that looks so damn beatiful and just lifelike and people are complaining because their hardware can’t handle it. This reminds me of the year 2007 when the original Crysis first launched. Most people couldn’t play past medium without massive bouts of lag. It was a few years later before gamers could play it as intended. Red dead redemption 2 is the new Original Crysis.