As we’ve already stated, the first third-party benchmarks for the NVIDIA GeForce RTX 2080Ti have surfaced, however I believe that we should focus on some triple-A games that cannot run with 60fps in 4K. You see, NVIDIA has advertised this graphics card that can offer a 4K/60fps experience on Ultra settings, however that isn’t the case in numerous games.
The reason we are focusing on 4K/60fps is because NVIDIA advertised the RTX 2080Ti as a graphics card that would be able to deliver such an experience. Now this does not mean that RTX 2080Ti is not an impressive product. It really is. However, we all knew that it wouldn’t be able to run some really demanding games on Ultra settings and at 4K/60fps.
Let’s start with Tom Clancy’s Ghost Recon Wildlands. Ubisoft’s open-world title was one of the most demanding games we’ve tested and while the NVIDIA GeForce RTX 2080Ti can run it with more than 60fps at 1080p and 1440p, it’s unable to run it with 60fps in 4K on Ultra settings. Basically we’re looking at an average of 46fps (so the minimum framerate could be even lower than that).

Next one is Kingdom Come: Deliverance. Again, this title was one of the most demanding games we’ve tested and similarly to Ghost Recon Wildlands, it’s unable to run with 60fps in 4K on Ultra settings on NVIDIA’s latest flagship GPU.
Deus Ex: Mankind Divided is an older triple-A game that is also unable to run with 60fps in 4K on the NVIDIA GeForce RTX 2080Ti.
Monster Hunter World was also running with an average of 49fps in 4K on Ultra settings on this brand new graphics card, though we are not sure whether OC3D used the workaround that – according to reports – significantly improves performance on NVIDIA’s hardware.
Hellblade: Senua’s Sacrifice was also a bit below 60fps (meaning that the minimum framerate was definitely below that number), though we should mention that this game will support DLSS in the future so we will see a significant performance boost.
Last but not least, and even with DLSS enabled, Final Fantasy XV was unable to run with 60fps in 4K on the NVIDIA GeForce RTX 2080Ti according to Guru3D. We should note that NVIDIA claimed that its new GPU would be able to run the game with 60fps in 4K.

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email





I will give a solid PASS to this RTX gen series of Video cards…..Will wait for next-gen lineup instead.
Not really worth the extra cash, at least in my opinion. The GTX 1080 Ti still seems like a better option, especially if we consider the price/perf ratio.
Same here but I’m going to have to do something soon. I’ve been wanting to get a 4K monitor for a long while now but my 980 Ti just isn’t going to cut it. 1080 Ti are heading down in price. Maybe I can pick one up when/if they hit $600.
I would like to hold out until next year to see what AMD’s 7nm gaming GPUs look like but I’m getting impatient to upgrade to 4K.
Even 4K 144Hz HDR monitors are expensive, to say the least.
I’ve got a GTX 1070 and a 55′ 4K TV. I get 60 fps on some titles and i don’t see it worth it to upgrade. Not only is ray tracing not viable and worth the money yet, not getting the full capabilities of this card are sad for the price.
I use a similar set up to yours 55 4k tv built my pc back in 2012 but since upgraded my 680 to a 1070 when it first came out and happy to play 4k with mixed settings getting 60 or higher than 30 fps in most games at times i lower my res to 1800p or 1440p max settings which still looks good on the 4k tv
Def happy with running 1440p 60fps on Far Cry 5 Maxed. Still more detailed than consoles with higher res and 30 fps. I’m also thinking i’ll probably do DMC 5 on PC
Im in the same boat… just like you. Not really worth the extra cash. Im gonna keep playing on my 1080 ti till next vga gen.
Me too since my 970 still runs evrything max settings 30+ fps some games even 60 fps!
if you are on a 970 you can upgrade to a 10xx. till the 3xxx serie or new amd card.
?? because i have 1440p monitor if RTX 2080TI cant run max settings 1440p with ray tracing then it will not be worth it. So far it runs evrything at 1440p max settings and most games run at at 30-35 fps some games even at 60 fps
Agreed this lineup doesn’t make you feel like I am left behind with a 1080ti.
https://uploads.disquscdn.com/images/c0a658ff4e6e81f08fba99bdf4e0e4d9e0a895760be4ee7d66b23661e6dd4f31.png
That’s what I’m doing.
Passing on this entirely and waiting to see just how expensive the next series is. If they go even higher in price with the 3000 series then I’m passing on those as well.
Agreed. Also seeing that Nvidia Marketing’s comment armies are out in force today on a bunch of sites defending this Apple-like business model to no end.
+1. Same here.
“The GTX 1080 Ti still seems like a better option”
I would rather buy RTX 2080 instead of GTX 1080 Ti. It has a bit better performance and support of new and better support of old technologies (not only RT and DLSS but also DX12 and Vulkan).
it is 1 percent faster 😀 and 200 bucks more expensive
It is 5-15% faster. It depends on the game. It has better support for DX12 and Vulkan and you can use RT and DLSS. But if you are not fan of technologies, then you can go with older GPU.
Not even close… check hardware unboxed , 1 percent on average and it looses to 1080ti in some games.. sorry m8 it is what it is
It’s a new arch, lots of time to optimize. Also, Pascal and Maxwell were very close in design so driver work is topped out, not on 2080/2080ti. lets have this convo again in 6mths/
bullshit mr fanboy, when pascal was out it was much better than maxwell end of it
https://uploads.disquscdn.com/images/2632068298b2446c0326a240bac51ee8bc203f830beda276ec59b6ecab18760b.png
hah joke
DLSS is per-game basis , not every game will have it so that is a dud , as for RT LOL yeah 1080p/60 in 2019.. no thank you m8
When first DX10, DX11 or DX12 GPUs come out, did not you buy it because of lack of games supporting these APIs? This is the same story. There are 30 games which will support DLSS and/or RTX in next 2 years (I thing others will be announced too). How many games support DX12 after 2 years? How many games supported DX10 or DX11 after first 2 years when first GPUs with their support were released? If you don´t want RT or DLSS, it is obviously not for you. You can skip this generation and wait for another.
by the time dx 12 vulkan and above al dlss and rt are widely used your card will be obsolete by far;.. so paying 50% more for a bit more performance is so dumb
That’s valid for all today GPUs with DX12 and Vulkan support. Are people who bought them dumb? If I jump from GTX 1080 to RTX 2080 Ti, I get 90% performance boost in 4K. Is it nothing? Of course jump from GTX 1080 ro RTX 2080 or from GTX 1080 Ti to RTX 2080 Ti is not good enough in performance point of view. But that’s not the only possible combination.
yes it’s dumb because price per performance is very bad… you have some 1080 ti at 700 dollar why pay 500 more for at most 30% more performance from a 1080 ti to a 2080 ti
I have GTX 1080. So if I buy RTX 2080 Ti, it would be huge performance jump. More then 80% in classic gaming and 2x performance in VR. Plus I would get better DX12 and Vulkan support and support for new tech like RT and DLSS. For me it’s good enough even when the GPU is overpriced. If it’s not for you then do not buy it. Nothing of this is dumb. Price per performance is not the only one valid parameter. Maybe for you, but not for everybody.
MHW, Kingdom Come, and Mankind Divided are horribly optimized. Not sure about others though
Kingdom Come still bad? I didn’t try it for 6 months.
Bad in terms of what performance? it’s still demanding though it looks great even on lower setting
as for bugs and other stuff, it’s practically fixed now
they ironed out over 500 bugs&glitches since the launch of the game
though i’ed wait if i were you, some more content+updates are coming to the game in the early quarter of 2019
KCD is more graphically advanced than most games. It’s also got more simulation going on than most games.
Also it uses voxelized global ilumination (SVOTI) which could be demanding.
Yeah, it uses SVOGI (Sparse Voxel Octree Global Illumination).
“graphically advanced”
Except for it’s own downgrades, especially to plan and forest assets, as well as building texture detail.
Won’t change the fact that it’s still more graphically advanced than most games, like I said…
Except it really isn’t, but nice try…
sure its’because of the game never because of the card
How disappointing but this is Nvidia as Shady as they can get. We need competition cause this type Bull crap is on life support. Pc gaming might be out of reach which is sad
There’s a strong point here that’s easy to make. The price is what makes people actually hate Nvidia/RTX series. I don’t believe anymore is disregarding the fact that this GPU is a BEAST, nothing less. I am looking to buy the FTW3 Hybrid version from EVGA when it comes out. Back 5-10 years i would always get the fastest GPU around, price didn’t matter, i got the GTX690/Titan Xm etc. Now with that price tag it makes people think twice like… @2000$ you can get a complete “average” system.
Price tag is the problem, nothing else.
Worst Nvidia lunch ever
What is on the menu?
Well, ofcourse it can’t. Depends on the game and how demanding it is.
This ‘4k 60fps at ultra settings’ standard needs to stop. Judge a game by what it does in the grahpics department vs performance. Judge a graphics card by performance under reasonable settings+resolutions, performance increase over previous gen, and the price.
Problem is some ultra seting still depends more on CPU than GPU like rendering range and scene complexity.
53 FPS is VERY close to 60 FPS, esp on a demanding game like KCD
it gets a pass from me
on the Witcher 3 it gets a whopping 80 FPS in 4K which is above and beyond the 60 FPS gold standard
very high setting has MSAA on 2X if memory serves
the thing that absolutely kills the FPS in this game is the draw distance
if you go above the 50% draw distance (which is like the default in the very high setting) the game will lag like a b*tch
personally tune down all the useless effects (bloom, blur etc…)
and reduce shadows to medium or just “high”
that will give you a huge FPS boost if you don’t mind that little lack of detail
i personally run the game at 3440X1440 on High setting with textures set to very high and everything else (draw distance to 50%)
i get about 40 FPS on average with a GTX 980TI but with a little OC i could get it to 50 if i wanted to (at the cost of a VERY noisy fan and a chance of an instability crash)
so if i take this card for 3440X1440 instead of 4K, i could definitely crank everything to 11 and still get buttery smooth 60 FPS
only problem is the huge battles, they’ll kill your FPS no matter what rig you use
even at 4K you still need basic AA
i’ed say just knock down MSAA to 1X and call it a day
and as for the article, this isn’t about “optimization” this is about 60FPS at Ultra (or at least very high setting cause this game is INSANELY demanding at Ultra)
and the bang for the buck you get for that 2080TI
though i’ll be honest, 1200$ is a f**king horrible price esp with taxes and import fees for retail stores (warranty for such an expensive card is a f**king must)
and if you want the aftermarket OC version, that’s gonna be at least another 100-200$
though like i said, give it time
there is nothing to play anyway…
Sure, but Witcher 3 was downgraded a lot.
There’s a reason it runs so well. Part of it I’m sure is good optimization and whatnot. Part of it is that it was lacking in the graphics department.
Remember that the vegetation in KCD was also Downgraded from the beta
among a few other non-crucial details (water reflection and such)
there supposedly there is a way to restore these old setting via console command
the Witcher however requires you to use mods to get rid of the internal downgrades…
no mods will get rid of the downgrades. The lighting model itself was noticeably downgraded, as well as things like Screen Space Reflections on certain surfaces etc.
Judging it by graphics alone is stupid since those games were designed around consoles, not a $1200 GPU, so that idea goes right out the window, and performance so far is.
Judging it by going to 4k medium to low settings just makes the argument against spending 1200 all the more stronger.
“Judging it by graphics alone is stupid since those games were designed around consoles, not a $1200 GPU”
So what ? Different games offer different options on PC. Some of them offer huge improvements on PC. Some don’t.
Judging it by an arbitrary standard is stupid.
Why should a $1200 GPU run every game at 60fps 4k ? Where did this standard come from ? PC gamers can’t have their cake and eat it too. If you jump the framerate by 2x and the resolution by 4x over the standard console game, why do you expect any graphics card to run a demanding game, especially with every single additional graphics option ticked.
And why ‘ultra’ settings ? Different games will have varying levels of graphical fidelity for their highest options.
“so what”, so it’s a complete waste to show off a console designed game which doesn’t come with 4k assets. It’s like trying to show off a game like Human Revolution, a game from back in 2011, which is dated by today’s standards and even then it wouldn’t be showing off much. Trying to show off a game like Creed Origin’s wouldn’t help either, as it has been confirmed that the game is sporting the same texture assets as consoles, meaning the graphical part is useless to judge with a $1200 card.
The only options you get are to tweak settings from low to high, as well as resolution and AA options. You shouldn’t even be forcing 8x MSAA at 4k for one example. DLSS is the only tech with that card right now that actually makes sense.
I find it a bit crazy that you think it’s fine for a very, very expensive GPU that touts 4k 60 to not do 4k 60.
It’s not a 1440p card, and it was never advertised or touted as one either, and saying it’s a 1080p card, at that price you’d be laughed out the door, so that leaves only 4k, which even the consoles are trying to aim for.
Actually yes, PC gamers can have their cake and eat it too, that’s why we have options, that’s why we have upgrades in hw, while consoles stay stuck with middle end hw for 8 years. Why do you think games are designed around consoles these days?.
You don’t even come off as a knowledgeable PC gamer, but more like one who either got into it recently, or is a console gamer. There are plenty of tech experts out there who know far more than you do, let alone having the right mind to judge the tech on it’s merits and what it advertises.
1. Why are ‘4k assets’ the only thing that matter ? Most games allow you to increase a wide variety of things from shadow quality, reflections, draw distance, object detail, volumetrics quality, terrain and geometry quality etc. There are many games in which a maxed out PC version are far beyond any console version graphically. This is true especially in open world games, where console versions have to be limited greatly. Battlefield on consoles completely lacks Screen Space Reflections, for example, and this is a game that runs generally well. There are plenty of games where PC version has a much bigger boost in quality.
You’re dismissing how good different games look simply because they don’t have ‘4k assets’. What is the purpose of that ? Textures aren’t the only things that matter in games. It’s one part, but by no means the only part and certainly not the most important.
2. “I find it a bit crazy that you think it’s fine for a very, very expensive GPU that touts 4k 60 to not do 4k 60.”
touts 4k 60 on which game ? and on what settings ? It will do 4k 60 fps on some games, but obviously not on others, atleast not on the maximum available settings. Different games vary wildly in the graphics department. Some games don’t offer PC many extra options. Some of them offer the PC significant quality increases that tank the frame-rate if you turn them up.
So it only makes sense that it won’t get 4k 60 fps on every single game, especially when some games offer more extra options to push on PC, and your comparison is based on ‘ultra’, which means the maximum available options. This is why you don’t judge based on 4k 60fps at Ultra. You judge based on what it offers in the graphics department at a particular graphics setting. And once again, you don’t get to dismiss everything because a game doesn’t have 4k textures (you don’t even need 4k textures in many games to have very high detail surfaces, eg Star Citizen).
3. “Actually yes, PC gamers can have their cake and eat it too, that’s why we have options, that’s why we have upgrades in hw”
No, you can’t. Do you think it is a coincidence that the most demanding games that push more in terms of graphical features are the ones that don’t run at 4k 60fps ? Top of the line GPUs are much more powerful than consoles, but how much more ? Enough for some games to run at ultra 4k 60 fps (but these are generally the games that don’t offer extra demanding graphical options on PC). Obviously not enough for others.
Plenty of games are too demanding to run at 4k 60fps ultra when you turn up settings, and you can’t get around that unless you want those developers to downgrade their graphics options. These games generally tend to be the open world ones that allow you to vastly increase draw distance, shadow quality, density, even GI (in more recent games)
“most demanding games”
I’m going to stop you right there mate.
Those games are designed for consoles first and PC last. Do you really think they are all extremely well polished and well optimized games?, because if the answer is “yes objectively” then I’ve nothing more to discuss.
The giant wall of text certainly does not help either.
You’re just dismissing everything by making more generalizing statements, and by saying “oh that wall of text doesn’t change anything”
I should also just say “your comment doesn’t change anything”, but that wouldn’t make any sense as a response.
“Those games are designed for consoles first and PC last. Do you really think they are all extremely well polished and well optimized games?”
No, but some of them are more optimized than others, and some are definitely more demanding than others, especially at ‘ultra’, because some games offer more demanding options on the PC. This is what I’ve been saying from the beginning, maybe you aren’t understanding ? You haven’t addressed this even once.
The fact that they are primarily designed around consoles first changes nothing here. It won’t change the fact that different games still have different options on PC and are demanding to different extents.
Therefore, your ‘one size fits all’ 4k 60fps at Ultra standard does not make sense. You would expect some to run at 4k 60fps and others (which are more demanding depending on the options they offer at higher settings) to run below that.
And no, the fact that ‘but it was made for console first’ isn’t going to change this.
That’s true, it is quite the shame. People feed on negativity, finding and criticising all that is wrong instead of focusing on whats good and right. Let alone the advancements in technologies.
Because they weren’t paid by Nvidia to suck their dicks.
who’d get these cards?streamers, youtubers, reviewers and media in general, autistic kids who have rich parents and those who have salary of $300-500 per month but they’re autistic as hell and don’t have any friends or outside activity so they’re ok with renting a place and paying for top end gaming equipment.
oh i nearly forgot virgin miners but it’s unlikely.
I work at a coal mine and I get more teenage underage poontang than you can ever hope to get.
I think you got him wrong. ROFL
no he doesn’t i like both goats and underage girls like a true muslim.
whoosh.
it’s fine, these days we have camel breeding farm and i do my thing with underage camels when i’m tired of my goat wife or adult camel wife.
~750mm2 die can`t run those games in 4K?. Wonder why?.
Bloat the price, gimp the performance. No thanks, Nvidia.
REEEEEEEEEEEEEEEEEEEEEEEEEEEETX 2080TI
Someone didn’t even bother reading the article.
“Now this does not mean that RTX 2080Ti is not an impressive product. It really is. However, we all knew that it wouldn’t be able to run some really demanding games on Ultra settings and at 4K/60fps.”
The GPU can’t run numerous triple-A games in 4K/60fps on Ultra settings. That’s not hyperbolic. That’s a fact. I don’t see any sensationalist title either. It only states the fact. It’s not like “The NVIDIA GeForce RTX 2080Ti is a big disappointment” or anything. RTX 2080Ti is an amazing product, we even said it ourselves (that however comes at a premium price. The GTX1080Ti had a similar 30% performance jump over its predecessor and was $300 lower at launch day). HOWEVER, it does not offer what NVIDIA advertised (unless of course we are cherry picking results).
You posted 4 games. “Numerous AAA titles”
you remain as embarassing as ever i see. Cherry picking 4 outliers then making a news article how the new card cant run numerous games at 60 frames. You dont even try anymore to at least sound like you have an ounce of idea about what youre talking about
Agree. 4K is extremely demanding and there will be always some games that will not run 4K 60fps totally maxed out. Flr example new tomb raider runs 59 fps average maxed out but with high settings game looks almost identical and runs arpund 80fps. But what’s interesting, 2080ti offers better perfrmance like 1080ti did in 1440p, and 980ti in 1080p, Well price aside 2080ti is a reall performance killer. In current games 30% up to 50% (wolfenstein 2 64 fps on 1080ti vs 103 on 2080ti m and 111fps with OC) and as digital foundry have said another 20% when developers will use new shading features so not long from now new games will show 50-70% more performance in favor of turing. And lets not forget about DLSS that thing alone can add 70%-100%. So it will be around 150% performance improvement in totall. I have seen 4K TAA vs 4K DLSS comparison screenshots from Final fantasy and even picture quality eas better on 4K DLSS. More sharpness and less AA. And of course there’s also RTX performance. Some sites have benchmarked also 1080ti in thar star wars rtx tech demo and it was 9 fps vs 45 fps difference in 4K.
By now John wrote only about negatives, but price aside new Turing GPU,s are really revolutionary.
Almost 60 FPS in very high setting 4K is pretty damn impressive!
(esp considering it jumped from 37 FPS on the 1080TI to 53 FPS, a whopping 45% increase)
i’ed want some LIVE benchmark on crowded areas, esp Battles to see how it holds up
but i’m definitely tempted to grab one
the biggest downside (not card related) is the lack of GAMES to play with it
sure i can revisit KCD for the DLC’s but i already finished it 3 times
and the upcoming games aside from Metro (which looks like it got R A P E D by Far Cry 5’s dumb$hit A.I)
i can’t really see it’s worth the investment
all of 2019’s lineups are SJW garbage and lame remasters/ports (much like every year)
i MIGHT grab this for Bannerlord in 2020 however, maybe the price by then will be knocked down to a 1000$
you should never believe on nvidia
https://uploads.disquscdn.com/images/1ce20003395ab811ed52624d471f667f058375c4cc2cd82e1d39881eef603dba.png
Dude we’ve seen this like 9999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999 times already. Just stop posting this jesus christ. We all know that’s expensive.
WE GET IT.
Their marketing mistook us for Apple customers.
Passing on it entirely until the 3000 series, and they had better have it sorted by then, both price and performance, otherwise I’m just going to wait for AMD in the future. I’m content with my 1080ti, so none of that fanboyish crap like “gl waiting on AMD, you’re gonna have to go Nvidia lololol”.
Curious what NVLINK SLI can offer. This is a different beast from what we know of multi-gpu tech altogether. GR: Wildlands and Deus EX: MK supports a form of SLI already. The former scaling really well. I’d like to see the results with minimized or AA Off.
ok soooo sad cause too expensive and not so fast compared to my sli of 1080
Apple likes customers like you.
stupidest comment you could make i loathe apple… apple policy = make beautiful well made tools but far overpriced for what they can technically do, a sli of 1080 = expensive but fastest thing you can buy so stupid comparison… and don’t play the oh it was humour bla bla bla i didn’t believe it that bullshit thx
The reviews are out for both cards, and the results are unanimous they are expensive and underwhelming.
And why exactly do you assume it would with such an arbitrary thing like “Ultra” that could mean anything? Talk about click bait title.
It generally means the highest settings. However, the features those may include vary on a game by game basis.
I want to know what AA is being used. 8xMSAA is idiotic at 4K.
If anyone ever thought 1 2080ti was ever gonna properly run AAA games at 60fps 4K then i really got news for ya kiddos.
Why not focus on what it can run and the simple fact that it offers 20-30 more fps in games at 4K compared to the 1080 Ti. The new cards are a huge leap in performance but unfortunately in price as well. Negativity sells though and understandably you’re after the clicks.
Huge leap? dude the 2080 is 1 percent over the 1080ti on avg and the 2080ti is 25-30 percent.. where the hell is that yugeee leap
It’s 40-50 percent faster and that is a huge leap. There is a huge advancement in technologies that move the industry forward. If you don’t care for it don’t buy it. The cards just came out, there are still optimized drivers coming and developers need to optimize for the new cores. Again if you don’t want it don’t buy it. There is absolutely no need though for such negativity at this point.
lol what??? lol no the 2080ti is not 40-50 percent over the 1080 ti LMFAO it ranges between 15 (worse case) to 31 percent on avg to 50 percent in ironically TWO amd vulkan sponsored games.. the cards are optimized already , and who the hell would buy the 2080ti over the 1080ti for 700 to 1000 euro more..Turing is a outright disappointment , and it is DO NOT BUY from the all the 99 percent of the tech press either written or you tube.. it is not negativity is just facts
Drop settings ? Are you serious ? You pay $1200 for a GPU to max out all games, and not to compromise on some graphic settings, no matter how demanding the game engine might be.
Everyone who pays 1000+ USD on a card, should expect their GPUs to max out all games, and not just give a playable gaming experience by reducing some graphic settings, which you seem to claim.
Makes little sense..Because even after spending 1200 on a card, if it can’t max out the game, then it’s really not worth the upgrade, given how expensive these cards are, to start with.
People expect the PS 5 to do native 4K 60fps when the 2080ti cannot achieve that on every game. They need to get their expectations into check for what a $400 price point can offer.
It’ll probably be able to do 4k 60 with medium settings, but mostly 4k/30.
by the time it releases a high end pc(with high enc cpu ram and graphics card of that time) will be at least 10 times more powerful than it probaly even more! just like when xbox x come or ps4 pro or even ps3 ps2 e.t.c. Pcs are wayyy ahead!
Hey man, you’re preaching to the choir here
Why? With how fast pc’s are becoming more and more powerful this will happen sooner or later. When intel starts releasing their graphics card we will see a much faster advanceemnt in pc graphics comapred to now that is nvidia and amd only.
BUT RTX 2080TI runs evrything maxed 4k and most are on 60 fps and the few who dont run on 60 fps run at 50+ so iti sok. My gtx 970 runs most current games at max settings 30-35 fps however they are fully plable that why i have kept it for so long because it still run max settings 1440p even if most games run at 30-35 fps instead 0f 60. We dont need 60 fps. 30+ is what you need for games to be playble!
BUT RTX 2080TI runs evrything maxed 4k and most are on 60 fps and the
few who dont run on 60 fps run at 50+ so iti sok. My gtx 970 runs most
current games at max settings 30-35 fps however they are fully plable
that why i have kept it for so long because it still run max settings
1440p even if most games run at 30-35 fps instead 0f 60. We dont need 60
fps. 30+ is what you need for games to be playble!!!
https://uploads.disquscdn.com/images/b06a506fe331f6dae1342a3d8ecf379a72834f8c7f14ef18c4c572c8cb1a7c49.jpg
THIS CAT IS FAT WITH THE LIES OF NVIDIA! OF COURSE THIS ENDS UP AS A MASSIVE WHIFF IN THE BREEZE!
Damn! Just when I stocked 2000k $ for new card! Ah, well, skipping that until they come up with something that work 16k at 100fps, sticking with my 1050 ti until then!
Me too! Until they releae card that can run 16k with 16k textures and graphis be like real no point to upgrade. My 970 still runs maxed 1440p 30+ fps on evrything!
Never buy a new video card at launch. Never. There’s no optimized drivers yet and it takes times. Not worth forking out that dough up front. Then again, if you already have components that are capable, then you probably have the money to do whatever you want.
The 1070 Ti is still good enough for gaming at1080p 60fps. The RTX cards a waste unless you’re super into 4k@60fps
why do i get the feeling when they say max they also mean lets jack up the msaa as well, at 4k thats just wasted resources. just saying if thats the case
Jesus John, the cards just released. Let drivers mature. Also, you didn’t take into consideration that those ran on stock clocks, which sits around 1600/14000mhz, this card easily does 2000-2100/16000 outputting about 5-10 FPS extra in 4K.
Ps, Kingdom is unoptimized at highest settings because at 4K, the in-house global illumination system (SVOTI) of CryEngine is extremely taxing.
I use a similar set up to yours 55 4k tv built my pc back in 2012 but since upgraded my 680 to a 1070 when it first came out and happy to play 4k with mixed settings getting 60 or higher than 30 fps in most games at times i lower my res to 1800p or 1440p max settings which still looks good on the 4k tv
Overpriced card with meh results and a new gimmick. in a s**ty industry that no AAA game past few years even worth playing.
Fake news!
…. AND WHEN WE THINK these graphic cards could be outputting 50, 100, even 200% more processing power, IF ONLY NVIDIA allowed it…
Let’s look a Quadro card. It can cost + $5’000, huge Tflops figures, tons of memory, pretty much the same components as a ‘normal’ card… BUT, if a guy tries to play a game with it, he might get 10fps at low settings…WHEN, with such hardware, he could be having 3 or 4 times more processing power than, say, a 1080ti !
BUT NO. They limit/lock the card on the driver level, and probably also add 2 or 3 tiny components.. in case a genious coder would write driver to unleash the beast.
Obviously, NVIDIA don’t want gamers to buy ONE super powerful card every 5 years, they prefer if they buy FIVE cards in 5 years.
Imagine if this new 2080 card could run ALL titles at 4k120fps ULTRA SETTINGS : yeah, you see what i mean… that gamer would have a card powerful enough, UNTIL 2020 or 2021 !
DONT you find it WEIRD, that these new cards ARE ALWAYS just powerful ENOUGH to run the games released 6-12 months ago…? Strange, huh…?
Let’s say a Nvidia card has 5 major components.
Let’s say component #1, they have 4’000 units…
WHY couldn’t Nvidia SIMPLY use 5’000, instead of 4’000… ?
I mean, they JUST need to add more components, at the end, the card might be 10 degrees hotter. .. might cost 300 or 400 extra bucks… BUT…
… when a gamer will buy one, he will be able to run 6-12 months old games, BUT ALSO the games that will be released in 1 or 2 years !
That way, he might pay a bit more, but he will be able to keep the same card 1 or 2 extra years !
WHICH, OF COURSE, Nvidia doesn’t want him to do !
Nvidia will make that card just powerful enough to run 1yo games…OBVIOUSLY, in 6 or 12 months from now, that card won’t be powerful enough to run the latest games….
…and the BUYING CYCLE goes IN : the gamer will need to buy the 3080 TI !
What i mean is Nvidia could make their top end gaming cards MUCH more powerful… more memory… more processing units… more everything…and even if those cards would cost 1500 bucks instead of 1000, the gamer might be able to USE that card 1 or 2 extra years, while still being able to run the latest games at ultra settings.
BUT NO. Nvidia ALWAYS release new cards just powerful enough for the already released games. THAT, SO even the most powerful card can basically become obsolete /not powerful enough, a few months later, so the gamer goes BUY another ‘best’ card…
THEY could do much more, much better, much more powerful…BUT NO . THEY make it SO the gamer needs to buy A NEW CARD, every single year.
So YES, Nvidia locks/limits they cards ON PURPOSE. They don’t want gamers to keep the same card 3 or 4 years. They want them to buy a NEW ONE every single year.
Think about that, for 1 second…
another fanboy bullshit, sorry but a 1400 dollar card not doing 60k fps 4k is a no no no no no
I’m the idiot you moron?