Shadow of the Tomb Raider releases in a few days and from the looks of it, it will be a demanding title in 4K. Insomnia’s Spartan117 has shared an image from the built-in benchmark in which the NVIDIA GeForce GTX 1080Ti can run the game with an average of 47fps and a minimum of 40fps in 4K on Ultra settings.
NVIDIA has stated that its upcoming graphics card, the NVIDIA GeForce RTX 2080Ti, is around 25-35% faster than the GeForce GTX1080Ti. This basically means that in this GPU-bound title, the NVIDIA RTX 2080Ti should run the game with a minimum of 50fps and an average of 59fps in the worst case scenario (25%) and with a minimum of 52fps and an average of 63fps in the best case scenario (35%).
However, we also know that Shadow of the Tomb Raider will support Deep Learning Super Sampling. DLSS is a new technique that applies deep learning and AI to rendering techniques, resulting in crisp, smooth edges on rendered objects in games. According to NVIDIA’s own graph, the RTX graphics cards are around 75% faster when DLSS is enabled than their previous generation counterparts. This means that the NVIDIA GeForce RTX 2080Ti should, theoretically, be offering a minimum of 70fps and an average of 82fps.
We won’t be certain about these figures unless we test Shadow of the Tomb Raider ourselves with an NVIDIA GeForce RTX 2080Ti. For what it’s worth, we’ve already purchased one and we’re patiently awaiting for it (though it will most likely not arrive at launch day).
Still, and if DLSS works as it’s intended, this should give a smooth gaming experience in 4K. While in theory the NVIDIA GeForce RTX 2080Ti will be able to run with an average of 60fps Shadow of the Tomb Raider in native 4K, we are almost certain that a lot of RTX2080Ti owners will prefer using DLSS in order to maintain higher framerates.
On the other hand, and as we’ve already stated, this new graphics card has a lot of trouble currently with real-time ray tracing. At Gamescom 2018, and for most of the time, Shadow of the Tomb Raider ran with 40fps on the RTX 2080Ti. EIDOS Montreal and Nixxes claimed that they will further optimize the ray tracing effects so hopefully the final product will run better.
Stay tuned for more!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email


I will quote some tech guy about DLSS:
(It’s not my statement, i will have my own opinion after Tests and comparisons emerges)
“DLSS is a myth, just like their “upsampling” technology or whatever the hell it was called.
How the hell do you take a 4k image and “downscale” it to 1080p, when 1080p
literally doesn’t have enough pixels to fit a 4k image in it?
That’s like saying a 4k image will look better on a 2×2 pixel monitor than a
2k image. How? You can’t fit the extra information in only 4 pixels.”
Another AMDumb who knows everything without having card.
About “upsampling”, DSR and AMD VSR are the same thing but somehow he only attacking Nvidia.
It’s based on the resolution that you sample at before applying a filter. Then the image is downscaled to the desired resolution. Best advise I can give you is don’t listen to ret@rds.
DLSS is very different in what it does.
One key part you’re missing here, is that when downsampling, the resolution isn’t just evenly divided down and the pixels displayed exactly as they would have been at a native resolution. There is a trilinear algorithm that averages the changes in color information across the extra pixels provided by the higher resolutions and then uses that averaged color in the downsampled pixel displayed natively.
What you end up with is significantly more accurate anti-aliasing, as opposed to the blending algorithms of post effects like FXAA or SMAA.
The other part you’re missing, is that DLSS isn’t traditional downsampling. It’s actually rendering your game at a lower than native resolution (which seriously lowers the performance hit of the game) and then the Tensor core is capable of upscaling the lower res image to a higher res image using AI in real-time. That gives you a game that looks upscaled beyond native, but it’s actually run at roughly half your native resolution. So it’s an image quality improvement PLUS a big performance improvement over native resolution.
Heh, that sounds pretty cool, actually.
lol…stop spamming…u haven’t got a clue.
Something else I’d like to add, is the distinction between “downsampling” and “upsampling”. The two terms create a lot of confusion and are therefore often used interchangeably but they are quite different.
Downsampling = Rendering/sampling at a resolution higher than the native resolution, and then using a trilinear filtering algorithm that combines the additional color information gained from higher sample rates and then smooths out the results before scaling the final image back DOWN to the native resolution, thus DOWNsampling.
Upsampling = Rendering/sampling at a resolution less than or equal to the native resolution and then using an algorithm to scale the final image UP to a higher resolution. This algorithm can vary greatly.
DLSS is hardly a “myth”, as according to NVIDIA themselves it exists and uses specific hardware to do what it does. Whether or not it’s worthwhile remains to be seen, but it’s hardly mythical at this point…
And more to the point, DLSS is NEITHER of the two aforementioned rendering techniques. It uses neither upsampling nor downsampling. It uses AI to learn how an image’s details change when it is rendered at higher resolutions and applies those changes using this form of learning to the final image without every having to actually render it at a higher resolution and “downsample” it. It’s basically guessing at downsampling without actually doing it. This form of AI is trained using numerous simulations of a particular game and taking each image produced in the simulations and rendering it to a higher resolution, then learning from this upscale how that image changes. Then it’s able to very efficiently and accurately guess at how to improve an image without needing to upscale it.
Think of FXAA/SMAA for a moment… These were born from the application of human intelligence that noticed the artifact of aliasing at lower resolutions, and then noticed how the appearance of these artifacts was reduced when rendered at higher resolutions. Then, they put into a specific algorithm a way to detect presence of edges and blend color around them to make them less noticeable, simulating the effect of a higher resolution in those places. DLSS is doing exactly this, but all from AI intelligence instead of human intelligence, making it much more accurate than FXAA/SMAA/TAA
Of course it should! More so if you disable or minimize AA at 4K. Reminder everyone this is with RTX OFF maybe even with DLSS OFF
https://uploads.disquscdn.com/images/1ce20003395ab811ed52624d471f667f058375c4cc2cd82e1d39881eef603dba.png Nvidia already as big fail
Shush pesant. ?
https://uploads.disquscdn.com/images/efe2427c6ed67749b8540ca618486577148b49050770c74adf0ef5f94d697309.png
LOL
RIP RTX
https://uploads.disquscdn.com/images/870c4b776119aadd95312641bf2c42bfd35fb353e68c75b37ba5e73945d6bfb2.png
The ones who voted “It’s worth it” are the worlds dumbest, they should be stuffed into a rocket and fired off into space lol
Poor kids voted ”It’s Not” like you. You can’t afford the best of the best and you cry like a b*tch.
if you’re so rich do be kind and send me one gtx2080ti, I have a gtx 1080 but still i want the best of the best and i’m obviously poor.
You’re a d1(k.
This guy wants to eliminate people he thinks are dumber than him.
Or perhaps, he just wants to eliminate people that make more money than him…. AHA! I think we’re onto something!
God@m peasants….
?
https://uploads.disquscdn.com/images/5da71610a4e6217158113544fa33b092c0d72578befa0c757363ac9fb1987c00.jpg
Sadly our category of 2 x 2080 Ti is not on there. Grief befalls me….
Im buying one as soon as Amazon has stock.
Early adopters of new tech always pay a premium, the image has no context of “1080p@30fps” and fails.
Look at history with GPU’s and realize you’re wrong. But since Nvidia claims this for this generation because they want to sell cards with very little performance info, you just go right ahead. Pfft, common sense!
You didn’t sight any argument to back up what you’re saying, you just said it. We’ve known for a while now that performance isn’t going up as fast as people want so you have to push new graphics tech with it. The people that just want doubt performance all the time and then claim NVIDIA don’t innovate are just hypocrites.
I hope this calms people down a little bit about the new cards. People are getting wayy too worried that this card is going to be a useless flop for no good reason.
Everything currently indicates that without ray tracing, you can very reasonably expect near 50% increases in performance vs. previous gen models. That is REALLY GOOD everyone. Pascal spoiled us a bit with higher gains last time, but typically the gains from gen to gen are only around 30%.
This gen, it looks like you can expect 50% increase in traditional performance, PLUS a really neat new technology for Ray Tracing that you can experiment with if you want to. Not to mention the really cool application of AI for graphics with the Tensor core, that will at the very least, bring some very welcome improvements with DLSS in the games that support it, though I grant you, this can’t exactly be relied upon as it is up to developers to choose to implement it, or at least have NVIDIA implement it for them, which we probably won’t be able to do for many games.
I agree, the cost increase per model is a bit disappointing, but you really are getting some impressive new technology. If you don’t want it, then buy a Pascal card for a better price. I’m rectifying the 2080 Ti’s price by the absence of a Titan, as the Ti cards typically are Titans at half the price for all intents and purposes. Maybe NVIDIA finally realized most people that need Ti performance will still pay for Titan price tags if they have no other option, and you can’t blame them for making said decision as a company.
Nobody said its gonna be useless.
But the price for what you gonna get is too high gaming should be more affordable even high end gaming a high end gpu for gaming should not cost 3 times the price of a console.
If you think you’ll ever find me on the level of my serfs and peasant playing on a console, I’ve got a fat one you suck on.
Dont be an Immature elitist,you cant agree to those pricing!
In 2015 ive bought my 980ti it was almost 2 times more expensive than buying a console(700-750$)and it was the best gaming gpu available fast fordward to pascal the gtx 1080 launch price was the same of the 980ti and now the 1080ti is around 1200$. Nvidia are being greedy because some people are sheeps that are ready to pay stupid amount of money for gaming.
What next? 3080 at 1000$?
I love gaming but not at those prices im sorry but nvidia are being anti consumers with those shaddy practices.
You got trolled.
I agree things have gotten expensive.
I just needed an excuse earlier to ask you to suck me off in a public setting.
Well take me to dinner first at least XD
If you have a professional use for the 2080 series, then you will adore the new cards, otherwise …. ROFLMFAO!
With the 2080ti. 4k 60fps should be the minimum. Lets see how the 2080 and 2070 perform. That’s the real question.
Never mind…
You have to admit, that graph was VERY poorly presented. Nothing in that graph is as self-evident as what you just said. Considering the performance increase with/without DLSS is on the same exact axial point, you wouldn’t be crazy for assuming that the peformance increase factor is in fact relative to the performance achieved on the same card in the same game before DLSS was turned on.
You really need to reconsider your insecurities before bashing someone else’s intelligence with lots of capital letters and demeaning remarks for something not remotely as obvious as you think it is. You either lack perspective or love the way that putting other people down makes you feel. I don’t really see many alternatives, but if there is one, forgive me for not giving it consideration.
Nvidia clearly doesn’t respect the intelligence of its consumers
They respect their money in their pockets tho.
Wow. Nvidia sure s**t the bed with this one. Definitely trying an AMD card this time around.
Maybe consider not pricing your cards so stupidly Nvidia. The arrogance is comical.
“may” is a strong word
for a 1200$ card
i’ll wait for the Metro Benchmark, since i know for sure Metro was designed to be as graphical as possible
Is that’s with DX12 on them presumably the game was running with VXAO enabled?
After warching gamersyde videos and DF analysis I have to say it’s the best looking game right now and of course with RTX shadows it will look even better :). But requirements are very high, pchardwarede benchmark suggest in 1080p game runs in 50-60fps on 1070.
List of DLC’s added to uou if you bought “Croft Edition” big list 😉
https://uploads.disquscdn.com/images/9a6eac72c90b0bb055fe1adf7e11b89df1db90ee544a80b918f9feb6a456b503.jpg