NVIDIA has announced that its new GeForce RTX graphics cards support several new, innovative advanced shading techniques, that will enable developers to further improve performance. One of them is called Adaptive Shading and it will debut in Wolfenstein 2: The New Colossus via a new patch.
According to the green team, Adaptive Shading, formerly known as Content Adaptive Shading, adjusts the rate at which portions of the screen are shaded, meaning the GPU has less work to do, boosting performance.
The GeForce RTX GPUs (RTX2070, RTX2080 and RTX2080Ti) can measure spatial and temporal color coherence per frame, and in areas where detail remains unchanged from frame to frame (like for example sky boxes or walls) the shading rate can be lowered in successive frames. As such, this can improve overall performance in games that support this feature.
Of course developers will have to manually implement this feature in their games so that the RTX GPUs can take advantage of it. Moreover, NVIDIA has not provided any percentage numbers of the performance boost that PC gamers can expect from this technique.
NVIDIA stated that the first game that will support Adaptive Shading will be Wolfenstein II: The New Colossus and the patch that will add support for it will release later this month!
UPDATE:
MachineGames has just released the patch that adds support for Adaptive Shading in Wolfenstein II: The New Colossus. This patch will be auto-downloaded from Steam and you can find its complete changelog below.
- Added support for NVIDIA Adaptive Shading on NVIDIA RTX series GPUs. (Improves frame rate by dynamically adjusting the shading resolution in different areas of the screen, without affecting fidelity).
- Ensured that, on multiple GPU systems, the discrete GPU is preferred over an integrated GPU.
- Players can now choose to ignore/suppress warnings when the selected video settings exceed the amount of dedicated VRAM available on the GPU
- Fixes for skinning issues on GTX 970

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
Cool but why has the RTX 2080 Ti delisted from the official Nvidia Store.
It’s already been debunked…
Clarified how ? What happened ?
I just checked and it’s still on their store. It’s just been out of stock for a while. The EVGA store is out of stock as well. I saw a couple on Amazon but they are price gouging. My guess is that the shortage is due to a combination of yield issues at TSMC and people buying every one that hits the stores even at this absurd price.
the best way to improve performance is for programmers become more competent and not lazy not by helping their laziness
there are a lot of beautiful games that run very well and optimized like BF4,THE phantom pain & Crysis 3
This is one of the dumbest things I’ve read all day.
Creating innovative new techniques that everyone one in the industry can use to boost performance in their games is not “helping developer’s laziness”.
It’s just one of the many other ways to boost performance with the goal being the highest performance possible.
The Wolfenstein dev’s are some of the most talented in the business and they’ve already done a phenomenal job with Wolfenstein 2.
I really couldn’t care less what the comment section thinks about Wolfenstein 2’s performance. It’s empirically shown that it runs really well for what it looks like.
That’s unfortunate. Performance matters. And it’s directly related to Content Adaptive Shading. But I agree, most of the people on DSOG don’t care much for this stuff. Which is why it’s annoying when they open their mouth and bs comes flying out.
dumbest…really?
innovative?
do you even know what this is
Yes. It’s a way to boost performance by saving the GPU from doing extra work on areas of the screen that have already been shaded and don’t change much.
It’s a good feature that boosts performance for free. Read the article.
by downsampling the quality like their new anti aliasing ?
the word adaptive means $hit here like adaptive AA
the current hardware showed that it is capable of running everything with a high FPS no need for cheap tricks
good luck
Do you know what optimization is? 90% of it is finding cheap approximations that do the same thing with no noticeable difference. Why should a GPU waste resources shading something that has barely changed? This isnti a “cheap trick”, it’s just a smarter way to use the GPU.
Yeah they made a great job on optimizing it. It’s one of few triple-A games to run at near 4K at 60 FPS.
“Never underestimate the predictability of stupidity”.
Bullet Tooth Tony:Snatch
Very well said sir!
It makes perfect sense to provide devs tools to assist them with saving performance that can used elsewhere.
It’s like optimisation of the architecture, which in term gives them more time and money to develop elsewhere, as they have saved in this area.
LOL on Crysis 3 btw, it had horrible cpu utilisation and optimisation.
And what about amd user??can they get that too??
Not yet, this is exclusive to the NVIDIA Turing architecture from what I’ve gathered so far
No. Not even Nvidia users can get it unless they have the 2000 series
Yep, it’s an unfortunate choice of game for the unveiling of said tech because anything SJW-related should be kept in permanent shade!
I haven’t seen a steeper drop in quality between games in a very long time. It’s like someone kidnapped the devs of The New Order, replaced them with pod people and had those new devs make The New Colossus.
Unfortunately, I think the devs had to rein themselves in for TNO, actually make a good game, but after it and The Old Blood did really well, they got stricken with George Lucas syndrome, got too big for their britches and decided to show their true colors and make the game they wanted to make: an awful 6-hour interactive movie exclusive to the NPR crowd and it ended up dying on delivery, rightfully so.
https://uploads.disquscdn.com/images/f6c0359d34c4a0300b8b0784f65578b160c98a596b4d425a5487c2299f7298c3.jpg
Ray tracing was always going to bring phenomenal hit, anyone who thought otherwise is a numpty and clearly knows nothing about these processes.
Secondly, RTX is the brand of the GPU, not the term for ray tracing feature used by Nvidia.
RTX can refer to DLSS, raytracing or Adaptive Shading.
Don’t waste your breath, they are salty it’s expensive, and act like the first generation of a new, cutting edge technology should be the same price as the old one that’s been around for decades.
Think we all agreed, the cards are ridiculously priced and I don’t think ray tracing is really ready for masses at all.
But it’s stupid to disregard the things that it’s bringing to table and things that will positively impact on future of gaming
phenomenal hit
Haaaaaaaaa
Yeah Go buy a RTX 2080 Turn on gaytracing an enjoy the phenomenal FPS.
Secondly . Its a gimic. No one wanted this and no one need this
Are you a child? Or maybe just special needs?
either way, you are incapable of having adult conversation in sensible, rational manner it seems.
IF, I wanted to, I would buy a 2080ti, and not the 2080, as it’s only GPU actually worth buying for raytracing imo.
Secondly, as I’ve stated multiple times, I have less than zero interest in these cards, as I don’t believe it’s worth the hit, requires me buying a new 1080p monitor with gsync and HDR and finally, the prices are insane.
I see it as amazing technologically, but ultimately, very premature, as industry and hardware is not ready to make this mass market.
However, I can admire it technologically and also appreciate these other features aside from Raytracing.
I’d certainly love to have this feature and DLSS on my 1080ti, as I would enjoy better performance in games on my 4k monitor.
Grow up
“, you are incapable of having adult conversation in sensible, rational manner it seems”
Yeah cuz the RTX series is a joke And anyone supporting it have more money then sence …. You got something expensive and you have to defendit it cuz its natural to defent your purchuse. . …
“IF, I wanted to, I would buy a 2080ti,”
No you dont.
“the prices are insane.”
But but GSYNC AND HDR and 4k Are the future.!
“performance in games on my 4k monitor.”
You have money for 1080ti and 4k monitor? Wow taking abou useless HW ….. . You paid so mutch for 60fps .. . .
“Grow up”
I did Thats why i am happy with i 5 3570k and 970s
1080p 60hz and 60fps is sweet and cheap.. Just like u mum
“IF, I wanted to, I would buy a 2080ti,”
No you dont.
That’s not even English you clown lol
“You have money for 1080ti and 4k monitor? Wow taking abou useless HW ….. . You paid so mutch for 60fps ..”
Again, English is clearly not your strong point.
But, I’ll play along.
Yes, I bought a 4k monitor and and a 1080ti, so what?
How are those low res textures you have on your 970? 60fps is not even guaranteed on a 970 these days unless you are running on medium settings.
Medium settings…mmmmmm
“sweet and cheap.. Just like u mum”
Better than rabid and infected with the clap, like yours.
🙂
English is my 3rd language what i leard home not in school. So it you wanna defend the GAYTRACING pls use a better insult than my english. . .
My rig can run farcry 5 max settings 4k 30+ fps
And i can run what ever on max with AA disabled cuz i dont wanna tank my fps…..
“4k monitor and and a 1080ti, ”
Error 404 english not found
English is my 3rd language what i leard home not in school.
It shows.
“My rig can run farcry 5 max settings 4k 30+ fps”
Lies
https://www.youtube.com/watch?v=xaxtp1kQ3bY
“And i can run what ever on max with AA disabled ”
More lies.
Various games more vram than you have allocated on your 970 to run highest res textures.
Go back to school, and not just for your English
https://uploads.disquscdn.com/images/144f39b3a127549d68bc4e7474deb9d273d897661a59c91b854a5ef50f7f5d80.png https://uploads.disquscdn.com/images/8d7991ce1c091ed40cf4a4be1953f42e9c143796fb3a38cde02ac34b397d3d11.png
Ooooooh nooooo
Woud you looooook at that
You paid so much just to se 40fps uuuuuuuuuuufff
How about downgrade the graphics to medium hmmmmm?
Medium settings…mmmmmmm
How dose it feel to taste you own poison ehhh?
:-*
I fail to see your point.
firstly, I wouldn’t attempt to run at Ultra settings, I’d adjust to ensure i had a good mix of settings that allows for better performance, with minimal visual impact in most games, secondly I would also alter resolution if required to keep a more stable frame rate, and thirdly, I have a gsync monitor and don’t have to hit rock steady 60fps to avoid tearing.
Finally I made no mention of hitting 60fps, unlike you, who claimed it was easy to hit 1080p at 60fps on your 970.
Enjoy those muddy less than console level textures
I fail to see your point.
kids these days
https://uploads.disquscdn.com/images/10539fca53316654b3b26edd6de5645edc2db5a0830c1bec484d5709dafa3e2f.jpg
oh look a benchmarked your lies . looks like my old rig can run Far Cry 5 on 1440p just fine on 60 FPS on max Settings ….
Enjoy those muddy lies you
Ha ha
Great, but that was not your claim
You said 4k 30 FPS and evidence suggests otherwise in the gang you claimed would run at those settings.
I’ve never claimed my GPU ran anything other then at what it does. It’s clearly not really ideal for 4k in many games, especially at ultra settings 60fps, which is the conversation started, lest you forget, as I stated I’d like DLSS as a feature on my current set up and this other feature would also improve performance too.
And yes, having gsync helps mitigate frame rate drops where 60fps isn’t possible at all times.
What a fool I am lol
You’re also either lying or just wrong if you think your set up can achieve 1090p maximum settings in anything other than select games which aren’t CPU intense, as there are numerous games on a 3570, which can’t run at higher settings on it. Love to see you get 60fps on assassin’s creed origins on maximum, and in fact it will drop below 30 at points.
Your GPU is less capable then Pro and Xbox 1 X and that’s fine, I don’t care what you play on, if you happy, buy you can’t expect people to swallow the BS you coming out with about your hardware, as it’s a total lie.
Further more your pathetic insults to the RTX GPU and my set up, just make you sound like a bitter petrol who’s envious as they can’t afford better hardware.
I am sorry you can’t afford better, and so envious.
But, if you can’t discuss things like an adult, you should stick to kid’s play area for discussions
https://uploads.disquscdn.com/images/f6e04a2c6f4a203f4f7762837c9021f3f535a8dc8ba835a7b61cfe0d65de8b48.jpg https://uploads.disquscdn.com/images/733baeaa35ceb54dce61d8c12f33c2ef6925dcccd244eae31f91b001a0022119.jpg
Just SHUT up kid…You are the same moron, “luckynumber8” I presume. Was reading another previous topic where you were being exposed.
Why the heck are you back ? Just to defend NVIDIA, and spit out garbage from your mouth ?
Get a life outside this forum. You are nothing but a nuisance. Even “Kiran Kara’ is using rude language/insults against BFG, not just him.
If you don’t find this insulting, and/or racist, then you are BLIND. This is what Kiran Kara wrote:
“”That’s not even English you clown lol””
Shut up, again. This is what “kiran kira” wrote in the first place, which made BFG reply. Read ALL comments carefully, to see who first stared the fight/insult:
“”” Are you a child? Or maybe just special needs?
either way, you are incapable of having adult conversation in sensible, rational manner it seems “””.
After reading the above comment, BFG made this reply, “Just like u mum”…..
Which made you think that bfg started posting rude comments.
” phenomenal hit
Haaaaaaaaa
Yeah Go buy a RTX 2080 Turn on gaytracing an enjoy the phenomenal FPS.
Secondly . Its a gimic. No one wanted this and no one need this”
No technically this is where it started.
I think you will find that’s homophobia.
You can start attacking your friend BFG for homophobia when you’re ready .
Have fun
How is that racist ?
It’s not correct English, that’s just a fact.
He comes on forums trying to make fun of people with broken English and I’m supposed to be sensitive to his feelings about his poor English? Then
Behave lol
If you going to start throwing insults in broken EnglishE which he did, including calling my mother, then he should expect to get taken to town for his terrible English.
Isn’t like I’m just picking up on his English randomly.
Felling guilty, now ? Yup sort of, I can sense that….lol
“””He comes on forums trying to make fun of people with broken English”””
When did BFG made fun of others with regards to English ?? Kindly quote his post/comment, because I don’t see any such remarks from his posting history.
well i did it now but only cuz he made fun of mine sooo …
“4k monitor and and a 1080ti, ”
Error 404 english not found
Gtx 970… Error ultra settings not found, not enough vram found.
Your 970 can run game better at 1080p… And there we have it, only a few million less pixels less, with more aliasing, plus screen tear and stutter too.
Far cry 5 is also one of least demanding AAA titles ito there too.
The very fact I can run it at 60fps at 4k ultra says it all
triggered my sun ?
cant defend your argument ?
i just prooved you wrong …..
triggered ?
Not feeling guilty at all.
Hence why I just said it’s not racist to say someone’s English is poor.
It’s a fact his English is poor.
I also was not saying he makes fun of other’s broken English (that would be ironic to say the least), I was startin he comes on the foundf making fun of people with HIS broken English.
So, no I’m not feeling guilty.
He started with insults, acting like a child, all because his poor a*s can’t afford decent GPU.
OneI no time for him or your left wing social justice warrior self.
Oh and Caitlyn Jenner is a men, yes I went there.
I’m also half Indian and half white, whilst married to a black Venezuelan, good look with racism claims lol
I don’t give a damn about your ethnicity. I can only judge by reading other’s comments.
I’m not saying that you are indeed a racist, but that comment sounded like one. Not trying to stir things up here.
Lol
Best to stop conversing with these guys I think.
It was amusing at first, as they just embarrassing themselves at every point , but as grown ups we shouldn’t engage with them.
BFG just sounds like he’s bitter as he can’t afford better hardware, and ordinarily I’d feel sorry for someone in that scenario, as I’m not rich, I just don’t drink, smoke etc and use hard earned money for gaming.
I’ve even sold on my old GPU at discounted price in past, as when I first got into pc gaming, I was broke and someone helped me out by selling their old 590 cheap to me as they’d bought a 690.
But this guy is a tool.
As for the other clown, who’s crying on behalf of BFG, he needs to grow a pair.
I don’t even care BFG insulted my mom.
It’s just words.
Just can’t help but laugh at his attempt to put down my set up whilst he using a 970.
I didn’t even come to praise the 2080ti or RTX cards (thereth extortionate and no one really using the features yet, and likely outcome is they will be replaced by time people are using them), but to say that these features are useless, is just silly.
Anything that improves resources to a developer and what gamer gets as performance is also great news.
His 970 could really do with some tbh 😉
look my guy . i can run games better at 1080p than you at 4K
https://uploads.disquscdn.com/images/c03d0eefd5d81701b070b93f7e4668b75598c483a83e395340c0ad8b7cf02f8b.jpg even in 1080p, and 3570 can run CPU intensive games only with around 35-45fps
https://uploads.disquscdn.com/images/d9fa1b1eb618b047f01efaa14488bb05d6d33f323aa4511df85acf9022244ca7.jpg
OH LOOK AT THAT .
A 2018 GAME
AND A i5 3570k with only one GTX 970 can run it AWSOME at max settings …..
woooo you are wrong my guy …. just stop my child ….
https://uploads.disquscdn.com/images/d9fa1b1eb618b047f01efaa14488bb05d6d33f323aa4511df85acf9022244ca7.jpg https://uploads.disquscdn.com/images/c03d0eefd5d81701b070b93f7e4668b75598c483a83e395340c0ad8b7cf02f8b.jpg
i am sue my rig is still dooign ok on max settings …
Congratulations, you just priced you can get a minimum of 24fps.
Wish I could game like that lol
Must be epic playing at that frame rate.
someones mad because they couldn’t afford one…I bought a 2080 because of DLSS and VR games have a 50 percent boost in performance and add on DLSS support in that game..come on.
You got a 2080or 2080ti?
I assume they will have a hard time lowering the shading rate of the RED color in Wolfenstein II: The New Colossus…
You can use it on your 2000 series GPU unless it bricks and turns into a Atari 2600…
https://uploads.disquscdn.com/images/7a9c1a2b5a0084d26a06b084e5c8f26c3ffac45a1d988f98a8f4c7a1d0cdc864.png
https://uploads.disquscdn.com/images/af5993b2d5ecd27f9e2addc6abeec7fce6ebc1de657334272cada1fea9801a41.png
I would like to believe that too but it’s doubtful that they aren’t selling a lot of these Turings even at this absurd pricing. I’m curious to see their next Quarterly Financial Statement. I won’t be surprised to see record revenue and profit quarter over quarter.
they are selling a ton of the 2080 ti’s so that statement is not valid.
https://uploads.disquscdn.com/images/8dd63e6feeffdba923f98525e306cbd3cb9f9b433e2e0cddbc9036756863c2c4.jpg
i killed the turtles!
Liking the kidney one lol
Great last paragraph, would be a shame if you happen to post it on a comment section that hates the game…
Wasn’t aware of this feature. Good if it ends up being implemented whatever the game..
lol
What’s funny about that ?
Sorry, but I’m not laughing on his comment though.
But more like, the claim which Nvidia is making about the performance difference, in the pdf file. Because I knew those figures won’t be accurate in real life scenarios.
They can claim all they want, but I’ve kind of lost faith in NVidia.
No way ! I’m NOT trolling you here by any chance. Read my response posted above to MM. This is my only account.
I was actually laughing on the bold CLAIM which nvidia always makes. Sorry, didn’t mean to offend your comment.
RTCW was one of the best FPS that I have ever played. There is even a site dedicated to mods for the game that was still active a few years ago when I was last on it.
I would like to see a remaster of that game. Give it a graphics face lift and today’s gamers would enjoy it.
This is what I like about nVidia. They’re not only pushing visual technology. But also explores ways to improve performance with little to no cost in visuals. First DLSS, now this? Good job nvidia!
+1.
Another nice feature for the upcoming 3 or 5 AAA titles. You want to play 1 or 2 of these.
And suddenly we all need an RTX…
If Raytracing sucks do bad why do movies use it? They should save millions and do some crappy costumes instead
Cuz movies and games are 2 dufferent things my guy
So …..
Movies are pre rendered with effects only used in movies on super PC and play on 24fps
Games are locali rendered on your PC with what ever fps you want ..
Yeah you are right they are the same … I was wrong …
https://youtu.be/tb2Ct3yyB4g
“there both visual entertainment”
And what did you expect?
Person who say games and movies are the same……
Why dont you use wiki a bit……
Or in your case some tiepods ….
you hate graphics https://uploads.disquscdn.com/images/5513d631791efe4b87d87f97917833abf67197e6fb4104b1c76ec8395ec7c44f.jpg
Yess
Tell me how i live my life
Performance gain?
Not yet determined. I guess we need to wait for some proper benchmark results. The perf gain is surely there, but I’m also interested to see the difference in Image quality as well.
As much as I’m not that much interested in this whole RTX release, it’s good to know how NVidia keeps on pushing new TECH, and implement features that are compatible with their latest Hardware.
They never fail to impress me though. That being said, this Game performs great on both AMD and NVidia’s Hardware. AMD VEGA cards really perform well in this game as well, mostly due to the “FP16” capabilities (double-speed), Rapid Packed Math.
But TURING GPUs also really shine in this game, thanks to the optimized Id Tech 6 Game engine, and also partly due to Vulkan API.
Though it seems PASCAL cards suffer some perf loss, due to FP16 basically being cut, or rather operating at a low/half rate (1/32, of FP32 compute more like). Turing does that at twice the rate, so the performance gain is huge, as per some benchmarks.
Anyways, I think this new “Adaptive Shading” post-process will come with some small trade-off in ‘Image quality’ though, if I’m not mistaken.
But like mentioned before, basically for the lower detailed areas like skies, walls, the shading rate/shading ratio is kind of reduced, so this might help determine the quality of the next frame being calculated. I guess we still need some comparison screenshots.
NVidia has already introduced few more features like Mesh Shading and Texture-Space Shading as well. It looks like the above new shading tech is a derivative of similar technologies like ‘Variable Rate Shading’, and ‘Foveated Rendering’.
Btw, on a side note, turn OFF GPU Culling via the in-game video settings, if using an NV GPU, because it will reduce your performance to some extent. I have an RX 480, so it runs fine.
You can see the 4K Turing GPU performance here, without using DLSS.
https://uploads.disquscdn.com/images/9e8720ebe9cdc6570de52b01ab81801a3c9c9d3d65fd1d0812d5da37777871fa.png
170 at 4k? Damn
Another excuse to put more NV Bloatware into game code… not bad.
I bet this thing adaptive shading can be done by devs in they in-house engine, without any 3rd party tools, just like plenty of games has adaptive quality setting, which is changing whole bunch of settings in one go.
A thing like that, which is changing just one thing in fly, can be also done.
CUDA cores could hold this, more CUDA Cores = more effective/more flexible.
But ofc you need to push something to make big sales of that FAIL RTX
Amd must do something to stop this madness,or they will always left behind…
GURU3D did some early testing on this “Adaptive Shading” setting, to check the performance difference on 1440p, as well as 4K.
Adaptive shading can be selected in the following settings/options, i.e. Off, Quality, Balanced and Performance.
Sadly, according to them, there was roughly a 5% perf difference, going directly from off to adaptive performance mode. Not a very huge gain, imo. But I think this feature needs more testing on other cards as well.
https://uploads.disquscdn.com/images/7293c02ec302b199e74850d58f2cac22bbe697447105470cde0449ec43e4f1d0.png
https://uploads.disquscdn.com/images/ba40616c57bac20e6f61e7ce7abec3c6a59be1f5047b31944cf3fe1094c5aa1a.png
Guru3D website did some early testing on this “Adaptive Shading” setting, to check the performance difference on 1440p, as well as 4K.
Unfortunately, according to them, there was only a 5% perf difference, going
directly from OFF to adaptive performance mode. Not a very huge jump though.
But maybe, if other TECH websites also do some testing, then we can come to some conclusion.
OT. Another game using RTX has been detailed. But it’s a Chinese exclusive MMO. Very less adoption rate though, in other countries, imo ?
https://www.youtube.com/watch?v=4gqzZREHhMY