NVIDIA and CD Projekt RED have announced that a new free update for Cyberpunk 2077 will add support for a new Ray Tracing mode, called Overdrive Mode. In addition, this upcoming patch will add support for the next version of DLSS, DLSS 3.
Here are the key features of the Ray Tracing Overdrive Mode for Cyberpunk 2077.
- NVIDIA RTX Direct Illumination (RTXDI) gives each neon sign, street lamp, car headlight, LED billboard and TV accurate ray-traced lighting and shadows, bathing objects, walls, passing cars and pedestrians in accurate colored lighting
- Ray-traced indirect lighting and reflections now bounce multiple times, compared to the previous solution’s single bounce. The result is even more accurate, realistic and immersive global illumination, reflections, and self-reflections
- Ray-traced reflections are now rendered at full resolution, further improving their quality
- Improved, more physically-based lighting removes the need for any other occlusion techniques
There is currently no ETA on when this Ray Tracing patch will come out. If I had to guess, I’d say that we might get it in November 2022. However, that’s merely a guess and nothing more.
Stay tuned for more!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
Yowza. Expecting another 30 fps slideshow with all this enabled even on the new cards unless dlss 3 is the bees knees.
They demoed it running maxed out at 90~ fps with DLSS 3 lol.
DLSS3 is shaping up to be incredible.
Wow. It’s a whole new world then. Excited to try it out
Haha, yes more sorcery and no rasterization in sight.
? Why does CG need to be rasterized? Also it’s not sorcery just because you don’t understand it. They explained how it works and there’s very detailed info on how neural up sampling works all over SIGGRAPH’s papers and the GDC vault.
Oh I understand it very well. Good luck paying 1200 dollars for AI. Yeah read more charts please. Parrot what Jensen says, you’re unbearable dude. Have a nice day man.
The only thing unbearable is your complete lack of knowledge on anything related to CG. The fact that you think siggraph white papers are just charts is hilarious. Don’t be arrogant and ignorant at the same time. Choose one.
I never said the pricing was good or fair. I’m talking solely about the tech.
For gods sake, *read* man.
They also demoed it at 22 fps on 4090 without DLSS at 4K.
And? 4k with raytracing is extremely demanding, and will be for a long time. I don’t understand why DLSS 3 (which uses hardware within the 40 series) is considered distinct from the hardware.
As long as GPU manufacturers are improving performance for the end user in one way or another, it’s a good thing.
DLSS 3 sounds nice on paper, but definitely is 40 series exclusive (even though it’s more for software reasons, as the hardware used isn’t actually exclusive to the 40 series.) and the main way they’re touting up to 2x-4x performance increase.
And while they’re technically rendering more rays now, they’re using DLSS to render alot less pixels. It’s “smart”, but kinda a magical cheat of sorts that only makes them look better than they are at native resolutions.
I have a feeling DLSS 3 might not get as extensive as a adoption as DLSS 2, especially on titles already out of support. Mods or dll injects might be possible but definitely not be as easy to plug and play them in.
Well actually, some of the DLSS engineers answered this in an AMA. DLSS 3 has 3 parts to it, optical flow calculation (which uses hardware exclusive to the RTX 40 series for the time being), Nvidia reflex, to reduce the added latency caused by optical flow, and the traditional DLSS upscale. The last two parts of the list above are pure software ), and work on the 30 series as well.
I don’t really understand why people think that GPU manufacturers are only allowed to deliver performance improvements via brute force hardware improvements alone. CG is full of optimizations or “cheats” but if you can’t tell, does it really matter?
However, yes you’re 100% correct regarding adoption. It’s not as simple as a drag and drop replacement of the DLL like with prior versions.
Base hardware performance is a fair perspective especially in benchmarking and what it would be like if DLSS support isn’t available, especially for DLSS 3 which is likely not to get supported as much as it’s pretty much exclusive to 40 series.
But nobody understand a thing , jensen said : the created image by tensor core will showed on the screen WITHOUT any calcul of the CPU =
more FPS no more input = feel like 60 or less when you got 90fps on the screen, so , i love the dlss tech , but this is sh*t
i talk about the new system of the dlss , it will be the same like upsclaling low res game so u got 3x your fps cause your game run at 1080p on performance mode, AND with the extra performance from the huge uplift of the tensor core , they will create some frame between the upscaled frame,
i talk about these, they will uplift your fps like your lantency.
cause actualy , a native game , calculate your input (mouse displacement , key pressed) 1 time by frame.
https://developer.nvidia.com/blog/accelerating-ultrarealistic-game-development-with-nvidia-dlss-3-and-rtx-path-tracing/
well i dont know why i waste my time to write this in my poor english , they say 4x fps 2x latency. bye
I appreciate you taking the time to write this out even when English is hard for you, and I’m aware of the issue you’re describing.
While it’s true that this is inserting frames *between* rendered frames, and that this will only increase the visual smoothness without actually decreasing input latency, there’s a ton of evidence that people prefer the visual smoothness of a higher frame rate even if that frame rate isn’t tied to the user input on a frame by frame basis.
For example, 60fps motion while rendering user input at 30 fps still feels dramatically smoother than 30fps for both input and frame rate.
Tried to re-play this thing last week out of pure boringness and dude, it’s worst than when it came out. I saw, literally, ONE bug with the first version but now, with the 1.6 update, there’s bugs all around and the optimization, which wasn’t the best there is, is way worse now.
They should have some decency and just ditch this shltty project to start another from the ground.
The game and gameplay loop still suck to high heavens no matter how fancy the wrapping is.
I can’t believe they are still doing the raytraced puddles of water to sell GPU. They don’t even looks realistic, they look like mirrors.
Nvidia is the “upsell” company. And their costumer base is so vapid. Nvidia been doing this same ole schtick. But there is an ole saying, “the fool and his money shall part”. Be prepared for their fanbase to downvote you into oblivion.
Did you even watch the demo? They’re raytracing far more than just reflections. They’re doing multi bounce GI in real time which is incredible.
And even before this news CP2077 looked amazing with all RTX effects on.
lmao, it looks exactly the same outside of the stupid raytraced puddle that looks too reflective
https://uploads.disquscdn.com/images/45566f5b4880fb6291efbb001eefb52b4f4bff7aa4ebf02a826487909131bef1.jpg
https://uploads.disquscdn.com/images/01c434e7d877ad2d0808f8fce84de808b7239fff22b229591887af6d4c8e9669.jpg
https://uploads.disquscdn.com/images/fb88ea53f7f52aba09e2074b2386b201a1c2be0d174d44588b31afc1632bd7ba.jpg
Dude, there’s people high on copium with all these g4y tracing shlt. So if you come and say the truth to them you must expect them to ignore all the evidences you might have and hate you to the core.
got a problem with RT ? so you don’t like realistic game ? i talk about the technique , WITHOUT ANY BRAND OF GPU.
RT is the futur of gaming Cause Is The Best Way To Simulate Light In 3D Space. if you don’t understand that stfu ,
PS: I GOT A 6900XT
Yeah, it may be the “futur” of gaming, cause in the present it svcks and it’s just a name used primarly to sell overpriced cards for morons and consoles to the peasantry.
You’re showing demo pics from the DLSS trailer lmao.
Look up old CP2077 comparisons (like the ones done by Digital Foundry) between RTGI on and off. That’s before the multi bounce upgrade, and it’s still very noticeably better.
It looks great! Don’t see why people hate better graphics. The most obvious reason I can think of would be the paywall or simply hating Nvidia for leading that tech. It’s sad that DLSS 3.0 is locked to 4000series. I can’t justify that price at the moment, maybe next year when Cyberpunks DLC will release and prices come down by then. Until then I’m eager for digital foundries analysis.
Yep, I think people are conflating their anger for Nvidia’s scummy business practices and pricing with their actual tech, which is undeniably impressive.
with a COD like length story
What does that have to do with the tech being demoed lol
dude chill you reply to everyone. Take a breather nobody is talking to you ffs.
Kind of agree on how mirror like-reflections are done. Never saw puddles in Chicago where Im from almost perfectly reflect all the buildings, lights, etc. I’ts a bit overdone imo but I guess they’re not going for realism but instead video game eye candy.
Clean puddles do look like mirrors. The ones that don’t reflect much are usually mixed with mud. As a 3D modeller, I spend at least an hour a day observing lighting, reflections and shadows wherever I go. Maybe go out sometimes, eh?
You sound like one of those people who do 3D modelling at home. Too much of an amateur to have an actual job in 3D modelling.
Man I cant even get 60fps with ryzen 3700x and an rtx 3080 with RT Mid/high settings without dropping dlss for 4k. How the f*ck should that run additional rt settings????
the 4090 will be a 500W heater
It’s been said (though not proven) to be able to spike into ~600W range.
Overdrive! Now instead of one ray bounce we get 2! We also get more rays.. about 5% more and they have allowed every light to to actually act as a light.
DLSS3… Wow, amazing when moving in a predictable straight line. Wonder how the controls will feel and game will look when the algorithm starts adding frames but we are giving inputs that don’t match.
It’s basically motion smoothing from your tv but in video game form. The performance gains were also vs DLSS2 which DIDN’T run on the tensor cores and I wonder if this will and why exactly it can’t run on the 3000 series.
So that 4080 12 gig with the tiny memory bus… That’s a 4070 folks. Welcome to higher prices across the board!
We have the 3090ti aka titan, 4080 16gig which also oddly has a small memory bus but that’s the real 4080. Then the 4070… I mean 12 gig 4080. Don’t expect to see much stock outside of the 4090 and maybe 16 gig 4080 for the first 3 to 5 months.
AMD is going to come out with more than enough performance but they are waiting until PCIE G5 gets more widespread adoption and then they will crank up the 7995xt which should easily trade blows with the 4090ti except maybe in RT where it will probably be about 15% behind but still much better than the 3000 series. Of course they can add chiplets and cache at will.
They should also have no issue hitting 4.2ghz boost, be cheaper and likely quieter considering EVGA is gone… Good job nVidia. All I ever buy is the FTW3 Ultra Gaming series of cards… Had the 1080ti, 2080ti and 3080ti.. I think I’m moving to AMD. 4k is not going to be an issue so that’s all I need and if I can save power and money then I’m in. One thing I do hewr is RDNA4 is a dominant architecture and should just strut in like Rick Flair and take the performance crown with no issue.
Interesting times ahead
Apparently BF5 used like 54 rays total in any given scene while CB77 with Overdrive mode will be closer to 700 rays.
Who care. Everyone what’s New Game+ but you keep avoiding it like covid. Get you sh*t together. I will not by other one of your games period.
Man…I’m feeling like a chump dropping $2K on a 3080Ti during the Great GPU drought of ’21