It appears that CD Projekt RED’s new role-playing game, Cyberpunk 2077, will support real-time ray tracing tech as well as NVIDIA’s Hairworks tech. This information comes from a leaked video which showcases the in-game graphics menu of Cyberpunk 2077.
We do know that CD Projekt RED was showcasing Cyberpunk 2077 running on an NVIDIA GTX1080Ti at 1080p and with 30fps. As such, we don’t know whether an NVIDIA GeForce RTX 2080Ti will be able to handle this game with RTX, especially since the game will not be coming out anytime soon.
Still, it’s pretty cool witnessing another highly anticipated triple-A game game support real-time ray tracing. Moreover, and while we can’t see the remaining options, the Cyberpunk 2077 demo was most likely running with a mix of Ultra and High settings, and with HairWorks enabled.
Both NVIDIA and CD Projekt RED have not officially announced yet Cyberpunk 2077 supporting the Hairworks or the real-time ray tracing (RTX) effects. However, it’s pretty clear from this leaked image that the game will indeed support them. Still, and since the companies have not announced anything yet, we are marking this as a rumour (so take it with a grain of salt)!
UPDATE:
It appears this screenshot is fake as it originates from this fake video. Marcin Momot also confirmed that this screenshot is fake!
Yup! That screenshot wasn’t real.
— Marcin Momot (@Marcin360) August 28, 2018

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email

Nice. Dual 2080’s will come in Handy after all.
Dual 3080Ti more likely…
gotta love that OVERKILL preset
Now that’s a setting that perfectly says “future-proof, do not play for now, even on your 2080 Ti”… I think devs should adopt that approach… Ultra for useless crap you will barely notice and powerful rigs, and overkill that won’t be able to use for a couple years 🙂
Don’t forget Ubersampling in Witcher 2 — that should actually be usable these days xD
TW3 still can’t manage 4K @ 60FPS on a 1080Ti… But that’s probably all those mods I have installed TBH 🙂
that preset will probably turn rtx on
I’m actually quite excited for it, because it looks really good without raytracing, but has quite a few noticeable issues. Raytracing could potentially fix a lot of those, but it would be very demanding for sure.
And gonna have both of those off :p
True! 😀
Rather have 4k than 1080p these days…
If this game comes out in 2020 it will be perfectly playable at 4K with RTX on with a new shiny RTX4080Ti.
Lets be honest: you’re gonna need TWO RTX cards to play at 4K UHD.
I highly doubt even that will run ray tracing at 4K
Absolutely, I’d rather downsample than run that awful performance hogging nonsense at this point. When Ray Tracing becomes something more than just an overly expensive GPU graphical option maybe I’ll care. Having both them off will change absolutley not a single thing about the game.
Enjoy console visuals then.
I dont think console will hit 60fps+ in this game or even high settings,the will be probably on medium
Consoles wont run this game it will release in 2020 and will be pc, ps5 and xbox-something(2?) ,,exclusive,,
Its confirmed for PS4 and X1. Unless CDPR are lying.
HairWorks: YAY! Raytracing: NAY!
I have to admit I realyl hate reflections the way they are done nowadays. Raytracing is amazing with that… but what is the point if you have to lower your resolution to 1080P to enjoy it…
Nice, but we won’t be able to achieve 4k60fps with those settings ON for the next 5-6 years… hell, we may not even be able to achieve 1080p60fps with a 2080Ti…
That is why Nvidia is pushing NVlink dual GPU again(new sli).
If weak af 2013 consoles can run this at even 900p with all the Nvidia garbage off, then a 2017 1080 Ti better be able to run this game at 4k 60, which I have already been doing on many games except the very latest which are getting gimped on purpose. These games are not getting upgraded hardware on consoles yet are looking better and better because of how they studios are fine-tuning x86 instructed engines – software, not hardware. And don’t get me started on the 2080 Ti, which should fly us to the moon.
Ubisoft fooled everyone with AC Origins and made everyone think their latest games are oh so hard to run to get people to buy an xbonex when we know damn well they were partnering to debut the new console and the game with MS – also added their heavy DRM garbage. And any game publisher partnering with Nvidia has us getting updates to gimp our games and upgrade to better cards.
How do we know? Because again games are still being scaled for weak console hw launched back in 2013. All hw marketing BS.
honestly i prefer developers push graphics to the max still remember how in the old days my 670 sli coudln’t handle uber settings on witcher 2 but i didn’t complain since we appreciate having future proof settings. Now people complaint about everything i don’t have a single issue if my Current high end build don’t push 4k aslong it has blown mind effects, physcs , tessellation, distance scale, reflections, realistic water and the list goes on
bot
plagiarist bot….
Lagworks is godlike.
Witcher 3 has HairWorks, which caused AMD to add a tessellation slider to their drivers. AMD were then able to run with HairWorks more efficiently than Nvidia. In response, CDPR added a tessellation slider in game (probably under instruction of Nvidia). Why don’t we see the same here?
Perhaps this is just another BS as it does look awful.
“AMD were then able to run with HairWorks more efficiently than Nvidia.”
They ran them at lower quality you mean
Oddly enough, the quality looked the same. Even Nvidia users were using the new slider.
If you referring to the varying levels of Hairworks that’s available in the game menu setting to Nvidia users too, after AMD introduced varying levels of AA, then difference is noticeable for sure.
If there’s some very recent development , i’m unaware of this and haven’t seen, so I would like to see and apologise if i stand corrected
x32 vs x64 (native) in AMD driver was 100% INDENTICAL and it was like 5% faster performance.
x16 to x64 in AMD driver was 90-95% identical and it was like 10-15% performance improvement.
x8 to x64 was like 80-85% identical with 30% performance improvement.
“Witcher 3 has HairWorks, which caused AMD to add a tessellation slider to their drivers. ”
LOL NO. The tessellation slider in AMD’s drivers is there since 2011, since Crysis 2 added overkill tessellation to pointless things. Because AMD’s slider was SO EFFICIENT, CDProjekt added tessellation slider IN THE GAME for Hairworks.
I cant agree that they identical tbh.
You can spot aliasing instantly, and it gets worse further you go.
It’s not bad enough that you can’t reduce it and tolerate it, but the original is still preferable.
That’s just gonna destroy anything that’s put before it.
We will all end up going back to 1080p screens at this rate
I wonder if NASA has a super computer I might borrow….
>Ray-tracing
>1080P
ok now i’m getting really suspicious about the performance cost of ray-tracing
It wasn’t enabled.
so that demo had any of those active (ray-tracing and Nvidia hair)
or none of them?
Suspicious? Man it’s very heavy, it’s pretty safe to assume so, there isn’t one single new game that is able to run at more than 1080p at much more than 60fps with RTX on, so well…
The E3 demo definitely had Nvidia hair enabled, but Ray Tracing would have been disabled since the demo ran on an GTX 1080 Ti.
You have provided no source for the video, I can’t find it anywhere.
where is from this screen? ’cause it looks fake as hell…
uhoh! ray tracing and hairwerks at the SAME TIME!! well, might run that, but my character shall be optimized! straight bald!
Why is almost nobody questioning the legitimacy of this? We have NO SOURCE, and it looks like an edited Witcher 3 menu.
Identical. Nothing but the font and colors are changed.
You mean HAIRPATCHWORKS?
Well just read an article about the first leaked Time Spy benchmark of the RTX2080 and it’s 6% faster then the GTX1080ti, and that is with an improper driver (proper drivers will only make performance better). So the RTX2080ti will be an absolute beast… I’m betting 50% faster then the 1080ti. With that in mind there is no reason that you shouldn’t be able to use the RTX2080ti with Ray tracing and get similar results to a 1080ti without ray tracing (4K@30fps). Time will tell… Until then use your brains, it’s not hard to figure this crap out. Everything I have said so far on the RTX2080ti series is turning out to be true (you can look up my posts). And goes against 90% of what is out there, even from sites that should know better… What a joke.
“SHADOWS HIGH”
SLIDE IT TO ULTRA DAMN IT.
The gameplay demo was released to the public in 4K. Yes the one showcased at E3 was on PC using an i7 and a 1080Ti. The Gamescom demo was probably using the same configuration, although since the new RTX cards were announced at the event maybe they upgraded.
crapworks and ray tracing rip your pc’s boyz…lol.
Settings -> ‘turn off’
lol
For anyone wondering, the video was made in UE4
I don’t buy it’s fake.
Marcin Momot has lied to us before and more than once.
No, i didn’t mean RTX tech. Just in general. I wouldn’t buy an RTX card either.
You welcome to those frame rates, idI rather cut my nut sack off than play games at that frame rate