Back in October 2018, we informed you about the Path Tracing mod for Quake 2 that could run on the new high-end gaming graphics card from NVIDIA, the GeForce RTX2080Ti. Unfortunately, though, that Path Tracing renderer could not take advantage of the RT cores, resulting in really low performance. Thankfully though, a group of people has released a new special version that takes advantage of the RTX GPUs.
Q2VKPT is described as the first playable game that is entirely raytraced and efficiently simulates fully dynamic lighting in real-time. While some games have started to explore improvements in shadow and reflection rendering, Q2VKPT is the first project to implement an efficient unified solution for all types of light transport: direct, scattered, and reflected light.
According to its team, this project is meant to serve as a proof-of-concept for computer graphics research and the game industry alike, and to give enthusiasts a glimpse into the potential future of game graphics. Besides the use of hardware-accelerated raytracing, Q2VKPT mainly gains its efficiency from an adaptive image filtering technique that intelligently tracks changes in the scene illumination to re-use as much information as possible from previous computations.
It is said that thanks to the Vulkan API and the RT cores of the NVIDIA RTX series, an NVIDIA GeForce RTX2080Ti can come close to a 60fps experience inĀ 2560×1440. Now keep in mind that this RTX version does not take advantage of NVIDIA’s DLSS (in order to boost performance) and that it uses real-time ray tracing for numerous effects. In other words, we’re looking at fully dynamic global illumination using path tracing, with raytraced shadows, glossy reflections and one bounce of indirect lighting.
As we’ve already stated, this Quake 2 project uses mainly path tracing. In case you weren’t aware of, path tracing is an elegant algorithm that can simulate many of the complex ways that light travels and scatters in virtual scenes.
“It’s physically-based simulation of light allows highly realistic rendering. Path tracing uses Raytracing in order to determine the visibility in-between scattering events. However, Raytracing is merely a primitive operation that can be used for many things. Therefore, Raytracing alone does not automatically produce realistic images. Light transport algorithms like Path tracing can be used for that. However, while elegant and very powerful, naive path tracing is very costly and takes a long time to produce stable images. This project uses a smart adaptive filter that re-uses as much information as possible across many frames and pixels in order to produce robust and stable images.”
Those interested can download this real-time ray tracing mod for Quake 2 from here. The team has also included an interactive screenshot comparison between this RTX and the original path tracing version. The only downside here is that the original path tracing version looks somewhat more realistic and crisp. Due to the excessive noising, the textures appeared to be more realistic, detailed and crisp than they actually look. The de-noising of the RTX version takes away that effect and while we have a more consistent image, it looks blurrier (or should I say similar to the vanilla Quake 2 game).
Have fun everyone!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
Man O’man when will ask this Ready To Xplode nonsense stop. Ray Tracing is basically one big giant troll. Now the modding community is buying into this rubbish as well ????
Giant troll? People said the same thing about 3d acceleration at the time of the PS1 and yet…
Dumbass
That’s all i can say
Oh you are so innovative. You inspire us all to be nothing like you.
+1. Can’t disagree on that though.
Tetris RTX incoming!
That might actually be cool, on Tetris Worlds and above. Would love real time reflections from special effects applied to the textures of my favorite block shapes :P.
Haha, probably true. The Tetris spin-off by the name of Wetrix on PC, Dreamcast and N64 had some rather nice reflection effects for the time which could look better still with some ray-tracing.
Pauper.
The only real reason to own an RTX card? Honestly, this is the best use of the tech I’ve seen yet.
I think this kind of tells us how far away we actually are to getting full, real-time ray tracing running across an entire game, as the only game that’s so far been capable is from the 90’s.
Man I just bought a 1080TI a few hours ago.
I feel so silly now.
But in all honesty though Q2 is one of my favourite games and I love that modders are still all over it.
Too much of this RetardTaX. Let’s wait for some hardware that really makes proper use of DXR. Even DLSS looks a bit meh since DirectML can do the same thing, and more.
Jen-Hsun Haaa may be along soon to ‘correct’ your opinion!
He always gets triggered with anything related to NVIDIA.
lol, do you people even know what are you crying about anymore? You don’t do you? Or did you miss that AMD announced GPU is as expensive as RTX card with none of the features? Then they announced they are working on these features, so in your eyes, they should be just like Nvidia, expensive with useless tech.
I guess when you whine so much, you can reach a point where you forget what are you whining about in the first place, enjoy your 700$ AMD card with no new features buddy.
TRIGGERED!
He’s kinda right tho.
As always with Jen-Hsun Haaa he’s guilty of projection by assuming all manner of falsehoods as to people’s motives and more when those people don’t worship his chosen corporate overlord like he does. So, no, he’s not “kinda right” at all.
In common with many fanboys his child-like emotional instability, angry ranting and blind loyalty to a corporation that doesn’t care for him at all does provide a modicum of amusement value. In the absence of Sp4ctr0 Spencer he, along with bizzysgs, gives us all something to laugh at so his existence isn’t entirely without merit! ?
I don’t know about his other posts, i’m talking about this one he made here, and there’s honestly nothing wrong with what he said here.
There “honestly” is plenty “wrong” with his comment in question. Like I said, he’s been guilty of projection by assuming all manner of falsehoods as to people’s motives and more when those people don’t worship his chosen corporate overlord like he does.
How so?
Radeon 7 is as expensive as 2080
Radeon 7 doesn’t have the same cores and hardware to support RTX
Radeon 7 is roughly as fast as 2080
RTX from nvidia was labelled as gimmick and useless, but RT from AMD doesn’t seem to have the same negative talk around. How is that?
“How so?”
Like I said, he’s guilty of projection by assuming all manner of falsehoods as to people’s motives. I’m solely referring to the remainder of his comment, i.e. his false assumptions of what people here may be thinking as though he believes he’s some kind of mind-reader.
So what you bash the guy whenever possible just because? What you’re saying is basically “I wasn’t trying to say that what he said is wrong, i’m just bashing the guy because he worships nvidia”. Correct me if i’m wrong.
I don’t know how many more times I need to state that he’s guilty of projection by assuming all manner of falsehoods as to people’s motives. He consistently exhibits a child-like emotional instability with his angry and abusive ranting in defence of his chosen corporate overlord. Yet you’re seemingly seeking to paint him as being some kind of victim. Sheesh!
I don’t know how many more times i need to say that i don’t care about what he usually does, it’s THIS comment we’re talking about, and what he stated about the card is objectively right, but yet you keep going back to “guilty of projection by assuming all manner of falsehoods as to people’s motive” whatever that’s supposed to mean.
And i’m not painting sh**, i don’t even know who the guy is.
RTX Off
https://uploads.disquscdn.com/images/23abc92ba4773a534d26c4284ad87e6a993315c3a1efb4020743e1ecdd167519.jpg
RTX On
https://uploads.disquscdn.com/images/f7c34127d9e2a66238ddd01fdd76c7c5cd1917bf7aa9730da501e1d1bed9134d.jpg
Excellent ! Ray tracing rig on FIRE !
There were a lot of problems with RTX cards dying when Nvidia first released them. I can’t see how Nvidia couldn’t have known about the issues before release.
Below is an extreme example but one owner of a stock 2080 Ti posted some pics on [H]ardOCP forum when his PC shut down and a flame shot out of his card:
https://uploads.disquscdn.com/images/c7a9395dff375eb0de683c168615e84ff88e61281ff878d9fb704bc09b6f418d.jpg
https://uploads.disquscdn.com/images/8582f52a438546f7551b5fd1f132c00b38c72c7e32403ef8a0d9c62c7217dceb.jpg
Yeah, I’m aware of this issue.
Please let go of your mothers teat, grow up.
What has AMD got to do with Nvidia’s failure in producing a full scene raytracing card at a price suitable for the masses?
(lol)
As always, He and this “Paul86” lunatic guy is on full trigger, when it comes anything related to nvidia/rtx/dlss. lol
How do you know what DirectML can do?
He doesn’t, he just heard the term yesterday and wanted to use it.
Yeah, as a 50yo software developer, there is no way I would know such a thing.
Ah, the 50yo software developer who said “RTX = RetardTaX. Just an awful attempt at implementing Microsoft’s DXR”? Yeah, you have no idea what are you talking about.
What part did I get wrong?
You didn’t openly worship Nvidia enough for Jen-Hsun Haaa’s liking.
Pretty much everything you said. RTX is a combination of ray-tracing and AI tensor cores, the ray-tracing part does not require, and most likely predates DXR since Nvidia’s been working on ray-tracing for more than a decade. DXR is a software API exclusive to Windows platforms, it does not require RTX cores, and can run on compute cores, but it will not run as fast. So you are comparing a hardware cores to a software API, instead of comparing it to OptiX, the Nvidia equivalent to DXR.
DLSS is an upscaler that runs on the tensor cores, you are comparing it to DirectML which is a framework like they are equivalent instead of comparing DirectML to NGX, the Nvidia’s AI framework.
Try Google.
I mean, that thing is yet to come out, how do you know it performs better and does more things than DLSS?
ML, Machine Learning, can be used for many things, of which AMD (and Nvidia) have been experimenting with supersampling. I didn’t say anything to suggest performance. DLSS is just Deep Learning Super Sampling, nothing more.
https://uploads.disquscdn.com/images/244087d6eb45a4bc1fca66792d378de27158b25bc0683a186c9c8a82895b74db.jpg
It is EXACTLY what you said, besides we’re talking about DirectML and not ML in general here.
Nothing that you highlighted defines performance. I also wonder why you would think that machine learning would be restricted to algorithms used with super sampling? Are you also under the impression that Nvidia’s Deep Learning is also restricted to just Super Sampling?
So with “more” what you refer to exactly? I’m not thinking anything, i’m just asking how do you know DirectML is better than DLSS, if one of those still isn’t available to the public. I’m not sure but sounds like you’re avoiding to answer that.
“well someone would have to create a DLSS alternative using DirectML to see what the performance difference is”
Exactly my point.
DLSS just uses fp16 operations for it i think so its nothing AMD cant do but tensor cores are significantly faster at doing it even with say vega 7’s 28TFLOPs fp16, tensor cores would still be like 3-4 times faster depending on how many the gpu has of course
yeah i know its not going to have much left over after rendering the game but thats still the point though is that DLSS isnt limited to tensor cores
it doesnt have to hammer performance at all. defeats the objective if it costs more to use than just running it natively. im sure theres other ways of implementing it than nvidias approach.
you do know the super sampling part comes from nvidia servers running the game at much higher resolution then the devs get back whatever data is generated from it. AMDs version could use temporal reconstruction technique using that data to create a better quality image than what would normally be achieved. may not be as good as nvidias but game devs are a heck alot smarter than me or you with figuring this stuff out so they may be able to do something and dont sit there and pretend they couldnt, fact is me and you dont really know and they come up with stuff to get around problems all the time.
You haven’t got any clue what you are talking about. It doesn’t have to work that way around. DLSS is a different technique.
Well said.
Agree.
You don’t have to wonder too much, since you are brainless/DUMB as always. lol
DirectML will work depending on how AMD, and other gpu vendors optimize the directml operations for dx12 on any Gpu. There will only be a difference in performance. Learn something next time…
It seems this guy is again getting triggered. I’m done arguing with people like him.
He’s a jobless and homeless kid, who is 24/7 online on the internet. Nothing else to do in life. So what else you can expect from him ! lol.
So glad some is embracing the new tech – as it has potential for making much nicer graphics than anything we have today – and possibly (down the line) with less effort for developers. Even if you don’t want to be bleeding edge (your walled will bleed for sure) I cannot understand why anyone is not behind this tech. Yes it might take time (years) before you personally can benefit fully from it, and it might even fail – but that is the case with any large change.
You guys are focusing too much on the fact that it’s using RTX and not enough on the fact that this is (afaik) the first time we’ve seen pathtraced quake run without insane noise. Really cool to see that.
By the way, long live QUAKE 2 ! One of the best fps of all time.
MEH….Who really cares, since majority of gamers don’t have an rtx card, let alone the 2080Ti. Yeah, It’s good to see some progress though, but modders are also falling for this so-called tech !
No one is going to buy an rtx card just to play some old modded games. Not saying ray tracing is a gimmick, but there has been too much hype being created surrounding this tech lately. enough !
Arent nvidia making a non Raytracing card?
Yup. They have the “Turing” GTX series of cards coming in few months time, which lacks ray tracing and tensor cores as well.
Wow, such a confident answer! And your source is? You mean the famous 11 series that rumors have been talking since last year but which failed to materialize?
I thought the 2060 was one of those cards? Don’t listen to this guys worthless rumors.
Yes, the GTX 1660 Ti but Nvidia is gimping it. It will only have 1536 CUDA cores vs the RTX 2060 with 1920 CUDA cores. Nvidia won’t allow the 1660 Ti to perform as well as the 2060 even in games that don’t have ray tracing.
Hopefully the price will be more reasonable so people can afford to upgrade and I also hope they release a full lineup of Turing GTX cards at more reasonable prices as well.
I think it is amazing what modders are investing their time in. They are not falling for anything. They are Following their passion and desire to breath new life into the games that they have love for. Pure bliss if you ask me. I hope to be well off enough one day, and educated enough to be able to pursue my own desired modifications.
I know that There is nothing to realize here
No need to put your ideas on me.I’m done talking about this tech, which is still a far cry and it will take many more years for this to become mainstream. It’s in an early adoption state. End of discussion.Just FYI you can go to their profile and report them directly to disqus for harassment. Disqus is much less forgiving than John is.
Just like 3d acceleration was a few years ago, yet see where we are today.
Okay….I’m sorry if I overacted.
You’re wrong, but ok.
No one is bothered about what YOU think persoanlly think about that guy. Get a life, dude.
Keep your personal vendetta aside. That just proves you are still butthurt by others attacking you over here.
Everyone can see here that there was absolutely NO reason for you to interfere, and post such images as a proof, but yet you still persist on posting nonsense.. Totally uncalled for.
But under this specific Topic, he hasn’t attacked/abused you. That was my point. He was arguing with some other guy, @psionicinversion:disqus
Anyways, I’m done talking with you.
Take a tissue paper, and start crying…lmao
So end of discussion because no one agreed with you, lol.
How is this prediction working out for you? I’m pretty sure many years is more than two.
That’s 1080p, without denoising and without textures .. not even a comparison.
lol what a salty re**rd
its for people who own the cards, modders ain’t expecting rtx selling better just because their mods
Dude, you can say it. It’s a God damn Gimmick. Don’t be afraid of the fanboys. They can’t hurt you. It’s all hype especially about something so small and miniscule. Look man, most of Nvidias tech is not innovative and not as “exclusive” as you might thing. AMD cards can run ray tracing for quite some time they just don’t Harper on B.S tech and brainwash to sell gpus. I’ve owned many Nvidia cards and I too can admit that I’ve been scammmed. Nvidia is full of Schitt and always will be. Because their fans permits them to be.
Back in Q1 of 2018 AMD announced widened support for Radeon Rays, its GPU accelerated ray-tracing developer software. Formerly known as AMD FireRays, Radeon Rays 2.0 is targeted at content developers who want to utilize high-performance ray-tracing capabilities with AMD GPUs, CPUs, and APUs via asynchronous compute. Unlike Nvidiaās RTX ray-tracing technology, which runs on Microsoftās DirectX Raytracing API, Radeon Rays is open source and conforms to the OpenCL 1.2 standard, so it could be deployed with non-AMD hardware and on multiple OS environments.
Totally agree with you m8. Well said.
Please i want ray tracing for pac man
I don’t think Pac Man is supported in the RTX drivers but Minesweeper is if you’re interested. You will need an RTX 2080 Ti for 4K at 60 FPS :p
https://uploads.disquscdn.com/images/43558f49c090f3e5089a5aaad50f3bb474d1f5b55342526a2ac40030ef47ad3c.jpg
LOL. Nice one.
RTX is a joke as a product, but as a stab at playable ray-tracing this is cool any way. Unlike BF5 and other RTX games this is full path-tracing, tracing the proper way. Lighting = shadows = reflections. We’re more likely to see tracing used for just reflections in the coming years, like nvidia’s own tech demo. But if I’m not mistaken this version of Quake 2 is the first playable (full) path-tracing game without the noise problem. That’s progress.
And the guy says in the comments he wants to do Quake 1 next. Quake 1’s mod and mapping community is still active. If a robust path-tracing renderer for that becomes playable for the masses in the future it could revive that community and bring a whole new generation into Quake singleplayer, so that’s also exciting imo.
Excellent comment!!
Excellent comment!!
Is this not the same mod project, just with denoising now?
look at some of the comparisons on the guy’s site where the full presentation is (link in the video description). there are noisy shots that look exactly like the demo we saw some months ago. the denoising removes a lot of perceived fidelity. you can see realistic shadows in the noise that simply disappear with the filter. especially “ambient occlusion” like you pointed out. AO means subtle shades, means low sample count. those details are the first to go with a denoising filter. it’s sort of ironic how, in order to get path-tracing running in real-time without visible noise, you have to blur it in such a way that it ends up looking more like traditional rendering with its binary light/shadow feeling (fewer gradients inbetween).
I’m not impressed although it’s great moders are finally using RTX in old games
For once, I agree with you on that part.
Bah, I’m done with this comment section. A total waste of time each time we browse DSOG’s forum. There is always chaos and much unrest/discomfort out here. LOL.
https://uploads.disquscdn.com/images/4d8298aa0990019cb1b7d635e59295f113dc5763287bf7b691627fb68d06f1e5.jpg
https://uploads.disquscdn.com/images/dcc1f128d8c292c071f76b4b2c3fcfcaa5bcb6dd9ddf17cf7e9058dc470cfe96.jpg
Haha. not so soon though….:D
It’s a mess recently, though i blocked 2-3 users and it’s sane again
Now i know why nvidia pulled the 1070 back
https://uploads.disquscdn.com/images/c801537cdb82b249cb243905da924d64cfc32c1cd0a4909da4c7aca7cf7664ae.png
This is my favorite all time game, beat it at least 30 times. This is the game that keeps on giving. Got the oculus vr mod, origanally had an intel mmx at 200mhz intel then upgraded to a p2 233 with a 4mb ati diamond, then went p2 at 350 with a voodoo 2 8mb card. Then got a nvidia TNT 16mb that could run q2 engine games at 120hertz on my 19 inch 1600×1200 trinitron. Now I’m still addicted and have an rtx2080 i7 setup.
I’ll game till i die guys.
“I’ll game till i die guys.”
You won’t be alone. I am 56 and I started gaming on an Atari 2600 back in 1980. Later a C-64 and then the IBM compatibles.
I met a fellow on the internet a few years back that was in his mid 70s and he was still enjoying gaming and still had a love for gaming. I will be like him one day and you probably will be as well.
Old soldiers never die, they just fade away.
You seem to be a pretty old Nvidia Gamer, from the riva tnt and voodoo days. Glad to hear that. Same here.
“I’ll game till i die guys.”
You won’t be alone. I am 56 and I started gaming on an Atari 2600 back in 1980. Later a C-64 and then the IBM compatibles.
I met a fellow on the internet a few years back that was in his mid 70s and he was still gaming and still had a love for gaming. I will be like him one day and you probably will be as well.
Old soldiers never die, they just fade away.
lol, they just RUST away like the color scheme in q2
42 with a massive backlog.
I won’t stop till I drop.
Started in 79 at 3 on handheld games then actual computers in 82.
Like @disqus_eNwMfpj0Z2:disqus and you, I probably played everything you both did and maybe more due to having managed a game store at 16 after 3 years of hard labor until we sold the entire thing for millions at 19, started at 13 if anyone is bad at math. One regret, giving up my entire collection of systems and games into the thousands of titles because I moved, I can’t fathom the estimated value for the simple fact I specifically kept boxes and manuals for everything. I would have been richer today…and not come out of retirement.
Anyways, enough ego talk, back to reading the trolls idiocy posted throughout Disqus.
(nearly all liked comments for that low IQ brain dead dude are from his alt accounts)
Don’t feed into the abyss of stupid.
My voodoo 2 was limited to 600p glide, you could get 768p with sli. There were 8mb and 12mb cards that 3dfx made, and had to be used in the same memory configuration or no go. My tnt 16mb card opened 1600×1200 at 120hz refresh on my Sony trinitron CRT monitor. Color bit was also a factor, like you said. I believe my TNT did 24bit color, remember those console commands.
my 1st video card was a geforce 256 32mb SDR paired with a intel pentium 3 533 b jan 2000.
Don`t remember the asus intel motherboard but it was recalled mid 2000 due to losing packets so i traded it in for a AMD cpu & mobo but can’t remember what it was.
Early 2000’s had a crazy time with amd & intel memory usage sdr – ddr & rambus intel wanted !.
I knew 3dfx was headed out the door when nvidia dropped in with big performance others couldn’t touch! strange how these things work out & wondering if the same will happen in mobile small form factor the early 2020’s like nvidia did 20 years prior.
Hoping intel adds value to graphics so nvidia isn’t setting the stage pricing.
funny intel wants in on graphics & why they didn’t do it earlier, what have they learned in the last 20 years that may surprise us ?
real time raytracing on super mario bros from 1986 is next!
@John,
Some related info/update. Nothing worth noting right now, as it is NOT fully complete/final, but it’s just an Ray Tracing analysis
https://www.youtube.com/watch?v=BRCAfdBMe2Y