During GTC 2018, NVIDIA revealed that Epic Games’ amazing real-time raytracing tech demo in Unreal Engine 4, Reflections, was running on four Tesla V100 graphics cards. And while it took four Tesla V100 GPUs to power it, it’s pretty incredible that this demo was running in real-time at a smooth framerate.
Furthermore, NVIDIA revealed that its NVIDIA RTX technology will be supported by both Microsoft’s DirectX Raytracing API and the Vulkan API.
Last but not least, the green team revealed its new graphics card targeted for workstations, the NVIDIA Quadro GV100. NVIDIA Quadro GV100 features 32GB of HBM2 memory, 5120 CUDA cores and 118 Teraflops tensor cores.
Oh, and in case you’re wondering… no word yet about the rumoured GTX1180 or GTX2080 GPUs.
Edit: Here are the full specifications for the NVIDIA Quadro GV100 (thanks Metal Messiah).

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email


2-3 years to see this in action, can’t wait…
Why not, the top tierd card will be able to run that, and some pc devs will start implementing it, like everithig else will be start slow.
Exactly and Metro: Exodus is out this year and it will have RT in real-time in the actually AAA game which is insane.
If it gains traction and some how AMD picks up their GPU sector RT could blow up really fast and we could have an epic GPU RT war. I really hope so fingers crossed.
No more then 5… tops. You haven’t been around long if 8-10 is truly your prediction. Remember Epics “Samaritan” demo? When it launched it took 3 GTX580’s to run it (2nd Gen Fermi). Less then 2 years later on a single Kepler based GTX780ti they could run the same demo. Less then 2 years a single card was able match 3 cards. So ya… 5 year tops for this (more like 3 in reality….).
That’s if we assume that Nvidia will pick up all, and I mean all of it’s slack and giving it it’s 110% effort over the next 5 years to give us very affordable GPU’s that can handle this tech properly, and not another future proofed piece of tech that we cannot touch for another 5 more years added on top of the current 5 we have to wait for.
Meanwhile AMD is still putting a massive focus on the medium to low end, so yeah, 10 sounds like a better idea if we go by how the market is currently operating.
Also, don’t forget about the miners who are single handedly ruining the GPU side for the majority of general consumers out there.
780 Ti? Nah, it took a single GTX 680 to run it at the same quality and speed, although with FXAA instead of 4xMSAA.
In action ? You maen downgraded for consoles ? And runing single Gpu ?
You’re right, but there is always some that will take advantage of it
I know . Like AAA devs who put out an awesome looking game full of microtransactions .
See you in a decade.
Nah.
The Unreal Engine 3 DX11 Samaritean Demo was done in 2011, on 3 overclocked GTX 580s (probably 3 GB versions too). Later it ran on a single GTX 680 using FXAA instead of 4xMSAA.
4 years later, Batman Arkham Knight features basically every feature and design esthetic from the Samaritean demo. Give it a 4 GB GTX 680 and you can even play Arkham Knight in 1080p30 maxed out easily (no extra Physx though)
Games had such great SLI scaling also!! I’ll have to get around to throwing up some videos of the games that rock four way SLI!!
Oh we’ll see how this happens in just 5. If it’s not being handled perfectly by then then I’ll be on the money.
5 years tops, and it has to run smoothly, if not, it’s crock.
https://www.youtube.com/watch?v=RSXyztq_0uM
I say 5 years till ray tracing
Dude, what 5 years. We’ll have ray tracing in Metro Exodus this year.
And ray tracing in Remedy’s Project 7 that’s coming in less than 2 years.
I would say more like 5 years…Very few devs will dare implement it in actual games since only the top < 10% games might be able to actually use it!
2-3? hahaha ohhhh thats a rich estimate.
Yeah I think you are right. If it takes friggin’ four Quatros running with 32Mib of memory to render this scene (which I will admit looks damn good), we are looking at 8-10 years minimum.
Nvidia RTX, comes with four Titan V preinstalled. Cherry included.
four Titan V graphics cards ? That must be some SLI multi GPU tech .
To bad we gamers dont have that kind of tech . We would play games on 8k 120fps . But that would make rhe 4k machine xbone x look bad.
Can you really afford 4 GPUs in a row on your system, if suppose gamers have this tech ?
This just proves that we are just not ready for this tech yet. Hardly anyone is bothering with the Titan series cards, and we all know that consoles won’t be touching them.
At this point, only the 0.01% will be able to afford something like that, and that’s honestly not worth bothering about.
Makes me wonder why on earth did they show this tech so soon in such a early state?
Don’t get me wrong I think it is epic tech but 4 Titan V’s! The software mode must use generic OpenCL, Compute Shaders etc and is GPU acceleration(no reason not too) along with offloading some work to multi core CPUs for more calculations. I just don’t understand how some of this tech will be in Metro: Exodus if it doesn’t do this, there must be something they are not disclosing. I believe you must be able to do partial Raytracing on certain effects like only reflections etc in software GPU acceleration hybrid CPU mode or else Metro would never have it in it if it needed multi crazy multi GPU configs no one has to run.
I am going to wait and see when Metro is released. I bet the new GPU will be out by then and they will disclose that the software mode is in fact GPU accelerated and also uses multi core CPUs to help calculate the Raytracing effects. I would bet money it will work on the 1000s series too for the partially effects in software mode and that is exactly what Metro will have, nothing else makes any sense.
If we knew how powerful the 2000s are compared to the 1000s then we really know if the above is correct. Man… I really need to know now LOL 🙂
Cool, I think your right anything else would just be a really bad design at this point. Baby steps are cool with me, I really can’t wait for Metro: Exodus regardless of the RT features. If the RT features in Exodus are really cool and work well then that will just be icing on the cake.
Pretty epic times ahead for us PC gamers and video games in general, I can’t wait! Now if only we could settle the damn crypto miner stuff and move forward.
They’ve been doing this for years. Remember Sony showing off some of their tech early last gen?. Half of it was never even used and some of it flat out couldn’t be used with last gen hardware.
It mostly feels like drummed up hype of what we “could” have, rather than what we *can* actually have right now.
Going from the recent Metro demo of the said RT tech, I honestly found it hard to notice any major difference between what they had shown before the tech demo was released.
All I know, is that if the 10 series cannot handle the tech, then it’s up to the 11 series, and if that series cannot master the tech within 5 years, then we’re clearly left with waiting for the 12 series.
People think this tech is going to be ingrained into every game going forward within 2-3 years, yet it’s clearly very, very demanding and hasn’t at all been refined, as well as the high end sector of GPU’s facing very little competition from AMD, which has of course left Nvidia to stagnate and remain in a comfortable position to not do much.
This proves quite the opposite. Real time raytracing has never been possible and the fact that it is now, means in a few years it will be a consumer feature. “Pro” level technology catches up in a few years to what most normal people can afford but if no one works on the tech, then it will never come.
Just look at the original titan for example. That was unobtainable power at the time and shortly after the 980 ti made it completely obsolete for nearly half the cost.
By a “few” years you mean a decade, because if you’d just take a simple gander at the rate at which we get our hands on actual consumer and 100% stable level hw to be used with new tech, is awfully low.
You can easily look at Nvidia’s cards and how they perform with gameworks tech and only the tiny handful of high end cards can actually make full use of said tech, and even then they cannot “master” it fully and perform 100% better with it either, which shows that we’re still that bit off with having hw that can actually handle the tech introduced over the years.
Look at how the competition is going between Nvidia and AMD, it’s utterly abysmal, especially in the high end.
@JOHN, kindly correct the article content. The specs are wrong.
The Quadro GV100 is a single-GPU card with 5120 CUDA cores, and 32GB of HBM2 memory.
This info which you have posted (as quoted below) , is valid for 2 of these quadro GPUs connected via NVLINK2, not one. When paired with NVLINK it can offer up to 10240 cores:
“” NVIDIA Quadro GV100 features 64GB of HBM2 memory, 10.240 CUDA cores and 236 Teraflops tensor cores “”.
.
https://uploads.disquscdn.com/images/bff9e1676186049f745448d841855ba140f303c1152031a7d0c65c6ffe23e510.jpg .
right on the money, that seems to be the case.
yeah, I was away for a couple of hours. Fixed.
Hey John, are you going to be giving away that spec as a prize in a future competition here? ;-D
Hahahahaha, definitely :P. Still, next month we’ll have a surprise (not a giveaway) for our AMD fans 😉
What resolution was it running at with these 4 cards ?
Don’t tell me it’s 1080p…
yeah, I was away for a couple of hours so didn’t have Internet access to fix some things. Fixed.
Soooooooooo…….. Nvidia does like “SLI” and or multiple GPU configurations!?! To show off their tech that won’t be available for years to come of course.
Starting to get an extremely strong sense of hypocrisy from this company….
People here are seriously underestimating the growth in GPU performance. In 3 years I easily think this demo will run on a single flagship GPU at 1080p60fps. The performance evolution of prior GPU’s points to a pretty fast growth rate in perf.
Awesome! So 5 -10 years from now this will be the standard and running on my spiffy new x54 holographic “display” and a single nvidia x-force laser memory module gpu. Cant wait to see how many gpu generations nvidia dribbles out between now and then.