NVIDIA had recently introduced to the public its new Dynamic GI called VXGI. VGXI has been integrated in a separate branch of Unreal Engine 4, and 3D artist “Byzantos” experimented with it. Below you can find some screenshots captured with this new Global Illumination technique in full effect. As Byzantos claimed, this is the most accurate solution available to date (especially when it comes to the performance/visuals ratio). Enjoy!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email






Some fo the effects are great, but overall is average efford (not the creator 😉 )
Here i give you something:
https://youtu.be/0DG51glKipU
Nice!
Spectacular.
specular* 😀
Look at the graphics
Epic! No pun intended. Love that Global Illumination stuff. GI adds so much to immersion.
This is some amazing dynamic global illumination!
I don’t like this lighting. It is don’t look realistic at all
I don’t know. Too much bloom and DOF filters.
The video looks nice but the pics look very crappy.
Looks good, but nothing can shock me anymore. Maybe VR can, but thats all i can think of. GFX alone can’t WoW me like they used to back in the days. Back when Half Life 1 and 2 came out. Back when Doom 3 came out and people were amazed at the dynamic lights and shadows.
Seems like the better graphics get, the less stuff is in motion. Kinda like anime shows. Lots of motion? low quality drawing. 1 or 2 things moving? high quality drawing.
Looks pretty cool.
Looks awsome! Only if nvidia was interested in pushing pc gaming forward as whole instead of letting this die as preprietory billsh!t!!
Oh well least we get low level Apis thanks to amds push.
I come to respect nvidia less.
From proprietary more expensive gysnc to bearly used furs, physics and face techs over the years ugh.
Amd is helping us all out while nvidia is making a crap load of AAs and paying 5% of devs to make awful looking floaty slow motion hair movements.
Funny months ago I liked nvidia but it’s apparent to me that pc gaming as a whole matters little to them.
Not a long time pc gamer but it’s obvious.
Somehow amd profits less but does more for us.
Think I will support them when hbm comes.
Nvidia can take this tech and shove it!
Yes I have gtx 970 though for now.
Hope dx12/Vulkan can reproduce something like this so all devs use it on pc.
Nvidia released vxgi a while ago and now we have concept art from a very good artist.
We need to max out ue3 first lol
I want Samaritan grfx for all!
Dude they opened PhysX, Many things of the game=works Library works on AMD and so does VXGI, the current AMD GPUs along with previous Nvidia GPUs don’t support it for hardware limitations.
they had to open physX in order to get it inside UE and gameworks is still blackboxed, no access to source code is allowed there to anyone but nvidia partners therefore noone can optimize nor offer any optimizations.
If that’s the case then why is FC4 and other gameworks games run great on AMD cards with gameworks on? Come on dude… I been PC gaming since the freaking mid 80’s before all the discreet gpu bs dealing with companies been around.
And Nvidia at 1st had everything under gameworks closed off. But that’s not the case anymore. And besides Nvidia already fired back from those “blackbox” statements.
AMD only made such a big deal out of it because their Radeon SDK is still going no place after three years now only being in three games… And the third game is not even out yet and it timed exclusive to Xbox1….
Nvidia’s Bryan Del Rizzo, who was also on the call, was quick to point out that AMD’s claim of Nvidia removing “public Direct3D code samples from their site in favor of a ‘contact us for licensing’ page” simply isn’t true. Nvidia’s developer site contains open source, freely downloadable code samples as recent as March of 2014…
AMD was just acting like overgrown babies and all the BS you keep typing is back when R Reed was Telling Huddy what you say. Now that Lisa is in charge you don’t see AMD making stupid ignorant comments about gameworks anymore.
FYI It was inside UE since UE3.
So when it’s not proprietary, it’s awesome. But when the same thing is proprietary, it’s “billsh!t!”? 🙂 Common. If something is good, then it is good even if it’s not for everybody.
It can be proprietary but it has to have source code availability, You cannot expect other companies HW to run code they cannot review or optimize or fix any problem they might have with that code. Nvidia has to do it all and since nvidia is well known for how their optimize code works only for certain HW, it is not certainly the best situation for developers, customers nor PC platform.
It depends. NVIDIA is offering source code to developers under special licence. They cannot share it, but I think they can make changes to it. Am I wrong?
UE4 … Almost as good as Cryengine.
I can’t help but think that every time I see one of these articles.
UE have features that CE doesn’t have and the same goes for CE.
So you can’t really say that one is better than the other.
UE4 is ten times better than CryEngine. Especially when it comes to tools provided.
Is it better than the removed svogi? it certainly doesnt look better.
AC Unity indoors look way better than this imho
Thai is because it uses baked lighting.
Unreal Engine 4.6? 4.6, really? They passed the halfway mark before a
noteworthy game even launches based on UE4. Strange how they seem eager
to move on to number 5, when UE3 lasted a decade.
P.S. The video is great but the screenshots look terrible. Aesthetically it reminds me of Crysis 2, and that is not a good memory.
It’s not necessary they’ll jump to 5 after 4.9. Lots of software have a 10th iteration or above, like 4.10, 4.11 etc.
It hurts my eyes just reading that. How could tech people do something like that? 4.10 bigger than 4.9? MY EYES!!!
Actually it’s pretty common. Usually the version numbering is x.x.x, so 4.10 isn’t always the same as 4.1.0.
I think they are already on 4.8
I’m fine with additional dots as long as we’re talking single digits. 4.1.1 is fine because there is no rule for additional dots in maths. You see that second dot and you know it’s a special kind of notation. But you don’t know that with an atrocity like 4.10, 4.10 would suggest 4.1 with a higher level of accuracy. It can’t also mean 4.TEN. If you want to count to ten after the dot you’re going to have to start at 4.01, not at 4.1.
It’s best to follow the release notes where available. There’s no standardized software version numbering, so like I said it’s random and doesn’t necessarily follow any mathematical system.
It’s awesome, however there’s not a single tutorial on how to build the damn Nvidia stuff, and EVERY coder/programmer I know, acts like they’re the kings of the world, and they’re not. Actually a coder can’t barely draw a circle or a straight line on a paper, and without the 3d models/assets they would be nothing.
Hopefully it’ll be in our video games soon.