At GTC 2017, NVIDIA revealed the first Volta GPU that is aimed at very high end of the compute market. And while a gamer/consumer GPU based on Volta won’t be coming out – most probably – until 2018, the green team showcased Square Enix’s Kingsglaive: Final Fantasy XV running on it.
In case you didn’t know, Kingsglaive: Final Fantasy XV is a CG movie based on Final Fantasy XV, and this tech demo features the main protagonist Nyx Ulric.
The fact that this scene runs in real-time is simply mind-blowing, even though it features only one detailed character in a somehow basic environment.
Enjoy!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
I’m holding off upgrading my 980ti until 2nd gen volta comes out but when it does, itl be glorious. Mostly waiting for a single gpu to drive 4k with little issues.
Saaaaammeeeeee
Same here with a 980, hoping for a big upgrade.
2018 then
Yeah same here. Still going strong on my 780 🙂
But when I upgrade, I’ll merge a good 4K graphics card with a 34 ultrawide monitor with freesync or gsync. So far for 1080p my card still tops it with some tweaks.
we can only dream
This most sane decision i’ve heard in a long while pertaining to GPU’s. Anyone that Owns a Titan X/980Ti, Shouldn’t be buying a GPU till 2019/2020… And i’m sure you could even stretch it longer.
I’m going to hold off on upgrading until second-gen VR headsets drop. By then we’ll probably have an 1180Ti or something by 2018.
This means nothing. People getting excited over nothing. I’m a green fan but meh. Volta is kinda far away for consumers. Nvidia are annoying for this. You get the besr gpu available and 2 months after BAM they’re showcasing something stronger that will cost less.
I guess it’s the way it is with amd being nowhere in the high end gpu market.
I’m on the green team atm, but Volta just feels like a more expensive repeat of what we already have. We won’t be getting games that make full use of it either, most days we get current gen games designed for what those specs can handle, not what high end PC’s can do.
Also the price of Volta isn’t likely to be nice and cheap, probably more like the 1080ti price point.
We should be reaching a stage where high end doesn’t cost an arm and a leg, not it becoming more and more expensive, but I guess humanity loves ramping up prices for the tiny percentage gains I guess.
it is only logical for the high end to become more expensive as development cost continue to go up. yesterday JHH mentioned 3 billion was spent on R&D to make Volta happen. and if i’m not mistaken it was 2 billion for pascal. 5 billion just for two architecture. technically nvidia could keep selling their GPU at the same price as they were back in 2010. but if they were then we probably will not get what we have today either.
Not really, otherwise by that logic we should all be minted to cover the future costs of higher end cards in the future. Except that’s now how economics work so we all won’t be able to cover the rising costs indefinitely.
Is it really going to be absolutely worth buying a single GPU in the future that goes for the price of a high end lamborghini?, I find that hard to defend.
for consumer that is the case. but for company they will always how to think about not just about increasing the revenue but also the profit margin they can get.
And if they aren’t careful their future sales can tank, just being based on increasing the prices each year. Bubbles burst, markets shift, very, very important changes to look out for, simply raising the price each year isn’t going to work forever.
nvidia is aware about that. people said the pricing is increasing every year but after looking back at nvidia pricing several generation i don’t really think it was. 1080ti MSRP is $700. that is the same MSRP for 780ti back in 2013. aside from x80 (and those FE scheme) the other card is pretty much still cost the same across the generation.
entry level:
650 Ti: $150
750Ti: $150
950: $160
1050ti: $140
mid range:
660: $230
760: $250
960: $200
1060 : $250 ($300 FE)
high end (x70):
670: $400
770: $400
970: $330
1070: $380 ($450 FE)
high end (x80):
680: $500
780: $650
980: $550
1080: $600 ($700 FE)
ultra high end (x80 Ti):
780ti: $700
980ti: $650
1080ti: $700
this is what nvidia should show us last year. as per the original roadmap volta should arrive directly after maxwell. but some hurdles in development forced nvidia to slip in pascal in between volta and maxwell.
What did you expect? For Nvidia to just stop releasing GPU’s? Nobody is forcing you to buy the latest and greatest. Don’t blame NVIDIA for your buyers remorse.
The latest and greatest should cost pennies, that’s the online communist way of thinking.
Wow you jump to conclusions quite fast bro maybe you should step down a little ? And Nvidia is very greedy since they praticly own the High-End market so they do whatever they want. If AMD could get their High-End out AND be successful Nvidia would have no other choice but to change their over the top money marketing strats.
Lol you didn’t respond to my comment at all. What you said has nothing to do with your original comment. I agree that nvidia is greedy. That’s a perk you get to have when your competition is (currently) trash. But my OP was in response to you saying that it wasn’t fair that nvidia keeps releasing better cards.
It is true that i didn’t go into details. It feels like Nvidia shafts it’s consumers. Like the TitanX and Price/Performance ratio. I know the Tx isn’t a “gamer” card but still, the “Ti” that follows fares better or equally in neary every aspect. They’re making that a trend to outperform their Kinpin 3 months after, and that for a much lower price. Why not lauch the “Ti” version at the same time or before the “Tx” version of a said chip ?. Money. And i hate it.
I still don’t see myself going to AMD anyways so…
That frame-rate..And we’re talking about the next GTX Titan(VOLTA) here..
I was thinking the same thing. Either that was the camera slowing down, or that supposedly powerful GPU was choking under that load.
if you watch the keynote JHH mentions that 10 days ago or so he ask tabata to give him something to show at GTC. this was something done quickly as per JHH request. so there might not be much optimizations going on. the frame rate might be choppy but i think this was supposed to be rendered in render farm not real time by a single GPU.
Isn’t it running on the high end compute Volta? Remember, with Pascal we had GP100, which consumers never got, and GP102, which the Titan X variants and 1080 Ti used. Seems like it was running on “GV100” from the wording.
The video itself was just really choppy.
Call me when we get games that make full use of the GPU and look a few miles ahead of what current gen can do. Until then I’ll stick with what I’ve got. I also don’t see Volta costing a good price either, if the 1080ti is 700, then I can see Volta costing even more.
Wow. It can render one character from the movie in real time. Just wow. Will I get autographed leather jacket if I pre-order DGX-1V?
Ooohh I saw that movie months ago. Will check this out soon. I didn’t think any computer would be able to run that level of detail in real-time any time soon. The movie looked incredible, I recommend it for those who haven’t seen it, just for the CG and action alone. The story was really confusing though.
The sad thing about all of this though is that most of the stuff I’ve seen Nvidia demonstrate, even from years back have yet to be implemented in games properly. And I don’t mean Nvidia’s proprietary tools. I’m talking about stuff developers could try to implement onto their own engine: things like realistic hair and water simulation or subsurface scattering. I know the problem really comes down to the limitation of consoles but publishers need to realise that the majority of PC gamers buy computers to get the best out of their games. It’s the reason why so many games age so badly.
But I understand, there’s probably a lot more to it (i.e. Sony and Microsoft bribing devs and publishers to ensure there’s parity on all platforms).
Hm, I guess I just haven’t noticed it well enough. I know games like MGSV: The Phantom Pain featured some really nice tech on their engine.
Sea of Thieves? I remember watching playthroughs of the game in the past. The water in it does look nice, especially for a cartoony looking game but it certaintly didn’t seem real or dynamic for me. Perhaps the way the boat moves in the ocean seemed convincing but not so much how the the ship becomes flooded inside. You don’t see pressure through the damaged parts or water trickling down the stair cases as you’re running around fixing the hull of the ship. Those kind of things were faked back in old games like Modern Warfare.
Perhaps you’re right though that we may be too early for those kind of things to run in real-time in a proper game. Even though there was one indie game I believe which tried to emulate realistic water physics (Hydrophobia).
Ah I see. What I meant in specific was the way light would shine on someone and through some layers like on the hand, if the light is bright enough, would show subtle details like the silhouette of the bones and veins within. These are features that obviously won’t make a game any better or things you would even notice in a subscious level as you’re playing but they are things that can be impressive to have nonetheless. Especially for those who like taking screenshots like myself.
Perhaps I’ve just grown from being a standard gamer into a perfectionist but I really did think back during the last generation that we would be at this stage by now, where everything can be dynamically simulated and look lifelike. Especially AIs. That’s not to say nothing is impressive currently. I was indeed impressed by some things like the environments in Ghost Recon (character models look awful in my opinion).
Well perhaps I was overthinking when it comes to realism xD But my point still stands with the whole transparencies through the hands and such. That’s right, with games like Star Citizen it’s important if they nail those aspects correctly I feel. It is striving to be great at everything anyway, not just with visuals.
Like you said with games like the Last of Us 2 already demonstrating some convincing looking skin. If I recall how Ellie clenches her hand and you see some subtle detail around the knuckle areas. I’m not sure if those things will be present in the final product either but they usually help to make the cinematics more engaging, or when you’re seeing it all in first person. I never made any direct comparisons but Uncharted 4 seems largely the same compared to it’s reveal. But being a console only game I could be wrong.
Interesting. Well I may give TLoU2 a look at some point but just having it a console only game kinda takes away a lot of the intrigue. I’m really liking the amount of detail that goes into Star Citizen, but yeah you’re right. I remember one of the solutions for decent performance in that game was to allow the engine to ditch away all kinds of rendered objects and materials that can’t be seen from the players position. Even things like shirts and so on would have parts removed if there’s really no point in rendering them there in the first place. Usually I’m not all for cutting corners but for things like optimisation it makes sense. It’s something a player wouldn’t notice anyway.
Ah, fair enough. Well now I know the name of it 😛 Some good points you stated. I can also see how pointless it would be as it would make any memory stick and GPU pointless to use with their limited storage. Even if there was potential headroom for a lesser demanding game it would still be a case of a badly optimised one, as people would need a much more powerful hardware to run it I guess. Especially if it’s a game that looks like it could easily run on a dual core or lesser at max settings.
I’m guessing that’s also a similar method games like the Half-Life series have used. Although in a more primitive manner. Since the progression in those games felt kinda seamless when you move from one point to another back then, despite how long those loading screens took.
I’m totally amped for 3.0 aswell, seeing all the updates is always reassuring 😛 But I’m just not trying to expect a lot from it. I sure hope it provides a lot more replayability. I actually never had the chance to play the game again ever since 2.0 came out, since my computer is below the minimum specs. But 3.0 is certainly making me wanna upgrade my PC so I can run it again 🙂 When the game is finally out or when SQ42 comes out that’s when I plan to have a 4K system ready… assuming it’s possible to run it at 4K max settings at 60fps lol. Wanna get an ultrawide monitor for it too. If there’s one game worth going all out with fancy hardware it’s this one for sure.
Oh I’m aware. But I meant the flow of the game, rather than to be kicked back out onto the main menu you’re basically loaded into the next portion of the game right away in HL. Battlefield campaigns were able to allow you to advance in quite huge scale maps once you finished an objective in a certain area. Star Citizen I think is doing somewhat of a similar method to Elite Dangerous, except it’s not using the hyperspace stuff as a way to mask the loading, since you can get to point A to point B without using it. Which is pretty amazing. I don’t know if there is any other space sim game that does the same thing.
I’m actually quite behind on those techs, still trying to get a better understanding of Gsync and AMD’s equivalent but I got some materials I know of that I can look into. I need to see HDR in action so I can be sold into it, but yeah I’m hoping there’s gonna be a decent price drop by then. That is really expensive but what new tech isn’t?
I imagine the method SC is trying to achieve is a more realistic one since I don’t think you’re basically being teleported from one system to another. I do recall them mentioning about how it’s not a simple process, you would have to traverse many types of locations in order to get to a certain point. Kind of like Elite I believe but I think Elite has you simply jumping from one system to another with no way of being able to stop once you’ve made the jump?
And oh okay, sounds like vsync method but with the monitor doing all the work, rather than part of being done with the graphics card. So I take it that it makes the vsync option in games kinda useless then, giving GPUs perhaps more headroom for better performance. Or maybe I missed the point entirely lol. The names do sound very similar to vsync.
Talking of Volta, still haven’t seen Vega (:
Can’t wait to buy a Volta Ti.
very big downgrade potential x’)
This doesn’t really mean anything. This was already running on existing graphics cards, otherwise this movie wouldn’t have been made. Video games aren’t going to look anything like this once you factor in AI, NPCs, unpredictable movement, world rendering, and the need to hit any sort of playable frame rate.