NVIDIA Marbles RTX Demo looks insane in these screenshots, running in real-time on a single RTX GPU

Gavriil Klimov, Senior Art Director at NVIDIA, has shared some amazing screenshots from NVIDIA’s latest real-time RTX tech demo, Marbles. The NVIDIA Marbles RTX demo uses full ray traced effects, and showcases some truly incredible next-generation graphics.

As Klimov stated, this tech demo runs in real-time on a single NVIDIA single Quadro RTX 8000 graphics card. Unfortunately, Klimov did not specify the framerate and resolutions. Still, the fact that we can have this kind of graphics on a current-gen graphics card is insane.

Going into more details, the NVIDIA Marbles is a fully playable game that is entirely ray-traced, denoised by NVIDIA’s AI and DLSS, and obeys the laws of physics. Yeap, NVIDIA describes this as a minigame that players can play.

Sadly, we don’t know whether the green plans to release this tech demo/game to the public. To be honest, though, NVIDIA should simply hit the “Publish” button. This tech demo looks insane and it’s one of the best ways to showcase what developers can achieve with Ray Tracing.

Below you can find some screenshots from that tech demo. Keep in mind that these are real-time graphics and not pre-rendered stuff. This is a look at the future of PC video-game graphics.

Enjoy!

65 thoughts on “NVIDIA Marbles RTX Demo looks insane in these screenshots, running in real-time on a single RTX GPU”

  1. Very realistic indeed. Those showcases are usually great, but never too real.. until now. Now its finally the time where we can all get fooled.

    Show a gaming pic (or a tech demo) 5 years-10 years ago, and people would never think its real. The question now is, when will this kind of graphics be in the common 60$ games.

    1. I give it 5-10 more years until we actually see this in common triple A games.
      I know it sounds rather cynical, but that is because of how poorly optimized games seem to be running nowadays.
      With graphics that really doesn’t look all that super different from a few years back.

      The RTX Demo looks very real thought, but there is one thing I saw that immediately ruined the realness for me. The edges of the screen have a brutal amount of chromatic aberration.
      I know that this effect actually can happen when taking pictures with a smartphone or camera, but it’s definitely something I don’t want to see in games or in RTX demos.

      1. Yeah, chromatic aberration is an auto turn off for me. I hate how games look with it and i hate even more when devs don’t let us remove it.

        1. For me, I don’t want better graphics… yet. I do want better graphics, but I want the games to be better optimized and be more fun again overall. And then we can add better graphics.

          For the last couple of years I feel like games have declined too much in terms of performance optimization and being less fun overall.

          Not gonna list any examples since it would open up another discussion entirely.

          I just want the publisher to start caring about releasing games, especially PC games that run optimized again. Then we can focus on implementing those huge graphical changes, but it should not come at the cost of performance. I mean a little cost in performance is just how it is, but we have had unreasonable costs in performance lately.

        1. Exactly. It’s small, therefore being easier to render compared to a huge open world map in games.

          An open world with these visual would probably melt even the high end RTX cards probably.
          I know it sounds exaggerated, but I’ll stick to it.

          Would be nice to see this level of graphics in triple A games and run really optimized, but I’m not gonna hold my breath for it in the nearest future. I’ll wait and see.

          1. Rockstar pulled off really nice-looking graphics with RDR2. And yeah, I played on PS4 Pro, so resolution was kinda bad, but I could still appreciate what they did.
            What’s cool in RDR2’s graphics is that it’s not the models or textures per se, which are good there. But somehow they can produce lighting which fakes the real world so well.

          2. RDR2 does look really nice. I just wish the PC version was better optimized, even when having high end GPUs and setting the graphics settings on par with the PS4 settings, the game still has stability issues and weird performance degradation.

            I do get that RDR2 is kind of a benchmark title like GTA V in that is has so many settings that it future proof itself for newer GPU down the road. But it still has some optimization issue that I really didn’t expect from Rockstar today based on how good they have done the optimization after the huge mess in GTA IV.

    2. Possibility to do these visuals has existed for some time now. Devs are always targeting base consoles and then flow in some PC flourishes (future proofing them a bit).

      But no one is really pushing the envelope *too hard*, given they know PC hardware will take a long time to get to acceptable performance. And devs don’t want PC visuals looking all that drastically improved vs console visuals.

      Think back to 2009-2011, when we got DirectX11 hardware and devs started adding DX11 features to games that were DX9 at their core. Modern hardware of the day could push really high fps, high resolutions, higher lods and better precision effects at same-ish console visual output. But add the relatively light DX11 visual improvements and performance would plummet.

      We’re a long way off from DXR becoming a standard.

      1. >Possibility to do these visuals has existed for some time now.

        That is flat out wrong.

        Sure you could remake this scene with baked lighting and have it run much faster even, but this is entirely ray traced at playable game framerates, it’s entirely dynamic, without the limitations of traditional rasterization.

          1. Not sure how this statement is supposed to explain your comment.
            Wtf do visuals matter if you can’t get playable framerate, or if it’s just an equivalent of a 3d picture with no interactivity that a game needs.

          2. All in all , I’m saying we had the tech, we didn’t have the performance. Neither will we have it for the next 8 years on PC, for next-gen looking games fully, 100%, raytraced, zero rasterization.

        1. Yeah but if a game were to launch like this right now, today, lots of people would be complaining on forums and review bombing the game on Steam, saying that the game is “unplayable”, “too demanding”, “Nvidia-biased” (because AMD does not support ray tracing), using “demanding effects which necessitate a high end card which is out of reach for most gamers”, “could have used cheaper, traditional techniques for a similar appearance”, not “well optimized” (because it runs poorly on their integrated graphics), etc. so of course developers find cheaper techniques to produce somewhat similarly looking techniques to please the crowds.

    3. Arma 3 can still fool me sometimes, great lighting, texturing and art design play a big part.
      If ray traced light becomes the standard though great lighting will be the norm

      It’s a shame no game in the next 15 years will look like this though, no walking sims don’t count.

  2. $5,500 dollar card, probably still less than 1 fps (why no video?).
    Still, very impressive to look at. Even if it likely pushes the boundaries of what “real time” could mean.

    1. There is a video, just search on youtube…
      It’s 30 fps, it would run the same on a 2080Ti, they use a Quadro because this is demoing new software aimed at professionals.

    1. The Unreal 5 demo was done without ray tracing and runs much faster…it’s representative of what you can do in an actual game, and it ran on a 500$ console.

      This is a glimpse into the far future of game rendering, not something you’ll see anytime soon. It’s also not running on a commercial game engine.

      1. Its funny how people perceive this kinda stuff as being unobtainable, this is real and we could do this right now, the issue is consoles couldnt and graphics dont really matter in the grand scheme of things when it comes to selling games so its not really worth going the extra mile for any dev.

        Thats all there is to it.

        1. What are you on about?
          You would need a 2080Ti on the consumer side in order to run this demo at similar performance. And again, it’s a controlled playable demo, in a small environment, and it runs at 30 fps with an unspecified resolution using DLSS.

      2. “This is a glimpse into the far future of game rendering, not something you’ll see anytime soon”

        more like never too expensive.

      3. but to be honest i doubt we ever going to see things that is done with UE5 demo will be fully implemented even for next gen games.

    2. running on a rtx 8000 quadro a 5600$ gpu…almost 3 times the price of a very high end pc with an rtx 2080ti

      1. 2080 Ti is only 10% slower at ray tracing than the RTX8k, and it runs much faster CUDA clocks. It would perform about the same on it, but this is marketed at industry people, content creators. It’s a demo of an upcoming software suite from Nvidia.

      2. Quadro RTX 8000 is just fully enabled TU102 chip. RTX2080Ti performance are not that far from RTX 8000. in fact is possible RTX2080Ti end up having similar or faster performance since Quadro usually clock much lower than Geforce.

  3. truly high quality.hope to see such level of visual quality available for gaming before i die.

  4. RTX 8000 only has 10% faster ray tracing. 11 GigaRays/s vs 10.

    They have the same number of tensor cores, the 2080Ti has a few RT cores less, which is what accounts for the difference above.

    The big difference is 48Gb of VRAM, but that certainly wouldn’t be necessary for this. They demoed this on Quadro because it’s a show-off of their new content creation platform meant for industry people.

    This would run about the same on a 2080 Ti.

    1. Oh yeah man, that’s indeed correct. My bad. I mistook this card for some other totally different model, or maybe not.

      Anyways. Having a ‘brain fart’ since this morning, LOL ! Btw, yeah I know this card is for content creation, hence the premium price.

      1. Gotta like a guy who knows his s**t, but can admit when he messes up. Nobody’s perfect, bud.

  5. I don’t trust these showcases. We never got anything nice as what Unreal 4 demo showed and now “we” are already hyping Unreal 5

    1. UE4 demos ran on the highest end PCs available at the time.

      The UE5 demo ran on an actual PS5.

      Can you spot the difference?

    1. Looks less impressive in motion. The shadows and reflections are much lower detail,and the static objects stand out and look a lot less natural.
      Still, not bad.

    1. the mention of Quadro RTX 8000 is just for marketing purpose. in reality those RTX 8000 are more or less sits at RTX2080Ti performance. so yes consumer grade RTX card (single) can run it.

  6. why wouldn’t you share this with anyone? it does nothing but sell and push graphics cards. The only reason not to share it is if it was fake or bs like ps5 hype LOOOOOOOOOOOOOOL

  7. Meanwhile, yesterday, I was checking what were the best games ever on RPGCodex…
    Results : 95% are from the late 90s/early 2000s.
    BECAUSE GAMEPLAY IS WHAT MATTERS.
    And back then, games had actual fun gameplay.
    Who cares about such graphics if gameplay is dead?
    Since 2 years, I don’t even play AAA games anymore, but only indie games, which are usually in 2D, or even pixel art…

  8. not suprising know that pc is always a generation or 2 ahead of consoles. just like the epic demo for ps5 pc get marbles demo. next generation of nvidia cards, 3080rtx will be able to do this.

  9. And the most played steam games will remain low end graphics games.

    Most of gamers don’t care, they want global offensive and cheaper mid range cards.

    I expect rtx 2060 to dominant on steam by 2021-2022

    Rtx 2080ti is 0.73% of steam users.

    Most popular is gtx 1060

    Pc gamers Will hold back console for about 2-3 years.

  10. …talking about ‘publish’,

    Nvidia has the power, not by pressing a button, but by typing 2 or 3 lines of code, TO TRANSFORM a powerful graphic card, with 40 or 50 gb of memory, 100 or 200 CUs(computer units) and 1 trillion transistors INTO a $80 graphic card….

    Think…meditate….(and also remember, 10 years ago, a highend gaming graphic card , the best money could buy, would cost 600 or 700 bucks. Today, $1600. By 2021, 2022, $2000 to $2500 ???) …

    IF NVIDIA can turn a hypothetical $50’000 ‘pro’ graphic card, and TURN IT INTO a cheapass, +25yo sega megadrive/ Nintendo Snes-like graphic chip, WITH some drivers and a couple of lines of code,

    and at the same time, they CAN TURN a $1600 gaming card into a miserable ultra-low performance ‘pro’ (2d/3d design, render, autocad /archicad/3dmax/etc), capable only of encoding 10 frames per minute
    WHY,
    TELL ME WHY…
    … why WOULDN’T NVIDIA, on purpose, with their beautiful drivers, seriously CRIPPLE even the best and most expensive gaming cards, SO THESE CARDS are never powerful enough to even run the freshly- released AAA title:

    – it creates an eternal, never-ending NEED for an upgrade, in a quest for the 4k/120FPS/ALL ULTRA settings…
    -the gamer NEVER HAS too much power (otherwise, he would spend 1500 or 2000 bucks ONCE, and for the next 2 or 3 years, he wouldn’t need to upgrade his setup
    -nvidia supports ultra old computer setups, because the guy will not buy a new CPU, but he can still BUY a highend graphic card, at max $$$

    TELL ME,
    what would happen to Nvidia, if tomorrow you could buy a $2500 gaming card ?
    Well, Nvidia wouldn’t be seeing a single dime from you, before 2023… because you would have extra power to run all your games, at super high quality.
    WOULDN’T IT BE BETTER, for Nvidia, if you could buy a new $1500 card in 2020, one $1600 card in 2021, one $1700 card in 2022, and a $1800 card in 2023 ?

    WHY can’t Nvidia release a $10’000 gaming card, with 4 rtx2080 to chips, 48gb of memory, and 300 CUs , even if it would require a $500 liquid solution ?

    THIS is not conspiracy theories…
    How can a $400 xbox x console generate RDR2 graphics almost as good as a $5’000 highend PC ?

    Could it be BECAUSE NO DRIVERS are slowing down the hardware, on purpose, and everything can run at its maximal capacity?

    How is it possible, that a game like RDR2, 7 years of dev, extremely optimized, when ran on a $6000 super gaming PC, with countless cpu cores, nvme, ram, etc, and the best current Nvidia graphic card , it can barely do 4k, 60 fps, and ONLY Very high TO High (global)settings, with all the hundreds of visual options, shadows, AA, textures, lighting, etc, ALL SET at MED or LOW settings…

    Damn, it’s just RdR2, with a horse, character and a ton of trees and grass textures, not a 3d movie at 8k with full real time raytracing !

    Yes, I am sure Nvidia, via their drivers, can slow down the most powerful gaming card ! Of course they do it.

    Instead of attacking and calling idiot, just think 1 minute, what can be achieved with drivers alone !

Leave a Reply

Your email address will not be published. Required fields are marked *