GTA IV mods

Ray-tracing in games requires 100X more powerful GPUs, photorealistic virtual reality requires 40X

As we’ve said numerous times, we believe that ray-tracing is the future of lighting in video-games. While there have been some attempts in various tech demos to implement a fully ray-tracing rendering system, we haven’t seen any triple-A game featuring it. And from the looks of it, we won’t see such a thing anytime soon. According to NVIDIA’s president in Brasil, Richard Cameron, we need 100 times more powerful graphics cards to achieve this.

As Richard told TecMundo regarding the future of PC gaming, we need 100 times more powerful GPUs for real-time ray-tracing. In addition, Richard claimed that for photorealistic virtual reality we need 40 times more powerful GPUs.

“In order to achieve ray tracing in games, the GPU must have 100 times the processing power it has today. To have virtual reality, you need 40 times more powerful GPUs.”

Now while VR is possible today in video-games, Richard is talking about photo-realistic graphics. As Richard concluded, when the hardware is ready, players will put on their VR glasses, play and cannot distinguish the real from the virtual.

In other words, PC gaming has not reached its peak yet. However, it will be interesting to see whether NVIDIA will keep pushing the boundaries, especially now that AMD seems unable to compete with them. Will the green team invest on more powerful hardware each and every year, or will it slow down?

81 thoughts on “Ray-tracing in games requires 100X more powerful GPUs, photorealistic virtual reality requires 40X”

  1. Nvidia will keep on delivering small performance updates across new generations and ask for higher prices since there is no healthy competition.

    1. Performane difference is impressive when you look at the best GPU that each generation has to offer. TitanM/980ti vs titanXP/1080ti is not a small difference, sometimes 1080ti even doubles the 980ti framerates.

      1. Nvidia was scared of the new amd Rx Vega. Now they know. It’s not as powerful as expected, nvidia is going to go slower while charging more. Hence a possible Pascal refresh since they have no competition in higher end. Otherwise if they would’ve been threatened, they probly would’ve got Volta GV100 out.

        2 cents.

        1. rather than scared is more like nvidia never look down on AMD. because they know AMD often make unexpected move and when they do that it hit nvidia hard. so not only they release 1080ti but they also cut 1080 price along with it. logically nvidia does not need to drop 1080 price back then and just charge more money for 1080ti.

        2. Huang had to know that Vega was a fail from the beginning. AMD was putting so much R&D into their CPU section and succeeded quite well there but AMD doesn’t have the resources to compete with Intel and Nvidia at the same time.

          Next year we might see a GPU from AMD that could be impressive. If they can bring the Navi 7nm GPU before 2019 then they will probably have a good competitor for Volta.

          If not, then expect really high prices on Volta GPUs.

      2. If we talk at stock, if we compare the difference between them in OC, the 980ti reduces the gap significantly (at about 40/50%) depending on how good the cooling is on your 980ti and if you modded the bios or not (the 980ti is basically the old vega, it can be overclocked a lot but the power consuption skyrockets). A mere 45% gap is not enough to justify a purchase, mostly because the 1080ti is still not there in performance for 4k max settings and high framerates (if we esclude fps games). What I am really afraid is that the gpu performance increase generation by generation will slow down as for the cpu market (albeit is better for a customer standpoint because what you buy will last longer). What Ryzen did is adding more cores, but the IPCs are still too low at the clocks they can push the chip (there is no point in comparing intel and ryzen clock for clock if ryzen is at max and intel is downclocked). Hence, the real IPC should be the IPC@max stable clock (for X and K chips). If you see in this way both companies are slacking and trying to shovel us this new procs that have no real gain: Ryzen has a low IPC and compensates adding cores, Intel compensates rising the clock (with the same IPC).

        1. 1080ti can be also OC’ed so that gap will remain. Of course many people were OC’ing 980ti to the extreme, into 1400-1500MHz areas (from 1075MHz!), but on various forums I have seen few times people admitting, that not all games were stable with OC like that.

          Below 1080ti Vs 980ti SLI comparison
          https://www.youtube.com/watch?v=pdIt2s04KPo&t=61s
          Here you can see, Maxwell vs Pascal performance gap is HUGE, 1080ti has the same framerate as two 980ti (while people here call it small difference)

          980ti was good card, but today even 1070GTX is better option because of more VRAM. In COD WW2, 1070GTX is way faster compared to 980ti because of VRAM limitation on 980ti.

          1080p use 5.5GB vram – 980ti vs 1070GTX results are the same

          1440p use 6.5GB vram – 980ti is noticeable slower (10fps)

          4K use 10GB vram – 980ti is even more slower (15-20fps), and while 1070GTX can provide 4K60fps experience (65fps average), 980ti not (45fps)

          1. well I am one of the few people that has a 980ti watercooled that does 1500mhz (1560 in few games) and 8000mhz on memories (+500mhz on vrams=1000mhz) and I can say that my performance most of the time are like a OC 1070 (i did benchmarks comparing a pc I build for a friend of mine with a 6700k 4.6 ghz +1070 OC +50 on core against mine 2600k 4.5ghz and 980ti) and most of the time we were basically the same, sometimes he had sligltly lower results.
            I do know that a 1080ti is more powerful but since Pascal overclocks way way less than maxwell if I try to sqeeze all the performance i can get. In a not limited vram scenario the performance are about 45% more, even less.
            Of course If you compare a scenario where the vram exceeds the 6gb of the 980ti you will see lose badly since it will stutter or crash hence is natural that a 1070 is a better buy now, but once again we are talking about an almost 2year card.
            Also those card are going@1200mhz…which is really low..300mhz less than mine and they are also aircooled.

            Also the test you put was in SLi, Sli does not scale as good as a single card (in framerate smoothness at the very least), hence you cannot really compare a sli setup vs a single card in the scenario you posted you are seeing maybe a 980ti+50%, also were the SLi cards OC and under water? or at least in an open test bench? since they will be limited a lot in thermals if they did not a test like that.

            Lastly, about vram usage I am not really sure is correct. When I played Witcher 3 my vram @ 2880×1620 (my favourite DSR resolution) did not exceed 6 gb and I always play with an on screen display the first hours of a game to see if my overclocks are stable.

            Therefore, It’s possible that the usage you are referring to is the usage@ stock memory clocks?

            Edit, in the test you link the usage for TW3 1440p is 3.5GB vram…not 6.5…

          2. “”Also those card are going@1200mhz…which is really low..300mhz less than mine and they are also aircooled.””
            And do you really think most 980ti users have their GPU’s watercooled? Your 1500MHz OC on that 980ti no longer represents performance that most 980ti users see

            When it comes to 6.5GB vram usage I was talking about Call Od Duty WW2 example, not the witcher 3. Look at gamegpu benchmark site, because they have included vram details and you can see 980ti vs 1070GTX results for yourself.

          3. My 980ti is a g1 and it seats comfortably above 1420mhz after oc very stable and good temps

          4. Well first of all, mine out of the box did 1279 mhz (is a gigabyte g1 gaming). Second of all if you have a 980ti in my small opinion is better to spend 100€ on a waterblock than buy a new card (this apply ofc if you already have some of the WC gear, albeit the fluid gaming from EKWB is pretty cheap). Third, if you are not interested in OC, modding bioses etc (not to the point of powermods like adding a whole pcb soldered to vrms ofc which needs a whole lot of IT background) why buying a 700€ card if you are not going to use it fully? I mean a 980ti is an enthusiast card and being an enthusiast for me consist in sqeezing all the performance from a component. if for you being an enthusiast consists in just dropping money on the most expensive piece of hardware you can get, than I cannot agree.

            P.s.
            OFC I do not believe most people watercool their cards but I believe if you paid 700€ for a card you should learn how to use it fully, hence at least learn how to OC. Also Sli on air is really a way to waste money (been there done that with 670s) since the top card will hold back the other due to higher thermals, which will also result in a noise increase.
            Really for Sli most of the time is better to just water cool a card (mostly if you play newer games, since most of the time the card won’t be used because Sli is almost never implemented in game ready drivers).

          5. Yes, but that gigabyte is non-reference card. Reference 980ti clocks are much lower, 1002MHz base and 1072MHz boost.

          6. Who the hell purchases reference cards anyway? Braindead console peasants and special enthusiasts who wants to run tripple or quad SLI maybe, but definitely not the majority.

          7. “Your 1500MHz OC on that 980ti no longer represents performance that most 980ti users see”

            That’s just not true. I can run 1500mhz easily on my air-cooled 980ti. One of my friends has that card too, and he can also do 1500, I’d say most 980Ti’s can run in the 1400-1500mhz area with the right configuration (and unlocked voltage control and decent cooling of coures). I can even do 1530-1550 in some games/engines.

          8. When people talk about each GPU generation, they mean stock clocks, not OC’ed, period. 980ti stock clock is 1072MHz and this is how that card should be compared with the next GPU generations. The biggest site in my country have benchmarked 10 models, and concluded that the best 980ti samples were stable at 1500MHz (gigabyte G1 1516MHz, asus matrix 1515MHz, zotac amp! 1505MHz) while other models had problems even with 1414MHz. These speeds were archived with air cooling (so temps and noise were higher) and because of that reason I doubt average 980ti users are running their 980ti above 1400MHz. Of course on this site many users are enthusiasts that like to OC their cards to the max, but average Joe dont do that.

          9. No enthusiast buys the top line GPU and doesn’t OC. But regarding GPU generation comparisons, yes default clocks are used, but that’s not what I responded to.

          10. top line GPU´s are bought by those who can afford it, not necessarily

            those who are willing and able to tweak. Alot people buy the most expensive (“give me the best card”) and expect to get the best result
            out of the box – without doing more than driver install.

          11. Well, i m not sure if they are crispy, but those i know buy the top gpu, cpu etc. intentionally to avoid tweaking –
            in contrary to “us poor folks”, who are excited about 10 FPS
            more after hours of OC-tests…

        2. Ryzen IPC= somewhere between ivy-hawell in latency based applications and for throughput its around haswell-broardwell

          If one has a 2600K and has it clocked at 4.5+ghz they do not need to upgrade their CPU for gaming Ryzen at 3.9 or so is about equal to sandy at 4.5 in gaming and emulation and far above it in productivity.

    2. Tbh Nvidia is the company that has been pushing hardest for performance over the past few years. The 1080Ti even at stock is head and shoulders faster than any AMD GPU out there.

      1. You’re right but Nvidia is like Intel they’re capable of much more than what they’re offering but since they have next to no competition, it feels like they’re not even trying. Prices increases with low performance increases etc. There lies my problem with them.

        I still buy and will buy Nvidia though. It’s just that before i would buy one of their high end gpus every generation but meh, now it’s every two gens out of principles.

        1. Not 4K since that wasn’t even a concept back then. But CRT had usually refresh rates of 85 and up. Then came CCFL LCD displays with 60 Hz. It took more than a decade to fix this and starting to offer 75 Hz+ panels.

          Plasma had problems with ghosting on the lower models, higher panels were ok (my parents still have Panasonic plasma at home and it’s very fast for average gaming). Also there was not a problem with image retention as long as you haven’t done some stupid stuff like leaving game menu on the screen for hours and hours.

          BUT the screen, oh my god, the beautiful and natural plasma screen. On par with OLED, without that many problems.
          But LED technology is cheap, so manufacturers can squish more margins out of it. That’s one of the reasons why it dominated the market.

          1. “On par with OLED, without that many problems.”

            I would hardly call plasma on par with OLED, but it does have some advantages over OLED while being less expensive.

          2. Agreed. The two technologies are the best atm, ofc OLED being more modern, but the problem is the advancement in it goes so slow. I mean we expected to have perfectly good OLED monitors somewhere back in 2012/3.

            I think plasma should’ve been the TV standard since then, but it wasn’t as power efficient and manufacturers wouldn’t make as much money selling it compared to LED.

            So we got the W-LED abomination.

          3. Oh well i’ve learned. Question, could crt today be able to achieve 4k. I figured back then that we switched from crt to lcd because of the lack of future of the former. That and also the fact that they we so big.. Maybe lcd, after all, was the good esthetic/practical choice to make.

      1. LED is cheaper than plasma and less power-hungry. Plasma is still high-end. It depends on your perspective. As for CRT, the pros of having LED-backlit displays over CRT outweigh the cons.

      2. There are many dimensions of improvement. Overall technology tends to improve in most dimensions. I’m usually the first to bring up how bad LED monitors are but the reason they ever became popular was that they did offer real advantages over CRT right away, crucially the size and weight decrease. LEDs enabled people to sit a healthy distance away from their monitor while using a normal desk. LEDs allowed for bigger screens that a normal person could carry and set up. Clearly, these benefits outweighed the drawbacks in the eyes of most consumers so in majority opinion technology did improve. The fact that gamers had to wait a decade for gaming monitors to become decent again is just an unfortunate tradeoff.

  2. Not really sure PC gamers at large desperately cheer for ray tracing lighting or photorealistic graphics…latest graphics engines (UE4, CE-V/Lumberyard, Frostbyte, Snowdrop… etc) have been doing pretty well both in the lighting department while reasonably also raising the polygon count since their previous iterations.

    What I feel is really stagnant in PC gaming is the physics …and most importantly the IA.

    Over 90% of triple AAA big budget games feature basically vast amounts of undestructible, non-interactive scripted worlds…with sometimes borderline ret°rd NPCs/Enemies.

    1. No, just no. We are nowhere near photo realism, except in extremely polished tech demos of small areas that are presented in perfect conditions, and even those are far from perfect in motion.

      If you believe even the heaviest modified version of Fallout 4 even approaches photo realism, you need to get your eyes checked, no offense meant.

      1. Hehe, i’m glad that there are people out there that want more. I know i’m a graphics wh*re, so it kinda makes me happy to see the wish for even better.

        But it seems that no one gets my point (can anybody read nowadays?), so i think i’m done here.

      2. Oh, those mods again. They make games look good or even great, but only on screenshots. Just move and you’ll see stiff animations and poor models. They will quickly destroy the illusion.

      3. Photorealistic means looking like a photograph. All three of these pictures are immediately evident as NOT images of the real world, even from the thumbnail. I don’t even have to open them in a new tab to see that. It would be easier to name elements in the pictures that are realistic, than those that are unrealistic.

    2. LOL nope. people said the same thing when Half-Life 2 came out. Look where we are now…? This trend in disparity will continue until we’re no longer able to differentiate between reality and fantasy. Ray tracing is a fraction of that equation.

    3. Nah, we aren’t even close. We can make things that look good, very good in fact (photogrammetry helps this too), but truly photo-realistic games are years away!

    4. AAA devs stopped caring. Many games have really poor AI, physics are horrible (not only destructible environments; also vehicles feel bad and move like toys, even Rockstar stooped to Watch Dogs’ level there), and animations lack any weight. And what about difficulty? Games, contrary to movies, revolve around interaction, so they should offer some challenge! Meanwhile, way too much titles are mindless “LMB simulators”.
      Visuals are nice, but I feel they became a white whale of many developers. It’s possibly a deliberate move (a pretty wrapper will even sell a poor product).

  3. At our present rate this will not be possible. We’ve got a few die shrinks left with silicon which will mean improved efficiency and so greater speeds with the same wattage but not close to enough performance. We’re going to hit the limit of what silicon is capable of in a few years. Some multi-GPU tech is probably on the way but there are limits to what wattage will be acceptable and the cost of manufacturing so many cores. Huang said recently that the Volta GPU is costing them $1,000 to make right now but the costs will come down over time.

    Still, I think we will get there using different materials such as carbon nano tubes. It has the capability of being up to 5 times faster than silicon while using up to 5 times less watts. There are difficulties involved but new materials for semiconductors are being researched and just because we don’t get a lot of news on it doesn’t really mean anything. We will probably only hear about a breakthrough right when it happens and not too much beforehand. The company that gets to patent the new material will stand to make a huge fortune and they’re not wanting to help their competitors out with any R&D results. It will probably seem to come right out of the blue but the research has been going on for years.

    1. ………and yet you are here on a regular basis. Not to brown-nose John but he does a lot for the size of this site imo.

      1. I’m here for the actual stories, and to talk shìt in the comments.
        That doesn’t mean I can’t be critical, in my opinion whenever he posts this sort of garbage just to pad his stats, it taints some of the good that he does.

      1. What an astute observation, what does that have to do with my comment?
        Did you see me telling him what to do or demanding something?
        Just posted an opinion…

        1. You’re b##ching about the nature of the content on the site when you’ve risked nothing and spent nothing to receive it.

          Bricktop: “If I give a dog a bone, I don’t want to know what it tastes like.”

          When I don’t like a site’s content, I just leave.

          1. Let him have his moments… That guy is Olek reincarnated after he was banned. He often hates, attacks people in the comment sections and posts his harsh opinions, some of which are actually very true and valid. But sometimes he takes his hate a bit too far. But since we now run under the true freedom of speech he will fire away and if you disagree, be prepared for a war, this guy is mean and cunning. Or, you could just ignore it and instead participate in real discussion about the topic further up 🙂

          2. It’s not a big deal. I just click the ignore button. “War” averted. I value Freedom of Speech more than getting censored for arbitrary reason.

  4. “According to NVIDIA’s”
    stopped reading here, these guys know only how to charge more for gimped performance. not reliable

    1. That has never actually been proven to be true. Actually linus made a video proving that they never gimped performance but in fact improved it.

          1. These people hate that Linus tells the truth.
            I’ve encountered a few of them. Too bad they can’t contest the fact that his results are reproducible and similar to everyone else’s, aka NVIDIA delivers more performance than AMD on a regular basis. Took them a year to get something remotely close to the 1080 Ti.

  5. Dont worry guys the next playstation and xbox will promise you ray tracing along with cloud rendering and other overhyped nonsense that will never happen.

    1. the next playstation and xbox will promise you ray tracing along with cloud rendering and other overhyped nonsense that will never happen.

      hahahahahahah

  6. No **** it does. I’ve been tryingto explain this to tards on the console and PC side alike for nearly 10 years. Anyone who does any 3D rendering either for a hobby or a job knows this – raytracing is EXTREMELY processor and memory intensive, even if you limit the amount of bounces down to 1 or 2. With current aiblities it would still take literally hours to even render 1 frame, without extreme rendering tricks, lighting diffusion and intentionally placed planes to prevent how far the lighting actually has to render.

    It’s highly doubtful actual ray-traced rendering will *ever* be possible in games. Would they look EFFing amazing? Yes. They would. Depending on textures and how well the lighting is done. Do we really need it at the moment? No.

    UE4, Unity, CryEngine, Frostbite, idtech and its descendants already look extremely good. Just stop slathering everything in ****ing post processing and start working on physics, destruction, etc.

  7. Sounds like he’s just throwing numbers around. I doubt we need 100 times the power of a 1080Ti to run realtime ray-tracing. And photorealistic VR? That’s something further down the road, we’re slowly getting there for now. An example that takes real-time graphics a bit closer to photorealism is UE4 architecture demos like Loft In London. And running that at 4K (8k effectively) in VR, isn’t a limitation of engines or performance but is rather the hardware, it’s the headsets being too low-res. Running two displays at 4K @ 8K in Loft In London isn’t that hard to achieve. We’ll see when Pimax releases its new headset. But 40 times more power? No, just no.

    Also, defining photorealism? He can’t say photorealism as we’ll probably get ray-tracing running smoothly long before actual photorealism is achieved.

    The closer we get to achieve photorealism there more techniques are required from an engine, the higher textures and shaders must be, and also monitor pixels must be indistinguishable. The first real step towards photorealism was in 2007 with the original Crysis… Now, 10 years later we have only exelled in some areas of real-time visuals, but nowhere the big leap Crytek made in 2007. Just going from 1993 to 1996 was a enormous step up going from Doom to Quake, then another two years gives us the original Unreal, which leaped the visuals forward in yet another big way. Then it was 2004, when Valve, Crytek and ID broke new ground with HL2, Far Cry and Doom 3. Next jump was Crysis and as I said, no new big jump has come yet after that, only minor improvements here and there. HairWorks, TressFX for example, Path-tracing by Brigade etc. Both Cryengine, Unreal Engine, Unity and FrostBite also continued to improve in some areas like Voxel Based Ambient Occlusion and realistic GI, then Physically Based Rendering came along, Photogrammetry, Nvidia experimenting with soft particle physics for fluids and smoke in realistic ways.

    The rest is up the artists and developers to push visuals further, a great example of that is how visually impressive Ethan Carter can look on an rather old engine as UE3, and how Dear Esther manages to look so amazing on the Source Engine (post-2009 engine-version) even by todays standards. Then we see games like Assassins Creed Origins with a multi-million-dollar triple-A behind it and graphics are quite sh*tty to be honest.

    I wonder when next big jump in visuals come. Crysis with mods still holds up really well by todays standards…

    1. Dont say nonsense. Games like Uncharted 4 and Horyzon surpassed Crysis in rendering quality tremendously. Speaking about animation nobody can compete with last Uncharted.

      1. Nah, it’s really just cut-scenes, characters and artistic love that stands out. From a technical perspective they really aren’t any more or less advanced.

  8. ““In order to achieve ray tracing in games, the GPU must have 100 times the processing power it has today. To have virtual reality, you need 40 times more powerful GPUs.”

    Non geeky people keep asking the PC community why do we need more power lol.

    We need to find a better way to do ray tracing then brute forcing it.

  9. Nah… instead of ray tracing, devs are going to use that power to render the same open world borefests they do today but at 300000 x 175000 resolution.

  10. I’ve been hearing about ray tracing since the mid 90s and nothing has come of it since then. As a concept its really interesting but in practice not worth it.

  11. Is that VR number on top of the regular number? So, 100x what we have now for photorealistic, and 4000x for VR?

    Also, I am inclined to think that before we get a GPU that is 100x more powerful than what we have now, we will probably have adopted cloud gaming streaming, and because they will use large servers consisting of hundreds, if not thousands, of GPUs, it is possible that developers will take advantage of this to offer us games that use a dozen or more GPUs simultaneously… probably at a higher cost.

    Also, I’m sure if ray-tracing becomes a serious goal for games then we will probably start seeing hardware designed specifically for it, that might help us achieve it faster.

  12. Photorealism can happen with images, but once their is movement, goodbye realism. We keep getting closer each generation of hardware and graphic engines, but we are a long way off. Resolution keeps increasing, framerate, color and Lighting needs keep increasing, texture size keeps increasing, and hardware has to handle all of that. I remember talking photorealism ray tracing before HD resolution was even a thing, now we are at 4k, with 8k on the horizon, and others saying we need 16 to 24k for VR. It never ends.

  13. Full scene ray tracing will likely never be the future. It’s too demanding. Rasterization has become so advanced now you get fantastic results with ‘just’ shaders and polygons. There might be some element of ray tracing in lighting scenes as the article suggests, but full scene ray tracing will be inefficient it will never be the ultimate goal.

  14. It’s actually not clear how he meant that. You added “in addition” to your paraphrase, but it’s not there in the original. He may well be saying that you’ll see photorealistic VR BEFORE you see ray-tracing. I personally believe so. Hacks always beat brute force. Ray tracing is brute force, our current methods are all hacks. Our hacks will continue to outperform RT forever. Compare current VR games with current ray tracing. Current VR holds up really well in that comparison, running at high resolutions and 90+ fps while tracing demos run at tiny resolutions and low framerates and are plagued by crippling noise. You need way more processing power to make tracing happen than you need for photorealistic graphics with hacks.

  15. That’s a pretty random critisizm tbh. What’s some nvidia guy supposed to do about gameplay? Blame publisher meddling for the decline in gameplay.

  16. Well, this article didn’t age well at all. Here we are, just 4 years later and ray-tracing has already become a basic standard for any console or graphics card that wants to be considered part of the latest gen instead of already outdated and only suitable at steep discount… something that just ain’t gonna happen in this age of covid-related shortages and inflated prices. Video card scalpers became a thing. Amazing.

    Anyways, I’m pretty sure the best of the newest cards from Nvidia are considered to be about 2.5x more powerful than the best cards they were putting out in 2017 and 2.5 is a far cry from 100x.

Leave a Reply

Your email address will not be published. Required fields are marked *