NVIDIA RTX 5090

NVIDIA RTX 5090 appears to be 30-40% faster than the RTX 4090

Last week, NVIDIA officially revealed its RTX 50 series GPUs. The NVIDIA GeForce RTX 5090 and RTX 5080 are currently planned to come out on January 30th. And, from the looks of it, the high-end model will be around 30-40% faster than the RTX 4090.

Do note, that these are preliminary results. We’ll know for sure the performance gap between the RTX 5090 and the RTX 4090 once we get the RTX 5090 on our hands and test it.

For these preliminary results, we have two games that can give us a pretty good idea of how they run on both the RTX 4090 and the RTX 5090. The first one is Black Myth: Wukong and the second is Cyberpunk 2077.

For our tests, we used an AMD Ryzen 9 7950X3D with 32GB of DDR5 at 6000Mhz, and the NVIDIA GeForce RTX 4090. We also used Windows 10 64-bit, and the GeForce 566.14 driver. Moreover, we’ve disabled the second CCD on our 7950X3D.

Before continuing, it’s crucial to note that we have a Founder’s Edition for the RTX 4090. That’s the key to these preliminary comparisons. By using an FE GPU, we should get a pretty good idea of how much faster the RTX 5090 is. If we had an AIB GPU, our comparisons wouldn’t be valid. After all, most – if not all – of the AIB GPUs are OCed. So, you can’t compare an OCed RTX 4090 with an RTX 5090 FE.

So, with this out of the way, let’s start. Frame Chasers was able to capture some gameplay footage from Black Myth: Wukong with and without DLSS 4. Without DLSS 4, the game ran with 29FPS on the RTX 5090. In the exact same location, our RTX 4090 was pushing 21FPS. So, we’re looking at a 38% performance improvement.

Black-Myth-Wukong-RTX-5090

For Cyberpunk 2077, we used NVIDIA’s own video. In that video, the RTX 5090 was pushing 27FPS at Native 4K with Path Tracing. In that very same scene, our RTX 4090 was pushing 20FPS. This means that the RTX 5090 was 35% faster than the RTX 4090.

DLSS 4 / DLSS 3 / DLSS 2 / DLSS Off Comparison | Cyberpunk 2077

NVIDIA has also shared some graphs when it announced its new GPUs. Without the new DLSS 4, the NVIDIA RTX 5090 appeared to be around 30-40% faster than the RTX 4090. This falls in line with our findings.

nvidia-rtx-5090-compared-with-rtx-4090

As I said, we won’t know the exact performance difference between the RTX 4090 and RTX 5090 until we test them ourselves. However, it’s safe to say that without DLSS 4, the RTX 5090 will be about 30-40% faster than the RTX 4090.

Now I don’t know if this percentage will disappoint any of you. However, that was exactly what I was expecting from the RTX 5090. And let’s be realistic. A 30-40% performance increase per generation is what we typically get. For instance, the RTX 3080Ti was 25-40% faster than the RTX 2080Ti. Or how about the RTX 4080 Super which was 25-35% faster than the RTX 3080Ti.

All in all, the RTX 5090 will be noticeably faster than the RTX 4090, even without DLSS 4. However, you should temper your expectations. You will NOT be able to run path-traced games at Native 4K with 60FPS. If you are expecting something like that, you are simply delusional. For Ray Tracing games (or for UE5 games using Lumen), you’ll still have to use DLSS Super Resolution. For Path Tracing games, you’ll need DLSS 4 Frame Generation. And that’s in today’s games. Future titles will, obviously, have even higher GPU requirements.

Stay tuned for more!

77 thoughts on “NVIDIA RTX 5090 appears to be 30-40% faster than the RTX 4090”

  1. the 5080 will be 20-30% faster than the 4080, the 5070 will 15-20% faster than the 4070 and the 5060 will be 10-15% faster than the 4060. So might as well buy the 5090 guys.

    1. My RTX4080S is OC'ed to 59.9TF and 820GB/s memory bandwidth, so I will be not surprised if stock RTX5080 (56TF, that's 3.9TF less compared to my card) will be a little bit slower. Of course, the RTX5080 can also be OC'ed, but I think it's the least impressive performance jump from generation to generation since I started gaming on PCs in 1999.

          1. dude are you stupid why would i look at nvidia benchmark, we talking about leaked scores on benchmarking tools

          2. Except it isn't as it's nowhere on nvidia's website. It's a slide created by Game Day, as evidenced by Paul's screenshot above.

      1. You're high. I have a 4090. OCing The 4xxx cards adds nothing in actual gaming. Maybe +8fps. Undervolting is better. 5080 will whoop your 4080, you made a poor purchase decision.

        1. I tried undervolting my GPU, but it reduced performance too much in certain games, and also made absolutely no difference in a few games like Metro Exodus at 4K (only power limit reduced power draw in this game, but then my fps droped from 86fps down to 62fps). Undervolting makes more sense on your RTX4090 because it consumes a lot more power (450W on reference models and up to 600W on non-reference models) and that's more than my entire PC at max (around 415-430W).

          "Benchmarkboy" recently tested Hellblade 2 and had 70fps on his 4080S, while I had 82fps in the exact same spot. I don't know if the OC made this difference, or if my whole PC is better optimised, but I usually see a noticeable increase in performance compared to other youtubers with similar PC. My card with OC has 59.9TF, the RTX5080 56TF, so IDK why you think 5080 will provide much better performance with less TF. Maybe with RT and especially FGx4, but that's to be expected.

          I made a poor purchase decision? I have bought several GPUs since I started gaming on the PC in 1999, and not even the legendary Geforce 8800Ultra, or GTX 1080ti impressed me as much as my current RTX4080S. The vast majority of games in my Steam library run at well over 100fps in 4K native, some even with RT. For example, RE3 Remake 130-190fps, or RE8 Village 120-160fps. Only the most demanding UE5 games or RT / PT games dip below 60fps on my PC at 4K native, but then I can use DLSS features and still get around 100-120fps. Even if I were to buy the RTX5090, it would not improve my gaming experience that much. I'd just be able to run more games at 4K native instead of DLSS, and that's about it.

          As for the RTX4090, in my country the cheapest RTX4090 was 2x more expensive ($2000) and I would rather save that $1000 for my next upgrade (RTX6080) than just buy a 30% faster card now. In some games John only has 10fps more on his RTX4090, that's literally the difference between DLSS Quality and balance, IMO that's ABSOLUTELY not worth an extra $1000, especially for a product with a design flaw, so maybe it's you who made a poor purchase decision :P.

          I have known a number of people whose 4090s have died. Either the connector melted or the board cracked under the weight of the card. My RTX 4080 Super has a redesigned power connector, so I don't have to worry about it burning my house down, dude :P. I also chose the model with the anti-gravity plate, so the board cant bend even without the GPU bracket (although I still bought one just to be sure).

          From my point of view the RTX4080S has been worth every penny. I paid $1000 for my card and I am running games at 4x higher resolution compared to the $299 RTX4060 and I still get a much higher frame rate. That's not bad for a 3.3x price difference. I'm 100% happy with the purchase and have absolutely no regrets. This would not have been the case if I had bought the RTX4090, because IMHO additional $1000 is absolutely not worth it unless you are extremely rich and do not really care about value at all. The new RTX5080 offers very similar performance in raster compared to my card, so it seems my card will remain strong for 2-3 years and run every single PS5 port without issues. My next upgrade will be the RTX6080, assuming it has more VRAM than the PS6, otherwise I'll wait a bit longer as it's not a good idea to buy a high end card that has less VRAM than the console.

          1. Most of the 40 series seems to benefit from slight undervoltage / drop the allowed power budget – For instance have a 4090 running with 90% power budget / slight undervolt (stix oc) and it runs slightly higher average clocks and can hold the boosts longer due to better thermals – And the chassis is well ventilated so its not the limiting thermal factor

  2. a little bit disapointed but only because 4090 was 60-70% faster than 3090/3090 ti.
    I expected 50% mini but we ll see the test of 5090 the 24th

    1. knowing, that increasing cores is a expotential thing. as in, negative, not positive, 40% more cores dont usually translate into 40% more performance. but this generation it seems like it does. almost as if ada lovelace was bandwidth and power starved intentionally as if nvidia knew amd was not gonna be competitive. almsot as if blackwell is "what should have been". because the architectural changes have probably been super small. so in a sense, blackwell has become supercharged ada lovelace, clocks adjusted, unleashing the full power. usually one expects 20 to 40% improvment for the same price. now, 25% higher price for 40%, so like 15% raw performance gain price to money wise at least for the 5090. worst generation ever probably. not kidding. worse than refreshes, because refreshes at least make overpriced previous gen cards affordable, but nowadays, not accurate too. refreshes mean nothing anymore. just another opportunity for profit milking because people are desperate animals. animals who don´t need more fps in league of legends or minecraft anyway. or brainless foam swimming on another hypelake.

      1. Two theories that i heard been presented a lot about many ada sku's "poor scaling" along with quite some compelling evidence were bandwidth starvation and the schedulers leaving too much untaped resources – That said to get really deep down into the silicons performance and see all its metrics… its not something the regular guy can do. Nvidia also seems to dedicate more and more silicon away from gaming and instead allocate more towards ai centric functions. I really hope they split it up and do a more gaming centric arch than they do atm

      1. Lossless Scaling's main selling point these days is framegen, not the original scaling feature. As with any framegen LSFG works a treat 60-72 fps framegened to 120-fps. The fact you say it is a*s tells me that you have never used it.

        1. Lossless scaling for me is a gamechanger for emulation. I play many PS2 games in 4k which looks great but many ps2 games were stuck at 30fps, lossless scaling turns those games into 60fps and the latest version 3.0 looks really great.

      2. Lossless scaling the latest version works really great, it has its uses, i use lossless scaling with emulators and it really works great. PCSX2 many great ps2 games when ran at 4k looks great but many games were stuck at 30fps, lossless scaling turns those games into 60fps and honestly it look really great.

    1. 6090 will no doubt feature twice as many "fake frames" with dlss5 – all while adding reflex3 that they will claim won't add any latency at all, it will even somehow give you negative latency… ie the AI will predict when you shoot before you do it 🙂

    1. You tell him we warned you about fake frames, you tell him we told you its fake, you tell him that we warned you that if you tolerate this more devs are gonna use it instead of optimize it. You tell him that.

      "My lord he says he doesnt like it when people try to improve things"

  3. MEH, this one is even more overpriced than 4090.
    it will not be acceptable to pay so much money for a gpu that needs so many craps like upscaling,frame generation and what not, to get high fps.

  4. Soon the games will just be a postal card with one image and AI will creae the game! I wish they were more expensive. This way all you poors can also get it. Disgraceful…

    1. The crazy thing is that Nvidia has almost been hiding all the most impressive stuff. The updated RR in DLSS4 looks WAY better than 3.5. DF in their CES video uploaded today showed a comparison of Alan Wake 2 which will be getting lots of updates to implement these new features and a single table in that game looked way better than the current version. Indiana Jones will also be getting updated with, in addition to new frame gen and RR, a new hair model that is ray traced. NV is also working on a mega geometry feature which is like UE nanite but they're working with MS to implement it into DX so that it becomes more of a standard. If I recall I don't think mega geometry was mentioned at all in their presentation. And that demo they showed with the neural textures was on UE5. They already customized UE5 to feature neural materials as well as a much more advanced version of PT over hardware lumen. It's all very impressive stuff. And as mentioned with Alan Wake 2, we're going to see some of these things added into existing games. Can't wait to see the new Doom with PT. I'm just laughing at people arguing over the semantics of "real" or "fake" frames. It's all fake guys. We just have a new version of fake that allows much more performance

      1. yeah this neural shading and rendering is so magical and revolutionary .the more i reserach about it the more it blows me away .nvidia game developer channel has great videos about it and they say the applications of neural rendering are countless .
        better geometry beter hair and face better ray reconstruction better everything literally with much better performance than traditional ways .we will soon see some showcasing games for it like alan wake 2 indiana jones black state and half life rtx .games garphics will be peaked in 2 years .end of story

        1. Black State is very interesting. Even more than I originally thought. DF discussed it with new footage in their video today. The game is using UE5 which we knew. But they're NOT using nanite and lumen. They've done some hefty customization to the engine. It's their own RT implementation and high density models. They're currently working in a "PC-only" mindset (no current console plans) to get it as smooth as possible across a wide range of configs. They don't want any UE stutter to be present. The demo build they played with RT enabled, 4K with DLSS Quality was nearly hitting 100fps. When they enabled MFG it was over 300fps. And the gameplay looked smooth as butter.

          1. yeah it will be a showcase for neural rendering i think .it was something special even in older trailers .its a showcase for nvidia new techs like cyberpunk was a showcase for path tracing .

        2. many mindblowing things, often come with a price. be it the art or craftsmanship or taste, culture even. down to the core of human identity. it is the robots taking over in a cynical sense.

      2. The problem with neural shading/textures is they won't work with older games without an update. The other downside is you have to have two complete sets of textures, normal textures and the neural compressed textures which will only work with Nvidia GPUs and likely only really well with the 5000 series. I expect it will be several years before it becomes a commonly used standard and Intel and AMD need to figure out how to do it on their AI cores. Cripes AMD is 4 generations behind Nvidia in that regard since the 7000 series AI Cores seem to be broken and worthless. Intel will likely figure it out by next generation and possibly later in the current generation.

        Variable Rate Shaders has been in DX12 since at least 2018 and hardly any games utilize it yet despite it being a way to increase optimization.

        1. None of that is really anything new or unique to this new feature set. Of course games will have to be updated if they want to take advantage of a new feature. But I think the main draw will be for future titles. I think that's how we get a Witcher IV that actually comes close to replicating the fidelity of that pre-rendered trailer. And games, for literal decades have shipped with optional features available that not all hardware can take advantage of. Even for some game I downloaded recently (Indy, maybe?) there was a separate high res texture pack to download. We're already seeing a solid number of games employing RT and PT pipelines that would render them unplayable on AMD/Intel hardware if those features are enabled. So it doesn't seem devs have issue with including features like that if they feel it aids in furthering their vision. Not everyone will be able to experience it fully, but hey, not everyone gets to go see movies in top of the line Dolby Cinemas or IMAX. But filmmakers still strive to push improving the quality of those experiences. And I dunno about VRS. I feel like devs have just come to kinda avoid it because it tends to cause more problems than it's worth. Usually looks obviously worse with a negligible if any performance improvement

      3. they read the room, and know, they can only advertise once something is mature. thats nvidias way. unlike amd or intel. who are desperate to sell based on promises. thats valve and nvidia magic. both companies understand the magic of how to actually get customers to buy their technology. the RR and mega geometry thing is amazing honestly. i hate the neural stuff though. the issue with neural rendering is, it is so all over the image, who does the quality control? pixel quality control? no one, so we let em open pandoras box? well, a new measurement needs to be added. Raster /RT per second vs AI generated frames per second lol.

      4. they read the room, and know, they can only advertise once something is mature. thats nvidias way. unlike amd or intel. who are desperate to sell based on promises. thats valve and nvidia magic. both companies understand the magic of how to actually get customers to buy their technology. the RR and mega geometry thing is amazing honestly. i hate the neural stuff though. the issue with neural rendering is, it is so all over the image, who does the quality control? pixel quality control? no one, so we let em open pandoras box? well, a new measurement needs to be added. Raster /RT per second vs AI generated frames per second lol.

    2. honestly, look at nfs 2015. or any other rasteroptimized title. they already look photoreal under certain scrutiny. the trick is to get actual real gameplay photorealism, because we already achieved photorealism long ago in the professional space when looking at it further away, its all about the details and the finetuning which costs too much AI "magic". the consumer doesn´t want photorealism, because it homogenizes artstyles. something much more valuable, artstyles, than graphics. but fact is, some genres benefit from that. arma 4 is gonna be insane. it will not be photoreal, but goddamn if they include a pathtracer…. just for future proofing….. imagine 2035 machine running that game. training soldiers in games is gonna be a reality by 2050.

  5. 30% to 40% is pretty normal going from one generation to the next. The disappointment will be the 5080 which will probably not be a 30% to 40% increase over the 4080 but we need reviews and benches from TPU to see.

    1. you forget, the price also increased. and 50 dollar less for the lower tier customer like us, irrelevant since the prices are still supercharged anticonsumer.

    1. Where did that 720W TDP come from?

      One big mistake everyone is making is the 4000 series used the TDP rating which is the rating for the chip itself and the 5000 series is measured in TBP which all the power it will use including memory and VRM losses.

      I think when Gamers Nexus does it's frames per joule tests the 5000 series is going to be more efficient than it first appears because of the different power measurements

        1. GN changed the way they measure efficiency and they do indeed measure only the GPU power using special tools from Cybernetics/Elmore Labs. They also went to a Frames per Joule or a FPS per watt measurement. You can see the new test methodology used in their video on the new Battlemage GPUs

          https://youtu.be/JjdCkSsLYLk

      1. The RTX5090 (reference model) has a TDP of 575W. However, youtuber "Game Day" has done calculations for certain scenarios, one of which would require 720W. Watch the whole video and you will know everything:

        I can't post links, but you can easily find it by searching for this video on YT:

        "RTX 50 Series: Real Performance Preview + Titan AI Uncovered"

        https://uploads.disquscdn.com/images/2b6f2905601cf740b9d39b3525ecfe9a13487de25c38515fb7abbe751203ba21.jpg

        1. Actually the 5090 has a TBP of 575W and yes there is a difference between TDP and TBP

          I repeat it again, the 4000 series was measured in TDP while the 5000 series is measured in TBP

          TDP = Thermal Design Power
          TBP = Total Board Power

          Those are obviously two different things, the first is a measure a heat produced in Watts and the second is a measurement of total power used in Watts

          TDP tells you how big of a heatsink you need, TBP tells you how big of a power supply you need. TDP is basically worthless for GPUs because they all already have a heatsink provided but it is useful for CPUs because many often don't have a heatsink provided and you need to supply your own and TDP tells you the minimum size needed

    2. the moment the 720W popped up i just had to laugh my a*s off at wtf ever this picture is suppossed to send as a message.

  6. I really hoped RTX5080 would have 20-24GB of VRAM, would've been a good choice of gamers who could afford it. Effin Ngreedia intentionally kept it 16GB so they could release a more expensive Super / TI in the near future…

    1. I would be willing to bet money that the 5080 Ti will have more than 16 GB VRAM and will be considerably faster than this 5080. Probably the 5080 Ti will be a cut down salvage part from the 5090s that weren't perfect. It's the waiting until Nvidia releases the Ti that will be hard for a lot of people.

      They may release a 5080 Super in a year as a refresh of the 5080 but the 5080 Ti will be a different GPU altogether.

      1. Exactly, I even heard somewhere it was leaked to have 24gbs at launch, but now it's instead going to be released next year $1200 as Super/ti variant.

      2. I think the 5080 Ti will be a premium+ GPU. I am looking forward to a 1.75 slotter design. Fingers crossed. But I also think it will be priced at 1200 USD.

      3. There won't be a 5080 Ti unless there's a market need for it. 5080 is more than most games need, and ahead of all competitors. To create a 5080 Ti from a cut down 5090 die, which is already a cut down DGX die, would require either an unusually low yield rate, or intentional crippling of chips, which they could more easily sell at substantially higher margins elsewhere.

        Not happening.

        1. True the 5080 will be more than most gamers will need right now but 2 things. 1 is that no one buys a GPU, especially a really expensive GPU, just for playing games today. They invest in the future. Most upgrade every other generation so about 4 years. Some keep a GPU 6 years and even longer. Games coming out in a GPUs lifecycle are a consideration. We don't know how much more demanding games will get during that time period but the universal rule is that games will always continue to get more demanding. UE5 is proof of that.

          The 2nd thing is that, even though not mainstream, there are gamers using really high refresh rate monitors that want really high fps.

          The 5080 Ti will come. I just am not sure what Nvidia will call it but the gap between the 5090 and the 5080 is 2X the cores. That gap will be filled and not by a measly 5080 Super refresh a year from now although I do expect a 5080 Super as well.

    1. the moron that downvoted you has a 4090, as if frame generation matters in that card and he says "i wont upgrade from my 4090" duh why not, if frame gen is so good, get a 5070 nvidia says with frame gen its better than the 4090, lies.

      1. FGx4 will be unusable.

        With DLSS FGx2, artefacts are literally unnoticeable, because even if one frame has some artefacts, the next frame looks perfect, making it much harder to spot artefacts. However, FGx4 generates 3 extra 'fake' frames, so your eyes will see artefacts for longer making them impossible to ignore. People who already played on the RTX5090 noticed FGx4 artefacts very easily.

        DLSS FGx2 works extremely well and it doesn't make sense to turn it off, as you get a much smoother and sharper image (on sample and hold displays, a higher frame rate improves motion resolution) and only minimal input lag (I did my own measurements and it's in placebo territory). The FGx4 is however too much and I doubt many people will want to use this technology.

        BY THE WAY. Even the RTX5090 needs to use FGx2 to run the most demanding PT games such as Black Myth Wuking in 4K at high refresh rates. So yes, FG is absolutely essential, even on the best GPU in the world.

        1. I run black myth wukong over 60 fps maxed out at 4k ultra performance. FG is green eggs and ham, I do not want or need them sam I am. My system is custum build with an i7 12700k and RTX 4090. I use dlss. Demanding stuff I use ultra performance instead of FG. Most games I use dlaa if available. FG ruined AW2 it looked better maxed out running in ultra performance. IMO.

          1. 4K DLSS performance with my own reshade sharpenig filter as ungame sharpening is excessive.
            https://i.ibb.co/g39CcfC/b1-Win64-Shipping-2024-09-01-00-30-46-747.jpg
            https://i.ibb.co/BLMw5Ns/b1-Win64-Shipping-2024-09-01-00-07-52-582.jpg
            4K DLSS performance with the Iatest 3.8.1 DLL still looks like 4K to my eyes, but when I tried ultra performance I started to see very noticeable image degradation. Im not willing to play with DLSS ultra performance and I'm surprised anyone want to use this mode. Maybe you are playing from longer distance?

            I will rather use DLSS FG, and here's why:

            4K DLSS performance, very high settings, fullRT (PT) – 57fps 63ms latency (in the top right hand corner of my monitor screen)

            https://i.ibb.co/56y8c04/20250107-215732.jpg
            With DLSS FG I get 88fps and 54ms.
            https://i.ibb.co/NtHjWF5/20250107-215926.jpg
            Not only do I get a sharper and smoother image, but I also get the lowest possible latency, because in this particular game, FG activates NVIDIA Reflex.

            I always enable DLSS FG in all my games because it always improve image quality (more fps equals sharper image during motion) and it also improves my aiming, because my eyes can track moving objects much easier.

            Cyberpunk at 1440p TAA native + psycho RT. I get 73fps with these settings and 28ms latency
            https://i.ibb.co/f9rvRQv/20250108-211531.jpg
            The same settings just with DLSS FG (without DLSS super resolution) 124fps and 37ms latency. That's just 9ms more (placebo territory) but it made the game look MUCH sharper during motion and I could aim much easier.
            https://i.ibb.co/nCD4tDR/20250108-211622.jpg

            As for Alan Wake 2 it's probably the worst FG showcase, becasue in this particular game FG adds 20ms latency instead of usual 9-10ms. The game is still perfectly playable on gamepad, but on M+K I started to feel increased input delay (but it's still wasnt that bad, some 60fps games have higher latency even without FG).

            4K DLSS performance, max RT/PT – 75fps, 40ms input latency
            https://i.ibb.co/rcphd0M/20250113-205120.jpg
            DLSS FGx2 108fps, 60ms latency
            https://i.ibb.co/7tQq6kx/20250113-205159.jpg
            I still however played Alan Wake 2 with FG, becssue it improved
            sharper during motion and artefacts were during gameplau were unoticeable. The only artefacts I saw was during text scroling in "mind place", but that didnt bothered me at all.

            Overall I'm very happy with the results of DLSS FG. I can see all kinds of problems with Lossless Scaling FG (people said that LSFG adds very little input lag, but it was still too much for me), but DLSS FG works so well that I consider it a free performance boost and use it in every single game (even in Alan Wake 2 becasue it improves motion clarity quite a bit). What's more I enable DLSS FG even if I have a decent average fps (100fps), because some games might still dip below 60fps from time to time. Normally, this would bother me, but since I started using DLSS FG, I no longer care about the sub 60 fps dips, because I no longer notice them.

            Nvidia FG can add very high input lag (100ms on top of your base input latency) if you have vsync turned on, or try to limit framerate with RTSS (riva tuner). Some people might use the wrong settings and they will think that DLSS FG is to blame. Just a few days ago I talked with one guy who said that he never use DLSS FG becasue he thought it destroys input latency. I found out he was using RTSS and that's was the reason why he didnt liked it.

          2. Your comparisons are just bad (and misleading). For example your Cyberpunk example. You are comparing native with 73fps at 28ms with FG with 124fps at 37ms. You DO realize that 124fps is NOT the true framerate and its really 62 real and 62 fake frames? That means the latency comparison is really just 73fps vs 62fps. The different between 60fps and 120fps (pure not considering the CPU/GPU pipeline) is 8.333. So why are you surprised that the difference is only 9ms – which is actually bad for fps numbers that are so close to one another.

            Also its not a placebo since people can tell the different between 60fps and 120fps. Now imagine you turned on DLSS instead of FG and ran it at Quality(or Balanced). 73fps turns into 113fps and latency drops to 20ms (more with Reflex)? The different between 30fps and 60fps (pure not considering the CPU/GPU pipeline) is 16.666 and people can defintely tell the different between those.

            Now if you don't like DLSS – then fine you really have no choice but to use FG – but most people consider DLSS viable and plenty of people reduce settings that don't really matter (or have little impact) to enjoy a TRUE higher refresh and IF that does not cut it THEN they look at possibly FG. The other options would be if they already have a high refresh AND they have a high refresh monitor and wish to max out its capabilites.

            Also you don't need FG to enable Reflex. It can be used without FG.

          3. I tried it in BMWK, and you are right its very enjoyable with FG. Alan Wake 2 was truly the last time I tried it and I was so disappointed because it had graphic issues when it was turned on, plus, the input lag. I just assumed all games with FG would be like that but after trying BMWK with it I was very surprised at how well it was implemented. I’m going to retry it in AW2 soon. thanks for the feed back.

  7. At least no body buying flagship card just to play a game in 20-ish FPS. We all knew that's tracing is better on 5000 instead of 4000, the scaling factor itself maybe contribute more than half of performance improvements. What about the benchmark when all tracings and dlss off?

  8. Let's get real. Almost no gamers are buying 4090 or 5090. Even as a high earner (top 10% in UK), I'm don't think blowing over grand on a GPU or a smartphone is a good idea.

    1. the scary part is when you look at the steam survey after it has been refreshed and updated. the 4090 users match some of the AMD budget gpu users in numbers. in fact, 80 and 90 series card combined, match midrange amd in numbers. now, usually i do not take steam numbers as accurate, but, it is accurate enough to understand, that not everyone in the 1st world is a poor fcker despite the fcked economy. do not project. my poverty is due to a social and mental health crisis. normies come in all shapes and forms. including rich and mentally healthy and socially conformist. and some have too much money, others have none. isn´t this how it always worked?

  9. As expected. Pricing is comparatively higher for 5090 but with all the advanced techs considered, I think the buyers will be in for a treat. Let's see.

  10. lets see, human. brain. iq above 100. sees clockspeeds. sees no architectural improvements that were advertised even if they exist, too small to advertise. nothing new except multi frame fake frame generation. ok. so. who expected anything more? there is a war. not on the pockets, but a war on developing big changes. we will see more and more such small jumps the smaller the nm become. expect a generation of "refreshes" and patching hardware. 8800gtx times are long over. innovation can stagnate. we will get there. and despite little to no hype outside of the internet forums, 30 to 40% is actually quite big. but 1999 dollars? goddamn nvidia started milking the real customers now HUH. as if they didnt already….. maybe they ease off the brand hate by giving the peasants better prices, while squeezing the oil barons which simply can´t but suck jensens d*ck.

  11. Aaaaand I won't change my RTX 3070 anytime soon. 😌
    Why would I? Have you even seen ONE recent AAA game that is good?
    What's even the use of having such GPU? And indie games don't need that either.
    Call me back when there's RTX 100 series in 5 years or something, and that good AAA games are back. (no hope whatsoever though)

    1. Yep, there isn't a single good AAA game that actually needs that kind of power. Game development has been going backwards for years, AI is only going to accelerate that as assets are generated instead of created. Real creativity is dying a slow death.

      1. Basically ALL AAA games today "need" that kind of power, most of them sadly are due to sloppy optimization than anything else IE let the hardware bruteforce the lack of proper work

  12. Without the AI gimmicks, it was predicted to be 35ish% faster in raster. Seems those predictions were in line with those results.

    Framegen will serve one thing… make devs even 300% more likely to skimp on proper optimization. The initial idea that it would allow proper pt/rt games to come earlier was not bad at all… shame it went from being a potential good thing to become yet another excuse to only half-bake games today… as the hardware will allow it

  13. That better be a low guess, we're talking about x90 series here shouldn't it be like 100% over the previous one from years ago now? The power reqs are crazy for these cards now also.

  14. Once the real raster numbers come out we will see why Nvidia could not raise the price of the other cards. If they had any shame (only ~30% real performace increase) the 5090 would be the same price. I mean the VRAM speed is SLOWER on the x90 card (again) AND it runs at a lower clock speed. WHY? If its the flagship and the ONLY card to increase in price – WHY gimp it? We all know NO 5090 TI is coming.

  15. If only developers dedicated part of the team's resources to REAL game optimization, as they used to do, this mess wouldn't exist!
    Instead, they don't optimize games now but solve performance problems by focusing everything on the brute force of GPUs….instead, for us mere mortals who have the XX60 or XX70 series in the case, the various devs use the various modern upscaling algorithms and fake frames!
    Soon the scenario in which we will find ourselves will be only AI-generated frames and no minimal optimization of the game code…this is the death of gaming!
    Actually, it won't happen any time soon, because this crap has already started a while ago!!!

Leave a Reply

Your email address will not be published. Required fields are marked *