Lossless Scaling feature

We’ve tried Lossless Scaling Frame Generation, here are our thoughts

Last week, we informed you about a new version of Lossless Scaling that allows you to enable Frame Generation in all PC games on all GPUs. And, thankfully, its creator has provided us with a code for it. As such, we’ve decided to test it in a variety of games and share our thoughts.

Now as we’ve already said, Lossless Scaling Frame Generation has some limitations. The program needs a game to be locked at half of your monitor’s refresh rate. So, if you have a 120hz monitor, you’ll have to lock games at 60fps. Anything other than that and you’ll get frame pacing issues. Thankfully, you can easily do this via NVIDIA’s control panel. And, no. LSFG is not compatible with VRR monitors.

In fact, owners of G-Sync/FreeSync monitors will have to completely disable G-Sync from their Control Panel. With G-Sync enabled on our LG CX and LG 32GK850G-B, we had major frame pacing issues. That was even when we set the refresh rate at 120Hz and locked a game’s framerate at 60fps. The only way we could resolve this was by switching our TV and PC monitor to “Fixed Refresh Rate.” Once we did this, Lossless Scaling Frame Generation worked the way it was supposed to. So, if you own a G-Sync monitor, you’ll have to set it to “Fixed Refresh Rate.”

Another thing worth noting is how BAD 60fps looks on a non-VRR monitor. With G-Sync enabled, 60fps felt smooth on both our LG CX and LG 32GK850G-B. Even when panning the camera in Elden Ring or Cities Skylines 2, 60fps felt great. However, when we set them to “Fixed Refresh Rate”, 60fps wasn’t that smooth. While close objects were moving smoothly, distant objects felt a bit judder-y.

We’ve tested Lossless Scaling Frame Generation in six games. These were Elden Ring, Kingdom Come: Deliverance, The Medium, Assetto Corsa Competizione, Cities Skylines 2 and Chernobylite. For our tests, we used an AMD Ryzen 9 7950X3D, 32GB of DDR5 at 6000Mhz, and NVIDIA’s GeForce RTX 4090. We also ran the latest version of Windows 10 64-bit and used the GeForce 546.33 driver.

At 2560×1440, our NVIDIA RTX 4090 was able to maintain 60fps in pretty much all of the tested games. This was crucial as lower framerates can introduce major frame pacing issues. However, even with a constant 60fps, there were some minor judders in quick mouse movements. That could be easily noticed in Elden Ring, Kingdom Come: Deliverance and Cities Skylines 2. Similarly, in Elden Ring with its Ray Tracing effects, we had numerous frame-pacing issues when exploring the game’s environments.

We also experienced MAJOR ghosting issues in all of the games we tested. Some games, like Elden Ring, felt completely broken. Not only that but I’ve noticed major latency input issues in some titles. For instance, I could easily notice the extra input latency in Assetto Corsa Competizione. Assetto Corsa Competizione also had numerous ghosting issues, so it was completely unplayable for me. In this game, I’d gladly take 60fps over LSFG on any given day.

The games that heavily benefited from LSFG were The Medium, Kingdom Come: Deliverance, and Cities Skylines 2. The Medium has fixed camera angles and was one of the games in which LSFG worked incredibly well. And while Kingdom Come: Deliverance and Cities Skylines 2 had some framerate judders, they felt way smoother than the default 60fps. However, that’s on a non-VRR monitor. On a VRR monitor, all of the games felt smooth at 60fps, without having any of the issues of LSFG.

Here’s a summary of our experience per game.

  • Cities Skylines 2: Laggy, overlay/icon ghosting issues, smoother gameplay
  • Elden Ring: Ghosting issues, frame-pacing/judder issues, slightly laggy mouse movement
  • Chernobylite: Minor ghosting issues, smoother gameplay, minor judders
  • Kingdom Come Deliverance: Ghosting issues, smoother gameplay, minor judders
  • The Medium: Best example, LSFG works great here
  • Assetto Corsa Competizione: Really laggy, ghosting issues, frame-pacing issues, game didn’t feel well

Now while LSFG was a bit disappointing in most modern-day PC games, it can make a big difference in older titles that are locked at 30fps. Or you can use LSFG to boost the performance of console-emulated games that run at 30fps. So, if you’re an emulator enthusiast, LSFG will be a godsend.

For everyone else, LSFG is a mixed bag. Yes, LSFG can improve your gaming experience if you are still gaming on a non-VRR monitor and if your GPU can maintain 60fps in games. In this particular case, Lossless Scaling Frame Generation will improve the smoothness of games. However, if you have a G-Sync monitor, LSFG is a big NO-NO. And no, LSFG is nowhere close to matching the quality of DLSS 3. DLSS 3 Frame Generation doesn’t have as many ghosting or input latency issues as LSFG has. Plus, DLSS 3 works wonderfully with VRR. Also, you can use DLSS 3 FG to get better performance, something that is impossible with LSFG (as even minor drops below 60fps can introduce frame-pacing/judder issues).

LSFG is a free mod so I have to give kudos to THS for it. Don’t get me wrong, I’m not criticizing his work. For an individual, this is an incredible achievement. However, I was expecting more from LSFG.

23 thoughts on “We’ve tried Lossless Scaling Frame Generation, here are our thoughts”

  1. so basically they somehow ‘ripped’ tvs judder and dejudder function and applied it software wise via windows api. Something nvidia and amd did but with pro coding part. Cool:)

    1. I don’t like the trend, because it makes companies skimp on the pure rasterization performance.

      I understand the cloud convergence being a trend, but damn I will be sad when that comes around.

      1. Nice observation about the cloud convergence there, really well spotted!

        Frankly, I’m not a fan of using “the power of the cloud” within games, but it does open up opportunities for studios which are otherwise hard to handle on local hardware, which has many magnitudes less processing power than “the cloud”.

        Take for example “THE FINALS”, which has server-side processing of the destruction physics:

        https://uploads.disquscdn.com/images/60a4e1134b0c4137ac9bc6de943a89bb4ff42a2cc3578cf300f1012c2d5dc521.jpg

        Or even the Crackdown 3 tech-demo many years prior to that:

        https://uploads.disquscdn.com/images/112d11f3da13e3a9b46c6bab884014e7b47a462fbef64f7a68524b3de4894c96.jpg

        And Hideo Kojima’s next game in partnership with Microsoft will also require “the cloud”, which means it will be unplayable without an active Internet connection.

        Of course, “the cloud” will bring an end to game preservation, but since most consumers don’t seem to care anyway, alot of the big game companies are very much looking forward to “cloud-native” games, because as a byproduct, it will destroy piracy once and for all, too.

        Dystopia is just around the corner, it’s just that the majority has a too low IQ to see it, even when its dangling right in front of their eyes…

        1. The Finals doesn’t have server-side processing of physics.

          It’s all done locally and synced with the server in order to work properly in multiplayer. It’s the same system that games like Bad Company 2 used, just with more detail. You’re completely misinformed.

          Until the time that every person has a server/render farm on their streets with a 1ms latency to their device, the cloud is not going to be viable in any way.

        2. The Finals doesn’t have server-side processing of physics.

          It’s all done locally and synced with the server in order to work properly in multiplayer. It’s the same system that games like Bad Company 2 used, just with more detail. You’re completely misinformed.

          Until the time that every person has a server/render farm on their streets with a 1ms latency to their device, the cloud is not going to be viable in any way.

          1. You know, you could have saved me 5 minutes of googling around if you had just told me that you got that info from Digital Foundry, because all the other sites I saw indeed conveyed the wrong info that the physics processing is all done server-side.

            That being said, I’m pretty sure that the first games with cloud integration won’t require it for real-time compute tasks, but rather for something like “consistent worlds” multi-player components even within traditionally single-player focused gaming franchises.

            In fact, Microsoft already said as much within their internal pitch for the next-gen Xbox, which accidentally leaked to the public:

            https://uploads.disquscdn.com/images/1818649a2a8e220dd86449ecef94c064f91b4151711d14203ab4b04e163e501e.png

            And even todays Internet infrastructure around the world has already improved to the point that NVIDIA is able to offer the GeForce NOW cloud streaming service in alot of regions, which is way more computationally intensive:

            https://uploads.disquscdn.com/images/2e1c0100445907eb1e897cc6f5867d650435a72af68ae9ebb257a0a60d15bd25.png

            BTW, GeForce NOW works by having a Linux hypervisor spin-up Windows VMs dynamically as needed, by leveraging their vGPU feature in the data-center:

            https://uploads.disquscdn.com/images/cd97967585b1cfc6c7f32423b98404c21dcbae9140dc35227d25f6992cd5d305.png

            In fact, I moved to a rural area, where the closest data-center is several hundred miles from where I live, and even I could hit 60 FPS if I wanted to with my Internet connection for cloud streaming:

            https://uploads.disquscdn.com/images/f42fedb98f2c46d72158ce51c8c5c9a14e84ecbd806533f38203e704cc258134.png

            As you can see, I have a round-trip latency of 16ms with no packets lost along the way.

            If you can, please provide a screenshot of the quality of your Internet connection via the M-Lab speedtest.

            Thanks!

        3. cloud games dont exist and never will, whole cloud in next Kojima game is just drm and/or PR bullshit as always.
          Whole cloud stuff is just tech demos but in real world it falls apart in seconds. Its also unsustainable model. I would not worry about it all, it will be gimmick at best drm at worst.

          1. Even if it just ends up being a gimmick, the outcome is the same nonetheless:

            An active Internet connection will be required, even if the game will be single-player focused.

      2. Still believing in “cloud convergence” ?
        That was a scam. How is Stadia going ?
        There is no cloud convergence. Everything that runs in the cloud can run on your PC.
        At a time where people buy big GPUs like 4090 4080 etc “cloud convergence” makes no sense.
        And servers have nothing to do with “cloud convergence”.

  2. Well it’s one guy doing this instead of a big syndicate worth billionstrillions whatever they throw out there so just from that it’s impressive. The problem is that it applies to games that most people with 3 yo systems can already max out anyways and if it’s locked fps usually there are simple mods that unlock that.

    I think emulation needs to be tested as well not some games that it really wouldn’t help with. Turn down a couple of things and silly ones off, and you don’t need fg of any kind. A locked 30 is actually fine for gaming it’s just not what pc gamers want with their expensive equipment.

    FG is really just a tech for gaining back frames taken by RTPT and 2k/4k for those on the cusp and also for these insane hz screens above 120. Yeah using the FSR3 mod in 77 I noticed going from 60 to 90 almost up to my 120hz rr but also noticed the huge spike in cpu mostly which is not to my liking. 30 to 45 is barely noticeable in any case. And 45-60 if you have a 60hz screen would be noticeable but not much. 30-60 would be noticeable but like I said a simple unlocker mod can do that.

    FG likely just be similar to physx and how that’s done now in a few years. nvidia just comes up w gimmicks to sell cards every few years. Mining was likely never real and just a manipulation that was juiced up for “scarcity” and such. Benefits of virtual monopoly.

    1. Well, believe it or not, there are still games that have a locked framerate with no mod to unlock it or with incomplete/broken solutions.
      What’s great with LSFG when you think about it, it’s the fact you could unlock the framerate of some games without ruining their timing/logic or their tick-rate or their physics simulation because internally the game is still running on default framerate, which is awesome when you think about it

      1. Agree that’s one thing I missed, some games if they go above a certain fps messes up the game since it’s tied to the fps limit. I have this software but since I also tried the FSR3 mod for 77 and decided any FG wasn’t worth it, I haven’t tested anything myself with LS. One game that I might try is RDR1 RYU now that I played it so much a while back have the shaders mostly done.

        The introduced cpu overhead of the FSR3 mod in 77 kind of makes turning on RT worse as far as system usage and I’ve got to be careful w my only laptop that’s not new. The 20+fps hit with just basic RT I get can be clawed back with FG but it’s not free. I also got ghosting and such. You need much better specs overall for FG, kind of defeats the point we think it’s for imo

        1. Yep, one must approach this kind of stuff with one thing in mind : There is no such thing as magic free FPS.
          It got to have negative side effect, it must be considered as an option at best, not as a final solution, and any option is welcome, if it can serve you sometimes in the right context then it’s better than nothing.
          As for RDR, i tried LSFG in Xenia with x2 scaling and it works great, the slow pacing of the game kinda helps a lot to mitigate the side effects if your game manages to hold a steady 30 fps

          1. Oh nice I just used ryu newer release since it was what most of the repackers said was better and is much less hassle than my old xenia setup, it just works and includes dlc all in one package. I also have rdr2 pre-dlss version and once I got dialed in, it looks amazing and lock it to 45 for no stutters and system usage is no fans on a laptop or barely noticeable fans at 99% of 1080p using the built in res scaler. Everyone has their unique situation with PC.

            What’s terrible is that the 20 and 30 series desktops and higher level laptop cards/cpus could benefit the most but ngreedia decided to lock those out. Even just an FSR3 software version for more fps, but less fps boost than hw based, could have been released by nv but wasn’t. And if you think about it LS can work with any game, so even the “official support” needed by the devs is just another gimmick to sell.
            8k is obviously going to be the next thing in gaming but we’re well into the 4k era and outside of tvs most don’t even use that in games despite the very vocal first worlders and wealthy online.

  3. Kudos to the dev nonetheless, he gave PC gamers an option that AMD or NVIDIA could’ve done a long time ago, but they know that a lot of people with old GPUs could be satisfied with the end result and this will ultimately end with lower sales for them, this is far from perfect, but in the right context it can make a huge difference, i noticed that slow paced games benefit the most.

  4. Told you this is sh*t. Why the hell bother with this Garbage!
    I dident bother to read your wall of text even.Trash and nothing more 👎

  5. Interesting , but I will move on later this year to a Gsync Pulsar monitor , I can’t play any games that are less than 120 fps and without the smoothness of Gsync.
    I’m using FG in many games that support it( although some games don’t use it as it should ) to get that 120fps.
    so its interesting how technology is growing.

  6. So its not good, i guess if there is no other option its something. Hate nvidia all you want dlss3.5 is freaking amazing and no one will win with them.

  7. Simply wish to say your article is as amazing. The clearness in your post is just nice and i could assume you’re an expert on this subject. Well with your permission let me to grab your feed to keep updated with forthcoming post. Thanks a million and please carry on the gratifying work.

  8. And now you know why Nvidia’s new Optical Flow Accelerator hardware circuitry is so important to DLSS Frame Generation. It’s what lets you use G-sync/Free-sync and VRR with Frame Generation without all the issues ALL the others have.

    It’s not “Just to make players buy a 4000 series card”, it’s because the other methods didn’t meet the STANDARDS Nvidia was going for. The solution was to do it in hardware rather than software for basically the exact same reasons we do video encoding with the hardware in a modern GPU rather than in software. Sometimes hardware can do things better, faster or both than software

Leave a Reply

Your email address will not be published. Required fields are marked *