Dying Light The Beast feature

Dying Light: The Beast does not support Ray Tracing at launch, runs with 80-100FPS at Native 4K/DLAA/Max Settings on an NVIDIA RTX 5090

Techland has just released Dying Light: The Beast on PC. However, it appears that the game does not support any of its advertised Ray Tracing effects. The devs have stated that they will add them via a post-launch update. I was able to test the game, so it’s time to share my initial performance impressions.

To test the game, I used an AMD Ryzen 9 7950X3D, 32GB of DDR5 at 6000Mhz, and an NVIDIA RTX 5090. I also used Windows 10 64-bit, and the GeForce 581.29 driver.

At Native 4K/Max Settings with DLAA, the game was running with an average of 94-98FPS. The interior places appeared to run way better than the open-world area. However, I was able to find an area early in the game that appeared to be quite demanding.

As you can see below, this scene runs with 82FPS on the NVIDIA RTX 5090. So, this will be the area I’ll use for our upcoming PC Performance Analysis. This will give us a better idea of how the rest of the game will run.

Dying Light The Beast 4K PC screenshots-17

For a rasterized game, Dying Light: The Beast looks great. However, it suffers from major pop-in issues. The lighting can also feel a bit underwhelming at times. In other words, the game will most likely greatly benefit from its upcoming RT update. On top of that, the 3D models of all main characters cannot match the ones we’ve seen in some other games, like Metal Gear Solid Delta: Snake Eater.

Overall, my first performance impressions of Dying Light: The Beast are positive. I did not experience any major stutters, and the game felt great. There is also support for DLSS 4 and FSR 4.0 and Intel XeSS 2.0 for those who do not own a high-end GPU.

Now, I know I’m using an RTX 5090 here. So, it will be interesting to see how the game runs on less powerful GPUs. Still, first impressions appear to be positive. The game also has Very Positive reviews on Steam. If it had major performance issues, it would have Mixed or Mostly Negative reviews.

Our PC Performance Analysis will go live later this week. So, stay tuned for more!

Dying Light The Beast 4K PC screenshots-1Dying Light The Beast 4K PC screenshots-2Dying Light The Beast 4K PC screenshots-3 Dying Light The Beast 4K PC screenshots-4Dying Light The Beast 4K PC screenshots-5Dying Light The Beast 4K PC screenshots-6 Dying Light The Beast 4K PC screenshots-7Dying Light The Beast 4K PC screenshots-8Dying Light The Beast 4K PC screenshots-9 Dying Light The Beast 4K PC screenshots-10Dying Light The Beast 4K PC screenshots-11Dying Light The Beast 4K PC screenshots-12 Dying Light The Beast 4K PC screenshots-13Dying Light The Beast 4K PC screenshots-14Dying Light The Beast 4K PC screenshots-15 Dying Light The Beast 4K PC screenshots-16Dying Light The Beast 4K PC screenshots-17

90 thoughts on “Dying Light: The Beast does not support Ray Tracing at launch, runs with 80-100FPS at Native 4K/DLAA/Max Settings on an NVIDIA RTX 5090”

  1. While I didn't have the time to test it out myself yet, it should be noted that this game is already Steam Deck verified, so in theory at least even a potato PC should have no trouble running it with reduced settings.

  2. It's powered by techland's chrome engine which runs great, deliver crisp visuals and nice draw distances. Also given their history, the graphics and performance is gonna get even better with the updates overtime. But yeah, it won't satisfy UE5 kisser John and gang with its presentation because it's not running at ~30ish fps leaving eye candy blurry trails. Game's good and a true sequel to 1st game.

      1. It's Arnold Vosloo, he is a South African actor, who is an Afrikaner. He was made famous starring in the first two Mummy movies (as the titular character) with Brendon Fraser and Hard Target with JVD.

          1. a little trivia, he spent months for a single scene in movie as ILM wanted to scan & match every angle of his face to the mummy imhotep, in the end we got the best villain & best mummy movies ever created. Goated actor for sure.

  3. It's powered by techland's chrome engine which runs great, deliver crisp visuals and nice draw distances. Also given their history, the graphics and performance is gonna get even better with the updates overtime. But yeah, it won't satisfy UE5 kisser John and gang with its presentation because it's not running at ~30ish fps leaving eye candy blurry trails. Game's good and a true sequel to 1st game.

  4. "The lighting can also feel a bit underwhelming at times. In other words, the game will most likely greatly benefit from its upcoming RT update."

    No it won't, because of peformance costs. Also, what the hell does "underwhelming" means when discussing how the lighting "feel". Made up buzzwords.

    Good for techland for leaving unoptimized trash for later updates.

    1. The feeling is that of butthurt.

      Nvidia and Gaming sites spread lies:
      -they said raytracing would look drastically better, it doesn't
      -they said raytracing would become mainstream, it never did
      -they said said raytracing performance would improve, it doesn't, because it is a flat dot product cost that can't be optimized

      1. RT is mainstream for quite some time now with even rtx 5070 providing descent performance in RT.

        dying light 2 looks so much better with RT enabled compared to raster mode.

        1. Every 3D GPU ever made supports hardware raytracing.

          Doing dot products and intersection tests is supported by every 3D GPU, an old Voodoo GPU from the 90s can also do this.

          In fact, there are thousands of rendering engines using realtime raytracing. Look up some Assembly and Symposium demos from the demoscene from the 90s, I made my own demo using raytracing too. Realtime raytracing on PC has existed for decades. Every other Computer Science major in Europe in the 90s was making some demo using raytracing, many of these people are behind today's game engines, or working for Nvidia.

          However, realtime raytracing makes no sense for complex environments like games. Raytracing's Achilles heel is the number of dot products it needs to do and the number of intersection tests. A dot product and intersection test is a very expensive mathematical operation. And unlike rasterisation (blinn phong or lambert) that just needs a single hit per polygon, raytracing requires thousands of rays for each pixel on your screen. And then lots of denoisers, because the algorithm often scores no hits at all (ie. it was unable to find a lightsource and the algorithm basically shut off that pixel and will approximate it with a denoiser).

          Just because a GPU supports "raytracing" doesn't mean it is actually usable in practice.

          And you can't "optimize" raytracing, it will keep on scaling linearly. Meaning, in 10 years from now, raytracing will still take away half or more of your performance. The amount of dot products and intersection tests scales linearly, so raytracing will remain very costly. Unlike rasterisation, there is no way to optimize raytracing.

          1. tl;dr:

            Realtime raytracing for games is a really bad idea. You don't want to be doing millions of expensive dot products and intersection tests each frame, just to figure out what color a pixel should be.

            Raytracing also requiers constant denoising because lots of pixels will never find a matching lightsource in time, no matter how many rays you shoot, rays get stuck in occlusion shadows for example.

            Foley and Van Dam figured this out in the 80s. Siggraph papers described it as unfeasible in the 90s.

            Nvidia also knows this, but Nvidia is not interested in games performing well. If games perform well, there is no reason to upgrade your GPU.

      2. Well said, if a feature can be easily replicated on non RT hardware, why do i need to brute force it? Other than selling expensive hardware that is. Suddenly you can get 100 fps 4k native when you take RT out of the equation.

      1. People should stop obsessing with lighting for their own sake. We have this figured out 2 generations ago, now we make games centered around this one graphic feature for the detriment of everything else. From what i have seen, the lighting in DL:B is great, but since it’s not RTX there will be whining.

    2. It means the game looks underwhelming for a 2025 current-gen exclusive release.

      Dying Light 1 looked like this a decade ago on weaker hardware.

      1. Dying light 1 still looks good, just a bit of low res assets. I woudn’t mind if they made the game optimized for the hardware of the era.

  5. Focusing only on performance : Well, gotta consider it's not carrying the performance cost of RT/PT, while it also lacks its graphical fidelity and you can tell there's a lot of shadowing missing in exteriors, trees and foliage… But…
    HEY!! At least we can run it well!!

    When RT is implemented though… I suspect they'll choose to not make it too thorough, so it won't be too heavy… But let's see it.

  6. DLSS frame generation appears to cause frame drops sometimes. If you have that issue, use FSR frame generation. You can mix any upscaling with any frame generation technique.

    1. Crisp textures doesn't mean anything lmao. Textures are objectively not very detailed, the game's level of performance makes sense for its visuals.

    1. What?? Playing on pro 5 with lg c3 and it looks as good or better than any game I've played. Ppl just b*tch and whine about everything. Not to mention the game JUST came out. Gone are the days of buying a game day 1 and it being the best, most optimized verison of itself.

    2. Particularly exteriors and especially because of vegetation that looks quite flat, both are lacking some finer overall shadowing. Some good Ambient Occlusion would solve a lot of that for the whole scene.

  7. Doesn't support RT and it's a good thing.

    KB/M customization is in much worse state, as expected – judging from what I saw on forums.

  8. Theory: No RT at release because too many players will just "turn everything to max" and get mad about bad performance. But if you delay RT, performance for the initial reviews will be great na dalso you can sell a "free graphicas update" later.

    Not even critizising this, it's a good idea. Stupid people with angry early reviews are a pain in the a*s and that's a clever way of dealing with them

    1. How about no retarded RT at all?

      How about developers spend time on gameplay instead of on raytracing that makes no difference at all and tanks performance.

      Silksong is outselling everything by a massive margin.

      Raytracing and $2,000 GPU are completely irrelevant. Developers should nver cater to that 1%, they are irrelevant to the success of a game.

      The only way catering to people with high-end GPU would make sense is if those people paid more for the game, A LOT MORE, to justify alienating a much bigger consumer base with lower end GPU. But they don't, people with $2,000 GPU still pay the same for games.

      It makes 0% economic sense to make games that require high-end GPU when it is a tiny part of the market. Almost all games that do use RT are massively subsidized by Nvidia to compensate for the loss of sales.

      1. Just because you cant shell out the bread for high end equipment means what? Everyone else should play on cheap outdated hardware? Goes both ways kid.

        1. No one cares one iota what card you personally have, numbskull.

          What developers care about is how many people will buy their game, and using raytracing to target 1% of the market with high-end GPU that can run it, is retarded.

          Battlefield 6 understands this, where they categorically said there will be no raytracing, now or in the future, because they do not want to limit their audience.

          1. bf6 is a fast paced multiplayer shooter so no RT is no big deal, most people would turn it off anyway to get max fps and minimum input latency but dying light is a single player story game so it will benefit greatly from the RT mode for improved visuals, a RT patch is coming soon.

    2. yeas, that`s spot on. People are mad if the path tracing mode doesn`t run with 100 fps on thier radeon 6700 so it might be a pretty cleaver move to delay the RT update.

      I finished dl2 with RT and the ray traced mode was a quite massive upgrade to visuals, generational leap in many places so I`ll wait for RT patch for the new game too.

      1. No they aren't, stop blatantly lying.

        People complain about performance for games that are confirmed to have shtty performance.

        Go find me a DF video for a game that was review bombed for poor performance which wasn't confirmed in their review.

    3. What's with this braindead take of unruly, unreasonable players "turning everything to max and then crying", when every single time a game is criticized for awful performance it DOES have awful performance on any config.

      Care to give a single example of a game that runs well for its fidelity, has no godforsaken stuttering from traversal and shader comp, no geriatric frame pacing, and also one that scales well with hardware/settings that was then panned for performance by unreasonable gamers?

      You won't because it doesn't happen, only dogsht ports get slammed and rightfully so. Cronos, Silent Hill 2, Borderlands 4, Jedi Survivor, most/all UE5 games, etc etc. All pieces of garbage.

  9. When the graphics don't wow, gamers complain. When developers push the graphical boundaries, gamers call it "unoptimized."

    1. What game released in the last 10 years "pushed the graphical boundaries" to justify their ridiculous system requirements though?

      Oh wait, there's none, because almost all "games" released in the last 10 years are agenda-poisoned cancer made by fake so-called "gamers" who in reality are actually pi󠀀nk-ha󠀀ired tr󠀀an󠀀nies who wor󠀀ship Sa󠀀ta󠀀n, hence why they release unoptimized pile of Ta󠀀lmu󠀀dick fe󠀀ces that require the most powerful GPU on the planet to barely even give a half-playable performance, because these pi󠀀nk-ha󠀀ired tr󠀀an󠀀nies who wor󠀀ship Sa󠀀ta󠀀n don't know how to code sh󠀀it, because brainwashing people to normalize their filth was the sole reason for making these ag󠀀en󠀀da-po󠀀is󠀀oned ca󠀀ncer "games" just like Immortals of Aveum.

        1. Did your infa󠀀nto󠀀philic Va󠀀mp󠀀󠀀iric ra󠀀bbi subhumans get de󠀀󠀀layed fr󠀀om su󠀀ck󠀀ing yo󠀀ur ne󠀀wborn 0-year-old in󠀀fa󠀀nts' blo󠀀od out of their pe󠀀epe󠀀es because they're too busy with the literal wo󠀀rst gen󠀀ocide of the 21st century and then using a non-existent fairy tale that supposedly "happened" a century ago to get away with it? That sucks.

          1. Hey @disqus_DEktqYvCcx:disqus who blocks then replies like a coward, have y󠀀ou finished yo󠀀ur Talm󠀀udick lessons at yo󠀀ur sy󠀀na󠀀gogue today?

            I'm neither 12 nor autistic, I go outside every single day and I make money by myself unlike y󠀀ou, yo󠀀u spoiled 12-year-old soy 'MuriK󠀀󠀀KKan bu󠀀yf󠀀ag with no source of income who burns $500-$1000 on microtransactions on COD and Fortnite annually from its tr󠀀anny "m󠀀om" and "d󠀀ad's" credit cards before ending u󠀀p on the stre󠀀ets, feeding on tr󠀀ash cans.

          2. This is the comment you get from a person that thinks they're sophisticated and intelligent, but clearly, cannot hide their ignorance and stupidity. Kid definitely skipped English class.

      1. All big games released during X1/PS4 era had higher system requirements than games on X360/PS3, but the visual jump was always there to justify it.

        This look like DL1 that ran on a quarter or less of the current gen consoles.

  10. Lets compare this to the the cartony gfx in borderlands 4 that needs a 5090 for get barely passabe 1080p fps without gimicks that were supposed to be there for earlier ray/pathtracing adpotations rather than complete lack of raster optimization.

    BL4 will earn the worst optimized game of 2025 where a 5090 is barely a passable 1080p card… and look at what that insane gfx power gives in that turd – yikes!

  11. one of the best optimized game i have ever seen the CPU usage well balanced a cross all cores that give you Low CPU usage and GPU usage all max 99% without any stutter super flat frame time that how DX12 should work on any games . Kudo to techland

  12. Sorry if this is showing up more than once, I posted a couple of hours ago but it's not showing.

    Does anyone know if this game has Nvidia Reflex 2.0, a couple of Youtubers were claiming it did before it was released and I am curious as it would be the first game with it implemented.

    1. Nvidia has said DL:TB was going to be one of the games the feature is used. The setting is called Latency Reduction in the game and there's two options: Reflex and Reflex + Boost.

      1. Still, the graphical "improvements" over a 10-year-old game are barely even worth mentioning, and that is when keeping in mind all of what's written above (the aforementioned GPUs, DLSS, etc.)

          1. I get your point, but if we were to compare how graphics used to evolve back then, 1997 game's High Settings would equal 2007 game's GARBAGE Settings.

            For real though, no sarcasm. It's funny that this is actually the real thing of how things were back then VS how things are today.

  13. I'm playing on a 7800X3D / 5090 and it's been great so far. I am at max settings but decided to use DLSS Quality for a locked 120 FPS on my LG OLED. The game itself is a lot of fun.

  14. hey John, you should consider removing the light/dark theme, having that as an option feels like a betrayal of your branding with this website

  15. I LOVE Dying Light!
    IMHO it's one of the best game series developed in the past ~15 years…genuinely good game, and arguably the best Zombie game besides L4D2!

  16. This is bad. I can run Cyberpunk 2077 at around 60 fps with my 9070 XT, 9800x3D at ultra settings, including ray tracing and plenty of mods, especially texture and graphics/AI/NPCs numbers enhancing ones with native seeX AA. It looks a lot better and is a lot bigger than Dying Light The Beast which you are telling me it runs at nearly 100 fps with some higher demanding areas on 5090 in 4K, no ray tracing? It doesn't sound as impressive as you put it.

  17. 3 years after DL2 and this looks dated. Zero improvements. Dated AO, dated shadows, geometry often looks like something from PS3 for rocks and vegetation. The probe indirect lighting used for interiors is something straight from 7th gen.

    Game is still too GPU demanding for visuals on display and too CPU intensive. My former 5700 XT could just about do 1080p120 on the original Dying Light and this doesn't look better and I saw 3060 can't hold 1080p60.

    Still NO long distance shadows. What year is this. Look at Days Gone or Gears 5. I known it's a different engine, but the tech problem is fixed. IMPLEMENT IT.

    Even the upcoming RT effects, RT reflections look like a downgrade on SSR.

    Game should not still have DX11. It should've also shipped with XeSS 2.1 that unlocks XeFG for all. The game exposing FSR 2.3.4 is stupid. The game NOT having HUDless still when DL2 didn't have it either is also stupid.

Leave a Reply

Your email address will not be published. Required fields are marked *