Techland has just released Dying Light: The Beast on PC. However, it appears that the game does not support any of its advertised Ray Tracing effects. The devs have stated that they will add them via a post-launch update. I was able to test the game, so it’s time to share my initial performance impressions.
To test the game, I used an AMD Ryzen 9 7950X3D, 32GB of DDR5 at 6000Mhz, and an NVIDIA RTX 5090. I also used Windows 10 64-bit, and the GeForce 581.29 driver.
At Native 4K/Max Settings with DLAA, the game was running with an average of 94-98FPS. The interior places appeared to run way better than the open-world area. However, I was able to find an area early in the game that appeared to be quite demanding.
As you can see below, this scene runs with 82FPS on the NVIDIA RTX 5090. So, this will be the area I’ll use for our upcoming PC Performance Analysis. This will give us a better idea of how the rest of the game will run.
For a rasterized game, Dying Light: The Beast looks great. However, it suffers from major pop-in issues. The lighting can also feel a bit underwhelming at times. In other words, the game will most likely greatly benefit from its upcoming RT update. On top of that, the 3D models of all main characters cannot match the ones we’ve seen in some other games, like Metal Gear Solid Delta: Snake Eater.
Overall, my first performance impressions of Dying Light: The Beast are positive. I did not experience any major stutters, and the game felt great. There is also support for DLSS 4 and FSR 4.0 and Intel XeSS 2.0 for those who do not own a high-end GPU.
Now, I know I’m using an RTX 5090 here. So, it will be interesting to see how the game runs on less powerful GPUs. Still, first impressions appear to be positive. The game also has Very Positive reviews on Steam. If it had major performance issues, it would have Mixed or Mostly Negative reviews.
Our PC Performance Analysis will go live later this week. So, stay tuned for more!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email


















While I didn't have the time to test it out myself yet, it should be noted that this game is already Steam Deck verified, so in theory at least even a potato PC should have no trouble running it with reduced settings.
It's powered by techland's chrome engine which runs great, deliver crisp visuals and nice draw distances. Also given their history, the graphics and performance is gonna get even better with the updates overtime. But yeah, it won't satisfy UE5 kisser John and gang with its presentation because it's not running at ~30ish fps leaving eye candy blurry trails. Game's good and a true sequel to 1st game.
Any particular reason your profile pic is of a jew?
I don't think he is. A quick google search will clear you doubt.
🌽 🤡 🤡 🌽
It's Arnold Vosloo, he is a South African actor, who is an Afrikaner. He was made famous starring in the first two Mummy movies (as the titular character) with Brendon Fraser and Hard Target with JVD.
man sure knows he's actors and movies.
a little trivia, he spent months for a single scene in movie as ILM wanted to scan & match every angle of his face to the mummy imhotep, in the end we got the best villain & best mummy movies ever created. Goated actor for sure.
You are correct. I mistook him for this other mummy jew. https://uploads.disquscdn.com/images/5cfb460e403356a332234a9e2a17819ca845d07fc3182f853bccb9ed0ea860c1.jpg
https://uploads.disquscdn.com/images/da28b4317c4037934a1b64cce9e14c47f5e0fee1ed68fcf22c60e05381d770d2.gif
It's powered by techland's chrome engine which runs great, deliver crisp visuals and nice draw distances. Also given their history, the graphics and performance is gonna get even better with the updates overtime. But yeah, it won't satisfy UE5 kisser John and gang with its presentation because it's not running at ~30ish fps leaving eye candy blurry trails. Game's good and a true sequel to 1st game.
"The lighting can also feel a bit underwhelming at times. In other words, the game will most likely greatly benefit from its upcoming RT update."
No it won't, because of peformance costs. Also, what the hell does "underwhelming" means when discussing how the lighting "feel". Made up buzzwords.
Good for techland for leaving unoptimized trash for later updates.
The feeling is that of butthurt.
Nvidia and Gaming sites spread lies:
-they said raytracing would look drastically better, it doesn't
-they said raytracing would become mainstream, it never did
-they said said raytracing performance would improve, it doesn't, because it is a flat dot product cost that can't be optimized
RT is mainstream for quite some time now with even rtx 5070 providing descent performance in RT.
dying light 2 looks so much better with RT enabled compared to raster mode.
EVERY Graphics card made in the last 5 years has Ray Tracing …. If THAT is mainstream then what is?
Every 3D GPU ever made supports hardware raytracing.
Doing dot products and intersection tests is supported by every 3D GPU, an old Voodoo GPU from the 90s can also do this.
In fact, there are thousands of rendering engines using realtime raytracing. Look up some Assembly and Symposium demos from the demoscene from the 90s, I made my own demo using raytracing too. Realtime raytracing on PC has existed for decades. Every other Computer Science major in Europe in the 90s was making some demo using raytracing, many of these people are behind today's game engines, or working for Nvidia.
However, realtime raytracing makes no sense for complex environments like games. Raytracing's Achilles heel is the number of dot products it needs to do and the number of intersection tests. A dot product and intersection test is a very expensive mathematical operation. And unlike rasterisation (blinn phong or lambert) that just needs a single hit per polygon, raytracing requires thousands of rays for each pixel on your screen. And then lots of denoisers, because the algorithm often scores no hits at all (ie. it was unable to find a lightsource and the algorithm basically shut off that pixel and will approximate it with a denoiser).
Just because a GPU supports "raytracing" doesn't mean it is actually usable in practice.
And you can't "optimize" raytracing, it will keep on scaling linearly. Meaning, in 10 years from now, raytracing will still take away half or more of your performance. The amount of dot products and intersection tests scales linearly, so raytracing will remain very costly. Unlike rasterisation, there is no way to optimize raytracing.
tl;dr:
Realtime raytracing for games is a really bad idea. You don't want to be doing millions of expensive dot products and intersection tests each frame, just to figure out what color a pixel should be.
Raytracing also requiers constant denoising because lots of pixels will never find a matching lightsource in time, no matter how many rays you shoot, rays get stuck in occlusion shadows for example.
Foley and Van Dam figured this out in the 80s. Siggraph papers described it as unfeasible in the 90s.
Nvidia also knows this, but Nvidia is not interested in games performing well. If games perform well, there is no reason to upgrade your GPU.
EVERY Graphics card made in the last 5 years has Ray Tracing …. If THAT is mainstream then what is?
Well said, if a feature can be easily replicated on non RT hardware, why do i need to brute force it? Other than selling expensive hardware that is. Suddenly you can get 100 fps 4k native when you take RT out of the equation.
Underwhelming means mid, mediocre, ok, not great, etc. it's self explanatory.
People should stop obsessing with lighting for their own sake. We have this figured out 2 generations ago, now we make games centered around this one graphic feature for the detriment of everything else. From what i have seen, the lighting in DL:B is great, but since it’s not RTX there will be whining.
It means the game looks underwhelming for a 2025 current-gen exclusive release.
Dying Light 1 looked like this a decade ago on weaker hardware.
Dying light 1 still looks good, just a bit of low res assets. I woudn’t mind if they made the game optimized for the hardware of the era.
Focusing only on performance : Well, gotta consider it's not carrying the performance cost of RT/PT, while it also lacks its graphical fidelity and you can tell there's a lot of shadowing missing in exteriors, trees and foliage… But…
HEY!! At least we can run it well!!
When RT is implemented though… I suspect they'll choose to not make it too thorough, so it won't be too heavy… But let's see it.
DLSS frame generation appears to cause frame drops sometimes. If you have that issue, use FSR frame generation. You can mix any upscaling with any frame generation technique.
John, can you disable that hideous colour aberration?
Yep, you can disable both Chromatic Aberration and Film Grain.
Cool, thanks φίλε.
The game looks damn crips! great textures!
Yeah, a decade ago.
Crisp textures doesn't mean anything lmao. Textures are objectively not very detailed, the game's level of performance makes sense for its visuals.
The game looks past gen as hell, a little better than DL1, but somehow needs, far superior hardware.
Screen below, it's native 1440p, max settings, no film grain, chromatic aberration, no motion blur. Runs and looks like that on 7800X3D and 4090.
DL2 without RT looks far better. https://uploads.disquscdn.com/images/a36ae5a1391610ba3e869940c1620e6ac4a866b9794899f67e5007f58a5d308c.jpg
What?? Playing on pro 5 with lg c3 and it looks as good or better than any game I've played. Ppl just b*tch and whine about everything. Not to mention the game JUST came out. Gone are the days of buying a game day 1 and it being the best, most optimized verison of itself.
Particularly exteriors and especially because of vegetation that looks quite flat, both are lacking some finer overall shadowing. Some good Ambient Occlusion would solve a lot of that for the whole scene.
Why do you test on a system that is beyond most peoples reach and wallet.
Stick to consoles.
Check out BenchmarKing and RandomGaminginHD on YT these days.
Doesn't support RT and it's a good thing.
KB/M customization is in much worse state, as expected – judging from what I saw on forums.
That's never a good thing, so many rtards on these boards lmao.
Theory: No RT at release because too many players will just "turn everything to max" and get mad about bad performance. But if you delay RT, performance for the initial reviews will be great na dalso you can sell a "free graphicas update" later.
Not even critizising this, it's a good idea. Stupid people with angry early reviews are a pain in the a*s and that's a clever way of dealing with them
This makes sense and is a good spin for the company.
Not a tin foil hat theory at all, makes a lot of sense.
How about no retarded RT at all?
How about developers spend time on gameplay instead of on raytracing that makes no difference at all and tanks performance.
Silksong is outselling everything by a massive margin.
Raytracing and $2,000 GPU are completely irrelevant. Developers should nver cater to that 1%, they are irrelevant to the success of a game.
The only way catering to people with high-end GPU would make sense is if those people paid more for the game, A LOT MORE, to justify alienating a much bigger consumer base with lower end GPU. But they don't, people with $2,000 GPU still pay the same for games.
It makes 0% economic sense to make games that require high-end GPU when it is a tiny part of the market. Almost all games that do use RT are massively subsidized by Nvidia to compensate for the loss of sales.
Just because you cant shell out the bread for high end equipment means what? Everyone else should play on cheap outdated hardware? Goes both ways kid.
No one cares one iota what card you personally have, numbskull.
What developers care about is how many people will buy their game, and using raytracing to target 1% of the market with high-end GPU that can run it, is retarded.
Battlefield 6 understands this, where they categorically said there will be no raytracing, now or in the future, because they do not want to limit their audience.
bf6 is a fast paced multiplayer shooter so no RT is no big deal, most people would turn it off anyway to get max fps and minimum input latency but dying light is a single player story game so it will benefit greatly from the RT mode for improved visuals, a RT patch is coming soon.
The RT in Dying Light 2 was incredible. It was arguably the best part.
yeas, that`s spot on. People are mad if the path tracing mode doesn`t run with 100 fps on thier radeon 6700 so it might be a pretty cleaver move to delay the RT update.
I finished dl2 with RT and the ray traced mode was a quite massive upgrade to visuals, generational leap in many places so I`ll wait for RT patch for the new game too.
No they aren't, stop blatantly lying.
People complain about performance for games that are confirmed to have shtty performance.
Go find me a DF video for a game that was review bombed for poor performance which wasn't confirmed in their review.
Dying light 2 Has great RT performance, stop whinning.
Nobody has expectations about PT on an RX 6700, so calm down.
What's with this braindead take of unruly, unreasonable players "turning everything to max and then crying", when every single time a game is criticized for awful performance it DOES have awful performance on any config.
Care to give a single example of a game that runs well for its fidelity, has no godforsaken stuttering from traversal and shader comp, no geriatric frame pacing, and also one that scales well with hardware/settings that was then panned for performance by unreasonable gamers?
You won't because it doesn't happen, only dogsht ports get slammed and rightfully so. Cronos, Silent Hill 2, Borderlands 4, Jedi Survivor, most/all UE5 games, etc etc. All pieces of garbage.
Too bad the raster visuals are too underwhelming anyway. Performance is too.
This… actually makes a lot of sense?!
When the graphics don't wow, gamers complain. When developers push the graphical boundaries, gamers call it "unoptimized."
What game released in the last 10 years "pushed the graphical boundaries" to justify their ridiculous system requirements though?
Oh wait, there's none, because almost all "games" released in the last 10 years are agenda-poisoned cancer made by fake so-called "gamers" who in reality are actually pink-haired trannies who worship Satan, hence why they release unoptimized pile of Talmudick feces that require the most powerful GPU on the planet to barely even give a half-playable performance, because these pink-haired trannies who worship Satan don't know how to code shit, because brainwashing people to normalize their filth was the sole reason for making these agenda-poisoned cancer "games" just like Immortals of Aveum.
Did your med delivery get delayed due to tariffs? That sucks.
He's either a 12-year-old or autistic.
Did your infantophilic Vampiric rabbi subhumans get delayed from sucking your newborn 0-year-old infants' blood out of their peepees because they're too busy with the literal worst genocide of the 21st century and then using a non-existent fairy tale that supposedly "happened" a century ago to get away with it? That sucks.
Hey @disqus_DEktqYvCcx:disqus who blocks then replies like a coward, have you finished your Talmudick lessons at your synagogue today?
I'm neither 12 nor autistic, I go outside every single day and I make money by myself unlike you, you spoiled 12-year-old soy 'MuriKKKan buyfag with no source of income who burns $500-$1000 on microtransactions on COD and Fortnite annually from its tranny "mom" and "dad's" credit cards before ending up on the streets, feeding on trash cans.
This is the comment you get from a person that thinks they're sophisticated and intelligent, but clearly, cannot hide their ignorance and stupidity. Kid definitely skipped English class.
You should've changed your diapers before publishing this comment, I'm sure it's full of diarrhea.
https://uploads.disquscdn.com/images/cd07bd40a4e21ca77cbfb316a9664ab83b3cadce114c6ae58162a95e3e29a12e.png
Nah bro you're just a rabid schizophrenic who should be medicated, it's not a conspiracy.
https://uploads.disquscdn.com/images/ff3232109a73b6a9be3793b8d85a2873280056360df6f8daea98617d6f11074c.png
I'm the one who should be "medicated" or you the one who should be sent to a mental asylum because you think this type of shit is acceptable just because it is done by none other than your favorite infantophiles, you infantophile terrorist-worshipper?
https://uploads.disquscdn.com/images/e9534f9ad17943e8de6c15aeae0a91d01230bd4e7a406c89470d8936e7497c02.jpg
Uncle Leo??
Don't know about your gay and cringe Jewish terminologies and don't care.
All big games released during X1/PS4 era had higher system requirements than games on X360/PS3, but the visual jump was always there to justify it.
This look like DL1 that ran on a quarter or less of the current gen consoles.
Both can be true at once if your IQ isn't at the bottom of a well.
This is 3 years newer than DL2 which was a cross-gen game and it looks the same. Zero progress.
Lets compare this to the the cartony gfx in borderlands 4 that needs a 5090 for get barely passabe 1080p fps without gimicks that were supposed to be there for earlier ray/pathtracing adpotations rather than complete lack of raster optimization.
BL4 will earn the worst optimized game of 2025 where a 5090 is barely a passable 1080p card… and look at what that insane gfx power gives in that turd – yikes!
I just like the helpful information you provide in your articles
one of the best optimized game i have ever seen the CPU usage well balanced a cross all cores that give you Low CPU usage and GPU usage all max 99% without any stutter super flat frame time that how DX12 should work on any games . Kudo to techland
How can you say this? It looks the same as DL1 and it's over twice as demanding no reason.
Typo on the 6th paragraph John
Dying Light: The Light, lol
Sorry if this is showing up more than once, I posted a couple of hours ago but it's not showing.
Does anyone know if this game has Nvidia Reflex 2.0, a couple of Youtubers were claiming it did before it was released and I am curious as it would be the first game with it implemented.
Nvidia has said DL:TB was going to be one of the games the feature is used. The setting is called Latency Reduction in the game and there's two options: Reflex and Reflex + Boost.
This is a 2015 game that runs at ~60FPS on a $330 GTX 970 on High Settings at 1440p with NO DLSS:
https://uploads.disquscdn.com/images/d13347e1b38a3b57283044049ae6747c0370b46aad468bd2dba7222dc3d1923c.png
https://uploads.disquscdn.com/images/a9255fc6f35678dc41b489831b4eb687fe4eddcf40083a95f710ccf886256530.png
While this is a 2025 game that runs on a $1200 RTX 4080 at ~90 FPS on High Settings at 1440p WITH DLSS:
https://uploads.disquscdn.com/images/5b5aeab63b214354b413e22a7cbe9a642939e17cc4d0a0419527f6d76e43ae19.jpg
https://uploads.disquscdn.com/images/bf6810db24dbcf69ff192c83bb2eb9081ac83371b29de2ee2b8553b1cb85c92f.jpg
Man, the gayming industry can't just crash and burn fast enough.
What can we do if you're blind?
Still, the graphical "improvements" over a 10-year-old game are barely even worth mentioning, and that is when keeping in mind all of what's written above (the aforementioned GPUs, DLSS, etc.)
The game can run on a GTX 1650. High settings of 2015 is Medium settings today.
I get your point, but if we were to compare how graphics used to evolve back then, 1997 game's High Settings would equal 2007 game's GARBAGE Settings.
For real though, no sarcasm. It's funny that this is actually the real thing of how things were back then VS how things are today.
I'm playing on a 7800X3D / 5090 and it's been great so far. I am at max settings but decided to use DLSS Quality for a locked 120 FPS on my LG OLED. The game itself is a lot of fun.
AC Mirage level of pandering.
hey John, you should consider removing the light/dark theme, having that as an option feels like a betrayal of your branding with this website
Wow another non-UE5 game runs smoothly at launch! What a shocking revelation
I LOVE Dying Light!
IMHO it's one of the best game series developed in the past ~15 years…genuinely good game, and arguably the best Zombie game besides L4D2!
This is bad. I can run Cyberpunk 2077 at around 60 fps with my 9070 XT, 9800x3D at ultra settings, including ray tracing and plenty of mods, especially texture and graphics/AI/NPCs numbers enhancing ones with native seeX AA. It looks a lot better and is a lot bigger than Dying Light The Beast which you are telling me it runs at nearly 100 fps with some higher demanding areas on 5090 in 4K, no ray tracing? It doesn't sound as impressive as you put it.
3 years after DL2 and this looks dated. Zero improvements. Dated AO, dated shadows, geometry often looks like something from PS3 for rocks and vegetation. The probe indirect lighting used for interiors is something straight from 7th gen.
Game is still too GPU demanding for visuals on display and too CPU intensive. My former 5700 XT could just about do 1080p120 on the original Dying Light and this doesn't look better and I saw 3060 can't hold 1080p60.
Still NO long distance shadows. What year is this. Look at Days Gone or Gears 5. I known it's a different engine, but the tech problem is fixed. IMPLEMENT IT.
Even the upcoming RT effects, RT reflections look like a downgrade on SSR.
Game should not still have DX11. It should've also shipped with XeSS 2.1 that unlocks XeFG for all. The game exposing FSR 2.3.4 is stupid. The game NOT having HUDless still when DL2 didn't have it either is also stupid.
RT in Dying Light 2 was incredible. Also, I'll just leave this here.
https://youtu.be/c-qduZ_FQ98?si=exCTLJRHTibdXsat