Nexon has just announced a new third-person co-op action RPG shooter that will be using Unreal Engine 5, called The First Descendant. In order to celebrate this announcement, the company released the game’s first in-engine trailer that you can find below.
In this game, you can play as Descendants who inherited the unknown powers and make them stronger to fight against the invaders and protect humans.
Now what’s really interesting here is that The First Descendant will be the first free-to-play game using Unreal Engine 5. Hell, since STALKER 2 and Hellblade 2 are nowhere to be found, The First Descendant will also be the first Unreal Engine 5 game we can play.
Nexon will also launch a beta phase this October. The company will also share a new official trailer soon, so stay tuned for more.
Lastly, you can find more details about this game on its official Steam store page. I’ve also included below its official PC system requirements.
Enjoy!
The First Descendant Official PC System Requirements
MINIMUM:
-
- OS: Windows 10 x64
- Processor: Intel i5-3570 / AMD FX-8350
- Memory: 8 GB RAM
- Graphics: GeForce GTX 1050Ti or AMD Radeon RX 570
- DirectX: Version 12
- Network: Broadband Internet connection
- Storage: 30 GB available space
RECOMMENDED:
-
- OS: Windows 10 x64
- Processor: Intel i7-7700K / AMD Ryzen 5 2600X
- Memory: 16 GB RAM
- Graphics: GeForce RTX 2060 or AMD Radeon RX 5600XT
- DirectX: Version 12
- Network: Broadband Internet connection
- Storage: 30 GB available space

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
https://uploads.disquscdn.com/images/9da5c6189685dd7b924c02f82b4f735e75569bf22eea51853eea13b821420463.jpg
Music sounded like stock template for generic action movie LOL !!!
All of Nexon is just template garbage
Donno about Nexon tbh
I wouldn’t install anything they made on my system. Nope nope nope.
this.
“Will be the first free-to-play game.”
https://uploads.disquscdn.com/images/061c76f1fefde7415497a4a93efd1405b37445ab1960281971b03be77a3412b6.jpg
Haha, good one
got excited there for a second then i saw NEXON on the trailer and i was like NOPE.
“third-person co-op action RPG shooter’ – now that’s a mouthful especially when it comes to genres. Do the devs actually know what they want their new game to be or they just decided to jumble all of the elements together hoping that something good will come out of it?
Dx12, Nexon, “free2play”, making sure to spoke as many people as possible i see.
Being generous and depending on how much the shills get paid i give it from 6 months to a year.
This is long so I’ll put a 2nd section that simplifies everything under “Section 2” in bold. If you’re really interested in why it happens read the whole thing. It looks huge but I read it back just now in about a min an 17 seconds (that’s a lie!) & I was looking for big typos.
DX12 won’t have the stutter issue in UE5 or any new genuine DX12 engine. The reason we see stutter in some dx12 games is because it’s really a dx11 engine using some dx12 specific function calls like RT or mesh shaders and since dx12 works completely differently and allows the developers to send instructions directly to the GPU driver instead of the dx1 through dx11 way which worked by the dev sending instructions to the API (let’s say it’s dx11 for example lol) and then dx11 would send the instructions to the driver. This adds overhead which means performance loss. It also means no control of the GPU directly or what they call “to the metal”.
Removing that extra step allows the developers to not only do significantly more advanced things (things we haven’t even seen yet) but it so REQUIRES them to handle a large number of things that the API used to handle for them. (See dx12u documentation for info it’s readily available)
This is also how huge performance gains can be achieved. The problem is they have needed to learn a new way to do things since they control the hardware now. The API (dx12) takes a back seat for most things so a PC work more like a console, although on console they can skip talking to a driver and talk directly to the gpu which is better than even dx12 & Vulkan but it would be dangerous on PC (imagine an unkillable, undetectable gpu virus)
SECTION 2
More words but an easier faster read.
Anyways let’s me fast forward. Basically we see stutter because devs want to have their cake and eat it too. They likely want either a couple or even single visual function like RT or they want a performance boost. The issue is since it’s not really a dx12 engine at the core they don’t have the full control they require. In order to get proper frame cadence they would need to write code that’s specifically controlling ever step of the GPU rendering pipeline where in dx11 they just follow the right steps and the API tells the driver how and when to do what.
It’s an issue of either laziness or cost cutting because there are plenty of dx12 games on dx12 engines (proper ones, not dx11 with some 12 function calls like say…Elden Ring and it’s happening because if you look at the steam hardware survey the vast majority of PC gamers are still on the nVidia 1000 series or below. The number of people running full dx12 let alone dx12u compliant GPU’s is sad.
People need to upgrade because for anyone on an nVidia 2000 series or newer (not including crap like the 1550ti that’s part of the 1000 series even if it’s newer derp) or anyone with an AMD gpu that’s a 200 series (technically dx12 tier 1 compliant) or newer is literally holding PC gaming back and in part helping cause these performance problems.
A developer isn’t going to make a dx12 pipeline for their game if only 15% of people can play it or use it, at least not one who wants to spend the exact minimum to get the game out in a working state but oh well the dx12 mode with better visuals and DLSS etc has stutter…sorry. They will do this until the threshold of people in the 2000 series or newer catavory hits 50%. OR if people stop buying games that stutter & hold their ground until it’s fixed while making it known why you didn’t buy it and a large number of people need to do the same because why fix it if people will buy anyways?
The DX12 games that don’t stutter and get an easy 20% to theoretically 50% performance boost are ones who have 2 development pipelines. They have the game made for 11 and then they take the time to write the proper dx12 code. The only reason this is less likely on Vulkan is because it’s based on mantle and AMD gave everyone the code because for the couple mantle games that exist the developers made the dx11 version then AMD made the necessary engine modification for mantle and that’s open source. They did this to show off that their GPU’s weren’t crap and that there was a significantly better way to make games. They stopped because they knew Microsoft was finishing up the last parts of rev 1 dx12. The reason AMD had such huge gains in Thief on a 290x dx11 1080p high 46-60 fps. Thief on mantle, 1080p high 140 to 160 fps and it was probably hitting thermal limits but it got the boost due to the “to the metal” api mantle and their uarc focus in GCN3 was definitely async-compute.
With dx12 developers write their stuff but they aren’t going to just share it. Microsoft isn’t at fault either because the AMD situation is very unique and Microsoft has given developers the instructions on how to code for dx12 just as the did for every version but because it’s cheaper and because people keep buying the games anyways they haven’t done what’s necessary and the end result is stutter. Hell Microsoft helped with UE5 by sending The Coalition, the MS studio that has been making Gears of War and can literally make ue4 using dx12 perform like NOBODY else. Look at Gears 5 then remind yourself it not only works on a base Xbox one but it looks incredible for that hardware… They were just using the tip of the DX12 iceberg and there wasn’t any stutter. They also helped make that crazy matrix ue5 demo that people have done cool. Stuff with
This is long so I’ll put a 2nd section that simplifies everything under “Section 2” in bold. If you’re really interested in why it happens read the whole thing. It looks huge but I read it back just now in about a min an 17 seconds (that’s a lie!) & I was looking for big typos.
DX12 won’t have the stutter issue in UE5 or any new genuine DX12 engine. The reason we see stutter in some dx12 games is because it’s really a dx11 engine using some dx12 specific function calls like RT or mesh shaders and since dx12 works completely differently and allows the developers to send instructions directly to the GPU driver instead of the dx1 through dx11 way which worked by the dev sending instructions to the API (let’s say it’s dx11 for example lol) and then dx11 would send the instructions to the driver. This adds overhead which means performance loss. It also means no control of the GPU directly or what they call “to the metal”.
Removing that extra step allows the developers to not only do significantly more advanced things (things we haven’t even seen yet) but it so REQUIRES them to handle a large number of things that the API used to handle for them. (See dx12u documentation for info it’s readily available)
This is also how huge performance gains can be achieved. The problem is they have needed to learn a new way to do things since they control the hardware now. The API (dx12) takes a back seat for most things so a PC work more like a console, although on console they can skip talking to a driver and talk directly to the gpu which is better than even dx12 & Vulkan but it would be dangerous on PC (imagine an unkillable, undetectable gpu virus)
SECTION 2
More words but an easier faster read.
Anyways let’s me fast forward. Basically we see stutter because devs want to have their cake and eat it too. They likely want either a couple or even single visual function like RT or they want a performance boost. The issue is since it’s not really a dx12 engine at the core they don’t have the full control they require. In order to get proper frame cadence they would need to write code that’s specifically controlling ever step of the GPU rendering pipeline where in dx11 they just follow the right steps and the API tells the driver how and when to do what.
It’s an issue of either laziness or cost cutting because there are plenty of dx12 games on dx12 engines (proper ones, not dx11 with some 12 function calls like say…Elden Ring and it’s happening because if you look at the steam hardware survey the vast majority of PC gamers are still on the nVidia 1000 series or below. The number of people running full dx12 let alone dx12u compliant GPU’s is sad.
People need to upgrade because for anyone on an nVidia 2000 series or newer (not including crap like the 1550ti that’s part of the 1000 series even if it’s newer derp) or anyone with an AMD gpu that’s a 200 series (technically dx12 tier 1 compliant) or newer is literally holding PC gaming back and in part helping cause these performance problems.
A developer isn’t going to make a dx12 pipeline for their game if only 15% of people can play it or use it, at least not one who wants to spend the exact minimum to get the game out in a working state but oh well the dx12 mode with better visuals and DLSS etc has stutter…sorry. They will do this until the threshold of people in the 2000 series or newer catavory hits 50%. OR if people stop buying games that stutter & hold their ground until it’s fixed while making it known why you didn’t buy it and a large number of people need to do the same because why fix it if people will buy anyways?
The DX12 games that don’t stutter and get an easy 20% to theoretically 50% performance boost are ones who have 2 development pipelines. They have the game made for 11 and then they take the time to write the proper dx12 code. The only reason this is less likely on Vulkan is because it’s based on mantle and AMD gave everyone the code because for the couple mantle games that exist the developers made the dx11 version then AMD made the necessary engine modification for mantle and that’s open source. They did this to show off that their GPU’s weren’t crap and that there was a significantly better way to make games. They stopped because they knew Microsoft was finishing up the last parts of rev 1 dx12. The reason AMD had such huge gains in Thief on a 290x dx11 1080p high 46-60 fps. Thief on mantle, 1080p high 140 to 160 fps and it was probably hitting thermal limits but it got the boost due to the “to the metal” api mantle and their uarc focus in GCN3 was definitely async-compute.
With dx12 developers write their stuff but they aren’t going to just share it. Microsoft isn’t at fault either because the AMD situation is very unique and Microsoft has given developers the instructions on how to code for dx12 just as the did for every version but because it’s cheaper and because people keep buying the games anyways they haven’t done what’s necessary and the end result is stutter. Hell Microsoft helped with UE5 by sending The Coalition, the MS studio that has been making Gears of War and can literally make ue4 using dx12 perform like NOBODY else. Look at Gears 5 then remind yourself it not only works on a base Xbox one but it looks incredible for that hardware… They were just using the tip of the DX12 iceberg and there wasn’t any stutter. They also helped make that crazy matrix ue5 demo that people have done cool. Stuff with
The reason why it is happening is irrelevant since well it is known and keeps happening. You come off to me, as being in the camp of those that still “hopes” especially since you are aware of the 2 things that should rid yourself of it, “cost” and “laziness”.
All DX12 games in a variaty of engines have had stuttering even in hardware that fully supports the API. You are hoping that somehow a change of engine will change that for an online only game, from a Korean developer/publisher that WILL not spent the amount of monies MS did for Gears of war 5 to make their game look good.
All i know is, i will believe it when i see it.
Warframe clone.
ew, microtransaction garbage
ew, microtransaction garbage
ew, microtransaction garbage
no gameplay, dont care!
Nexon and ‘free to play’ are words that do not align. Expect to shell out big cash to keep up.
Post apocalypse + nuGears characters and guns, rocks from UE5 demo map, awkward name. This game will have 3 hours of content that will run at 25 fps on high end PCs. Dead on arrival.
Ah Nexon.
I remember when these clowns added pay to win lootboxes in Maplestory and killed it.
I wonder what new “Fun” ways of monetization these Koreans are going to push now.
How original! Can’t wait to play this masterpiece and brand new idea of a game.
Guess the GOTY is already decided.
Pew pew pew.
Finally something that looks next gen looks kinda plays like gears of War