MAWI United GmbH has shared a new tech demo for Unreal Engine 5.1 which uses full Nanite for foliage and trees, Lumen and Virtual Shadow Maps.
This tech demo showcases all of the new features that Unreal Engine 5.1 brings to the table. Moreover, it can give you a glimpse at what the forest areas in some upcoming games may look like.
According to MAWI United GmbH, the minimum required GPU for running this tech demo is the NVIDIA GeForce RTX 2080. The team also recommends using an NVIDIA GeForce RTX 3080.
You can download the Full Nanite Redwood Forest Demo from here, here or here.
Lastly, and speaking of Unreal Engine 5, we suggest taking a look at these other fan remakes. Right now, you can download a Superman UE5 Demo, a Halo 3: ODST Remake, and a Spider-Man UE 5 Demo. Moreover, these videos show Resident Evil, Star Wars KOTOR and Counter-Strike Global Offensive in UE5. Additionally, you can find a Portal Remake and an NFS3 Remake. And finally, here are Half Life 2 Fan Remake, World of Warcraft remake, GTA San Andreas Remake, Doom 3 Remake, The Elder Scrolls V: Skyrim, God of War Remake and GTA IV Remake.
Enjoy!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
Minimum 2080 lol
Forget about somewhat soonish games going for this level of detail. It’s nice to know the engine can handle it, but consumer hardware can’t. The consoles aren’t close to a 2080 and the PC mainstream is even further behind at this point, thanks to mining BS.
I honestly thought it would be higher than the 2080 when I first clicked on it.
RTX 2080 Super < XSX < RTX 2080 Ti
Name one game the series X runs better than a 2080 super. I’ll wait.
It is not about running better than RTX 2080 Super. The discussion was about raw TFLOPS which XSX has advantage over the RTX 2080 Super. If we are talking about gaming performance, then I am already calling XSX a plastic box.
are you f*king out of your mind ? go and seek a psychologist , or better, go and watch some Digital Foundry videos about how XBSX runs games at much much lower internal settings and uses upscaling technique and even so still runs most of the games at 30FPS , when an RTX 2080 Super can runs all the games at higher settings and maintaining a 60FPS target on NATIVE 4K on 99% of the games out there !…..i’m sure you don’t even know what a GPU is !
and you better buy a pair googles first before writing inconsistent comments. have you even read the texts above or, directly outta nowhere spitting this nonsense?
Consoles generally have less overhead and additional hardware-specific optimizations that help it slightly surpass their PC equivalents. However, with that said, there is no way this demo will run anywhere close to playable on any of the current gen consoles. I tried it just now on my RTX 3060 Ti/Ryzen 7 5700X machine and I got like 9-11FPS at max settings at 1080p lol. To get a steady 30FPS I had to drop it down to high settings and balanced TSR, which looks somewhat visually noisy and blurry because their temporal solution isn’t quite as good as DLSS or XeSS at low internal resolutions. It’s going to take next-gen consoles to run demos like this one at 4K native and at 60FPS.
I have one objection however. “additional hardware-specific optimizations” those days are just dusted and done. It was those weird X360 (less weird) & PS3 (more weird) architectures that caused that enormous backlogging in graphics technology. I am glad Crysis came out and slapped the $hit outta everything.
Yep, the problem of chasing graphics advancement is just diminishing returns compared to the resource needs to run it. More hardware resource need to be poured into NPC A.I., physics, and gameplay
Actually XSX (12TF) is even faster than RTX 2080 (10TF), and I’m also sure PS5 (10TF) is close. RDNA1 (not to mention RDNA2) has already matched Turing architecture efficiency per flops, so there’s no way 10TF RTX2080 could be faster than 12TF XSX in raster. But it’s different story when game will use RT. In such games even 2060S can match results of current gen consoles. Also nvidia cards have amazing DLSS technology, so sometimes 2060S can run games with much better picture quality than consoles. For example doom eternal looks better on 2060S at 4K with DLSS performance + RT.
You can’t compare amd tflops to nvidia. You can hardly compare nvidia to nvidia (1080 ti > tflops than a 2080 super?!). I mean, name one game that runs better on a XSX than an 2080. I’ll wait.
John, you might want to include the alternate download links that the creator posted in his video’s description as the one in your article doesn’t work anymore.
I know graphic in unreal Engine 5.1 are amazing but if epic games doesn’t fix shader compiler problem on game engine to developers , it’ll hurt developers who art craft their games also customer who buy the game but ended up facing performance issues
They already addressed the issue with UE5.1. A new PSO (Pipeline Shader Object) caching has been implemented to minimise stalls caused by shader compilation. It pre-compile PSO early before it’s needed to render on screen. They promised to work on it further.
Shader compilation stutters are still there in UE5.1 according to digital foundry forntie update analysis.
Of course, there are. It’s right in the documentation where they say this will not eliminate stutters and may trade stalls for delays from PSO immediate calls.
imagine time traveling back to the 70’s and showing this to the “Pong” developers
Imagine time traveling to the year 2000 showing myself this in high school. I’d have sh*t lol
This level of fidelity, at least at first glans, was achieved by DICE in Star Wars Battlefront. An almost 8 years old game. The demo looks like the Endor map running on a pc with a 2 gb video card (Gtx 660 for example). Also, all this tech is pointless if you’re engine has shader compilation issues…
LMAO..Dream on buddy ..LOL!
Star wars battlefront has similar scenery, but it doesnt look nearly as realistic.
The bit of chasing realism is a bit concerning for me. Like you said this kind of details actually already presents years ago, its fakery of course, but realistically its enough to fool us into thinking its looks real with all traditional technique being implemented that still able to run on modest hardware. Even with unreal the Matrix Demo, I cant find the excitement, its still looks like ‘game’ to me, especially on character front (the face department still not as good as L.A. Noire from years ago or injustice a few years back). Im actually fine with that level of graphics on game, but when it still looks like a game but with way more heavier tax on hardware its just unacceptable. That resource need to be implemented to other department like physic, A.I NPC behaviour and gameplay system as a whole, while graphics can get a hold for a while because its already have pretty diminishing returns from resource perspective.
less than 60fps on radeon 6600 at fhd at minimum….
So? 6600 is weak compute-wise.
Yea, but still, not everybody have 3rd kidney for sale to buy 4090ti
Mid-range cards don’t last long, especially at the start of a new console generation. I believe Ray Tracing will stay optional on PC for a while until AMD and Nvidia are humbled by lower demands and start giving us good cards again.
This demo is extremely unoptimized. I am getting around 50-60% of the performance I get on the Conifer Forest Demo which is not even using the nanite foliage like in this one.
It’s because it features Virtual Shadow Maps this time. You’d get similar performance penalty if you activate VSM in the older demos too.
Skyrim with mods.
Shaders compilation issues will always f*ked up UNREAL games , i don’t gives a sh*t about how beautiful or advanced an Engine looks if i can’t get ride of those stutters every 10 seconds even with an RTX 3080 and such !
I don’t think the title of your article matches the content lol. Just kidding, mainly because I had some doubts after reading the article.