The Lord of the Rings: Gollum releases tomorrow and from the looks of it, NVIDIA’s high-end GPU, the RTX 4090, can run it with 48fps at Native 4K with Ray Tracing.
For those unaware, Daedalic Entertainment has used Ray Tracing in order to enhance the game’s reflections and shadows. Moreover, the game will support DLSS 3 from the get-go.
According to NVIDIA, the RTX 4090 can push 48fps at Native 4K with Ray Tracing. Its second most powerful GPU, the RTX 4080, can push an average of 30fps at Native 4K with Ray Tracing.
In short, PC gamers will need DLSS 2 (or DLSS 3) in order to enjoy the game with smooth framerates at 4K. NVIDIA has shared some performance numbers, however, DLSS 3’s Super Resolution was set to Performance Mode. Thus, expect lower framerates when using DLSS 3’s Quality Mode.
Although Daedalic has not provided us with a review code, we’ll be sure to test and benchmark the game once it releases.
Stay tuned for more!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email

So these devs are now literally just working with Nvidia to show off how good DLSS is instead of optimizing their games? God damn.
Highly plausible as “suddenly” every game seems unoptimized without much visual or physics upgrade , well nothing is impossible in the corrupt gaming AAA industry ruled by corporations .
Yeah, but this one seems particularly in your face. Reviewers really need to update their 40 series reviews with games that launched this year because based on older games those reviews are already looking outdated and misleading.
AMD shovelware cards with zero innovation but lots of VRAM are making devs lazy. they just send everything to the VRAM pool instead of properly optimizing
Yeah, the rasterized performance gains on the 30 series in particular were disappointing, but this game just doesn’t look that good to put even the 30 series to shame like this without DLSS.
vram is overrated, it’s the only edge AMD has over Nvidia and AYYMDrones are gobbling it up like the dumb goyim they are.
a 4070 Ti with 12 GB VRAM is on par with a 3090 which has double the VRAM. really activates your almonds. what’s going on here?
Anyway AMD sucks, and only troons and poojets buy them
And with a jumping simulator no doubt.
I am playing Days Gone lately. Early UE4 based game 4,26 or 4,11. Looks very good on par with todays title, there is no shader compilation issue, weird stutterings, no dlss or other ‘methods’, game has nice volumetric cloud system, better than all these ue5 tech demos had so far and here is the thing – 4k maxed on 4090 and its 120fps+ constant no magical drops below 100. Thats said do the math.
I’m planning to play Days Gone soon…
It was not that great on Day 1 for PC version, but it was not a mess.
Still the game has Fluctuating game speed during gameplay.
4.13*
Check out the dismemeberment mods and restored content (P90 and auto shotgun). Really had a blast with those.
I played Days Gone and it did look good but that was good for an old title. All these problems with screen space reflections, flat looking vistas, issues with light bleed and shadow. Imagine it had a remster, something like CP2077’s Overdrive patch. now that would look good.
First: there is no ray tracing in any game .people need to stop believing in horse sh*t. it’s just half baked fake crap.
Second:is DLSS supposed to cover for crappy written code?
Not only is the incompetency and lackluster development in games a common-place these days, I still can’t believe the BS that DLSS3 can’t run on RTX 30 series cards (atleast the high-end ones)!
It can, yet it won’t. It’s called Huang magic.
You wrote it quite realistically.
I won’t go AMD (at least GPU side) but it’ll be at least 10 years before I ever do another PC upgrade. PC gaming is in such a mess right now. Even John with his mighty enthousiast CPU and enthousiast GPU is telling people to play Jedi at 30fps. What a sh*t show.
Nyah, I only showcased that at 30fps there is no traversal stutter at all. I personally finished Jedi Survivor with an unlocked framerate. The reports of the traversal stutters are exaggerated. They are there, but they are not game-breaking on a high-end CPU.
Yeah, only the base at Koboh is where things really go bad in Jedi Survivor, everywhere else the stutter was not that big a deal. But the stutter in Koboh was enough to put me in a foul mood every time I went back.
Aren’t the traversal stutters more to do with CPU and RAM speed (rather than GPU)?
By limiting your framerate to a lower value, you give enough time to the CPU to handle the streaming data without causing a stutter. This is a workaround and not how things are supposed to work, especially given the raw power of current PC systems.
Ah good to hear you played it at an unlocked framerate.
My PC wasn’t the best in 2004, nor was it the best in 2007. But I never opted to play Half Life 2 or Crysis at lower than 60fps. That’d have been a true shame.
Frame Generation can run on Ampere cards but they don’t have the hardware to let it run well thus Ampere is left out. It’s like saying you can’t beleive you need at least an 8GB GPU when you are trying to run games on a 2GB GPU with all the issues.
The Lag of The Rings: Unoptimized
Smeagol is the part where devs still cared, gollum is the part where they just go: f*k it, give us your money. At the end Gollum took over.
https://media0.giphy.com/media/JQ3XW6tIzEJYHyc6w0/giphy-downsized-small.mp4
4K is a meme. for idiots
and Gollum was developed by baboons probably, it’s so unoptimized
Have fun at 720p then, since you are so clever
2K chad reporting in.
Gaming (especially PC) has been in a bad state for the last 2~3 years.
I would argue for much longer.
The Hobbit movies ran at 48fps so maybe it’s not an unoptimized pile of sh1te that can’t hit 60fps with a 2 grand GPU but a true cinematic experience
The Hobbit movies were proof that more frames isn’t always a good thing. It absolutely destroyed the VFX scenes.
The hobba.
Devs just gave up on optimizing a long time ago. This 4090 was supposed to be a card that would destroy all games through the generation by a landslide and its already struggling without the glorified “Tru Motion”bs. Won’t subsidize these hacks anymore. I’ll stick to old games and mods.
Watch this game running at 4k30 on the PS5 as I am sure they will put in work on that version.
Most people don’t game at 4k with ray tracing tho. Interested to know how the game runs at 1440p ultra with no ray tracing so I know what to expect when I pick it up for $10 in 2 months
I’d rather run 1920p downscaled to 1440p on sensible setting (i.e. mostly high); it’ll likely look nicer and run just as well.
Are people just crying over not having a modern GPU?
MINIMUM:
Requires a 64-bit processor and operating system
OS: Windows 10/11
Processor: Intel Core i5-4690 / AMD Ryzen 3 1300X
Memory: 8 GB RAM
Graphics: Nvidia GTX 1060, 6GB / AMD Radeon R9 290X, 4GB
DirectX: Version 12
Storage: 45 GB available space
Additional Notes: at Low preset and 1080p, Ray Tracing off
I could understand the tears if you were getting sub 60 FPS at 1080p with a 4090.
“With such fast frame rates on many GPUs […]”
-Nvidia
Ahah HAHAHAHAHA https://media3.giphy.com/media/D6XJsMA2cfIdy/giphy-downsized-small.mp4
lol DLSS 3 is still a really sh**ty excuse by devs to stop optimizing their games. Have fun with fake frames I guess xD
Nothing graphically impressive to warrant a low fps on a 4090.
Aaaand it still looks like sh*t.
Ray Tracing is just a gimmick no ones need.