The Lord of the Rings Gollum July 2021 screenshots-3

The Lord of the Rings: Gollum suffers from major stuttering issues, has insane VRAM requirements in 4K

Daedalic Entertainment has released The Lord of the Rings: Gollum on PC, and things are not looking particularly well. Unfortunately, this is another Unreal Engine 4 game that suffers from major stuttering issues on PC. Not only that, but The Lord of the Rings: Gollum has insane VRAM requirements in 4K.

For our initial tests, we used an AMD Ryzen 9 7950X3D, 32GB of DDR5 at 6000Mhz, and NVIDIA’s RTX 4090. We also used Windows 10 64-bit, and the GeForce 532.03 WHQL driver.

Upon launching the game, there isn’t any shader compilation procedure. Thus, the game appears to be suffering from both shader compilation and traversal stutters. Even when simply moving the camera you can get stutters. Below you can find a screenshot that showcases these stutters. Pay attention to our frametime graph which perfectly shows the stutters.

ToM-Win64-Shipping_2023_05_26_03_48_28_005

It’s also worth noting that the game only uses two CPU cores/threads. However, even with Ray Tracing, we were GPU-bound, and we did not witness any CPU limitation/bottleneck. The two main CPU threads were not maxed out, so in theory, even those with older CPUs will be able to run the game.

As the title suggests, though, the game has some insane VRAM requirements, especially when gaming at 4K/Epic Settings. With Texture Streaming On and without Ray Tracing, the game could use 14-15GB of VRAM. By enabling Ray Tracing, the game’s VRAM requirements increased to 16-18GB.

ToM-Win64-Shipping_2023_05_26_04_21_36_287

Now when you disable Texture Streaming, things get even worse. By disabling it, you may be able to avoid any low-res textures as the game will load most textures to your GPU’s VRAM. Without Ray Tracing at 4K/Epic, the game can use between 16-19GB of VRAM. With Ray Tracing, we saw the game using between 20-22GB of VRAM.

Since Texture Streaming can significantly reduce VRAM usage, we suggest keeping this setting enabled at all times. Still, and even with it, owners of GPUs with less than 16GB of VRAM will have trouble running the game at Native 4K/Epic Settings.

For those wondering, DLSS 3 Frame Generation WILL NOT decrease your VRAM usage as the game is still rendered at native 4K. So, if you are VRAM-limited, you won’t see any performance benefits at all from DLSS 3 Frame Generation. The solution here is to use DLSS 3 Super Resolution. By rendering the game at a lower internal resolution, you can decrease your overall VRAM usage. Common sense, right? Then, and only then, should you use DLSS 3 Frame Generation. So once you’ve eliminated your VRAM bottlenecks, DLSS 3 Frame Generation will improve your overall performance. Again, common sense.

We’ll have more to say about The Lord of the Rings: Gollum in our upcoming PC Performance Analysis article. As said, the game can use a lot of VRAM, especially in 4K, so make sure to monitor it via tools like Rivatunner.

Stay tuned for more!

51 thoughts on “The Lord of the Rings: Gollum suffers from major stuttering issues, has insane VRAM requirements in 4K”

    1. Japanese developers “work hard” weeaboo?

      90% of so-called Japanese games are made in India. If Elden Ring had been made by Japanese workers instead of Indian, it would have come out in 2030.

      Japanese workers have the world’s lowest productivity, they still use fax machines and Yahoo is Japan’s most popular site. They’re too lazy to figure out how a PC works.

      https://uploads.disquscdn.com/images/d99e522b66935d9f2f548918da8b6d9bbe82a067f7b04f8aac3af0c7593f35db.jpg

        1. Why does Japan have low productivity workers?

          1) Because they don’t work. While Japanese workers spend a lot of time behind their desk, they do little actual work.

          2) Because it is almost impossible to legally fire anyone in Japan. So whether you are productive or not doesn’t have any effect on your job. Most Japanese have only worked at 1 company and never leave, they know the employer can’t fire them.

          3) Because Japan is technologically in the stone age. They don’t know how to use PC. They print and scan everything, almost no one can touch-type. Almost no one in Japan bothers to properly learn another language so they need translators for everything. All of this Japanese incompetence results in low productivity.

          4) Because many Japanese women don’t think they should work, so they don’t. While the productivity of Japanese men is abysmal, the productivity of females in Japan is off-the-charts abysmal compared to any other country.

        2. They have one of the world’s leading economies, with less population and natural resources than the US. They need to let us in on their secret to getting so much done with such little work.

        1. about 40 companies were involved in making Tears of the Kingdom, these are just some..

          AnimationCafe, India
          Astro Production, Malaysia
          Black Beard Design Studio, China
          CGCG STUDIO, China
          PTW International, UK
          Imageworks Studio, Canada
          PTW, Singapore
          Polycraft Products, US
          Q’tron Inc., US

        2. About 40 companies were involved in making Tears of the Kingdom. The largest Japanese company besides Nintendo was Black Beard Design Studio, a Japanese outsourcing company that outsources development for Japanese companies to other countries.

          AnimationCafe, India
          Astro Production, Malaysia
          Black Beard Design Studio, Japan/China
          CGCG STUDIO, China
          PTW International, UK
          Imageworks Studio, Canada
          PTW, Singapore
          Polycraft Products, US
          Q’tron Inc., US

    2. …and then they mostly release half broken games about half nude medieval armored underaged girls and dudes fighting giant monsters with giant swords and sparkly magic.
      Weeb.

  1. This game look like came from Mid-2000s PC graphics with annoying mission and gameplay . there is nothing special about this game to make me buy the game in $60 on PS5, How the hell Sony has approve this game on PS5 console , This game’s graphics made Life of Black Tiger on PS4 looks masterpiece , This types of game who does not take advantage of new technology should not exist on Current Gen console.

  2. Ok, this trend of games broken at release has stop, and rather soon or it will seriously dwindle the gaming industry, but especially on PC.
    Like we didn’t have enough with nvidia already.

    1. I got no complaints about Nvidia, my Nvidia stock blew up yesterday and my $122,000 investment in mid October is now worth $380,000 as Nvidia didn’t just hit a home run Wednesday with their quarterly earnings report and future guidance they hit a freaking grand slam and are just shy of hitting the rare trillion dollar market cap

      1. In America we say “A fool and his money are soon parted”

        Granted we do have a lot of fools here ……

      2. Well, Greece being a sh**ty country, always depending on Germany’s euros, must have a lot of foxes.

    1. Depends on the monitor size, it’s stupid on anything less than 32″ because you are just throwing money and performance away for essentially nothing but bragging rights …. Unless of course you are a reviewer then you do need to be able to review it in 4K

      What kills me is people complaining about a $1000 GPU and then plunking down a $1000 on a 4K monitor

  3. VRAM issues with a game that looks like it from 2015? Then it has the boring Havok climbing physics meaning you can only climb things that have the “magic paint” on them. It still amazes me that only Ubisoft of all people have figured out climbing physics that don’t use “magic paint” and are severely limited.

    1. It seems correct to use vram if it’s available. The reviewer has no idea what he’s talking about. It’s not bad to cache textures. That’s what vram should be for.

  4. Allocated != Required all the time – Some games/engines actually allocate vram/ram smartly meaning if its available it will be used so it won’t have to stream in/load as much.

    And thats very good when its utilized that way rather than just sitting idly while the game keeps loading/streaming in the same that could have fit in vram over and over again.

    Hope the base requirements are way lower than what used here thoo, seems quite excessive if it actually is required.

    As for the shader stutter – They should place the coders behind a firing squad at this point, it’s a well-known performance issue that can be solved with a few cleverly written lines of code – It’s no excuse to not do that properly.

  5. Guys, this game looks like vaseline – blurry as hell. The sharpening is also broken, as the image becomes way too sharp. This game really does look like a PS3 / Xbox360 title – it is that BAD! Not sure how you need such high-end hardware to run something like this – absolutely shocking. Aside from all the bad reviews, the game itself isn’t that bad and has a decent atmosphere – but it is completely linear and the gameplay is broken as hell. In terms of performance, it has stutters like you won’t believe and the transition from 60+ fps to 30fps when the cutscenes play out, just ruins the experience. Clearly, this game could have been so much better and actually needed much more development time. Rushed release of an incomplete game.

    1. One YouTube reviewer said he had some cutscenes capped at 30 FPS, others capped at 120 FPS and still others that had no cap at all. He also said it looked like a game developed in a computer class by high school or college students.

  6. Good thing it ran poorly on 6800xt, so much 16gb became the new 8gb problem when games now gonna need more than 20gb to run

    1. this has nothing to do with the amount of VRAM , it’s coded with the butts and Fugly as hell , don’t fall into the VRAM traps !

  7. I wouldn’t even call this shït a game. I mean, to play with Gollum is already stüpid enough, and it comes looking like a PS3 game with tons of issues? Fück off!

  8. Allocated != Required all the time – Some games/engines actually allocate vram/ram smartly meaning if its available it will be used so it won’t have to stream in/load as much.

    And thats very good when its utilized that way rather than just sitting idly while the game keeps loading/streaming in the same over and over again. that could have fit in vram and one load would have cut it.

    Hope the base requirements are way lower than what used here thoo, seems quite excessive if it actually is required.

    As for the shader stutter – They should place the coders (from both epic and the game devs) behind a firing squad at this point, it’s a well-known performance issue that can be solved with a few cleverly written lines of code – It’s no excuse to not do that properly.

    Seems UE4 should be renamed Stutter Engine 4, epic seem to be unable to fix that in engine and rather expect the coders to do their work (*LMAO* like that will happen beside a few who actually know how to do things right)… So UE4 games too me nowadays are mostly just a trademark for something that likely will run like crap.

  9. And as always no one talk about the story and the gameplay. Now days It’s all about the graphics and how it runs on my useless high end system… In the past high quality video games run on almost any pc, and now with top of the line hardware this stupid s#!ty gaymes run like a*s.
    Do the “PC Master Race” still a thing?!

    1. who cares about gameplay or the story if you can’t even play the damn game because it’s ugly as f*k and stutter like hell ? Graphics is a big part of the convenience of the gaming experience , and as technology and power progress in the fields of GPU/CPU/RAM ect , gamers are in the right of expecting a well optimized beautiful game that take advantage of the raw power of their machines , and yes , love it or not , the PC is still the best platform to experience the games at their full potential , sadly developers and publishers doesn’t care anymore about the quality of their games , all they want is to put as many money possible in their pockets and the pockets of the shareholders as quick as possible helped by the idiots that fall into the “day one” , “early access” , “pre-order” and monetization traps , money IS the key , wallets are our weapons against those scamming companies .

      1. So why did you down vote me?! i’m totally agree with you but it looks like you didn’t understand my comment, my comment is not about this bs game, i’m talking in general.

  10. https://www.youtube.com/watch?v=Yhxs3CNsXJE
    And another Master professional with 50% running His 4090 on PCIE3 motherboard. Why you fink if RAM5 then PCIE5 to? You bought I see lowest cost so PCIE3, probably lite. Do you need lesson on How run on fool speed GPU?
    Ok back to school then, all motherboard HAVE two parameter for PCIE 1 for processor and one for GPU, so at you case you bought PCIE5 for processor but PCIE3 for your motherboard, so all what you can use its 8k TPS, and 30-40 series needs 14-21k TPS so you NEED buy atleast PCIE4 to run it on fool speed. Now you bought GPU with 2.6k EU and run it on 600 EU cost 3070 speed. Because at your case your fool speed used on 1660, 2080 doubled it, because 4k screen can use double PCIE speed as big pictures and need more resurses to make and less speed for loading resurses. Now you try on your PCIE3, not double speed but Qvadruple it…. Dude pleas BUY Extreem motherboard insted Lite, you shaming yourself.

    1. did you test yourself the big huge difference “in-game” between PCI-E 3 and PCI-E 4 or you are just buying into the bullshits of Motherboard/CPU manufacturers ? actually the GPU benefits of PCI-E4 over the 3 version is minor , less than 10% , and yes DDR5 higher frequencies is a big step from DDR4 ones , but in-gaming scenarios you’ll win about 10 frames at max , it doesn’t worth the money , games 3D engines and API’s needs to be codded in an all new way to take advantage of those new technologies, and as the gaming industry is going right now, the coders are working with their butts apparently , plus , a tiny fraction of people really game at 4K , 1080p is still the most popular resolution out there , 4K is a trap to make you buy expensive GPU/CPU/MOBO/MONITOR , for me as long as i have at lest 60 Fps at 1440p in every game i have , i’m happy .

  11. This is my #2 GOTY for 2023.
    I dont know what people want, but the game is fun and well spent money, runs 4K/120 on my 4090.
    Its unique and so much fun, just give it a chance.

Leave a Reply

Your email address will not be published. Required fields are marked *