While our PC Performance Analysis has been slightly delayed, below you can find some a new comparison between the High and the Ultra textures for Middle-earth: Shadow of Mordor. As we can see, there are barely any differences between these textures, suggesting that these are basically only uncompressed textures. There are slight differences but for the most part, they look similar.
Monolith claimed that 3GB of VRAM is required for High settings, however that was nothing more than an exaggeration. Our GTX690 was able to run the game with constant 70-75fps at 1080p with Ultra Details and High Textures (thanks to a custom SLI profile).
On the other hand, the game was simply unplayable when we enabled Ultra textures. There was severe stuttering and the overall gaming experience was as bad as it could get. This also means that the game was loading those uncompressed textures when we selected them (otherwise there wouldn’t be any performance difference at all between High and Ultra textures).
As always, High textures are on the left whereas Ultra textures are on the right. As you can see, the difference is really minimal between them. Do note that we had to re-run our playthroughs in order to offer proper comparisons as the game alters its TOD whenever you restart it. Also note that PCGamesHardware’s previous comparison had different TODs (therefore, additional details due to shaders could be an issue in that comparison).
Furthermore, it appears that Eurogamer is also reporting the very same thing; that the difference between the Ultra and the High textures is really, really minimal. Hell, in their article you can only spot the differences by using the magnifying glass.
Enjoy!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email










Besides the whole texture controversy, the majority are saying that the PC version is pretty good and well optimized. It seems that Monolith did a good job. I’m glad that high textures work fine on 2GB GPUs. Here I thought I was going to have to use medium textures on my 270X 🙂
I very much doubt these people with only 2GB of VRAM can actually play the game without issues(sttuter, lag, etc) since the textures on High use almost 3GB of VRAM.
Did you not read the article then?
“Monolith claimed that 3GB of VRAM is required for High settings, however that was nothing more than an exaggeration. Our GTX690 was able to run the game with constant 70-75fps at 1080p with Ultra Details and High Textures”
Yes, but judging from that test on Neogaf the game uses 2900 VRAM on high settings.
speak for yourself son. my 680 runs its very well at max with ultra textures. the perfoamnce hitter here are the shadows on ultra and the motion blur on ultra. disable motion blur and put shadows on high and watch how magic happens. never dips below 45 fps.
Come on guys, we were warned when Watch Dogs uses 3.8GB of VRAM, the textures are really nice on ultra, it’s just the game stuttered even within VRAM limits. Look now, Watch Dogs ultra textures are indeed worth it and look rather better than high but require 4GB VRAM cards. It’s not like 4GB VRAM is new, you can get them on non reference cards and cheap R9 280 have 3GB as standard, R9 290’s have 4GB as standard and they’re only £250 as of now. NVIDIA GTX 760’s non reference come with 4GB versions as well.
John, can you take closer up shots of the textures and compare them because from a distance you can’t see the detail very well.
Yeah but the majority of PC gamers today have cards today with only 2GB vram and many of those 2GB GPUs are pretty good such as the 660s,670s,7870,s etc. Hell, according to the steam survey, most PC gamers still use 1GB vram. And at least the texture quality will be scalable so it’s not a must to get 4GB+ GPUs.
Watchdogs is not worth what it asks for mate
In my view it is, this game isn’t because of the lack of variation.
Performance is great on my 2GB card also. With 1080p Ultra details and HIGH textures, there’s no stuttering at all, 50-65fps. If I use FXAA and downsampling from 1440p resolution, image quality is great because there’s less aliasing (and theres still no stuttering, and good fps 40+). I have SDD so maybe that helps ?
With Ultra textures my performance will drop (30-45fps), and indeed there’s stuttering. But I tried 30fps lock, and it helped somewhat with that issue. I played like that without major problems for 15 minutes (still there are games on ps3, that run worse :D). But because Ultra textures looks similar to high I prefer to play with downsampling @high textures and better fps.
Your card is?
680GTX OC (770 performance)
this game even at “ULTRA” looks worse than the witcher 2, and the witcher 2 is a 5 YEAR OLD GAME, and performance wise, the witcher 2 runs at 1080p 60fps with even a radeon 7970.
welcome to the next gen games….
witcher 2 was something special, man those textures were great and still are and none of them looks the same or copy/paste like other games
I still believe W3 will have better textures than Mordor, and will run with less requirements for VRAM. I can even bet on it.
LOL!! Witcher games have a tendency of FORCING you to upgrade your rig. Its a little something called progress.
The Witcher 2 was launched may 2011, so its 3 years old.
no, the witcher 3 will coming feb 2015 😀 , just joking
Sry, I miss typed xD
As much as I love the Witcher 2 and it’s visuals, there are a few points.
1. It’s not open world.
2. Has some bad steaming issues when loading areas.
3. It has bad LOD issues.
4 Was very buggy when it came out.
5. Tons of patches
6. Poor draw distance.
7 Limited movement, no jump, only areas it says you can climb(though there are plenty of paths).
8. Invisible walls at the edges of water.
I doubt that requirements for Witcher 2 would have been higher.
Separated zones and limited movement are more of a design decisions. Open
world games are hard to make, and Witcher 2 was even buggier than the
first game on release. Now Imagine all those bugs in open world.
Well, dynamically generated can give you more bugs, prefer a hand crafted world like Risen 3. Did you ever see the grass in Farcry 3? Looks like it was just dynamically generated with no care at all on hills, looks dreadful and no variation.
Modern open world games usually aren’t dynamically generated (unless they aren’t procedural), and that’s not the problem. The problem is in open world itself. You need keep in mind a lot of things, things that do not matter when you make a game with linear or completely separated levels. These things can lead to a train of new problems.
Yeah but the problem is that these so called bugs have nothing to do with the game being a huge open world, the bugs from previous games that never got fixed are still simple bugs like Dragons flying backwards. 🙂
It’s not an excuse, community patches often fix bugs in games that the devs just didn’t fix. I’ve seen very little bugs in Risen 3 because they fixed issues from the previous game and took time to test it in their hand crafted world.
I suppose the bigger a game, the more manpower you need. Judging from HLTB, Risen 3 is even shorter than Gothic 2: Gold Edition, lasts almost as long as Witcher 2 and 6 times smaller than Skyrim: Legendary Edition. And as far as I understand, the world is still consists from separated islands, which means no open world.
I agree with you, some of the bugs like no support for more than 2GB RAM is a ridiculous BS and it has nothing to do with open world, but CDPR followed the path of Bioware (linear design, no jumps, etc.), they had no experience with open world at all. Only after Skyrim they decided to try it, and I still doubt about its quality in Witcher 3.
As for Piranha Bytes, I’m more interested in the new project of Michael Hoge (the guy who was responsible for Gothic 1-3 and Risen 1, co-founder of PB, and the face of Nameless One in Gothic series). He left Piranha some time ago (as well as many old timers) and working on a new project.
facepalm.
still was GOTY 🙂
and also, would you agree those high textures from 3GB require double memory for ultra preset , there is barely any difference , very little
Seems to me we want ultra because it’s ultra.
Thats like saying that assasin’s creed should be more demanding than witcher when all the way up to ac3 that was not the case.
Middle earth has very simplistic enviroments.
Yeah, looks like a lot of open space with broken walls, all looks pretty boring.
“even”?
A 7970 should comfortably max any game from 3+ years ago, don’t you think? It’s an upper midrange card, bordering on high-end.
also by the looks of it this game has simplistic enviroments all the textures are the same.
Dude, the draw distance & amount of enemies on-screen are the largest factors determining its system requirements >_<
What do you think guys? Witcher 2 at 4K(taken from youtube 4K video) doesn’t need 6GB of VRAM for lovely textures.
http://i.imgur.com/t3OjP10.jpg
http://i.imgur.com/0qSOGeC.jpg
http://i.imgur.com/c0ADJKM.jpg
DAT A*S
Jou have to play 1440p or higher for the ultra resolutions to see. 1080p can’t see the pixels… duhhh
3440x1440p gtx 780 ti cant handel it. Neither a gtx 980 with the 4gig ram stutter fest. Only my black handels it vram maxing 4.8 gig vram. And yes the textures are awsome and increading visuals. But the port is afwull
What a F*CKN ugly turd, console to Pc and then boost about yada yada 6GB ram bla bla bla, just to make it sound the game will push pc’s and look fantastic!
Pathetic crap!!
And still Pc are on the rise, but developers dont seem to care, to hell with them who dont give out a proper Pc version as we deserves.
F*CKN Consoles boys to blame for bying into those consoles!
wait what? you can play it on high texture with 2gb, gotta try out that now.
nobody here talking about how the game is, how it plays, how the nemesis stuff works, etc. As for how it looks, it looks better than i expected. Only when you are in daylight and it’s clear can you see how the first zone actually is laid out and get a sense of just how the region in front of the black gate is. I was pleasantly surprised because if at nigh or in rain, the environment does not make that much sense, it really comes together when you see the big picture. Facial animations are great, lip sync perfect and there are no two orcs that look the same and that for me is huge. It really is cool to kill captains and watch how new ones are generated and no one looks the same. I can’t understand why technology such as this that makes npc’s different is not more common, if i think about it only Kingpin on pc tried to do something like this. What makes the game cool for me is the combat brutality and the nemesis system with it’s many different orcs. I was playing it on high and switched to ulltra, unplayable almost, choppy as hell and not much better looking. Also tessellation kinda ruins the orcs faces.
Well dso does perfomance analysis you can check TB for that.
The LithTech Jupiter EX engine is old anyway and never really looked that special in previous games that used it.
The main character looks like Bradley Cooper.
Aside from the decent character model all of these screenshots look absolutely hideous and I can barely tell a difference in texture quality. Looks to me like another case of an “ultra” feature being more placebo and marketing than anything else. Arkham Creed of Mordor looks like a last-gen console game with bad art direction and it would still look like that if you doubled the texture resolution once more.
Please tell me that Shadow of Mordor uses more than four CPU cores. It would be nice to have a reason to go hexa-core or octa-core.
marvelous