Middle-earth: Shadow Of Mordor is almost upon us and YouTube’s member ‘teoKrazia’ has shared a video, showing the in-game benchmark tool that will be featured in it. Moreover, toeKrazia tested Middle-earth: Shadow Of Mordor on an Intel i5 2500K@4.4, with a GeForce GTX 970 4GB, FoceWare 344.11 WHQL drivers and 8GB RAM. This combo run the benchmark (at 1080p with max settings) with an average framerate of 70fps (minimum framerate was 41fps). Do note that while teoKrazia has selected the Ultra textures, this HD Texture Pack has not been made available yet (meaning that those are High and not Ultra textures). Enjoy the video after the jump!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
This is without the Ultra textures by the way, since they haven’t been released yet so even if you select Ultra textures it will use High.
This can be verified by the settings screenshot which says Ultra will only display if you have the texture pack installed.
Hopefully this game has SLI support because it sounds like running it at 2560×1440 on a single 770 is not going to be enjoyable for me.
Lolx…. My prayers are with you! Though running it on 1440p with single Gtx 770 might be a bad decesion!!
P.S : I’m sure it will support Nvidia will soon release Sli profile update for this game! So don’t worry about that.
No official SLI support yet, but you can use the FEAR3 SLI profile (reported to be offering good scaling)
Great! But Ultra textures not included! So, this benchmark doesn’t matter for those who want to play with ultra setting on Gtx 970 with ultra textures!
ignore the minimum fps , because its initial loading stutter that causes it.. like in thief 4 benchmark. from what video shows 59 is minimum
That’s right.
Why ignore anything? 41fps is an acceptable minimum framerate, especially while using G-Sync.
Yeah G-Sync rules, the best thing happen EVER!
My Rog Swift cant wait to play this the way it’s meant to (NVIDIA) be played 😉
You guys havent really gamed before you are using the Asus ROG SWIFT, that for sure 😉
Worth every penny and more!!
ROG swift is nice, but a TN panel + Matte AG filter overtop makes it upsetting. Considering a 60Hz 4K GSYNC panel from Acer for the same price right now. But that’s also TN…which is upsetting. Need some more IPS displays with GSYNC.
I hate TN panels but that ROG one is much better than anything else. Not IPS but you get 144Hz.
as its not actually going to be 41 when you play, its loading stutter.. i said that so people won’t get misinformed
soo.. i guess this means its a good port?
no it should be around 100fps for this card , since ps4 does 60 fps as well…
PS4 is probably equivalent to 1080p @ medium settings with only fxaa
you would be probably correct my friend.
Would be great if someone will make a comparison between the PS4 version and the PC version on High and Ultra settings. This will help a lot.
ok now i feel better , because im about to get 970
this card just came out, they didnt even made proper drivers for them, since the game was in developement before 970 was released
What definition of 60fps might that be? 🙂
PS4 = capped version…
you sure ps4 version is 60fps ? it drops below 30fps in close encounters
I don’t know, that’s just what i read online.. people were saying 60fps solid on ps4, so i was pretty surprised about 70 only on 970…
They are going off some stupid GamingBolt report about the devs shooting for 1080P/60FPS (Why am I not surprised that it’s gamingbolt. Bastion of peasant ignorance). WB games have been secretive about the console rez and the footage and screens from the PS4 version I’ve seen look 900P. So I’m assuming it’s a watchdogs style 900P on PS4 and 792P on Xbone situation and they don’t want console owners to know.
But the peasants won’t be able to tell, as usual.
honest The game doesn’t even look great. It looks worse than AC IV, honestly.
so those people :)) ofcourse they say it’s 60fps, because they say that about all the games even those games that are sub 30fps. todat or tomorow digital foundry will make an article about that
”they say it’s 60fps, because they say that about all the games even those games that are sub 30fps.”
Care to explain more? Who said that about what game?
console gamers, to them anything is smooth as butter. silcky 60fps until you (someone) prove them wrong then it become cinematic 30fps. btw, why should i explain anything ? those people who said it’s 60fps on PS4 should explain how they know it was 60fps. even in ign’s (or gamespot) review the reviewer said the game have frame rate issues in some areas.
Oh I though you meant some dev or similar said a game was 60fps when it was 30fps. But yeah the new ps4 infamous ”is 60fps” when in actuality it’s closer to 40-50fps.
It’s been confirmed. PS4 1080P/30FPS, Xbone 900P/30FPS
Console peasants wrong again.
Rather p*ss poor benchmark because there is hardly anything happening, I’m mean come on stress the damn thing like Metro Last Light does.
Agree with you on this Sean. There’s no real battle scenario in this benchmark, and something like that will tank permormance much more for sure.
So if this is the case, my dual ACX 780’s should average 80-90fps at 1440p.
I have the same set up. Hopefully 3gb Vram will be enough for 1440p but I am afraid its not going to be for Ultra.
Ultra at 1440p takes 5.5GB~ VRAM. But…as others have said, Ultra texture pack hasn’t actually been released. So that may be 5.5GB for just High textures on 1440p. A little ridiculous…
Also, no SLI profile currently. Should be in another week or so. Setting to alternate frame rendering seems to be…problematic. I’ll be at the 100fps cap, get into combat, and for some reason frames drop to as low as 35fps on a GTX Titan OC’d to 1300MHz.
And the game is capped at 100fps. Though I did find the 100fps to be smooth enough as to not bother those with 120Hz/144Hz monitors. Now it’s just a matter of being able to get 100fps…which I can’t do at 1440p with all maxed out settings on a single card (due to lack of SLI profile).
Oh, so he tried Ultra textures after all and he says that they consume ~3580MB of Vram and High textures consume ~2889MB.
http://www.neogaf. com/forum/showpost.php?p=132208286&postcount=107 (Remove space between neogaf and com)
Although, for Ultra textures he says :
Ultra approximate average: 60-70, but with some very bad swaps [1-2 secs of freeze] every, mmh, 40 minutes?
Seems like the V-RAM needed to ‘re-cache’ every certain spot.
how was he able to enable ultra if it’s a separate download that isn’t even available yet?
Maybe he has the texture pack ? Otherwise Ultra textures wouldn’t consume more Vram.
Larger texture maps do consume more vram. Ultra textures are probably 2048×2048 or higher, meaning they will take up more cram as opposed to 1024×1024 textures
The settings do say that regardless if you set it to ultra or not, without the texture pack, ultra textures will still stay at high.
But what about AA? I can’t see any AA options in the video. There is a resolution option though and I wonder if you can go higher than 100%.
My single 760 just started crying.
my EVGA arriving tomorrow… lets see what perf/tdp that loving card could do!
My single gtx 650 ti boost will melted if im run this game on ultra 1080p 🙁