After a lot of controversy surrounding its micro-transactions, Middle-earth: Shadow of War has been released on the PC. The game uses the Denuvo anti-tamper tech and is powered by Monolith’s in-house Firebird Engine. As such, it’s time now to see how this title performs on the PC.
For this PC Performance Analysis, we used an Intel i7 4930K (overclocked at 4.2Ghz) with 8GB RAM, AMD’s Radeon RX580, NVIDIA’s GTX980Ti and GTX690, Windows 10 64-bit and the latest version of the GeForce and Catalyst drivers. Thankfully, NVIDIA has included an SLI profile for this title. This means that those with such systems won’t have to mess around with third-party tools in order to enable it.
Monolith has included a nice amount of graphics settings. PC gamers can adjust the quality of Lighting, Meshes, Anti-Aliasing, Shadows, Texture Filtering, Textures, Ambient Occlusion and Vegetation Range. Players can also enable/disable Motion Blur, Depth of Field, Tessellation and Large Page Mode.
[nextpage title=”GPU, CPU metrics, Graphics, Benchmark video & Screenshots”]
In order to find out how the game performs on a variety of CPUs, we simulated a dual-core and a quad-core CPU. For our CPU tests, we chose a highly populated area of the first chapter, and we dropped our resolution to 720p. In case you’re wondering, we’ve done this so we can eliminate any possible GPU limitation during this scenario. And Middle-earth: Shadow of War appears to be suffering from CPU scaling issues. As we can see, only two of our six CPU cores were stressed while running this game.
Due to the game’s inability to scale – and take advantage of multiple CPU cores – we were CPU limited on pretty much all systems. Despite that though, Middle-earth: Shadow of War does not require a high-end CPU for 60fps. That is of course when your CPU has a pretty good IPC performance. We’ve heard some complaints from Ryzen owners and there is a high chance they are CPU limited due to these scaling issues.
Our simulated dual-core system was able to offer an almost enjoyable experience. With Hyper Threading disabled, our simulated dual-core pushed an average of 61fps and a minimum of 50fps. With Hyper Threading enabled, our performance increased and we were able to get a constant 60fps experience on Ultra settings at 1080p. On the other hand, our six-core and our simulated quad-core systems performed similarly, suggesting that the game does not take advantage of more than four CPU cores/threads.
Regarding its GPU requirements, our AMD Radeon RX580 was unable to offer a smooth gaming experience on Ultra settings at 1080p. While it was able to run the built-in benchmark with an average of 63fps, the framerate dropped at 40fps during some scenes. Surprisingly enough, there were also some framerate drops on our NVIDIA GTX980Ti. NVIDIA’s GPU was able to push an average of 81fps, however we did notice some drops to 52fps. Do note that the first two chapters of the game performed better. Our GTX980Ti ran the first two chapters with a minimum of 70fps. As such, this benchmark may be a stress test or it may reflect the performance of later stages.
Middle-earth: Shadow of War comes with six presets. These are: Lowest, Low, Medium, High, Very High and Ultra. Given the number of presets, we were expecting the game to be scalable on older hardware. However, the performance gains between Ultra and Lowest is only 47fps. Not only that, but the game looks horrendous even on Medium settings. So yeah, we were expecting better performance on lower settings than what we’re getting.
Monolith has also released a 4K High Resolution Texture Pack. This pack requires more than 8GB of VRAM, so you’d normally think that there would be a huge difference in quality between High and Ultra. However, there is almost no difference between them. Below you can find some comparison screenshots between High (left) and Ultra (right) textures. Can you spot the differences?
Graphics wise, Middle-earth: Shadow of War looks great. Most characters are highly detailed, the Orcs look great, and the environments are cool. We did notice some pop-ins of shadows and objects but let’s not forget that this is a game with some really huge environments. As such, we can’t expect LOD values similar to those of games’ with more linear levels. Still, it would have been great if Monolith pushed the boundaries of some settings (like LOD, lighting, tessellation) even more with its Ultra settings. So while Middle-earth: Shadow of War is not as visually spectacular as The Witcher 3, it still looks great.
All in all, Middle-earth: Shadow of War performs quite good on the PC. While the game uses the Denuvo anti-tamper tech, we did not notice any stuttering issues (or anything other side-effects). The game also works great with the mouse+keyboard. Like most triple-A releases, the game allows players to rebind most keys. There are also proper on-screen keyboard indicators. Performance on the lower settings could have been better, and owners of CPUs with low IPC performance may encounter performance issues. There is definitely room for improvement here, no doubt about that. However, we believe that the majority of PC gamers won’t encounter many performance issues with it.
Enjoy!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email





















Off topic guys:
Looks Evil Within 2 got cracked day one just looks ShadoW Mordor. Chekmate!Is official> DENUVO IS DEAD .
Edit>: Looks Bethesda Denuvo silently removed from The Evil Within 2 lmafao
all both of them bad ports
they are far worst port out there
those 2 are ok
“ok” its destiny 2 and it runs fantastic on pretty much all hardware configuration 🙂
no destiny 2 is not “ok” its a great port.
port range from shite:(dishonored 2,forza 7,arkham knight)
mediocre:(koei techmo games mosty)
ok:(this and evil within 2)
great:(destiny 2 etc…)
dishonored 2 with the latest update can run 60+ Ultra even on 1440p with just a GTX 1060/970/480
thats true even on my 980ti it runs great now…but at launch it was a mess
Bruh, you’re only running at 4.5ghz bruh, where is your 5.2 oc bruh? You lied again bruh, oh my god bruh.
forza 7 even without proper multithreading runs at 60fps, so IMHO it’s not damanding game at all, and you put it next to dishonored 2 and arkham knight, that’s just not right.
dude yes even on my rig its above 60fps…BUT because i have an old i7 3770k at 4.2 ghz my single core perf are not enough and it results in stuttering…and stuttering in a racing game is worst than a 30fps lock
actually ur 3770k its above ryzen 7 single thread performance so..idk
ryzen+ivy bridge stutter?
got 7700k 5.2 Ghz btw and its basically the same shat as my old 3770k minor difference in some specific games
i was talking about forza 7 and also forza 6
i still dont know how shadow of war will run on my pc.
Sandy-Ivy basically have the same IPC ryzen IPC is a good 10-15% above that in several benchmarks including dolphin emulator which is highly single threaded
Ryzen IPC is between ivy-haswell in latency sensitive applications and games and in applications that depend more on throughput then latency such as encoding Ryzen is slightly above haswell IPC
ik but AMD fanboys are crazy
Show us a screen grab of your 7700k at 5.2 ghz you lying monkey. Make sure your name is on the picture too you google picture thief.
go back to wccftech where u belong
You were banished from wccftech for lying about your computer specs. Your only home now is here where you can lie and have the majority of users turn a blind eye.
i have GTX 1080Ti and i7 7700k wtff…look at my benchmarks 🙂
https://uploads.disquscdn.com/images/2a62277da4fe8e487e05690512c6c01aed7b9d2e7ff8a31f38804aca21b751cb.png https://uploads.disquscdn.com/images/883288827b0c5d2a67914a3eee01723e9da22b34b7e0f9ebf61280f34b2546e8.png
bruh
ah lets not forget how the performance degrade the more you play this cancerous uwp game
It’s no strangeness that it runs at 60 FPS, it’s made for weaksauce consoles in mind. Also, you can’t compare a game like Destiny 2 or Dishonored 2 to a linear consolized racing game with cut-back graphics. It’s evident because how the Shitbox X version looks identical to PC’s maxed out settings (except for AA) It’s called parity and money greedyness and we hate it!
Even your 980ti can run this game very well in 4K, so you should be happy.
BTW. 980ti is still good card, I had 1070GTX before I bought 1080ti and that was comparable cards. I have no idea why you didnt wanted to tell me what GPU you have earlier
I though you put me on block lol…
Listen, I don’t know what you’re on about, besides bailing out of our discussions like a coward when you run out of arguments to fuel your BS with. Yah, the 980Ti still holds up pretty well. It’s wasn’t a matter of “I didn’t want to tell you” I though you knew what GPU I had, since I’ve said it so many times in previous discussion on this board and I also told you directly before.
Here’s some advice: If you’re entering a discussion and throwing claims and assumptions left and right, you’re gonna need to be ready to back those claims up with something valid. And when someone comes along and invalidates your claims with proof, then you should either prove that person wrong with evidence or apologize for faulty assumptions. Not just ignore it like you did with me. I don’t have any respect for you, not before, not now. You’re just a console defending clown with very vague arguments and no real proof on what you say.
You still owe me an explanation why the Crapbox X runs Titanfall 2 at PS4 Pro settings while the 1070 can run it maxed out in 4K at around 60 FPS? You also gonna have to explain why the Box X can’t run ROTTR maxed out in its 4K Native mode, but a 1070 can. Because you claim that the Shitbox X has the same GPU power as a 1070. Then why do devs need to cut down on the settings so much?
I’m sure you know but basically i can say over the tablet like CPU that is in that console. But then again i think you already know that haha
“destiny 2 and it runs fantastic”
I don’t think so, in most modern games the 1080Ti is capable of 60 FPS at highest settings @ 4K. But in this game? 49 FPS average… I don’t call that “fantastic”
DOF off and it’s already solid 60fps on 1080ti
DOF off? You shouldn’t have to turn off graphical options to get good performance. It’s like having a ATI 9800 Pro from 2003 and claiming that Crysis runs excellent, yeah when turning all settings to low it does.
The definition of a game running excellent is when the highest end GPU’s can run it at a high framerate (at 60 or close to 60 FPS) in 4K with maxed out settings. If settings have to be turned off, then it’s badly optimized and not considered excellent.
Try harder kid.
Standards have certainly fallen. Doom runs “fantastic”. Most other games, not so much.
It supports 60fps with two 1080 TI cards though… lol
We should just rename if to DEADUVO
It never launched with it.. I don’t think
I prefer it my way:
Denuvo-Earth: Shadow of Microtransactions
Pathetic!
No, the real title is “Spoopy Elf Creed: Lootboxes of Lore Rape Edition”.
The only pathetic thing here was that “Joke”.
That wasn’t a joke, it’s the truth. What’s also true is that YOU are pathetic.
s.hait performance(probably DENUVO) – check
micro-transactions -check
pirated dayone -check
looks the same as the previous one and has worse performance – check
TOTAL fail -check
You will never change WB …maybe Mad Max was the only game that was amazing day 1 from ur company
And that wasn’t even done by one of their studios, they just licensed the IP to Avalanche & let them do their thing.
Well, Mad Max was made by the branch studio, whereas Just Cause 3 was made by the main studio, so WB just lucked out that they needed something to give their branch & their branch handled it well, I think.
But yeah, that was the other thing; the branch studio had access to an older version of the Avalanche tech for Mad Max, unlike the main studio which used the latest version with a new lighting system for Just Cause 3 (they even said as much themselves in an interview with DSOG, if I recall correctly). As a result, Mad Max ended up with the older but stabler version, whereas Just Cause 3 got the newer but broken version of their tech……
I have read few articles in regards to shadow of war performance, and It looks like this game is very demanding, even in 1080p.
results from gamegpu
480RX – 52fps and 32fps dips
1060GTX – 53fps and 36fps dips,
980ti – 70fps, 44fps dips,
1070GTX – 73fps, 49fps dips
1080GTX – 89fps and 60fps dips
Vega 64 – 89 fps and 60fps dips (the same as 1080GTX)
1080ti – 119 fps and 82fps dips
Only 1080GTX, 1080ti, and vega 64 were able to afford solid 60fps experience at that resolution, just unbelievable. If only this game would look much better compared with the first one
devs excuses nowdays.
bad optimized= demanding ayyy LMAO
Performance in 4K is really bad on PC. I can’t wait for Xbox One X benchmarks in 4K
(game support native 4K at 30 fps on XOX)
PC performance without ultra quality textures
https://uploads.disquscdn.com/images/4ebc85122cdf72945e3a744fa927033429dbe5708aa912cb68e66dce4e55d8f2.jpg
Well I dont need to wait for xboX benchmarks, because it will be 30fps 99% of time for sure. But I just wonder what settings that game will use on xboX in 4K, even 1080GTX dips below 30fps in gamegpu benchmark, so I doubt this game will run on max settings, maybe high and very high mix
Digital Foundry will check all settings. I read that XOX will support “ultra textures” in this game. Three more weeks…
Developers released more information about XOX version. It will support 2 modes (both modes will support ultra quality textures):
– native 4K (8 million pixels)
– checkerboard 4K (4 million pixels) but with ultra details
https://uploads.disquscdn.com/images/a2cfc4ef6f19a089071c03044a7fc9ea1f6de5ab218effce2f5ccbc6383c909c.jpg
Thanks for that picture. I’m quessing native 4K will use standard xbox one details (maybe even higher), but in checkerboard resolution mode max settings (like tomb raider for example). Personally I play all multiplatform games on PC, but if I would have only xboX console I would use checkerboard and max settings rather than native 4K.
Details in both modes will looks better than Xbox 2013 or PS4 Pro because XOX in all modes use ultra textures from PC
Well, I would hope it looks and runs better than on the antique 2013 Xbox One. That console was pitiful even on release compared to real gaming hardware (PC) back then. It will be the same story next year on the Xbox One X. Pitiful compared to PC hardware next year : )
https://uploads.disquscdn.com/images/c62607b3010a8048fad5e80e3fe12f52707a4c303c2e59991dd0c6daf377cebf.png
The red slice……that’s you Sp4ctr0
Personally I dont know anyone who would play just on console in these days, but that’s just me :).
im on the green slice 😛
Well, maybe in other games ultra texture packs for xboX will make a difference, but ultra textures in this particular game looks similar to me judging from John comparison screenshots
That picture, such misleading BS… “Will always play the game in 4K”
Not in “favor quality” mode it wont. And checkerboard 4K isn’t 4K and shouldn’t even be called that.
Oh and: “Renders the game in 4K and uses a technique called supersampling to scale the image back to 1080p”
They don’t know sh*t! That’s not supersampling, that’s called downsampling and isn’t the same thing. Real supersampling uses different grid patterns and requires MSAA to hook on.
Misleading and peasants fall for it, like with most BS like this.
Its true that its very strange to call “checkeboard 4K” a “4K” or “dynamic 4K”. I don’t like this. I think “checkerboard 4K” should be named “half resolution 4K” or “half 4K”
Checkerboard render only half of full resolution so it should be named correctly. “Half 4K” is much better name than “Dynamic 4K” or “Checkerboard 4K”.
I can’t believe it. You just said something that actually made sense.
PC 1440p (3.7 million pixels) vs Xbox One X checkerboard 2160p (4.1 million pixels) – ultra quality with ultra textures benchmark.
https://uploads.disquscdn.com/images/25db927b31bc7f3c5ec2d3607e19fa74714b3c5577421dd2410bdb1652d91797.jpg
Standard upscaling (like in QuantumBreak for example) just stretch picture to native resolution, while checkerboard use half of rendered pixels from previous frame, so “half resolution 4K” as you have described may be correct in technical terms, but it’s also very misleading, because checkerboard looks WAAAAAAAAYYYYYY better compared with old and simple upsclaing, and it even gives native resolution impression from normal viewing distance (even digital foundry admits that) because checkerboard artifacts are pixes small size.
Microsoft promote 4K native on xboX as it would be some great achievement, but it’s very hard to tell a difference from from normal viewing distance. I would rather see xboX exclusives getting 60fps and better details, than seeing XboX GPU power wasted just on resolution increase. 1440p and 60fps on consoles and with even higher details (better lighting etc. shadows) would look and feel much much better than native 4K, and 30fps.
“because checkerboard looks WAAAAAAAAYYYYYY better compared with old and simple upsclaing, and it even gives native resolution impression from normal viewing distance”
I’m gonna let you in on a little secret… 4K for console gaming is a waste of resources, since 99% is gonna run it on a TV watching from far away and the difference from 1080p to 2160p isn’t very noticeable from that distance anyway. Those resources could be put to better use, like higher graphical settings. The real reason Sony & MS are marketing “4K” nowadays is because of $$$… Plus they don’t wanna look like they are lagging behind the technical evolution of PC gaming.
Console peasants are blindly happy about 4K, but they don’t understand that it is THEM who will suffer the drawbacks of that decision. Drawbacks like being stuck at 30 FPS, or games coming with cut-back visuals, heck the new Assassins Creed game comes with BOTH those drawbacks. Far from every game has the nice option that ROTTR provides in terms of choosing. Yet you can’t choose to run a 1080p 60 FPS mode with even higher graphical settings. I’m predicting that in the future, consoles will be just like PC’s, upgradeable and with different options in the game menu like ROTTR. Either that, or consoles will eventually fade away.
” native 4K”
ahhahahaahha on conosoles
” at 30 fps ”
yeah at mid high settings.
Here we go again…
apart the ultra textures the game will probably run all the other settings on high-medium
It’s not “unbelievable” it’s very believable and very much the reality of gaming today. Games coming out (often half-finished or rushed) and drivers aren’t optimized. Patches always follow which optimizes the game to run better. Shadow of Microtransactions is just another example. You can’t expect a game like this to run great in the first couple of days. Just look at the history…
We can’t, but we should.
Once we drop expectations, they’ll think they can get away with this & then some.
Expectation shouldn’t be dropped, what needs to change is morons paying full price, on day one (or even pre ordering) these crappy games.
Thats on Ultra. theres almost no difference between Ultra and Very high and on V.high a 1060/480 can achieve higher than 60fps average.
Weak visuals.
You can never make everyone happy. For me i will gladly take a game that looks slightly worse but preforms well. Imo if the devs pushed the graphics a complete different subset of people would be complaining about a poor port.
It has the same visuals with the first one and it performs way worse..
LOL are you kidding the first game ran like trash at launch. we were forced to use that horrible letter box mode and once you stepped foot into an open area your fps tanked for everyone.
Stop chasing ultra settings people its for fools that like to waste money. games are designed for “high” settings. on high/very high this game runs on mid range gear above 60fps.
Almost 2018 and games still can’t even use 4 cores correctly, use MODERN engines idiot developers god junk your crap and just use something like unity, unreal 4 engine, hell even the cryengine
“Money.”
Low resolution 1440p vs checkerboard 2160p
https://uploads.disquscdn.com/images/25db927b31bc7f3c5ec2d3607e19fa74714b3c5577421dd2410bdb1652d91797.jpg
“Low resolution 1440p vs checkerboard 2160p”
Can you remember us the resolution of ARK on the X ?
Low resolution 1440p….Oops…
980Ti level – 47fps/50fps average….Nope
This is Xbox One X ballpark….
https://uploads.disquscdn.com/images/7fb27832d8489391725f86861a022cf31cb5d4debf344306d55b6cf7db8ac057.png
ARK kills every GPU that’s out there, even on 1080p 1080ti 2GHz OC drops into 50fps territory, so no wonder console have to use medium settings in order to hold solid 60fps. XboX GPU is not 580RX as you trying to suggest. Maybe at some point xboX GPU was indeed based on 580RX, but after removing the biggest bottlenecks that this chip had (you can read about that in articles that talks about xboX), and after adding Vega architecture features that card should be way faster. Also hardware resources on consoles are used much more efficient. Even PS4pro with 4.2tflops GPU polaris architecture is sometimes above 980GTX and almost catching 980ti performance level on PC (for example in COD games), and xboX is 2x as fast compared to PS4pro as you can even tell looking at games resolutions (4K on xboX instead of 1440p PS4pro). In gears of war 4 max settings xboX offers 4K native 30fps, and 580RX on PC cant do that. First card that can do that is 980ti (33fps average on guru3d benchmark chart), but I think xboX will need to push somewhere around 40+fps in order to hold 30fps most of the time during normal gameplay. So in order to mach xboX settings in that particular game, you have to own 980ti as minimum (drops below 30fps will be frequent with only 33fps average), or simply have 1080GTX.
Oh my dear Sp4ctr0, so cute you use your second account….
Nevertheless, still the same BS.
“ARK kills every GPU that’s out there, even on 1080p 1080ti drops into 50fps territory, so no wonder console have to use medium settings in order to hold solid 60fps. XboX GPU is not 580RX as you trying to suggest. Maybe at some point xboX GPU was indeed based on 580RX, but after removing the biggest bottlenecks that this chip had, and after adding Vega architecture features that card should be way faster……”
Yeah sure, sure…
From Jeremy Stieglitz:
“On Xbox One X, ARK runs at 1440p 30 FPS, with settings approximately equivalent to “PC High””
“It can run at 60fps 1080p with settings equivalent to PC “Medium”, we’ll probably provide option to toggle between that or 30 FPS 1440p High”
1440p High – benchmark from last february
https://uploads.disquscdn.com/images/4e77e9d59a04d7235652b7e5e25e207e314c9bc96cf9332c1ef7c0ba28c2b2d2.jpg
1080p Medium – benchmark from last february
https://uploads.disquscdn.com/images/70c32528f7aa26e71a92045508ed1db08aecbd36dda277bb89b6fefc9de981d4.jpg
and from PCGamer:
“GTX 1060 6GB can almost manage a steady 60+ fps at 1080p medium”
That’s coherent…
Oops….
Concerning Gears of War 4, we can easily guess that The Coalition won’t deliver a full Ultra version on the X…
And what about the best case scenario FM7 ?
Since the last nvidia drivers, a GTX 1070 is more than enough to run the demo at 4k maxed out and 8xMSAA
https://uploads.disquscdn.com/images/5035ade5f99f10156f6d7f39acbe44b66059b065549815923cdd9984e08d58e9.png
Can you remember us the AA solution used on the X ?
Less demanding EQAA, for an inferior result…Oops…
“Concerning Gears of War 4, we can easily guess that The Coalition won’t deliver a full Ultra version on the X…”
I have seen interview with gears of war 4 developer on YT, and in that interview these people have explained, that xboX will run max PC settings equivalent native 4K30fps, but of course you know better from your guesses.
That forza 7 screenshot is just static thing. There’s just one oponnent in front, and we dont know on what settings that game runs. It would be much better if you would show YT 4K gameplay from that game without dips below 60fps at all, on 1060GTX or even 1070GTX (just remember, max settings, no dynamic scaling LOD, and MSAAx2). After the last nv drivers update you are sure that 1070GTX can run forza 7 even with 8xMSAA, so finding perfect 60fps gameplay with even lower settings should be no problem for you.
But I’m glad that you have posted ARK benchmark from 1080p and medium settings, anyone can see that even these settings are very demanding, and 1060GTX dips below 60fps and just 61 fps average. And this is just one benchmark chart showing performance in one simple scenario. In real games there are many situations, where performance will drop wayyy more and especially in this game. It’s safe to assume that 1060GTX will drop even more than that benchmark chart suggest during normal gameplay and in order to mach xboX experience you would have to own much faster GPU.
“Oh my dear Sp4ctr0, so cute you use your second account….
Nevertheless, still the same BS.”
Even in this thread I have disagreed with Spectro on few things, but of course people like are willing to post absurd conspiracy theories just to distract from the real facts. I can also say something similar, and suggest that you and Oscar are the same person, because both are you are poor and cant even afford a proper 4K PC setup, yet alone cheap console.
More like “no wonder consoles need to use medium settings to hold a solid 30fps” not even 60. Poor Optimization = pure laziness IMHO. But then i see games like pubg that made half a billion in half a year and they can’t afford to optimize their games perfectly
Bet that the opening of the Lootboxes runs at 100+ FPS on all systems …
Hmmm I’ll just max the game out and lock it to 30.
Yeh, and you could run Crysis 1 on a ATI Radeon 9700 from 2002. A frigging potato can run Destiny 2 on lowest settings.
That’s not the point.
If something runs fantastic, then a 1080Ti should be able to max it out in 4K with around 60 FPS. Which doesn’t happen in Destiny 2. Fu*king console port!
It’s a sh*tty console port. Get over it.
The word “great port” shouldn’t even exist. Games should be developed on their respectively platform.
No go cuddle with your childish and casual D2 like the true noob you are.
That guy dont have 1080ti, at best he can look just on benchmark charts that shows 49fps dips on that card. DOF on max kills performance in this game, 1080ti can hold 60fps most of the time in that resolution if you tweak just DOF settings alone, so game runs great even in 4K. This is good result, because 1080ti cant hold 60fps in that resolution in every new game, you would need 1080ti SLI in order to keep 60fps in all games.
You seem stuck at “can hold 60 fps” holding exactly 60 FPS seem to be such an important thing for you. Dipping here and there under 60 is still providing a great and fluid gameplay experience.
Secondly, tweaking settings shouldn’t be necessary on a 1080Ti, it just shows that either devs are really bad at optimizing their games and/or settings. Or it’s overall sh*tty optimized port. I’m leaning towards both in the case of D2.
Thirdly, I was looking at the AVERAGE framerate in D2 for the 1080Ti, not the min FPS. The Min FPS was at 42, that is no longer considered fluid and certainly isn’t enough for a fast phased FPS which requires good precision.
Lastly, you seem stuck on the fact that I don’t have a 1080Ti, why does it matter? I can look at gameplay videos and benchmarks and draw an easy conclusion that the performance in D2 isn’t “excellent” and that most other modern AAA titles are running great maxed out in 4K on the 1080Ti. And having to turn off effects in order to maintain playability shouldn’t be required.
Your pathetic attempts at invalidating what I say is just continuing to fail. The only person pulling faulty assumptions out of your scrawny a$$ is you, and you can’t even admit it, instead you’re ignoring me when I prove you wrong. Pathetic as always.
Everyone can edit their posts, that’s why edit button exist here, and after reading your post for the second time I have added important observation that I have forgot to add in the first place. Gears of war developer has explained publicly how their game on xboX looks and you ignored that completely with some stupid line because you dont wanted to hear facts.
But man, you have really surprised me. I would never thought people on this site would behave like some psychos and make screnshots from my posts :P. That’s so weird man, first you make some crazy conspiracy theories and accuse me of being spectro, and now something like this, I almost feel sorry for you. You should definitely stop playing games at least for few days, please do it for your own safety.
When it comes to forza 7 video that you have posted, you have proven that game runs great on that settings with new Nv drivers, that’s for sure, but you failed to provide perfect 60fps gameplay, becauses this video show drops below 60fps few times and that 1070gtx is OC’ed on top of that.
On my 2GHz 1080ti I will be able to max xboX settings in every multiplatform game, and use even higher settings, thats for sure. But I guess people with 1070gtx/980ti and slower cards will have problems accepting, that cheap console will provide the same or even better experience than their gaming PC, and this explains why some of you guys are so angry at consoles.
“…You should definitely stop playing games at least for few days, please do it for your own safety.”
So cute…
“When it comes to forza 7 video that you have posted, you have proven that game runs great on that settings with new Nv drivers, that’s for sure, but you failed to provide perfect 60fps gameplay, becauses this video show drops below 60fps few times and that 1070gtx is OC’ed on top of that.”
Framerate drops with plenty of unused ressources, that’s just unbelivable. Maybe in 8months, the game will be finished
What did you tell me ?
“you dont wanted to hear facts”, it suits you very well.
“But I guess people with 1070gtx/980ti and slower cards will have problems accepting, that cheap console will provide the same or even better experience than their gaming PC, and this explains why some of you guys are so angry at consoles.”
Oops angry ? Nope, you and your alias Sp4ctr0 are just spreading some BS.
Xbox will launch next month, and we will see whos estimations were better. You estimate 1060GTX level of performance, I estimate something better than 1070GTX.
“You have quoted one guy, that talks about dynamic resolution system in titanfall 2, but you only accepted one line, that supported your belives. When the same guy later on say, that xboX can render internally 6K even during shooting scenario, then you say it’s impossible and refuse to belive him.”
6K yeah perhaps, but he was playing multiplayer and those maps are smaller and doesn’t represent the performance of single-player.
“Then you have said that 1070gtx is enough for titanfall 2 maxed out in 4K”
Yeh, which is true, the 1070 runs it at 50-60 FPS: /watch?v=qcUv0flLKsA
The point was; Why does the One X run at PS4 Pro settings if a 1070 runs it maxed out in 4K with high framerates? When you claim the One X is just as fast or faster than the 1070.
“while even 1080ti (card nearly twice as fast) dips sometimes below 60”
It doesn’t matter if a 1080Ti or a 1070 dips a few fps or up to 10 fps for the 1070. It’s still very playable and considered high framerates. The fact that these cards dip below 60 isn’t relevant to my point or the discussion. The question was why the Box X run at PS4 Pro settings and NO AO.
“Next you have said, that 1070 can run tomb raider in 1440p60fps maxed out, while in geothermal valley that card can drop to even 40fps”
Yes, it drops in geothermal valley, so it does in the console version, it’s stalemate and the fault is within the level optimization of the developers. For the most part, the 1070 runs at around 50-60 FPS in 1440p maxed out, that was my point, while the Box X runs it at just 30. Explain that.
” The problem with you is, that you base your knowlege just on benchmark charts alone, but these represents performance just in one place/scenario, while real games have very different scenarios.”
Yes that is true, the performance is a little different depending on what level is played. But we’re talking about the majority of the time, the average performance throughout the game, the fact that it drops frames in some locations is not important at 60 FPS, going from 60 to 50 isn’t as much of a big deal as going from 30 to 25 or even 20, understand that. And yes, benchmarks only gives a hint on the performance I give you that, but I’ve watched gameplay at said settings and performance is between 50-60 FPS, which is a big step over the 30 that the Box X runs at, so my point is still valid.
“so I have linked you 1080ti gameplay with dips below 50 fps in this game. But of course, you have ignored that material”
You posted a video from some random guy who didn’t show his PC specs, he could be running an AMD Bulldozer CPU we don’t know that, so that proof isn’t valid. And I didn’t ignore it, I responded to it saying the exact same things I just did now.
“unlike you I dont need to dream about that card and base my opinions just on benchmark charts from various sites”
I definitely don’t dream about that card, I made the choice to skip that generation of GPU’s, as I’m happy with my current performance. It has nothing to do with this discussion. I didn’t only rely on benchmarks, I’ve also provided gameplay of said settings/setups to prove my points, did you look past that boy?
“I know much better than you what that card can or cant do”
That doesn’t have anything to do with the discussion. The discussion is still about why the Box X runs at PS4 Pro settings in TF2 and why in ROTTR it’s limited to only 30 FPS in its Enhanced Mode. And why the 1070 runs maxed out in 4K with 30+ FPS when the Box X has its settings dialed back significantly. Which you still haven’t answered me on.
That only proves that the Box X isn’t nearly as powerful as a 1070, why else would the developers do that? It proves what you said isn’t true and that you only made that claim up. This is also the reason why you keep ignoring exactly just those questions but responds to everything else. Which is quite pathetic.
So… I’m still waiting…
Your claims remain false until proven otherwise, until then you can keep yapping and running that cool mouth of yours, still doesn’t do you any favor. If you wanna be respected, back up your claims with something that holds.
Let’s see:
First video, he’s playing on Medium settings, no FPS counter
Second video, he doesn’t show what settings he runs, and no FPS counter.
Third video, again medium settings and no FPS counter.
Forth video, that wasn’t Ultra settings, if you check video description.
Sixth video, high settings.
Seventh video, running around low 50’s isn’t like I already stated. You were talking about solid 60 FPS smoothness, nope, didn’t happen.
You just proved NOTHING with those videos… So yeah, my point is still valid. The game doesn’t run as good as most other triple-A titles maxed out in 4K on an overclocked 1080ti. What was your point again boy? Try to have something valid to back things up with next time you reply.