Shadow Warrior 2 has just been released and it appears to be the first PC game that takes advantage of both HDR and NVIDIA’s Multi-Res Shading Technique.
HDR is the new gaming trendy these days, as it was first widely showcased during Sony’s PlayStation Meeting back in September. HDR offers richer colours and in order to enjoy it, you need a TV that supports this tech. Microsoft and Sony have been adding HDR to a number of games, and Shadow Warrior 2 is the first game that supports it on the PC.
On the other hand, NVIDIA’s Multi-Res Shading Technique is a technique that can be used in order to render the game in higher resolutions. While the center of the screen is being rendered at your chosen resolution, the edges of the screen are rendered in a lower resolution. Basically, think of it as an upscaling method that will allow you to play games in higher resolutions.
NVIDIA’s tech was used in VR games, however this is the first time we’ve seen this technique being used to a normal game. The good news is that this technique actually works. NVIDIA offers two options; conservative and aggressive. The aggressive method offers greater performance gains, however – and in order to avoid noticeable aliasing – we strongly suggest using the conservative option.
Below you can find some comparison screenshots between proper native 1080p (left) and 4K via the aggressive method of NVIDIA’s Multi-Res Shading Tech (right). As we can see, the MRS images look sharper (floor in the third image), however in some cases the aggressive method brings more aliasing (which is why we suggest using the conservative method).

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email






I thought the first game that supported HDR on PC was Rise of the Tomb Raider.
Not sure. I found the “Implementing HDR in ‘Rise of the Tomb Raider’” Nvidia Blog Post, but there they’re talking specifically about HDR on HDTV’s, so it may be a feature exclusive to the console version of ROTR, much like how Gears 4 only has partial HDR support on PC (“Gears 4 has HDR on Xbox One S. It does not support it on PC. PC just started incorporating HDR support on cards recently and many monitors don’t have the functionality.”).
This is after all a game that launched back in January.
Umm nope consoles are based on pc tech
Well GCN was designed primarily for low level API consoles s been using long before PC. They both using same technologies now, but consoles have been first in that regard.
Though you cannot call it PC nor console tech, it is just IP which is currently same for all those platforms.
Going further you calling anything PC tech makes no sense as this platform is democratic and open, Would not make any sense to call any technology PC technology, because no technology on PC is currently PC proprietary, it cannot be as there is no owner or restriction involved. Any technology used on PC can be used on any other platform, PC makes no restrictions in that regard, there is no one to makes those restrictions.
ROTTR on PC still hasn’t got the HDR patch
I dislike the term “HDR” being used so confusingly. Whenever I think of HDR, HL2 Lost Coast comes to mind. Then I have to remind myself this isn’t “that” HDR, but some new screen technology.
HA 😛
In HL2 and most other subsequent games, the term HDR referred to the full range of light intensity that the game could produce, which was then tonemapped down to work on normal monitors.
All ‘HDR’ refers to in this case is support for newer TVs that support a greater range of brightness, likely mostly just involving employing a different tonemapper as far as game engines are concerned. I imagine that a lot of other PC games should be pretty quick to follow suit.
Thank you, i was wondering.
Not just brightness, but color as well
DUDE, SAME THING. I was like, WTF i experienced that 10 years ago.
That sort of HDR is supported by almost all game engines now although the term applies to both screens and renderers. HDR in monitors actually allows renderers to output the full color (and tone) range to the monitor and have it look accurate.
Yeah one cant go wrong with nVidia, they simply are in a league of their own. Same goes for Intel on the CPU side. Their is no contest, AMD is a f*ckn joke and as such no one should ever tuch that POS!
On the other hand it’s bad for us nVidia/Intel can sett pretty much any price they like for their products :/
LOL!
Right sis, one have to be careful as these crazies are all over!
I myself only have an 970 thats oced as much I could, but of course it having problem with my new 1440p g-sync monitor but not as bad as I have though. But one 1070 would be pretty awesome. But I will hold on until the next card drops if possible that is 🙂
Actually AMD has very similar tech in LiquidVR SDK.
Variable Rate Shading
All this talk about HDR. I’m more interested in the Multi-Res shading. From what I’ve gathered, you basically get up to 30% free performance using it in Shadow Warrior 2.
Nice feature with not so much of a loss. Barely noticeable when the action is going on. This could prove very useful for console too.
MRS looks like it could be a great tool to increase FPS. Would be even better if it could be dynamic in some way, so when you stop to look around, you get the full res, then when moving it switches. Best of both worlds then.
Multi res has been a godsend in graphically intensive VR games like Raw Data, from what I hear on the steam forums people are reporting a 11 – 15 fps boost while sacrificing minimal image quality, I’ll def have to check it out
Yeah, some VR units track where your eyes move and only render that part at high resolution. If they could do that with PC using some kind of eye tracking in the monitor that would be brilliant.
Kinect 3.0?
F*ck that sh*t.
I guess, look at it as a way of attaining the highest possible value setting your hardware is capable of. Like if you have a card that does 4K 60 flat then no it’s probably not going to be of use, but say you have a card capable of 4K 45fps, or 1440 60 fps This would allow you reach that 60fps standard, and while it wouldn’t be true 4K it would it would still be higher than 1440p like if you ended up with 3096×1680 ast 60 fps.
You could look at it as cheating 4K but if you where never going to hit your performance target anyway, it’s finding as spot between the common resolutions where you get maximum performance for res that your card can support
Didn’t even know about the Multi-Res thingy. I won’t use it but i can see people actually needing it (older computers).
Dude 1080 is already legacy with nvidia. 🙁
I feel you. The problem is that NV is alone on the performance crown and they can do whatever they like. Even if i am a green fan, i do not like that.
Indeed. It’s a pain.
Ya, I see it as more of a future proofing feature than anything. Outside of 60 hz max, a lot of TV’s have terrible latency in HDR mode (can be as high as an additional 150-200 ms) on top of the already terrible latency TV’s normally have.
some have already been announced like a 4k one from asus, only problem is that it will probably cost around $1500 USD
Next year it will be multi multi multi res or whatever Nvidia call it, Which will be just one square cm in the middle of your screen that is rendered while everything else is blurry,You should get 200fps on your 10 year old gpu at 4k resolution! lol
Think I’ll pass and leave option off, But cheers for at least having options(choices)
P.S. A big thank you to Flying Wild Hog developers, No DRM = Winner!
Didn’t even have to hesitate on this purchase. 😉
this can be fixed easily in the future. They just need some sort of full scene AA that works in pst process (SMAA probably) and runs after Nvidia’s MRS process.
I think its less of an issue in a first person shooter, where you’re mostly focused at the center of the screen, rather than at the edges.
What’s even better is combining DSR + MRS. you can then make sure to have a native or otherwise satisfying minimum resolution at the edges, while having better image quality (downsampling) at the center where it matters most, with no huge performance penalty.
yea true, but older GPU’s can’t handle 1440p (at a stable FR) anyways ..
https://uploads.disquscdn.com/images/956d940a9a07403bd57faacdd8e7de1269926809987026bfe6a6d647cba604d5.jpg https://uploads.disquscdn.com/images/d8120ee66bee6c7fdc5f8ed5ab2218975c517721e604b7891ae623d1f754da92.jpg On low end rig.