IO-Interactive announced today that HITMAN will support High Dynamic Range (HDR) from January 31st. Allowing for more vibrant graphics, HITMAN will support HDR displays via a new game update.
IO Interactive’s Programmer Anders Wang Kristensen said:
” As an example, think of 47 in a dark room looking out a bright window. Without HDR you would not be able make out what’s outside, whereas with HDR you can simultaneously see details inside the room and outside the window. Another extreme example is the sun. Without HDR we could not render the sun brighter than a white piece of paper, so we had to ‘simulate’ the sun being blinding. With HDR we can draw the sun several times brighter, so that it is actually close to being blinding, just like in real life.”
IO Interactive has also released the first image of HITMAN with HDR enabled that can be viewed below.

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email

HDR is great, but what about bt2020? Some games support both
bt2020 ?
UHD standard is all about 4K, HDR and BT2020. HDR improve details in bright and dark areas, while bt2020 is all about better colors. On consoles most games use just HDR, but some use both HDR and BT2020, for example infamous second sun. Lately polyphony talked a lot about bt2020 in new gran turismo, because thanks to bt202 they were able to show car paint colors as close to real life as possible.
Hardware has to support it first, then the software (I think).
Oh i will read on that, this is very interesting!
Good post, just one thing; REC.2020 was deprecated back in July, in favour of its immediate successor, REC.2100, which is now part of the ITU-R recommendations for HDR HDTV’s & UHDTV’s.
Other than that, agreed.
P.S. If anyone is wondering, yes, this is what they call “10-bit” & “12-bit”, which is moving to phase out the now antiquated 8-bit colour spectrum we’re currently using, all thanks to technological advancements in video bandwidth ^^
UHD discs have bt2020 logo, and also bt2020 is mentioned in HDTV’s reviews
From how long ago, though? REC.2100 is a recent addition, so anything “not-too-recent”, or even just movies that were prepared for their UHD releases a few months ago most likely won’t have it.
All new UHD movies use bt2020 if fact I have never seen even one that would mention 2100. Do you suggest they can change UHD specification even right now (when players are already made with bt2020 in mind) and replace bt2020 with 2100?
Either way, it’s a minor detail, as REC.2100 isn’t a new HDR format, it’s just the ITU standardising all the various HDR implementations from over the years into a single format, including (but obviously, not limited to) Ultra HD Premium.
Regardless, I can definitely see future UHD Blu-Ray players being REC.2100, rather than REC.2020.
Well I think these people should know what they are doing, and both bt2020 or 2100 parameters are better compared to what we are get used to :).
I remember when HDR was all the rage in 2006
Right along with BLOOM and lens flare.
Yeah, artificial HDR, that is.
As if there is a type of hdr that isnt artificial?
ar·ti·fi·cial
?ärd??fiSH?l/
adjective
1.
made or produced by human beings rather than occurring naturally, typically as a copy of something natural.
Lawyered.
This definition fits, though. 🙂
Different thing.
Nobody is talking about that type of HDR. They are both different
Where are HDR monitors, though? They’re virtually nonexistent at the moment, let alone reasonably priced.
They are coming. A few were shown at CES 2017
But they will cost more than 1000 euros. The cheapest 4k monitors curruently are 27” or 28” 4k IPS 60 HZ and they cost 500 euros. They make the 4K HDR monitors at the same price.
Honestly, I don’t give a damn about the whole 4K. I’m not feeling like having 2 times less FPS after dishing out loads of money. The graphics don’t get much better so I don’t think it’s a good tradeoff.
On the other hand, I really like the idea of HDR. I won’t mind having more colors without any FPS hit.
So yeah, I’m waiting for HDR, but in normal, Full HD monitors.
Agreed. I’m on 1080p right now, & my next upgrade will be to 1440p, not 4K. It’s just not worth it, both in terms of price, & how young the technology is right now. Next time, sure, but right now, hell no.
Also, sub-4K HDR monitors? Oh, I do hope so O.o
Get a 1440p no Full HD. Who buys full HD in 2017?
My display has an even lower resolution than Full HD, namely 1280×1024.
1440p would take a solid chunk of FPS from my pool. I don’t want this to happen. I vastly prefer frame rate over resolution.
Even though it’s 2017, PCs aren’t ready for such high resolutions yet, in my opinion. Unless you buy some top-tier rig or are okay with choppy gameplay.
I always try to keep my frame rate at over 75 FPS, as I’m playing at 75Hz.
Correction; PC isn’t quite ready for Full UHD + 60-120 FPS, we’re more than ready for 1440p if you also move beyond that 3-5 year old GPU 😛
It seems the frame rate with a GTX 1070 tends to drop below 60 FPS in many games at 1440p. So I wouldn’t be satisfied with it at 60Hz, let alone at 75Hz (my current setting).
I wouldn’t call it more than ready, to be honest. And don’t forget 1440p will age faster than good, old HD.
I agree, though, that 1440p is way more sensible than 4K at the moment.
O.O
Interesting, last I checked 1440p was a stable 60+ FPS. My bad, then.
Also, 1440p means I’ll have to move on from all those 720p videos to full 1080p on everything, now, too 🙁 My Hard Drives shall weep like the Niagara Falls, I tell you!
I have 2500k gtx 970 12 gb ddr 3 1600 mhz and i run most games max settings 14440p with no issues. Only games who can not run maxes 1440p are badly optimized ones like Deus ex mankind divided and Quantum Break. Evrything else i run it maxed perfecty. Not at 60 fps all the time but still max settings A friend of mine who upgraded 4 months ago and got core i5 6600k 16 gb ddr 4 280 mhz and gtx 1070 8 gb can run even the badly ported games on max settings 1440p. 1280X1024 on 2017?! Are you crazy? I had 1024 monitor from 2006 till 2014 that it got bunred due to ie getting old. Then i got 1920×1080 and i could never go back to 1024 again. And then in 2016 i decided to go 1440p after i saw how much better games look 1440p! I advice you to get 1440p montor. A cheap 1440p like mine costs only 300 euros but the games look so much better than 1280×1024 that it is worth it. You need a pc like mine or better to max games at 1440p not a very high end one like you think. Also iam going to upgrade cpu and ram on february and get amd 8 core with 16 gb ddr 4 3600 mhz and finaly get rid of the 5,5 years old core i5 2500k and 12 gb ddr 3 1600 mhz! For 4k max settings you need i7 6700k and gtx 1080 or better. So saying that pc’s are not ready for 4k or 1440p is not true as you see.
OK, they are just ready, but not “more than ready”. And why are so many games unoptimized? It seems like every other game is very demanding. It’s a valid excuse, but still an excuse. Devs ain’t gonna fix it, so either you buy a powerful GPU which can compensate for it, or you can forget about playing those titles at your desired resolution and frame rate.
I don’t know about you, but in my case, if frame rate drops not so much below my refresh rate, the stuttering is really visible and makes the game considerably less fun.
My desired frame rate is 75 FPS, not 60 FPS, so it’s even harder to upkeep.
But yeah, different strokes… Some people can have different FPS requirements. What are yours? Do you mind when your frame rate drops below your refresh rate?
I dont mine it realy…. When i had 1080p monitor(until June 2016) evrything was running at 60 fps 1080p max settings. Since i got 1440p while fps dropped i didnt realy notice any issues with games except Deus ex mankind divided and watch dogs 2 and quantum break but even at 1080p those games dont run much better that i tested them. Still no worth to change gpu for 2-3 games that dont run maxed 1440p when evrything else runs ok GTX 1080 costs 700 euros in my county so unless alll games that releases each month start to not run maxed there is no reason to change it yet. Cpu and ram have priority this year because i havent change them since 2011
I think sub 4K monitors (1440p or 1080p) but with HDR and WCG support is just a matter of time. This year sony is already making FULLHD 1080p HDTV’s with HDR and BT2020 support and it will be the same with monitors. 4K in only really visible on big screens, so there would be no point in limiting HDR and BT2020 technology only to 4K resolution.
Yeah, it’s not like 4K is an indisputable upgrade, at least for now. So if HDR catches on in the mainstream before 4K at at least 60 FPS is easy to get, we’ll see HDR monitors smaller than 4K. I bet it will be the case.
Hitman’s pretty good with a controller too, so it helps for people with PC’s hooked up to HDR TV’s. The monitors will come soon enough.
TV’s are laggy, though, with HDR even more.
New samsung HDTV’s have the same input lag with HDR on or off
So HDR isn’t bound to cause big lag? I thought it wasn’t initially, but then I saw an article about FreeSync 2, which implied latency was an issue of this technology.
It can, but depends of who is doing it.
I dont know about articles that you are talking about, but I have read HDTV reviews that talked about HDR and input lag, and they have concluded, that HDR alone is making no difference in certain HDTV and resolution modes. For example some HDTV’s have no additional input lag in 1080p and HDR, but they have in 4K and HDR, and samsung HDTV’s were the best, because they had the same input lag with HDR on or off even in 4K.
That’s good news.
Hold on if they are bringing HdR for hitman hopefully all platforms why not also have HDR available for tomb raider all platforms.
And here I am still boycotting this game over their r*tarded always online DRM.
first i bought the intro pack then i bought sapianza and marrakesh episodes separately, i didn’t get the bonus mission like everyone else did back then, so i had to buy the upgrade pack which meant rebuying episodes i already paid for, shady and greedy developers, don’t support it, if i could i would refund the whole game
How can you check if your display is PC display is HDR compatible?
to make your work easy, i’m going to tell you yours doesn’t support HDR, but you can be sure by checking your display specs on the manufacture website
Basically, anything that isn’t 4K, or came out earlier than mid-2016 doesn’t support HDR, period, & even if it IS, 4K, if it doesn’t actually, specifically state that it’s HDR-capable on the box, then it near-definitely isn’t, as this has quickly become a major marketing bullet point.
Oh, & even if it DOES say it’s HDR compatible, some of the earlier HDR integrations were sh*t, so unless it says something like Ultra HD Premium, then you might as well cross your fingers, because it’s more likely than not an earlier, prototype-era below-standard integration.
Basically, anything that isn’t 4K, or came out earlier than mid-2016 doesn’t support HDR, period, as this is a hardware-side thing, ergo not something that they can just patch in through the GPU drivers or whatnot & even if it is 4K, if it doesn’t actually, specifically state that it’s HDR-capable on the box, then it near-definitely isn’t, as this has quickly become a major marketing bullet point.
Oh, & even if it does say it’s HDR compatible, some of the earlier HDR integrations were sh*t, so unless it specifically says “Ultra HD Premium” (or one of the equivalents, assuming there are any yet), then you might as well cross your fingers, because it’s more likely than not an earlier, prototype-era below-standard integration.
Footnote: This is all going by the currently-established UHDTV rules, since UHD HDR PC monitors are still being finalised, & even just standard 4K PC monitors are still an “early adopter’s” techology. As such, there might, in the end, be a different tag for PC monitors, though to be fair, it is unlikely – you never know, though.
Did my last round of upgrades 3 years ago, so it shouldn’t be long now, I’m just waiting on a couple of final parts – hopefully.
Either way, even if it’s a small market, I can at the very least see the higher-end monitors getting HDR support, as it’s a big marketing bullet point.
P.S. your DELL has a pretty terrible ms……
Do you think HDR support will come to current monitors or developers are trying to force people to buy the new 2017 models by making it excluve to it? Yes it has 9 ms time but the the 1440p 1 ms monitor costs 500 euros and i didnt want to pay +200 euros more for just lower ms and ”3 more inches. It wasnt worth the additional cost for what they offer compared to a 60 hz 9 ms monitor. But compared to 21,5” 1080p 60 hz ips 5 ms that i had till then this dell 24” 1440p 60 hz ips is much much better and was worth the 300 euros that i paid for it. What parts are you waiting for? Iam waiting for new Amd ryzen. I will get the 8 core with 16 gb ddr 4 3600 mhz. But i will keep gtx 970 for at least 1 year or even more since it can still max games at 1440p and will upgrade when 16 gb or more with HBM2 memory cards are out. Possibly not until 2018 or even 2019.