In January, NVIDIA released the Geforce 551.23 WHQL Driver which had a new feature, called RTX Video HDR (or TrueHDR). This feature allowed all RTX gamers to convert in real-time any streamed non-HDR video into an HDR video. And, modder ’emoose’, converted this tool to work in all PC games so that you can enjoy them in HDR.
As the modder noted, the latest driver features some hidden “TrueHDR” profile settings. These settings allow you to apply RTX HDR onto non-HDR-enabled games too. So, basically, think of it as an alternative to Windows Auto-HDR.
From the looks of it, all DX9/DX10/DX11/DX12 games are compatible with it. There is also a chance that this tool may also work with OpenGL/Vulkan games.
You can download this tool from this link. Do note that you’ll need driver 551.23 (or future versions) and WDDM 3.1 to enable it. And yes, this tool will only work with RTX GPUs. So, sadly, our AMD fans won’t be able to enable it.
Do note that TrueHDR currently isn’t compatible with NVIDIA’s NIS. It’s also not compatible with DL-DSR/DSR resolutions. So, make sure to disable them if you want to use this mod. I also don’t expect newer versions to fix these compatibility issues.
Since this mechanic appears to work on games, I’d love to see NVIDIA implementing it natively. This came out of nowhere and it could be a big feature for NVIDIA. And, if the green team does add it to its Control Panel, I’d like to see a more robust version of it.
Have fun and stay tuned for more!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
they will definitely capitalize on this. there’s no way in hell they’ll let this go for free.
“this tool will only work with RTX GPUs”
You have to buy a nvidia GPU to use it.
there are a lot of green team fans, they’ll squeeze anything and everything out of us.
Tried SN5 with HDR on a friend OLED. Looked all over the place. Some areas looked good, some looked awful. Imagine how many fixes and shiet you have to adjust for older games. Yeah it’s gonna die just like 3D.
You can already use SpecialK to play D3D11, D3D12, and OpenGL games in HDR. It’s HDR mode may work with Vulkan as well, but I don’t remember.
Of course, for those who like ReShade there’s an AutoHDR plugin for ReShade that works with D3D11 and D3D12 (and possibly Vulkan now too).
Neither of these solutions require pseudo-AI nonsense like NVIDIA’s does, and thus they won’t put a heavy load on any GPU cores. They should also work with any GPU and display that supports HDR.
Personally I use the ReShade AutoHDR plugin in games where it’s supported, and I use SpecialK with VLC Media Player to watch SDR movies and TV shows in HDR. I leave NVIDIA’s nonsense turned off, since it pretty much only works in Chromium based web browsers (VLC doesn’t support it, and SMPlayer doesn’t support it).
It actually heavily improved non hdr videos for me within the browser. NOW finally I can leave HDR on, when I am solely browsing. Special K unortunately does not work with dx9 games and this solution improves the situation a bit, then I am using it.
There are ways to get D3D9 games to work with SpecialK’s HDR features. To be fair those methods aren’t always successful, and the “best” way to do it seems to keep changing, but I’m pretty sure I was playing Dragon Age Origins in HDR at one point. I also was able to get Resident Evil 6 in HDR with SpecialK a couple of years ago, but it had issues with the shadows and certain dark objects/effects turning bright yellow/gold in color (I think due to dgVoodoo 2, which I used to translate the game’s D3D9 API calls to D3D11 for SpecialK’s HDR mode).
Granted I can see this tool as being an easier solution rather than fumbling around with things like dgVoodoo and DXVK trying to get SpecialK’s HDR mode (or ReShade’s AutoHDR) to work.
You know that Winshat’s 11 HDR-slider (0 – 100) impact the iamges/videos luminosity? You set it to 0, the videos are dim.
I noticed that nvidia’s rtx video hdr in the opera browser bypasses it though.
4) Try at least the superior mpc-be player, not inferior VLC (or go with mpc-hc by clsid2 or mpv player).
There is even a new fork of Aleksoid1978’s mpc VideoRenderer, written by emoose with the help of clsid2 (the one who does mpc-hc github(.)com/clsid2/mpc-hc),
with code to enable RTX Video HDR (nvidia driver 551.23 required)
on videos played through mpc-be.
github(.)com/emoose/VideoRenderer
And, yes, it works great. Though at the end it’s subjective.
I doesn’t make 8-bit colors magically go DCI-P3 or Bt.2020, but at least it elevates color luminosity, gets the contrast and dynamic range right, tone mapping right.
I’ve tried MPC-BE (I actually do keep it installed), and I’ve tried the madVR DirectShow filter which has had the ability to tonemap SDR content to HDR for some time. It has some nice features, but in the end I actually prefer SMPlayer over everything else, but I use VLC Media Player for watching movies and TV shows because I can change the volume quickly by tapping the up and down arrow keys on my keyboard. VLC Media Player is also the only media player I’ve been able to get to work with SpecialK and its HDR features, which are less annoying than the DirectShow filters for MPC-BE since those black out my screen for a few seconds when they activate/deactivate and SpecialK doesn’t do that.
What’s so good about the SMPlayer? Haven’t tried.
Looks too me as mpc-be, mpc-hc, mpv have all features that are needed, along with madVR, RIFE-AI (or other trained models).
Haven’t used VLC for ages. I guess “perceptual boost” via special-k also can be activated for better HDR?
SMPlayer can use mpv as its backend for playing media, which is an extremely good and efficient media player. SMPlayer also allows me to quickly scroll through a video with the scroll wheel on my mouse, and with the left and right arrow keys (I have scroll wheel set for 10 seconds and the arrow keys for 5 seconds). It will play almost any video, it supports HDR, and it’s surprisingly good at playing online videos (not just YouTube) thanks to yt-dlp integration. It’s also open source, completely free, and cross platform (meaning I get to use it on Windows and Linux and have the same user experience).
If you launch VLC Media Player with SpecialK Injection Frontend (SKIF) and then open a video, the SpecialK interface will appear in the video player and you can open it and enable the HDR mode. The remastering modes don’t work since there’s no 3D rendered graphics to remaster, but the basic HDR tonemapping works just fine. I watch all of my movies and TV shows this way, and without that annoying screen blackout that happens in MPC-BE every time a video starts playing or stops playing. Just note that SpecialK’s HDR is a bit bright and the color vibrance is set a little too high by default, so some of the settings need tweaked to fix this, but once you’ve got things where they should be then it looks pretty good.
Will give it a try. Thanks and for the chat.
1) It’s not pseudo AI-nonsense. Don’t embarras yourself by being an ignorant or a dumbo (for whatever reason you have that opinion about nvidia’s rtx technology sets) and call superior technologies such names.
2) I use Winshat’s 11 auto-hdr myself, and Special K on my Samsung S95C TV, thus I know Special K’s sdr- hdr injection and tone mapping is superior than Winshat’s machine learned network (which is in fact strange, as a trained deep learning approach as that should offer better results).
3) I haven’t tested and see no comparisons out right now, but from what I commented here myself using “rtx video hdr”, applied to video games, it should be on a complete level above even Special K in terms of accuracy and image quality.
This is Nvidia and their technologies, the leaders of machine learned image reconstruction, sampling and other technology suites.
dsogaming(.)com/news/nvidia-geforce-551-23-whql-driver-released-optimized-for-tekken-8-suicide-squad-kill-the-justice-league/#comment-6375755841
Real AI would be something along the lines of Data from Star Trek or EDI from Mass Effect. What we have right now is machine learning that has been rebranded as “AI” for marketing, and thus I have taken to calling it pseudo-AI. Training a computer program to mimic a human isn’t the same thing as real artificial intelligence.
As for RTX Video HDR vs SpecialK, I tried it with a video vs SpecialK in VLC Media Player playing the same video, and I preferred SpecialK’s output. That being said I think the only real noticeable difference was brightness, with RTX Video HDR not getting anywhere near as bright as SpecialK does (even after I turned its brightness down as the default was too much for media).
Correct, and thus I also don’t call it “AI”; just “machine learning” or trained networks and inference”.
Okay, I haven’t tried it, thus I can’t say anything about comparison.
Thanks for feedback, will try it then.
I don’t know. Maybe that thing about Special K’s approach that you perceived as brighter, could be a limitation of Windows 11?
For my Samsung S95C, the OS limits screen luminosity to 1000 cd/mn² (or 1200 cd/m² in gaming mode, HGiG), despite the display being able of white 1350 cd/m² .
I’m pretty skeptical and interested how a manually approach via Special K is not superior to nvidia’s machine learned technology suite. Sounds pretty strange.
At the end HDR is about dynamic range, elevating contrast. Higher range, higher contrast, for luminance and color luminance, static/dynamic tone mapping, leading to higher color volume etc. pp.
I’m alreay baffled how nvidia’s approach, a network trained millions of hours specifically for that task, can lead to an inferior perceived experience.
Strange things happening.
I use Windows 10.
SpecialK’s brightness is just a setting you can tweak (“SDR Gamma Boost” I think). I have it turned down a bit because by default it’s way too bright and washes out bright colors in videos too badly.
https://uploads.disquscdn.com/images/c3377ffad12b65238ec94a7db77170e270c3e17ad4d565bbe8b0c76add4f97c9.png
I think the reason why’s NVIDIA’s approach doesn’t seem better than SpecialK’s is that machine learning isn’t necessary to tonemap SDR content into HDR effectively. There’s also the fact that SpecialK has settings that allow you to tweak the experience to your liking, whereas NVIDIA’s RTX Video HDR doesn’t have any such settings. Does NVIDIA’s technology make SDR content look more like real HDR video? Probably. Does that make it better? Whether something looks better visually or not is highly subjective, and will come down to a matter of personal opinion, so I’m sure some people will prefer RTX Video HDR.
The issue that I see with NVIDIA’s RTX Video HDR is tensor core usage. While games that don’t have native HDR support probably aren’t going to be using the tensor cores anyway, and the tensor core usage seemed fairly light when I tested it (10%-20% if I remember right), it’s still an extra power drain on the GPU while gaming and thus may impact FPS.
Hmmm, yes. That’s why I use the “Hdr preset 1” and tweaked my settings a bit too, as my S95C has 150 % sRGB color colume or so, to match it.
What settings you recommend tweaking for hdr?
Ye, that’s the problem. I can’t know what’s “real hdr”, and you too, if there is no native hdr content for the movie/game in comparison hahaha.
Well, I agree and disagree with that. I say your order is wrong
1) There is a pretty big objective part why rtx video hdr and games, could be objectively better, since it’s all trained on real world shot/mastered/game hdr content. Special K is not.
That’s the baseline and it’s true. It can only be not true if the people at nvidia somehow screw there machine learning and technology up (I doubt they do).
2) With then simply come and put our subjective opinion on top of that objectivity.
In any case, I give them the benefit of the doubt, more than Windows auto-hdr.
About the performance:
Auto-hdr sucks 7 – 9 % more watts with my rtx 4090 when the gpu is not power- or performance limited.
In SpecialK I use HDR preset 0 for most things, and I set the Saturation to 85% and the SDR Gamma Boost to 0.85 for VLC Media Player. In games I tend to leave the settings on default unless I feel they need changed, although sometimes I will select one of the remaster options depending on the game. I also increase the maximum luminance setting in SpecialK to the max for my monitor.
Understand, makes sense.
Than it looks like 1) saturation and 2) sdr-hdr for sdr-hdr gamma, are the tweaks people should change, based on their display’s native color volume.
This dude here also does that with his LG C1
youtube(.)com/@plasmatvforgaming9648
As W-Oled is washed out in comparsion to qd-oled very much (4th white pixel dillution), he always keeps that setting lower.
I tried to go above value 1 with my S95C and the other setting, but it all looks unnatural and oversaturated. Looks like the default settings “Hdr 0 & 1” are already very good.
Btw. “Remaster 10-bit render pass”. Why you disabled it? 8- and 10-bit should be left activated, as it’s that what counts.
None of the remaster options are enabled by default. They aren’t usually necessary, and they alter the rendering pipeline of the games in some way. They can also cause issues in some games (the remaster 8-bit option has a tooltip that says “May Break FMV in Unreal Engine Games”).
In this case, I don’t remember why I turned the remaster 8-bit option on as I don’t think that does anything in VLC Media Player. There’s no computer generated graphics in a video, so I don’t know if messing with the render pipeline would have any beneficial effect on the output.
Anyway, my monitor (despite supporting HDR) only supports 8-bit output modes. There’s probably no reason to enable the remaster 10-bit option in my case due to my monitor’s lack of support for it.
Good to know the disadvantages, thanks.
You sure that “remaster 8-bit” settings is only recommendable if the display supports native 10-bit color depth (no dithering +FRC)?
I thought it applies by making use of the game’s engine/graphic pipeline.
When to use each remastering mode is an assumption on my part. I don’t usually see much of a difference with them, but since my monitor only supports 8-bit it’s possible that some of the changes get lost due to that. Since they reduce performance and the HDR tonemapping works fine without them I generally leave them off in games anyway.
TechPowerUp seems to think that NVIDIA intends to release RTX TrueHDR for games on their own at some point, rather than requiring a mod to enable it:
https://www.techpowerup.com/318775/nvidia-readies-rtx-truehdr-converts-sdr-games-to-hdr-in-real-time-using-ai
Thanks for the information.
Yeh, make sense for them to make that step. Yes, you already mentioned it.
Let’s see if machine learning, trained neural networks, will “beat” Special K’s injection and sdr-hdr tone mapping.
Hehehe.
Technically I think SpecialK is just using a shader made by someone else (unless they changed it again).
Up voted for the buzzword AI word for marketing.
I don’t agree with the rest of the text
It’s even strange calling it artifical.
If nature made us and we made an intelligence, what’s so artifical about it?
Just because it’s made of silicon, non-organic, non tissue, digital?
If there is no forgein thing/conceipt as soul/consciousness and animals as us are biological-machines, would it be right to call AI artificial?
I don’t know. Let the theologist ponder.
https://www.youtube.com/watch?v=An4gF0Gbpyw
“Neither of these solutions require pseudo-AI nonsense like NVIDIA’s does” – uh oh another manbaby that cannot afford NV.
Yes its AI, its how LLM are categorised in tech. Ofc its not actually real intelligence but AI in tech has wide spectrum, gonna b*tch about AI in games too? its also not AI just scripts but we call it AI anyway. LLM and generative models are considered AI. Its not rebranding anything its just how its referred to and was referred to for years.
You may hate NV all you want but they have best tech in AI visual category and they really make great stuff with it. Its way more accurate and gives better results then your solutions that actually provide absolutely garbage experience that has nothing to do with HDR, because HDR is not just more brighter colors. In the end of the day it doesny matter if its NV AI or your garbage injectors if content wasnt natively made for HDR it will suck anyway if you try to force HDR.
I have an RTX 3070 Ti… I can’t do NVIDIA’s frame generation, but I can do most of their other “AI” tech, and I regularly do AI video upscaling and I occasionally mess around with Stable Diffusion (locally installed).
To me real AI is something like Data from Star Trek, or EDI from Mass Effect. A computer program (or collection thereof) that is actually intelligent. Modern “AI” is just re-branded Machine Leaning, which is just software that has been trained to do something. In the case of LLM’s these softwares have been trained to mimic human conversational abilities, however as we have seen they aren’t particularly good at doing anything more than that.
I have taken to calling all of these Machine Learning things “pseudo-AI” because of this. They aren’t real AI. We haven’t developed that technology yet. It’s just marketing BS.
upysm.com 是一個專業的社交媒體推廣平台,提供各種刷粉絲服務,讓您的Instagram、Facebook、Youtube等帳號快速增加人氣和影響力。無論您是個人創作者還是企業品牌,我們都可以為您量身定制最合適的方案,讓您在社交媒體上脫穎而出。
Thanks, I have just been looking for information about this subject for a long time and yours is the best I’ve discovered till now. However, what in regards to the bottom line? Are you certain in regards to the supply?
Still a dumbo with exxagerated censorship (even “shietty”) and and anti-spam whatever.
PLEASE, at least be smart and remove your censorship whatever for your own domain “dsogaming” and other mainstream domains as youtube, whatever.
Thanks.
I loved as much as you will receive carried out right here The sketch is attractive your authored material stylish nonetheless you command get got an impatience over that you wish be delivering the following unwell unquestionably come more formerly again since exactly the same nearly a lot often inside case you shield this hike
Thanks for the sharing!
I assume 99 % of pc gamers won’t appreciate how big this is, as there is no “PC master race”.
Majority are dumbos wasting thousand for PC-hardware every two years or so, but at the end still have an inferior experience and shietty image quality, because they watch/view on overexpensive and cheap moniturds, ahem monitors; and that goes even for the current 2023/2024 oled-moniturds.
No amount of “maximum cranked graphical settings” and “ray/path tracing” wil change that.
– Current premium/high-end QD-Oled TVs is the best image quality, contrast, color brightness and accuracy will get.
– They offer the best display coating – any monitor out their has inferior display coating and thus flushes image quality and experience even for Oled, down the toilet
– 55 inch and above for the biggest immersion (cinema-feeling at home). Sitting closer to tiny 27 – 32 inch screens won’t give the same immersion.
– a whole bunch of AI-driven (deep and neural learning) image processing, which elevates image quality, and even does frame interpolation (Sony’s is to some degree on par with amd’s fsr)
– 4K-UHD for fidelity.
Even worse: Some are such big dumbos, they spent 3000 bucks on their PC, but then have some utter-trash of tiny 32 inch LCD-moniturds at their desk.
All don’t know, because they haven’t experienced, how awesome gaming on a big premium W-Oled/QD-Oled TV + HDR looks today when gaming on PC.
Meantime: Console plebs , while gaming at 30 fps, enjoy superior image quality and experience, because they play on superior TVs.
Strange things.
How’s LG C2?
How you come say that?
Never had one. Only 2020 and 2021 LG CX and C1.
Swapped both for the S95C, which is big times superior in multiple ways.
1) QD-Oled is superior to W-Oled. Don’t fall for any media, reviewer, tester etc. telling you otherwise.
They are either all liars, dishonest, only tell the audience half-truth, because they want to keep getting review samples from all TV-manufacturers (W-Oled and QD-Oled).
There are differences in technology (image processing for example) etc.which elevate the experience, but that fact stands true, no matter what, and is the biggest deciding factor.
2) QD-Oled has double the color luminosity (brightness) of W-Oled. Thus double the color saturation, double the HDR-impact.
All colors with W-Oled are washed out, because half of the brightness is the Oled emitter (layers) pushing the 4th white pixel.
That even goes for DCI-P3/bt.2020 content limited to 1000 cd/m².
Watch rtings color volume for S95C. It’s almost 40 % higher than for LG’s G3.
rtings(.)com/tv/reviews/samsung/s95c-oled#test_144
The 1400 cd/m² luminosity (nits) for W-Oled G3 are in fact 50 % being pushed by the white pixel.
These are the things majority of lying reviewers won’t tell.
What TV do you want to buy or own?
Fantastic site Lots of helpful information here I am sending it to some friends ans additionally sharing in delicious And of course thanks for your effort
Hi i think that i saw you visited my web site thus i came to Return the favore Im attempting to find things to enhance my siteI suppose its ok to use a few of your ideas
I should look at this webpage, as my brother advised, and he was entirely right. You have no idea how much time I spent looking for this information, but this post made my day.
I enjoy your website, obviously, but you should check the spelling on a number of your posts. A number of them have numerous spelling errors, which makes it difficult for me to tell the truth, but I will definitely return.
I do believe all the ideas you’ve presented for your post. They are really convincing and will certainly work. Nonetheless, the posts are too short for novices. May just you please lengthen them a little from subsequent time? Thanks for the post.
Cluc.io: Elevating content creation to the next level.
Cluc.io: Making content creation a breeze.
Unfortunately doesn’t work if you have a receiver connected to your PC since it counts as an additional display.