AMD has just released the first version of its AMD Radeon Software Crimson Edition graphics driver. This software is – more or less – AMD’s answer to NVIDIA’s GeForce Experience and comes with following features.
- Radeon Settings
- New Install UI
- Liquid VR
- Asynchronous Shaders
- Shader Cache
- Optimized Flip Queue Size
- Freesync™ Enhancements
- Custom Resolution Support
- Frame Pacing Enhancements
- Frame Rate Target Control Enhancements
- Updated Video Feature support for ‘Carrizo’ products
- Power Optimization
- Directional Scaling
- Dynamic Contrast Update
- DisplayPort to HDMI 2.0 support
Those interested can download it from here.
Here are some additional notes for AMD’s Radeon Software Crimson Edition’s features:
Radeon Settings:
Radeon Settings is the new, streamlined user interface, engineered to bring out the best of AMD graphics hardware. User-friendly and feature-rich, Radeon Settings is lightning fast and starts upto 10x times faster1? than the previous AMD Catalyst™ Control Center. Radeon Settings provides a brand new game manager, improved AMD Overdrive options per game and new video, display and Eyefinity tabs.
New Install UI:
The driver installer now provides a brand new, streamlined user experience with a reduced number of clicks required, providing better usability and an easy to install user experience.
Liquid VR:
The AMD Radeon Software Crimson edition is the first publicly available driver that enables all LiquidVR features, which are currently being validated and tested by VR headset manufacturers and ecosystem partners.
Asynchronous Shaders:
A feature that has been extensively used by game console developers is now available to PC Gamers. Asychronous Shaders break complex serial workloads into smaller parallel tasks, thereby allowing idle GPU resources to be used more efficiently and parallel workloads allow taskes to be completed much faster.
Shader Cache:
The shader cache feature allows complex shaders to be cached, thereby resulting in reduced game load times, lower CPU usage and reduced stuttering and latency during gameplay2?.
Optimized Flip Queue Size:
The optimized Flip Queue Size provides users with the very latest keyboard and mouse positional information during gameplay to reduce input latency on DirectX 9® , DirectX® 10 and DirectX® 11 applications. This feature is a driver optimization, therefore it is automatically enabled and requires no user configuration or setup.
Freesync™ Enhancements:
- Minimum/Maximum display rate is now listed in Radeon Settings?
- Low framerate compensation to reduce or eliminate judder when application FPS falls below the minimum refresh rate of an AMD Freesync™ enabled display
- New support for AMD Freesync™ with AMD Crossfire™ in DirectX 9® titles
Custom Resolution Support:
This feature provides users more control over display capabilities allowing the user to create custom display profiles to attempt to drive their display with chosen resolution, timings, refresh rates and pixel clocks.
Frame Pacing Enhancements:
Frame pacing support is now extended to DirectX® 9 titles.
Frame Rate Target Control Enhancements:
FRTC enhancements include: power saving capabilty, support for DirectX® 9 titles and an extended range for target control (30 – 200 FPS).
Updated Video Feature support for ‘Carrizo’ products:
- FluidMotion for smoothing playback of 24/30 FPS video using Cyberlink PowerDVD 15 for Blu?Ray playback
- Improved edge enhancement for sharper images
- Improved de?interlacing for interlaced content
Power Optimization:
Improved power optimizations for video, gaming and FRTC enabled gaming environments (AMD Radeon™ R7 360, AMD Radeon™ R9 380, AMD Radeon™ R9 390 series and AMD Radeon™ Fury series).
Directional Scaling:
Enhanced image quality for 1080p media content scaled to 4K resolution using adaptive directional filtering technology (AMD Radeon™ Fury products).
Dynamic Contrast Update:
Qualative changes are now content adaptive and provides video enthusiasts with improved video quality and contrast settings (AMD Radeon™ R8 285, AMD Radeon™ R9 380 and AMD Radeon™ Fury products).
DisplayPort to HDMI 2.0 support.

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
A much deserved update for the amd brand. Let’s hope they can bring the fight to nvidia. Healthy competition between the green and red giant would benefit everybody.
If this keeps up I might go back to AMD next year. I am starting to disable these crappy GameWorks features anyway so I won’t miss much.
For some, it’s about software and for others it’s about hardware andddd for others it’s the company image that’s important. I could give amd a try just like you. Problem is that when i buy a card, it’s most likely to cost me 600-700$. If i end up buying amd and getting one of their 700$ cards, and end up being deceived or whatever i would be pissed. It’s something to think about.
But surely you would have nothing to complain about if you did that?
GameWorks features have always been such resource hog that I never use them. As long as those features can still be turned off and not essential to the main gaming experience then I don’t mind.
However, in my place the price difference between the two graphic card brands is almost negligible, especially among the top tier models, that I always buy Nvidia out of habit. I have bought several Radeon cards, starting from Ati X800, but then GTX8800 came along and converted me back to Nvidia.
Yet in the other articles he’s trashing Gameworks and then saying he likes TXAA in AC Syndicate, then before ACS even launched he said no “thanks thanks” to TXAA. The guy is a contradiction in everything he says.
So they copied the shader cache option from NVIDIA.
LOL
Just like Nvidia copied TressFX with HairWorks.
What else have Nvidia copied as well as the hair tech?
At least their tech actually works. TressFX was very demanding but never so bad as HW, just try HW in Witcher3 you double performance and loose almost no hair physics. And then you do the same thing with Tomb Raider.
That’s a flat out lie and you know it.
Quoted by Techspot:
“The average frame rate of the R9 290X was 49% faster with TressFX disabled”
Quoted from EuroGamer:
“From our perspective, HairWorks – just like the TressFX tech found in Tomb Raider – is best considered as an expensive luxury to be enabled only if you have a vast surfeit of GPU compute power available. With regards AMD performance generally, you’ll note that the GTX 970 and the R9 290X offer similar frame-rates once HairWorks is disabled – so the good news is that GameWorks integration hasn’t crippled AMD GPUs overall”
What a load of bullsh*t! you’re cherry picking sentences, quote the full sentence for everyone if you’re that confident.
Techspot:
“The average frame rate of the R9 290X was 49% faster with TressFX disabled, that is certainly a significant performance gain, but not quite the 75% gap we saw when disabling HairWorks on The Witcher.”
“With TressFX disabled the R9 290’s minimum frame is 1.5x greater while the GTX 780 and GTX 980 saw a 1.6x increase — roughly half the impact we saw HairWorks have in The Witcher 3. We realize you can’t directly compare the two but it’s interesting nonetheless.”
“The issue with HairWorks is the minimum frame rate dips. This isn’t something we found when testing TressFX in Tomb Raider back in 2013”.
EuroGamer:
“Whether you own an Nvidia or AMD graphics card, our suggestion is to run The Witcher 3 with HairWorks turned off for best performance. The bottom line is that if a superb card like the GTX 970 (which is running the optimised HairWorks code, remember) produces sub-30fps performance dips with the technology enabled, that’s a lot of GPU resource that could be better spent elsewhere, like aiming for a consistent 60fps for example.”
From me:
Also don’t forget that we’re still comparing an initial iteration of the tech (TressFX 1.0) with more optimized and mature iteration of HairWorks. We’ll see real comparison when Tomb Raider will be released on PC.
that’s not true Nvidia showcased hair physics back in 2008
Then why they held it back that long ? they showcased few slides in 2008 then released a demo in 2010 and then went silent on it, we also don’t know whether they were making it PhysX based or Compute based ? it’s only after TressFX that they finally put some effort into it and released it as HairWorks, I can also bet that they would have made it another “Nvidia Only” tech had AMD not released compute based TressFX first which works on all GPUs.
Because it was to demanding on hardware at the time. And besides everything under Gameworks that deals with Effects under Apex is PhysX at the core.
Which is why Alices hair in Alice in wonderland did not use PhysX for her hair like I once thought it did. Besides the tech was just not their yet for developers. And still developers can’t seem to not make something not have a 10/15fps hit. Just like COD Ghost when it used hairworks on that dog.
HairWorks has nothing to do with PhysX whether by name or by tech so please stop confusing things, if you check dependencies of HairWorks, it says DX11 not APEX or anything else, all hair rendering and simulation techs use DirectCompute which is something set by AMD.
Back in 2008 – 2010 even PhysX was too expensive but Nvidia kept pushing it in games like Batman Arkham Asylum and Metro 2033 so I don’t see a reason not to use HairWorks even if it was expensive ? Nvidia just went silent on it for no reason and then came back with HairWorks when AMD released TressFX, the tech itself was just pieces according to Nvidia’s own definition on HairWorks page (Rendering Technologies – GDC 2008) and simulation technologies (Fur Demo – GDC 2012), I bet it was only meant for tech demos and they only put everything together when they saw AMD’s work on TressFX.
Sorry but all gameworks that is not TXAA/HBAO/HBAO+ is under PhysX as a Core. Nvidia just has names for everything. However they are not mainly GPU accel
Nvidia does that on purpose because most of the people believe PhysX = “Nvidia and Nvidia only” without getting into the nitty gritty, that’s why a lot of people still face trouble understanding how HairWorks work on AMD. In reality anything mentioned under PhysX means it’s either processed by CPU (Software) or Nvidia CUDA (Hardware), HairWorks/TressFX use none of them and run on DirectCompute which comes from DirectX so it’s completely misleading to mention it under PhysX.
But what I am saying is the Core of all Gameworks is PhysX. Does not mean it’s on the Hardware accel side. Since PhySX can be used CPU/GPU/PPU Just like how a AMD GPU can play witcher 3 in all of it’s Glory as well as see Destruction.
PhysX is the Core while Visual FX deals with TXAA/HBAO+
I said this ages ago, you must have got that information from me.
I get information when I judge everything myself because I am not a biased person like you who sided with AMD when he had AMD card and now siding with Nvidia just because he bought an Nvidia card, that’s the reason everything coming from you looks like joke to me.
Go f*ck yourself man, don’t reply to me if you’re going to talk such bullshit and twist my words you stupid c*nt. If I’m that biased why do I admit AMD’s Compute performance is superior? I criticise NVIDIA for poor Compute performance in Kepler and Femi, I criticised GeForce Experience for being slow and buggy, I said NVIDIA has poorly picked game titles from Ubisoft and AMD picked good ones for their Gaming Evolved.
Biased, really? I have data and videos to prove Gameworks doesn’t cost much in performance as claimed. I said the Omega driver was a good driver. Where is the bias?
Which word of yours have I twisted ? I have seen you defending AMD when you had an AMD card and now you defend Nvidia to the death just because you switched to Nvidia and that makes you nothing but a jester. B*rk all you want but that’s not going to change the facts.
Oh F*ck off you worthless pr*ck, go get a brain. You counter my facts with nothing but your opinion. All you’re doing is defending AMD, nothing but.
Your facts ? if abusive language is what you call facts then by all means your facts are great… Other than that you have nothing.
All those points you mention for your so called unbiased nature are crystal clear as water, several benchmarks and mining community indicate that AMD has better compute performance, Geforce Experience is criticized by Nvidia’s own users most of the time, hell some don’t even bother installing it, Nvidia’s bad selection of developers is apparent with the number of failed GameWorks titles, specially those that are heavily marketed and hyped.
Ha, data and videos ? I can also make similar videos on my GTX 980s and call it a day but what does it prove ? a vast majority of users including Nvidia users don’t like the performance cost of GameWorks (see Geforce forums for your own reference). Only the most niche high end or fanboy class deny it’s damage and say it’s great.
And it has nothing to do with me not liking you, I have nothing personal against you, I just don’t like how you always criticize AMD even if the article has nothing to do with Nvidia, their attempt to improve their weak areas isn’t something worth praising to you ? they did several improvements in Crimson but all you mentioned is that they copied shader cache from Nvidia so what ? DX12 is copied from Mantle, Vulkan is sharing Mantle’s idea, HBM is going to be utilized by Nvidia so what ? one technology provide inspiration for the other and it’s always good to see progress.
When I saw AMD had nothing to compete with Maxwell I switched to it, I also said several times that Nvidia has better driver and SLI support, the only thing where I dislike Nvidia is it’s self centered thinking and black boxed SDKs but here you’re criticizing AMD on on something that doesn’t make sense for example TressFX, what’s wrong with it seriously ? just that it runs on consoles ? or you’re mad because it’s open source or maybe because AMD doesn’t restrict other IHVs to work with devs ? The only thing you’re mad about is that people compare TressFX performance with HairWorks and that’s super stupid, you’re so blind in supporting Nvidia that you’re even refusing to admit that Batman is the biggest fail of 2015.
Despite my dislike I also praised GameWorks when Nvidia showcased Batman Arkham Knight features in trailers, I said it’s something that’s going to set PC version apart from consoles but the game turned out to be a fk*ing mess and all that Nvidia’s false marketing fell apart. Blame the devs all you want but the fact is clear that Nvidia is helping devs in giving a lollipop to PC gamers, they don’t think twice before picking devs and they have no quality control over their promoted titles, they are only interested in marketing, promotions and attracting more fools into buying their cards.
You make assumptions you can’t prove. AMD did nothing with Compute up until 2013 with Tomb Raider. Compute has been around for a very long time, they did nothing with their superior Compute performance.
Because compute was always targeted towards another market segment like bitcoin mining and AMD never showed off anything like Hair rendering in 2008 that never made out, when they made something they released it quick and Nvidia followed afterwards.
Again you can’t prove what you’re saying, you just say it and assume it’s true. Again One game has TressFX, one f*cking game back in 2013, nobody but Square Enix uses TressFX and then it’s used in the consoles as well. The fact you can run it on consoles tells you something, it’s used on one character LOL.
Your shouting and crying won’t change anything, one studio use it but they do it properly not like a dozen studios clusterf*cki*g with it, Nvidia just want more names associated with it, they don’t care about freaking quality and performance.
And don’t tell me TW3 run HairWorks in it’s full glory, it was shown with much better versions in trailers but were severely reduced in quality for example this is what we saw in trailers
http://i.imgur.com/LGKhzEb.gif
but in reality I can’t find hairs like these on roach LOL. I had another gif that showed side by side comparison of Roach’s tail as well but I can’t find it anymore, it was much smooth before but now I think it don’t even use HairWorks anymore. Bears look horrible with next to no fur and rest of the animals just use fur instead of full hair strands like you’re trying to suggest, only two characters (Geralt and Ciri) use proper Lara like Hairs and even then Ciri has just few moving hair parts, now if HairWorks is a PC only tech then why it was severely reduced in quality compared to trailers because no sensible GPU out there can handle it. It’s far better to just stick with one character then show off higher versions in trailers and then reducing them.
And show me one character from The Witcher 3 whose hairs look as good and detailed as these, if you can’t show then don’t bother replying.
https://www.youtube.com/watch?v=IlUxKO5zYOA&feature=youtu.be&t=161
Yes and only one developer uses TressFX LOL, which happens to be multiplat dev that implements it into consoles as well. NVIDIA effects are for PC, not some pleb effect made to for consoles as well.
Better stick with one good dev rather than several half a$$ed devs. AAA games coming from Square Enix are shining on PC as opposed to trash like Batman Arkham Knight and Assassin’s Creed Unity. Call of Duty Ghosts was a huge fps hog with HairWorks while FarCry 4 had terrible LOD issues with HairWorks causing fps drops even with animals at long distance, only CDPR managed to implement it properly but with huge performance cost even with Geralt alone.
So we have one good HairWorks game and one good TressFX game with another two TressFX titles coming soon, Lichdom was also a TressFX game but I exclude it since it was an indie title.
And don’t give me that PC vs Console sh*t here, there is nothing wrong if a tech works fine on both PC and consoles and even if I agree to your argument that HairWorks is a PC only tech then why TressFX looks superior to HairWorks ? see some RoTR videos and come back, from movement to shading and reacting to external objects like ice, water etc it just looks better.
You’re funny man, you keep referring to Hairworks to defend TressFX’s 49% performance loss.
And you’re a stupid NVtard who has no idea what he’s saying, first bring me proof of that 49% percent performance loss with TressFX right now with current drivers and then we’ll talk, you’re just taking that number out of your A$$. Additionally see this for your own reference, oh let me guess now you’ll say he’s an AMD fanboy too ?
https://www.youtube.com/watch?v=esm8YzFBPOk
Wrong.
and how is that ? if you’re basing your answer on John’s reply then let me tell you that ideas sound great on papers and tech demos but it’s the real implementation that matters and something that public can use, AMD brought the tech to public and did it very fast unlike Nvidia who held it back for years and only came back to senses once they saw AMD’s work.
Wow, one game supports TressFX and that was back in 2013, hardly a shining example of a ton of developers using it because only Square Enix is. The Witcher 3 uses hair on multiple objects, not just on a main character.
The new Tomb Raider uses TressFX, XB1 exclusivity deal ,AMD picked a right stupid dev there to showcase their TressFX.
As they copy eyefinity (nvsurround)
Not just the shader cache but pretty much the down sampling, dynamic resolution , half refresh rate or 30fps with proper frame pacing etc.
lol
Is there a built in unistaller or we have to download 3rd parties programs to unistall at all again?
You’ve never needed to download a 3rd party program to uninstall amd (Or nvidia) drivers.
And what u do in order to unistall PROPER the previous catalyst?
Nothing. Just uninstall it from the control panel.
RS Crimson will automatically un-install Catalyst while installing.
I’ve always used AMD’s uninstall utility called “AMD Clean-up Utility”
Thinking in swapping the GPU just to test it.
Same from me. Very clean interface, no issues with FO4 thus far.
Can’t say about games, my HD busted with all my library in it (only downloaded a few retro ones, and some indies, not enough to measure the driver improvements). 🙁
Not that it wouldn’t help anyway, the GPU I’m using to test is a entry level one (6570), so my impressions for now remains software only.
Sounds great and all, now let’s test it out. Catalyst CC was quite rubbish, I hope this is better and gives more options.
Cant wait to test them.
Some would still find not having that time to make coffee a con. AMD, WHY HATH YOU FORSAKEN THE COFFEE DRINKERS.
Joking, of course.
Good for you, you could make coffee in 1 second.
Good job AMD! Still need to buy a new GPU…I’m between a GTX970 4GB or AMD R390 8GB…hum…(same price here at Spain, 360€)
R390 is good, but very hot, i mean VERY VERY HOT!
Is Hot? I dig it. 😉 lol
Hot my a**. It’s as hot or ever less hot than the 970 nowadays.
-10 fps in AC Unity after that driver on R9 390… wtf!
Is there a way to install it on windows 8? I’m not running 8.1
Well go on and install W8.1 or W10.
Welp, it locks up on my system…Win 10.
This driver made wonders for me. BF4 dx11 increase is from 60 to 90+. Mantle from 90 to 120+.
Wither 3 is running so smooth. No stuttering at all even on big city’s. 40 fps minimum on city’s with most settings on ultra except shadows.
Also the profiles are very good. It auto detect all my games. I can now overclock my card to 990 with 70 maximum temp. Before i was reaching 80. The fps limiter now works.
This is an amazing driver for AMD.
If you’re using Win 8/10 -> Please go to C:Program FilesAMDCNext
Click Type and Make administrator privileges for every .exe in it !
As for Now the New Drivers are Very Promising.
+2FPS to +27FPS (depends on Game and API)
Great Job AMD/ATI
What exactly does this do ?
So happy for AMD … put the pressure on nVidia 🙂 What we need is competition.