nvidia feature 3

NVIDIA GeForce 531.61 WHQL Driver released, fixes issues with The Last of Us Part I, Assassin’s Creed Origins & Hogwarts Legacy

A couple of days ago, NVIDIA released a new driver for its graphics cards. According to the release notes, the NVIDIA GeForce 531.61 WHQL driver adds support for the RTX 4070, and fixes some game issues.

Going into more details, this new driver resolves some stability issues in Assassin’s Creed Origins. Moreover, it packs the crash fix for The Last of Us Part I that was included in the previous hotfix driver. Additionally, it fixes a black screen issue that could occur on launch at Shader Compilation Screen in Hogwarts Legacy.

As always, you can download this new driver from here. Below you can also find its complete changelog.

NVIDIA GeForce 531.61 WHQL Driver Release Notes

Gaming Technology
  • Introduces support for the GeForce RTX 4070.
Fixed Issues
  • [Assassin’s Creed Origins] Game may have stability issues when using 531.18.
  • [The Last of Us Part 1] Game may randomly crash during gameplay on GeForce RTX 30 series GPUs 
  • [Hogwarts Legacy] [Regression] Black Screen/Hang on Launch at Shader Compilation Screen using Driver 531.18 
Known Issues
  • Toggling HDR on and off in-game causes game stability issues when non-native resolution is used. 
  • Monitor may briefly flicker on waking from display sleep if DSR/DLDSR is enabled. 
  • [Halo Wars 2] In-game foliage is larger than normal and displays constant flickering 
  • [GeForce RTX 4090] Watch Dogs 2 may display flickering when staring at the sky 
  • Increase in DPC latency observed in Latencymon
  • Applying GeForce Experience Freestyle filters cause games to crash

17 thoughts on “NVIDIA GeForce 531.61 WHQL Driver released, fixes issues with The Last of Us Part I, Assassin’s Creed Origins & Hogwarts Legacy”

    1. You can likely get 531.58 to work on a 4070 by whitelisting it, a trick we used to use way back in the day before Nvidia went to the unified driver ….. Basically back then every card had it’s own driver and with some models you could get a free boost in performance by using the driver for next model up the stack and whitelisting your lower tier model so the drivers would install. Basically it would just increase the clock (which was fixed frequency) and change the power profile. Sometimes it would even enabled features that were on the chip but just not enabled in the drivers for the cheaper card.

      Ironically I had to do that shortly after the GTX 1660 Ti came out because a week after I bought it Ubisoft did a major update of Odyssey and the only drivers available for the 1660 Ti would constantly crash so I took the previous driver and whitelisted the 1660 Ti so the drivers would install and sure enough it worked just fine.

      1. This and we also used to flash BIOS meant for better models in hopes to get performance boost.

        It worked, but that was a long time ago, lol.

        1. I did that with a RX5700, I flashed the BIOS from the AIB’s 5700XT and got a major boost in performance ….. My nephew still runs it at 1080p with a 3700X which for him was a major upgrade from a Xbox One and he forgot all about trying to get a Series X when they came out a couple of months later

          I still wonder if you could flash a 7900 XT with a 7900 XTX BIOS and get a performance boost. The 7900XT should be capable of running higher clocks that it does because generally as a rule of thumb a chip with fewer CUs can be clocked higher than one with more CUs because the one with more CUs is thermally limited. However stock the 7900 XT clocks several hundred MHz less than the 7900 XTX. A 7900 XT that has dual BIOS would be an excellent candidate for that experiment because it is didn’t work you could fall back to the second BIOS and reflash the other back to stock

          1. You can’t these days, they are checking the version of the BIOS and if there is a mismatch, card refuses to install it. Also flashing tools often reported as being “Not Supported” We know why they are doing that ^^.

  1. Why the link to Guru3D instead of Nvidia’s official website? The Guru3D link also goes to the wrong driver version.

  2. Would be interesting to see someone make a video on actual VRAM usage between Nvidia and AMD on titles. From what I’ve seen Nvidia’s compression allows them to use like 1.5 -2 gigs less of VRAM than AMD in newer games like Resident Evil or TLOU.

    The problem though is that Frame Generation uses VRAM so that ram savings is gone.

    This has got to be the worst generation of GPU’s ever. 12 GB VRAM cards with features that need VRAM as their selling point. AMD helping keep the prices high and barely cutting prices on last gen cards using 100 watts more power for close rasterization.

    The only affordable card that might be good for people is the 7800XT and AMD will probably price it at 650-700 or price it at 600 and make it a paper launch due to small profit margins and the MSRP price won’t exist.

    The 6800Xt should be 500 bucks atm MAX not 550. The 6900 should be 550 and the 6950 XT’s can be 600.

    1. I haven’t checked anything after the RX5700 but as a rule of thumb with the exact same settings Nvidia cards would use about 10% less memory than AMD.

      You can gain even more by setting the “Texture Filtering” which is really the compression level of textures to performance with little difference in the quality if they are already high quality textures, that will gain you another 5%-10% depending on how much of the memory is textures

  3. Tha base game now works much better but the Left Behind DLC crashes like a MF. And why didn’t they integrated it in the story when they did the remake? like the HBO show did

  4. Tha base game now works much better but the Left Behind DLC crashes like a MF. And why didn’t they integrated it in the story when they did the remake? like the HBO show did

Leave a Reply

Your email address will not be published. Required fields are marked *