NVIDIA header image

NVIDIA will stop supporting 32-bit operating systems after its GeForce 390 drivers

NVIDIA has announced that it will end support for 32 bit operating systems really soon. After its upcoming GeForce 390 drivers, the green team will no longer support 32-bit operating systems. This includes Windows 7, Windows 8/8.1, Windows 10, Linux and FreeBSD.

As NVIDIA noted, driver enhancements, driver optimizations, and operating system features in driver versions after Release 390 will not be incorporated back into Release 390 or earlier versions. However, the green team intends to support critical driver security fixes until January 2019.

To be honest, this makes perfect sense. In case you haven’t noticed, most – if not all – modern-day games require a 64-bit operating system. As such, there is no reason at all to release optimized drivers for newer games on 32-bit operating systems (as these games can’t even run on the aforementioned 32-bit operating systems).

NVIDIA did not reveal when it will release the GeForce 390 driver, however – and since there aren’t any triple-A games coming out in the last remaining days of 2017 – we can expect to see this driver in 2018.

13 thoughts on “NVIDIA will stop supporting 32-bit operating systems after its GeForce 390 drivers”

  1. That’s the point in using a 32-bit OS these days? Is there anyone that would not be able to use a 64-bit OS? If so, why?

    1. I guess there are some people running really old PCs still and some that are running a 32 bit OS for legacy software but they don’t have to. With some versions of Windows 7 you can run 32 bit XP Mode within Windows 7.

      I think the vast majority of PCs are running 64 bit OS’s so this isn’t really a big deal. They can just stick with older drivers for 32 bit support I guess.

    1. and needs to die now, Windows needs to also dump 32 bit support next.

      I wonder how long it will take till we see 128bit CPUs and OS, I wonder if there is already a road map? Or would it be pointless?

      For my game dev going to 64bit was massive, really 32bit is pointless to use at this point especially for games and it is just bad practice if you still use 32bit.

      1. 128 bit would be pointless since 64 bit is an exponential increase from 32 bit. But you’re right, it’s bad practice to use 32bit in nearly anything nowadays.

        1. But is that right though about 128 bit, I know for memory allocation it is at a exo something level @ 64 bit aka insanely big and there is no need for more but what about raw CPU calculation performance?

          So say if all motherboards, buses and CPUs for now on went 128 bit wouldn’t that be massive for bandwidth through put etc and for hard core calculations? Wouldn’t it make some sort of diff, it seems it would?

          1. It’s true that 128bit would be a MASSIVE increase. HOWEVER, we are barely even close to maxing out the capacity of 64-Bit, so going to 128-bit is just a waste of power and manufacturing costs. Maybe 20 years down the line we will need to switch.

    1. DX11 is fine. It was just barely adopted in mass about 3 years ago. People are still learning to properly leverage it. And since there are a lot of old gen GPUs limited to DX11 that are still either faster than base consoles and games usually are tweaked to run under console visuals, they don’t need to go anywhere yet.

Leave a Reply

Your email address will not be published. Required fields are marked *