NVIDIA has released a new WHQL driver for its graphics cards. The GeForce 347.25 WHQL driver expands Multi-Frame Sampled Anti-Aliasing (MFAA) support for DX10/DX11 games, and is described as the ideal driver for Dying Light (that releases next week).
Those interested can download these new drivers from here.
And here the full release notes for NVIDIA’s latest driver:
GeForce Game Ready Driver
The new GeForce Game Ready driver, release 347.25 WHQL, is aligned with today’s launch of the new GeForce GTX 960. This new GPU brings the features, power, and efficiency of the second-generation Maxwell architecture to a more mainstream gaming audience. One of the most popular features of the Maxwell architecture is Multi-Frame Sampled Anti-Aliasing (MFAA) support. With this latest release, MFAA support is extended to nearly every DX10 and DX11 title. In addition, this Game Ready WHQL driver ensures you’ll have the best possible gaming experience for the latest new blockbuster titles including Dying Light.Game Ready
Best gaming experience for Dying LightNew GeForce GPU
Supports the new GeForce GTX 960 GPU, based upon the second-generation Maxwell architectureGaming Technology
Expanded Multi-Frame Sampled Anti-Aliasing (MFAA) support for DX10/DX11 games

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
this is why i respect Nvidia, they always are up to date with their drivers, and always deliver on time, AMD on the other hand has always been lackluster in driver department, here is to hoping that AMD somehow doesn’t mess it up with the new dx12 drivers,
I know expect hate I’m still waiting for a smaa or fxaa injector in amd drivers or adaptive v-sync. Still waiting just wondering how much longer they will let Nvidia do these things?
You make a good point about adaptive v-sync, which I find to be increasingly necessary as lazy ports (mostly Ubisoft) have problems with proper v-sync implementation. It was unfortunate that the AMD CCC did not have adaptive v-sync built in but RadeonPro solved this problem for a long time. However, RadeonPro hasn’t been updated for about 2 years and I was having all kinds of problems with AMD drivers with my Radeon HD 7950. I recently upgraded to a 980 and it’s good to have adaptive v-sync actually supported at the driver level, I seriously can’t understand why AMD haven’t officially implemented something similar.
The only reason why we are getting DX12 next year is AMD since they made mantle (so to not give them credit for that is very low), plus MFAA is feature AMD (ATI) had a decade ago (04-05), they call it temporal AA and they abandoned it few years later and made EQAA instead (similar to CSAA-custom sample AA from nvidia). While MFAA / Temporal AA is interesting in theory, practical use brings problems in many engines while in move, because sample is applied on every alternate frame. nVidia made this bit more interesting by framework where you can set up samples, which really doesn’t solve major problem that this technique has in the first place. CSAA/EQAA has very same performance benefit and do not suffer from those issues. There is only few games that supports MFAA atm so talking about they are always up to date is kind a funny, furthermore that it is after AMD’s OMEGA release that brings bunch of more features that actually brings something new on a table (HSA, Freesync, etc.)
Plus we already have so many AA techniques in drivers from both vendors that bring more… How many AA modes do you use? Future is definitely in post process AA like SMAA which is pretty much superior in every way, especially higher levels.
No The reason we are getting DX12 next year is because Win10 needs to be fully Finished. MS is not going to bring out DX12 so nobody will have a reason not to get Win10.
DX12 has been worked on way before AMD even spoke of Mantle… I mean come on really dude. DX12 is pretty much done since right now AMD 285 has full hardware feature set for D3D12 as well as Nvidia’s 980,970, and 960 gpu.
And for Games you mention CSAA/EQAA but guess what… Not even Crysis or even the most optimized games with the most AA options use that for it’s games in it’s options.
Stop sounding like AMD is paying you per word please.
Really ? then why didn’t we heard any rumor about Dx12 or any indication from Microsoft itself ? I remember when Dx11 was being made Microsoft revealed it long time ago. The industry is very fast in getting rumors out.
Also it’s surprising to see Microsoft developing something that works exactly like Mantle, oh it must be a coincidence for you, please….
The only reason Microsoft got back to PC is because it got under pressure from Valve’s Steamboxes and AMD’s Mantle otherwise I don’t see any reason why would Microsoft take interest in PC so actively and specially when Xbox One is so young in it’s life, all their focus must be on Xbox One like always.
It’s funny you call others fans when you wrote this yourself in one of your next posts ?
“I am a huge Nvidia fan these days”.
you guys are pretty funny if you think it’s possible to create a new directx api in such a short time that this could be in response to anything. It’s been in development for years.
Then maybe they took something from Mantle, Microsoft is known to pay huge sums to buy technologies or part of them. The things is we absolutely heard nothing about Dx12 until Mantle was shown and it does exactly the same thing to free up CPU. Funny are those who doesn’t accept facts.
because it takes time… and Nvidia and AMD both knew of it. If MS was not making a DX12 then guess what.. Nvidia would of made it’s own low lvl API plane and simple to Rival AMD.
All Mantle was…. was a cheap PR stunt that they paid developers to use their API to sell AMD hardware… since guess what AMD’s 2014 annual was not that great for money at all. you can read about it on tech sites…
AMD has a lot of money that needs to be made or they will be bought out by a company like Samsung.
And yeah you can quote me all you want. I been making my own gaming rig’s since 1999…. My 1st gpu was a Voodoo2 that I later had in SLI… The same SLI that Nvidia purchased from 3DFX… My 1st AMD card well was under ATI at the time was a Rage 128-bit pro card and my 1st Nvidia card was a GeForce2 MX 200
You mean AMD paid devs like EA, Eidos Montreal, Firaxis, Rebellion, Capcom to showcase Mantle ? and that they spent huge sum of money in development of Mantle even after knowing the existence of Dx12. I think you don’t even know what you’re talking about.
Mantle was a slap in Microsoft’s face that even after having such powerful hardware we are lagging behind in performance due to the limitations of Dx11. The cheap PR stunt was done from Nvidia when they fooled theirs customers by releasing the drivers with “Shader Cache”. It’s not a technological advancement like Mantle or Dx12.
Once again they fooled their customers with GTX 970, which has problems with 4 GB vram allocation, didn’t their cards passed through extensive quality testing ? even in your own posted screenshot of ACU your vram is stuck around 3.5 GB.
And what should I do with your rig building history ? I am also building rigs since R100 series with ATI Radeon 7000 being my first card. My first NV card was MSI Geforce FX 5200, then MSI 6600 GT and then XFX 8800 GTX, finally went to HD 4870 and never tried Nvidia again. Right now I own R9 290 and was willing to get GTX 970 before The Witcher 3 but reports of excessive coil whining and vram allocation issue killed my excitement for the card.
We are getting DX12 because Microsoft.
Also, AMD did not in fact cure Ebola.
lol. You think Microsoft saw Mantle and though “Oh s**t, we better do something”!? DX12 has been in development for years. Stop fanboi-ing.
For years when?They said something in press and we dont know it?
M$$$ togheter with Amd reveal they work at DX12 API recently.
This “new” API from M$$$ never will revealed if Amd dont throw Mantle on market and in games.
If an GPU maker dont push forward M$$$ togheter with their Xbone will prevail against PC games&gamers.
Learn to read from other sources not from M$$$ or nvidia brain wash marketing.
Thanks
DX12 has been in development before Mantle existed. Just because it was not talked about until AMD decided they would do a low level API for their graphic cards. So that did forced MS hand to reveal DX12 is coming sooner, rather than later. Both Nvidia and AMD had been working with MS on the DX12 for quite sometime. However, AMD decided on Mantle because DX12 was originally not a low level API for their needs.
Not everything get leaked out in the press. Valve is still good at keeping Half Life 3 a secret. Blizzard as well with their products. When it does get leaked though, Blizzard response with cancellation of the project; Ghost & Titan, both discontinued.
btw the only reason why nvidia brought obsolete MFAA is because CSAA cannot work on maxwell due HW limitation, so that much for driver you talking about, also delivering on time? there is no schedule! The only company that had periodic driver release was, surprise, AMD with their monthly driver updates, they no longer use that though and just recently they presented new omega scheme that should bring huge amount of new features every year. I am not saying one or other has better model because both companies has pretty much the same, same as both have pretty much same issues. If you think Nvidia users has no issues then go to geforce forum and then you can do pretty much the same with AMD, there is no clear winner in that deparment, AMD is definitely not loosing there! So next time please if you wanna criticize, at least get your facts straight.
nobody uses CSAA… Nobody… Show me a game that uses it that came out in 2014… you are sounding like a AMD shill again and it’s been 6 weeks since omega hit the downloads…
We all know you don’t care for anything Nvidia so why bother trolling Nvidia articles or users on here all the time about it.
The only games I recall that use CSAA are the recent Final Fantasy XIII ports, which are coded in such a way that at least some CSAA has to be on or else the engine can’t render hair and eyelashes properly. Many of the early fixes for these crappy ports involved forcing the driver turn off all AA, but this had the domino effect of causing other engine problems related to the CSAA implementation. This is all according to Durante, the creator of GeDoSaTo in this very interesting article:
http://www.pcgamer.com/final-fantasy-xiii-and-xiii-2-port-analysis-durantes-verdict/
So yes, CSAA seems to be more of a liability than anything else.
MSAA vs MFAA comparison:
http://www.pcper.com/image/view/49403?return=node%2F61710
Well I had to do a ACU test… 1080P Ultra everything and using 2XSMAA on my 970. And I am better better frames then people were when the game last got patched uses FXAA!!!
I’m in a similar position, in ACU before the last patch I was using the game’s 2X TXAA just fine on my 980 and it looked quite good at 60 fps, but then after the patch any form of TXAA tanked the fps. I switched to using the Transparency AA setting from the Nvidia control panel and I get a similar improvement without the frame drop.
aah yeah I am just testing out the new MFAA option when using 2xsmaa. Seem’s to be working good. Will be interesting if they really do give MFAA support for DX9 games. For some reason TXAA just makes the image look really well blurry like a smeared image quality even next to TXAA’s blur in ACU
You do know that MFAA only works with/improves MSAA, not SMAA, which is post processing AA
There is no MSAA support in DX9 with deferred renderer so MFAA won’t work either. MFAA is really for DX10/DX11 deferred render because MSAA’s performance cost is so much, it’s pretty irrelevant with older games now using none deferred renderer.
I am just talking about the video on PC perspertive when the guy from Nvidia talks about bringing MFAA to DX9 games. it was made a day ago.
He said they are working on bringing MFAA to DX9. No given date.
So uh, where did you enable SMAA for ACU? I don’t see it in Nvidia control panel or ACU settings.
it’s in game you don’t need to use the control pannel.
Yea, that’s what I’m asking, where in game? The only options I see for AA are FXAA, MSAAx2 thru 8, and TXAA.
This driver says nothing about SMAA being enabled on DX11 games.
when you use MSAA it automatic turns on MFAA
You could actually argue that the new 960, is in fact the 3rd generation Maxwell due to the release of the 750/ti. But I guess the architecture is the same, more or less.
I am a huge Nvidia fan these days.. but I am not impressed with the 960 at all. But what I am impressed with are the new drivers. MFAA on all DX10/11 games that have SMAA option in them. I am seeing 5/10fps difference in performance from just using SMAAx2 or X4. I am gonna do most testing and see if I get more performance from other games. But so far in Dragon age inq using 2xSMaa option with everything else set to ultra I am getting around 76/up to 114fps on my 970 now lol
Side note is Nvidia plans on making MFAA work for DX9 games as well. Will be fun to see how that works out.
Agreed, Nvidia are really on it at the moment Imo. My 970 is taking everything I throw at it, I can now play Arma 3 to the max 🙂
yeah the new drivers are amazing. And the MFAA option is just the icing on the CAKE!
Can you please stop using the wrong term?
You are talking about x2 SMAA, SMAA is a postprocess AA like FXAA with barely any performance impact, you’re talking about using it in Dragon Age Inquisition which doesn’t even have SMAA.
You are confusing it with MSAA (Multi Sampling Anti Aliasing), that’s what MFAA is competing with and that’s what games need to support natively in order to be able to use MFAA now.
MFAA looks worse.
NVIDIA drivers are a joke with Metro Last Light benchmark, micro stutter mess.
http://i.imgur.com/i8V3qgn.png
i hope it improves the performance of Far Cry 4!!!