NVIDIA GeForce header image

NVIDIA GeForce 344.75 WHQL Driver – Optimal For Latest Triple-A Games, Introduces MFAA

As promised, NVIDIA has released a new driver for its graphics cards. According to its release notes, this latest GeForce Game Ready driver provides support for Maxwell’s new Multi-Frame Sampled Anti-Aliasing (MFAA) mode.

In addition, this Game Ready WHQL driver ensures you’ll have the best possible gaming experience for Far Cry 4, Dragon Age: Inquisition, The Crew, and World of Warcraft: Warlords of Draenor.

Those interested can download this new driver from here.

Do note that there is no MFAA support for Kepler-based graphics cards.

Enjoy!

New in Release 344.75:

The latest GeForce Game Ready driver, release 344.75 WHQL, provides support for Maxwell’s new Multi-Frame Sampled Anti-Aliasing (MFAA) mode. In addition, this Game Ready WHQL driver ensures you’ll have the best possible gaming experience for Far Cry 4, Dragon Age: Inquisition, The Crew, and World of Warcraft: Warlords of Draenor

Game Ready
Best gaming experience for Far Cry 4, Dragon Age: Inquisition, The Crew, and World of Warcraft: Warlords of Draenor

Gaming Technology
Supports Multi-Frame Sampled Anti-Aliasing (MFAA) mode. (No MFAA support for Kepler-based graphics cards)

MFAA Supportet Games
Assassin’s Creed IV Black Flag
Assassin’s Creed: Unity
Battlefield 4
Crysis 3
Civilization V
Civilization: Beyond Earth
DiRT Showdown
DiRT 3 GRID Autosport
F1 2013
F1 2014
Far Cry 3
Far Cry: Blood Dragon
GRID2 Wargame: European Escalation
Hitman: Absolution
Just Cause 2
Saints Row IV
Splinter Cell: Blacklist
Titanfall

133 thoughts on “NVIDIA GeForce 344.75 WHQL Driver – Optimal For Latest Triple-A Games, Introduces MFAA”

    1. LoL didn’t you used to be a big nVidia fanboy? Based on your previous comments, you always defended them in AMD/nVidia arguments. Just saying.

      1. No I didn’t, I used to criticise NVIDIA, it’s not my failt people tagged me as an NVIDIA fanboy for views. I don’t give a sh*t anyway, I have all AMD now and buy what’s best for me.

        1. It’s good that you aren’t a fanboy then. I myself made the jump to AMD when I got myself a R9 270X and I couldn’t be happier. At the end of the day, I think people should just buy them a card that gives them the best performance for the price irrespective if it’s nVidia or AMD.

    2. TXAA was supported by Kepler don’t be such a cry baby. New tech move along with it. Besides MFAA does not look like garbage next to TXAA.

      When you whine like this it makes you sound like a new age gamer… aka somebody who has been only dealing with this for 5 years or less.

          1. Now you are sounding like a little kid.. That’s sad… Besides a 750ti is not going to push new games using MSAA anyways… So no point in putting MFAA with it.

          2. Well it’s fact, I know you don’t like to hear it but their Maxwell can’t even support MFAA. I guess NVIDIA will have to sell those people a 950 Ti.

          3. I don’t have a problem with it… you just seem to be making a problem with 1st gen maxwell that was the 750/750ti. So what if it’s a fact. They made new tech and 970/980 and later 960 will be dealing with it.

            And everybody knows the 750/750ti were the test runs for Maxwell and they are great cards and I pushed my 750ti to 1450ghz on Air.

          4. You are just making a problem out of nothing. Besides what’s the point of MFAA on a card that would not play games at playable frame rates.. .

          5. That’s a stupid way to look at it considering MFAA supposed to perform better than MSAA, yet MSAA is supported by ALL GPUS. MFAA supports old games, 750TI can play those games with MSAA easy.

          6. And what games have you played on a 750ti with MSAAx2 or X4… Ether way it’s new tech for 2nd gen Maxwell dealing with the 900 series. And has been in the works for a While now. Besides it was not even ready to begin with when the 900 series came out in September since it’s just now in ready for games.

          7. Why don’t you try it and find out, you’re the one with a 750TI, but wait, you can’t get the benefits of MFAA anyway so you’ll never know.

          8. Yeah I got a 750ti because I wanted to see how far I could push it for fun next to how games run on xbox1/ps4.

            And like Anantech say’s… “One nice thing with MFAA is that it currently ties into the existing MSAA support in games, so there’s no need for extra programming work on the part of the developers (unlike TXAA). Of course there are drawbacks with MSAA, specifically the fact that it doesn’t work with deferred rendering techniques, which is why some games only support FXAA or SSAA for anti-aliasing. MFAA doesn’t do anything for such titles, but that’s no surprise. Considering the performance benefit of MFAA over MSAA and the fact that it can be enabled in the control panel and will automatically work on supported games, I don’t see much problem in simply leaving it enabled for most users.”

            Besides if you follow up on GPU’s and you wanted the newest tech the 900 series for Nvidia users all expected it when Nvidia 1st announced it. Again you are just making problems out of thin air.

          9. Well, MSAA does work with deferred rendering games, just DX10/DX11 ones. Farcry 3 averages 40fps with 2xMSAA so MFAA would make that better, same with BF4 and they’re deferred rendering games. DX9 Farcry 3 doesn’t support MSAA for the reason stated above.

          10. Ether way it’s new tech that was made for 2nd gen Maxwell’s dealing with higher end cards. well I am done.

          11. I like hardware.. Been like that since I was a Teen and it was not for bragging right’s.

            Btw Radeon R9 280, 280X, 270, and 270X will not work with freesync monitors … So how does that make you feel knowing that only 290/290x users get it… Oh and Tonga R9 285…

          12. Why not care? Smother gameplay What’s not to care about that… And the cards I listed will not support it…

          13. I’m not in he market for a new monitor, I brought my Benq about a month ago. :p

            BTW

            “Currently, AMD’s R9 295X2, R9 290X, R9 290, R9 285, R7 260X and R7 260 are the first wave of FreeSync-compatible graphics cards.”

          14. oh but what if you wanted it.. you are left out.. kinda like you saying I would be left out of mfaa on my 750ti if I did not own a 970… But one would think AMD would of given all it’s 200 series cards support for it since the other cards came out 1st…

            See point being AMD/Nvidia give certain cards tech well to sell cards that’s business

          15. Fair enough but personally it’s not something I brought an R9 280 for, so yes other R9 280 users will be disappointed probably in that respect.

          16. Just like 750/750ti users did not buy them for mfaa since Nvidia ahead of time said MFAA was being designed for the 900 series 2nd gen maxwell. they got them to play games a bit higher then console lvl since those cards are considered mid range cards. Not higher end or enthusiast lvl.

          17. I’m glad you could found a defence for MFAA not being on the 750TI, referring to AMD. If you can’t defend your point use the opposition to do it for you. :p

          18. I am defending it because it’s freaking true… but hey atleast 750/750ti’s can run on G-Sync monitors… eh eh eh . Besides like I said NEW TECH NEW CARDS. That’s how it is with both AMD and Nvidia. Been like that since I 1st started building my own game rig’s using ATI/3DFX cards.

            Besides my Wife does not bother with the problems you are making with the 750ti rig since she plays Guild wars 2 on it on the highest settings and get’s 60fps not that she cares about that… 🙂

          19. All the 750 TI was released for was to counter the consoles, it’s just a marketing card, a black sheep, a mutt of the Maxwell series.

          20. So? And guess what it proved for just a 50 series card that it’s pretty Grrrrrrrrrrrrrrrrrrrreat! Even TitanFall on it with a I5 makes the xbox1 cry. Same goes with BF4 vs PS4.

          21. That’s not the point… Since a I5 is more then just a gaming cpu. Even a I3 with a 750ti will work well

          22. Of course it does because devs don’t multi-thread properly so the i3 single-thread looks great just like in AC Unity and Farcry 4.

          23. Stutter has nothing to do with not having an SLI profile, the game just stutters in general like Farcry 3 did. FC4 stutters at 100fps on his card

          24. stutter’s from needs a patch I would asume.. I go from a avg of 60/65 on Ultra to a min of 20 I seen. Ether way CPU load and GPU load are fine. Just seems like a funky bug. But general gameplay is just fine. But something is eating up performance. Prolly fixed in the next patch

          25. Yeah, I’m happy with the performance in FC4 too, it’s just the stuttering. I put up a video in the FC4 Hairworks article ,just needs to be approved.

          26. yeah will be interesting to see how the patch goes. I say 60/65fps is a avg because of the crazy spikes I get 73fps, all the way to 105fps and sometimes it wants to stay around the mid 80’s eh eh… Then Crazy dips to as low as 20 since it came out.

          27. Very high pre-set seems to work best for me and works around that blur bug,since custom seem to make the gun blurry, I can keep 60fps+ more often but that damn stutter every so often FFS.

        1. Remember how hard it was to pull his head out of… when we talked about Unity? Now you will need 1000x more horsepower to do the same thing when it comes to Nvidia.

          In other words, just forget about it, at least until we’ll see a proper comparison.

      1. 2nd gen Maxwell aka 970/980 cards to be exact. Which does not shock me since it’s AA built for higher end gpu’s and not 750/750ti gpu’s

  1. Check out PC Perspective’s article on MFAA. Underwhelming right now and only benefits a handful of games. I am sure it will get better and also be available to Kepler owners soon enough.

    1. Not more but better with smaller performance impact. They say MFAA is about the same or better as MSAA but doesn’t impact the fps so much.

        1. TXAA is just meh.. Some FXAA’s are good for their performance cost. The blur isn’t bad and most of people wouldn’t even notice but the AA itself seems rather good. I can run BF4 with 4xMSAA but it just doesn’t get the job done, I use low amount of FXAA also. Then there is SMAA, no blur and relatively good result with very small fps impact.

          1. The snobbish hatred of FXAA annoys me. Early implementations were poor, but it’s really good now and I like that it works on transparent objects as well.

          2. Yeah, the FXAA in games like BF4 and Crysis 3 is great. It’s horrible in skyrim and people tend to think FXAA is that.

    1. MFAA is temporal AA which same as ubersampling we used to have (ATI’s temporal AA in 2004! and ubersampling was used in old DX versions) in drivers but then were replaced (for obvious drawbacks) by newer CSAA / EQAA / SSAA / SGSSAA etc..
      Temporal AA needed 60FPS to run well and even then there were artifacts when moving. Ubersampling (DSR) has same performance impact as running the game in native 2x-4x higher resolution.
      Those are not new (but old and re-mastered) AA techniques!

          1. Doesn’t make it any less terrible to the point that NVIDIA have to pay devs to use it because hardware AA is terrible with deferred rendering engines today, well actually, it always was and still is. Software AA was supposed to break that and does but it needs to be better.

          2. It’s all TWIMTBP games. And not all games with TXAA are bad. Warhammer for example is really good. It’s all up to developers and how they use it like anything that’s in form of a SDK.

          3. No Half truth they are given everything they need for the SDK to run on 600 series cards to what is currently on the market now

          4. That has nothing to do with it. That would be like saying dev’s don’t have everything for AMD’s HDAO for games because it’s hardware OA….

          5. You don’t understand the different between software and hardware AA. Hardware AA this has to go through DirectX, still has to be optimised for different GPUs, still has to run through a GPUs fast render path, software AA doesn’t need to do that.

          6. You don’t understand my main point! They can fully control TXAA, just like MFAA for their games and how well it’s used. And Nvidia just uses it’s driver’s to boost performance.

          7. Well sure does not make much of a difference in performance. I been testing out 4XMFAA with ACU on Very High settings and getting around 45fps and when I use 4XMSAA I get around 34/36fps. Not to mention MFAA can also do transparency and some other things msaa can’t so it is superior in a lot of ways.

            Nvidia did very well with it and I hope support for it grows in the future and when DX12 on the way it will eliminate what you see as a problem with dx11

          8. MSAA can do transparently as well. All NVIDIA are doing is trying to find a hardware solution to aliasing with deferred rendering and cut the performance cost with MSAA. They stop with software and simple didn’t advance FXAA any more, now they have two different hardware AA methods, both of which only work on their GPU and MFAA only works on Maxwell.

          9. With MSAA dev’s have to “Order independent transparency with MSAA” Just enabling MSAA screws up the transparency as the layered pixels are resolved x times equal to the number of sample levels. MFAA does not have that problem for dev’s

            Ether way it’s a great AA and I am not complaining as a customer. And that is what matters in the end.

          10. Even if I only had a 780ti I would not complain. Because I would wait for the 1000 series. I don’t upgrade every year all the time

  2. MFAA best AA out there and the first AA that focus on more quality and been less demanding !
    Awesome Nvidia thats are just few of the reason that you are NR1 in PC Gaming industry!

      1. I thought TXAA was very good in Origins, even did some very close-up image checks and it was not “blurry” at all. The image looks softer, but so does down-sampling.

          1. Yeah, I think I used 4x. It was better than FXAA for sure when it came to small details like power lines and stuff in the distance that faintly disappeared with FXAA.

          2. I still don’t think it’s that bad, I look at up close shots and see no degradation to texture quality in games.

          3. What I really noticed was the lack of “shimmer” along edges. That’s what TXAA was really about as I recall, and it did a good job imo. I hate the “crawlies” you get along aliased edges >.<

          4. Ah, never played it. I actually find in-game AA and AF tends to be crappier than just forcing it yourself a lot of the time. I’m still shocked that a lot of games like Watch Dogs don’t have in-game texture filtering. That game looked like garbage without it. All those bloated textures filling your vram, and they look like crap…

          5. I just override in control panel and since I’ve had my R9 280 I’ve notice image quality is just sharper overall. One thing I have noticed though enabling high quality filtering and AF x16 is the frame-rate lower, I can get at least 10fps more with performance on rather than high quality but high quality looks damn good.

      2. TXAA when done right is not blurry. Most dev’s add it in at the last second and never do anything to patch it.

          1. Nothing wrong with that, it just shows that AMD don’t GIMP the competition. What a great AA method where devs put it in at the last second, yet SMAA temporal beats it, cry engine and UE4 have their own temporal AA method which works on all GPUs.

          2. It’s in the game that’s all that matter’s and it does very well. And UE4 is pretty much the most universal engine on the planet now. Now to mention Nvidia’s gameworks is perfect for it with the teamwork of Epic/Nvidia working tougher. As well as it for Linux which is interesting… OpenGL5.0 api for the future maybe? Guess that depends if DX12 is a flop or not.

          3. TXAA looks like sh*t in Crysis 3, it’s just an AMD logo game for some reason, like Bioshock Infinite, NVIDIA beats AMD in benchmarks in that. In the end the fastest GPU should win, not one that’s only optimised for or tried to make the other look worse with crazy tessellation and unoptimised tech for any other GPU than NVIDIA.

            Dirt Showdown runs better on AMD GPUs because of it’s superior Compute performance, not because the devs didn’t optimise it for NVIDIA.

          4. HardOCP:

            Crysis 3

            “However, MSAA just doesn’t have what it takes to reduce specular and shader aliasing. This is where shader based antialiasing comes in. TXAA, avoid it, and you’ll be happier.”

            SC Blacklist:

            “We’d prefer to have No AA on Alpha Textures, than blurring the heck out of these with FXAA or TXAA. Just like Crysis 3, we can’t recommend TXAA due to the loss in texture quality. TXAA therefore, is moot.”

          5. oh you look that up on google… sad to say it was fixed. btw try playing games and not just reading about them… it’s more fun that way XD

          6. Well perhaps you should do an article about it and send it to those people and keep defending it with BS. Yes I used TXAA with my GTX 660, I know how blurry it is.

          7. yeah TXAA 4XT 1 does not look bad at all.

            edit but on of the best examples of TXAA when done right would have to go with Titanfall.

  3. “Best gaming experience for Dragon Age: Inquisition”
    not even a single frame improved, with or without it game runs the same

      1. Dragon Age runs pretty good even without the driver. for me the difference before and post driver was none. 45fps to 60fps on ultra in gameplay (cutscenes locked to 30, i unlocked them but i saw 17fps twice in some cutscenes). but my problem is the hichings, every few hundred meters i get one & in cutscene when the camera changes. some textures are really beautiful. like always EA servers are F’ed at launch so i couldn’t import my saves from the Keep.

        sorry for gamepad:

        1. don’t be sorry for gamepad most newer games are being designed for gamepads anyways. Pretty much the only thing you need a keyboard mouse anymore on PC these days is MMO’s, FPS. and RTS when they are cough cough made these days.

  4. Yes it is absolutely the same thing with even very same drawbacks or very similar performance cost. ATI’s Temporal AA from 2004.

  5. I switched to Nvidia a year ago and I haven’t regretted it at all. So many more options built into their drivers and software. The built in downsampling has been wonderful, it’s a great way to test performance on higher-end monitors (eg 1440p/4K), and it gives me great image quality in games where I can downsample.

    Really shocked to see certain games like Skyrim running incredibly well at 4K, too.

    And now this neat MFAA to check out. So many toys.

    1. Say’s the pro AMD guy… MFAA is pretty amazing so far. And as long as developers use it right and not rush it in their games it will be used for years to come.

        1. Shame for you. I am enjoying it on my 970. Besides saying was is implying the past… TXAA is still being used and it’s not as garbage when more developers are using it right.

          1. I thought you came to your senses after all, but you continue to sound like a shill. TXAA is bullsh*t. If you need a performance antialiasing you use SMAA / MSAA, and if you need a proper antialiasing you use SSAA / Downsampling.

            Why would anyone want to use MFAA instead of supersampling / downsampling if he has enough power to enable it?

          2. yeah shill overused word like optimization, and bottlenecking… If you never used TXAA in Titanfall don’t speak. It’s a nice AA when used right by dev’s but the majority of the time I use MSAA and now MFAA over everything.

          3. If you think that MSAA / MFAA is better than SGSSAA / Downsampling then you even more clueless than I thought.

          4. Did I say it was better? Did I even mention SGSSAA?…. I am 34 years old you just sound like a punk teenager and I am sure I been building my own gaming pc’s since you were even thought of. BTW SGSSAA can be better or worse than OGSSAA depending on the game. In some games it can be blurry while OGSSAA isn’t, and vice versa. It works on pretty much any graphics engine regardless of API.

            And you say things like “then you even more clueless than I thought.” Come on you sound like the Clueless one just naming off AA options that you probable don’t even use yourself. And just so you know SGSSAA does produce blur in some games just like OGSSAA does in the x-x-x and xS modes. And once again it has nothing to do with LOD. The xS modes have an auto lod adjustment but that doesn’t help you when a game is producing a blurry image due to a conflicting post-processing shader.

Leave a Reply

Your email address will not be published. Required fields are marked *