As promised, NVIDIA has released a new driver for its graphics cards. According to its release notes, this latest GeForce Game Ready driver provides support for Maxwell’s new Multi-Frame Sampled Anti-Aliasing (MFAA) mode.
In addition, this Game Ready WHQL driver ensures you’ll have the best possible gaming experience for Far Cry 4, Dragon Age: Inquisition, The Crew, and World of Warcraft: Warlords of Draenor.
Those interested can download this new driver from here.
Do note that there is no MFAA support for Kepler-based graphics cards.
Enjoy!
New in Release 344.75:
The latest GeForce Game Ready driver, release 344.75 WHQL, provides support for Maxwell’s new Multi-Frame Sampled Anti-Aliasing (MFAA) mode. In addition, this Game Ready WHQL driver ensures you’ll have the best possible gaming experience for Far Cry 4, Dragon Age: Inquisition, The Crew, and World of Warcraft: Warlords of Draenor
Game Ready
Best gaming experience for Far Cry 4, Dragon Age: Inquisition, The Crew, and World of Warcraft: Warlords of DraenorGaming Technology
Supports Multi-Frame Sampled Anti-Aliasing (MFAA) mode. (No MFAA support for Kepler-based graphics cards)MFAA Supportet Games
Assassin’s Creed IV Black Flag
Assassin’s Creed: Unity
Battlefield 4
Crysis 3
Civilization V
Civilization: Beyond Earth
DiRT Showdown
DiRT 3 GRID Autosport
F1 2013
F1 2014
Far Cry 3
Far Cry: Blood Dragon
GRID2 Wargame: European Escalation
Hitman: Absolution
Just Cause 2
Saints Row IV
Splinter Cell: Blacklist
Titanfall

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
I like how far cry 3 has support but not 4
Another hardware anti-aliasing method from NVIDIA and on top of that, it’s only supported by Maxwell, christ NVIDIA.
I’m going to check it out now 🙂
LoL didn’t you used to be a big nVidia fanboy? Based on your previous comments, you always defended them in AMD/nVidia arguments. Just saying.
No I didn’t, I used to criticise NVIDIA, it’s not my failt people tagged me as an NVIDIA fanboy for views. I don’t give a sh*t anyway, I have all AMD now and buy what’s best for me.
It’s good that you aren’t a fanboy then. I myself made the jump to AMD when I got myself a R9 270X and I couldn’t be happier. At the end of the day, I think people should just buy them a card that gives them the best performance for the price irrespective if it’s nVidia or AMD.
So what’s the point to buy new GPU without any new features?
http://www.geforce.co.uk/hardware/technology/mfaa/technology
“MFAA is still in development, but once finished it will improve frame rates and image quality in traditional games, as well in Virtual Reality titles , giving Maxwell owners a superior experience that cannot be found elsewhere. ”
Well designed with virtual reality in mind. That’s something interesting.
well, win7 doesnt get dx12 either, so we will be forced to upgrade or amd will save us with mantle.
TXAA was supported by Kepler don’t be such a cry baby. New tech move along with it. Besides MFAA does not look like garbage next to TXAA.
When you whine like this it makes you sound like a new age gamer… aka somebody who has been only dealing with this for 5 years or less.
Grow up man.
I am Grown up it’s you that needs to Grow up dealing with this subject at hand.
Enjoy MFAA on your 750 Ti, oh wait, you can’t.
I own a 970 as well.
Can’t use it on a 750 TI Maxwell, how sad.
Now you are sounding like a little kid.. That’s sad… Besides a 750ti is not going to push new games using MSAA anyways… So no point in putting MFAA with it.
Well it’s fact, I know you don’t like to hear it but their Maxwell can’t even support MFAA. I guess NVIDIA will have to sell those people a 950 Ti.
I don’t have a problem with it… you just seem to be making a problem with 1st gen maxwell that was the 750/750ti. So what if it’s a fact. They made new tech and 970/980 and later 960 will be dealing with it.
And everybody knows the 750/750ti were the test runs for Maxwell and they are great cards and I pushed my 750ti to 1450ghz on Air.
You don’t care because you choose not to care since you have both cards and you can’t see past that.
You are just making a problem out of nothing. Besides what’s the point of MFAA on a card that would not play games at playable frame rates.. .
That’s a stupid way to look at it considering MFAA supposed to perform better than MSAA, yet MSAA is supported by ALL GPUS. MFAA supports old games, 750TI can play those games with MSAA easy.
And what games have you played on a 750ti with MSAAx2 or X4… Ether way it’s new tech for 2nd gen Maxwell dealing with the 900 series. And has been in the works for a While now. Besides it was not even ready to begin with when the 900 series came out in September since it’s just now in ready for games.
Why don’t you try it and find out, you’re the one with a 750TI, but wait, you can’t get the benefits of MFAA anyway so you’ll never know.
Yeah I got a 750ti because I wanted to see how far I could push it for fun next to how games run on xbox1/ps4.
And like Anantech say’s… “One nice thing with MFAA is that it currently ties into the existing MSAA support in games, so there’s no need for extra programming work on the part of the developers (unlike TXAA). Of course there are drawbacks with MSAA, specifically the fact that it doesn’t work with deferred rendering techniques, which is why some games only support FXAA or SSAA for anti-aliasing. MFAA doesn’t do anything for such titles, but that’s no surprise. Considering the performance benefit of MFAA over MSAA and the fact that it can be enabled in the control panel and will automatically work on supported games, I don’t see much problem in simply leaving it enabled for most users.”
Besides if you follow up on GPU’s and you wanted the newest tech the 900 series for Nvidia users all expected it when Nvidia 1st announced it. Again you are just making problems out of thin air.
Well, MSAA does work with deferred rendering games, just DX10/DX11 ones. Farcry 3 averages 40fps with 2xMSAA so MFAA would make that better, same with BF4 and they’re deferred rendering games. DX9 Farcry 3 doesn’t support MSAA for the reason stated above.
Ether way it’s new tech that was made for 2nd gen Maxwell’s dealing with higher end cards. well I am done.
You brought a 750Ti for bragging rights to PS4/XB1 users, that says alot.
I like hardware.. Been like that since I was a Teen and it was not for bragging right’s.
Btw Radeon R9 280, 280X, 270, and 270X will not work with freesync monitors … So how does that make you feel knowing that only 290/290x users get it… Oh and Tonga R9 285…
I don’t really care much about it but you read wrong, only the first “wave of cards” to be supported.
Why not care? Smother gameplay What’s not to care about that… And the cards I listed will not support it…
I’m not in he market for a new monitor, I brought my Benq about a month ago. :p
BTW
“Currently, AMD’s R9 295X2, R9 290X, R9 290, R9 285, R7 260X and R7 260 are the first wave of FreeSync-compatible graphics cards.”
oh but what if you wanted it.. you are left out.. kinda like you saying I would be left out of mfaa on my 750ti if I did not own a 970… But one would think AMD would of given all it’s 200 series cards support for it since the other cards came out 1st…
See point being AMD/Nvidia give certain cards tech well to sell cards that’s business
As far as I know, all those cards came out last after R9 280. R9 285 is a mix of old and new AMD tech.
290 came out in september last year… 280 came out march 4th 2014..
Fair enough but personally it’s not something I brought an R9 280 for, so yes other R9 280 users will be disappointed probably in that respect.
Just like 750/750ti users did not buy them for mfaa since Nvidia ahead of time said MFAA was being designed for the 900 series 2nd gen maxwell. they got them to play games a bit higher then console lvl since those cards are considered mid range cards. Not higher end or enthusiast lvl.
I’m glad you could found a defence for MFAA not being on the 750TI, referring to AMD. If you can’t defend your point use the opposition to do it for you. :p
I am defending it because it’s freaking true… but hey atleast 750/750ti’s can run on G-Sync monitors… eh eh eh . Besides like I said NEW TECH NEW CARDS. That’s how it is with both AMD and Nvidia. Been like that since I 1st started building my own game rig’s using ATI/3DFX cards.
Besides my Wife does not bother with the problems you are making with the 750ti rig since she plays Guild wars 2 on it on the highest settings and get’s 60fps not that she cares about that… 🙂
All the 750 TI was released for was to counter the consoles, it’s just a marketing card, a black sheep, a mutt of the Maxwell series.
So? And guess what it proved for just a 50 series card that it’s pretty Grrrrrrrrrrrrrrrrrrrreat! Even TitanFall on it with a I5 makes the xbox1 cry. Same goes with BF4 vs PS4.
Yeah, except you have to pay more for an Intel CPU than the card itself LOL
That’s not the point… Since a I5 is more then just a gaming cpu. Even a I3 with a 750ti will work well
Of course it does because devs don’t multi-thread properly so the i3 single-thread looks great just like in AC Unity and Farcry 4.
well I am having no problems with Farcry 4 on my i7/4770K
lol ,why would you? the game stutters anyway, even Total Biscuit has bad stuttering on his monster rig GTX 980 SLI
game does not have a sli profile…
Stutter has nothing to do with not having an SLI profile, the game just stutters in general like Farcry 3 did. FC4 stutters at 100fps on his card
stutter’s from needs a patch I would asume.. I go from a avg of 60/65 on Ultra to a min of 20 I seen. Ether way CPU load and GPU load are fine. Just seems like a funky bug. But general gameplay is just fine. But something is eating up performance. Prolly fixed in the next patch
Yeah, I’m happy with the performance in FC4 too, it’s just the stuttering. I put up a video in the FC4 Hairworks article ,just needs to be approved.
yeah will be interesting to see how the patch goes. I say 60/65fps is a avg because of the crazy spikes I get 73fps, all the way to 105fps and sometimes it wants to stay around the mid 80’s eh eh… Then Crazy dips to as low as 20 since it came out.
Very high pre-set seems to work best for me and works around that blur bug,since custom seem to make the gun blurry, I can keep 60fps+ more often but that damn stutter every so often FFS.
Remember how hard it was to pull his head out of… when we talked about Unity? Now you will need 1000x more horsepower to do the same thing when it comes to Nvidia.
In other words, just forget about it, at least until we’ll see a proper comparison.
Only Maxwell has the hardware to support MFAA..
2nd gen Maxwell aka 970/980 cards to be exact. Which does not shock me since it’s AA built for higher end gpu’s and not 750/750ti gpu’s
Enjoy your locked in hardware AA.
Im good with the old and good one msaa 🙂 cant stand the jaggs… but cant stand the blur of fxaa and txaa
Glorious NVIDIA!
At first a thought that u was a console peasant…i got it wrong…u r just an AMD butthurt user!
Check out PC Perspective’s article on MFAA. Underwhelming right now and only benefits a handful of games. I am sure it will get better and also be available to Kepler owners soon enough.
Geee how many more AA methods we still need?
One good one.:)
Not more but better with smaller performance impact. They say MFAA is about the same or better as MSAA but doesn’t impact the fps so much.
But what about the image blur? Cuz txaa and fxaa are a blur mess
TXAA is just meh.. Some FXAA’s are good for their performance cost. The blur isn’t bad and most of people wouldn’t even notice but the AA itself seems rather good. I can run BF4 with 4xMSAA but it just doesn’t get the job done, I use low amount of FXAA also. Then there is SMAA, no blur and relatively good result with very small fps impact.
The snobbish hatred of FXAA annoys me. Early implementations were poor, but it’s really good now and I like that it works on transparent objects as well.
Yeah, the FXAA in games like BF4 and Crysis 3 is great. It’s horrible in skyrim and people tend to think FXAA is that.
TXAA was a flop so now we have MFAA.
MFAA is temporal AA which same as ubersampling we used to have (ATI’s temporal AA in 2004! and ubersampling was used in old DX versions) in drivers but then were replaced (for obvious drawbacks) by newer CSAA / EQAA / SSAA / SGSSAA etc..
Temporal AA needed 60FPS to run well and even then there were artifacts when moving. Ubersampling (DSR) has same performance impact as running the game in native 2x-4x higher resolution.
Those are not new (but old and re-mastered) AA techniques!
Is it really the same?
https://www.youtube.com/watch?v=Nef6yWYu0-I&feature=youtu.be
If the performance is as good as they told in video above, that’s great. Was the older Temporal AA performance impact that small?
mfaa – motherfukin aa
:))
Another heavily promoted flop AA technique just like TXAA.
Nope lol
Notice how you people with 900 series are defending it, oh that’s right, because it’s only available to you.
it was the same way when TXAA was introduced when Kepler hit the market.
Doesn’t make it any less terrible to the point that NVIDIA have to pay devs to use it because hardware AA is terrible with deferred rendering engines today, well actually, it always was and still is. Software AA was supposed to break that and does but it needs to be better.
It’s all TWIMTBP games. And not all games with TXAA are bad. Warhammer for example is really good. It’s all up to developers and how they use it like anything that’s in form of a SDK.
You say that but it’s only half true, devs can’t fully control hardware AA, where software AA they can.
No Half truth they are given everything they need for the SDK to run on 600 series cards to what is currently on the market now
Doesn’t matter, the software is not low level enough ,this has to go through DirectX.
That has nothing to do with it. That would be like saying dev’s don’t have everything for AMD’s HDAO for games because it’s hardware OA….
You don’t understand the different between software and hardware AA. Hardware AA this has to go through DirectX, still has to be optimised for different GPUs, still has to run through a GPUs fast render path, software AA doesn’t need to do that.
You don’t understand my main point! They can fully control TXAA, just like MFAA for their games and how well it’s used. And Nvidia just uses it’s driver’s to boost performance.
Which is limited by DirectX and Direct3D as you know it not very close to the metal.
Well sure does not make much of a difference in performance. I been testing out 4XMFAA with ACU on Very High settings and getting around 45fps and when I use 4XMSAA I get around 34/36fps. Not to mention MFAA can also do transparency and some other things msaa can’t so it is superior in a lot of ways.
Nvidia did very well with it and I hope support for it grows in the future and when DX12 on the way it will eliminate what you see as a problem with dx11
MSAA can do transparently as well. All NVIDIA are doing is trying to find a hardware solution to aliasing with deferred rendering and cut the performance cost with MSAA. They stop with software and simple didn’t advance FXAA any more, now they have two different hardware AA methods, both of which only work on their GPU and MFAA only works on Maxwell.
With MSAA dev’s have to “Order independent transparency with MSAA” Just enabling MSAA screws up the transparency as the layered pixels are resolved x times equal to the number of sample levels. MFAA does not have that problem for dev’s
Ether way it’s a great AA and I am not complaining as a customer. And that is what matters in the end.
If you had a 700 series you’d complain.
Even if I only had a 780ti I would not complain. Because I would wait for the 1000 series. I don’t upgrade every year all the time
Good job Nvidia!
But still FC4 need some serious patching!
MFAA best AA out there and the first AA that focus on more quality and been less demanding !
Awesome Nvidia thats are just few of the reason that you are NR1 in PC Gaming industry!
Well you can’t even use it so why your shouting about it? LOL, Looks like you’ll have to stick with second best blurry TXAA.
I thought TXAA was very good in Origins, even did some very close-up image checks and it was not “blurry” at all. The image looks softer, but so does down-sampling.
It did a good job in Watch Dogs in regard to getting rid of nasty aliasing around windows and such, 2x is awful, 4X seems to be sharper.
Yeah, I think I used 4x. It was better than FXAA for sure when it came to small details like power lines and stuff in the distance that faintly disappeared with FXAA.
Anything is better than FXAA LOL :p
I still don’t think it’s that bad, I look at up close shots and see no degradation to texture quality in games.
What I really noticed was the lack of “shimmer” along edges. That’s what TXAA was really about as I recall, and it did a good job imo. I hate the “crawlies” you get along aliased edges >.<
Yep but it’s absolutely awful in Neverwinter, so blurry.
Ah, never played it. I actually find in-game AA and AF tends to be crappier than just forcing it yourself a lot of the time. I’m still shocked that a lot of games like Watch Dogs don’t have in-game texture filtering. That game looked like garbage without it. All those bloated textures filling your vram, and they look like crap…
I just override in control panel and since I’ve had my R9 280 I’ve notice image quality is just sharper overall. One thing I have noticed though enabling high quality filtering and AF x16 is the frame-rate lower, I can get at least 10fps more with performance on rather than high quality but high quality looks damn good.
TXAA when done right is not blurry. Most dev’s add it in at the last second and never do anything to patch it.
Lies.
Crysis 3 say’s other wise and that’s a AMD game XD
Nothing wrong with that, it just shows that AMD don’t GIMP the competition. What a great AA method where devs put it in at the last second, yet SMAA temporal beats it, cry engine and UE4 have their own temporal AA method which works on all GPUs.
It’s in the game that’s all that matter’s and it does very well. And UE4 is pretty much the most universal engine on the planet now. Now to mention Nvidia’s gameworks is perfect for it with the teamwork of Epic/Nvidia working tougher. As well as it for Linux which is interesting… OpenGL5.0 api for the future maybe? Guess that depends if DX12 is a flop or not.
TXAA looks like sh*t in Crysis 3, it’s just an AMD logo game for some reason, like Bioshock Infinite, NVIDIA beats AMD in benchmarks in that. In the end the fastest GPU should win, not one that’s only optimised for or tried to make the other look worse with crazy tessellation and unoptimised tech for any other GPU than NVIDIA.
Dirt Showdown runs better on AMD GPUs because of it’s superior Compute performance, not because the devs didn’t optimise it for NVIDIA.
lol take it you never played Crysis 3 with TXAA… oh well
HardOCP:
Crysis 3
“However, MSAA just doesn’t have what it takes to reduce specular and shader aliasing. This is where shader based antialiasing comes in. TXAA, avoid it, and you’ll be happier.”
SC Blacklist:
“We’d prefer to have No AA on Alpha Textures, than blurring the heck out of these with FXAA or TXAA. Just like Crysis 3, we can’t recommend TXAA due to the loss in texture quality. TXAA therefore, is moot.”
oh you look that up on google… sad to say it was fixed. btw try playing games and not just reading about them… it’s more fun that way XD
Well perhaps you should do an article about it and send it to those people and keep defending it with BS. Yes I used TXAA with my GTX 660, I know how blurry it is.
well sounds like it’s been a while because it’s looks nice now XD
I’ve only had this R9 280 for just over a month and you claim they’ve fixed TXAA LOL
yeah TXAA 4XT 1 does not look bad at all.
edit but on of the best examples of TXAA when done right would have to go with Titanfall.
Even heard of upgrades thats why i got PC and not consoles lol !
“Best gaming experience for Dragon Age: Inquisition”
not even a single frame improved, with or without it game runs the same
5/10fps difference during parts of the game so far.. At least on my 970 at 1440p.
Dragon Age runs pretty good even without the driver. for me the difference before and post driver was none. 45fps to 60fps on ultra in gameplay (cutscenes locked to 30, i unlocked them but i saw 17fps twice in some cutscenes). but my problem is the hichings, every few hundred meters i get one & in cutscene when the camera changes. some textures are really beautiful. like always EA servers are F’ed at launch so i couldn’t import my saves from the Keep.
sorry for gamepad:
Don’t apologize for using a controller, having options is what PC gaming is all about imo
don’t be sorry for gamepad most newer games are being designed for gamepads anyways. Pretty much the only thing you need a keyboard mouse anymore on PC these days is MMO’s, FPS. and RTS when they are cough cough made these days.
true. DAI is perfect with KB&M but i’m 32yo, i can’t sit for 8hours+ behind my computer’s desk
Great performance scaling in Far Cry 4 with SLI, both cards are above 90% usage.
Yes it is absolutely the same thing with even very same drawbacks or very similar performance cost. ATI’s Temporal AA from 2004.
I switched to Nvidia a year ago and I haven’t regretted it at all. So many more options built into their drivers and software. The built in downsampling has been wonderful, it’s a great way to test performance on higher-end monitors (eg 1440p/4K), and it gives me great image quality in games where I can downsample.
Really shocked to see certain games like Skyrim running incredibly well at 4K, too.
And now this neat MFAA to check out. So many toys.
It’s either SMAA / MSAA or SSAA / Downsampling. Everything else is garbage.
Say’s the pro AMD guy… MFAA is pretty amazing so far. And as long as developers use it right and not rush it in their games it will be used for years to come.
Pro AMD guy, lol. I don’t care who made it, I don’t need it. TXAA was a garbage, and it will be a garbage too.
Shame for you. I am enjoying it on my 970. Besides saying was is implying the past… TXAA is still being used and it’s not as garbage when more developers are using it right.
I thought you came to your senses after all, but you continue to sound like a shill. TXAA is bullsh*t. If you need a performance antialiasing you use SMAA / MSAA, and if you need a proper antialiasing you use SSAA / Downsampling.
Why would anyone want to use MFAA instead of supersampling / downsampling if he has enough power to enable it?
yeah shill overused word like optimization, and bottlenecking… If you never used TXAA in Titanfall don’t speak. It’s a nice AA when used right by dev’s but the majority of the time I use MSAA and now MFAA over everything.
If you think that MSAA / MFAA is better than SGSSAA / Downsampling then you even more clueless than I thought.
Did I say it was better? Did I even mention SGSSAA?…. I am 34 years old you just sound like a punk teenager and I am sure I been building my own gaming pc’s since you were even thought of. BTW SGSSAA can be better or worse than OGSSAA depending on the game. In some games it can be blurry while OGSSAA isn’t, and vice versa. It works on pretty much any graphics engine regardless of API.
And you say things like “then you even more clueless than I thought.” Come on you sound like the Clueless one just naming off AA options that you probable don’t even use yourself. And just so you know SGSSAA does produce blur in some games just like OGSSAA does in the x-x-x and xS modes. And once again it has nothing to do with LOD. The xS modes have an auto lod adjustment but that doesn’t help you when a game is producing a blurry image due to a conflicting post-processing shader.
At least you can use google. That’s a plus, I guess.