NVIDIA header image 2

Nvidia’s GeForce 337.50 Beta Are Great But Not Equivalent To AMD’s Mantle Performance Boost

Nvidia today released its latest beta driver for its graphics cards. According to their release notes, the GeForce 337.50 Beta drivers pack key DirectX optimizations which result in reduced game-loading times and significant performance increases across a wide variety of games. Moreover, these gains were said to be noticed in CPU bound titles, therefore we decided to give these drivers a test. The end result pleased us but the performance boost is not as good as the one witnessed with AMD’s Mantle.

Our Q9650 was the perfect CPU to test. While this CPU can still run – without major issues – all latest titles, it is also the CPU that sees the highest performance gains from such optimizations. AMD’s Mantle shined on low/mid end CPUs, therefore we expected Nvidia’s latest driver to give us a noticeable performance boost.

Well, after testing both GeForce 334.89 and GeForce 337.50, we have to admit that there were noticeable improvements in just a couple of games. Unfortunately, a lot of titles were merely affected by this new driver, suggesting that Nvidia’s optimizations were not made for all DX titles.

Arma 3 and Shadow Warrior (with Mirrors option enabled; an option that stresses both the CPU and the GPU) did not receive any performance boost at all. Tomb Raider, on the other hand, saw a 2fps improvement. Lost Planet 2’s Test B benchmark improved by 3.5fps while THIEF’s benchmark saw a minor 4fps improvement.

Battlefield 4 and Sleeping Dogs, however, surprised us with their performance boost. Battlefield 4 saw a 7-10fps increase in scenes we were CPU limited, and Sleeping Dogs’ benchmark was running without stuttering issues whatsoever. Not only that, but the framerate in Sleeping Dogs was increased by 7-9fps.

GF33750 vs GF33489

Do note that Resident Evil 6’s performance was measured by its points. The game scored 11.693 points with the GF334.89 Beta drivers and 11.867 with the GF337.50 Beta drivers.

Unfortunately, we were unable to test Assassin’s Creed IV: Black Flag as our key was not working (that’s because Ubisoft de-activated review codes after the game’s official release).

All in all, while this new driver is better than the previous ones released by the green team, it is not up to what AMD achieved with Mantle. Although there are improvements that will help a lot of gamers on a number of CPU bound titles, this new driver does not put AMD’s API to shame. Mantle was able to double the performance of AMD’s GPUs when used with low/mid tier CPUs, something that this driver does not even come close to.

Below you can find the scenes that were tested in Tomb Raider, Shadow Warrior, Arma 3 and Battlefield 4.

arma3 2014-04-07 14-33-20-98bf4 2014-04-07 17-32-03-93sw 2014-04-07 17-35-54-15TombRaider_2014_04_07_17_22_23_611

61 thoughts on “Nvidia’s GeForce 337.50 Beta Are Great But Not Equivalent To AMD’s Mantle Performance Boost”

    1. GAH I should have benched Metro before I installed them to compare. Though I think I uninstalled it, even with an Nvidia card it runs like crap.

      1. didn’t have problem with Metro, specialy with 335driver. lowest fps i got was 46fps on Max with Advanced physx and without SSAA

    2. Friend of mine told me Crisis3 runs better for him, but not fraps number was provided, just try and see if you have geforce^^.

    3. Crysis 3 runs UBER better for me. I actually wanna go back and play it all the way through again. lol

        1. 4770k ( Stock)
          16 gigs Mushkin Blackline 2133
          ADATA 256 gig SSD
          1TB WD Caviar Black
          128 gig Kingston V300 SSD
          2 EVGA GTX 770s
          Corsair H100 cooler
          Corsair HX 1050 PSU
          Windows 8.1
          Asus VG248QE @ 144hrz

          1. The gains come from the min FPS not the max. Before I was getting:
            25 min
            75 max

            I get now:
            75 min
            115 max

            The gains come from keeping that min FPS above 60 FPS. Much better stability that way. The max never really counts if it is over 144 FPS. Heck, I get 320 FPS in Warframe. lol After 144 ( my monitor’s limit ) it doesn’t really matter.

          2. That’s an incredible difference in min frames.

            I have that monitor, I love it, but I don’t think I’ve ever maxed out the frames, not in a modern AAA game, anyway, lol

          3. Yeah that monitor is the best I have ever owned! Let me ask you something, isn’t it hard to convince people how good the monitor is? I do hardware reviews on Youtube and all I get for comment is, ” The human eye can’t see above 60 FPS ” or ” there is no difference between 60 hrz and 120 hrz.” Bugs the crap out of me. lol

          4. Haha, I know – I think people just fear what they don’t understand (or can’t afford?). Mostly I hear people going on about IPS monitors, but if there’s one thing I don’t care about when I’m playing a game it’s perfect color gamut, lol. Even as an artist I don’t give a shit about how one hue of color is a tiny bit different than the next – we all perceive colors differently, anyway – and if you’re an artist sweating over such a tiny discrepancy then you’re doing it wrong.

            The higher hz is far more beneficial to the human body, constantly streaming more information to your brain. At 120 hz you’re seeing twice as much picture than at 60hz – that’s just amazing (and even more at 144hz).

            Honestly, I don’t know why they even still make 60hz monitors or TV’s. It’s the difference between a laser mouse and a mouse ball in terms of a technological leap. It benefits all around – even if you’re watching a movie that’s only filmed at 24fps – it’s still sending more information to your face.

    4. I did Saints Row 3 earlier. I’ve been playing it and I’m pretty certain the CPU is bottlenecking me, I got down to 30fps often when blasting through the city or heavy weather. Now it’s never below 40 and generally over 50 or 60. It just FEELS so much better, I love it. I’d guestimate an average 10fps improvement.

      1. Yup, same here! I’ve been checking, like, EVERY driver since the game’s release, it never worked good enough on my old dual core and gtx260. Driving in the streets with 50-67% gpu usage and 20fps, now it’s 30fps, 99% usage.
        Glad that’s finally fixed!

      1. True same issues here, my cpu is way to slow for my vga but still no fps gain.
        The benchmark ran smoother and i had less fps drops though.

    1. well it’s DX and those drivers they been working on for months, it will not et much better, if CPU overhead could be eliminated by drivers techniques then why thy would even support DX12?

      1. The issue isn’t just “overhead”, it’s the fact that so many games are not programmed to use more than a couple hardware threads. I’m not sure you can do a lot about that with driver tweaks or even an api. That’s what’s frustrating about my Phenom II x6, and AMD CPUs in general- half your CPU power is just sitting there unused most of the time because games tend to only use a couple cores 🙁

    1. +1 nothing more to be said 🙂
      Nvidia DO put Mantle to shame as we all know that spectacle about Mantle who gives shi*ty image quality for to make it work and gives some more fps, not to mention an piss poor weak AMD cpu thats 10 years old, then yeah give kudos to AMD. yeaah right :/
      Everything AMD has done have nVidia done MUCH better in every way, thats just a fact and those who says otherwise are just butthurt AMD fanboys!

        1. MANTLE doesn’t boost performance by lowering quality, LOL. If there is problem in one game it doesn’t mean there is problem in API. By the same logic you’d have to blame DX API for every graphic corruption in DX games. That is why driver updates and application/games updates are for^^

      1. mantle still beta..all those stuff were bug….and mantle didnt give any boost to graphic which actually game developer task itself…..what mantle and incoming dx12 do is reduce cpu load and draw more power from hardware as how console api capable to do….i dont care how much you love nvidia..but fact is fact

          1. MANTLE advantage is in much much better CPU utilization (lightyears better than any DX11 driver patch can ever bring, by number up to 10 times better) and RTS view makes sure you actually see most of those 5000 ships that are computed. So it is the best representation of improvement MANTLE gives you. Alse best difference for CPU utilization you get by using low settings (in that case 400% improvement). Also for MANTLE average FPS is not really relevant as MANTLE brings much stable framerate even if number would be very much the same – no stuttering, no FPS drops.

    2. actually first benchmark provided by nvidia were publicity tricks…they make a showdown comparison between API with different gpu obviously unfair….780ti performance already suprass r290 even with mantle,so even with this ‘new’ dx11 driver doesnt prove anything….now benchmark up there is real showdown…what a trick

      1. It is AMD who made the claims that Mantle will enable 290X to surpass 780Ti. It didn’t, especially after this new driver. Even in the especially and carefully made best case for Mantle (ie Star Swarm demo), NVIDIA pulled through with higher scores using only DX11, 780ti was anywhere between 20-30% faster than 290x’s Mantle path, and between 60-80% faster than 290X’s DX11 path!

      1. i5 3570K on stock speed (3.40 GHZ) with 8 GB ram. I tested the game in Riverwood. My OS is win8 64-bit.

  1. what about rome 2 ? i’m going to test it myself tomorow, it’s a cpu-bound title, and i want to know how much this boosted it. 64% boost for rome 2 is too much and too good to be true, but in real world ? 1fps ? 30fps ? i will test it

  2. Oh man FFS…TRY PLANETSIDE 2…there is nothing…NOTHING more heavy for a CPU than that game. Go to a heavy battle and see what happens…

    1. I just tried it and it ran remarkably better than last time I played. It’s hard to compare fairly, though, since your framerate really can go up and down in that game based on what’s going on. Hopefully someone can do some serious comparisons.

      I will say this- I’d pretty much given up on the game *because of performance* after playing a couple times a few weeks ago. Now it’s running fine, 40-60fps and it felt pretty smooth in general. Could just be better servers tonight, though.

    1. Your CPU isn’t weak mate. Can’t test it myself because I don’t play the game but probably would benefit me more because I’m GPU bound and my CPU is weak for my GTX 660.

      1. i guess Benchmark in Rome II is somehow screwed, becuase with those settings the campaign map is 50 to 60fps. tried playing for 5hrs today. last time i checked it was 33 to 45fps, so the boost is real but not in the game’s benchmark. in real time battles i can see a little boost but it depends on unite counts. more than 70,000 units and it goes to 5fps 😉

        btw, that “Vegetation Alpha” option is something hungry. all of those 8 and 9 frames on results was for this option, i turned that off and and avrage in benchmark goes to 48fps

  3. I did some tests before and after… in most cases I got a 3-5fps boost in games across the board. Star Swarm’s demo was the big winner, which makes sense since Nvidia is trying to show up AMD there.

    Star Swarm (“high”) – 37 to 50 (35% increase)
    Skyrim Markarth – 32-35
    Skyrim Windhelm Stables – 76-80
    Call of Pripyat benchmark – 3fps avg improvement
    Unigine Heaven – 46-49, score went from 1179 to 1256
    Lost Planet 2 DX11 Test A – 86-91
    3DMark (newest) Firestrike (not extreme) 5561 to 5574 combined, Physics went from 6348 to 6843 and Graphics went from 6412 to 6388
    Shogun 2 DX11 High 1080p benchmark – 60.39 to 62.64

    System:
    Phenom II x6 1090T 3.2ghz
    GTX760 4GB OC
    12GB DDR3 1333
    Win7 64-bit

    The most promising thing to me was the 3DMark score. The overall wasn’t much better, and the Graphics score went down slightly. But the Physics score – which is basically the CPU test – went up 7.8%

    1. Phenom’s is pretty small compare to newest i7, so those driver eliminating CPU overhead should bring completely different results, shame.

      1. Yeah, I was hoping for a bit more given the way people were talking about it. But expectations aside, an increase of 3-4 fps across a range of titles is a very good gain for any single driver release.

        Mantle is a nice idea but until it’s in more (good) games – and comes supported *at release* – it’s not going to be the game-changer AMD wants it to be.

        1. in THIEF i have about 200% improvement in min FPS, and FPS is much stable. Problem is that NVidia using completely wrong numbers in their presentations, so their comparing of MANTLE is totally irrelevant and they should focus on their tech as there is no PhysX game after 6 years that would run stably on 60FPS (regardless what HW is used, 780 SLI with dedicated PhysX card etc.) So after they fix this they might criticized other tech, because MANTLE in its first year has much better results than any NVidia ever did.

  4. “Nvidia’s GeForce 337.50 Beta Are Great But Not Equivalent To AMD’s Mantle Performance Boost”

    Mantle is an API. This is a driver. Also, how many games currently support Mantle? Two? And weren’t there comparisons between DX11 and Mantle showing that Mantle actually had decreased quality?

    Either way, Nvidia has always been known for having amazing driver support, and that’s something that can’t be said for AMD. I love the fact that my video card just becomes better and better every couple months.

    1. I think *some* comparison is fair because Nvidia themselves are pushing that by benching Star Swarm and comparing their drivers to AMD’s Mantle performance. Though based on my benchmarks they did a hell of a job with “just” drivers in Star Swarm, 35% boost.

      1. earlier benchmark already unfair…before this there was benchmark showing 780ti (old driver)vs r290 with mantle….which is 780ti win….those game were starswarm,mantle and bf4….and even with this new driver really doesnt prove anythings..some tricks by nvidia

  5. “Up to 64% in Total War: Rome II”

    no one want to accept my comment ? i send it 12hrs ago

    btw: i test it on Rome II

    no even a single fps boost

    335: (in game benchmark = 34.2fps)
    337: (in game benchmark = 34.3fps)

    where is that 64% boost in rome 2 nvidia ? or it was 0.64% ?
    (i5 3570k – 8GB – 660ti 2gb 1306mhz)

    or maybe i should have an i7 4999x and 120GB ddr4 and 2x titanZ to see those 64% boosts.

    1. Whenever there are links in a comment, the comment is awaiting for approval. Should be visible now 😉

      1. I understand that you have that to avoid spam etc. but it’s really frustrating sometimes and kills conversations :/

  6. actually the first benchmark by nvidia already obvious..they want to show comparison between API but with different gpu is seriously unfair….quite a trick nvidia pulling there…780ti already suprass r290 even with mantle..so with this ‘new’ driver doesnt really prove much…now with ‘real’ benchmark provided up there we can see whats actually happening….i wonder even if dx11 capable pulling stuff like mantle did,why on earth they make dx12 instead?why not microsoft just release new dx11.3 or something…
    waiting to see what opengl can do…rumors said opengl can draw up to 15x more power

  7. I have definite improvements in Battlefield 4, Tomb Raider and Assassins Creed 3. AC4 Black Flag still runs like a giant turd. I am on 2x 660 sli and fx-8350

    1. Actually every single DX11 game I have played feelds much smoother and i have been able to increase graphics with no performance hit.

    2. I just switched to a FX 6300 and AC4 runs a lot better, in the 40+ consistently now, 1080p, all high settings, very high shadows, SMAA, GTX 660, HBAO+

Leave a Reply

Your email address will not be published. Required fields are marked *