NVIDIA header image 2

Nvidia Finally Officially Speaks About AMD’s Mantle – Will Not Support It, No Real Benefit Using It

On MaximumPCMagazine’s latest podcast, Nvidia finally took a stance on AMD’s Mantle. MaximumPCMagazine invited Nvidia’s Distinguished Engineer Tom Petersen and Senior Director of Engineering Rev Lebaradian who shared their thoughts about it, and confirmed that Nvidia is not part of it, will not support it, and there are – at least to them – no big benefits from using it.

Of course, most of what Nvidia claimed is PR stuff and nothing more. But let’s take things from the beginning. When MaximumPCMagazine asked about Mantle (16:00 mark), Tom Petersen said AMD is free to develop and innovate in any area they think is important.

“We don’t know much about Mantle, we are not part of Mantle. And clearly if they see value there they should go for it. And if they can convince game developers to go for it, go for it. It’s not an Nvidia thing. The key thing is to develop great technologies that deliver benefits to gamers. Now in the case of Mantle it’s not so clear to me that there is a lot of obvious benefits there.”

Rev Lebaradian said then that ‘if you go look at the numbers, compare our DX drivers today to Mantle games, I don’t think you’re going to notice any big improvement‘ to which Tom added ‘in configurations that matter‘.

But later on, Rev and Tom contradicted themselves. While Tom claimed (and Rev agreed) that there is no benefit from using Mantle, Rev said:

“It’s possible to pull performance out of DirectX, we’re approving that, and so you can argue that maybe it’s not a good idea to put extra effort into yet another API that does the same thing essentially. Feature wise there is nothing more.”

And just a couple of seconds afterwards Rev said:

“DX12 is coming and a lot of the features, the benefits of having a lower level API (the extra calls and stuff), it’s going to be in DX12.”

So, how come Mantle does not have any features and is similar to the other APIs when DX12 will pack the same new features that Mantle already supports? Especially when Nvidia is using DX12 as an excuse for not supporting Mantle?

All in all, it all comes down to specific techs that companies are trying to take advantage of, while at the same time downplaying the features of their enemies. This basically means that Nvidia will never support AMD’s techs/features, and the same applies to AMD as well (which explains why they did not take advantage of G-Sync and went ahead with FreeSync).

I think Gordon Mah Ung, Editor In Chief at Maximum PC Magazine, was spot on when he said:

“G-Sync is a perfect example. You know what? G-Sync is pretty damn cool. But you know what? Only works for Nvidia GPUs. Cool stuff, doesn’t work with AMD, Intel ‘meh’ and of course AMD is like ‘Hey, we’ve got Freesync’ right? Can’t we just all get together?”

To which of course Tom replied “why can’t we be friendnemies? (laughs)” and with Rev adding “I mean would you ask BMW to give their technology to Mercedes?

And that is that!

Nvidia Responds to AMD's Cheating Allegations (No BS Podcast 229)

210 thoughts on “Nvidia Finally Officially Speaks About AMD’s Mantle – Will Not Support It, No Real Benefit Using It”

  1. Completely disgusted fed up with nvidias attitude. They have become the bane of PC gaming. Now watch the nvidia ass licking fanboys attack me.

  2. AMD offered FreeSync because NVIDIA got G-Sync out onto the market first and with FreeSync you need a card and monitor with DisplayPort 1.2a anyway.

    Why should NVIDIA adopt Mantle? Well it’s true that devs don’t really give a crap what API they have to use but more the different versions of that API for each platform and how easy it is to port. NVIDIA are right to say that Mantle only benefits certain setups ,especially CPU limited setups but rarely are games so CPU bound anyway now days. DIrectX just needs to cut the fat, it’s being done, devs will get the low level access that they have been wanting for years with DX12 but in the end bad ports are bad ports, Mantle won’t change that.

    If you want proof of devs not supporting different API versions, look at DirectX 11.2, although tiled resources is great tech, no one is supporting it via Windows 8(except BF4), they rather just skip to DX12 but again, it wouldn’t have changed games like Watch Dogs much if it was optimised for PS4 unified memory just dumped on to the PC. Mantle has a place, I’m just not sure it’s a long term one or one where all games will support it, Mantle really needs to prove itself in all situations in a real games.

    1. I think people focus too much on potential raw performance, too. Mantle is potentially better for framerates, but with developers familiar with Direct-X they are going to have a lot of ways of tweaking that. There’s also the issue of ease of development. I’m guessing a higher-level API like Direct-X is easier to program for because it would have a more extensive library to draw from. DX12 could potentially be the best if it would allow lower-level access like Mantle when wanted/needed, but the convenience of DX11 when it isn’t. (I’m not a programmer, though, so this is all me extrapolating from what I read)

      1. Actually we don’t need different API’s for gaming if they are optimizing the games effectively.Every game there is a minimum specification and recommended specification. If that game can run at 30-60 fps with lowest settings and lowest resolution then no problem.No API/drivers can fix a game’s coding/engine problems. So developers should optimize their games under these specifications.

        1. IF you gonna make someone to run new game on lowest setting in 480p then I do not think we need any new technologies what so ever.

          1. Bro,please check the resolution settings in games. In every game the lowest resolution is 480p or 600p including Watch_dogs

      2. MANTLE is better for framerate stability, not necessarily for higher framerate.
        big news, high abstract API are easier to adopt. Will see how well DX12 will be adopted, MANTLE is very well atm.

    2. “Well it’s true that devs don’t really give a crap what API they have to use”

      Very good point which is why everyone is still using 3dFX

    3. “AMD offered FreeSync because NVIDIA got G-Sync out onto the market first
      and with FreeSync you need a card and monitor with DisplayPort 1.2a
      anyway.”

      Right, because Nvidia’s G-Sync doesn’t require special hardware either.

      1. It was always like that from the outset, you know what you get with NVIDIA and that’s proprietary, just like Microsoft and their Windows/DirectX locked in ecosystem. Funny though, the vendors don’t really care, more money for them and NVIDIA, which is the point of a proprietary business and that’s why NVIDIA have lasted this long, Unike ATI, S3, and 3DFX.

      2. Pretty sure G-Sync is actually better, though. People act like they are the same thing but as I recall G-Sync addresses lag in a way Freesync does not.

        “Furthermore the [AMD] spokesperson believed that many of the problems solved
        by G-Sync could simply be resolved with triple-buffering, pointing out
        that there used to be an option in AMD’s drivers to force this on and
        that it could easily add this back in.”

        Great, through some input lag back in there, w00.

        1. Triple buffer only solves the frame-rate drops under refresh, is doesn’t solve latency or input lag and you have to apply extra options to negate it. Triple buffer also requires more frame-buffer memory so it’s not a fix.

          G-Sync fixes all these things on a hardware level but clearly people still don’t understand it, and think it’s just a money maker because they’ve seen a video on youtube that doesn’t show any difference, or they lack the understand of how display to GPU timing works.

          1. Tom’s Hardware had a big article about it (before Freesync was even a thing) and man did it sound wonderful. I think Freesync is definitely a good thing in general, but I hope Gsync isn’t forgotten because of it.

        2. Well you can be pretty sure (because NVidia said it is better) but rest of us might hopefully wait till adaptive sync monitors come up and see for themselves.

      3. That’s not what he said, basically first gen free sync users will need to upgrade there monitors just like G-sync users and if you watched the Huddy interview with pc per you’d know that cheaper monitors with free sync will actually have a smaller usable frames area than more expensive monitors so if you want free sync that works across a full range of fps you’ll need a high cost model. Another point to note is that the current freesync compatible gpu’s are the 260 and 290 ranges, People running 270’s, 280’s or any other AMD gpu can’t use free-sync.

        1. Yep, All you need to run a G-Sync monitor is a 2 year old GTX 650Ti Boost or above and, all support DisplayPort and you can have one now. FreeSync isn’t actually free, you still have to upgrade to DisplayPort 1.2a monitors

          Also, yeah you need a specific AMD card as well to run VESA Adaptive V-Sync.

          1. All you need to run a G-Sync monitor is… guess what? A BRAND NEW G-Sync monitor, or buy the G-Sync module and mod a old monitor.

            Sense?

            Oh and none of those G-Sync monitors supports DisplayPort 1.2a … waste of money.

          2. DisplayPort 1.2a compatible monitors won’t be out for another 6-12 months anyway, G-Sync monitors are out.

          3. Wow! That’s how you avoid your ridiculous post?

            I’ll bring it back to you: you said that G-Sync is like the obvious easiest choise, because you only need a 650ti! And for Adaptive sync you need a new monitor with display port 1.2a, and a specific GPU.

            When the reality is: For G-Sync you need a NEW MONITOR, the same thing as Adaptive Sync – yet G-Sync monitors are going to be OUTDATED in 6-12months since they DON’T SUPPORT display port 1.2a.

            And you need a SPECIFIC GPU for G-Sync as well as for Adaptive Sync. On G-Sync you need a NVIDIA GPU, on Adaptive Sync you need a GPU that supports display ports 1.2a… in this case only AMD has them in GCN cards.

            Avoid this.

          4. What’s so good about DisplayPort 1.2a that a G-Sync monitor needs it?

            Specific GPU for G-Sync, Yes a friggin 2 year old one or newer, why is that an issue? You can get a GTX 660 for £130 or a GTX 750 Ti for £120 and they have DisplayPort.

          5. It supports Adaptive Synce… that’s something no? It will work on any piece of hardware: Intel, AMD, NVIDIA.

            Or you think we should just stop developing DisplayPort because G-Sync is out?

          6. Do you know what you’re talking about? G-Sync doesn’t need that, that’s the point of it, WTF would a G-Sync monitor need Adaptive Sync when it solves the problem on a hardware level in the first place?

            FreeSync tries to solve it on the software level and claims to be the same when it’s actually not.

          7. Wow you try really hard to avoid the point of the question. The point is: you wont be saving any money with G-Sync, in fact, in the long run you will have to spend more if you decide to change GPU to another vendor.

            Yes I know what im talking about.

            So if you choose to buy a different GPU, from another brand, you will have a 150 euros block of useless hardware in your monitor called G-Sync?

            Or you change monitor at the same rate you change GPUs?

            I change GPU every year, monitor probably 3 or 4years.

          8. Well, you just answered your own question, what’s up, can’t you make choices based on what you buy for the future? If you switch GPU vendor that’s your own fault, it’s not my fault you change GPUs every year. It’s simple, it sounds like G-Sync isn’t for you, really simple isn’t it? So wait for FreeSync, problem solved.

          9. Still avoiding the point? Damn lol

            Just don’t missinform people saying that G-Sync is cheaper/better option then Adaptive Sync. It’s a blunt lie.

            They might be the same cost wise (wich I highly doubt that Display port 1.2a support adds €150 to any monitor), in the long run you will have to spend more money with a G-Sync monitor because:
            a) the foundation is on old standards;
            b) limited GPU support;

            I wont spend anymore time with a delusional fanboy.

            Have a nice day

          10. I didn’t say G-Sync is cheaper and better than Adaptive V-sync, maybe you should read what I said properly, I said FreeSync isn’t actually as “free” as people might think.

            1.2a isn’t even out yet, you got a time machine?

          11. don’t mind him he is a die hard amd fan from wccftech, videocardz as well as fudzilla. you can slap facts in his face all day and it won’t matter.

          12. I will quote you:
            “All you need to run a G-Sync monitor is a 2 year old GTX 650Ti Boost or above and, all support DisplayPort and you can have one now. FreeSync isn’t actually free, you still have to upgrade to DisplayPort 1.2a monitors

            Also, yeah you need a specific AMD card as well to run VESA Adaptive V-Sync.”

            I’ll translate:

            “All you need to run G-Sync is a G-Sync monitor that costs €150+ and a NVIDIA GPU… even GPUs that don’t have the power to cause tearing wich bring G-Sync to be useless, but ok!”

            To run AdaptiveSync you need a standard monitor that supports DP 1.2a, wich has no mention of extra price for DP 1.2a, and any GPU from any vendor that supports DP1.2a – only AMD GCN cards support it currently, cause NVIDIA lags behind as usual just like Direct11.2 support”

            G-Sync = NEW MONITOR (+€150) + NVIDIA GPU

            Adaptive Sync = NEW STANDARD MONITOR WITH DP1.2a + ANY GPU WITH DP 1.2a.

            Now do you understand?

          13. Your completely missing the point Jizzus, And then you start throwing fanboy claims at someone. That’s comical. If anyone is acting like a butt hurt fanboy it is you. Once monitors that support displayport 1.2a are available I’m sure the range of G-sync monitors that are released after then will include it. You can’t bitch about a future tech not being on current models for christ’s sake. Nvidia got the jump on AMD with G-sync and are several months ahead, So what?

          14. Thank you for giving me reason of how G-Sync monitors will always be more expensive then monitors with Adaptive Sync!

            Like you said: G-Sync monitors in the future will be released with DP1.2a – so over the €150+ for the GSync module you will have to pay for the DP1.2a (according to Sean).

            There are no problems sister, who cares about another proprietary crap? Just don’t make posts saying “with G Sync you ONLY need a old 650ti, and with Adatpive Sync you will need a new monitor and a AMD GPU”.

            When both need new monitors and specific GPUs.

          15. Again you miss the whole point and make presumptions that no-one can know at this moment in time of course future monitors will embrace future tech updates the same as current ones do not include it, As I explained in an earlier post AMD’s Huddy explained that the cheaper free-sync capable monitors will as an example only work when fps is between 30 and 50 or 30 and 60 maybe (his own words) so in order to get one that covers the monitors full fps range (ie: 10-60, or 10 to 120/144) like g-sync capable monitors do you’ll need to buy a high end free-sync capable monitor. If you need verification go to the PC.per interview of Richard Huddy. I imagine I’ll get a free-sync capable screen to try it out when it is available but we already know the tech does not do as good a job as G-sync does and that info came from AMD.

          16. Again another cluesless post: Huddy said that as an example.

            “Potential ranges include 36-240Hz, 21-144Hz, 17-120Hz and 9-60Hz.”

            You make assumptions that wont work as good as G-Sync, when you are clearly uninformed! In reality the tech is way better then what G-Sync offers lol wake up son

          17. Sure he said that but we are talking about a different part of the discussion where he specifically said that cheaper monitors with free-sync will “as an example” only work when the fps is between 30 and 50 fps, So stop the lies and stop all the condescending son bullshit, My grankids are more open to differing opinions than you clearly are so go back to living in your bubble boy.

          18. So how does one example states that G-Sync is superior to Adaptive Sync?

            Where does it say the more expensive, top end monitors will be EVEN MORE expensive due to display port 1.2a?

            Because that’s what G-Sync does – it adds up to the price.

            A cheap monitor will be cheap because it uses inferior hardware. A expensive top end model will be expensive because it uses superior hardware.

            By your logic we are going to watch a increase in the price of monitors because of display port 1.2a? Where is the sense in that?

          19. Your doing it again,

            When was it said they’d be more expensive due to display port 1.2a?

            It wasn’t. I said the more expensive monitors would allow a wider working fps band for freesync than the cheaper monitors will.
            I said that because Huddy said it. That isn’t the competition making claims, but because it’s a negative point you disagree with it.

          20. Negative point? That’s not a negative point, that makes complete sense.

            Who would buy a expensive monitor to be used with a low end GPU? That’s missmanagement of the money used in your build. The example given here was a 650ti.

            Who would buy a G-Sync monitor, giving extra €150+, for a 650ti usage?

          21. Your now moving the subject to something I haven’t remarked on, What has a 650ti got to do with free-sync? The other guy’s point was all Nvidia gpu’s from that model on can use g-sync unlike AMD who currently on have the 260 and 290 ranges that work with free-sync. Remember there are cheaper monitors coming with g-sync though and there is also the situation of what if you need a temporary card for some reason and you can just get a cheaper one which also benefits from g-sync (which works on low frames too) that you can use temporarily.
            It’s a loaded question with to many variables and more to the point it’s got nothing to do with what I was talking about. So there’s no reason to be using it in response to me.

          22. Yes GCN 1.1 cards are the only GPUs in the market that support Displayport 1.2a.

            Like GCN 1.0 and 1.1 cards are the only ones who fully support DirectX11.2 features.

            “Cheaper” monitors with G-Sync are going to be expensive compared to the FreeSync solution, because it requires extra, not standard, hardware.

            Anyone who buys a temporary card because of a monitor doesn’t deserve to be called a pc user – it’s just a dumb fck who doesn’t know how to make balanced builds.

          23. Here’s the link you need, Now do away with your selective hearing and listen from 58 to 63 minutes, There example on display was capable of working only as long as the fps stayed between 40 an 60 so sure top end monitors can have a fuller range but that’s exactly what I said so again we get down to you selectively choosing what you hear or read. grow the hell up Jizzus.

            https://www.youtube.com/watch?v=8uoD8YKwtww&index=15&list=UUtKh7t3br1obEQL6EyiAq0w

          24. Yes, G-Sync monitors will always be more expensive than FreeSync monitors because G-SYnc is hardware, FreeSync is software. Go learn more about it before whining.

          25. I work at a place that manufactures monitors. I can say beyond a shadow of a doubt that at first G-sync is better than free sync. I can also say that G-sync will cost more. They will both stabilize as a premium option aimed at gamers mostly. The R&D for Free sync is not free. AMD is not paying for it like NVidia did for G-sync. This R&D has to be payed for by the manufactures(new scalers for DP1.2a) and then passed on to customers. Also, it will be a non mandatory standard for manufactures. It will be marketed the same way G-sync is period. Just for all gamers and it will start off cheaper and may remain cheaper but not by that much……. say $35 to $75.00. G-cync is here now, I can also tell you that AMD is having issues trying to get manufactures on board for free sync. Basically they want AMD to foot the bill like Nvidia did for G-sync. They see it as a risk and wonder if it is so awesome and is gonna revolutionize monitors that pertain to gaming why AMD is reluctant to pay the way forward. There is a cost to all new tech. It aint free kids……..carry on

          26. Now Im waiting for an appology to the community for your shamefull behavior trying to misinform the readers, just because you are butthurt 😐

          27. After your earlier immature rant at me I decided to do a bit more checking and I was quite right the next gen can be built to include displayport 1.2a as well as g-sync so your abusive tirade was not only uncalled for but it was wrong. And you have the audacity to run around calling people fanboy’s and liar’s when you do not even know if your claims are accurate. Can I suggest you reign your anger in when someone makes a point that’s negative towards a brand you love so strongly and check into what your about to write before embarrassing yourself. Try to remember “IGNORANCE IS NOT AN EXCUSE”

          28. You are the one completely and ultimately missing the point because you are blinded by your stupid faggot moronic nvidia fanboism…

            i’ll quote this:

            “Once monitors that support displayport 1.2a are available I’m sure the range of G-sync monitors that are released after then will include it.”

            from your statement.. includes what? I suppose display port 1.2a right?

            Then if that happens,I think your G-Sync Will be obsolete “IF” you use the displayport 1.2a as the primary adapter… ^_^ if not, you are tied to an old technology for the next couple of years if you are still not using display port 1.2a as an adapter. Hello 4k-8k gaming??? I don’t think soo..

            Damn!! Fan boys will be Fan boys no matter how stupid is the idea or what ever evidence you present them, they will actually knock it off just to win ahaha!!

          29. G-Sync is an option, what makes you think it will become obsolete when it’s an option? It can’t be a standard because it’s used with NVIDIA cards only and DisplayPort 1.2a doesn’t fix the core problem of Monitor GPU timing because it’s software based.

            There will always be a market for G-Sync, NVIDIA own 63% of the market, so their customers will buy G-Sync or not.

          30. Your quite right I don’t know the full in’s and out’s; of compatibility regarding whether G-sync monitors are capable of also having displayport 1.2a, I made a presumption based on the fact that that’s what generally happens when new tech is released, So how does that make me an Nvidia fanboy, I’m willing to use whichever brand offers the best experience and funnily enough I’m currently running a 290x, so how does that fill in the blanks to make me an Nvidia fanboy?

          31. Don’t worry about it, these people don’t even understand G-Sync or FreeSync or what they do properly. it’s just NVIDIA Vs AMD to them and NVIDIA are evil because they only give G-Sync to their customers.

          32. G-Sync monitors are aimed at NVIDIA owners plus NVIDIA hoping to get more NVIDIA GPUS sold, high risk but that’s what they do. AMD? Meh, no wonder they’re struggling against NVIDIA and Intel because they can’t put a proper plan together and invest in higher risk and lost out to the competition time and time again.

          33. So, that was all, Nvidia making more money? So Jizzus has a point, if I bought a G Sync monitor to get all the advantage of that technology, do I need a Nvidia GPU then? I though we were carrying PC gaming to a standarized market where everything is compatible with everything, but It seems we are going backwards
            “AMD? Meh, no wonder they’re struggling against NVIDIA and Intel because they can’t put a proper plan together and invest in higher risk and lost out to the competition time and time again.”
            Ok, so AMD just have to develop a new Sync technology compatible only with their cards, in order to get new Monitors that only get its fullest performance working together with AMD cards?
            I dont like your idea, sound much like console market.

          34. You guys live in a different world don’t you? NVidia not only wanted to solve a problem but also make money, it’s actually pretty risky doing G-Sync, if you haven’t got the backing all that R&D is wasted and noone likes wasted money in a business.

            Business is business, you can’t give your tech out free or to competitors, pretty much why NVIDIA have been around so long, smart business, good tech for their customers.

          35. QUOTE “I though we were carrying PC gaming to a standarized market where everything is compatible with everything”

            If that was the case what happened with Mantle? It’s optimized for GCN, built from the ground up for GCN meaning it will always offer GCN more.

          36. G-Sync doesn’t limit panel types, if people want IPS monitors with G-Sync they will probably make them. G-Sync is about gaming, IPS is more image processing and colour accuracy which is not what G-Sync is about.

          37. There will be igzo panels and IPS panels in the future. Yes, it will take some time. The IPS panels coming first 60hz panels. Which works great with G-sync. You do not have to have a high speed panel. That is false.

          38. G-Sync and Adaptive Sync are the same. But with Different implementations;

            G-Sync: Needs G-Sync Capable monitor or specific old monitor and a Gsync module and an Nvidia Card.

            Adaptive Sync: Display port 1.2a compatible Monitor and any GPU..

            So what do you thing is the pricey one? if you don’t get this you really are a 100 percent nvidia fanboy ^_^

          39. Okay..

            What does Gsync do??

            From Nvidia Website:

            http://www.geforce.com/hardware/technology/g-sync

            NVIDIA G-SYNC is groundbreaking new display technology that delivers the smoothest and fastest gaming experience ever. G-SYNC’s revolutionary performance is achieved by synchronizing display refresh rates to the GPU in your GeForce GTX-powered PC, eliminating screen tearing and minimizing display stutter and input lag. The result: scenes appear instantly, objects look sharper, and gameplay is super smooth, giving you a stunning visual experience and a serious competitive edge.

            And what does Adaptive Sync do??

            From Nvidia Website:

            http://www.geforce.com/hardware/technology/adaptive-vsync

            NVIDIA Adaptive VSync makes your gaming experience smoother and more responsive by eliminating frame rate stuttering and screen tearing. To learn more check out the Technology tab.

            From VESA

            http://www.vesa.org/news/vesa-adds-adaptive-sync-to-popular-displayport-video-standard/

            DisplayPort Adaptive-Sync enables the display to dynamically match a GPU’s rendering rate, on a frame-by-frame basis, to produce a smoother, low latency, gaming experience.

            Same right?

  3. just because every1 does the same thing doesnt mean its the right thing,

    this is a bad response when i buy a next card will be amd probably.
    not that amd doesn’t do the same but at least her stuff is open source mostly?
    correct me any1 if im wrong ty

    1. AMD tends to be about open tech than Nvidia, and some of their stuff will work on Nvidia cards as well (eg TressFX). However, they also *do* less than Nvidia tech-wise, which I think is part of the reason Nvidia is hesitant to share too much.

      And it’s not just about performance, it’s other stuff. I’ve really enjoyed the switch to Nvidia after having three AMD cards in a row. The drivers are better imo, and there are other things like PhysX which can be great (especially if you love Borderlands 2). It does piss me off that they seem to intentionally gimp its CPU support, though >.<

      And I love Nvidia Shadowplay, the recording and Twitch stream stuff they integrated. That stuff usually has to be done with an external program, and comes with a big CPU and framerate hit. Shadowplay uses the hardware encoder in the video card to do all that work, with next to no framerate hit.

      1. I’ve just switched back to AMD after my last 2 cards being Nvidia and I wish I hadn’t. I sold a 780 Classified and bought an MSI 290x gaming and my load temps are 20 degrees hotter and performance is practically the same, With both overclocked the 780 is faster a lot of the time. I wanted to try the new AMD chip out and now I have but I’ll be moving back ASAP. As you can see from the link the 290x is ahead on box clocks but once I overclock both gpu’s the 780 is faster. “nd link is overclocked but 290 o/c is not registering for some reason.

        http://www.3dmark.com/compare/fs/1583473/fs/2464938

        http://www.3dmark.com/compare/fs/2499677/fs/2055784

  4. “So, how come Mantle does not have any features and is similar to the other APIs when DX12 will pack the same new features that Mantle already supports? Especially when Nvidia is using DX12 as an excuse for not supporting Mantle?”

    Because only one 1diot Charlie Demerjian said that Mantle is DX12.

    You can say that x264, Divx, FFmpeg encoders are the same but they aren’t.

    1. Not only Charlie. Johan Anderson, and all developers who attended Mantle presentation and received Mantle documentation and then went to Direct3D presentation – wich was presented by the creators of Mantle – and saw they were the same.

    2. Well, the only reason why DX12’s coming is a driver overhead problem. And according to presentation and reviews it does exactly same thing as MANTLE. DX12 will be supported by DX11 HW so it cannot bring anything new for those cards, might and probably will bring some new stuff for new gens. But mainly, yes, it is the same thing.
      AZDO is also same thing, just for OpenGL.
      They’re trying to solve the same problem and since MANTLE is on windows and DX is windows only, then logically they are doing the same thing.

      1. It’s not because of driver overhead, that has little performance cost like Carmack said. DX12 will allow low level hardware access. Also, DX12 will make Tiled resources more accessible rather than just Windows 8.1.

        1. Also mate, do not forget DX12 will be likely Win9 only feature; MANTLE is for Vista and newer. So if “Tiled Resources” was not really used anywhere, I wonder how new version will for Win9 only. Also “Tiled Resources” was a feature more for new X1, but I might be wrong.

  5. “I mean would you ask BMW to give their technology to Mercedes?” exactly!

    The Asus ROG Swift is out next Friday here, so I’ll finally get to see what all the fuss is about with G Sync.

    1. But you cannot buy PC from NVidia, can you? BMW and Mercedes make a whole package.
      So this point has no validity. PC is based on standards, without it we just would have PC consoles.
      If you want PC console then I suggest you ask NVidia to make one (or just buy “one of” available today).
      If NVidia has such a great technologies it should not be a problem, right?
      AMD could and practically does anyway (PS4/X1) but on PC platform remains open to standards as possible.
      And NVidia can babble about how much money they put in their tech but fact is they will never put enough to make developers to massively use them.

  6. where are my quantum chips ? it’s almost 2020 and moore must die, but not even a hint. i’m tired of this mantel/dx sh*t. they can’t do sh*t when games are broken. Watch_Dogs

    1. Yeah, all this arguing about performance gimping yet the devs don’t even optimise properly anyway for the PC in the first place. Also, did NVIDIA moan about AMD’s superior performance in Dirt Showdown? NO because the devs decided to use their forward rendering technics which used heavy Compute.

      I’m sick of this bullshit being plastered over the internet for conspiracy theory’s to propagate. AMD have buggy drivers, funny how Batman Origins and Watch Dogs now actually perform well on AMD cards, funny how NVIDIA drivers perform well in AMD logo games when they didn’t before.

      1. I think a big issue is how certain developers get in bed with AMD or Nvidia to produce particular games, and lock out the other company. So a game like Tomb Raider or Arkham Origins comes out that is a buggy mess if you don’t own the “right” brand of video card, because the other guys didn’t have pre-release access to properly optimize and configure their drivers. It often gets fixed, but it’s rotten for people who bought the game on release.

      2. It’s pretty much a combination of these two posts – Nvidia can’t be shown supporting a “competing” Technology, especially since year after year they still refuse to open up PhysX (AMD needs to license it, which they of course refuse to do), preventing it from becoming the standard Physics Engine throughout the Industry, & thus continuously limiting its usage to shit like “leaves on the ground” (Batman: Arkham City), etc.

        Instead, they opt to hide behind the already Industry Standard Direct3D (DX) Tech that they’ve been using in their GPUs for over a decade now, that “just happens” to be doing all the same things Mantle does in its next upcoming major iteration, making it all the better for their Anti-Mantle position.

        That’s the crucial difference. When your competitor comes out with something new – eh, it’s nothing big. When the 3rd Party creating Industry Standard Software does it however; “see, that’s real advancements right there, & we are proud to call ourselves early adopters of this feature that the Industry has long been wanting…. What? Who did it first? What? Oh, no, that’s just a shoddy second-rate watered down version of what’s included in this. This is the real deal right here. Did I mention we’re (very proud) early adopters?”

        Far as Mantle itself: In the short term, there are numeruous Benchmarks that showcase how lower-end Systems have gained precious FPS with heavy Games like Battlefield 4, etc. In the long term, considering DX12 is going to be doing the same thing, I’m not sure exactly why AMD opted to do this (considering they likely knew long before we did what DX12 would be all about).

        1. I do dislike how PhysX is always sort of “tacked on”. It can only ever be cosmetic, which is unfortunate because that means it can never really affect *gameplay* :/

          1. Exactly. It’s one of the best Physics Engines on the market hands down, but due to its severe support limitations Developers are forced to limit its use to superfluous things. Hair, leaves, etc. Complete bloody waste.

          2. Same waste is “line tessellation” in HairWorks but then, NVidia has huge exp. with tessellation optimization 🙂

          3. You know what makes me laugh at these conspiracy theory’s about over used effects? Like Cryek would purposely drag their game down on all GPUs for what reason? I mean Crysis 2 used pre-tessellation because their engine didn’t support all these new methods at the time, that insane barrier didn’t actually take much performance because the whole scene wasn’t heavy tessellated anyway.

            THe whole water under the ground being tessellated is normal because water meshes cover other parts of the level, go look at old games, the mesh is massive under the level and besides, if the tessellation is LOD based it’s not an issue.

          4. What is conspiratory about one competitor trying to hurt another. This is scenario old as civilisation itself (if we can agree we are actually civilized, which this particular scenario would not suggest)

          5. Like a lot of this crap, it’s never been proven, it’s just media sites getting clicks because AMD made groundless insinuations about GameWorks which they don’t even have assess to because like all licences, they don’t allow you to give sensitive information to the competitor or anyone not licensed

            The mass public don’t really understand such terms, so they make stuff up. I wish people would just stop peddling this crap all because AMD said so, oh we have lower market share, NVIDIA are being bad to use and the gaming industry because AMD can’t do the leg work themselves, which NVIDIA have built up over many, many years. AMD brought ATI, now it seems they want a hand in a business NVIDIA have been in a lot long and earn a lot more respect in.

          6. BTW overuse of tessellation was proven many times (and I am not saying it was always NVidia’s fault) but it took few years to find the sweet spot and learn how to work with this technology (use it only where it does bring some benefit from visual point). Just look how is tessellation used today and how was 3 years ago.

          7. Because it is proprietary and also terribly optimized. It is funny when NVidia talks about PhysX development when they use old version is newest games

          8. Yeah, I don’t know why the hell it’s taking so long to see 3.xx PhysX used in games. It’s supposed to be much better optimized.

          9. Well, using PhysX SDK 2.x was influenced by game engine which developers used for their games. Unreal Engine 3 and Unity 4 had native support for older PhysX SDK. But now Unreal Engine 4 and Unity 5 have PhysX SDK 3.3 built in and you will see that games using these engines will use new PhysX SDK instead of old one. I already wrote this to you under another article but you are still only looking for reasons to blame NVIDIA from something bad. And as Sean wrote below, game developers are choosing PhysX API not NVIDIA.

        2. It’s going to be an up hill battle for AMD, NVIDIA’s 63% share is dead for Mantle, if Intel don’t support it that’s a massive integrated market gone for them too unless AMD somehow managed to overhaul Intel’s market share in that market for laptops and APUs.

      3. “all this arguing about performance gimping yet the devs don’t even optimise properly anyway for the PC in the first place.”

        Man, you really said it all right there. Making a game run beautifully starts with

        the developers. AMD and Nvidia fighting over the 5% performance gained through driver optimization seems a bit asinine. Of course, hardware and software developers rely on each other, but Nvidia isn’t really responsible for content.

    2. Yep bro, coders are needed not some school guys with Rat$on$the$Run 🙁
      I like good games, like Crysis3 (some of coders there are from SCENE 😀 w-w-w pouet net ) They make awesome grass with no frame impact 😉 so it can be done. but they have to work, not making $$. The good games sells great (Skyrim etc.) and its great Work.
      We need the people who LOVE Games -> like Epic and Unreal Free 😉
      I think the KickStarter is the future of Awesome Games

    3. Well NVidia with their proprietary tech do not help either, (not that ubi would) because then devs do not have much time to focus on other stuff. If you have standard / HW independent technology then you know that everybody can use it, but with proprietary one you spend time only for some, time that you could use by optimizing, But problem of PC gaming is much larger and lot of company take PC for granted (Microsoft, intel, etc.)

      1. But you’re missing the point, the game studio is the one who picks the vendor or tech they want to use. NVIDIA is known for their great support and tools, ever heard of John Carmack? Yeah, that guy got help from NVIDIA when he needed it so don’t come with that crap about NVIDIA not helping, they send people out to game studios and provide tools for better optimisation of graphics tech and new tech.

        You do know that NVIDIA tech does run great besides the core game not? TXAA, HBAO are very well optimised, despite what you may think about the tech itself or game.

        1. Well since they have their TEGRA, they do not support that many games, far less than AMD.

          They have great tools but no one wants then if they cannot review or change code themselves.

          Also You missing a point that NVidia pay money for using their tech, lately if was ubisoft. How many developers you think pay AMD to use MANTLE?

          Same does AMD for years, they just do not promote that as NVidia did. NVidia always had a great PR. AMD on other hand has NO PR 🙂

          I am not goona even get to Carmack who use open openGL with CUDA (instead OpenCL/DC), he is another promoter, apart being great dev.

          But again the point is not if the tech is well optimized, but if other vendor can optimized drivers for it and for that review source code. AMD is confident company that all their tech deliver in source (HDAO, GI, Tiled Lighting, TressFX and many more) so everyone – not just NVidia can review them and optimize drivers for them easily.
          The main point is if you are for standards or against. Both have their pros and cons.
          But apart that confident company should compete openly otherwise I have to ask why does not.

          1. “How many developers you think pay AMD to use MANTLE?” I really dont know, but my guess is, every single one of them.

            Also, do you how much paid nvidia to ubisoft?

          2. so you don’t know sh*t and don’t have any clue. i just wanted to check now i’m sure kid.

          3. I guess do not respond to people who do not know sh*t and you save yourself a time.

          4. he said the exact same thing (my replay) to me in another article, i am just returning the favor 😀

          5. .. paid by, my bad, but you could pick that up.
            $5 million for AC and WD. They expand the contract recently, I am not sure how much more it cost them. But you can find out and tell me 🙂

          6. I meant that AMD paid to developers.

            “$5 million for AC and WD” Can you provide source? I find only guru 3d where they clearly say “It LOOKS LIKE NVIDIA spend $5 million to optimize Ubisoft’s Assassin’s Creed 5 and Watch Dogs.”. So it is only speculation without single evidence and i dont give a sh*t about speculations. But you have no problem acting like it is actually true, right?

          7. AMD has no PR? Funny isn’t it how AMD CPUs get really close to top Intel CPus in BF4, want to know why that could be? BF4 is quite CPU heavy in situations yet manages to do well well normally it would get destroyed by Intel on the single thread but DICE optimised BF4 heavy on the multi-thread, I wonder why? 😀

            Well NVIDIA helped Carmack, he’s worked together with NVIDIA for a long time, not to say he doesn’t criticise NVIDIA, he did on PhysX but CUDA is big on the workstation side and what the Titan is for as well.

          8. HE did not, he just expressed a hope that NVidia did not pay a lot for that tech. But you can hardly take that as a criticism. It is just different business opinion.

          9. I assumed AMD paid companies to use Mantle. Otherwise I’m not sure why they would bother.

        2. Gawd I love TXAA. Gets rid of the “crawlies” along edges so nicely, something regular AA doesn’t always manage.

          1. You’d be surprised how many people think it’s just like FXAA, when it’s MSAA+Shader AA+temporal filter combined.

      2. “devs do not have much time to focus on other stuff”
        yeah, marketing, marketing a broken game to create hype. it’s not just PC, yes it has it’s own problems etc…
        i’m not saying nvidia is the best, yes i choose nvidia most of the time because i like their physx and it is optimized very well in past years. however i like they give their codes to AMD then my friends with AMD cards can enjoy them too. my problem in not AMD/nVidia related, some devs are my problems, they don’t do sh*t to optimize their games but they marketed as a best possible game out there which is a lie and won’t fix with mantel/dx or nvidia’s help with some devs to create some extras in their games. actually those extras are not the problem but the game itself is. watchdogs and AC. of course optimizing a game like watch dogs in not an easy task for PC, but man, they screwed it even on consoles.

          1. not at first but now i can say it optimized (at least for nvidia cards), it rans good on borderlands 2 and batman AO and the other game after than i can’t remember. i like it, sure they still can do better.
            after all i like nvidia but i respect AMD too, in the end we all still brothers 😀 i don’t give a sh*t which is better, we are in the same boat.

          2. But show me a GPU accelerated physics, fluid ,volumetric smoke, particle system that is, ahh that’s right, there is none because it’s only recently have engines actually properly started using accelerated GPU particles and you won’t find a game with any kind of realtime physics based particle, fluid, smoke effects. Crysis 3 has GPU partciles but it’s nowhere near like PhysX.

          3. Yes it is well optimised. NVIDIA did great work on PhysX SDK (and APEX of course).

  7. LOL CHECK OUT HOW THEY SLIP THEIR TONGUE TO THE TRUTH ABOUT DX12: https://www.youtube.com/watch?v=aG2kIUerD4c&feature=youtu.be&t=20m59s

    “DIRECT X 12 IS COMING AND ALOT OF THE FEATURES OF… [MANTLE]… UH… THE BENEFITS OF HAVING A LOW LEVEL API, EXTRA DRAWCALLS AND STUFF, WILL BE IN DIRECT X 12”

    What a joke of a interview! The contracts talk I cringed so hard! “You can ask this to developers… NOT THIS! Ask what we want you to ask, not what you want!”

    NVIDIA forgot to say “Thank you AMD for letting Microsoft rename Mantle to Direct3D so we can use it without supporting Mantle and it’s lacks of real benefits”

  8. I have MSI 290X Gaming and temperatures are ok (this card has low-speed fans – probably for low noise, but you should ask MSI, why they did use them / with faster fans temps would get lower though).
    BTW why the hell did you change 780? It was not for performance reason because 290X is not much faster. If you happy with NVidia, then it makes no sense.

    1. I do a lot of modding and I hit close to 3 gb’s at 1920 x 1080. I’m changing to a higher res monitor so I need more ram, I was going to get a 6gb 780 but because I used to be happy with AMD and because I’ve wanted to try the current flagship out, I decided to go with a cheapish non-reference 290x, Regardless of whether I got the 780 or 290x I will be upgrading to an 800 series a soon as I can anyway so it’s only a temporary purchase. Having tried both I prefer the 780.

      1. Well next time try to search a web for a while, because your only complaint is the temperature and you can find very easy how the card is going in there. IF the temperature is the main concern then I would suggest 295 or another water-cooling solution, which is the best card out there.

        But to complain that +-30% smaller chip with better performance / running on higher clock has higher temperatures is well…

        Friend of mine almost fried his card when he tried new NVidia drivers some time ago, so lets hope that not gonna repeat and you should be fine with 800.

        Honestly I do not get why d you didn’t get TITAN on the first place, because size of the VRAM is easily predictable according to your usage.

        Also I personally wouldn’t buy the new card on the very same die shrink, been there it is not worthy. But good luck with that 🙂

        1. Search the web for what exactly? MSI and most of the reviews claim lower temps than I’m getting so how exactly does searching the web even help when you can’t account for the silicon lottery anyway? The reason I went with the cheapest non reference 290x and not an overpriced Titan Black is because were getting (debatable) info that the next gen will offer a decent performance boost over them so if I can wait 4 or 5 months and get a card with enough performance to negate the need to get a second card why wouldn’t I? That’s another reason why I went with the cheapest 290x and not a matrix platinum that would of matched my m/board or some other high end example. Lastly calling this a better chip is debatable. I wouldn’t call it that. The aggressive base clock gives it a lead but once both are manually overclocked that lead dwindles and the temps on the Hawaiian chip are terrible, I wouldn’t call that a win.

          1. OK. Smaller chip + higher clock + higher/same performance
            =
            higher temperature.

            You have review of that card all over the net, and not sure about US but in EU you can try it and return it if you not happy with that.
            Like I said If the temperature is the main concern and u wish not to get over +-70C, then you have to underclock / use water cooling or another chip because with MSI R9 290X Gaming you will get over 90C.
            I cannot see that as a problem for quiet high-end card but if someone does…

            Every chip is a lottery ticket anyway, especially the biggest.
            Well reason why you choose what you gonna buy is your concern and I wouldn’t really go there. But since temperature of the card is predictable I would never call that lost, not the card anyway.
            But, I mean I love the card (it could be quieter with water cooling yes and if I would want it I would get it).

            But if you do not like that card for reason I think is partially your mistake, just give it away. Somebody will wait that card, I am pretty sure of that.

          2. I haven’t decided what to do yet, I’m in the UK and I have a 4 week window to return it in so I might do that, However it does perform well and I want to give it a decent chance, I did a few tweaks ie spun my cases side fans around and upped the card fans to 50% so my temps are a little lower now, So I may keep it for a 3 or 4 month period while we wait on the 800’s. Isn’t there meant to be a 290x ghz style edition due soon (295?).

  9. lol, nobody attacked you. you and 6 of your friends were wrong. funny thing is your attitude is worst than the people you called bane.

      1. to me, an AMD fanboy started it. an AMD licking fanboy starts attacking nvidia licking fanboys. simple as that

        1. John Richardson
          expressed his opinion with something. If I am not mistaken, it is NOT an attack. Or I hope it is not considered to be, at least not by majority.

          1. first comment on the article and these words:
            “nvidias attitude, bane of PC gaming”
            “nvidia a*s licking fanboys”

            they are not opinions but trolling and fanboism

          2. you’re just a plain fan boy. get a life and burn your money for nvidia. ^_^

          3. i am a pc fanboy, yes. nvidia fanboy ? i don’t think so.but read your comment again and see who the real fanboy is.

          4. Burn your money whit nvidia morons, but don`t tell AMD is bad, low, and is not good…

  10. stuff like gsync actually not because not capable on amd gpu..its just because nvidia doesnt enable its…same with pyshic ..even witcher 3 devs also think it should be working on amd gpu
    meanwhile amd version of gsync already become standard of incoming television..who cool?
    also the talk about dx11 still could be optimized is bullshit..remembered the driver they released before?the one they claim capable to draw power as mantle did?turn out that one compeletely bluff
    im not amd or nvidia fans or console fans,what i see nvidia is arrogant and i respect amd for sharing his tech than nvidia who only thinking about money and monopoly

    1. I actually got big improvements from that driver. The big thing was that it got rid of hitches in open-world games like Skyrim and Saints Row 3, and I got a 10-20 fps boost in SR3 to boot. Funniest bit is that I have an older AMD cpu that struggles sometimes… and Nvidia’s driver helped AMD’s CPU cope with newer games lol

  11. The only bane is AMD. Mantle for AMD, Metal for Apple, Api Nvidia, Api Intel, Api Qualcomm, Direct X Api, OpenGl Api. AMD forcing everyone go with different Apis because nobody want to go with Mantle.

    1. Developing a technology that improves the performance of your product is not “forcing” anyone else to do anything.

      I’d prefer that technology be shared, but each business needs to create and sustain a competitive advantage.

      Anyway, I’m still waiting for new Voodoo5 drivers.

    2. NVAPI is not an API like Mantle, Metal, DX, OpenGl xD

      No one forces anyone to anything, in fact you are living in the Golden Age of APIs, every piece of software/hardware have their own APIs.

      You are just clueless and butthurt

    3. And how to you presume AMD does that?
      AMD only presented the problem very much like NVidia did with G-Sync and since apple, Microsoft, Kronos doing the same work as MANTLE presented, I would say It is good way to go. Anyway I wonder how AMD could force anyone to do anything. They actually did not even open MANTLE program and they choose who will be in since it is still in BETA state.

      1. At least they did the first step, FREESYNC, a technology that can be compatible with Intel, AMD and Nvidia cards. While G Sync is just a Nvidia thing.

        1. FreeSync doesn’t solve the core problem. G-Sync does. I guess heavyweights in the business that put their name behind it are wrong then, so go and say Carmack and Sweeney are wrong and they are only pawns for NVIDIA because they know nothing about the core problem that G-Sync fixes LOL, they’re only the two main people behind the most successful game engines in the computer game business.

          All you NVIDIA, G-Sync haters, say now that Carmack and Sweeney are wrong.

      2. I think people like you are missing something. AMD’s Mantle makes their CPUs look much better compared to Intel’s because it’s makes a lot better use of multithreading, something which AMD’s architecture is very strong at like the FX and Jaguar line. One is on console, one is on desktop PCs and their APUs also benefit from it, so you see there is an ulterior motive but all people do is fight about something that’s really not the main issues as such.

        If you want proof of that, look at AMD CPUs with Mantle, if you’re CPU bottleneck AMD CPUs benefit a lot. Also the NVIDIA driver update made CPU bound DX11 games perform much better.

    4. “AMD forcing everyone go with different Apis”
      at first i was thinking like that and i was angry about that. but they pushed MS’s s*itty DX with their API – Mantel. i proven to be wrong. AMD doesn’t own the major share in PC market it goes after intel and nvidia, but it’s a nice thing for AMD users, nice thing for others because they pushed DirectX. thank god API war didn’t happened

  12. All you Nvidia hatter’s fail to understand is Nvidia….

    1 Does not want to support Mantle because it’s under control of AMD.

    2 Why would they want to split up driver dev teams to work on DX12 and Mantle when they can go full force on DX12 which will be used by all gaming hardware in the PC market…

    3 Nvidia makes GPU’s not CPU’s! If anything Intel should be worried more about Mantle then anybody in the market.

    4. STOP acting like 12 yearolds on IGN yapping about how PS4 is better then Xbox1.

    1. $0N¥ paupers are desperate now, because this will bridge PC and Xbox development even more and make it a no-brainer for developers.

    2. Let me address the point number 2, because it shows you dont understand how Mantle works and how NVIDIA PR is messing with you.

      With Mantle you dont have to have a driver team working for it like you have with DX11. With mantle the GAME DEVS are the ones who optimize the game, not hardware vendors.

      For example BF4:

      DX 11 optimizations are made by AMD, with driver releases.

      Mantle optimizations are my by DICE, with patch releases. So far they released 2 or 3 patchs with Mantle improvements, framepacing fixes, etcs.

      So NVIDIA is just missinforming people.

      1. Again totally wrong, The optimization patches are not the same as driver patches there fixing teething issues that come from using a new tech, in this case the mantle api. Are you seriously going to claim AMD are not doing any BF4 driver work?

        1. You prooved again, to be clueless.

          I’m not even going to waste time, get informed and then talk to me.

          1. Ah look, Your stupidity is really shining through today, Check the AMD driver info and they’ve been doing mantle specific driver tweaks.

          2. The guy doesn’t even read proof.

            “”Mantle performance for the AMD Radeon™ HD 7000/HD 8000 Series GPUs and AMD Radeon™ R9 280X and R9 270X GPUs will be optimized for BattleField 4™ in future AMD Catalyst™ releases. These products will see limited gains in BattleField 4™ and AMD is currently investigating optimizations for them.””

          3. Ah so you think the API is finished? They have it in a Closed Beta Program just because?

            And I am the stupid? You are the ignorant twts who ignore the context just to make your butthurted points trying to pass.

            So yeah, AMD will be optimizing Mantle until it’s final release, maybe because the API is BETA.

            If you don’t know what BETA means, I can’t help you there.

          4. DICE must be stupid or AMD is paying them to beta test their API in their stable game. I heard bad things about BF4 and Mantle, turns out DICE expose their player base to a beta test API in a stable final code game.

          5. Oh my god I have to constatly fix you…

            DICE isn’t stupid at all, they were the first to deliver a game that supports the most advanced API in the market.

            And if you want to criticize, point your finger to Frostbie. Frostbite =/= DICE. DICE is a gamestudio, Frostbite is one of the R&D departments of EA.

            They exposed their player base to a game that fully supports a generic API called DirectX 11.2, and also gives them EARLY ACCESS to the best API in the market.

          6. How is this statement related to what I said? You claimed that AMD do not do driver updates for Mantle, I corrected you, what has that got to do with Mantle being in Beta?

          7. Since you come in acting like a bigshot and therefore, you come set to failure.

            There is one thing you lack, and I’ve noticed in other replys you made: you lack the notion of context.

            Our dear friend John said this: “2 Why would they want to split up driver dev teams to work on DX12 and Mantle when they can go full force on DX12 which will be used by all gaming hardware in the PC market…”

            To wich I reply something like: “With Mantle you dont have to have a driver team working for it like you have with DX11. ”

            I’ve never said AMD wont do drivers for Mantle, what I said is that the GAME OPTIMIZATION is on the hands of DEVELOPERS – not HV, like on DX11/10/9/etc.

            I dare you to see if there is any claim of “game performance boost of X%” on Mantle in the drivers, like on DirectX – what you most likely see are bugfixes that may arm the performance of the game – why? CAUSE MANTLE IS STILL IN BETA.

            And that’s how it should be done.

          8. You claim I lack context yet you reply to me and question me about what someone else said as if I said it?
            If your incapable of holding more than one conversation at a time the forums aren’t the place for you.

            You continue to attempt to mask your lack of knowledge and laziness towards gaining it by being abusive and trying to turn conversations into nothing more than people slagging each other off and saying this brands better than that brand.

            As I explained I’m currently running AMD graphics, I often change which I use so you can’t call me an Nvidia (or AMD) fanboy just because I corrected you over something you claimed that was wrong.
            It’s also worth pointing out that if I was an AMD fanboy (or Nvidia) I wouldn’t want someone like you making things up as he goes.

            People want to know the facts and to hear about the experiences people have with the products, They don’t want to listen to idiots waving a flag shouting Go go (insert brand) gadget.

            And lastly you said this “With Mantle you dont have to have a driver team working for it like you have with DX11. With mantle the GAME DEVS are the ones who optimize the game, not hardware vendors.” So don’t start telling lies about what you did or didn’t say.

          9. I quoted you someone else because that’s where all of this posts began. If you can’t cope with that, I just can’t help you there.
            Maybe your first post was just so out of context you think you started a conversation from thin air.

            I attempt nothing but to show the light to those who refuse to see it.

            I have two builds, one holds a 780ti (home office), other holds a 290x, both with great Intel CPUs – a 4670k and a 4770k respectivly. So I guess there goes the fanboy thing no? My god…

            Exactly, and I say that again. With Mantle you don’t have to have a driver team WORKING FOR IT LIKE YOU HAVE WITH DX11.

            Because currently, my limited and simple friend, the most things drivers do is game optimization. And that’s just wrong. That’s what John was talking about – the driver teams optimizing drivers for games “X% performance gain on this game, X% on this other, etc”. With Mantle you won’t see THAT KIND OF DRIVER RELEASE, because, and I quote myself :

            “With mantle the GAME DEVS are the ones who optimize the game, not hardware vendors.”

            It’s exausting trying to make you see the light, it’s like explain things a a 5 year old with downs syndrom. You either make it on purpous, or you surely have issues. GEEZ!

            Before you have a meltdown, I post you here the defenition of driver from wikipedia: http://en.wikipedia.org/wiki/Device_driver

            Do yourself a favor and educate yourself.

          10. You made a post I corrected it that has nothing to do with what someone else writes before hand, Also your the one throwing fanboy accusations continuously, What’s with the CAPITAL LETTERS? Saying it louder does not make it right son, or was that the way you was brought up? Your a waste of space and posting replies to you is a waste of time. You’ll never learn a thing with your attitude.

          11. There you go! See it’s not that hard!

            You just said it – you ignored the context of the conversation.

            Capital letters are used to help people understand what is written, focus their attention because they are big – so they must be important – since we don’t have a way to highlight with bold.

            Saying louder? Ok that explains alot… written letters… saying louder. Yeah buddy, that’s it 🙂

            I didnt realise you were a special little guy!

            You had to avoid what I said cause you either didn’t understand it or you simply saw how dumb your previous posts were xD

            Get well, see a doctor!

            Love,
            Jizzus

          12. Sorry for being rude but honesty is the best policy. The amount of drivel you come up with is shocking, Your obviously one of those children who needs to have the final say so feel free to make up whatever you need to in order to feel better about yourself.

          13. You are not being rude, you just show clearly you have no understanding in the matter.

            I think you have no clue in what Mantle is, how it works, neither how it’s similar to consoles APIs.

            By your logic, Sony Microsoft and Nintendo release drivers to improve games… lol I don’t even know what to say lol

      2. Really dude?

        “Mantle performance for the AMD Radeon™ HD 7000/HD 8000 Series GPUs and AMD Radeon™ R9 280X and R9 270X GPUs will be optimized for BattleField 4™ in future AMD Catalyst™ releases. These products will see limited gains in BattleField 4™ and AMD is currently investigating optimizations for them.”

        1. Oh wait!

          Maybe!

          Lets just say… MAYBE!

          Maybe AMD si investigating because… are you ready?… MANTLE IS IN BETA! Just maybe!!!

          LOL the things i read

          1. Using fully fledge games as your beta test exposing users to bugs that they wouldn’t normally have in a stable game release. What a joke.

          2. The gamers always were able to play with the generic DX11.2.

            They just gave them EARLY ACCESS to the most advanced API in the market, and let them use it whenever they wanted to. The same API has had at least 3 patches with major fixes and stunning improvements.

            And they use it.

            Can you fail more?

          3. DX11 is mature, DX11.2 adds some new features, it’s not a f*cking beta API for testing. Look, look AMD got new toys, lets play and let’s expose AMD users to beta code that might f*ck up their game performance(and it did).

    3. Sorry, but DirectX12 being used by whole gaming hardware is not a acurate statement. OpenGL and Mantle could be used for all that hardware you said too, so Devs can choose which API is better for its engine, I dont want to be tied to an API developed by a company that thinks PC gaming is a second class thing. I forgot to mention that DirectX 12 will be just on Windows 9? What about Mac and Linux. I would like to hear that Nvidia is working better on their OPENGL extensions, but theres too many money between and MS has that money, sadly.

      1. Yeah ms messed up many thing. It was Nvidia that showed off opengl at low lvl back at GDC in 2008…

      2. NVIDIA has the best OPenGL drivers for Linux, they also use OpenGL for their tech demos and Linux workstation side which is about 15% of their business.

          1. Again, who uses OpenGL? Maybe few but what they are showing is clearly not the way it is supposed to be today in the market. Yes Nvidia can brag about their OpenGL demos but clearly it won’t impact much on gaming. Maybe they should answer problems that are present right now.

    4. Getting a little late to the argument here but you got wrong your point one (moreso now that Mantle SDK is officially open): Mantle is free, NVidia could use it without paying a fee and I think Mantle will actually grow larger considering that DX 12 is still expecting to launch next year (not sure about the date) only on Win 10 (being compatible with Win 8 and 8.1 if I’m not mistaken) without compatibility on WIn 7 which is still a big part of the PC gaming community. And regarding your point three: Intel is actually showing interest in Mantle. Look, I’m not NVidia hater nor AMD fanboy, but I see very different practices on display (now more than ever given the AC:U fiasco) regarding new technologies.

      EDIT: I also read that Mantle could have helped avoid some of the performance issues presented in heavily populated areas of the game, thanks to the fact that the AI of the NPC’s is handled by the CPU. The specifics of that I don’t know but, as I understand it, it helps the multithreading work of the CPU cores. Maybe I got it all wrong though 🙂

    1. If DX12 will deliver what mantle does it will surely kill mantle. but if not, mantle all the way

  13. I’m such a die hard fan that the only piece of hardware from AMD I own is a R9 290X that was gifted to me.

    AMD doesn’t get money from me since I bought a ATI 9600XT… Oh wait! That wasnt money for AMD, was for ATI.

    So much die hard.

    1. you are dude… you take it everything AMD says like it’s golden bread and you eat it. it’s getting bad as of late

      1. No, I take most of what NVIDIA says as trash.

        I take Mantle seriously and I see a great potential in it. I’ve always supported it cause it simply made sense.

        It had the power to move the industry… a simple BETA API made is way to replace Direct3D in DX12, and has a massive developer support. You should put the green aside and actually read the information available for Mantle.

        The problem is that AMD is not making an effort to inform people, I see so many mistakes being said and ridiculous statements.

        The rest: I like GCN because most developers really enjoy working with it – just go to Beyond3D for example and check it out for yourself.

        I don’t give a single fck about their CPUs.

          1. Nice trolling on NVIDIA articles and speaking positive about AMD articles, you left a trail of AMD fanboy breadcums for me to follow.

          2. You left a trail of NVIDIA delusional fanboyism as usual. You mistake reality with being a fanboy.

            Do you want me to say bad things about AMD?

  14. You are right but that last statement wasnt necessary, Nvidia has been always closed towards new technologies developed by others, OPENGL took its time to be implemented on Nvidia cards.

      1. linux accounts for 1.5% of the entire desktop distributions per user world wide. Why would AMD waste time and money catering to 1.5% of people? If your mentioning linux because its the father of android? then you have a valid point. The android OS saw 48% OS distribution last year as apposed to 14% windows and 11% OSX.
        If not then mentioning linux is kinda pointless. its only real use at this point is either affordable corporate servers. Or as a toy for some of the more techy nerds in the world who are bored with windows.

  15. “It’s possible to pull performance out of DirectX, we’re approving that,
    and so you can argue that maybe it’s not a good idea to put extra
    effort into yet another API that does the same thing essentially.
    Feature wise there is nothing more.”
    This means they can pull DirectX in order to get a better performance as Mantle does actually. But they said earlier that Mantle isn’t doing any important performance boost. Now I know why Linus Torvalds was pissed off about Nvidia.

  16. The game developers must be part of that “bane” too because they implement NVIDIA technology, it’s called added value. AMD offer nothing outside the DirectX spec, no innovation at all, they love DirectX that much they made Mantle instead of pushing DirectX or OpenGL.

    AMD drivers for other platforms are crap, AMD didn’t even push for OpenGL on Windows because they know it will go nowhere(Carmack actually did more than AMD could ever do), they make a new API and expect everyone to use it without properly consulting other vendors first and declaring they don’t want to use it. NVIDIA never wanted to use Mantle, it’s pre-optimised for their GCN, very biased API from the outset.

  17. NVIDIA’s Adaptive V-Sync goes on and off dynamically, so if your frame-rate goes over monitor refresh, it enables V-Sync, below monitor refresh it tears instead of dropping the frame-rate as V-Sync does and causing judder.

    Note that nobody have actual comparison numbers between G-Sync and FreeSync yet. BTW, note that AMD demonstrated FreeSync on laptops, they”re actually quite different to desktops in regard to the core issue at hand.

  18. so right now using mantle gets you almost no performance, surely not enough o talk so much about it. Because to really use mantle to it’s full potential, i think, stuff must be written from the start for it, not have it baked on the frostbyte engine or whatever. So when games are supposedly going to come out that are written for mantle, the possible extra performance mantle gets you now will be available to everyone, in the form of the new dx12. Mantle is just low level access to hardware, a thing that consoles had for a long time and amd tried to apply to pc and they got mantle. And that is great BUT it’s a bit redundant, because they want to use it as something they have and nobody has and makes games run better on amd hardware, but are going to fail because dx12 will offer the same performance and no be tied to amd.
    Considering that, and physx and g sync, there is no reason to prefer amd hardware vs nvidia starting next year when dx12 cards are coming out.

Leave a Reply

Your email address will not be published. Required fields are marked *