DX12 Features To Be Announced At GDC 2015, New GPUs Required For DX12’s Full Set Of Instructions

By now, most of you are aware of Windows 10’s exclusivity for Microsoft’s new API, DirectX 12. And while NVIDIA was quick to announce the GPUs that will support it, it seems – as expected basically – that new GPUs will be required in order to take advantage of DX12’s full set of instructions.

Mike Ybarra, partner director of program management who leads engineering efforts for console and PC, told Polygon that while a number of DX11 and DX11.1 GPUs will support some of the features of DX12, new GPUs will be required for full support.

To get the full support of DX12 will users need to get a new graphics card.” said Ybarra and continued:

“The power and frame rate wins we demonstrated come from improvements in CPU usage in the OS runtime and device drivers.  And this was on DX11 devices.”

In short, those – theoretical – 50% performance boosts that Microsoft claimed will be enjoyed by gamers that own a Kepler or a Fermi or a Maxwell NVIDIA GPU.

But what will be the new graphical features that will be only supported by DX12 cards? Ybarra did not reveal any additional details but claimed that Microsoft will outline them at this year’s GDC.

51 thoughts on “DX12 Features To Be Announced At GDC 2015, New GPUs Required For DX12’s Full Set Of Instructions”

      1. Its not like i’m buying it urgently…. Of course i know about this issue.. that’s why i will wait for a while until it is resolved..

        1. maybe wait if it wouldn’t be resolved and then buy it. Then return it with this issue to get full refund xD to have free card for some time.

        2. Most people with the issue are playing at 1080P and no game at 1080p will eat up 4gig’s of vram… I played Crysis 3 with 4K DSR on and it ate all 4gig’s no problem..

          Even Max Payne 3 at 4K with 4xMSAA will only use 3.334Gig’s while at 8Msaa it will want 5.777Gig’s of Vram which a 970 does not have. But if you are going to play at 4K you are not going to have to use that high of AA unless you are on a 42inch or higher 4K TV.

          I have a Gigabyte G1 Gaming 970

          1. I play game at 1080p… But i’d like to play game at higher resolution through Nividia’s DVR…. So for higher resolution u need more vRam..
            and This issue isn’t with everyone.. it might be software bug as well… I hope Nvidia resolve it soon..

      2. It can’t be reproduced in games, maybe because all the VRAM needed gets allocated during loading of the game rather than in the game. People need to get a few things straight about the issue first without blowing it out into something it’s not, i.e it’s not an allocation problem it’s a performance problem over 3.5GB when testing not in real world games.

        Stress benchmarks can always show problems that don’t show up otherwise.

        1. Nvidia has said it’s a software bug they are working on anyway.

          I’d still wait as both Nvidia and AMD will go into price war once AMD announces the 300 series.

          1. Nvidia has only said that they are looking into it. They have not said that it’s a software bug but they suspect it is.

      3. yeah it’s bogus.. playing games at 1080p will not max out your V-Ram even when playing Crysis 3 with the highest SMAA…

        1. People aren’t testing it at only 1080p. There’s people running games at 4k with aa and it never reaches 4gb. Not only that the bandwidth goes to sh*t at over 3.3gb.

    1. The first DX11 video cards (Radeon HD 5850/5870) were released by AMD in September 2009. NVIDIA was pretty late to the game when it came to DX11 (over 6 months late, to be exact).

      1. yeah but the point is none of the hardware features were ever used in games. but if I remember right Nvidia 400 series was it’s 1st DX 11 cards that hit the market in spring 2010…

        Also AMD’s 300 series is late to the party since they will not start coming out till 18 months after the 290X hit the market.

        1. There’s a difference between “the first DX11 card” and “NVIDIA’s first DX11 card”. Also, we were on the topic of releasing cards that support a new graphics API. Not card releases in general.

          AMD’s R9 300 series cards have an expected release date of Q2 2015 which is later than usual. With that said, their current cards are still great performers. And so are NVIDIA’s cards. It’ll be interesting to see how NVIDIA reacts to the release of AMD’s new cards with high bandwidth memory.

          1. 1st gen HBM will not be all that impressive. It’s the 2nd gen when the magic will come into play and by that time Nvidia will be using it. And that’s when the real battle with AMD vs Nvidia GPU’s will happen dealing with stacked V-Ram. AMD’s 400 vs Nvidia’s Pascal

          2. 640 GB/s of memory bandwidth isn’t impressive? Granted it’s only limited to 4GB with first-gen HBM but it’s still a huge improvement over the R9 290X’s 320 GB/s.

    1. 900 series supports the DX12 hardware features, so does the R9 285 but apparently they are not that important compared to the software improvements. DX12’s low level nature is one of the most important features so the hardware is exposed more by the software.

  1. Phil Spencer needs to shut his mouth.everything that comes out of this….mouth is vague,stupid and undermining.We KNOW better that to pay attention to any of these deceptive promotions it is just business as usual for M$ and anyone who has his hopes up wake up.

    1. Where did you find Phil Spencer in this article? Seems like you came here to troll without the basic ability to read.

  2. Im exited since DX7, still waiting for something revolutionary, DX12 sounds like they are just trying to update DX to Mantle/OpenGL features. Anyways I wanted to upgrade to a 7950 3gb Radeon and the AMD site says that gpu supports DX12, but Microsoft came with this, kinda confusing.

    1. If the card is rated as “supports DX12”, take it at face value. AMD & Nvidia have SDKs for DX12, so they know if their Cards fully support DX12 or not.

      Ignore the Microsoft “MUST-SELL-MORE” Propaganda. Nvidia came out & said Fermi, Kepler & Maxwell Cards would be DX12-Compatible, so this is probably some kind of attempted damage control in order to showcase how truly “advanced” DX12 really is.

      1. There’s no propaganda involved here since Microsoft isn’t “selling” DX12. Nvidia might want to sell the hardware, though. However Fermi, Kepler and Maxwell are DX12 compatible as far as the majority of features are concerned. There are just a couple of additional minor hardware features that require DX12-grade GPUs. Technically if you have either a Fermi/Kepler/Maxwell and want to experience DX12, stick with your current card.

  3. New cards? Marketing stuff & crap
    And games who support DX12 when they come?

    In 2014 we have very few DX11 and looot of DX9 games. so whats the catch?
    nvidia push many expensive cards to play 20 DX9 games and only 5 DX11 games?
    Full DX12 will never appear ’till Q3 2016. 😉

    1. Well at least we have W3, Arkham Knight and many other AAA titles. Last year’s sh*t optimization is probably big devs trying to sell games that should have been ran on DX12. If the speculated May-June release date of Windows 10 is true, then most Windows users will have upgraded to W10 and DX12 for free by the time those titles come out. The reason DX9 is still so popular is because perf difference is so small and using full DX11 limits the playerbase to about 50%. Also, games can still look beautiful and run well as long as 1.the engine is good 2.game doesn’t have much cpu work (ai,etc…) such as the vanishing of ethan carter

  4. The amd r9 series cards are actually dx11.2 cards but i suppose there is some sort of overlap between dx11.2 and dx12 that makes for minimum support for the new api.

    1. Nope. They only have DX12 support for the feature levels that were decided at that time. DX12 has multiple and it’s still a spec that’s not finalized and is a work in progress.

  5. So, DX10 was unified shader support, DX11 was… tessellation? Maybe DX12 will finally give us better AI or physics, or maybe ‘free’ anti-aliasing. That would be nice.

    1. They already have DirectCompute since DX12 for AI and physics as those are higly parallel computations. And tehy will not interfere with engine makers. But they could support development, sure. New performance free AA would be awesome though. On the other hand SMAA is very good and runs with very low performance hit.

      1. Physics can be handled by the GPU, yeah (at a massive performance hit – PhysX is meant for SLI/ dual-processing), but since when are they doing hardware-accelerated AI? I don’t think I’ve ever heard of that.

        Now that I think of it, I guess ‘free’ AA will happen ‘naturally’ once cards reach a certain computing threshold, now that we have native downsampling (Nvidia just got this recently). Downsampling will only be ‘free’ because the card will not need computing overhead to detect edges, but will still need to compute the downsampling, although cards already do that to an extent anyway.

        On another note, maybe DX12 will finally bring hardware-accelerated voice or facial recognition to GPUs. Eventually I can imagine a single GPU acting like an SOC with physics, AI, shaders all working together. I think we’re currently hitting the limitations of the Unified Shader Architecture in graphics cards – a shader engine is not a good solution for AI or physics calculations.

    2. Brad Wardell said it’s mainly going to be revolutionary for AI and handling drawcalls. Stardock does grand strategy with a lot of AI processing so he’s seeing massive improvements due to how CPU bound his game is and how much it relies on Drawcalls.

      I’m interested to see that the extra feature sets DX 12 has that will be announced at GDC. I wonder if we will have the 300 series announced by then.

  6. The only thing we should be hoping for is developers are ready for low lvl api’s. If they are then we will finally get the most out of our hardware and it will make our hardware not feel so out of date after a year or two…

  7. About damn time, there SHOULD be DX12 exclusive instructions that ONLY a new DX12 GPU can provide!

    That goes without saying or else it just the same crap with better CPU utilization!

    Bring on Win 10 with DX12 as it will be the next frontier for pc gaming and about damn time!

  8. If DirectX 12 is compatible with DX 11 video cards why Windows 7 / 8.1 is not compatible with DX 12 (I don’t talk about the 2 useless features that Nvidia is advertising for marketing purposes )

Leave a Reply

Your email address will not be published. Required fields are marked *