Over the past few days, Stardock’s CEO, Brad Wardell, has been tweeting some really interesting details about DX12. As Brad Wardell noted, in a recent test he saw over 100fps difference between DX11 and DX12. This test was conducted on an unreleased GPU and as Wardell claimed, this performance boost was on a system that was ‘way beyond console stuff’.
As Brad Wardell tweeted a few days ago:
https://twitter.com/draginol/status/567428591630426112
Wardell then answered a number of questions regarding this performance boost and the PC setup he used. As he claimed, he used a Crossfire system with an Intel Core i5 CPU AMD CPU (though the accurate model was not revealed).
While conducting this new test, Wardell said that he was “moving around the camera and the unit AI was doing the rest.”
When a fan asked Wardell whether his PC setup was close to Xbox One’s specs (in order to probably get an idea of the performance boost console gamers will get with DX12), Wardell replied that his machine was way beyond consoles.
https://twitter.com/draginol/status/567434702798458881
Now while we don’t know the specifications of that Crossfire system that Wardell used (especially regarding the graphics cards that have been used), we do know that DX12 will further boost performance on multi-GPU setups.
When a fan asked some of the benefits of DX12, Wardell replied and said:
https://twitter.com/draginol/status/567435782789795841
Microsoft, Stardock and many other developers will reveal more details about DX12 at this year’s GDC, so stay tuned for more!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
so will this make amd compete even more with intel then. since amd is cheaper
After mantle… im still very skeptical.
Why they are two different API’s, one that is supported by one company the other that is supported by 3.
we´ll see friend. I hope you are right.
Mantle is a huge success. Just not many adopters of the technology. DX12 is mainly a Mantle copy. Mantle is a game changer. The end
Mantle is just as good as DX12, try Star Swarm Benchmark and do the comparison (at least 300% improvement on my PC).
The problem with Mantle is that not many adopted it and not many really took full advantage of it.
Devs were slow to use DX11 and they’ll be just as slow to use DX12.
Maybe but I think that if 1 developer releases a good DX12 which performs really well and is praised by the hole media the other will be stupid not to follow.
Yep. No point using DX11 when you dont need the extra effects. DX12, on the other hand, benefits all games that use it so will be adopted much quicker
They were slow because of the adoption rates on windows Vista and Win 8. Windows 10 being free to 7 and 8 consumers means there’s going to be a massive amount of consumers moving on instantly, so there’s already one huge reason to shift your dev environment to the new API’s.
Considering most devs are creaming themselves on twitter over DX12 and glNext, it will very likely be adopted from the start as the whole barrier that prevented adoptation before (That being a $100 new Windows purchase) is removed. When you add in both GLN and DX12 having what developers have been wanting for years, almost every developer will be straight onto DX12 and GLN as there is no reason not to do so.
Not exactly, they were slow cause consoles supported DX9 only. Even if adoption rates of Windows were higher, they still would’ve stalled.
Exactly. Things will move a lot faster this time around because the Xbox One will support DX12.
OK, OK, OK you got a point there, but what about people having to buy a new videocard with DX12 support, just for joining the party, isnt that another barrier?
We’re waiting for the specifics of “Full DX12” at GDC. Everything else is compatible as far back on Fermi on Nvidia and likely since GCN 1.0 on AMD so there’s no barrier on needing a new card which was another adaptation problem.
They seem to have their **** together on the adaptation problem software and hardware wise. We just have to wait till GDC to see how DX12 and GLN complete the puzzle.
It’s an exciting time to be a PC gamer :^)
Not to mention better hardware support. All GPUs supporting DX11 will support DX12 sofware level. The smooth consumer transition from DX11 to DX12 will make developer adoption very quick
Sounds like you’re hoping..
What?
I don’t think so this time around we have the xbox one being fully compatible with 12 and 12 is a huge upgrade in terms of performance unlike 11 was. 12 is Also already supported in the unreal engine 4.
it’s probably the rumored 380x, only unreleased gpu close to market right now.
It’s the GTX 980 Ti
Yeah, Wardwell said crossfire so it’s def AMD.
Actually, the 380x is rumored to basically be a 290x
Pretty cool. Even though it will be very expensive. Im skeptic about doing this with a $250-300 card…but in time every expensive card ends up at bottom and gets beaten by some budget cards. This means in a few years a 750Ti like card will play 4k just fine.
You’re asking for too much XD
One high end card runs games with minimum frame rates dipping lower than 20fps today, you need 2-3 cards to get high frame rates at 4k. And you think a low end card will handle 4k fine in a few years? Right..
You still need fairly high end cards to maintain high frames @ 1080p in new games, ignoring higher resolutions.
I think the guy misunderstands what Direct X12 and Mantle are doing by efficiently using more CPU cores to feed the DGPU. A 750ti should easily be fed by 90% of processors that are 2 to 3 years of age meaning you will only see a low fps boost at the current standard resolutions. The 750 ti will never hit 4k resolutions … just not powerful enough.
YOU do NOT need 2-3 cards to get high fps at 4k, I do it with one 780, sure I can’t max every damn game out fully with high levels of AA, but usually no MSAA and/or a couple settings knocked down gets me 60fps. Only reason you need 2 cards is if you want to max out all your games at 4k. I’m fine with mixtures of high and ultra at 4k. Works fine until something like the 1080/390 comes out.
why do you need AA at 4k ? Does it still make a difference ? What display do you use ?
1. AA does make a difference, but you don’t need near as much, if any(its one of the first things that I turn off if I run a game at 4k). 2. I use a Acer IPS 1080p display and I downsample to 4k in many games via the NCP. About the same performance impact(little more actually) than running on a native 4k display.
I can bet my hand that the test he used was Star Swarm or something similar in concept – sh*t load of draw calls. It’s no surprise that Dx11 can’t handle this ammount, but in real world scenario draw calls aren’t eveything (but they still matter).
gtx 750 ti oc already plays games better than ps4 and xbox one. now with the help of dx12 we can hope better framerates.
I have an i3/750ti $538 budget build. It outperformed the PS4 in cross platform titles in terms of FPS, resolution and graphics…except for any Ubisoft game. Ha. I think the 750ti is like $30 cheaper for the over clocked one now too!
I think the main reason why, is over that weak 8 core jaguar i mean people don’t even realize how weak it is, under my benchmarking scores some things were actually slightly faster with the Cell CPU in the PS3!!!
That 8 core was a very bad choice and even using a dual core Intel CPU will lead to gains something they should of thought of. I just think Amd was pushing multi-core processing so much that they got to Microsoft’s and Sony’s head.
TBH the GPU in the PS4 is actually quite nice for a console and good enough for 1080P 30fps playback its the CPU holding it back.
you need to do more research
What type of benchmarks have you actually used on both systems and what are your personal results?
Do you guys really want to pay $1000 for a ps4? jaguar 8 core is fine for the console
No its not a dual core Intel would have been a better choice the point is in some things the 8 core is slower then the cell CPU in the PS3, That jaguar is a horrible choice and a pain to work on(like the cell).
Many dev’s complain about it. The GPU in the PS4 is actually quite capable of 1080P 30fps if the CPU didn’t hold the performance back. Actually its to the point were GCN might have to pick up some of the physics and AI work.
Many dev’s complain about it? LOL i seriously doubt you know anything about game design let alone any devs
Yeah we can go to Ubisoft we can go to EA we can go to many other dev’s that complain about its lack of performance. Edit
http://www.mcvuk.com/news/read/ps4-and-xbox-one-s-amd-jaguar-cpu-examined/0116297
More sites claiming the CPU is just worthless for anything. Probably why the consoles can’t even keep a solid frame rate.
That’s the problem when you use an APU, mediocre graphics processing and mediocre central processing power.
Remember the power constraints the new gen of consoles faced.
Their choice of HW was very limited by that as well.
Cheers!
No he is right about the Jaguar architecture which was designed for low power devices like tablets or netbooks if clocked to 2.5ghz.
Ubisoft’s cloth physics simulation actually ran faster on PS3’s Cell so they moved it over to the GPU.
So basically MisterC is the biggest xturd…. The fake insider thought xbox one run on “future GPU”! $3B R&D LMAO
No one cares about the APU trash boxes. Back to neogaf or dualshillers.
A downclocked GTX 480 is stronger than both the Xbone and PS4.
PS4 is using an enhacned HD 7850 with GDDR5 memory (5.5 GB VRAM for games and 2.5Gb for OS/background applications) and Xbox One uses an HD 7790 with 32 mb of eSRAM memory.
And yet they still both underperform compared to an underclocked nearly 5 year old GPU.
The GTX 480 launched as a high-end card in March 2010. The Xbox One’s GPU is based on a low- to medium-end card (in between the HD 7770 and HD 7790). It’s unreasonable to expect high-end performance in a mass-produced gaming console.
Its also unreasonable to expect mid end hardware from several years ago to be anywhere near the performance of modern mid-end graphics cards. 7850 has little going for it over the 270 besides power consumption, and even then on a budget its possible to squeeze out a 270x for a little bit more that trumps the consoles graphics entirely. Call it the difference between a PC and a console with a year subscription to their online service to compensate the extra cost.
Not really, If you are making referance to linus’ video, they used a 6 core extreme edition as a cpu, which will contribute A MASSIVE performance jump. If they used the same cpu in the xbox one and ps4 both consoles would be better
Depends largely on the game. Many games are far more GPU dependant than CPU dependant. At least, the AAA games that follow the corridor-set piece style of gameplay most notably. Most games can run great with a mid-end GPU and an i3 processor. AMDs 270 is a great little console killer thanks to its price and performance, and unless you’re going for very physics intensive games, will be more than enough to handle mid-high settings at 1080p.
Well it’s not like they needed to use a 6 core CPU as Eurogamer has shown that a two gen’s old i3 pared with a 750ti is enough to match or outperform PS4 even in the demanding games like Ass Creed Unity and as you know 750ti is below minimum spec for that particular game.
Which makes it especially terrible that it’s surpassed by the GTX 480?
GTX 480 is weak for new games, it lacks VRAM and is too slow compared to GTX 680/780 and so on. You cant run new games with native 1080p, high – maxed out graphics and 30-60 FPS at the same time with that GPU (GTX 480). Even a single GTX 980/970/R9 290/290x cant max out (which means all graphics settings to max) new games in 1080p with 60 FPS (you need SLI/CF for that). The purpose of consoles was and always will be: 1. Profitability (in the long run) 2. Affordable price. PS4/X consoles are one year old, launch price was $400 for PS4 (controller included), high end GPUs alone are way more expensive. Im not defending consoles in any way but some uneducated PC gamers are going a little too far (stupid comparisons). BTW Xbox360/PS3 consoles were high end at launch (year 2005/06). For example, the Xbox360 GPU supported some feautures before PC, those feautures were introuduced for PC when HD 2000 and HD 3000 GPUs came out and not only that, it had 512 MB of unified memory aka system memory and vram combined (the OS eats only about 32 MB) and the Xbox360 has also 10MB of eDRAM memory (also for graphics stuff) and on top of that it came with an tri-core CPU clocked at 3.2 GHz (the CPU was built from the Cell arhitecture). So, all in all…. Xbox360 was high end in 2005.
“Even a single GTX 980/970/R9 290/290x cant max out (which means all graphics settings to max) new games in 1080p with 60 FPS”
http://i.imgur.com/VOpD6gP.jpg
My single GTX 970 is happily playing Shadow of Mordor, Evolve, Metal Gear Solid V Ground Zeroes, Borderlands Pre Sequel, GRID Autosport, Next Car Game, Assassin’s Creed IV Black Flag, Alien Isolation at maximum settings at 2560×1080 at 60fps
What about really next gen graphics then? Like Unity and Watchdogs? I bet it lags like a hag, which was exactly the point.
Ground Zeroes, Evolve are both on “Next gen graphics” engines and look far better than the games you mentioned while running beautifully. Unity and Watchdogs are both horribly unoptimized pieces of sh*t and I don’t own either one. Totabiscuit shreds both for their ineptitude in programming as did most reviewers. Evolve is running on the newest Cryengine (not 3, the unnamed one after it) and Ground Zeroes is the first game out with the brand new Fox Engine.
Picking games that are notoriously horribly optimized as your examples(even on console they run like sh*t) fits your very narrow narrative.
I doubt Assassin’s Creed IV Black Flag runs on stable 60 FPS, as it is a poor port. But I gotta agree with you, that current GPU’s can easily play current/next-gen games on 1080p with 60 PFS, as the architecture gets better every year, and not to mention DX12 coming out anytime soon, clearing up for DX11 poor structure of API.
You’re not defending consoles in any way, but you are defending consoles. Based on misconceptions of console fanboys, might I add.
There are plenty of guides on how to PC game on a budget, with builds made to show how you can make a console-priced PC that can outperform it, by being able to clock 60fps 1080p on multiplatform games at the same relative graphics settings.
And while the 360 was indeed pretty good for its time, it took PC about a year and a half to get the 8800 gtx, and beyond that it outpaced it by far. Going on about unified memory is pointless because it has certain detrimental effects on performance, and edram hasn’t been shown to be utilised in a way that improves performance noticeably.
Hell, I’m currently working on a YouTube video discussing options regarding budget gaming. This includes me going online with £350, and making a PC for that money to outpace a PS4. Including extra options and deals you will likely only see on PC, such as massively reduced game prices, peripherals, and taking into account the increased yearly cost of a console for online features.
Im a PC gamer, so, obiviously I know the difference between consoles and PC. BTW “8800 outperformed consoles etc etc”…. ofc, thats the purpose of consoles (profitability, relative cheap at launch and ofc none upgrade possible).
Yes like AnonGGThrowaway already said, my gtx980 is maxing out nearly everything at 60fps 2560x1440p with the odd poor port that won’t run at 60fps even if I am running at 720p
Yes you are defending consoles and stupidly so because they may have cost “only” $400 but then you have the $400 in mandatory online fees that gamers will need to pay if the want to game online that gen which that brings the cost up to $800 and if you can’t max out games with high frame rates for $800 then you don’t know what you are doing. Then of course we have more expensive game prices despite being inferior to PC versions because of the license fees Sony and MS charge third party developers.
Also right now a £90 750ti is enough to match or outperform PS4 , even in demanding games like Assassins Creed Unity when paired with a two gens old i3 which you can read about week in and week out on Eurogamer.
Keep in mind a 750Ti is weaker then a 7850 which the PS4 has but also keep in mind that the 8 core jaguar is the real bottleneck
ps4 doesnt have a 7850 it has two apus with a crossfire performance equal to an underclocked 7850
I agree with you however the £90 cards are 1gb 750’s, not the 2gb ti’s.
I thought I seen someone mention that the 750ti could be found starting at $95?. I may have used a mistake mate as I buy things in £’s.
Your rant makes no sense to me, you start by saying that the GTX 480 is too slow and end saying that the Xbox 360 was high end in 2005. Good for you, I guess, but nothing you said has anything to do with the GTX 480 being faster than the Xbox One’s GPU.
My point was not to say that consoles are stronger than PCs. Consoles obivious cant win the “performance” game in the long run. Xbox360 was just an example (high end at launch) but as we all know, its not upgradable, so, it obiviously couldnt keep up for long. The Xbox One GPU unline the GTX480 has more VRAM, VRAM is bottleneck on GTX 480 and the Xbox One GPU is much more efficient, they are using an low level API currently but DX 12 is comming soon as well, tiled resources for Xbox One and the Xbox One GPU supports new functions, shaders etc and all that means that the Xbox One GPU is able to use much more optimisations in future…. so, its not only about power, software and funtcion set matters as well. And AMD and nVidia arent innovative at all, they just rebrand new GPUs (only 10-15% changes on average in many titles), no big and revolutionary changes when it comes to pure performance (functions are something different). So, its nothing strange when we see an older GPU which can outeprform the PS4/X1 GPUs or with similar performance. But point is, consoles are relatively cheap and they have their own exclusive games, feautures etc.
yes but ps4 and xbone have the advantage because games are made for their specific hardware. PC games are never made for your specific hardware unless you’re lucky
I really dont understand why people like you are so hype? They are comparing DX12 with a stupid console. Even in six months will be tablets and smartphones more powerfull than those cappy boxes.
API’s like DX12 and Mantle are game changing across all platforms ie console, pc, mobile
LOL directx 12 on mobile won’t have a slight impact on anything OpenGL next is the future there
The ignorance is strong with this one
Yeah since 12 is only supported on windows OS so yeah your truly the ignorant one. Edit and Mantle only works on Windows atm. OpenGl next is the future for PS4-Android-IOS,Linux distro’s. Mantle and 12 have zero marketshare in any of those unless Microsoft opens up their API
for real?
Probably none games of 2015 will use dx12 so why do care about it.
Why not Halo 5 ? MS are the only one who have the latest upgraded version of DX12
who cares about halo here?
Halo..it’s a PC news site.
Wardell’s already said the major 3rd party devs have it and some 3rd party games this year will use it that are being released for the holidays but it’s guaranteed 2016 for everything else. DOTA 2 will be using GLN when it’s ported to Source 2 as well.
There will be new API games this year. That’s why everyone should care.
I doubt that sir
So the guy asks if the unannounced graphics card is similar in specs to the 5 year old chip which powers the Xbox One? HAHAHAHAHAHAHAHA!
Should ask about the 5 year old chip that powers the PS4 as well, just to be clear :^)
Seriously……. *Hurt is prevelent in this one.*
Oh please. We’re equal opportunity here at DSOG. We declare both consoles as outdated trash :^)
No he is pissed off that the sony fanboys assume that xbone is crap but ps4 isnt. WAKE UP ps4 is also weak.
The HD 7850 was released in March 2012. It won’t be 5 years old until March 2017.
Yeah…… Some people have seriously high hopes.
So they figured out how to make 2+ GPU’s stack Vram, now how about fixing the issue that has been plaguing multi GPU setups since they’ve been around, micro stuttering? All that’s been done so far is driver optimizations to minimize it, but nothing to solve it.
So, while it will not be a panacea for the XB1, DX12 will provide some performance increases if for no other reason the coding will not be as complicated to do for the games that support it.
No, it’s going to be a huge performance increase on PC as the DX11 and OpenGL API’s were already constrained by the PC Operating System abstraction layer (Despite hardware outpacing the consoles already). DX12 and GLN remove that layer and allow full hardware access.
It will be a placebo on the bone but PC gaming will have nothing but massive performance gains out of it.
So, what is the whole “no” about? I never mentioned the PC.
don’t be dumb
The only dumb thing is making an assumption I understand what someone else is talking about. I have no idea what the “no” is referring to since I agree with him. Of course instead of calling me dumb you could explain it to me?
Now we wait for glNext! 🙂
Which will have the same uptake as OpenGL! 🙂
If this is true, WINDOWS 10 will be my MAIN OS !!!!
lol,
directx10 was released back in november 2006, only a few titles fully implemented it, 3-4 years later.
directx11 was released in 2009, 2 years later, there were only a few titles fully implementing it.
directx12 release date ? probably june 2016 or later. there won’t be any released pc games fully implementing dx12 before late 2017-first half 2018. and i’m being optimistic.
That’s because PS3 and 360 were both limited to DX9. DX11 is supported on the PS4 and X1 will suport DX12.
Then that just means systems like mine will be better off/suited for even longer than I had anticipated. I’m not complaining about that. If you have somewhat newer hardware today it is going to be “mostly” backwards compatible with DX12……minus some hardware specific features of course.
“One thing it does is make it easy to treat multiple GPUs as a single entity.” “TICKET PLEASE!!”
The adoption level will be magnitudes higher than previous editions simply due to the fact that the OS is made from the ground up with DX12 in mind. Not to mention the OS is going to be free for the first year of its release. The advantages of this DX release are unprecedented in comparison to the last couple. The last good DX IMO was 9, but this one has me stoked.
DX11 wasn’t a big boost in perf from DX10, nor even DX9 for that matter. Most of the things it did was add more special effects like HQ bokeh, tesselation,etc which slows down the GPU. Sure there were perf increases but they were miniscule compared to the perf loss when appyling aforementioned FXs. DX12 however benefits the gaming industry as a whole by making it easier to optimize games. I can guarantee you that at least most of the AAA devs will use it. They can’t afford not to when open-world games which require sh*t tons of drawcalls are trending unless they want to be the next UbiLOL
Someone’s salty about their APU station about to get murdered by PC’s with glNext and Direct X 12 they have to spread FUD even though DX12 is confirmed to launch with Windows 10 this year.
I included a helpful chart for dealing with your buyers remorse before you run back crying to Dualshillers crying about GPGPU magically upgrading their games. Which is useless on the trashstation 4 since it needs a high powered GPU and CPU. And POS4 has neither :^)
Fastest selling console ever, 19+ million PS4 sold in barely more than a year and counting, what’s hard to accept about it? :p
It’s still a piece of s**t :^)
Huh.. Why don’t you go signing yet another petition to get Bloodborne on PC? lol
Why don’t you go back to shilling for The Odor 2/5 :^)
have you morons seen those petitions? they barelly got 1000 people and thats just when the game important halo petitions got like 500 people
It’s still a piece of s**t :^)
” what’s hard to accept about it”
That its a weak piece of crap with no games, might as well claim the wii was the best selling system and therefore the best console.
Mantle release Feb 2014. Battlefield 4 Mantle implementation Feb 2014. Direct X 12 will have similar results but on a much larger scale.
Greatness is coming….
One benchmark showed that these low level API’s only benefit you greatly if you are cpu bottlenecked. Comments?
If you had paid attention even the 4 and 6 core is bottlenecked by DX11. DX12 unleashes even more fury in the 4 and 6 core cpu’s
DX12 is exactly what PC Gaming needed for over a decade. I know results may not be as big as best case scenario benchmarks but it’ll still be pretty big and will make it a lot easier for devs to work on PC. Some devs like Ubisoft will still be doing their thing but who cares.
Pretty exciting!
dx12 is not going to give 100fps increases, if you believe this sh*t you deserve to work at ubisoft or something
Some very exciting times ahead, I’m looking forward to GDC in a few weeks.
LOL!! Really? Same they said for Directx11, Im still here sitting waiting the FPS boost DirectX 11 was supposed to give to our videocards.
It did give a boost on the GPU side. Only that boost was negated by tesselation, PCSS shadows,etc all those GPU intensive effects. DX12 will boost CPU more than GPU if my understanding is correct
It’s a Sony peasant. Don’t bother.
So basically that means the so much touted DX11 is giving out 10% of the intended performances? What a piece of junk!! ^_^
Really embracing that peasantry by not being able to read. Guess you got tired of defending the odor 800P and try to damage control your APU tardbox’s inability to compete with PC’s which are being held back now and will be unleashed later this year. Not even Sony selling off some more buildings can stop this train :^)
Isn’t it true that according to this article DX11 is performing at 10% of DX12? I rest my case.
Funny since the much touted DX 12 is the focus of the article. Someone’s in damage control mode. Back to shilling The Odor 1886.5 :^)
lol, thats adorable…is his pc close to “xbone specs”.
On one hand, I like the idea that I don’t have to sacrifice quality for frame rates. On the other hand, I don’t think any of my older game titles will see this benefit (Anno 2070, World in Conflict, Crysis 2, etc).
Couple thoughts:
1) Nothing is really stopping MS from applying a custom-version of their DX12 to their XB1 in a future update; it would make up for some of the performance gap between it and PS4.
2) DX12 would certainly become another ironic pitfall in PC gaming to the ‘good enough’ crowd that can both hold onto their new systems for a lot longer and get a cheaper system to perform like a more expensive DX11 system. The death of high-end is near, unless companies overcharge to compensate.
3) I still find it amusing that people don’t focus on the fact that the point for DX12/Mantle is to reduce CPU overhead– which means defeating the purpose of Intel’s advantage over AMD by making every game GPU-bound; hell, it defeats the purpose of upgrading the CPU or even overclocking. But you guys are looking past that.
wondering if a plx chip would get in the way of this new ‘treating the multiple gpus as one gpu’ as it simply duplicates the i/o from one graphics card to the other?
Finally I get to use my GTX 69o as it should be. I have been so frustrated in the amount of games that do not take advantage of multi GPU rigs.
I’m kind of disappointed in dsog with this article. The title seems very click baitish to me and the article as a whole doesn’t really point out obvious things that people need to know.
Like if I was just a casual PC user and came across this article, based on what it’s telling me is that future AMD GPUs with dx12 are going to have a 100fps gain over previous GPUs just with the change over from dx11 to dx12.
That clearly isn’t going to be the case and my personal thoughts based on previous advertising when it comes to new gpu driver releases/mantle/older dx iterations when they always give you those ridiculous graphs showing “30/40/50% increase in performance!” when in reality anyone who’s at least a little tech savvy knows it’s not true.
I don’t understand why xbox users still seem to think dx12 will give them anything noticeable. Phil Spencer himself has said on a few occasions that it won’t as well as many others. I don’t think dx12 will even give anything noticeable to PC users either so don’t get the impression I’m just hating on xbox.
Performance/graphical improvements are slow and almost imperceptible (especially as improvements as time goes by are getting less noticeable) and it’s crazy to think that dx12 is going to be some magic wand that gives all your PCs a huge boost. I’d be seriously impressed if it even gave a 5 fps improvement in dx12 enabled games.
I’m pretty sure I read somewhere (might not be true) that the reason there’s such a big jump in fps with star swarm with AMD GPUs is that the drivers for dx11 are broken for AMD and it runs poorly as a result but dx12 (or maybe a fix from AMD) resolves this issue as I remember seeing the fps that Nvidia GPUs were getting was much higher right from the start on dx11 compared to AMD.
I usually find dsogaming a refreshing experience when I’m looking at articles as they get to the point, don’t skew the topic to make it bigger than it really is and are usually realistic in their interpretations of the information that is being given to them but this one seems like a miss to me with what I find to be the common hallmarks of a click bait article.
“I don’t understand why xbox users still seem to think dx12 will give them anything noticeable. Phil Spencer himself”
Because phil knows everything and is always right, because ms would bother with dx12 if it didnt do anything for xbox.
Nowhere did I say he was always right and I didn’t just say Phil Spencer was the only one to say this, he just so happens to be the one I can remember the name of who mentioned it hence why I said “as well as many others”.
I think the opinion of the head of the xbox division is worth hearing on this issue especially since it’s something that doesn’t promote xbox and if anything harms it.
What makes you think dx12 wouldn’t be worked on if it wasn’t for xbox? I think the large number of PC users in the world is more than enough to justify its existence as with the previous iterations of dx.
“What makes you think dx12 wouldn’t be worked on if it wasn’t for xbox?”
Ms amazing support for pc gaming….NOT
MS know how to create Software, glad this is why I game on MS platforms.
Not for less than $400. For that price, Dualshock 4 included, it’s an amazing piece of technology.
And for $400 you can get a PC that destroys it and you don’t need a cheap pad that has the rubber of it’s sticks wear off
Back to Dualshillers
For 300$ its good but “amazing pieve of techonlogy”
AAHAHAHAHAHAH
Good one.
Yes keep in mind this isn’t over the 7850 being weak but over the 8 core 1.6Ghz jaguar being a POS
The 7850 isn’t weak but then PS4’s GPU suffers from lower bandwidth as it has to share memory bandwidth with the CPU. This site reported that memory bandwidth for GPU can be as low as 140gb’s when the CPU is being worked. You are right though, the CPU really is the weak link in the chain.
Yes that is indeed a issue but at the same time It can share information without having to be recopied over so its not as big as you think and its not really the big limiting factor.
Well it’s so far proving a bottleneck as games are using no texture filtering due to bandwidth limitation. Either way the GPU still only has 140gb’s of bandwidth even if data doesn’t need to be moved over PCIE
And yet DirectX 12 will still not support windows 7. I bet adoption rates of Windows 10 will be slow, despite the ability to upgrade for free. I initially started to like where Windows 10 was starting to go, but that instantly turned into hate as they rolled in more updates to it.
I’d like to see OpenGL make a large comeback again.
EVERYONE, consider giving this a read!! http://blog.wolfire.com/2010/01/Why-you-should-use-OpenGL-and-not-DirectX
anyone tried radeon hd 7950 on dx12 yet ?
microsoft and sony both should have had intel reproduce some older gen i5 2400ts at a way cut down cost due to the way cheaper manufacturing node process due to the higher nanometer dies with enabled gddr5 instruction set on the cpu and got something on the lines of a 7870xt discreet gpu with no vram so they could use gddr5 asynchronously between both gpu and cpu. Both consoles would have been able to get premium performance by using older tech with no increase in price because of licensing with manufactures would drop the price of parts to a little more than the cost of manufacturing process due to the quantity of the production needed for consoles. plus if they did it this way with how optimized consoles are they would perform around 70% better in gpu intensive applications and 200-300% better in cpu intensive applications