Gears of War: Ultimate Edition – PC Requirements Revealed – DX12-Only, 8GB RAM & Quad-Core Minimum

Microsoft has revealed (via the game’s Windows 10 store page) the official PC requirements for Gears of War: Ultimate Edition. According to the specs, PC gamers will need at least a modern-day quad-core CPU with 8GB of RAM and a DX12 graphics card. And as you may have guessed, this is another title that will be exclusive to Windows 10.

Here are the PC requirements for Gears of War: Ultimate Edition:

MINIMUM SYSTEM REQUIREMENTS

  • OS: 64 bit Windows 10 – version 1511
  • Processor: Intel Core i5 @ 2.7Ghz or AMD FX 6-core
  • Memory: 8 GB RAM (2 GB VRAM)
  • Graphics: Geforce GTX 650 Ti or Radeon R7 260x
  • Hard Drive: 60 GB available space
  • DirectX 12

RECOMMENDED SYSTEM REQUIREMENTS FOR 1920X1080P

  • OS: 64 bit Windows 10 – version 1511
  • Processor: Intel Core i5 @3.5GHz+ or AMD FX 8-core
  • Memory: 16 GB RAM (4 GB VRAM)
  • Graphics: GeForce GTX 970 or Radeon R9 290X
  • Hard Drive: 60 GB available space
  • DirectX 12

RECOMMENDED SYSTEM REQUIREMENTS FOR 4K

  • OS: 64 bit Windows 10 – version 1511
  • Processor: Intel Core i7 @ 4Ghz or AMD FX 8-core
  • Memory: 16 GB RAM (6+ GB VRAM)
  • Graphics: Geforce GTX 980 Ti or Radeon R9 390X
  • Hard Drive: SSD + 60 GB available space
  • DirectX 12

 

162 thoughts on “Gears of War: Ultimate Edition – PC Requirements Revealed – DX12-Only, 8GB RAM & Quad-Core Minimum”

  1. As I figured. They are selling GoW UE via the Windows Store. So it’s a UWP App. Great.

    I hope they enjoy low sales

        1. I bet once more games hit that people will excuse the fact that 12 does nothing but require more power just to do something that sure ain’t saving or refining the power you have.

          2 years ago we were told the power of DX 12 was going to make gaming more refined and smooth with your parts being used properly, all we see now is that for console ports that suddenly require even more power than the standard power required for other AAA console ports. Even with that 980ti requirement for max settings at 4k, we won’t be getting absolute smooth performance along with full constant 60fps at that, you’d need two cards and that creates a stupid requirement rift and massive chunk of money for so little to gain.

    1. Of course, it makes use of newer technology. Everyone that bought cheap builds using the Pentium G3258 just to bash on consoles now are stuck upgrading (those that didn’t upgrade for Farcry 4) their rigs.

    1. I’m probably buying it Day 1 if the port is good. I loved the first Gears and still think that it’s the best in the series.

          1. ::f380Work At Home….Special Report….Earn 18k+ per monthfew days ago new McLaren. F1 bought after earning 18,512$,,,this was my previous month’s paycheck ,and-a little over, 17k$ Last month ..3-5 h/r of work a days ..with extra open doors & weekly. paychecks.. it’s realy the easiest work I have ever Do.. I Joined This 7 months ago and now making over 87$, p/h.Learn. More right Here::f380????? http://www.freshemployments.com.­nu .?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:::::::f380…

  2. Windows 10 is being treated like a console now, with its supposed system sellers and all this crap… It’s upsetting.

  3. I’ll see how it looks before I judge it.

    If this version doesn’t look way better than the Xbone version, then I think that is MS’ way to say, “see, our console is way powerful than you think.”

    1. There is almost certainly some of this going on. Now we get to the the dufus cabal at IGN say “see you need a high end computer to play even last gen console games xdxd cant see over 720p 30fps anyway xd”. Can’t wait.

      1. Many forget how difficult it is changing from a 360 to a custom API for Xbone DX11 and going then to DX12. Going from a serialized environment to multithreaded could alleviate many software bottlenecks. Everyones still stuck on crysis and its horrible software bottlenecks and directly correlates crysis to running badly because of its amzng grfx. Just wait till they get a load of Star citizen looking gorgeous while not melting hardware and watch them squirm in agony.

        1. Vulkan is the future. It is in it’s early stages released and performs a bit worse than current dx11. With optimization I really hope we can see huge increases.

      1. What are you doing?!? You can’t come to the comment section and expect people to act appropriately, nonetheless know anything at all! I advise you to not come here to share knowledge.

      1. Oh look , it’s one of those gamers who believes in Unicorns, the Easter Bunny and conslol optimuhzation lol. The last two and a half years an i3 CPU and 750ti has been enough to match if not outperform the consoles.

        For example this is how the budget £350 PC managed to run the dreaded Batman AK and that footage was taken BEFORE it was pulled from sale so that it could receive patches to allow it to run at 60fps

        https://www.youtube.com/watch?v=VzvjjdXzyVs

        So I don’t actually see what’s streamlined about console development, it’s not like Sony or Microsoft use magic or hardware warped in from another dimension. So to answer your question, the plebs won’t realise that consoles run differently to PC because it’s mostly errant nonsense. The only area where consoles “had” an advantage was when PC used Directx11 engine which only allowed one CPU core to send draw calls to the GPU. That has been fixed with Directx12 on PC but even then it wasn’t really an advantage to console as they both use low power Jaguar CPU cores designed for tablets so even a dual core i3 under DX11 was enough to vastly outperform the Jaguar architecture as one core would send draw calls to the GPU and the other core would help prepare said draw calls.

        Oh and lastly, NVIDIA and AMD do lots of low level optimisations for their hardware which works out well as they designed the actual hardware so understand it inside out which is something a console developer will take years to understand . So I hope this answers your question….

        1. You just spoke a bunch of nonsense. It is astronomically easier to code and optimize for 1 system which runs on the same operating system than to optimize for myriad hardware configs and OS’s. You even said it yourself, the consoles are using a low power tablet architecture for their CPU cores and an integrated GPU. How can you compare that to a desktop i3 and an overclocked 750ti? Indeed that should serve as a testament to what “streamlined” means. You can’t get that level of performance with the console’s APUs on regular windows 10. I would also like to see a game on PC that allows for dynamic resolution scaling.

          Also DX12 is irrelevant because hardly any DX12 games are out right now.

          1. The consoles are using low power tablet hardware, however they have EIGHT cores to an I3’s 2 cores so that kind of evens out, however even then the i3 CPU is usually seen outperforming the Jaguar CPU’s by some distance, for example it’s not uncommon to see i3’s powering games at 60fps while the consoles are struggling for 30fps.

            How can I compare a desktop i3 and overclocked 750ti you asked, well for one thing the 750ti is actually WEAKER than PS4’s GPU yet over the last two years it has managed to match PS4 if not outperform it. Oh and did I mention that an i3 CPU and 750ti build costs only £350. So there’s your special sauce optimisation, a weaker GPU in a £350 budget PC that was built 2+ years ago matching the performance of the …single platform….console..duhdaaaa

            You really don’t know your earhole from your a$$hole, are you a console gamer by any chance?. Don’t you know that code compilers do all that converting work automatically?. Then AMD or NVIDIA optimise the game and release drivers .the end…

          2. Please go to pcpartpicker and build me a mini itx based core i3 / 750 ti/ 8 gb system with Windows, a game, and a controller for $350. The reason why I ask is because I can currently purchase a PS4 with a free game and a controller for $299.99.

            Also, wtf are you talking about? A compiler isn’t going to do any kind of optimization work. A compiler just compiles the code. An extensive amount of work is required to make sure a game runs smoothly on all hardware. This work is infinitely easier to do on a console with fixed hardware across millions of users.

            Developers also create custom game profiles for consoles games. That’s how a game can dynamically change it’s resolution without any input from the user. Please find me a game that automatically changes settings without pausing and changing them yourself on PC. Every single time I buy a game and start it up, it takes me at least 30 minutes of changing settings to get it perfect. (i7 / 390 / 16gb / 1080p) You don’t need to do that on a console. The reason why I do it is because I enjoy doing it, but not everyone feels the same way.

            Don’t get all PCMR on me friend as I don’t even own a console. I’m just telling you that you’re wrong. I don’t buy consoles because I can afford high end computers, but what I really dislike is the fact that PCMR idiots with false consensus bias cannot for the life of them see the value that consoles bring to people who do not want a gaming PC, don’t want to build one or maintain it, or simply want a platform that plays games with their friends.

            Also, for a GPU that was released an entire year after the PS4 was released, I would hope that it would at least match the PS4 – a small computer that runs on an APU.

          3. Look, Digital Foundry built a PC with the i3 and 750ti for £350, the Windows cost was £80 but I was comparing only hardware costs, that is unless you believe the £400 in online fees that most PS4 gamers will pay for online this generation isn’t considered when Sony price the PS4?

            I am not getting PCMR, on you, what is it with spackers squealing Master Race anytime their bullsh1t is challenged?.

            A compiler converts the code easily and AMD and NVIDIA make the optimisations for their hardware… which is why a lowly 750ti which is weaker than PS4 has been able to match or outperform it over the last two and a half years…..theee end.

          4. So you can’t build it then? That’s exactly what I thought. Digital Foundry also thinks that you can only overclock Nvidia cards so I’m starting to take what they say with a grain of salt.

            Also what $400 in online fees are you blabbering about? Their online service can be found for as low as $30 a year if you know where to look (slickdeals is your friend).

          5. What do you mean you can’t build it?. Also even if you could save an extra $10 it’s still $30 ever year which subsidises the cost of the console. Oh and lets not talk about finding software cheaper when the PC CD Key sites sell games for fraction of the price of the console versions., Fact is most people pay £40 a year charged through their debit/credit card so there’s no blabbering , the online fees are a complete rip off and the cheap console isn’t really cheap, I mean they use Jaguar CPU’s for fcuk sake, you know those £33 Kabini Athlons…..

          6. Also Directx12 is irrelevant because hardly any games use it?. So all future games are going to be using DX12 from now on so hardly irrelevant, hell even games that have already launched are receiving DX12 upgrades right now/

            What does DX12 do for PC, well for one thing it takes better advantage of multicore CPU’s which means games will better max out i7’s and FX8350’s . You want to know why i3 CPU’s are even still relevant in PC gaming, well it’s because the PC version of Direectx11 doesn’t take advantage of multicore CPU’s so only ONE CPU core can send data to the GPU to work on and any other cores that are working on data to be sent to the GPU need to first send it to the one and only CPU core that has a direct link to the GPU. Now that Directx12 is here ALL CPU cores have direct communication with the GPU instead of having to hand their data to the CPU core that has contact with the GPU they can simply send the data themselves which unlocks the raw power of I7’s, I5’s and FX8350’s.

            That’s the only area where consoles differed from PC’s and it has now been fixed thanks to Microsoft giving Windows 10 away free to ensure ALL PC gamers are using Directx12/

            So no secret sauce optimisations, AMD and NVIDIA know their GPU’s better than any console developer as they made them and they have software engineers that optimise the performance of all the big games. Also code compilers LOL

  4. Love this game. I’m so happy all the whiners with crappy PC’s in here won’t join the chat. Although, it would be fun owning you with the gnasher and listen to your cries. ?

    1. En del av oss har inga problem med att matcha hårdvaru-specsen men vi vill inte installera Windows 10 eller använda ruttna jävla Windows Store…

  5. as far as Windows Store is concerned i am only interested in Quantum Break and will pay full price only if it is not as limited as Rise of the Tomb Raider was on Windows Store (i got the Steam version obviously but heard nasty stuff about it’s Windows Store version)

  6. 390x for 4k 60fps. I’m not sure what people are complaining about. Tomb raider wasnt 60fps at 1440 on a fury x…

  7. Its very difficult for me to complain when you meet the RECOMMENDED SYSTEM REQUIREMENTS FOR 4K.

    Though I hate that we cannot use SweetFX but since its DX12 SweetFX would have not worked anyway.

      1. I will be playing the game in 4k/60fps if those requirements are to believed, so what else should I complain about?

        1. My question was regarding the sweetfx + DX12. I have since googled it and it has to do with the injection so it seems. They’ll find a work around eventually.

          1. ReShade/SweetFX currently does not work with DX12 or Vulkan, we will need updated versions for them to work it games using DX12 and Vulkan APIs.

  8. FU MICROSOFT omg i can’t belive that they even dare to post this ridicoulus specs .
    What did they smoked ? or sniffed?
    cocaine its a hell of a drug – Ricky James

    1. Seeing as I could probably run this game on a 260x, I am pleased. Ridiculous requirements? They are good. 980ti for 4k isn’t bad, if one is able to achieve 60 fps. If it were quad-sli titan x reccomended for 1080p it would have been another story

      1. well i am ready for 1080p maxed out 60 so i hope Microsoft will not mess things up its a long time since they didn’t release something for us pcgamers!

        1. 1920×1080 = 1080p = 1k
          2560×1440 = 1440p = 1.78k
          2715×1527 = 1527p = 2k
          2880×1620 = 1620p = 2.25k
          3325×1871 = 1871p = 3k
          3840×2160 = 2160p = 4k

          1. I bet that he copied those numbers directly from DSR factors on his 1080P monitor.

          2. 4k is called 4k because it has 4 times the pixel count as 1080p. Hence 2k and 3k will have exactly 2 and 3 times the pixel count as 1080p while also maintaining the standard 16:9 ratio and that is why we get those odd resolutions.

          3. Nonsense. 4k means 4 thousand. And what do you know, it turns out that the horizontal pixel count of “4k” is 3840, close to 4000. This is the real reason they named it 4k, not because “four times another arbitrary resolution”. Likewise, 3k and 2k are roughly that many pixels horizontally. 1080p is not “1k”, 1k would be 1024*576 or something like that.

          4. No, very obviously not. The number represents HORIZONTAL pixels, not vertical. 1080 is vertical, 1920 is horizontal. 4k is roughly 4000 horizontal, not vertical. Please stop for a second and actually think about this.

          5. k is just a nickname for 1080p standard resolution it does not literally mean 1000 in this case. Hence 4k is just 4 times the pixel count of 1080p.

          6. So it stands for vertical pixels in 1080p and for total pixel count in the multiplications of that resolution? Because F logic, right? lmao, you’re so far gone. Find me a source that says 1080p is “1k”. You can’t because it isn’t.

          7. The problem is that 8K where 8 stands for 7680 is only valid for 16:9. For 21:9 8k technically is 10240 × 4320. It’s easier to go by 2560*4=10240, 4*2=8k scheme.

          8. You have to count pixels:

            1k = 1920×1080 = 2,073,600
            2k = 2715×1527 = 4,145,805
            3k = 3325×1871 = 6,221,075
            4k = 3840×2160 = 8,294,400

    1. Clearly the don’t want to sell their game, I would say! I meet and surpass the recommended specs but I won’t give a single cent for Microsoft and their disgusting policy. This whole Windows store, Win 10, DX12 exclusive with insane specs is just a farce. I feel sorry for the people that continue and support these companies by spending their money…

        1. Of course they won’t sell much. It will stay Winstore exclusive like Quantum Break because in MS they are a bunch of idiots.. can’t explain otherwise their actions lately. I have worked as a seller for years and this forced and do whatever I can desperately in order to sell my product works for a target group but leaves a big target group that hear your name and change path. There is no need for this policy MS has. They could sell even more gamer and OS if they had a different approach less aggressive. It is so entertaining people that work for a multi billionaire company are not even able to see the basics. Or they just think of the consumer as a brain dead addict…

    2. Is it that hard to think that Intel/AMD/NVIDIA/G.Skill/Corsair etc etc paid M$ to reveal these ridiculous requirements in order to ”force” people upgrade ?

        1. So you’re telling me that the game will have bigger loading times if you play at 4K instead of 1080P ? That doesn’t make sense..

          1. My point is that they recommend an SSD only for 4K. If the game had problems with streaming and needed an SSD, they would recommend an SSD for 1080P too.

  9. lol gears of war from epic, the company that hates all pc. But anywho, I wont be buying it. Windows Store is just another GFWL failure.

      1. the game does have async but in no magical world is that the 390x is comparable to the furyx/ti at 4k.. The fury x even beats the titan x stock vs stock in rottr at 4k.. this is utter nonsense

        1. I’m also th8nking of VRAM as to why 390X was choosen over Fury. Or the game isn’t that demanding even for 4k but they need boat loads of VRAM.

  10. 970 is pretty much the recommended card for AAA title at 1080p, and has been for most of the recent major releases.

  11. The differences(If any lol) we’ve seen between this version and the original, DO NOT justify these requirements..I would rather play the original version with 5K downsampling with SweetFX and still maintain acceptable frame-rate..

    1. Yes I figured that out from your comments. That is your issue. You should. MS aggressive policy in games, even more aggressive selling their OS, privacy issues etc is something that everyone should be against!! If you are just a teenager I hope you will eventually understand things better besides just playing a game.

      1. You are free to not update your PC, but don’t blame MS when you’re left behind. And those privacy articles you’ve been reading are retarded. When you YOU grow up, maybe you will realize that if there is a headline and no proof, it’s bullshit.

    1. Well that’s not an issue anyway, it will make the performance better if the games are Async heavy, no high requirements are needed. Async Compute is a performance optimisation.

      1. It doesn’t make sense because a R9 290X is recommended. NVIDIA can’t brute force Async, it’s going to be shifted to the CPU so a i5 or strong multi-thread CPU will be required, an i3 just won’t cut it.

        Well, NVIDIA probably didn’t bother with Async because no one has been using it, AMD implemented it in their older cards but it never got used. NVIDIA focused on DX11, Tessellation and it’s worked out for them well and why AMD lost out in heavy geometry DX11 games. DX12 is all about shader, unlike DX11 which is mostly about geometry, something AMD have been weak at

    2. Then why 1080p need 290X? 390X isn’t that much faster than 290X to begin with. if the title really heavy on async compute then something lesser should be good enough for 1080p

      1. I didn’t even mentioned Async, all I meant is that 390X doesn’t make sense against 980 Ti. Most of the games now (despite requirements) need at least 970 or 290X for decent performance at good graphical quality. The reason why I mentioned aging factor is because those old 290 and 290X are still fighting back against new Maxwell based GPUs while 970 and 980 are getting slower at a faster rate, 390 (a rebrand according to several NV fanboys) is already the new price/performance king and now 390X is being compared to 980 Ti for some reason so what would you call it ? Maxwell is certainly aging faster… Async on the other hand is an entirely different beast and something Nvidia is not good at.

        1. That’s what happen when you got all console under your wing. Especially when the architecture itself is downright the same (GCN). It has nothing to do with Maxwell aging faster (same with kepler). You don’t see this trend on previous gen console when amd did not dominate the console machine roster. With maxwell nvidia addressing kepler weakness but one thing they never expect probably AMD ACE engine in GCN that finally get utilize under DX12. Good thing is DX12 doesn’t strictly need new hardware to take advantage of. If not we will see another reapeat to what happen with AMD 3K and 4K series when DX11 require some new hardware before ypu can access tessellation feature.

          1. It has nothing to do with Console cycle, you guys just over blow consoles because currently AMD is dominant there. In rough terms DX12 is mostly about how to make better use of the available resources, that’s why it’s compatible with older hardware. Only the most high level features require new hardware so when we are talking about making better use of available resources, we immediately think about parallelism and this is where Nvidia lacks.

            We all know that Nvidia kept their archs tied to DX11 strongly with strong performance in single threaded stuff and tessellation while AMD kept parallelism in mind since HD 7000 so now that all are shifting to this new trend, we are suddenly bashing AMD because Nvidia cannot handle Async very well. This is the problem coming from Nvidia’s side just like poor tessellation performance came from AMD side, it has nothing to do with consoles or anything, we have seen this trend of Nvidia GPUs getting obsolete faster, for example tell me which Kepler card currently qualify for VR ? even the most expensive original Titan cannot do it and it’s a pity because that thing was sold for 1000$. That’s what happens when you make too many archs without thinking about their longevity and value to customers.

          2. Single threaded stuff? True DX12 does it better than DX11 but saying DX11 solely focus on single threaded is false even if it is hard to do. Now which company again that refuse to support DX11 multi threaded? And most people should stop thinking DX12/Vulkan is the same as low level API used in console where such access allow developer to maximize hardware resource and potential. Tell me why BF4 in Mantle need more VRAM than it’s DX11 counter part? Why R9 285 have negative performance in DX12 vs DX11 as of the latest beta 2 of AoS? (anandtech suspecting the same issues why R9 285 got performance hit in Mantle). If dev maximize resource usage such performance regression vs DX11 should not happen at all.

            And you were saying nvidia did not value their customer by releasing too many architecture. Did you think AMD is any different? Did you honestly think that AMD still using GCN because they care for their customer to use it in very long time? If anything AMD are much worse in this regard. So after 3 years how much did 7970 hold up in games? Then think again how 4890 users feel when AMD officially drop the card from main driver support only after 3 years. People feels that nvidia forgot about Kepler when they come out with Maxwell. That’s also what 6k user felt when AMD come up with 7k series. AMD did not come up new architecture because of their financial burden. Not because they care for people to use it in very long time. And if anything they want you to upgrade. More so than nvidia when they need the money to keep the company afloat.

          3. Read anywhere on internet and you’ll find that DX11 is mostly about single threading even if you try to encapsulate things in multi threading, most of the graphic workload runs on single thread thus we poor CPU utilization, this is what DX12 is trying to solve and who said that DX12 is as low level as console API ? I didn’t said that anywhere, you’re just making things out of your own but it’s far better than the overhead associated DX11, do you even consider how old DX12 is ? it’s still new and developers are still experimenting with it, even drivers aren’t as mature as DX11 drivers. Once these things will become mainstream, we’ll see much better output from DX12.

            Yes AMD value their customers more than Nvidia, GCN is aging far more better that Maxwell, history will be repeated and Maxwell will be dumped into trash bag once Pascal will appear. Tell me can you run a game like RoTR on very high textures using GTX 980 ? The GPU itself is very capable of doing that but it can’t because Nvidia made both 970 and 980 just 4 GB because they want to sell that 980 Ti in 6 GB and Titan X in 12 GB flavors, it would have been great if they would have made 8 GB variants of at least 980 while making 980 Ti 12 GB and dumping Titan X altogether, a much better option for customers. AMD made 8 GB variants of 290 and 290X while 390 and 390X are readily available in 8 GB so which is going to die faster a 4 GB card or an 8 GB card ?

            Also I didn’t felt anything wrong when I had 6970 and AMD came up with 7000 series, the transition was graceful and nothing like Kepler cut off. On paper a 780 Ti is a better equipped card than a 970 so it should be VR ready isn’t it ? not to mention those high end cards were made with only 3 GB VRAM (a weird size) so that’s a good reason to dump them.

            And where are you getting your facts from ? AMD didn’t dropped support for HD 4000 after three years, they just moved them to quarterly update schedule instead of monthly, the last driver was released in 4/2013 which means 5 years of support since the release of 2008. All other archs after 4000 were supported till the first version of Crimson.

            Yes AMD has financial burden yet they make bigger leaps in technologies than Nvidia like GDDR5, HBM, Interposer, Mantle, XDMA and the list goes on. They upgrade their arch gracefully and not like Nvidia. 1000$ original Titan is now getting beaten by 280X does that sound normal to you ? 3000$ Titan Z was dumped immediately after R9 295X2 which is still relevant, does that is normal for you ? none of 700 series cards are recommended for VR while a 290X (a card came in competition against 700 series) is recommended for VR and here you’re arguing about who support their archs better, even the worst Nvidia fanboys accept this fact that Hawaii based cards are aging really well.

          4. Dx11 is quite capable of multi threaded. But gpu maker need to support them in their driver. But most game did not support DX11 multi threaded for a few reason; it is hard to do and to add furthermore AMD refuse to support the feature. And just because I said all this thing about AMD that I like what nvidia did either. Those 980/970 limited to 4GB is dumb from consumer point of view no matter how you want to spin it. And I said AMD dropping 4k from the MAIN DRIVER not completely stop releasing drivers for it. true they still release quarterly driver for it but that strictly for bug fixing only that usually not game related. Game optimization? There is none. That’s what it means when they move the support to legacy. If TSMC somehow able to go through with their 20nm HP process then probably you can see 7970/280X see the same fate as it did with 4890. And as I said AMD hardware in console are giving them the advantage they have right now. And yet you refuse this straight up fact when AMD themselves try to use this fact as why you should choose Radeon instead of GeForce. That’s why during Kepler generation nvidia publically state that AMD winning all those console win will also going to help PC in gaming in general and not just AMD (they want to assure people that AMD winning all those contract will not make their solution less appealing than Radeon). And please don’t joke about AMD value their customer more than nvidia did. It is just a coincidence that they have to use GCN for very long period of time because they don’t have the money to make ’emergency’ plan for their if certain things are not going well. If they did value their customer they will not drop 5k and 6k as of now even if supporting such old architecture are quite a burden on their financial. They will not left those 4k user on legacy support only after 3 year plus. They will not going to left the user that choose amd stereoscopic 3D solution to look up compatible driver themselves. And for Kepler most people just accept the stories about how nvidia abondon the architecture without even looking what changes nvidia has done on Maxwell. Even worse they take it as purposely gimping when test shows that even Kepler have performance improvement with new drivers.

          5. I don’t have time to write same big replies again and again but it’s a joke in itself that you’re comparing the support of HD 4000 series against GTX 700 series, the former got released in 2008 while the later was released in 2013. AMD dropped HD 4000 support in 2013 while Nvidia dropped Kepler in 2014, they are still releasing drivers for it but Kepler optimization is an after thought for them, that’s the reason we saw Kepler Witcher 3 optimizations so late after so many users cried for bad performance and that’s the reason why 780 Ti was slower than 290 in Fallout 4 with Extreme God Rays, it means AMD is still trying to optimize those Hawaii based cards while 700 series is an after thought for Nvidia.

            Now practically speaking tell me who is still using an HD 4000 series card ? they had 1 – 2 GB vram, quite limited compute capabilities and a GPU that’s dozen times weaker than what we have today so what game optimizations you’re expecting from AMD in that matter ? Now look at green side, there are still several users who are using 780, 780 Ti or even 770. What about them ? on paper 780 Ti is as capable as 980 with more CUDA cores but it’s taking a back seat now and what about GTX 960 2 GB which got discontinued suddenly because it cannot fight against 380, several mainstream users bought that card for casual gaming and now they have that obsolete product feeling, Nvidia could have tried to optimize it further with 2 GB and 4 GB variants but they picked the easier path and discontinued the product just like they did with Kepler.

            If AMD is enjoying the benefit of console market right now then Nvidia is also enjoying the benefit of 80% PC market, that’s business according to Nvidia fans, so Nvidia is allowed to do business and take advantage of things like tessellation and show that AMD is weak but if AMD do it then they are bad ? our hardware is getting stronger day by day and we need things like Async to better utilize it, what’s wrong if AMD is pushing it ? Nvidia pushed tessellation abusively but if AMD is pushing the right tech for future then that’s wrong ?

          6. Never did in my post comparing Kepler and AMD 4K series. You complain about nvidia Kepler getting second attention after Maxwell when Maxwell arrive but you were fine with AMD completely stop optimizing 4k series in new games when they drop the card from main driver in 2012. Which one is worse? Having new drivers and still receiving performance optimization (even if they were late) in every driver release or getting new drivers that only fixing minor bug (once a quarter to boot) with no new game optimization at all?

            And when I’m commenting about 4k series it was back in 2012. The highest end still have decent performance on low res (more so for CF setup) but despite that AMD decide to officially move the series into legacy. Back then the one that taking the most blow was CF user. This is the customer that willing to spend more money to get your product. AMD really value their customer very nicely right?

            And for the record I never say it is wrong for AMD to push for Async compute. Even I can understand the benefit of having more performance. Just unlucky for geforce user that their gpu does not have something similar to AMD ACE engine. For nvidia part if market dictates that Async compute is important then they will do something about it. Just like when many screaming power efficiency is important metric to have.

    1. The only difference would be textures and a few effects here and there. Obviously they dont make separate versions. This is Xbox version ported to PC.

  12. >win 10 required via dx12

    >windows store required for the hell of it

    I mean, if you don’t WANT to sell any copies that’s okay. Don’t.

  13. This is GFWL all over again. Its the second try by MS, and this time.. they will probably work it out better. It looks like people will finally bend over to the MS’s will.

    DX 12 is something everyone will wanna use.
    Once people go to W10 and upgrade for DX12, whats stopping them from using the store as well? Nothing. Right now it sucks cuz a ton of people don’t wanna upgrade to W10. Give it a few years and everyone will use it/have it.

    P.s. I thought DX12 will magically give you 20-50 % better performance? Whats up with the crazy high REQ?

  14. Want this @$%# to end? Don’t upgrade to Windows 10. MS’s console loses billions, their adoption on 10 is a disaster other than PC Gamers they treat like @$%#.

    Problem solved. Devs will adopt Vulkan API, MS will be remembered as a bad dream. Anyone that upgrades to Windows 10 and complains about MS inflating specs etc to make their console look good? I do not feel sorry for you. YOU ARE THE PROBLEM to begin with. The only reason Xbox still exists is you keep upgrading to the next POS MS OS. They lose money on that to keep the API monopoly going, then make it back from you idiots.

    BTW go look at Vulkan API running Unreal Engine 4 on a PHONE. Epic games posted demos to youtube. That is what you could have. An API made by the hardware makers themselves.

    1. MS are desperate, one day exclusive on The Division beta, Rise of Tomb Raider bombs, stop reporting sales of the XB1, port their exclusive game failures to the PC, Fallout 4 sales less than the PC., Rise of Tomb Raider out sells XB1 on PC.

    2. Who said anything about putting up with this? As far as I am concerened I am not spending one dime on the Windows Store, and my Win10 upgrade was free. They can’t expect any support if their delivery is trash, so why are you so hysterical about this? Either way you look at this, Microsoft loses. PC gamers for the most part aren’t sheep, and they will punish Microsoft and not buy from their store as long as their bad practices continue.

      1. Sorry dude but I get Bruce’s “hysteria” A LOT of people are defending MS and still supporting Win 10 and will buy their games for Winstore. You upgraded on Win 10. The message to give is NOT to upgrade not if you spend money to upgrade!! What we are talking about here should be a rule for the 70-80% of users so MS realizes that consumers/PC users are not brain dead addicts that just want to play games no matter what. Or upgrade to a fancy new OS. So I feel him. I find it difficult my self to believe that there are still people supporting them after ALL this.. And the show goes on….

  15. Okay.. so now we are complaining that this game has to high specs? From what I have read this game is redone almost 100%. I could see a game that pushes the envelope of graphics come to require that, think start citizen, to warrant these requirements. I mean we haven’t really seen much on it… crysis? If this however is crap on release, then we can bash it. We all want DX12 only (or Volcan)…I for one have no problems with this til we see more.

    1. There’s absolutely no reason for it to require a bloody GTX 970 and R9 390/290X. My HD 4870 from 2008 ran Gears of War just fine. I could understand it if the game was rebuilt with Unreal Engine 4 but that isn’t the case. It’s running a modified version of UE3. In 2016.

      I’m gonna skip this.

  16. I think if MS wants to really wants to take advantage in marketing in the PC they have to really improve there Windows Store or have an app tailored to MS games for the PC all seriousness. Origin and Uplay are getting better and we all know steam success I hope MS sees this and really I mean really how they failed with games for windows a couple years ago. They can really have a potential to make money here on the PC market. We’ll see what they have in store this Thursday… Let’s hope there plan is not dead on arrival…

      1. Off course.
        Unless Microsoft wont change UWP, because at this moment Windows store is terrible game distribution platform.

        1. MS is to focused on a Console Agenda. They will Gimp PC user’s no matter what PC hardware they use.

          And this is one of the main reasons I want to see success with Vulkan API

  17. so, tell me again, what are the benefits of DX12 ? higher system req ?
    FFS this game is almost a decade old and they mostly improved the cutscenes and all of them are pre-rendered cutscenes (video files) . a gtx 970 and 16gb ram for 1080p for this ? lol.

    1. I’d bet it’s just BS requirements so the game looks “up to date”. If they manage such an “outstanding” performance to actually require that sys. requirements, then just wow M$!

    2. yeah the 16 gig’s made me laugh so hard… I mean I only have 16 gig’s because I got my ram on sale and yeah I wanted 16 gig’s cause everybody else I knew had 4 / 8… eh eh But a game that really wants that now… that is a console port!?!? LMAO

  18. haha amd has won this. 390X and 980 Ti at 4K, when the Fury X is what the 980 Ti is meant to battling against. RIP Maxwell no async compute, can’t wait for Nvidia to abandon Maxwell once Pascal is released just like what they did with Kepler. Look where the GTX 770 stack now when it was released. It’s essentially a R9 270X where the R9 280X is almost the same performance as a GTX 780 which shouldn’t be happening RIP Nvidia Driver support. Never buying Nvidia card again I’m sticking with AMD this time, my poor GTX 770.

  19. Come on dudes, are you seriously telling me that you need 16GB of RAM (which I have btw) to fully experience this PotatoBoXONE port in 1080p on PC? What does it have ultra-giga textures that take up 10GB of RAM for swapping? GTFO Microsoft!

  20. And there you have it folks. The MicroSoft Store OF GIMP! If this does not scream they only want you to play their games on Console then wow I don’t know what else they would do.

  21. This is why you don’t build cheap gaming PCs. All those people that were building budget PCs with the Pentium G3258 just to say “hurr durr my pc runs better than an xbox/ps4” now have to upgrade. I love it.

      1. Nah, i was smart enough not to build a low end rig just to barely be able to bash on consoles. I’ll be running GOW just fine when it comes out. 1080p is so 2 years ago.

        1. I don’t think anyone buys a cheap PC to bash consoles, what a stupid thing to say. People bought 20th anniversary edition Pentiums to have fun overclocking them to see how far they could go, you know fun?.

          That doesn’t change the fact that a budget PC with an i3 CPU and entry level graphics card under the £100 range will match XB1 as that’s pretty much how it plays out.

          1. Matching an XB1 is the dumbest thing you can do when building a PC. .If you are a gamer on an extremely low budget, you’re better off buying an XB1 or PS4. Just look at all the guys that built low end PCs with the G3258 that barely were better than consoles, they’ll now have to upgrade now (if they haven’t already) just to keep up with the newer games that will be releasing soon.

            Again, this is why you never build a cheap PC. If you are primarily gaming and can’t afford a good PC it’s better to just buy a console.

          2. It all depends though, for example I doubt very many gamers decided to build a PC with a 20th Anniversary Ed Pentium to match consoles. Most people bought that CPU to have fun with overclocking it just to see what it could do, others still like me bought one just because they remembered when Pentiums used to rule.

            I am a hardware enthusiast so happen to have lots of hardware. My main PC has a watercooled i7 5820k with 16gb DDR4 running 3000mhz and a TitanX. I also have an i7 2600 PC with GTX980 that I don’t use anymore. However I also own A10-7850k PC, 2 GigaByte Brix PC’s as I love how you can get all that raw power in something that’s so tiny and unique. 1. has an i3 Haswell but also has a built in mini projector, so it can be hooked up to a monitor with HDMI, or used with the mini projector. The best aspect though is being able to use it as a standalone mini projector with any other device hooked up to it via HDMI. Then I have an I5 Haswell Brix with 8gb DDR3 and a custom Kepler GPU with 1344 shaders and 6gb GDDR5. The other Brix has an AMD APU as well as a discrete r9 275x.

            So I am a hardware enthusiast who loves using/owning high end hardware, however if I were less struggling for cash I would still want to game on a budget PC over a console and here’s why.

            1. The PC receives the most developer support so we have the best game library. I love PC only titles however there is now over 50 so called console exclusives that are either PC and PS4 or PC and XB1 exclusive with another 10 games like that coming this year alone. So not only does PC have the most varied game library but also lots of so called console exclusives which you would need to own 2 consoles to play.

            2. Free online multiplayer, over the space of this generation a console gamer will have to fork out £400 to game online which is money I could use to upgrade my PC.

            3. I may only be able to buy a budget PC today because I lack funds, however if I get spare cash tomorrow I can use it to upgrade my PC. Howeverif I buy a console today because I can’t afford a good PC and then come into money in the future I can’t use that money to upgrade my console

            So there’s reason for and against a budget PC. i love the types of games that are exclusive to PC and a budget PC would allow me to play them as well as all these so called console exclusives that come to PC. It’s not just XBOX exclusives on PC as Tales of Zestria, Street Fighter5, No Man’s Sky, Soma, Dragon Quest Hellblade and a good few other PS4 exclusives are also on PC

            So a budget PC with at least an i3 or quadcore AMD Athlon CPU and lower end GPU like 950 or even 4gb lower end GPU would suit me just fine if I had money issues

    1. Well the main reason for building a cheap PC with the Pentium was fun, just messing around with it and overclocking it to see how far it could be pushed.

      That being said I still expect a budget PC with an i3 CPU and budget GPU like 750ti to hand in similar performance to XB1 as that’s how it has played out over the last two and a half years.

  22. ::f381Work At Home….Special Report….Earn 18k+ per monthfew days ago new McLaren. F1 bought after earning 18,512$,,,this was my previous month’s paycheck ,and-a little over, 17k$ Last month ..3-5 h/r of work a days ..with extra open doors & weekly. paychecks.. it’s realy the easiest work I have ever Do.. I Joined This 7 months ago and now making over 87$, p/h.Learn. More right Here::f381????? http://www.freshemployments.com.­nu .?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:?2:::::::f381…….

  23. Lmao, I didn’t say that’s what makes PCs great, you PC fan boys did back in 2013. I game on PC and on console so I’m not blind like the majority of the PC only gamers on here. It’s funny seeing the new reqs on games coming out and now PC gamers that bought a Pentium or cheap pcs 2-3 years ago having to upgrade.
    Again, this is why you never build a cheap PC

Leave a Reply

Your email address will not be published. Required fields are marked *