RAGE 2 Official PC System Requirements Revealed

Bethesda has revealed the official PC system requirements for RAGE 2. According to the specifications, PC gamers will at least need an Intel Core i5-3570 or an AMD Ryzen 3 1300X CPU with 8GB of RAM and an Nvidia GTX 780 3GB or an AMD R9 280 3GB.

Bethesda recommends an Intel Core i7-4770 or AMD Ryzen 5 1600X CPU with 8GB of RAM and an Nvidia GTX 1070 8GB or AMD Vega 56 8GB graphics card. It’s also worth noting that the game will require 50GB of free hard-disk space.

Unfortunately, the publisher did not specify the graphics settings and framerate that these requirements target. Our guess is that the recommended specs are for gaming at 1080p on High settings with 60fps (do note that we are merely guessing here).

RAGE 2 releases on May 14th and you can find its full PC requirements below.

RAGE 2 Official PC System Requirements

Minimum System Requirements:
A 64-bit processor and operating system is required
OS: Windows 7, 8.1, or 10 (64-Bit versions)
Processor: Intel Core i5-3570 or AMD Ryzen 3 1300X
Memory: 8GB RAM
Graphics: Nvidia GTX 780 3GB or AMD R9 280 3GB
Storage: 50 GB available space
Recommended System Requirements:
A 64-bit processor and operating system is required
OS: Windows 7, 8.1, or 10 (64-Bit versions)
Processor: Intel Core i7-4770 or AMD Ryzen 5 1600X
Memory: 8GB RAM
Graphics: Nvidia GTX 1070 8GB or AMD Vega 56 8GB
Storage: 50 GB available space
Note: Persistent Broadband Internet Connection required to play and access certain features.

79 thoughts on “RAGE 2 Official PC System Requirements Revealed”

  1. Currently playing RAGE 1 in preparation for RAGE 2. The game is not as bad as I thought it would be. The Shooting Mechanics and AI especially are what are keeping me engaged.

    1. The game play is pretty solid. Great shooter imo.

      HOWEVER, the ending to the game is quite possibly the worst ending in videogame history.

      And I’m being a 100% serious. They did such a good job of building the world to be believable and immersive and they throw it all out the window at the end.

      1. IDK how anyone says the world in Rage is immersive. It’s 99% dead empty nonsense. No object interaction, can’t even open doors inside buildings, or interact with anything that isn’t a quest object, just nothing.

    2. It’s a great game, just buggy as hell and i wish it used some other engine.
      I also don’t like REAL bullet sponges that game has and SPOILER SPOILER SPOILER

      the story is cliffhanger

    3. Too bad they relied on stupendously high rez textures which needed to be compressed on the fly, all of the time. The higher the resolution and FPS you try to play the game at, the insanely faster the CPU has to be to keep up with texture decompressing. The game doesn’t leverage CUDA in Nvidia GPUs in a meaningful way and neither is it massively multithreaded to leverage a lot of CPU threads.

      I mean, it’s massively disappointing it lacked all and any kind of dynamic shadows and environment destruction but great it could manage 60 fps even on last-gen consoles and on moderate PC hardware. I think it relied on outdated OpenGL as well. It wasn’t using anything close to OpenGL 4.x AFAIR

    4. enemies are bullet sponges. But the best thing about rage 1 is the enviroments and character animations which are awful in 2.

    5. it is very good like all id software games you should had played when it was released. How are graphics now? still terrible as then? i had 8800 gt and 1280×1024 at release so i would like to know if they look any better now or as bad as they looked on pc’s of that time.

  2. I still cannot for the life of me understand how devs are asking as “the min reqs” being the specs for the PS4 Pro, lmao.

    Min specs should be less than X1/PS4 specs.

        1. Not even close to high.. on base consoles low to mid , on the X there are some high settings ..

    1. Funny how recommended PC specs get higher and higher every year while console hardware stays the same. Until now, gtx 1060 and 970 was the recommended GPU in every game. Now all of a sudden they moved it up to the 1070. Is it down to laziness from developers or is PC gaming one huge scam that wants you to upgrade every f-ing year ?

      1. ….and new consoles come out every few years that are upgraded to run the latest games and upcoming games at 30 FPS.

        PCs and consoles require upgrading every few years. It’s always been that way.

        1. Base PS4 (2013) still runs games at 30 fps just fine after 5 or 6 years and games only looked better with each passing year. The only enhancement the pro version brings is a higher resolution. Recommended GPU in 2015 was a GTX 770 and and here we are in 2019 where the 1070 is the bare minimum for 1080p/60fps at high settings. Last year the 1070 was an ultra settings GPU and when it launched, the 1070 was a 1440p GPU.

          1. No offense, but you don’t know what you’re talking about. It’s a combination of medium-high on base versions. Go watch some digital foundry videos if you don;t believe me.

      2. “Funny how recommended PC specs get higher and higher every year while console hardware stays the same”

        That’s not weird. While console games get more complex, the same running the game in higher rez and higher framerates will become more intensive than 4 years ago. Not to mention games that push visuals on PC much further than on consoles.

      3. I have GTX 970 and still run evrything at max settings 1440p 30+ fps some even 60 fps(like mortal kombat 11) gtx 970 still a monster even now which is why iam still keeping it.

          1. Some games run at 60 fps like mortal kombat 11 but with the crazy prices nvdia has currently whilei have the money to buy 2 RTX 2080TIs ifi want i dont want to supprot nvidia because then they will continue doing it. Most games run at about 40 fps maxed 1440p or 30-35 that impressive for 4,5 years old graphics card dont you think?

      4. That’s because the specs are literally almost always overstated.

        RE7 said it needed such and such and it worked fine on 2009 CPUs. Just an example. I remember a game came out like this year or last year saying it neded an i7-8700k – maybe RE2 ? – and that was clearly false.

        On the opposite end, a lot of games come out saying “Recommended specs such and such” and they run like absolute dog turd on machines that are WAY above those specs.

        It’s partially a scam to drive needless hardware sales, and partially devs with limited lab builds so they just toss out nonsense to cta.

    2. All games on consoles use low level API. On PC a lot of developers still use slower DX11 because they want support Windows 7

        1. What proves that? Two devs (id and Machine Games) that were always devs of OpenGL and then naturally switched to Vulkan got amazing results by using the same engine as well?

          DX12 gets amazing results nowadays as well. Rise of the Tomb Raider, Shadow of the Tomb Raider, Hitman 1/2, Division 1/2, Gears of War 4, Forza 7, Horizon 3/4, Sniper Elite 4, Strange Brigade. DX12 is a tremendous improvement compared to DX11.

          DICE seems incompetent at getting good results out of DX12

          4A were always great at implementing all DX versions. They leveraged DX9 for amazing visuals. They got DX10 with almost on par performance but with per object motion blur. DX11 was faster than DX9 and 10 but with extra features. And Metro Exodus runs better in DX12 than in DX11, as it should.

          I mean Rebellion implemented Mantle (the base for Vulkan) in Sniper Elite 3, so they’re familiar with the tech. And they implemented both DX12 and Vulkan in Strange Brigade. And somehow, DX12 is as fast or slightly (milimetarically slightly) faster than Vulkan in Strange Brigade.

          So what was your point?

      1. Sniper elite 4 uses dx 12 and scales almost 100% on dual gpu setup, if used right dx12 is great. There is another issue, most game engines are built on dx11, so building a new engine from scratch on dx12 takes a lot of time and man power, thing which these greedy companies won’t do. I mean look at Ac, they keep pushing dx 11 at it limits when clearly a new api is needed in open world games

      2. This is absolutely false. Most games have a half assed DX12 implementation.

        Why do you think consoles are so efficient with the mediocre hardware that they have? Magic?

        They are extremely low level. Often times even lower level (and more explicit) than Vulkan or DX12.

        It’s crucial that more and more devs start using DX12 and Vulkan to figure out it’s full potential. This is the same as when dev’s started using DX11 after DX9. There was a long progression period where DX9 was better. Then DX11 caught up, surpassed it, and dominated as devs got more comfortable using the new API’s

      3. One thing is low level API and something else is asking as the min GPU a 280/780, which are 3 times as fast as X1/PS4.

      4. Vulkan should be on this puppy like Doom (launched with OGL) since it is the same engine capable of both APIs. It had better look A LOT better than Doom and W2 if they want people upgrading. The 1080 Ti was pitched on being able to run Doom at 4k 60+fps on OGL (Vulakn was hitting the 100s for me with a 4790k, 1080 ti). W2 is a locked 60fps at 4k on Vulkan. Regardless, if it is riddled with microtransactions as they’ve eluded then its yet another pass until the “full” edition at half or more off for me.

    3. PS4 probably doesn’t run at 1080p and/or won’t be able to hold 30fps.

      Base PS4/Xbox1 are some of the worst consoles ever made. Don’t compare PC specs to those, cause the minimum spec on PC will at least be somewhat enjoyable.

      1. Worst console ever made? That’s a massive reach. Considering that they were the most powerful console ever made at the time of their inital release.

        1. Considering what hardware was available at the time, both consoles massively cheaped out on components. Consoles had never been that weak compared to PC. Not a massive reach at all.

          1. Well, to be fair, thermals were an issue at the time. If they clocked the processors higher they would have ran hot. They could have went Intel/nvidia but it would have been too expensive.
            They had what they had.

          2. They didn’t run hot at all despite the heatsinks being laughably small. Most cheap laptops have better cooling than a PS4/XB1.

            They could have clocked them faster if they wanted to or gotten a better chip from AMD. They chose what they did for the sake of saving a buck or two and the result was the weakest consoles in history.

          3. Which chips are you referring to not running hot? The Juguar chips certainly put out some heat (I had a few), especially at higher clocks. Combine that with the RX 470 ish GPU under the same cooling and you would have had a problem.
            Most cheap laptops have horrible cooling. Have you ever had one apart? I work on them daily and they’re laughable…the only reason they work at all is the severely downclocked chips and low core counts.
            I’m not sure you have much experience here…

          4. Of course I have, I work on all kinds of tech as a hobby. You should try disassembling a PS4 or XB1, you’ll be shocked

          5. I’ve watched teardowns of them, they’re at least a couple inches thick unless I’m remembering wrong.
            Most laptops have basically a sliver of a heatsink, usually not even 1 inch thick. They’re pretty pitiful…
            But you are correct. The ones in consoles are pretty tiny when you compare it to even say a ryzen boxed heatsink. But, that’s what they’re working with as they’re trying to keep things small and cheap.

          6. Which Jaguar based chips have you had that ran hot? The architecture is specifically designed to be low power, so I’m a bit confused here

          7. Maybe “hot” is an exaggeration, but certainly warmer than would be preferred (at least I felt that way at the time).
            I was using the Jaguar based server chips. I’m assuming aside from clocks they were pretty similar.

    4. Well sadly that is not the case… A console is a lot easier to optimize for then a PC with various hardware / Windows 7, 8, 10+

      Unless the min spec utilization even runs better then the Xbox 1 / PS4 versions…

        1. With today’s game I think not… Most games go well over the 750 ti’s 2 gig frame buffer

          1. RX590, he got it a couple months ago cause AMD marketed it as very future proof.

          2. No I got it for the free games 😀 And November is not a couple months ago lol

          3. I’ve had a RX 590 since November. He is just p i s s e d because I switched from Nvidia to AMD lol

  3. Recommended requirement of a GTX 1070! Did they task the Fallout 76 team with game optimisation duties? ?

  4. and i will kill them all All monsters! MAX SETTINGS 1440P 30+ FPS ON MY MONSTER PC RYZEN 7 1700 16 GB DDR4 3200MHZCL15 GS KILL RIP JAWS GTX 970 G1 GAMING DELLP2416D 24” 2560X1440 60 HZ IPS!

  5. I swear, if this doesn’t run at 60 on max, with my 1080ti and 6700k, hell will pay Bethesda.

    Consoles have lower specs than my hw.

Leave a Reply

Your email address will not be published. Required fields are marked *