Hogwarts Legacy feature 2

Official PC system requirements for Hogwarts Legacy

Warner Bros has revealed the official PC system requirements for Hogwarts Legacy. According to the PC specs, the game will require 85GB of free hard-disk space, and will be using the DX12 API.

PC gamers will at least need an Intel Core i5-8400 or AMD Ryzen 5 2600 with 8GB of RAM in order to run the game. Avalanche Software has also listed the NVIDIA GeForce GTX 1070 and the AMD RX Vega 56 as the minimum required GPUs.

Avalanche Software recommends using an Intel Core i5-8400 OR AMD Ryzen 5 3600 with 16GB of RAM and an NVIDIA GeForce 1080 Ti or AMD RX 5700 XT graphics card.

Surprisingly enough, these requirements are for using an upscaling setting, which is most likely AMD FSR. Avalanche Software has not revealed the requirements for running the game at native resolutions.

Hogwarts Legacy will release on February 10th, 2023!

Hogwarts Legacy Official PC System Requirements

MINIMUM:

    • Requires a 64-bit processor and operating system
    • OS: Windows 10
    • Processor: Intel Core i5-8400 OR AMD Ryzen 5 2600
    • Memory: 8 GB RAM
    • Graphics: NVIDIA GeForce GTX 1070 or AMD RX Vega 56
    • DirectX: Version 12
    • Storage: 85 GB available space
    • Additional Notes: SDD(Preferred), HDD (Supported), 1080p/60 fps, Low Quality Settings, Upscale Performance Setting

RECOMMENDED:

    • Requires a 64-bit processor and operating system
    • OS: Windows 10
    • Processor: Intel Core i5-8400 OR AMD Ryzen 5 3600
    • Memory: 16 GB RAM
    • Graphics: NVIDIA GeForce 1080 Ti or AMD RX 5700 XT
    • DirectX: Version 12
    • Storage: 85 GB available space
    • Additional Notes: SSD, 1080p/60 fps, High Quality Settings, Upscale Quality Setting

38 thoughts on “Official PC system requirements for Hogwarts Legacy”

  1. X360/PS3 era looking game and to run this in LOW + Upscale Performance Setting (WTF is this?!)
    You need PC with Vega56/GTX1070, 8GB RAM and i5 8gen, hahaha!
    PC like this runs a lot better looking game with better framerate…

    Upscale Performance Setting, sounds like the game will be rendered in 720? 540p? LOL

    Also GTX 1080Ti in recommended is a bad joke, this card is still a powerhouse with a raw performance of RTX2070/2070S

    PS. Vega 56 is on PAR/BETTER than GTX1080.

    1. Current gen equivalent to 1080ti are radeon 6600 and geforce 3050, low end cards witch would cost like 200$ or less in good old times when world were normal.

      1. RTX 3060 can match 1080ti in raster. I guess 3050 could also match it (ot even surpass it), but game would have to use mesh shaders, or DLSS.

          1. Depending on the test 3060 can be faster or slower than 1080ti, but I think it’s fair to say 3060 performance is very similar, and something like 3060ti is outperforming 1080ti even without RT, DLSS, or mesh shaders.

        1. Man, please stop taking psycho mushrooms and look at some tests plz
          before talking $hit! You have any idea how much is 45 % more?? Its +-
          15% top ,depending on situation TOP!

          1. Techpowerup has a good gpu tier list and has the 1080ti 48% faster than a 3050 across resolutions. Tomshardware’s gpu list isn’t as easy to interpret % wise put has the 1080 ti roughly 35% faster at 1440p ultra.

          2. If I’m not mistaken techpowerup list is based on 1080p results, so this gap would be bigger at higher resolutions, because 3050 has limited memory BW.

        2. Man, please stop taking psycho mushrooms and look at some tests plz before talking sh*t! You have any idea how much is 45 % more?? Its +- 15% top ,depending on situation

        1. Vega 64 chip was power hungry and hot to begin with. Depending on the model we are looking at 300-320 watt GPU (that’s more than even 1080ti, and almost twice as much as GTX1070).

          Some people prefer AMD cards for whatever reason, but I’m very happy with my GTX1080. I had 1080ti before, but it was way too loud, so I have replaced it with much quieter (I barely can hear anything even at full load) and cooler (60-72C MAX) GTX1080.

          1. It all depends on the VEGA 56. If the VEGA 56 has Samsung HBM like the 64, it’s gonna dust that 1070 and on an undervolt beat the 1080. Without the undervolt it trades blows and the 1080 is just better for someone not tinkering with the card. The biggest problem with VEGA is it LOVES bandwidth. The overclock on the core didn’t mean near as much and because the core was so power hungry it couldn’t maintain clock with temps.

            On an undervolt (I like about the same wattage you do and prefer cool and quiet) I have a 56 with Samsung HBM that runs at 64’s 950 ram speeds without touching voltage and that undervolts to 1520ish clocks on under 200 watts in most games. It’s almost passively cooled with the MASSIVE Red Devil heatsink. The highest I ever saw was a highly modded Witcher 3 in the 210’s. It’s just faster than a 1080. Dialing in the undervolt, lucking out on the RAM is not what the common user is gonna do or experience. Neither is water. With water the VEGA 64 will destroy the 1080, but you will be using absurd power to get the max core clocks.

            For most users I would recommend the 1080 if buying used. The market is coming back to normal though, so used cards aren’t as in demand as they used to be.

            My new GPU will just be whatever stays in the low 200 watts on an undervolt. I don’t care who makes it. I want a card that is cool and quiet and if they can get 3080 tier performance at those numbers I’m in. Now that AMD is ok on DX 11, drivers mean nothing to me. I actually prefer AMD’s GUI and like their sharpening and use it in a lot of games. Nvidia can leverage it with reshade.

  2. all these upscales f*ckscales shi**y crap……..doesn’t Matter how many fsr 10.0 or dlss 10.0 they add……one thing always remember native is native…..but none of those upscaling technique can beat native…….till future …..native is native………..till now every game i have played haven’t used this crap, it makes a big blurry crap & jesus its makes game a big hit in quality…… while downscaling resolution to sh*t… no thanks…….always set to native & play with high quality…………………& if some people are struggling to fps…just set to native resolution…& lower other graphics options….but don’t touch resolution option…..like always native best

    1. I agree with your points except a few titles e.g. Death Stranding where DLSS is actually better than native. But this ImAGe UpScAle is here because of the low a*s ConSlow hardwares. Sony and Microsoft should just make games and not sell hardwares. AMD already has V-cache enabled CPUs which are completely purpose built on gaming. Windows however has to become more like Linux. They must release a gaming edition free of every single bloat piece of $hit.

      1. Maybe it’s because I have my PC set up in the living room, but I think advanced upscaling tech like DLSS and XeSS are quite good when you’re playing at 4K or above. Hardly any difference between native and quality DLSS while certain things such as hair actually look better than native due to how the tech works. Plus, it helps offset the performance loss from ray tracing. IMO a slight resolution loss (practically imperceptible when playing on a TV) for the sake of advanced visuals is a worthy trade-off.

        1. Yep. It really pays off in the higher res territory (and it is the exact reason they are developed for). I was trying RDR 2 using DLSS2FSR mod and the visual loss was pretty unnoticeable. The ghosting effect was a headache initially. I think they have mitigated it through the latest updates.

  3. 1070 or V56 for 1080p60 Low, using Performance Upscaling? Like, 540p60?

    Yeah, whoever came up with those specs was smoking something.

    1. It seems like some requirements in game are just pulled out of thin air without proper testing of hardware in the game or by the marketing team that don’t know a lot about hardware anyway.

  4. Insane requirements for just upscaled 1080p. UE5 has similar requirements, so maybe they are also using software GI.

    1. A Steam Deck running SteamOS, because Valve is going to pre-compile all shaders in advance on their Linux server farm for all games available via Steam! 😉

  5. For those who are wondering what the heck are these SyStEm rEQuIrEmeNtS, these are actually PR Stunt figures and nothing more. Just relax and shrug off this mediocrity out of your mind.

    Edit: Hard to believe I have to see this from Avalanche Studios. These guys are PC First devs. If these BS sys req. numbers really turn out to be true at Day1, they really have lost their soul and passion.

    1. Again, they are not the Swedish studio responsible for the Just Cause series, but rather a Salt Lake City based US game developer most likely run by a bunch of always money-hungry Mormons married with multiple wives…

      1. wtf!!! I really thought they were the “Cause it is Just” developers. Finally I am feeling alright. thanks for the heads up.

Leave a Reply

Your email address will not be published. Required fields are marked *