Warner Bros has provided us with a review code for Hogwarts Legacy. Thus, and since Warner Bros has not showcased the game’s PC version, we’ve decided to share the ten first minutes from it. We’ve also enabled the game’s Ray Tracing effects, and compared native 4K resolution with DLSS 3 Quality.
In order to capture this gameplay footage, we used an Intel i9 9900K that was overclocked at 5Ghz, 16GB of DDR4 at 3800Mhz, and NVIDIA’s RTX 4090. We also used Windows 10 64-bit, and the GeForce 528.24 driver. Do note that NVIDIA has not released yet a Game Ready driver for this game. In fact, there isn’t even a game profile in the drivers for it yet. As such, we may see a performance boost once NVIDIA releases its new drivers. So, make sure to keep that in mind throughout this article.
Avalanche has used Ray Tracing in order to enhance the game’s Reflections, Shadows and Ambient Occlusion. Unfortunately, though, the game requires you to restart it whenever you enable/disable its RT effects. Thus, we could not provide any comparison scenes in this video.
At native 4K/Max Settings/Ray Tracing, we were getting a minimum of 42fps and an average of 46fps. With DLSS 2 Quality, we were able to hit a minimum of 72fps and an average of 76fps. And with DLSS 3 Quality, we were getting over 100fps at all times.
Later today, we’ll share another article for Hogwarts Legacy. In that article, we’ll inform you about almost everything you need to know about the PC version.
Enjoy and stay tuned for more!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
Too bad it’s yet another stutterfest so far when you get to the actual Hogwarts castle.
Don’t you have some game journalism sites to kiss up to on WCCFTECH by saying that Rowling stating biological facts is problematic? I’m sure if you clowns keep agreeing with THE MESSAGE you will get hired at IGN any second now. J/K they are losing money and firing people like all game journalist sites.
Anyways enjoy the 5 real viewers you have on articles and the 50 astroturfing accounts arguing for and against Nvidia, Intel and AMD in lazy, inaccurate tech articles.
Wtf are you talking about rofl, he was saying the game is stuttering. Did you reach that stage in the culture war where you’re incapable of comprehending written text and your mind overrides every sentence with a different, more triggering one?
Alessio works for WCCFTECH and his comment was not in good faith. ALL Unreal Engine 4 games have a stutter when entering new areas or when breaking apart larger one’s. They are basically invisible loading screens. Not all Unreal 4 games also have compilation stutter and the Harry Potter game doesn’t. To imply it does is a bold faced lie.
Guarantee when Alessio or his clown cohorts at WCCFTECH cover Jedi Fallen Order 2 they fail to mention this “stutter” which is really Unreal 4 asset loading and something that no developer can fix.
Maybe if gaming journalism was HONEST with you and informed you why things happened instead of crying about politics like Alessio and his fellow clown Nathan Birch (writes for IGN sometimes) does, you wouldn’t be so misinformed.
Have a nice day and keep reading failed journalists and activists like Alessio who hates what he does, hates gaming and is only there for the politics and to make people as misinformed as you.
Speaking of wccf, you should read the beginning part of their review of this game. They had that fat communist chris
gwray review it, so he couldn’t help himself from including BS about JK Rowling’s “problematic” views. That self-hating PoS has a long history of being a moonbat and yet they keep giving him work.Sounds about right for WCCFTECH.
Haha, you are hilarious from beginning to end. I’ve never implied this was about compilation stutter, but there is noticeable stuttering anyway, as also noted by El Analista de Bits, in all versions of the game.
If Jedi Fallen Order 2, or any other game, suffer from stuttering, rest assured that we will point it out in all cases.
You are also completely clueless with regards to our Hogwarts Legacy coverage. I’ve never brought politics of any kind in it, nor have I ever mentioned the books’ author in any way in my posts. We’re also continuing our coverage of the game as normal, unlike other publications which have campaigned against it.
But you are not really interested in the truth, are you? You are just a keyboard lion with too much time on your hands and an ax to grind with nearly everybody, facts be damned. Well, I’ve already spent too much time writing this. Enjoy your little imaginary world where everything is upside down and you’re the man.
F*k your feelings, snowflake.
Don’t you have another failed boycott to run? All the otherkin community did was show the world just how crazy they are by doing things like stalking streamers and keeping a list of streamers to harass for playing the game. They do the things they claim “fascists” intend to do with their baseless fear mongering.
BTW this Chick-Fil-A is DELICIOUS. Had to wait an hour in line because people care as much about your cause as they do this failed boycott.
Nice to see, but who cares for performance when its just another walking simulator for brainwashed kids.
Apparently you care
Nice bottleneck there using a 9900k with a 4090 ?
Is it really a bottleneck if you are playing in 4K? I have an i7 8700K with my RTX 4080 and everything runs fantastically.
I would understand if you were playing at 1080p,1440p.
Just open your eyes and look at his GPU usage hovering around 80%
You are the one who assumed we were bottlenecked in Dead Space Remake when our GPU was used at 98% (sooooo glad that those that have criticized that article faced performance issues in the game’s later levels which are as demanding as the benchmark scene we used, it did put a smile on my face as it proved we did a good work 😛 ). So I don’t think you can use the GPU usage to your advantage when you’ve criticized it 😛
Hint: Something else is going on here that explains the GPU usage, and it’s related to the NVIDIA drivers (or the lack of a game profile for the game). Things should get better with the Game Ready driver and the game’s day-1 patch.
Unfortunately yes now. With a 4090/7900XTX and to some extent a 4080 class card any CPU lower than a 5800X3D or 12th gen from Intel severely holds back these class of GPUs even at native 4K with maxed out settings from running at their full potential, won’t be every game but a good majority now. And if you use DLSS which renders at an even lower resolution internally it’s a very bad CPU bottleneck, add on frame generation and ray tracing and your making the CPU work harder still when it may already be struggling to keep up with the GPU.
I switched from a 5950x to a 5800X3D with a 4090 as my GPU struggled to hit 100% load in certain games, 5800X3D fixed that issue, same board, bios and windows install, my averages went up in the division 2 by 10 to 15fps, even in Far Cry 6 averages went up by 20fps and mins went up as well and this is native 4K with all graphic settings maxed.
Yes it is
Any shader compilation stutters?
Nope, the game compiles the shaders when you first launch it. It has, however, some traversal stutters.
Of fcking course it does. We somewhat overcome one problem only to be hit by another. I’m getting disgusted by stuttering in games at this point and skipping them altogether until/if it is ever fixed.
Unreal 5 supposedly does if you have a gen 3 and up m.2 SSD. Problem is that those games are a couple years away.
Some comedy gold already from reviewers.
“the context of this game’s release and the troubling subject matter of its main narrative are impossible to ignore…While not every goblin character in Hogwarts Legacy is vilified, those who stand with Ranrok – the one who’s leading the ‘rebellion’ for goblinkind’s equality – are beyond a doubt.
For every option Hogwarts Legacy offers you, seeing Ranrok as a villain isn’t one of them; your positioned in unwavering opposition towards Ranrok and his Loyalists, whether you like it or not. He isn’t the friendliest character in Hogwarts Legacy, sure, but from what I have seen so far, the perceived evilness of his agenda doesn’t quite stack up with the severity of your fight against him. Ranrok is using violence to achieve his goals, but this is a world where non-violence has clearly been ineffective. ”
“Hogwarts Legacy doesn’t contemplate your morality within the wizarding world.”
Jesus so this is what all the bullshit on twitter was about? These people are insane. Justifying the use of violence in their tactics.
of course they are…bunch of batshit crazy racists.
Goblins aren’t people so that doesn’t change much
I’m going to play Hogwarts Legacy like a Sith.
Might makes right…
Dear John, is RAM a bottle neck in this one? Because I only have 16GB and I really don’t wish to upgrade to 32GB if I don’t have to.
you’ll be fine with 16GB of RAM (as long as you don’t run numerous programs in the background) 😉
Zero in fact, haha. I’m still acting like I own an old school 90s processor with my mindset.
Download ISLC and you can turn on Windows game mode (not the bar) since it’s actually decent now. ISLC is by the people who made DDU, it’s free and it solves that problem in all games. Windows 10 since the Creator’s update has not released standby memory correctly. They probably do this to make switching between apps snappier for laptops and business applications, but it ends up destroying game performance when you run out of standby memory in longer play sessions.
You can set the utility to boot and I haven’t had stutter from running out of standbye RAM since.
Set the ISLC polling rate to 1000 keep min the same and max I usually have at around 4096. It just means when the standby memory reaches 1000 and free memory is less than 4 gb it will clear the list. You could set this higher for just gaming at like 8192, but I haven’t a game shift around that much RAM yet.
That sounds very helpful, thanks
150fps with 13900k your 9900k cause bottleneck.
what annoys are all the console sites that talk about only 100fps because copying the articles from dso gaming
https://www.google.com/search?q=4090+hogwarts+legacy+100fps
Not having THE top of the line CPU is a bottleneck?
GTFOH
at 4K
this is a straight up lie
The game looks incredible. Looks like I’m gonna have to grab a 4090 eventually and sell the 3080. It’s just not enough anymore for these new games.
Looks bog standard to me
Muddy textures, TAA anti aliasing blurring the whole screen, uncanny valley mocap and bad lighting that makes every model, including clothes, a greasy shiny mess while making every dark area grey.
This was intended. Nvidia is completely fine with making devs rely on DLSS to get a remotely passable frame rate, which will drive people to buy higher end cards for the entire cost of a PC.
These games are made for consoles in the first place.
The fact that you invent such absurd conspiracies shows that you have 0 understanding of game development, as well as the market.
The people with high end GPUs, much less high end 4000 GPUs are maybe a few tens of thousands on the whole planet.
This does not and cannot dictate game development and business decisions from publishers.
Then why does Forspoken run like trash at native resolutions (without upscaling tech) on the very latest GPUs? The 4000-series is not representative of consoles whatsoever, so there’s no reason for games to perform poorly on them, other than dev shops choosing not to optimize.
DLSS is a crutch, where dev shops / publishers can simply wave their hands and say: “Ah, it hits 60fps with DLSS enabled, just release it. We don’t need to optimize further.”
Before DLSS existed, this never would have been acceptable.
Don’t care about Flopspoken.
Agreed — but it does set a bad precedent. Hopefully dev shops will be all the wiser!
It’s not setting any precedent, game is a critical and commercial failure. The performance is equally terrible on console. You don’t have to invent conspiracies to explain it, incompetent Japanese developer with a garbage engine is what happened.
Time will tell, and neither you nor I know whether my hunch will come true. No need to gaslight with the “conspiracy” label. It’s called pattern recognition and critical thinking leading to educated guesses ?. Industries tend to mimic and sway to trends, just sayin’.
What you’re doing is the opposite of critical thinking.
Lol okay, and what you’re doing is denying what’s perfectly plausible purely out of spite. Expressing disagreement would have been just fine, but nah, being an arrogant a**hole seems better I guess.
Have fun introspecting, if you have enough emotional intelligence of course.
4080/4090 is amazing… Though I feel 4090 had the better value even though the price is on the high side. DLSS3 is incredible.
Happy pride month to all people who like harry potter universe ????????????
Pride is not a virtue
Calm down lady, I don’t even know if it’s the pride month or not. I just wanted to call Harry Potter fans gay because they are.
https://media4.giphy.com/media/UAJpANY0bGPhS/giphy.gif
It’s 2023, you can be gay, it’s perfectly fine.
https://media1.giphy.com/media/d6Ni9aqSatPfq/giphy.gif
Make sure to cry about Harry Potter for hours, drone.
I’m totally fine with gay people. Why would I “cry” about men who like to have s*x with other men?
Some women NPC in this game sound like a men. It s a MAM
We should themcott this game
Does the PC version support dual sense features – adaptive triggers and haptic feedback?
Nope, sadly it does not. Might get patched in, but don’t hold out hope.
is this a joke? or what?
They’re actually in stock at a lot of places now.
https://www.nowinstock.net/computers/videocards/nvidia/rtx4090/
Heavy!!!
If this going on, we may have reduce cuda core and increasing of tensor core in the future since now people are more focusing on DLSS hype than native.
Anyway I don’t like this trend, prefer more raster performance than proprietary tech.
Completely agree. I’m worried devs will use DLSS as a crutch to not optimize games for native raster performance especially for those with older cards right now.
Forespoken comes to mind, barely performing well on a 4090 without upscaling tech. Absolutely ridiculous.
This is what is happening now. Only so “much” fps with the 4090 dlss3
thanks! Eagerly waiting for the dso performance review (as always)! This one seems tough for my 2080, but I pray at the DLSS gods to grand me 4K-medium at 60fps!
No one needs a top shelf card for any game recently. Just use PS5 60 FPS settings you will get from the Digital Foundry video, disable RT because the consoles play RT games at 30 FPS, use better AF, use reshade CAS.FX on Nvidia and on AMD just use their Radeon image sharpening built into the driver because it’s the same thing and play at 1440p.
Grats you now can play any game at 1440 native, have a game look BETTER than console (1440p with cas.fx looks stupidly good) and have higher than 60 FPS. You can do this on an old VEGA 64, 1080 and a 6600 is faster, uses like 90 watts I have seen those as low as 200 bucks on Amazon refurbed from Sapphire and 230 bucks at Microcenter. If the 6650XT is close in price just buy that.
If you take 1440p with CAS.FX and put it next to native 4k or DLSS 4k people will choose the non native 4k every time. That is due to sharpening and TAA looking horrendously bad without it. Especially in Unreal games. DLSS does this natively (they are stopping that so they can sell higher end cards to dummies).
If you bought a mid end GPU last gen you don’t need a new GPU. You need to simply use optimized game settings, use reshade, use things in your driver. Play RT Witcher 3 next gen, skip it for now. Lumen RT is the future anyways and that won’t require stupidly expensive proprietary BS that is only useful in a couple games.
Thanks for the tips! Definitely going to be looking into CAS.FX. I’ve used Reshade in the past for various games with great success, but it’s been a while so I haven’t tried any of the newer techniques.
I truly appreciate your technique of writing a blog. I added it to my bookmark site list and will
Not really interested in DLSS 3.0 framerate. Interpolated frames don’t increase responsiveness.
Turning 30 FPS into 60 FPS by shoving a frame in between with DLSS 3.0, still only gives you 30 FPS responsiveness. In fact it gives you slightly less, since you need to delay the most recent frame.
If I’m not mistaken – Its actually worse, it renders frame 1, renders frame 3, display frame 1, interpolates between and then add and display frame 2 then 3. Reflex etc can go a long way to lowers that latency significantly and some optimizations to game code could make it even less.
That said if i played in a latency sensitive game i would never have such interpolation feature on – Just reflex and other latency reductions. Toyed around with latest Dying Light 2 patch and it was at least for me noticeable worse input latency with dlss3 than not. So basically, a choice between smoothness vs input latency and it’s in the end always good to have a choice depending on what you want.
So, finally a game that doesn’t run like sh*t like most game released recently.
How does it run without 4k, dlss and RT though….
Wouldn’t Low Latency (though preferred in most games) decrease framerate further?
For those wondering, yes, it does decrease fps
“Hogwarts Legacy features a transgender character, Sirona Ryan, and she can be met early on in the game.”
LMFAO
Kek told me to pre order the deluxe edition because based.
This makes no sense as there is no quality preset for dlss3 it’s either frame generation on or off. It’s binary. Dlss 2 has quality presets. Stupid
So basically it runs at 50 FPS at least from a latency standpoint
Actually DLSS 2 is really a better choice here because the latency is lower and thus the game is more responsive. DLSS 3 is stupid IMO because they entire idea of higher framerates is to lower latency and make the game more responsive but DLSS 3 doesn’t actually do that, it just artificially boosts the framerate while keeping latency the same or perhaps even worse
46fps only for everything that’s visually possible without DLSS. This means we’ll rely heavily on DLSS and similar tech in the future. It’s like traditional methods can’t keep up with the rising visual standard anymore.