NVIDIA GeForce RTX 4090 is the most powerful GPU available on the market. And, when this GPU came out, we claimed that it was a true 4K GPU. However, Dead Space Remake is one of the first games in which this GPU is unable to offer constant 60fps at native 4K with Ultra Settings.
Now the good news here is that Dead Space Remake does not suffer from any shader compilation stutters. When you initially launch it, the game will build its shaders. So that’s awesome news. However, there are a few noticeable traversal stutters.
Regarding the game’s performance, its prologue scene was running with 50fps on our NVIDIA RTX 4090. As you can clearly see, our GPU was used to its fullest (100% usage) and we were not CPU bottlenecked. Thankfully, and once you are inside USG Ishimura, performance improves. Still, there are some scenes – for instance when the Necromorph attack your crew – in which the framerate drops below 60fps.
What this basically means is that DLSS 2 and FSR 2 will be essential for everyone that targets really high resolutions. Yes, the game supports both NVIDIA DLSS 2 and AMD FSR 2. Furthermore, there is support for Ray Tracing Ambient Occlusion.
It’s also worth noting that the game does not feature any FOV slider. And, while playing it on my LG CX OLED TV, I felt really uncomfortable with the camera. In numerous cases, the camera was too close to Isaac (and I was unable to see where I should go). Things may be better on a PC monitor, but yeah… the lack of a FOV slider is a bummer.
I’m currently uploading a video featuring the first ten minutes in native 4K/Ultra Settings/Ray Tracing. Once the video has been uploaded, I’ll be sure to add it.
Lastly, and since a lot have criticized the performance of The Callisto Protocol, that game runs WAY BETTER than Dead Space. Without its RT effects, The Callisto Protocol only drops at 63fps at native 4K/Ultra Settings. On the other hand, Dead Space Remake can drop to 50fps on the RTX 4090. Furthermore, The Callisto Protocol looks better.
Now I’m not saying that Dead Space Remake looks bad or that it’s unoptimized. However, it’s obvious that a lot of people were a bit harsh on The Callisto Protocol. I mean, you can’t have double standards just because you like the X game. In these initial 4K tests, Dead Space Remake runs worse than The Callisto Protocol (and IMO, The Callisto Protocol looks better).
But anyway, our PC Performance Analysis for this game will go live this weekend, so stay tuned for more!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email



Wasn’t Frostbite meant to be optimized as hell?
Is this 9th gen exclusive? I didn’t check.
Frostbite hasn’t been optimized since Battlefield 3
Aside from BF2024, all Frostbite stuff ran wonderful.
I distinctly remember both BF 1 and BF V having random frame-pacing/stutter issues at framerates over 100 fps [when playing with a 1080 Ti / 10600K @4.9 GHz, 4000 MHz CL15 RAM]
Yeah, BF4 released 2 years after has no issues whatsoever…
2042? Oops.
People were harsh on Callisto because it also kept crashing and had MASSIVE stutter problems. Next article, try not to let your bias creep in so obviously.
Which were resolved in just a week, yet people still to this day say that it runs horribly. At native resolutions and without RT (BTW TCP has more advanced RT effects than DS), TCP looks and runs better than DS. The good things about DS is that it supports both DLSS 2 and FSR 2 (which is win-win for both camps) and that it does not suffer from single-threaded CPU issues.
First impressions matter. Turns out people don’t like paying money for something that’s nearly broken on release. Launch broke, become broke.
That combined with the slew of other problems (repetitive combat, no enemy variety, repeated boss 4+ times, hallway design, etc.) adds up to a dead game with garbage reputation.
Say that again when a Bethesda game releases.
WTH??? You just reported “The Callisto Protocol gets a new 548MB patch, does not fix its performance/optimization issues”
And? This doesn’t mean that Dead Space is amazing on PC.
PS: We’re also referring to its CPU single-threaded issues. The game runs with constant 60fps on a variety of PC systems without RT. This doesn’t mean that it doesn’t have optimization issues.
I mixed games up! ???
Callisto protocol runs at sub 50fps at MEDIUM settings (no rt) on my 3080 + 5800x3D. There’s no way this game is worse
Whether it’s true or not, the perception is now that denuvo is partially or completely to blame. 4k on pc is still a dream for most. But there is no reason to include this garbage DRM if there is even the possibility of it causing performance issues. Oh well. Another 80 bucks saved.
Why not wait for the evidence rather than believing the hear-say (and propagating it)?
That’s what you get for using a cheap video card!
https://media3.giphy.com/media/3o72F8t9TDi2xVnxOE/giphy.gif
My guess is they did not even bother optimizing it. Star Wars Battlefront looks better. Same engine. And that game ran very well.
Prey 2017 doesn’t look good? Time to check ophthalmologists near you.
Yeah prey is a damn decent looking game, good game too
I finished Prey 3 times to check all the ways to play and it’s a mediocre game.
Just like Bioshock.
DS Remake looks great too , Motive are ones that know better the engine that’s for sure. I think they did a great job pushing an last gen engine as much as possible on the current systems . Now we need Dice to make Frostbite 2 moving forward.
Kingdom come deliverance looks amazing!
the rating on metacritic is very high
So what?
Mainstream critics are paid off and that’s been known for decades and the User score is filled with bots with no way to know if they even own the game.
callisto doesn’t looks better
it does but only on josh duhamel…its like josh duhamel visiting ps2 years
Is it on the graphics card or the game? Hmmm…
I mean there is basically no reason to ever not run dlss quality at 4k especially with raytracing on.
This game barely has RT.
It’s only AO and the performance hit is less than 10%
If you are running DLSS it is not 4K and it looks noticeably worse.
Something is wrong with your setup. The game averages 70 fps at native 4K on a 4090.
with dlss? he said Native 4K
Trouble reading?
In the same area he described? With the RT on / maxed?
its not impossible a 3000 usd worth card
try next gen
That RTX 4090 must be a bootleg or something.
Here on 4080 + 10700K, in 4K +100fps on Ultra + DLSS Quality
He said NATIVE 4K + Ultra, no DLSS
maybe …just maybe your CPU ( 9900k) is the culprit ? just maybe im not sure. Just a brainstorm..
LMAO, is that what you call a “brainstorm”?
It is his cpu… I have much better performance and apart from the intro with dips into the 50s everything else ran above 60 with room to spare. (12900k+4090 here).
He is in defence mode their is no reason trying to converse with him. Is he poor or something just F’ing upgrade already. People with a 10900k got awful fps in cities on Elex 2 compared to my 12900k with the same gpu and his cpu is even worse. At least they listened and we compared fps, stutters etc and learned something. This guy is just deluded for some reason.
He’s waiting for the new AMD 3D cpus
Ive been waiting since i5-2500k too lol. Time for an upgrade i suppose.
It’s the only thing i can think of by now, or really short on funds.
There’s something not right about that performance he’s getting but i don’t think it’s his cpu.
I’m getting around 50 to 60fps with everything maxed out (DLSS balanced) on 4K.
6700K+3080.
You and the moron that upvoted you have probably never had a brainstorm in your life. Maybe try at least skimming the article and you will see the GPU was at full utilization, dimwit.
99% gpu usage means jack these days. Having faster ram or a better cpu can still give better fps even if your gpu looks maxed especially on 1% & 0.1% lows.
Sometimes full CPU utilization can even happen at lower percentages. A card can be used at 100% but still not at its full potential. That’s why more powerful CPUs always give better results.
My 3080 only stretched her legs after I replaced the 8086k with a 13600kf
Well since it wasn’t bottlenecking you are maybe talking about 2 or 3 less FPS ……
These days 99% gpu usage means jack all. Look at many articles with better cpus or ddr5 or OC’d or not, more cache, less cache. It’s complicated these days and yes a 9900k is old and sh*te these days and ddr4 is bad on games like Spider-man, get over it and read and watch other stuff as this website is now becoming a meme.
You have zero idea what the hell you’re talking about. If the GPU is being fully utilized and it is the absolute limitation which it is in many cases then you can throw more CPU and RAM speed at it all you want and it won’t make any difference. I now have a 13700k and there are plenty of games that there is no FPS difference at all over my 9900k and I knew that was going to happen because my GPU was already at 100% usage in those cases. If a faster CPU helps or faster ram helps then the GPU was not even at 100% or even 99%.
From 4k videos Dead Space looks better than Callisto, but maybe it is due to art direction. Also why gripe about FPS drops without going into what might be causing them. The “can it run Crysis mentally, its not running flawlessly on my x gpu/CPU combo” leaves so much to be desired on an opinion. Like give us some data!
Ooof… the disappointment!!
Apparently its pretty good but why bother? I played the first enough times, no need to replay it with slightly better graphics that you cant see due to super dark environments and needing a card as expensive as a ps5.
The only good thing I find in this remake is the top notch graphics and level design.
That team did outstanding job.
Apart from that, animations are mediocre, sound design is bland, basic game mechanics like aiming, shooting, punching etc feels like it’s done by one man in a day. Just for the sake of completing the game. And everything else lacks that boom factor of the original.
You can’t technologically outpace trash developers and no optimization. At this point, if a game is running at less than 90 fps at 4k native without RT ona fcking 4090, it’s safe to say the optimization is abysmal.
naturally like your web site however you need to take a look at the spelling on several of your posts. A number of them are rife with spelling problems and I find it very bothersome to tell the truth on the other hand I will surely come again again.
I don’t blame it on the hardware makers, I blame it entirely on the developers who can’t code for sh1t3 and forgot things like optimization, file compression and so on because they’re too busy virtue signalling on social media.
and they are so annoying at doing it
I blame both. The developers fuel the hardware manufacturer to not innovate because no matter what, you’ll be playing 60 FPS all over again. In the year 2099, it will be 54k@60fps…
ill just play at 52k at 120-144 fps. i cant tell the difference
Complaining about “virtue signaling” is also a form of virtue signaling for the Right Wing these days ….
You get played like a low IQ midget that you are.
Stay mad sand rat
It’s 1440p/30 fps on consoles high settings fwiw.
Rockstar about GTAV performance: That graphics are for computers of the future.
gta iv but yeah thats what they said, such a futureproof game that didnt even have antialising.
I for one cannot wait for atomic heart to come out and not being able to run on a super rig despite having very low requirements due to denuvo vmprotect and who knows what other bloat will shove into it only to be told to get a rtx card so i can use dlss to offset the performance cost like the studio ceo said. God i hate modern AAA games, you have to wait half a year until a game is playable and need videocards that cost as much as a console to run them well.
The 8k (lol) high end card can do 60 fps at 4k in a decade old engine. Wow.
Except for the opening scene (around 80fps) this guy has well over 100fps during normal gameplay at 4K DLSSQ on the RTX4090. There are also gameplays on much slower cards like the 3070ti with over 100fps at 1440p DLSSQ, so I don’t think Callisto Protocol runs better, even in DX11 and without RT.
https://www.youtube.com/watch?v=Sg_HG6L6880
If someone uses DLSS in Dead Space, they should also use FSR in The Callisto Protocol. You can’t compare a game in native 4K (TCP) with a game with DLSS (DS). Also, DLSS and FSR have MIP LOD issues in Dead Space, resulting in blurrier textures (the developers need to fix this, this doesn’t happen in other games using DLSS/FSR). You can find examples of this issue on our Discord server.
But even at native 4K this guy also has over 60fps during gameplay (80-90fps). John you have to come up with a better explanation :D.
And BTW. the callisto protocol undoubtly runs worse. On something like a 3070ti it’s impossible to get over 100fps at 1440p even with FSR, and that’s how dead space remake runs on 3070ti.
We usually get 80-100fps at native 4K. The problem is that there are scenes that can tank the framerate. The intro scene is one of them and the first time the Necromorph attack your crew is another one (later in the game there might be more demanding scenes). I can certainly cherry pick scenes in which the game runs with 100fps at native 4K. This doesn’t mean that there aren’t more demanding scenes (and we always prefer benchmarking the most demanding areas).
The Callisto protocol runs a lot worse, and you do not have to look for a few rare places where the framerate drops below 60fps.
And BTW. John why is the temperature of your CPU so high?? 100°C is a lot :D, you could cook an egg on it (3’rd screenshot).
Because he is a fool, not to mention the fact that a 9900K is outdated and bottlenecks a 4090 like crazy.
Using Ultra settings is rarely smart. It usually provides marginal increase in image quality for an fps impact that isn’t worth it. I only really use Ultra when I am playing games that are more than 5 years old, or those that otherwise have light hardware demands [RTX 3080, 1440p, targeting 120 fps].
Also AA on a 4k screen ?
THIS should be repeated at every chance. However… It isn’t.
F***in the story of every AAA game these days!
That being said, I see no point in using Native res, to my eys DLSS Quality is just overall comparable or sometimes better to native + better performance = much better experience.
Not in Forspoken it’s not. DLSS quality at 4k looks awful, FSR 2 at quality looks much closer to native and much sharper. It’s on a game by game basis these days.
Again the 9900k strikes back. Apart from a small dip to the 50s in the intro of the game afterwards it runs above 60 at 4k native with everything maxed out with some room to spare. I have a 12900k and a 4090 both at stock clocks.
Please stop using such outdated cpu. It makes no sense to pair such cpu with a 4090. Even when I had the 3090 I gained a lot of performance after switching my old 9900k@5ghz to the 12900k at 1440p in most titles with RT.
It makes no sense for the garbage Nvidia RT middleware to use one CPU core. Which is why this crap will be abandoned for Lumen lighting going forward and Direct Storage will eventually require even less from the CPU,
No one is going to buy CPU upgrades for what is essentially a broken feature to begin with. They are just gonna use frame generation on their current CPU’s and any game not software Lumen that uses RT lighting is just gonna be broken without it.
If the RT lighting middleware wasn’t crap deadspace would barely touch the CPU. It’s a SLOW walking simulator with laughable combat that you do standing still half the time. Meanwhile you have Doom Eternal with RT reflections in the hundreds of FPS with a hundred enemies on screen while you run around at the speed of light.
Game developers have probably already abandoned this garbage outside releases coming out this year and they have upgraded from Unreal 4 to 5.
If you think people are going to do a CPU/MB/RAM/M.2 upgrade for creatively bankrupt niche remakes that sold a whole 1 million copies you are delusional.
Quit complaining like a little kid who doesn’t know anything. At 3:43, we enable DLSS Quality which increases performance (so no, we’re not CPU-bottlenecked. If we were, we wouldn’t be getting any performance boost from DLSS 2). And even with DLSS Quality, the RTX 4090 is still being used at 99%.
Anyway, since we’re always benchmarking the most demanding areas (which everyone should be doing), the intro/prologue is the scene we’ll be using for our PC Performance Analysis. It will perfectly show the performance gap between native 1440p and 4K.
Nvidia shills are desperate. Their cards aren’t selling and they desperately have to defend current RT that will be a dinosaur when all the future games in development release with Lumen on Unreal 5. If they were smart they would push reflections. That is all the RT hardware on Nvidia cards is going to be great at.
UE5 can do software or hardware RT lumen. Hardware RT looks far better and only Nvidia have hardware RT. At least be educated before making daft comments. Look at Fortnite as an example between software RT on console and hardware RT on Nvidia or running slower on AMD gpus as always.
Lumen is fake ray tracing designed with the consoles in mind because they are incapable of proper ray tracing.
First of all there’s no need for insults, we can be more civil than that.
Second, your argument about enabling dlss increasing performance is essentially misleading. I ran the exact same scene as your vídeo and where you get almost constant 50-51fps I get 54-55fps. That’s a 10% difference which in my book is not insignificant (also your instant dips when changing locations are much more pronounced in your video than the ones I have).
Third, you were the one publishing an article titled: “NVIDIA GeForce RTX 4090 cannot run Dead Space Remake with constant 60fps at native 4K/Ultra Settings” when It only fails to do that in the intro sequence and the rest of the game you get pretty much a solid above 60/70fps (I played for quite a bit into it). Of course there’s the occasional instant dip here and there when you change locations but it’s just for a brief fraction of a second. Clearly the intro is not representative of the actual game and making a judgement of performance of the entire game based on an insignificant portion of it renders your conclusion misleading.
Fourth, pairing a 2000€+ gpu with an outdated cpu makes no sense. Even an amd r5 7600 or an intel i5 13600k would wipe the floor with the 9900k (even if it’s overclocked at 5GHz). I consider this simple common sense and even Alex from Digital Foundry was made to see this flaw and changed from his i9 10900k to a 12900k.
Finally, I won’t be bothering you more about this since I essentially already presented all my arguments and I’m clearly not going to waste more of my time to end up being insulted. I honestly hope you can be more professional about this.
Lol those shills clowns from Digital Foundry did not knew this? ? Even in RPCS3 the 13600k mops the floor with my 10700k.
But at least he got a 9900k. Ive seen comments on Nvidia subreddit of someone using a 2600k with a 3090! That’s a CPU with max RAM speed at 1333mz?
Article title: “NVIDIA GeForce RTX 4090 cannot run Dead Space Remake with constant 60fps at native 4K/Ultra Settings”
And then you say “But the intro runs with 54fps”.
So you basically proved us right, unless you don’t know what “constant” means and why we used that word. Moreover, later levels may be as demanding as the intro. We’ve seen this in numerous games. I mean, if you want to judge a game’s performance in less taxing areas, that’s up to you (it doesn’t mean that there aren’t drops in other areas, you simply overlook them. And that’s bad reporting). We always benchmark the most demanding areas.
I said the conclusion for the article title is misleading, unless you don’t know what “misleading” means. Also I find interesting that you ignored the part where I proved that you were actually being cpu limited by 10% vs a more recent cpu.
I’ve talked with other people that practically finished the game and all say the game oscillates between high 60s and 80s. None of them are using dlss because the textures look blurrier (probably some issue with the implementation like not using negative LODs which is necessary to retain textures clarity). FSR implementation seems to have the same issue.
Your article title gives the impression that the outlier part is representative of the game experience when in fact it’s the 4-5min exception. If I would have to classify your article it would be bad press. You’re essentially skewing the truth based on a single data point and also made it worse by using a non sensical combination of hardware.
Ahem, you didn’t prove that the CPU is limiting the RTX4090 by 10%. We are not limited by the 9900K in this particular game at native 4K. You should learn how to read the charts and data, they are all in the video.
You know why there are minor performance differences? That’s because different RTX 4090 models have different clock frequencies. An OC’ed RTX4090 will perform better than the RTX 4090 FE we have. And, surprise surprise, in general, most OC’ed third-party models are around 6-10% faster than the FE models.
Where did you come up with those 6-10% numbers? It’s 1-3% at best. And besides my model is a non OC gigabyte windforce because I’ve a mini itx build and it was the only that fit with the aircooler.
https://tpucdn.com/review/asus-geforce-rtx-4090-strix-oc/images/relative-performance_3840-2160.png
The game has RT, yes. Using DLSS makes the RT effects run at a lower res, yes. So less cpu demanding at a lower RT res, yes. The only moron here is you that keeps thinking this old cpu can run modern AAA games with RT and without stutter. Wake up jeez. You are just deluded now, we can see benchmarks with better cpus not have massive stutters or whatever. Spider-man runs way faster with ddr5 never mind your sh*te cpu and ddr4. Please just stop, upgrade and get with the modern age. These articles are just becoming embarrassing. Calling people kids is just a joke, we all know your ram and cpu is so old it can’t cope with a 4090. Even my 12900k loses badly to the 13900k on Far Cry 6 as it so speed and a few core dependant. Evidence speaks for itself and i didn’t think my 12900k would be that much worse but it was OC vs OC on that game even though we both had ddr5. Just admit and move on like most sensible people, my cpu is one yr old and loses by a lot when both are OC’d on one game by over 30fps.
At this point, and until you understand how to read the charts and stats for specific games, we’ll stick with the 9900K 😛
My god the stupidity. You really don’t understand anything at all do you. You are proven wrong about stutters, low fps and like everyone always says yet you still think benching games with such an old cpu even budget builds these days have a better setup ?????
It makes no sense for the garbage Nvidia RT middleware to use one CPU core. Which is why this crap will be abandoned for Lumen lighting going forward and Direct Storage will eventually require even less from the CPU,
No one is going to buy CPU upgrades for what is essentially a broken feature to begin with. They are just gonna use frame generation on their current CPU’s and any game not software Lumen that uses RT lighting is just gonna be broken without it.
If the RT lighting middleware wasn’t crap deadspace would barely touch the CPU. It’s a SLOW walking simulator with laughable combat that you do standing still half the time. Meanwhile you have Doom Eternal with RT reflections in the hundreds of FPS with a hundred enemies on screen while you run around at the speed of light.
Game developers have probably already abandoned this garbage outside releases coming out this year and they have upgraded from Unreal 4 to 5.
If you think people are going to do a CPU/MB/RAM/M.2 upgrade for creatively bankrupt niche remakes that sold a whole 1 million copies you are delusional.
not sure how you run it at 50 fps just tested on 5800x and 4090 FE runs 90-100 with TAA high and on DLSS quality get 100-150 sure you might get occasional dips but thats for the most part on game engine not hardware being incapable. Biggest missed opportunity in this game is no ray tracing it being all dark game.
He said NATIVE 4K + Ultra
Good grief you idiots are so hell bent on replying not even reading the damn article. How many times does it have to be said he was talking about native 4K??
the 90-100 i am getting is in NATIVE 4k ultra all max… who is the idiot here now?
the 90-100 i am getting is in NATIVE 4k ultra all max… who is the idiot here now?
You can’t technologically outpace trash developers and no optimization. At this point, if a game is running at less than 90 fps at 4k native without RT ona fcking 4090, it’s safe to say the optimization is abysmal.
I could say yes, but no.
Optimization, especially when it comes to the coding side of it, is an expensive process of development.
Therefor, naturally when time advances and hardware becomes more powerful, their naturally won’t be any focus on making it run as smooth as possible.
Therefor, when a new generation launches there always will be alot of games that run far below expectations….
Also, if you would mathematically take the performance metrics apart and consider PC uses higher settings than console; i think the performance would even scale logically between devices.
And seen this remake is only on CurrentGen you just can’t expect it to run great on anything that is far below Currentgen performance wise.
Currentgen runs it below 4k to achieve 60 fps at lower settings and without RT.
To do it with RT they have to run it at 30 fps.
So all in all things are not as bad as they seem, it’s just the way things progress naturally and developers are not all 140 IQ people who get paid 8k a month, yeah if they got that, we would have great performance on all our games, but it’s business and they rather cheap out on most things and sadly nothing will change that.
CurrentGen also runs below 4K to do Raying Tracing at 30 FPS ….. All they have is basically a RX 5700 with beta 1st gen AMD Ray Tracing cores
That isn’t true either. The “current gen” isn’t a monolithic gaming system. The Series X and the PS5 vary wildly in GPU core count, clock speed, memory bandwidth and even hardware level features. I don’t know why these journalist are so focused on 4k when resolution is nowhere as important as it used to be.
It’s for enthusiast who want latest and greatest without knowing most people still play at 1440P
AMD Ryzen cores are AMD Ryzen cores …. RDNA graphics are RDNA graphics ….. Sony uses 36 CUs and they are basically the same CUs and count that a 5700 non-XT has with the exception of 1. Being lower TDP and 2. Using lower clock speeds because of number 1 …. The CPU is a Ryzen 3700 chiplet with the same rules as the GPU it has, 1. Being lower TDP and 2. Using lower clock speeds because of number 1
To gain more performance MS add a few more CUs (52) while Sony decided they would up the clock speed of the GPU at the expense of the clock speed of the CPU and needing a more expensive and possibly problematic cooling solution (Liquid metal in a consumer grade device is just asking for trouble down the road) Problem with that is Ray Tracing needs both more GPU and CPU power so they Upscale using a method that’s not even as good as FSR 1.0 and no where close to DLSS 2.x
4k @30 FPS with RT on.
OK what exactly isn’t being optimized here …. and be specific ….. It’s not CPU threading, that looks really good …. The shaders are pre-compiled and not causing any stuttering so that’s good …..
What exactly would you suggest they do to “optimize” it?
Cpu threading means nothing if your cpu has old low IPC and cache. A modern Intel 4c 8t cpu can beat a high core ryzen cpu from some years back.
Bro comparing old Intel and old AMD sound like bias, why not comparing today 13 series vs 7000 series?
He has no idea. He’s the typical end user consumer who represents a child in experience when it comes to how the technology actually works. He see’s big graphics cards and big CPU and thinks “It should work. It has big number on side of box!” With the number of folks up voting him it’s very likely most folks/bots who’s highest level of skill is trolling. Videos concerning the actual tech issues are starting to come out so stick to those outlets vs feeding the low level trolls.
These are the same type that just use the highest preset and never go in and do their own optimization to the settings so it works best with what they have …. Just about every game has one or two settings you can lower from Ultra to High, gain a buttload of FPS and not even notice the difference when you are actually playing the game. Many games have INIs you can also use to get more performance or get better graphics if you have some headroom. Then there are settings I find just worthless like Motion Blur and Depth of Field …. Turn them both off and you can gain 5-10% in many scenes
Of course these things take actual effort and a bit of knowledge/experience, none of which the Entitled want to be bothered with ……
The game takes place in a similar setting as the Calisto Protocol (i.e not open world). So since CP can run better and looks better, it is reasonable to assume DSR is not optimized well enough
I’m sorry but that’s BS. When devs makes a game – they consider a performance budget. They consider what configuration they target as an absolute top limit… and then step down from that. Because, of course, no GPU is powerful enough to do 1:1 real life graphics and probably won’t be for another 20 years. Yet we are talking 4090 here. A $2000 videocard that struggles with the game. That means devs have effed up big time, plain and simple.
Of course if devs really aimed for the “future GPUs” like it was the case with Crysis and Kingdom Come Deliverance – they would’ve stated as much in the game. But Dead Space certainly doesn’t look like it needs a future tech.
So it shouldn’t work well on the best card out after you’ve forked out all the that money for the game? Shilling for the studio huh
So it shouldn’t work well on the best card out after you’ve forked out all the that money for the game? Shilling for the studio huh
Yeah its something I get it takes time and costs (lots) money to optimize but they need to cause the goal is for for 4k to replace 1440p. People can hate it or love it. I myself go to console for some games because the lack of 4k HDR ruins immersion or enjoyment period. I was happy PC got GOW for instance the same day I finally got it I bought a stupid expensive 4k gaming monitor cause I just couldnt, I just couldn’t. I’ll wait another card or 2 before I spend on a 4090 I don’t feel that is as future proof as they claim it to be. Not even from a hardware standpoint just like you and im sure some others have said Devs just won’t do there job like they used to from the rip. It seems like we get good (optimized) games right towards the end of a console generation only lol. Point is I want 4k HDR at 120+ FPS preferably 200 cause some games I refuse to sacrifice for frames like others do the opposite cause they only play COD, Fortnight or Overwatch (examples).
Soon DLSS 3.0 will be required to run games at 60 fps
Like “i cant run this game in my RTX 3070 at 1440p 60fps what sould i do?”
“Buy a 4060 and turn on DLSS 3.0 Frame Generation”
They already use this excuse for DLSS 2.0…
So… The new question is…
“Can it run without DLSS?” ®
The Witcher 3 developers over at CDPR should be taking notes on what proper threading in DX12 is supposed to look like …… Nothing worse than having 16 threads and only 2 of them are doing all the work ……
A lot of that has to do with the RT middleware. Witcher 3 was probably the best DX 11 CPU threaded game of all time.
Not really because DX11 can only handle 4 threads efficiently and the rest not so much because when it came out in late 2008 everything was still dual core with a handful of 4 core at the high end and Hyperthreading only existed in Xeon CPUs.
DX11 is a high level API meaning it is easier to use but has limitations in threading, memory allocation/transfers and basically a FIFO shader pipeline and the API itself handles the CPU threading
DX12 is a low level API which means it’s harder to use and it’s up to the game engine developers to handle threading/ memory and it has parallel and out of order shader pipeline that again is harder to use
DX11 is like BASIC and DX12 is like C and the biggest problem is developers thinking they can just modify an old DX11 game engine and make it work with DX12 but just like converting a program written in BASIC to work in C you can’t directly convert it you have to start over from scratch
CDPR uses a DX11 game engine (Red Engine) in Witcher 3, Ubisoft uses a DX11 game engine (AnvilNext 2.0) for Assassin’s Creed and Dead Space is using a DX11 game engine (Frostbite 3) and they all tried to convert it to DX12 and failed ….. It took well over a year after release before CDPR got the Red Engine working in Cyberpunk properly and then they turned around and made the exact same mistake with Witcher 3 mainly because all their best devs are tied up with Cyberpunk and still are which is why I don’t expect a proper fix until the announce the date for the Cyberpunk DLC which will free up those guys to work on Witcher 3
Performance of this game is really disappointing. I have a RTX 3080 and I have to play the game on Medium settings to get a stable 60 FPS. That is just terrible! What a pile of dog poop. Money wasted.
https://uploads.disquscdn.com/images/3ffeb4124b4fef9d523e2717622444e26f38c34a89a66977fabe8b7a345a4492.png
FFS… couldn’t help themselves.
It gets worse https://uploads.disquscdn.com/images/eb202ec448bfea917356e6d346b11e43c24e717c30513b4b32c27a0478846a17.jpg
They want to force it down the throats of younger generation at any cost!no one should buy this crap and support degeneracy
It’s called social engineering…wiring their brains that it is the ONLY world they will know!
DISGUSTING : (
P.S – The one aspect, Im all about racial equality and truly believe that color of the skin doesn’t make you superior. BUT, BUT, if it is forced to the point that a black is forcibly put in the front of everything, whether it fits or not…is just cringe beyond comprehension!
Neither Greek nor Jew, bond man or free man, male nor female; all are equal in the sight of God.
Redefining racism to fit agendas and narratives is not true racism . Treating or feeling about someone differently because of their skin color is racism.
Is this the kind of virtue signalling that is acceptable among some centrists?
Someone pretending to be one maybe
They did it just to Trigger guys like you …… They are probably sitting around laughing about it right now ….. Kind of like I am …..
I’m not so called “triggered”, I just think it’s pathetic they’re projecting mental illness into these games.
EA motive is also developing upcoming Iron man game
Well, I may not be playing in native 4k, currently playing at 3440 x 1440, and I’m not getting anything less than 130 FPS – during the prologue or the game using Ultra settings and DLSS Performance. Tried with DLSS Balanced (how I’ll primarily play) and got as low as 118. Also, tried it with TAA High, which saw FPS drop as low as 103. Nothing below 100 FPS however and GPU temps never were over 68 degrees Celsius.
Tons of people playing it with 100fps on ultra 4k 4090, i think the guy you picked has something wrong with his system.
https://www.youtube.com/watch?v=PslqN4oh_pg
“Tons of people”. Links video to one of the most highly tweaked, best gaming PC’s in the world from an enthusiast who usually has two machines with the top GPU from Nvidia/AMD. That video also shows dips into the 50’s in the first minute. Did you even watch the video?
Something must be wrong with your build, maybe OS performance allocation issues, RAM, driver trouble, runaway background programs, can’t say. What I can say is in the same area ultra native 4k for me averaged 70-75 on a 4090 with a Ryzen 5900x. I didn’t even see peak GPU utilization. In fact as far as I can tell at 4k I’m in a slight cpu bottleneck and at 1440p I’m in a strong CPU bottleneck. It had a few traversal stutters like you detail to below 50 but the average was still upper 60s-lower 70s. Highest utilization I’ve seen was 80% from my 4090.
4090, 32gb 3600mhz, win 10 most current build, Ryzen 5900x, most current drivers for everything. Additionally the opening area is the hardest area to run, anywhere else in the story I saw higher frame rates than the first area. Anytime you revisit that area you see a frame drop. Just got done watching the full video. OPs frame rate was consistently over 60fps for like 90% of the video. The opening cinematic drops/stutters and the few traversal stutters they experienced were not enough to drop the average to below 60fps. Misleading tech journalism, what else is new. God forbid a game within 12 hours of launch don’t have every single stutter ironed out
He uses a sh*te old cpu with ddr4 what do you expect. He keeps defending this in every article, it’s becoming a bit of a joke at this point.
I used to do this also in my videos. I would ignore all the heads in the comments telling me it was CPU bottleneck and would defend to the bone my aged CPU. Until I finally upgraded and did a comparison video and noticed a 15-20% increase in FPS. DUH!!!
I’d rather play the original at 4k 60 with mods on an RX 580. No remake so far is as good as REmake and that ran on a GameCube.
Game developers are not creative and programmers are lazy as f*k. All these improvement on GPUs brought by RTX were focused on them: unnecessarily demanding ray tracing so that they don’t have to bake lighting. Upscalling so that they don’t have to uptimize as much…
PC gamers these days are begging for devs to implement their favorite upscalling algorithm on their games. Only q handful of games will use the tech they paid for.
What happened to buying new HW and getting provements across the board in all games? That’s what PC gaming is all about.
Wait Till the Geforce RTX 6090 EX Plus Alpha Super Ti hits one week after the GeForce RTX 6090 EX Plus Alpha is released. With 10% more performance. ???
Its just the intro scene after that I’m getting 70-80fps
4k maxed 4090 and 5800x3d. It varies on the scene the more opened up area’s run in the mid 60s at the lowest after the beginning. But with dlss on quality it runs great and I can’t tell much difference from native. But I will be picking a 7950x3d when it comes out. I didn’t expect this game to push the system this hard im getting the highest cpu temps of any game I’ve played.
https://www.youtube.com/watch?v=J0LOWFhDAF4 check this graphics mod john.
4090 Ti will solve this.
Maybe so but Nvidia will probably set The MSRP at $2,000 and retailers will try to gouge even more.
CP 2077 also does not run native 4k 60 FPS. Less so with RTX ON no DLSS.
Is it using some DRM that could be hurting the framerate like Denuvo?
DENUVO tank the CPU and memory performances and even the SSD , it’s an invasive DRM that check every now and then the sources code of the installed game and the disk it’s on and communicate this info with an external server , so it’s unnecessarily give work loads for your PC when this extra power should be used to run the game properly, add to this a game that is initially bad optimized and you get your self a game with a 3D engine from 5 to 6 years ago made exclusively from closed corridors that can’t exceed 60 FPS on 4K with the most expensive and powerful GPU around !
If anyone wants to take this website seriously these days he needs a 13900k with ddr5 7200mhz and compares his results with his 9900k and ddr4 on the 4090. Only then when he proves that he has the same stutter or bad frametimes will anyone trust these articles ever again. I know the result but some deluded people need side by side benches to be proven wrong and this is all bad info.
If you see comments being deleted on a website like this you know someone is in damage deluded mode and can’t cope with the truth ??.
Are you sure they are being deleted?
They might be just awaiting approval.
Nope. Me agreeing with someone that his setup makes results worse than other people and then him leaving a comment calling that person out as a kid, i said he was the deluded kid and it was deleted, so yes he is deleting comments which tells you all you need to know. The 9900k defence deluded force en masse ????. What makes you laugh is he has all kinds of comments that are rascist, homophobic etc and they are fine but call his results out when it’s obvious and delete F’ing central ??????.
Name calling is allowed here. It’s part of what the majority voted for here in October.
Also John is right. You clicking on this article just helps to pay the bills on this site.
They were awaiting approval because he used “specific” words that were filtered. He would have known that if he was regular here. Ironically, he thinks that by insulting me he’ll get something? It will make him feel better? He fails to realize that by doing that, he increases our pageviews which is a win-win scenario for us (he even read other stories 😛 )
I always come here for news everyday. I only started commenting because your game perf articles with that cpu are becoming a joke and I feel the need to comment on such daft things.
5950x and rtx 3090 4k all max setting and dlss balance game run perfect smooth at 90 – 100 fps
the obstinacy of trying games without the dlss makes no sense. I don’t notice loss of quality between not having it and having it active
https://imgur.com/a/EjlHOZP
DLSS was invented as an excuse for game developers not to put effort into optimizing their games.
f*k this game…if a 4090 can’t give you 60+ fps on NATIVE 4K, it means this sh*t is optimized with horse’s piss , DLSS main GPU target is the MID RANGE and lower , i’m not paying +1500 bucks to use an upscale technique but to have the most efficient raw power of the market !
The RTX 4xxx series is a disaster. Nvidia will probably release a RTX 4060 Ti (which is really a 4050 Ti) with a MSRP of $550
Nvidia has dug their own hole and fell into it.
I don’t know what you’re running for CPU, mobo, RAM and NVMe but I’m achieving over 80-90 frames w Ultra settings and DLSS2 at 4k resolution w MSI Gaming Trio RTX 4090
This why you don’t pay $1,700 for a graphics card. People have built this up like its God among men but the reality its just an overpriced graphics card. Just buy something cheaper and call it a day so you won’t Nerd rage when you realized you can’t play everything at 4k 60 fps. This will be the first of many games where’s that’s true I can guarantee it.
Dead Space runs fantastic at 1440p on my 3070 completely maxed out and it looks absolutely gorgeous, even my Steam Deck can run the game with its TDP limited for zero fan noise and long battery times, so somethings definitely wrong on your end, i.e not the expected performance profile. Don’t get this article, just sounds like you’re mad about Callisto getting sh*t and Dead Space getting praised xd if people like Dead Space more and think it’s a visually more appealing game both technically and artistically, then why do you have an issue with that?
Game does indeed run very well, and also looks great. 1440p maxed and DLSS on quality , I get a constant 60fps on my laptop 3080ti. I did notice some very slight frame drops when there are multiple enemies on screen and you are slaughtering them.
This information is inaccurate. Getting perfect 60 frames on a 3080 TI paired with a 10900k and 64gb ddr4 running off a normal SSD
Digital Foundry used to use old cpus with modern gpus and mentioned huge stutters which people with modern top end cpus didn’t have. Now they have better cpus and still test with some older cpus they have proven that older cpus have much larger stutters and issues. No one pares a 4090 with this crap a*s cpu. It’s been proven so again stop being so deluded and upgrade you are just getting me irritated now.
Problem IMO is the insane level of “diminishing returns.” None of these recent demanding games look anywhere near good enough to justify their performance. Does Forspoken look twice as good as Horizon? Heck no. Does Callisto look twice as good as RE2/8? Heck no.
Obviously it’s different engines and teams but from the consumer perspective of facing down a $1k GPU purchase for such minor visual upgrades it’s annoying and feels dumb.
Sh*t take on stutter protocol. This game being hard to run does not excuse that pile of sh*t
garbage game and garbage remake, it dosnt even look good. It should run no problem on 3070… Devs are so incompetent its not even funny anymore, they just force us to use DLSS3 or suffer abmyssal FPS no matter what hardware we have
I fired up DS remake for the first time lat night and I must disagree. The lighting in DS is much better than Calisto which has a washed out look in comparison. I too am using a 4090 and a 48″ CX for my monitor for the last 2+ years. Using an OLED the black levels in DS are clearly defined and I find I could see detail much better in DS than in Calisto. With maxed settings and RT lighting turned on I was getting the same fps you mentioned, but with DLSS turned on and set to quality I see 80+ fps. The quality of the graphics are not reduced by turning on dlss in any way that I could see. The fact that DS has endless dark environments yet the lighting and visibly are still sharp and clear is really quite good. The camera is a bit narrow and to close at times, but that’s the horror game receipe. Maybe widescreen fixer will release an update for this game and give us an easy fix. Cheers
a gpu like the 4090, with 83 teraflops, and 9x more powerful than a ps5, if a gpu like this can´t do 4k60 on f*king corridor game like this, it don´t means is a bad gpu
this is a SH*T game, with a sh*t engine , the frostbyte engine is already very well know for being a engine that is really hard to work on it and also optimizing on it
this is a sh*t engine , and that is all, as a owner of a 4080 and a 4090, i NEVER will buy games like this with any optimization
i was planing to buy this game on day one, but now my money will go to atomic heart , f*ck frostbyte engine and f*ck sh*t optimizations
What are the other specs of the PC? I run the game just fine at native 4k ultra. 70-90 fps on my 4090.
Im literally running the game at 4K ultra quality, 120 hz in my LG Oled G2.
This is kind of misleading. I have a RTX3080 and I’m playing the game at 4K, with everything on ultra, except RT, DLSS on balanced and I’m getting 70+ fps, never going below 60fps on my 5 hours played so far. The game doesn’t feel unoptimized at all.
Because you use DLSS bro
Huh, been playing on a 4080 and it seems to be running fine 99% of the time for me on 4k Ultra, even TAA on high. Occasional stutters, maybe like once every couple hours.
no one said about the denuvo hurting the performance…
I can’t quite believe it, it’s odd. I’ll help:
Denuvo bad!
Bad bad bad!
Erm. Doesn’t everyone use DLSS or similar anyways? I have a 4090, on Ultra settings and RT it was literally getting capped by my monitor as the FPS were 120-144?
Bare in mind ps5 runs this game at 30FPS (upscaled/not native with lower quality settings overall) so I’m unsure the point of this as I haven’t seen any system churn out high FPS with NATIVE 4k.
one word : D E N U V O !
Lol but I think will cost 4999.99
The Questsion is, who tf besides elon and bill gates play at 4K??? Normal people play FHD or QHD (1K or 2K)
The Steam Hardware Survey says about 2.5% of gamers are using 4K and it’s ranged between 2% and 2.5 % for years and years. So in answer to your question only a very, very few are on 4K. It’s not the cost of the 4K monitor. Those have come way down in price. It’s the cost of a GPU to run 4K that holds it back from anything close to mainstream.
But even then, the whole of these companies marketing AND reviewers are geared towards it.
It’s… interesting let’s say.
$2000 so you can’t run Dead Space remake at 4K 60 fps
Dead Space has better: physics, particle effects, lighting and more world details.
Callisto Protocol has: absolutely no physics, half rate particle effects, worse lighting and less world details.
But overall both games are great!