Focus Entertainment has just released Atomic Heart on PC. The game is using Unreal Engine 4 and unfortunately, it launched without its advertised Ray Tracing effects. However, it appears that the game can at least run smoothly on modern-day PC configurations. At Native 4K and Max Settings, our NVIDIA GeForce RTX 4090 was able to push over 85fps at all times.
In order to capture the following gameplay footage, we used an Intel i9 9900K, 16GB of DDR4 at 3800Mhz, and NVIDIA’s RTX 4090. We also used Windows 10 64-bit, and the GeForce 528.24 driver.
Atomic Heart allows PC gamers to compile its shaders before beginning the single-player campaign. Therefore, and in order to avoid any shader compilation stutters, we suggest letting the game compile all of its shaders.
The game also supports DLSS 2/3, as well as FSR 2.0. Furthermore, and contrary to reports, the game doesn’t appear to have any mouse acceleration or smoothing issues.
For the most part, the NVIDIA GeForce RTX4090 is able to run Atomic Heart with 100fps at native 4K with Max Settings. However, we were able to drop our framerate to 85fps while using the Scanner. During that scene, our framerate dropped from 120fps to 85fps (with the GPU being used to its fullest).
Our PC Performance Analysis for Atomic Heart will go live later this week. Since the game does not have any built-in benchmark tool, we’ll be using the Scanner scene. After all, that’s one of the most demanding scenarios early in the game. For those wondering, we also did not experience any major stutters.
Stay tuned for more!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email



I’ll wait to see if they add RT back before playing it. They were one of the initial studios making a big deal about RT when the Nvidia 20 series GPUs first launched. Kinda sucks to see them pull a complete 180 on that. Especially because all of the reflective surfaces in some parts of the game would look way better without SSR’s awful occlusion artifacts
I’m sure they are just optimizing. Nothing can probably play their vision correctly. In two years we will all be able to enjoy this game the way it’s meant to be played, as NVIDIA says.
3080 already ran the old unoptimized benchmark pretty well. I feel like as long as they’ve worked on this game supposedly with RT in mind from the beginning and with Nvidia’s modern advancements in boosting performance it’s WEIRD that they all of a sudden dropped RT. Wccf tech said the review guide even mentioned the game having RT. Sooo, I dunno. Hopefully it does eventually make it to the game. Guess I’ll just start Hogwarts after I finish the Metroid remake
They were busy adding Denuvo
Devs will always slack as much as possible and pass the cost to the consumer. This is even without RT on ffs…
Why optimize the game if you can just brute force with rtx card clown emoji
Brilliant! ?
https://media3.giphy.com/media/wzYEqktr9NETC/giphy-downsized-medium.gif
Funny stuff. It’s Robo-Turd.
The only good thing about fernando-bot is all the quality gifs he gets memed with as replies to all his banal comments.
I’d say the best thing about him is that he triggers snowflake sand rats like you with no effort.
Who is this bot?
That is Robo-Fernando
Thats literally not impressive LMFAO
Yea, i thought so too. Imagine with RTX on… from 85 fps to what? 35-45? Meh. Not impressive for sure.
Ever hear of DLSS and now FG on DLSS3 mr IDIOT!
Take a damn look at how friKKIng aswesome it looks. Then grow some braincells moron ?
Bla bla bla, rTARD. Fake frames matie boy. Ignored btw, for being such a (again) rtard.
Nah, i just want real performance. Not every game i play has DLSS, so that wont work. I know the 4090 delivers it for the most part, but its also not super impressive. Impressive is 120-160 fps WITH RT. 35-55 fps is not!
Watch the benchmarks with raytracing (you know, the tech Nvidia promotes with those new cards) Its not looking good. It looks even worse if you go down to 4080/70. Also, who would spend 2300 dollars for a single grapcids card, cus thats the real in store price. Next year that card will be dirt cheap, just like the 3080 ti is now. Sure, you can go feed Nvidia’s greed, us smart people will know what to do. Google how many of those are sold if you dont trust me. The GPU market is in a BAD state, go help Nvidia and buy some more 4090, and be a good ?
Can i use the new features in old games? No? Ok, then im right. I want raw performance. The net also needs less people like you too, so yeah, ignored.
Before i go, lemme just say this: I happen to have a good cheap 8K TV, that i wish i can use someday for older games. Those old games do not even have FSR, let alone DLSS. Heck, not every single new game supports them either, so REAL RAW performance is key to me. Even at 4K, the current cards are not enough without DLSS. It is a fact, especially if you add RT to the mix. What do you think? PC Gaming only exists after 2022? What about 2017? 2018? I got plenty of games that i want to run on max graphics, 4 or 8K. BB now, you wont be missed on my internet.
Actually the GPU market for Nvidia is looking really good right now just not for gaming but for AI applications like ChatGPT Google and Bing AI enhanced search which will require over 1 million H100 units to build out a worldwide system. At $33,000 a pop that’s a lot of money and it’s essentially the same chip a 4090 uses only 8 of them per board but with a sh*tload of HBM3 memory onboard. and the ability to string several of them together into clusters
All a 4090 really is now is something Nvidia makes for their H100 dies that just can’t make the cut and apparently the yields are good enough there isn’t really that many units they can make and they aren’t going to waste dies that could be used on a H100 in a 4090 hence the price is intentionally high.
Thanks for the thumbs down, I’ll be laughing about it all the way to the bank when I sell the 1000 shares of Nvidia I bought in October for $120/share …… I could sell it tomorrow and net a profit of $86,000 ($206/share)
Went up to $236 today for the very reasons I told you all about in the first post ….. That’s $30,000 profit in just one day …. and a lot of places are forecasting prices to go to $275 – $320 ……
I started this run back in 2009 by buying 1003 shares of AMD for about $2750 and held it until November 2021 and sold it for north of $144,000 during the Tech Bubble and put that money in the bank and sat on it until the market fully bottomed out last October and used it to buy into Nvidia when everyone else was predicting a bad 2023 for Nvidia. Why Because I’m a senior Electronics Engineer and have a deep understanding of technology and I read a lot of trade journals. Two things that made me realize Nvidia was going to be a good buy was the Nvidia/Mercedes-Benz deal and the information coming out about some big breakthroughs in AI at AWS, Google, and Microsoft coupled with the news about the H100 coming out in Q1 2023 which doubled the performance of the last generation A100 which already had good sales despite the economy.
Unlike last generation where despite having the same basic specs (CU count tensor core count etc.) the 3090 was on Samsung 8+nm while the A100 chips were on TSMC 7nm. This time the 4090 chips and the H100 chips are both on the TSMC 4+ nm line meaning that many dies that can’t make the cut for the H100 can be used in the 4090 but judging from current stock of the 4090 Nvidia is having really high yields and they aren’t going to waste H100 quality dies on 8 4090’s that go for $12,800 when they can use 8 of them in a H100 that goes for $33,000 – $38,000
Don’t expect 4090 prices to drop anytime soon and it will likely go up in price due to scarcity because there aren’t enough chips left over from the H100. There may not even be a 4090 Ti line since it would have to use a fully enabled chip like the H100 uses and it there is it’s going to have to go for $3000
All frames are essentially fake ……
They love their fake frames. If you speak against it, it will hurt their feelings.
I noticed, god forbid someone thinks differently LOL, and i do get it. Its not a horrible thing to have, but the fake frames dont work for old games that are not supported. Heck, many new games dont have those features either.
Also, 8K is starting to become a thing for older games, so higher raw performance is more needed.
Ever hear of DLSS and now FG on DLSS3 mr IDIOT!
Take a damn look at how friKKIng aswesome it looks. Then grow some braincells moron
Hey, chill or GTFO!
F*ck of ni**er!!
Can’t even racist properly. You are a disgrace. Go back to your momma and learn how to swear!
LMAO …….
Finally a game that is optimized for PC. I tried it myself and I could not believe how smoothly it runs! Almost non-existent stutters. Trully amazing performance considering how good it looks
Epitelous ena paixnidi pou den trexei skata day1 release.
I can run it ~95FPS 4k native max settings on a 3090TI + 5950X…and almost stable 120fps in most areas with DLSS (Ultra Quality). This is on the leaked build with no Denuvo. But, the graphics settings are a little different, and a thing to note is RTX does not work even when set to ON on that build…No wonder it’s not even available in final release version.
This game has great visuals, and good art style. In my test run of the leaked build, I really enjoyed the first ~45minutes of the game. I’m likely going to wait for Denuvo to be removed before I play the game.
You talk about the quality mode of dlss, it’s 1440p to 4k. Ofc you got a good framerate…
My 4090 do 115-100fps at native 4k.
Btw i’m very sad about the release without the RT.
Hope it will come with frame gen to play without upscaling. Like hogwarts legacy.
I get ~95FPS on native 4K..yeah RTX would’ve been nice.
I get ~95FPS on native 4K..yeah RTX would’ve been nice.
Lol I thought I was the only one playing Hogwarts at 4k native ultra settings rt on Nvidia dlaa with frame generation. Everything else seems like a compromise. Surprisingly frame generation works pretty decent in that title at native rez thought imo.
Men of culture are more rare than ever.
Just play the dev build. No reason to support these games.
I’m confused. Is the 4090 a brand new card launched recently? Because of all these articles specifically focusing on the 4090 performance makes it sound like the 4090 is a brand new card which still requires testing.
This is about the game, not the card. How can this be so difficult to comprehend?
This not the first article like this. It’s almost like they’re bragging about the fact they have a 4090 and they have to show it off every chance they get. New comes out? Better show of the 4090 again. If this was about the game, you’d benchmark it with multiple cards. Or at least test the most commonly used GPU, like the 1060, not the card that less than 1% of gamers actually have.
I understand your point about a review but this is not a review article and the article is not from someone’s personal blog. This is a publication and I believe the reason to show off the 4090 performance is to show how the game performs on the best card out there so that people can get an idea of what is the absolute best this game can offer.
It does not feel like the best card, when I’m getting more FPS with the same settings with a RX 6950 XT…
Always told folks AMD had the best value for money. Retarded fanbois always drown the sense out of whatever someone says. The 6950XT is a good card and on par with the 3090Ti, but I am not sure its as good as a 4090. Perhaps you misspelled there or something.
Would be very hard to misspell my own GPU name, I’m getting over 100FPS on my RX 6950 XT on maximum settings 4K, so this NVidia benchmark is extremely disappointing.
Ahh, so AMD performs better than NV in this game. Lol awesome. So much for a game being paraded by Nvidia and the dev as RTX for years only to perform better on AMD.
Their latest article just confirms what I said, go check it out.
There is a huge difference on fps depending on location in this game. Maybe you test on a lightweight location on the map
As you can see from latest article, AMD is performing better than NVidia, https://www.dsogaming.com/pc-performance-analyses/atomic-heart-pc-performance-analysis/
If we’re talking only about 4K, though, the RTX4090 is noticeably faster than both the 6900XT and 7900XTX. The 7900XTX can drop to 69fps when using the Scanner.
I’m playing in 4K with a RX 6950 XT, getting way better performance than the RTX 4090 in this video.
I run it at 100+ FPS on a 6900xt at 3440×1440 no FSR fully maxed out. 4090 at 4k sounds weak as f*ck.
Same here but with a RX 6950 XT, so I can confirm AMD is being way faster than NVidia in this game.
I hope max settings means with Ray Tracing enabled, cause I’m getting over 100FPS at 4K max settings with a RX 6950 XT.
you have limited by 9900k cpu
denuvo
infested with bugs
awful optimisation
71% on Metacritic
5/10 GamesRadar
trash
Yes, all those those Russian and Chinese Steam reviews, that include trash talking the US and Europe, are surely an objective review of this game. Communist clown.
Is this your fantasy? I bet it is. You’re one of those who has a hardon for Putin aren’t you.
Time to move to Moscow and get the hell out of the West, communist f*g.
https://uploads.disquscdn.com/images/46484b1e314d91dafe0516ad46978241a58b87c2157c70f35e62cccbc35d870c.jpg
Get out of the West. Go buy a communist flag, stick it up your butt and take the first Tupolev to Moscow or Beijing. No one wants communist traitors here.
This is now the most popular thread of this Russian game on Steam. It has become a US trash talking competition. With Russians and Chinese competing who can trash talk the US the best in reviews.
https://uploads.disquscdn.com/images/d6d6e8e36517618804d12a2b03e67f6be6cf56107a9826e8b62de8b2677fe973.jpg
It runs at like 80fps with my 3080 at 4k dlss quality. Why the hell wouldn’t you use DLSS?
It’s amazing, I run this game with DLSS performance and I can hardly see a difference compared to native
It does 4k 60fps with Geforce Now, no dlss. There’s some packet loss however, probably should be using dlss. Haven’t bothered to go higher since my tv only does 60fps.
A $1600 card should be able to do traditional rendering at 4k with at least 120fps locked
This is a disappointment for Nvidia
Better quit your trolling job and get out of Putinstan before they mobilize you and send you to Bakhmut.
Okay, vatnik. Stay delusional! :)))))
That four generation old processor is bottlenecking the performance even in 4k. What comes to real life performance I’m getting 180-280fps depending on the scene with all settings maxed 5k and DLSS3 enabled, without any visible loss to detail or issues. I have 4090 paired with 13700k and RAM @6600MHz.