Atomic Heart new feature

NVIDIA RTX 4090 runs Atomic Heart with over 85fps at Native 4K/Max Settings

Focus Entertainment has just released Atomic Heart on PC. The game is using Unreal Engine 4 and unfortunately, it launched without its advertised Ray Tracing effects. However, it appears that the game can at least run smoothly on modern-day PC configurations. At Native 4K and Max Settings, our NVIDIA GeForce RTX 4090 was able to push over 85fps at all times.

In order to capture the following gameplay footage, we used an Intel i9 9900K, 16GB of DDR4 at 3800Mhz, and NVIDIA’s RTX 4090. We also used Windows 10 64-bit, and the GeForce 528.24 driver.

Atomic Heart allows PC gamers to compile its shaders before beginning the single-player campaign. Therefore, and in order to avoid any shader compilation stutters, we suggest letting the game compile all of its shaders.

Atomic Heart Shader Compilation

The game also supports DLSS 2/3, as well as FSR 2.0. Furthermore, and contrary to reports, the game doesn’t appear to have any mouse acceleration or smoothing issues.

Atomic Heart PC graphics settings-1Atomic Heart PC graphics settings-2

For the most part, the NVIDIA GeForce RTX4090 is able to run Atomic Heart with 100fps at native 4K with Max Settings. However, we were able to drop our framerate to 85fps while using the Scanner. During that scene, our framerate dropped from 120fps to 85fps (with the GPU being used to its fullest).

Atomic Heart - First 10 Minutes - Max Settings - Native 4K - NVIDIA RTX 4090

Our PC Performance Analysis for Atomic Heart will go live later this week. Since the game does not have any built-in benchmark tool, we’ll be using the Scanner scene. After all, that’s one of the most demanding scenarios early in the game. For those wondering, we also did not experience any major stutters.

Stay tuned for more!

68 thoughts on “NVIDIA RTX 4090 runs Atomic Heart with over 85fps at Native 4K/Max Settings”

  1. I’ll wait to see if they add RT back before playing it. They were one of the initial studios making a big deal about RT when the Nvidia 20 series GPUs first launched. Kinda sucks to see them pull a complete 180 on that. Especially because all of the reflective surfaces in some parts of the game would look way better without SSR’s awful occlusion artifacts

    1. I’m sure they are just optimizing. Nothing can probably play their vision correctly. In two years we will all be able to enjoy this game the way it’s meant to be played, as NVIDIA says.

      1. 3080 already ran the old unoptimized benchmark pretty well. I feel like as long as they’ve worked on this game supposedly with RT in mind from the beginning and with Nvidia’s modern advancements in boosting performance it’s WEIRD that they all of a sudden dropped RT. Wccf tech said the review guide even mentioned the game having RT. Sooo, I dunno. Hopefully it does eventually make it to the game. Guess I’ll just start Hogwarts after I finish the Metroid remake

      1. The only good thing about fernando-bot is all the quality gifs he gets memed with as replies to all his banal comments.

      1. Ever hear of DLSS and now FG on DLSS3 mr IDIOT!

        Take a damn look at how friKKIng aswesome it looks. Then grow some braincells moron ?

          1. Nah, i just want real performance. Not every game i play has DLSS, so that wont work. I know the 4090 delivers it for the most part, but its also not super impressive. Impressive is 120-160 fps WITH RT. 35-55 fps is not!

            Watch the benchmarks with raytracing (you know, the tech Nvidia promotes with those new cards) Its not looking good. It looks even worse if you go down to 4080/70. Also, who would spend 2300 dollars for a single grapcids card, cus thats the real in store price. Next year that card will be dirt cheap, just like the 3080 ti is now. Sure, you can go feed Nvidia’s greed, us smart people will know what to do. Google how many of those are sold if you dont trust me. The GPU market is in a BAD state, go help Nvidia and buy some more 4090, and be a good ?

          2. Can i use the new features in old games? No? Ok, then im right. I want raw performance. The net also needs less people like you too, so yeah, ignored.
            Before i go, lemme just say this: I happen to have a good cheap 8K TV, that i wish i can use someday for older games. Those old games do not even have FSR, let alone DLSS. Heck, not every single new game supports them either, so REAL RAW performance is key to me. Even at 4K, the current cards are not enough without DLSS. It is a fact, especially if you add RT to the mix. What do you think? PC Gaming only exists after 2022? What about 2017? 2018? I got plenty of games that i want to run on max graphics, 4 or 8K. BB now, you wont be missed on my internet.

          3. Actually the GPU market for Nvidia is looking really good right now just not for gaming but for AI applications like ChatGPT Google and Bing AI enhanced search which will require over 1 million H100 units to build out a worldwide system. At $33,000 a pop that’s a lot of money and it’s essentially the same chip a 4090 uses only 8 of them per board but with a sh*tload of HBM3 memory onboard. and the ability to string several of them together into clusters

            All a 4090 really is now is something Nvidia makes for their H100 dies that just can’t make the cut and apparently the yields are good enough there isn’t really that many units they can make and they aren’t going to waste dies that could be used on a H100 in a 4090 hence the price is intentionally high.

          4. Thanks for the thumbs down, I’ll be laughing about it all the way to the bank when I sell the 1000 shares of Nvidia I bought in October for $120/share …… I could sell it tomorrow and net a profit of $86,000 ($206/share)

          5. Went up to $236 today for the very reasons I told you all about in the first post ….. That’s $30,000 profit in just one day …. and a lot of places are forecasting prices to go to $275 – $320 ……

            I started this run back in 2009 by buying 1003 shares of AMD for about $2750 and held it until November 2021 and sold it for north of $144,000 during the Tech Bubble and put that money in the bank and sat on it until the market fully bottomed out last October and used it to buy into Nvidia when everyone else was predicting a bad 2023 for Nvidia. Why Because I’m a senior Electronics Engineer and have a deep understanding of technology and I read a lot of trade journals. Two things that made me realize Nvidia was going to be a good buy was the Nvidia/Mercedes-Benz deal and the information coming out about some big breakthroughs in AI at AWS, Google, and Microsoft coupled with the news about the H100 coming out in Q1 2023 which doubled the performance of the last generation A100 which already had good sales despite the economy.

            Unlike last generation where despite having the same basic specs (CU count tensor core count etc.) the 3090 was on Samsung 8+nm while the A100 chips were on TSMC 7nm. This time the 4090 chips and the H100 chips are both on the TSMC 4+ nm line meaning that many dies that can’t make the cut for the H100 can be used in the 4090 but judging from current stock of the 4090 Nvidia is having really high yields and they aren’t going to waste H100 quality dies on 8 4090’s that go for $12,800 when they can use 8 of them in a H100 that goes for $33,000 – $38,000

            Don’t expect 4090 prices to drop anytime soon and it will likely go up in price due to scarcity because there aren’t enough chips left over from the H100. There may not even be a 4090 Ti line since it would have to use a fully enabled chip like the H100 uses and it there is it’s going to have to go for $3000

          6. I noticed, god forbid someone thinks differently LOL, and i do get it. Its not a horrible thing to have, but the fake frames dont work for old games that are not supported. Heck, many new games dont have those features either.

            Also, 8K is starting to become a thing for older games, so higher raw performance is more needed.

    1. Ever hear of DLSS and now FG on DLSS3 mr IDIOT!

      Take a damn look at how friKKIng aswesome it looks. Then grow some braincells moron

          1. Can’t even racist properly. You are a disgrace. Go back to your momma and learn how to swear!

  2. Finally a game that is optimized for PC. I tried it myself and I could not believe how smoothly it runs! Almost non-existent stutters. Trully amazing performance considering how good it looks

  3. I can run it ~95FPS 4k native max settings on a 3090TI + 5950X…and almost stable 120fps in most areas with DLSS (Ultra Quality). This is on the leaked build with no Denuvo. But, the graphics settings are a little different, and a thing to note is RTX does not work even when set to ON on that build…No wonder it’s not even available in final release version.
    This game has great visuals, and good art style. In my test run of the leaked build, I really enjoyed the first ~45minutes of the game. I’m likely going to wait for Denuvo to be removed before I play the game.

    1. You talk about the quality mode of dlss, it’s 1440p to 4k. Ofc you got a good framerate…
      My 4090 do 115-100fps at native 4k.
      Btw i’m very sad about the release without the RT.
      Hope it will come with frame gen to play without upscaling. Like hogwarts legacy.

      1. Lol I thought I was the only one playing Hogwarts at 4k native ultra settings rt on Nvidia dlaa with frame generation. Everything else seems like a compromise. Surprisingly frame generation works pretty decent in that title at native rez thought imo.

  4. I’m confused. Is the 4090 a brand new card launched recently? Because of all these articles specifically focusing on the 4090 performance makes it sound like the 4090 is a brand new card which still requires testing.

      1. This not the first article like this. It’s almost like they’re bragging about the fact they have a 4090 and they have to show it off every chance they get. New comes out? Better show of the 4090 again. If this was about the game, you’d benchmark it with multiple cards. Or at least test the most commonly used GPU, like the 1060, not the card that less than 1% of gamers actually have.

        1. I understand your point about a review but this is not a review article and the article is not from someone’s personal blog. This is a publication and I believe the reason to show off the 4090 performance is to show how the game performs on the best card out there so that people can get an idea of what is the absolute best this game can offer.

          1. It does not feel like the best card, when I’m getting more FPS with the same settings with a RX 6950 XT…

          2. Always told folks AMD had the best value for money. Retarded fanbois always drown the sense out of whatever someone says. The 6950XT is a good card and on par with the 3090Ti, but I am not sure its as good as a 4090. Perhaps you misspelled there or something.

          3. Would be very hard to misspell my own GPU name, I’m getting over 100FPS on my RX 6950 XT on maximum settings 4K, so this NVidia benchmark is extremely disappointing.

          4. Ahh, so AMD performs better than NV in this game. Lol awesome. So much for a game being paraded by Nvidia and the dev as RTX for years only to perform better on AMD.

          5. There is a huge difference on fps depending on location in this game. Maybe you test on a lightweight location on the map

          6. If we’re talking only about 4K, though, the RTX4090 is noticeably faster than both the 6900XT and 7900XTX. The 7900XTX can drop to 69fps when using the Scanner.

          7. I’m playing in 4K with a RX 6950 XT, getting way better performance than the RTX 4090 in this video.

  5. I hope max settings means with Ray Tracing enabled, cause I’m getting over 100FPS at 4K max settings with a RX 6950 XT.

  6. Yes, all those those Russian and Chinese Steam reviews, that include trash talking the US and Europe, are surely an objective review of this game. Communist clown.

      1. Get out of the West. Go buy a communist flag, stick it up your butt and take the first Tupolev to Moscow or Beijing. No one wants communist traitors here.

  7. It does 4k 60fps with Geforce Now, no dlss. There’s some packet loss however, probably should be using dlss. Haven’t bothered to go higher since my tv only does 60fps.

  8. A $1600 card should be able to do traditional rendering at 4k with at least 120fps locked
    This is a disappointment for Nvidia

  9. That four generation old processor is bottlenecking the performance even in 4k. What comes to real life performance I’m getting 180-280fps depending on the scene with all settings maxed 5k and DLSS3 enabled, without any visible loss to detail or issues. I have 4090 paired with 13700k and RAM @6600MHz.

Leave a Reply

Your email address will not be published. Required fields are marked *