As we all know, Metro Exodus will be among the first titles that will support real-time ray tracing. However, 4A Games’ upcoming open-world first-person shooter will also support NVIDIA’s Hairworks and Advanced PhysX effects.
PCGamesHardware was able to capture the graphics settings from the Gamescom 2018 build of Metro Exodus which revealed these new graphics settings.
Obviously this does not surprise us as 4A Games had worked closely with NVIDIA in the previous Metro games. As such, it appears that this partnership will continue in Metro Exodus, with the development team implemented numerous NVIDIA features.
It will be interesting to see whether the NVIDIA GeForce RTX2080Ti will be able to run the game with real-time ray tracing and these NVIDIA features enabled, or whether PC gamers will have to choose some features over others in order to maintain a smooth 60fps experience.

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email

This is great news, for those planning to buy the RTX 4080Ti in 5 years, I guess.
Bro optimistically right?! 🙂
Maybe in SLI?
Maybe.SLI is not getting the support it used to but maybe now there are more reasons for better support.
Every Metro game has SLI support. 4A is a competent developer.
Competency and priorities are two very different things.
They are ONE in the SAME as not all developers are skilled in coding multi-GPU rendering.
Actually no. The point I was trying to make was that just because a developer is capable of doing something, that doesn’t mean it is a concern for them. It’s all about maximizing your greatest bang for your buck with time and resources (profit).
That same logic has caused a shift in the gaming industry for the worst with a bunch of minimal effort, maximum bang for the buck devs. Fortunately, we still have CDPR and 4A games, (among a few others) who are not too lazy to incorporate multi-GPU support into their games. Microprocessor fabrication is gonna slow to a crawl very soon, which is why investment into multi-GPUs again will be very important.
It is certainly a problem, and it is one they’ll have to sort out if they eventually want to stay competitive. That, or they’ll simply switch to an engine that has already done all the work for them, but there is still a cost to that too.
I was talking about general support for sli.
5 years?! Clown, try two.
It took them 10 years to get it barely get RTX running at playable framerates, and you think they’re going to master it and throw PhysX and HW on top in another two? Unlikely…
Dude… stfu.
You need your woobie?
No one is going to be running Exodus with RTX effects. The new cards are already a failure.
While Metro 2033 had really intensive DX11 DoF, the Redux for both 2033 and Last Light maxed out in DX11 with Physx are superbly optimized.
But Redux versions look so much inferior to the originals graphically! No sh** it runs so much better!
Besides few missing volumetric lights everything else looks better. textures, world and character details. Volumetric lights in the original metro 2033 were very unoptimized (wIth volumetric light on the screen fps was like 3x lower than without it), so no wonder they have tweaked that aspect for consoles, because console games unlike PC games have to be optimized.
Brace yourselves fellow PC Noblemen. Filthy low framerate is incoming.
Or you could just disable them. Nah that’s too simple right? You would rather complain
Why can’t we do both?
Because it’s stupid, you supposed to complain about something you don’t want or can’t disable, that’s the point of a complaint.
I think it’s more to do with paying for a feature (GPU RTX cores) that are basically useless. You can turn it off, but you don’t get your money back.
Then people should use their brain proper and consider if it’s for them or not like any other products with features they will or won’t use.
What the AMD GPUs that didn’t and couldn’t use Async compute for years, loads of people brought a GPU that part of it couldn’t be used and probably never used it. How about I sell you something that part of it can’t be used for years? LOL
You really think general consumers will have any idea of what raytracing is, nor care about “better shadows and reflections in some games” when making their purchase decision?
The difference between this and async was async didn’t add anything to the cost. RTX seems to have added several hundred bucks to it.
Well, you’re only going to get better shadows and reflections in games due to ray tracing which requires extra processing power, which in turn cost money to research and make. I wonder why you can’t even grasp these simple principals.
Why do you have to fall back to baseless insults/assumptions when you don’t have a real argument to make?
Shoving these costs down the throats of consumers is never something you should try to defend. Let the rich tech companies absorb some of that instead.
It wasn’t an insult, you’re not making sense with your definition of making tech that you claim people don’t want or don’t want pay for, you’re just getting ideological. You also seem to assume this applies to everything, people buy things that they don’t always use the features fully, don’t buy it or buy the lower end ones which doesn’t have the features then or keep to your older version.
“you can’t even grasp these simple principals.” Seems like an insult to intelligence to me.
Just because you don’t agree with a way of thinking doesn’t mean it’s wrong or fueled by a lack of intelligence. I wouldn’t say I’m ideological, I’m just pro-consumer. In a better system we would have mutliple GPU vendors and this tech would be subsidized by the companies, not the end user just to fuel competition. In a monopolistic system like we have now, they have move the expense to the user and there isn’t a thing we can do about it, except support to opposition or go without (as you mentioned).
Saying “you can’t even grasp these simple principals” is a criticism of your argument. Also it’s not a monopoly, NVIDIA has competition, they don’t control the industry of discreet GPUs, own a high percentage of a industry doesn’t make you a monopoly, it’s where a business has full control of that industry with no competition or destroys competitors getting a foothold, for example, Microsoft Windows.
Noblemen is a code word for laptop users?
We can disable them and get better framerate.i think the main problem is that nvidia’s next gen gpu simply sucks at ray tracing no matter what and should not be bought expecting using ray tracing and max graphics + 4k runing 60fps,damn even full hd 60fps is not possible on all games(so why to buy a high end next gen?)ray tracing is a big step in the right direction but upcoming gtx 2080ti is simply not enough.
Yeah in the next three years or more we can play this at 4k60 with all the kids settings. In not upgrading my gtx 1080ti until 2021
1080p/30FPS PC master race…
Advanced PhysX ? Ok. I Like but it’S like raytracing, is the implementation worth the hit ?
How do you think so? Even the old Arkham games still take a huge hit with PhysX.
That’s true, but this was not the case of Metro games. Also remastered Metro games used new PhysX SDK with significant performance improvements. I assume, Exodus will use this new PhysX version too. I hope for SDK 3.4.
Never tried the remastered ones (felt too soon at the time). I’ll go back and check those out.
I feel like the game will be dogsh,t
Typically the market that would buy a 2080 Ti would be someone running a 4K monitor or someone running a lower resolution 120 Hz or 144 Hz monitor and wanting extremely high FPS. I will be surprised if either of those groups will be getting the GPU power to warrant a $1,200 purchase but we’ll see.
Might have to wait 2 years or so for a 3080 Ti and leave ray tracing off for now in some demanding games.
I am going to play this peasant style:
3440×1440
Ultra
No Nvidia Stuff
THAT’S THE SPIRIT!
Who would like to play games with conslow graphics at native 4k? ?
2019 will be the year of 1080p 30-60fps
So boys don’t throw your 1080p monitor yet 😀
Not in my house.
i play csgo on 800 x 778 @144hz so 1080p monitor is overkill, only peasants play on 4k,1080p and 720p
If they will optimize RTX effects 2080ti will run RTX games at 60fps even in higher resolutions. Just watch digital foundry video in regards to RTX because they explain it’s possible to render RTX in 1920×1080 and even lower resolutions and upscale them to the native 1440p or 4K resolution. So as long as GPU will run higher resolution without RTX at 60fps, it should also run with RTX effects
For me 2k with rtx is good enough but 60fps oh god i cant go to 60
What about the dumb AI?
There have been AI issues?
Holy hell please tell me that’s not the final graphics settings. If so these devs are legit braindead, they already released 2 games with barebones graphics settings, and now they’re gonna release a third. Reminds me of FromSoftware releasing Dark Souls 1-3 without keyboard prompts.
Physx actually affect gameplay, hairworks on the other hand… no so much…
I’d still prefer the hair if I had a choice.
Many new games are using physx, but because most developers run physx on CPU not on GPU you dont even hear about it.
Because only GPU PhysX was showing everywhere. On the other side, CPU PhysX is native part of Unreal Engine 4 and Unity 5. Many games based on these engines is using this API, but it is not explicitly presented. CPU PhysX is even part of Cryengine, too. You have little knowledge about this, but you are talking about BS. That’s ridiculous.
Gimp power
So much obsession for RTT while in every article, only the Ti is ever mentioned.
Hint: You gonna need the Ti to enable the RTT, otherwise just keep dreaming and hope for the next Gen.
The Witcher 3, Killing Floor 2 & Fallout 4 has PhysX.
Rather have super smart and well codded AI than better and better gfx. Eye Candy is great but it gets boring eventually.
Id spend 70% of my GPU for AI, i mean real AI, not this brain dead – follow/hide/go after you/hide behind the wall AI. I want real shocking stuff with good animations.
Give me a Far Cry 3’s gfx (not the best but not the worst) with amazing AI and id take it over BF5 RTX GTX FTW BBT ULTRA GFX reflection/polish/shadows and lighting BS.
Yeah, i know i’m the only one. So go go 4k Ultra settings i guess… for 50 more years.
Except that none of those numbers were taken from actual 4k resolution, but from upscaled 1080p with ATAA ‘technology’
Really? So GTX 1080 Ti has barely 60 FPS in the mentioned games in 1080p resolution? 🙂 I play many games with GTX 1080 (without Ti) in 4K so you can believe me, that GTX 1080 Ti is much better than this.
It’s 4K native, and there is nothing like ATAA technology. However there is DLSS but it will not work in older games only in selected very few games.
All these games use PhysX
Arizona Sunshine
Batman Arkham Knight
Borderlands The Pre-Sequel
Chronos
Dirty Bomb
The Witcher 3
Hatred
Killing Floor 2
Fallout 4
Lords of the Fallen
Warframe
Metro 2033
Metro Last Light
Life Is Strange
Project CARS
Everspace
Warhammer 40,000 Eternal Crusade