Bethesda has just lifted the review embargo for Indiana Jones and the Great Circle. Powered by the latest version of idTech, it’s time now to benchmark it and examine its performance on the PC.
For our benchmarks, we used an AMD Ryzen 9 7950X3D, 32GB of DDR5 at 6000Mhz, AMD’s Radeon RX 6900XT, RX 7900XTX, as well as NVIDIA’s RTX 2080Ti, RTX 3080 and RTX 4090. We also used Windows 10 64-bit, the GeForce 566.14, and the Radeon Adrenalin Edition 24.10.1 drivers. Moreover, we’ve disabled the second CCD on our 7950X3D.
Indiana Jones and the Great Circle requires a GPU that supports hardware Ray Tracing. As such, the game could not launch at all on the AMD Radeon RX580, Vega 64 and NVIDIA GTX980Ti.
MachineGames has added a lot of graphics settings to tweak. PC gamers can adjust the quality of Texture Pool Size, Shadows, Global Illumination, Water, Hair, Volumetrics and more. There are also options to disable Chromatic Aberration, Film Grain, Motion Blur and DoF. The game also supports DLSS 3 with Frame Generation. However, there is no support for AMD FSR 3.0 or Intel XeSS. That’s a bummer as it shouldn’t be that hard to support those techs (as there is already support for DLSS 3).
I should also note that the game does not currently support Full Ray Tracing/Path Tracing. NVIDIA told us that the devs will add support for Path Tracing on December 9th. Now although there is no support for Path Tracing, the game uses by default Ray Traced Global Illumination. That’s why it requires a GPU that supports hardware Ray Tracing. Plus, the final version will not have Denuvo (even though the review build we’re benchmarking has it).
MachineGames has not included any built-in benchmark tool. So, for our tests, we used the “Castel Sant’ Angelo” Mission. This mission appeared to be more demanding than all the previous areas. So, it should give us a pretty good idea of how the rest of the game runs.
During the first two hours, there wasn’t any scene that could tax the CPU. As such, we were GPU-limited the entire time, even at 1080p/Supreme. This is why we won’t have any CPU benchmarks (I may add them at a later date if I find a scene that can stress the CPU). Just take a look at the CPU core usage in the following screenshots.
All of our GPUs were able to push framerates higher than 80FPS at both 1080p/Supreme and 1440p/Supreme. Yes, even the NVIDIA RTX2080Ti can provide a smooth gaming experience at Native 1440p. This is great news as the game will run great on a wide range of GPUs.
As for Native 4K/Supreme, the only GPUs that were able to provide a smooth gaming experience were the AMD Radeon RX 7900XTX and the NVIDIA RTX 4090. The NVIDIA RTX 4090 is obviously in a league of its own, but the AMD Radeon RX 7900XTX was also able to push framerates over 69FPS at all times.
Graphics-wise, Indiana Jones and the Great Circle looks absolutely stunning in close-ups. The character models and the environments look amazing. RTGI also makes the game look incredible. Just look at some of the indoor screenshots below (which showcase what RTGI can do). This is why more and more games should be using RTGI from the get-go. At times, Indiana Jones and the Great Circle pushes some of the best visuals I’ve seen.
However, the game suffers from A LOT of pop-ins. Right now, there are lighting, shadow and object pop-ins. Lighting and shadows form right in front of you. This is something you can clearly see in the following video. Due to these pop-ins, the game may put off some PC gamers. I can’t stress enough how aggressive the LODs are in this game. This is something I really hope Path Tracing will address when it becomes available.
It’s also worth noting that some objects have really low-resolution textures. From the looks of it, a bug is preventing some objects from using higher-quality textures. Here’s an example. Take a look at those awful plants. This bug occurs on both AMD and NVIDIA GPUs. Bethesda is aware of it, and is working on a fix.
All in all, Indiana Jones and the Great Circle runs incredibly well on the PC. A lot of GPUs will be able to run it with over 60FPS at both Native 1080p and 1440p. And, thanks to DLSS 3, NVIDIA RTX40 series owners can enjoy it with over 140FPS at 4K/Supreme with DLAA and Frame Generation. Oh, and the game does not suffer from any shader compilation or traversal stutters. This is a silky smooth game.
The only downside is the awful pop-ins that occur right in front of view. I don’t know why MachineGames did not provide higher LOD options for the Supreme settings. It’s a shame really because in the big outdoor environments, the constant pop-in can become really annoying.
Enjoy and stay tuned for more!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email




























-not a game but a CGI movie
-overpriced
1/10
next
Free for game pass subscriber at least
Doesn't have DENUVO, so is free for all!
Weird, cause 6 hours in its been 90% gameplay. Not sure how's that's a movie…
Not even close pal.
Nice to see the 10GB 3080 still doing well.
3080 is a pathetic man and a great day to find seed to impregnate her in order to keep her life from expiring to the basement too much for her place and began to back away towards the living room and accidentally diving the knife into her shoulder and glared at Andrew angrily and then realizing he could do as he pleased with them all the time the two were 3rd.
It would be cool if there was a way to just lift the movie footage and make a way for that to be interpolated as a level instead of remaking the scene in a graphics engine. Hard to explain what I mean like the sets used in the film etc as an input to a generated gaming set idk This is just the opening scene of the raiders remade in ue5 as a level.
Vulkan FTW
Vthe latest and greatest news on this site and for science related marks around the iew
This is awesome, means it's super optimized @ those FPS.
It's a walking simulator full of cutscenes with input prompts, similar to quicktime events. It has some fighting later on, but it's so clunky. This really should have been a movie instead, this doesn't work for a game, the game constantly interupts the storytelling. I'm surprised this got good reviews. It would have been a good movie, but it's not a good game at all.
I'm sure someone will upload everything on Youtube if you want to know the story, instead of spending money on a game that is barely a game.
Can't wait for Monday! Looks really good and thanks for your tests as always
there's no game here. paid 9/10 reviews. gaming is dying.
Can always tell the people who haven't played it. 6 hours in and it's almost all gameplay.
I was wrong. I’m also 6 hours in. I guess they didn’t want to spoil the hub areas and exploration this game has. As a huge Indiana Jones fan I am really satisfied.
So how much did you get paid to say that? After all you implied that all the good reviews were paid for …….
The problem with knee jerk reactions is eventually you give yourself a black eye ……
Did any IGN reviewer ever said "I was wrong, this game is not a 9, it actually sucks" ?
These days when I see some big outlets giving a game a 9 there's a good reason to be worried because they are getting paid. That's why I do not trust them. There's so much examples in recent years where a game has an overall great critic score, but it's a mediocre title.
I only trust myself when it comes to quality of games I play.
My hopes for this game were really low. Watching the trailers, I thought it's going to be a 'modernized' version of Indiana Jones. Linear cinematic adventure with some puzzle areas and DEI LGBTQ BS. Thank God it's not, because when some woke individuals get their hands on some of the cult classic franchises, they tend to ruin it. I guess the sane people in Machine Games and Bethesda guided the development on the right path.
It's a good game. So far combat and stealth are way too easy even on the hardest difficulty, but the exploration, level design and puzzles are great.
What is this garbage? Why does the video look like they tried to remake one of the movies in a video game engine?
Also, no off for the anti-aliasing, and renaming it "upscaling" instead of anti-aliasing. This trend needs to stop.
GA is a pathetic man and a great guy to work with you and me and my angel.
Because that is exactly the intent. It is the opening scene of Raiders of the Lost Ark. They are using it as the tutorial section for the game but it turns out Indy is only dreaming it. They did a really good job of recreating that scene from the movie.
Lock your frame rate to something like 70 or whatever frame spike you get and use lossless scaling. It's like a hack for games that have frame drops or stutters.
This game looked like it was going to flop but now I think I'll give it a try, combat looks fun.
Dagoat likes a "modern game" …… A clear sign of the apocalypse ……
Dyou have a beautiful night in the basement right o
It's the first comment I have seen from "him" that didn't make me want to throw up in my mouth.
That's only because you take him too seriously. I think the dude is funny as hell ……
Well, modern games are trash for the most part but I do like a lot of them.
Being weird, I'm buying in just for the RT.
70fps native or 200+fps with UI glitch and smearing artifacts all over the place on top of the increased input lag, ….hmmm… i guess not. at least not now.
RT always kill the performance. Why force it into almost every game?
Wlike the entire event and I will be thinking of you too love you and will text you when I wake up and down the lips of her now exposed t to the basement too much to be a part of something very special and I am.
I remember text adventures. People said we don't need pixels.
I remember silence. People said we don't need sound.
I remember CGA. People said too much colour.
….
That's great. All of those things brought drastic improvements.
Raytracing doesn't look any better and tanks your performance.
Raytracing doesn’t look any better? Maybe you need to get your eye’s checked out. I have no problem seeing the difference.
I’ve got my 4090 capped at 90FPS even though my 1440p panel does 180Hz. I’ll keep the 90FPS cap when I install my 5090. Does RT tank my performance? Certainly not.
Raytracing is an outdated technology. CGI has long switched to photon mapping because raytracing can't do caustics and can't do volume scattering. Two essential things that are required to make lighting look realistic.
Raytracing is actually a retarded lighting model, since it can't do scattering without turning into a performance nightmare. The only thing raytracing is good at is reflections. The moment light is absorbed, scatters, refracts, it is useless.
This is genuinely fascinating. I wondered what comes after real-time path tracing; photon mapping may be the answer. Going to read up on this this weekend. Thanks!
Photon mapping (got to love the naming e.g. Bounding Volume Hierarchy or just bucketing as we used to call it) is just a quick approximation for GI. Ideal for static scenes, as it relies heavily on caching for performance but then you could just pre-render.
What definition of retarded are you using ‘dude’?
Yes, RT takes a lot of work. This is why more and more of the specific calculations used are being grown in silicon. Of course you can render caustics using RT. I’m guessing you have yet to finish chapter one of the big yellow book.
RT obviously looks better a large majority of the time, however it doesn't usually look "better enough" to justify the insane performance cost. Most people insisting on using it have FOMO .
RT obviously looks better a large majority of the time, however it doesn't usually look "better enough" to justify the insane performance cost. Most people insisting on using it have FOMO .
Assuming we are still talking about gaming here, gaming where immersion is the goal, then “insane performance cost” would suggest you do not have the correct hardware setup. I can only assume “insane cost of hardware” is what you meant which is not the discussion.
It has an insane performance cost even on a 4090. You could run a game at a much higher framerate without RT. You choose to use RT. I would rather have 120fps or whatever. Its not worth it to me. But people are different.
Well you can run at 120FPS with a 4090. I often wonder though when people say they would rather have 120 what they actually mean. Do they mean they can tell the difference between 120 and 119 or is it that the spend their time watching a counter in the corner of the screen?
People are sheep, looking for an excuse to hate on what they can’t have rather than improve their situation and earn it.
I didn’t spend time searching but here’s a 4080 hitting 120FPS with CP2077. https://www.youtube.com/watch?v=5oOqt0PCSJg
Agreed, it seems people hate RT simply because they cannot run it on their current PC without a slideshow. From this perspective, it's better to pretend that the RT makes no difference.
As for the RTX4080 performance in the cyberpunk with Path Tracing, these settings are twice as demanding as normal RT, so it's necessary to use AI features if you want a smooth experience. Personally I dont mind using DLSS / nvidia FG given how well this technology works.
I get around 70 fps with PT at 1440p DLSSQuality, 100-120fps with FG. At these settings image quality looks amazing and the game is very responsive. At 4K however I need to use dynamic DLSS resolution (with the lowest res set to 1080p / aka DLSSPerformance) to get locked 60fps without FG.
I prefer playing Cyberpunk with standard RT (psycho RT), because PT has noise in certain situations (locations lit only by indirect lighting). Psycho RT still looks wayyyyy better than raster and runs 110-120fps at 1440p DLSSQuality, 140-170fps with FG. 4K DLSS balance is also very playable 70-80fps, and around 100-110fps with FG.
I’ve gone back and forth between the two modes in cyberpunk. Ray reconstruction needs some tweaks. Path tracing looks phenomenal but ray reconstruction bugs out heavily and smears. If you disable it, not only do you get lower performance, but also lower update rate on the rays as well as loss of some approximated bounces. Psycho RT runs faster and has no bugs, but in some scenes is lacking quite a bit compared to path tracing. So it’s really hard to decide.
Gotta wait and see what new features they’ll be launching with the 5000 series. Or at least hoping the higher refresh rate with the 5000 series also means faster updates and more accurate ray reconstruction as well. I’ve held off playing the expansion for it until then.
Yes, you are absolutely correct @socius:disqus , I noticed right away RR issues you mention and always played without it, even though certain PT effects looked better.
I’ll watch the video when I can. Certainly doesn’t line up with benches I’ve seen, Indiana Jones was just reported as barely hitting 60 on a 4090, but if you’re right then you’re right.
I don’t even really use RT at all though, because I want as high an fps as I can get.
There's a night and day difference between raster and RT, especially in sandbox games where you just cant prebake lighting for dynamic TOD different lighting conditions. IMO RT performance hit is justified, at least if you arnt playing on low end PC, or AMD GPU.
https://i.ibb.co/KrmJL3V/5b.jpg
https://i.ibb.co/SRQ0F6p/5a.jpg
https://i.ibb.co/yYvryT8/1b.jpg
https://i.ibb.co/5Mmv7pN/1a.jpg
https://i.ibb.co/C2TFj6F/2b.jpg
https://i.ibb.co/MCcQDYq/2a.jpg
https://i.ibb.co/Myzrvtn/3b.jpg
https://i.ibb.co/p4GC8Nf/3a.jpg
https://i.ibb.co/FXf28s3/7b.jpg
https://i.ibb.co/1Mmgp6z/7a.jpg
cyberpunk raster, the water surface and shadows
https://i.ibb.co/BfCfkb6/raster.jpg
https://i.ibb.co/8BVP1mr/raster.jpg
and now with Path Tracing
https://i.ibb.co/RcYRjMr/PT.jpg
https://i.ibb.co/wRZCCmw/PT.jpg
looking at framerate countet you may say there's too big performance hit with PT, but the thing is even RT at low settings still makes a massive difference and performance hit is no longer so massive.
https://i.ibb.co/4sQjCdT/RT-reflections-shadows.jpg
https://i.ibb.co/VM8dR2p/RT-shadows-reflections.jpg
of course RT doesnt make huge difference in all games, but even light implementation still makes a noticeable difference. Without RT there would be no chartacter reflections in this scene.
https://i.ibb.co/XLzbqLH/4.jpg
The reflections of the characters in the TV, maybe these are small details for some people, but I definitely noticed them. With 120-190fps at 4K native I see no reason why I should turn off RT in this game, while RT adds to the experience sometimes.
https://i.ibb.co/y81z8S7/6.jpg
As for RT performance in general, many games from my steam library run at over 60fps even at 4K native (for example metro exodus 85fps), and well over 120fps with DLSS. The most demanding ones like Alan Wake 2 or Cyberpunk (mainly games that use PT) require both DLSS and FG to offer 80-90fps, but this AI technology works so well I dont mind using it. BTW. there's only 22% performance hit in black myth between PT and lumen.
https://i.ibb.co/Db10v4Q/4-K-DLSSP-Very-high-FULLRT.jpg
photo mode
https://i.ibb.co/N1vNYyc/b1-Win64-Shipping-2024-09-01-00-06-20-759.jpg
https://i.ibb.co/BLMw5Ns/b1-Win64-Shipping-2024-09-01-00-07-52-582.jpg
gameplay
https://i.ibb.co/Bj7m69V/b1-Win64-Shipping-2024-09-01-00-25-08-987.jpg
as for UE5 games that only use Lumen:
https://i.ibb.co/1mZnTym/Hellblade2-Win64-Shipping-2024-09-02-03-03-45-853.jpg
https://i.ibb.co/Rj4SQ4K/Hellblade2-Win64-Shipping-2024-09-02-03-23-56-366.jpg
Frame gen absolutely does not work well at 30-40fps or whatever real frames. Maybe you don't notice or care about the latency but many of us do. It feels awful.
I personally would also argue that upscaling from a base resolution below 1440p-ish looks worse than any RT benefit most of the time, though way more people disagree with me there (which is fine, different strokes).
In the end RT is off the table for me because I want at least 90 real frames at a resolution above 1080p. Maybe on the 5090…
I dont have to use DLSS FG, but I see no point to not use it, since this technology works so well.
Nvidia recommends 40fps as a minimum to use DLSS FG, and base on my testing that's accurate. If the game runs below 40fps, DLSS FG starts skipping frames and the game becomes unplayable. I tried playing cyberpunk at extremely demanding settings without DLSS upscaling with PT and I had like 40fps. I could play like this on my VRR display, but my aiming wasnt so precise and smooth as I would like. As soon as I turned on DLSS FG, my aiming improved DRAMATICALLY because my eyes could track moving objects much easier.
As for input lag, I played games like Quake and Unreal Tournament for over two decades, and I'm extremely sensitive to input latency. I can even tell the difference between 170fps vsync and 170fps without vsync on my monitor. With vsync there's a feeling of weight during mouse movement even at such high fps. With DLSS FG mouse movement isnt affected nearly as much. Sure, I can still feel the difference at base 40fps in Cyberpunk when I have FG on, but the difference is very subtle, and because FG generates so many frames, it actually improves my aiming, therefore there's no reason to play without DLSS FG even at such low fps. Most of the time I use FG when my average framerate is well above 60fps. Many games can run at 100fps on average, but can dip below 60fps from time to time and it's better to use FG than lower graphics settings. Thanks to DLSS FG I no longer notice sub 60fps dips.
Cyberpunk with "psycho RT" at 1440p DLSSQ 3.8 (the same image quality as native TAA) + FG. Look at input latency, just 27ms. I could play this game competitively with these settings and I would still have an advantage over other gamers.
https://uploads.disquscdn.com/images/d747012ad6aa35ed59d633b5ad9f8a42f65fe74721e6a11fce71c17abc360ad5.jpg
Cyberpunk with Path Tracing at 1440p DLSSQuality, FG and still very very responsive experience 48ms (for comparison 60fps games on the PS5 have around 80ms).
https://uploads.disquscdn.com/images/e7023b905228dc7205e3365223e589f97741beb471c1d2270ac3474ff00c06b8.jpg
What's interesting some games have even reduced input latency with DLSS FG, that's the case in Black Myth Wukong. I get 48ms input latency with FG because nvidia drivers actives Nvidia Reflex, and 60ms without FG because Nvidia Reflex doesnt work in this game without FG. There's no reason to play this game without Nvidia FG even if someone will buy the RTX5090 in the future.
Again I’m glad you like it, but it’s not like I haven’t tried it myself. I don’t even like native 60 anymore let alone 60 with fg on top let alone using fg to get 60. We’re all different.
I have a lot of backup on that score though, as most tech channels tend to agree it shouldn’t be used that low. I do think it’s rather good at making 90 or so fill out a 144hz monitor.
I do not blindly listen to the tech channels, I prefer to test everything myself and make my own opinion.
On my previous PC I tested AMD FSR FG and Lossless Scaling FG and I did not like the experience. I could definitely feel increased latency even at base 100fps. What's more, I also saw artifacts and judder (motion wasnt perfectly smooth).
Based on that experience, I did not expect to like DLSS FG, but I was pleasantly surprised. DLSS FG adds very little latency even at low base fps, as long as I play on VRR monitor and turn off VSYNC (both in game and in NV control panel). There are also games like Black Myth Wukong where DLSS FG actually reduces latency thanks to NVIDIA Reflex. This game is definitely more responsive with DLSS FG than without. When game runs at around 100fps with FG I'm 100% happy with input latency. If game runs at 80fps it's still a very good experience (especially on gamepad), way better compared to console games running at real 60fps and 80ms latency on average.
People like you who do not want to use DLSS FG can always buy the upcoming RTX5090.
I do not blindly listen to the tech channels, I prefer to test everything myself and make my own opinion.
On my previous PC I tested AMD FSR FG and Lossless Scaling FG and I did not like the experience. I could definitely feel increased latency even at base 100fps. What's more, I also saw artifacts and judder (motion wasnt perfectly smooth).
Based on that experience, I did not expect to like DLSS FG, but I was pleasantly surprised. DLSS FG adds very little latency even at low base fps, as long as I play on VRR monitor and turn off VSYNC (both in game and in NV control panel). There are also games like Black Myth Wukong where DLSS FG actually reduces latency thanks to NVIDIA Reflex. This game is definitely more responsive with DLSS FG than without. When game runs at around 100fps with FG I'm 100% happy with input latency. If game runs at 80fps it's still a very good experience (especially on gamepad), way better compared to console games running at real 60fps and 80ms latency on average.
People like you who do not want to use DLSS FG can always buy the upcoming RTX5090.
It’s not like I didn’t test it and “make my own opinion” too my good man. Again I’m glad you like it. To me it feels like crap below 90ish real frames. Everyone’s different.
I dont have to use DLSS FG, but I see no point to not use it, since this technology works so well.
Nvidia recommends 40fps as a minimum to use DLSS FG, and base on my testing that's accurate. If the game runs below 40fps, DLSS FG starts skipping frames and the game becomes unplayable. I tried playing cyberpunk at extremely demanding settings without DLSS upscaling with PT and I had like 40fps. I could play like this on my VRR display, but my aiming wasnt so precise and smooth as I would like. As soon as I turned on DLSS FG, my aiming improved DRAMATICALLY because my eyes could track moving objects much easier.
As for input lag, I played games like Quake and Unreal Tournament for over two decades, and I'm extremely sensitive to input latency. I can even tell the difference between 170fps vsync and 170fps without vsync on my monitor. With vsync there's a feeling of weight during mouse movement even at such high fps. With DLSS FG mouse movement isnt affected nearly as much. Sure, I can still feel the difference at base 40fps in Cyberpunk when I have FG on, but the difference is very subtle, and because FG generates so many frames, it actually improves my aiming, therefore there's no reason to play without DLSS FG even at such low fps. Most of the time I use FG when my average framerate is well above 60fps. Many games can run at 100fps on average, but can dip below 60fps from time to time and it's better to use FG than lower graphics settings. Thanks to DLSS FG I no longer notice sub 60fps dips.
Cyberpunk with "psycho RT" at 1440p DLSSQ 3.8 (the same image quality as native TAA) + FG. Look at input latency, just 27ms. I could play this game competitively with these settings and I would still have an advantage over other gamers.
https://uploads.disquscdn.com/images/d747012ad6aa35ed59d633b5ad9f8a42f65fe74721e6a11fce71c17abc360ad5.jpg
Cyberpunk with Path Tracing at 1440p DLSSQuality, FG and still very very responsive experience 48ms (for comparison 60fps games on the PS5 have around 80ms).
https://uploads.disquscdn.com/images/e7023b905228dc7205e3365223e589f97741beb471c1d2270ac3474ff00c06b8.jpg
What's interesting some games have even reduced input latency with DLSS FG, that's the case in Black Myth Wukong. I get 48ms input latency with FG because nvidia drivers actives Nvidia Reflex, and 60ms without FG because Nvidia Reflex doesnt work in this game without FG. There's no reason to play this game without Nvidia FG even if someone will buy the RTX5090 in the future.
Ofc its worth it if you own rtx 4090
Metro exodus also shows how big difference RT can make. I will show screenshots from the original version (running at 4K TAA native), not the enhanced edition, because this "improved" version has washed out blacks and generally looks much worse IMO.
https://i.ibb.co/CndkqLM/Metro-Exodus-2024-12-08-07-13-14-541.jpg
https://i.ibb.co/L0XwHgS/Metro-Exodus-2024-12-08-07-13-37-492.jpg
https://i.ibb.co/ngDCgBH/Metro-Exodus-2024-12-08-07-14-18-906.jpg
https://i.ibb.co/6vsgr0N/Metro-Exodus-2024-12-08-07-14-08-225.jpg
https://i.ibb.co/4ZW4XGh/Metro-Exodus-2024-12-08-05-26-03-740.jpg
RT GI vs raster
https://i.ibb.co/f0t6brx/Metro-Exodus-2024-12-08-07-37-45-541.jpg
https://i.ibb.co/WHq47KD/Metro-Exodus-2024-12-08-07-37-16-409.jpg
https://i.ibb.co/BNYr4WD/Metro-Exodus-2024-12-08-08-36-43-492.jpg
https://i.ibb.co/ykQVtBt/Metro-Exodus-2024-12-08-08-36-53-914.jpg
https://i.ibb.co/yFvSRr2/Metro-Exodus-2024-12-08-08-42-18-238.jpg
https://i.ibb.co/gr6bH9q/Metro-Exodus-2024-12-08-08-42-31-250.jpg
https://i.ibb.co/3FPMxh0/Metro-Exodus-2024-12-08-08-44-29-030.jpg
https://i.ibb.co/HDkCWn3/Metro-Exodus-2024-12-08-08-44-41-667.jpg
Thanks to RT GI objects and characters in metro exodus game are well grounded in the scene.
so far all ray tracing in games was tacked on, now this game and avatar are designed with ray tracing light in mind and it looks good.
I can agree with you. Adding it later seems less appealing then creating the game with it. Although done correct in an optimized game both ways can look great.
To shove +800$ gpus down our throat.
Because some of us have 4090s.
That’s okay, but you should have the option to turn it off.
Easier on development when you can flip a switch instead of creating all the assets for each light source and object affected by it.
Exactly ….. It is easier to optimize using only RTGI and not having to also support SSGI at the same time. When you try to support both you have to make compromises and those compromises mean a game is less optimized.
On my PC I see 22% performance difference between PT and software lumen in black myth wukong. IMO that's not a lot. I can also run most RT games from my steam library at 4K native with over 60fps, not to mention with DLSS and FG (120fps+). RE3 remake in particual runs quite well, I get 120-190fps at 4K native and RT reflections looks a lot better than SSR.
RTX 5090?
OC'ed RTX4080S. I think the RTX5090 will be twice as fast.
Works great on my 4080 at Max settings (steady 120fps) and on my son's 2080 Super on High/Ultra (60-80 fps)
At what resolution for your 2 PCs?
Because it is easier and more efficient to have just RTGI alone and not have to also support SSGI which cuts the efficiency and makes the game less optimized. By only using RTGI the game is better optimized than it would be if you tried to support both RTGI and SSGI
It's also a big reason why the game runs so well at 60 FPS on a Series X. At 4K it is only lowers the resolution to 1800p instead of the usual 1500p or 1200p with high quality settings that normally only get you 30 FPS. There is no 30 FPS setting it's 60 FPS all the way on a Series X
It's a walking simulator full of cutscenes with input prompts, similar to quicktime events. It has some fighting later on, but it's so clunky. This really should have been a movie instead, this doesn't work for a game, the game constantly interupts the storytelling. I'm surprised this got good reviews. It would have been a good movie, but it's not a good game at all.
I'm sure someone will upload everything on Youtube if you want to know the story, instead of spending money on a game that is barely a game.
WWalking Singapore is a pathetic man and a great guy to work with you and me and my angel are you up for a little later today and I can even get you a message you when you cu.
I have observed this as well. The game is not really doing anything new. Is it getting good scores because microsoft managed to make a game that was not a total joke for once?
doesnt look as good as the trailers, so it explains the perfomance. It looks like something that could have come out 5 years ago.
It's one of those games that feels better when you're playing vs looking at a video. And RT path tracing is coming.
Good old lod issues for another linear first person walking simulator game, will these developers learn anything at all ?Fails in making interesting gameplay,no f*kin dobut;Even still loves to disappoint folks in graphcis presentations after all these years,all these hardware cycles,it's becoming very pathetic at this moment
"and the game does not suffer from any shader compilation or traversal stutters. This is a silky smooth game."
see how 90% of your problems are gone just by ditching UE5. now go ahead and defend UE5 (utter trash engine 5) some more…
It's not using ue5.
Who is upvoting all the ignorant posts?
That's what he's saying, he's happy the game isn't full of stutters because it's not using UE5.
Stop using 100% of your brain, you're opening a rift to the 5th dimension.
Why they asked for 32GB ram ? all i see is 13GB usage at max in your screens and youtube videos.
i missed you dude post more often.
haha 😀 i will and thanks, missed you too buddy
Because background tasks from your operating system and other applications you might have running on the side could end up taking more than 3 gigs. If you had 16GB memory, it would likely crash because of total system RAM usage exceeding that. For most people, the next step up from 16 would be 32 as I don't believe 12GB DDR5 RAM modules exist yet (24GB exists, but running 1 stick of RAM is often ill-advised due to performance loss).
It wouldn't crash, it would just push more and more to the Page File and then if it needs it back it would hit off the Page File slowing everything down. Windows doesn't waste memory meaning even if it shows it is only using 50% of available RAM it is using the free RAM to create things like disk buffers which makes things like Direct Storage work better which in turn it can stream high quality textures and meshes from storage to VRAM faster.
It wouldn't crash, it would just push more and more to the Page File and then if it needs it back it would hit off the Page File slowing everything down. Windows doesn't waste memory meaning even if it shows it is only using 50% of available RAM it is using the free RAM to create things like disk buffers which makes things like Direct Storage work better which in turn it can stream high quality textures and meshes from storage to VRAM faster.
That didn't made any sense. most games now days runs fine with16gb while using +12gb ram.
Yeah I forgot to take pagefile into account that most people probably have enabled on their systems as it's on by default in windows. I personally disable it so I've experienced crashes in the past when I used to have 16 gigs in my machine. What can happen when pagefile is enabled is that stutters can occur, sometimes heavy stutters, when RAM spillover happens.
Better enable it even if you have more than 64GB. windows still writes files onto it, you might get bsod if disabled.
Machine GODS can't stop winning.
Great reviews all around.
Natshits in shambles.
Wokeness wins, YET AGAIN!
Freedom wins. Mainstream Woke is religion hidden behind civil arguments. They are exposed now. Can't unsee it.
Ignore him, he's just the resident bundle of sticks troll that tries to get under people's skin over here because he's got a bad case of the TDS.
So were the movies woke because they all had female sidekicks and an Egyptian best friend?
Anti woke mob are the biggest joke.
This game is getting decent review scores but the same people giving it good reviews gave Veilguard perfect scores so I think it could just be more shilling for the current thing.
So i wont be able to run this game on 5700XT? This card performs on level of 2070 and 3060, but just because it doesnt support HW RT, i cannot play the game at all? Damn. Are you sure?
This game surprised everyone, it is actually not woke. Even though it was made by woke people. The girl from the trailers didn't end up being a girl boss and does not steal the spotlight from Indy. The game is actually a pretty damn good continuation of the movies paying a lot of respects from the original 3 movies.
What an odd thing to say.
Seems like the 4090 is getting better with age. Like fine wine. ☺️❤️💯🇺🇲👍
1600$
that's weird, I have tons of stuttering induced by the CPU ( a 7950x3D)
Other reports indicated FPS absolutely tanks on any GPU with only 8GB VRAM, struggling to get 30 FPS on 1080P low settings.
I don't know how the RTX 3080 tests were conducted and Supreme settings in 4K even with dlss are a slide show due to insufficient Vram memory
Someone must have had a great imagination or the RTX 3080 12 GB version was used in the test? Supreme details for this card in 4K are a slide show because there is simply not enough Vram memory. However, in 1440p, already in the first stage of the game, the efficiency oscillates between 65-75 with drops below 50 fps in custom scenes.