Final Fantasy 16 is coming to the PC tomorrow. This new FF game is using an engine developed by Creative Business Unit III. So, as we usually do, we’ve decided to test the game first on the NVIDIA RTX 4090. Can NVIDIA’s most powerful GPU push a constant 60fps experience at Native 4K/Max? Let’s find out.
For these first benchmarks, we used an AMD Ryzen 9 7950X3D, 32GB of DDR5 at 6000Mhz, and NVIDIA’s RTX 4090. We also used Windows 10 64-bit, and the GeForce 561.09 driver. Moreover, we’ve disabled the second CCD on our 7950X3D.
Final Fantasy XVI does not have a built-in benchmark tool. So, for our tests, we used the first Titan fight and the garden/palace area. These areas appear to be among the most demanding locations. Thus, these are the areas we’ll also use for our upcoming PC Performance Analysis.
At Native 4K/Max, the game cannot maintain 60fps at all times on the NVIDIA RTX 4090. There are areas in which NVIDIA’s GPU can push over 70fps. However, in the most demanding areas, the framerate can drop to the 50s.
Now the good news here is that FF16 supports NVIDIA DLSS 3. By simply enabling DLSS 3 Frame Generation with DLAA, you can get framerates over 80fps at all times. And, in my opinion, this is the best way to experience this new FF game on this high-end GPU.
Thanks to DLAA, you’ll get a better image quality. Then, with DLSS 3 FG, you’ll get a smooth gaming experience. I also did not notice any major input latency issues or visual artifacts. Controls and camera movement were responsive. So that’s another big plus for DLSS 3 FG.
As I’ve already reported, Square Enix has made some improvements to the PC demo of FF16 that were carried over to the final version. This means that the game won’t stutter as much as it did when the PC demo came out. There are still some stutters. However, most of you won’t even notice them (unless you’re constantly looking at the frametime graphs).
Our PC Performance Analysis for Final Fantasy 16 will go live later this week. Until then, here’s a video from the latest version of the PC demo I captured last week.
For those wondering, the final PC version looks and runs similarly to the PC demo. This means that the demo is representative of the game’s performance. So, make sure to download it if you want to see how it runs on your PC.
Finally, be sure to check out our Native 4K vs DLSS 3 vs FSR 3.0 benchmarks.

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email




The retardation of recent years is becoming suffocating, really. It should push 200-300 with those settings.
Nope. The game looks amazing. There are some rough spots but you must be delusional if you believe that it should run with 200fps at Native 4K.
Amazing hehe, nice try little bro.
Freaking Witcher 3 still looks better, not to mention FF16 is average action game and terrible RPG.
witcher 3 boring.
Witcher 3 is one of the greatest games ever made if you’re actually into RPGs. Its graphics are dated now. But the game itself. The story. The choices. The depth. The white and not ethnic Triss. Just draw you in and let you live in that world for a while. One of the few games I’ve replayed more than twice. Kotor and suikoden 2 and ff7/8/9/10 are the only other RPGs I’ve replayed as many times. Hell I’ve spent more time playing Gwent than I have many other full games.
buy a better pc, game looks f*king awesome
Witcher 3 looks like sh*t compared to this game lol
You might be physically blind if you think Witcher 3 looks better.
https://uploads.disquscdn.com/images/3ec9409621ae8ac50767765cfeef9911c560369839c08d8e600b3c99185d8eb6.jpg https://uploads.disquscdn.com/images/661bdf0ce4ea48a31c17426a5a750be95209ff2052cc5125798ac0a2cb32c47b.jpg
The facial expressions, the character quality… it isn't even close. Most notable is the facial expressions in Witcher 3 look far worse, though. FFXVI has some of the best facial expressions I've seen in a game to date. Don't be confused by the fact whatever that second screenshot above was running at as I suspect they had something hurting resolution as the image isn't normally that blurry. Maybe a badly configured DLSS setting (or the bug some had with it in the demo before it was fixed). The lighting and material interaction on the character's clothes and skin blow Witcher 3 away.
https://uploads.disquscdn.com/images/23bbb9c960d1780f0381805a9d49e7e25388097c2dbf004066a85822a2565de0.jpg https://uploads.disquscdn.com/images/3782df226ac652431ee2295461c4d287e6a20e938b84d75a4c12e075f0d2ca99.jpg
Yeah, Witcher 3 looses by miles here and it really isn't close.
FFXVI has more detailed character models, and textures. The witcher 3 has much better lighting (dynamic GI, RT shadows, RT reflections), and draw distance.
https://uploads.disquscdn.com/images/55c7bb7fbdbeea5340dfa15cefd4dbc406491db20b52875ce5d1585ee28dbd18.jpg
You're showing primarily a sunset which is part of a sky box or sphere and is not actual literal lighting but just a texture… The actual lighting on ground based structures, the character, plants, etc. is very flat overall due to limited material interactions and a low quality shadow implementation. The screenshots above do better of that. In fact, the second Witcher 3 screenshot I posted above on the stone street shows just how bad the lighting is even with ray tracing.
I'd post more but the demo is quite limited and the PS5 versions are frankly too blurry to bother with though its sunsets look much better than Witcher 3's, overall, though the strong color fade on the skybox is aesthetically neater in Witcher 3.
Witcher 3 has very poor shadows and has essentially no reflections for the most part, aside from some extremely low quality ones in mud that look almost like shadows.
There is no way Witcher 3 competes in draw distance and LoD. FFXVI is designed around going from small scale to large scale sequences and battles while maintaining detail. Even on the PS5 it beats Witcher 3 PC. The PC version is even more detailed in the few scenes we've seen from the demo such as when shiva fought titan.
Despite the graphic downgrade Witcher 3 suffered it was great looking at its time, but it looked very aged now. If I were to make any complaint about FFXVI's visuals it is probably how some of the textures look really bad with the mixed realism illustration hybrid art style and could have been reworked to better interact with the engine's lighting.
The Final Fantasy XVI (FF16) is linear game, while The Witcher 3 (W3) is a real sandbox game. If you see something in the distance in the witcher 3, you can go there unlike FF16.
https://uploads.disquscdn.com/images/75e1e8835ac7971398e324759a1d187779e5ecd2d7c1113f5ea3309e2e4b5329.jpg
Grass draw distance is very impressive in the W3, and even a single blade of grass has ray traced and pixel perfect shadow.
https://uploads.disquscdn.com/images/493fa2fbb15e1a03e700b972963bc6aa37afaec5bff28ffde56163d82e284150.jpg
https://uploads.disquscdn.com/images/ea0ac0249469e1842f5880435b134bb406c99b23f2c6b3862e64c37ccf2625bd.jpg
https://uploads.disquscdn.com/images/0f22f4e71466da34ec9cb03b670105bfd8a46a1ee86fef82275c9f8ec8503dde.jpg
https://uploads.disquscdn.com/images/a6a605ddf983c9fb3f9e57261267141e04870f3506800987aa1c7ed36c60b2af.jpg
https://uploads.disquscdn.com/images/2680c69965077262646e399c0cc741badaca7fb94adb1773942c2eba64a1e6ef.jpg
The W3 use dynamic GI and AO, which creates beautiful and very realistic indirect shadows (notice the darkening of shadow areas).
https://uploads.disquscdn.com/images/510c1596f6906c76ac555448c4d60194a638a298c9154e2248a140409b82c8ca.jpg
RT also creates realistic color bounce, notice the green color of the grass on the stone.
https://uploads.disquscdn.com/images/56c09658c8d417992a550a77b972618881ebc6fe83311c921e1337a8e21836f2.jpg
The Witcher 3 also has RT reflections, so water does not have occlusion artifacts and reflections do not fade when you move the camera (that's the case in FFXVI).
FFXVI does not have any of these RT effects, because the lighting is prebaked, therefore worse. The W3 assets have aged quite a bit, but the lighting in this game is at least dynamic and follows the rules of light. The lighting in the FFXVI demo looked flat to me in many places (missing indirect shadows), and I cannot understand why this game is so demanding even though it does not use RT.
https://uploads.disquscdn.com/images/07568bf0629e7627858e6ef25233b3a2f97ab8f8a084ca98d45e9e6b34266065.jpg https://uploads.disquscdn.com/images/6b34278f30ceffd0e4122fac163d1f1e2dc6aec57ea508408529f22ba4782683.jpg https://uploads.disquscdn.com/images/5f0ef6d562c40122302a127018d376ebc50a23fed3196c13d02c893a2cdaca52.jpg https://uploads.disquscdn.com/images/1e127be740d843daa9923d620e9e906db018e0d538fd88362206bcdd646120b6.jpg
I wish every sandbox game had such "flat" lighting like The Witcher 3.
It is true that Witcher 3, being an open world (the type with near total freedom from the start) game necessitates lower quality repetitive assets in bulk rather than the extremely asset rich more linear games like Final Fantasy XVI. It is a balance that must be struck, understandably. FFXVI does have open areas that are fairly large though, mind you, but most of the game being linear as you say lets htem focus on asset quality and push visuals harder than most open world games. AI will improve this with time, but yeah… realistically can't fault Witcher 3 for it (esp during its released time and hardware).
I imagine grass has weaker ray tracing accuracy, and as it gets further out stops using RT entirely after a point for saving resources (may even use nothing, if not using an alternative non-RT solution at such point) but yes it is a bit expensive so W3's RT is kind of heavy. Witcher 3 generally has great LoD at distance, but FFXVI does as well you will find later on in some of its larger sandbox like areas. Granted, in those areas FFXVI for obvious reasons also falls back to more redundant repetitive assets just like W3.
Some parts of W3 have good shadows, some not so much. Lighting and shadows are a mixed bag with FFXVI sometimes being better and sometimes being notably flat similar to W3. Alas, W3 (RT aside) is on a much older engine so I can't fault it too much.
I don't see any green indirect lighting color bleed from RT going on here. I think that might be your monitor doesn't represent the colors accurately enough so maybe the colors look too similar. Not to say Witcher 3's RT implementation doesn't support it (idk if it does, haven't looked into it), but I'm not seeing it in this example tbh.
The water has an extremely low sample quality for its RT reflections though so, imo, aside from distant large objects like the ship or trees it looks relatively bad most of the time. Not a huge issue since there isn't much body of water typically to care about it but yeah.
FFXVI's lighting isn't all pre-baked, as quite a bit of it is dynamic (most of it from what I've seen) such as its day/night cycle system and clouds, indoor lighting, particle light emissions, etc. I haven't seen any source of light that was clearly pre-baked, in fact. Not saying there wont be but I haven't noticed it. Sadly, FFXVI doesn't have RT I agree. It might have helped the scenes it feels flat in which the current illustration art style really holds the current lighting systems back at times in some cases.
I'd love to see witcher 3 reworked with AI improved assets one day because the game isn't a bad looker, overall, but just aged.
Reason it is demanding is because of asset quality is technically bloated, but I went into detail in the other post I mentioned to you so I wont repeat here and just let you read that one. It is a similar issue to Alan Wake 2 and Wukong though where they use upscaling to push higher quality results than current hardware supports (but scaling is inefficient at those levels of quality thus its bloated for less gain).
amazing?!!! What a joke. go enjoy your dlss
Relax Susan. DLSS is the future.
LoL
With a 4090? You betya I'm delusional.
A Ferrari costs for 200k, has a top speed of 350km/h, looks and drives amazing, but still no wings, can't fly. How dare they with how much it costs?
comparing an Overpriced GPU to a luxury car lol.
Who said you can't know if someone is retärded over the internet?!
I dunno dude, this game's visuals look like a mixed bag to me. There are some parts that are downright stunning, but then you get other parts where it looks like a PS4 game. Sometimes even worse (e.g. just look at the ground textures in the beginning area of the demo, looks like a PS3 game almost). Alan Wake 2 and Cyberpunk 2077 without ray tracing run better than this while looking better too.
While expecting 200-300 FPS with DLAA at native 4K is a bit much, the fact that you're getting 70-90 FPS after applying to frame generation isn't very good either. Those are the kinds of numbers you should be seeing PRIOR to applying FG given the visuals on display. And it even dips into the 40s during cutscenes AFTER FG. That's utterly insane and is frankly inexcusable.
You see 40fps during cutscenes because all cutscenes are locked at 30fps (even when you use the 30fps removal mod for the in-engine cutscenes, MSI Afterburner won’t report the right framerate of those cutscenes).
Man cant wait for the benchmarks. I might need to upgrade my cpu/ram for this game(ryzen3700x/16gb), but cant wait for a second run with better graphics. got the platin on ps5. game looked there already jaw dropping in some places
And frame generation, using the most overpowered and power hungry connector-melting gpu on the planet.
DLAA is more demanding than simply native 4K due to the extra AA. Somewhere around 10-12% typically, I believe.
By the way, even a RTX 4090 does not get 200-300 FPS at native, or often even with DLSS + Framge Gen) in modern games. You would almost always be CPU bound before hitting anywhere near 300 FPS in most games that aren't especially resource light, and many mid range to more demanding games won't even hit 200 FPS.
One thing the article writer did fail to cover, though, is that the DLSS quality result in this game is so good even with frame gen active without artifacts, obvious blurring, ghosting, or any other issues that there is quite simply no reason at all to use Native or even DLAA.
DLAA isn’t supersampling. Technically you might be able to argue that DLDSR 2.25x at DLSS Quality, which still outputs at native resolution, could be considered supersampling.
F*k I'm bad with names and worse with acronyms. Thanks. I was going to respond confused at first but then realized I was typing DL… AA, aka anti-aliasing, f*k me.
Still more demanding than native, but not quite what I pitched for that first paragraph. :/
so practically an undercooked mess.
Idk man it works really well for me I have a 4080 and 7800X3D and on my 1440p QD OLED with max settings and DLSS Quality I get like over 100fps the whole time the lowest I've seen it is in the 120s and the game looks amazing even with DLSS
Ty for these John. I think people seem ignorant to the fact that a lot of these new technologies that they call "fake frames" or "fake pixels" are just part of the development and graphical toolset now that most people have access to them. It makes no sense for a developer who knows that every modern platform has access to a Native AA Upscaling mode to not design with that in mind and you can think the same for other features as the technologies advance and develop. Past that it's really about how the developers implement them as tools to ensure we have the best experiences possible.
But they are fake frames, fake pixels. Their usefulness is debatable, but they are not and never will be, better than native. We live in a dark age of gaming, where games are woke, blurry (courtesy of taa), poorly developed and outright unfinished at times.
lol bro you forgot your /s.
I avoided all of the scalers simply because the first game I played that NEEDED some framerate 'help' was Hogwarts Legacy.
the analogue nature of the world (Hogsmeade village wouldn't pass any civic structural legalities) means 'lots of grunt needed' to render the world.
Grass and rounded stair edges just didn't survive 'scaling' and so native resolution was preferred.
FluidMotionFrames2, using vectors and intelligent frame data, seems to have very few image issues, and so it is a game changer in three of my software titles (only three that were not 4k native 60)
I figured I would use (framedoubling) in FFXVI, but the normal motion blur written into the animation engine seems to make the double framerate technique not as analogue/'nice' as vanilla 4k which I'd be happy to lock to 30 frames per second and enjoy..
Having just played the PC demo on locked lower framerate- game was amazing.
going back to console 30fps felt horrible. even without the latencies that some 30 fps games have- something just didn't seem correct in a 'few' consoles games playing at that framerate.
Most PC games I wouldn't go 'so low', but if the gameplay suits existing in 'a prettier world' then that MIGHT be the way I'd go.
Tech: framerate doubling- can be amazing, and certainly allows middle of the road enthusiasts like myself to tweak aged and outdated hardware and get an incredible experience on modern titles.
The framedoubling, in Hogwarts/Tokyo Ghostwire/Witcher 3 is akin to buying a GPU upgrade two years from now, but having it today and 'for free'.
not all games suit, but when the game animation engines and framepacing lines up for a 'flawless experience' (some games), with minimised graphical issues (Raytracing looks better at doubled framerate, due to 'temporal resolution') really shine.
FFXVI in ultra with no scaling or tweaks and tricks is one of the more impressive 'pc tech demos' one can find.
BlackMythWukong free demo benchmark is consistent (excellent)
but
open world games like Hogwarts Legacy can make great hardware tests- pushing the rig hard.
developers having 'more tricks' and end users having 'more options' means more likely to find that perfect 'middle ground' where everyone can be happy..
Square Enix make a passable PC port challenge
(impossible)
remember when devs optimized? Me neither, its not surprising that the video game industry is a shtshow, they want you to spend 70+ on unfinished broken games that needs nasa computers to work and they want you to beta test for them while the servers keep going offline and you cant play like test drive unlimited, play old games, dont buy any AAA games until things improve.
John loves those sh*ty broken games. More often than not,he says they look amazing like he said here! I bet there are a lot more low standard gamers like him who don't know what the true meaning of amazing is .they think dlss is a fix not a feature oh god.
Dude, the industry came with this RT bullshït so they could sell graphic cards that cost almost half your house.
Moröns flocked the market to buy them so it's natural that devs will push such agenda and keep making unoptmized shït. We will buy it anyway, so why bother?
Played through the demo on the Steam Deck and decided that this game is definitely not my cup of tea.
Anyway, here's how anyone else interested in doing so can get the best experience out of FF16 on the Deck:
Use DLSS Enabler to activate frame-generation, which will translate DLSS3 to FSR 3.1 (the built-in FSR3 implementation is useless).
You can look up how to install it in various tutorials, just make sure to use the DLL injection method and additionally select these 3 options:
– tickbox for AMD & Intel GPUs
– tickbox for Linux/SteamOS
– tickbox to disable NVIDIA signature checks
On top of that, I would recommend the FFXVIFix John linked to on here; just note that he directly linked to version 0.7.1, whereas the newest version was 0.7.4 last I checked.
In order for the above to work, add the following launch argument under the game's properties:
WINEDLLOVERRIDES="dinput8,dxgi,version=n,b" %command%If done correctly, you should be able to select DLSS3 in the settings, inluding frame-generation.
Finally, make sure to set the manual GPU clock to its highest value of 1600 MHz and disable the built-in framerate limiter of SteamOS, which will ensure that input lag is kept to an absolute minimum.
This way, you can get a surprisingly fluid gaming experience on the Steam Deck with this demanding game, even on the current (or rather old) SteamOS 3.5 stable branch.
Excellent post (lots of tricks to hack stuff onto low powered hardware, "excellent")
Users of hardware with AMD APUs (CPU and GPU), which includes ALL the PC portables if I have read the market 'right'…
I'd encourage users to 'sign up' for the AMD Vanguard Program.
Beta driver access with grant latest versions of FluidMotionFrames2 etc..
There was a public test released,.. but that is so many months ago now, and I see the weekly progress made on drivers for the phoenix /mobile APUs that if I ran one of those portables, I'd definately run AMD beta drivers to get the latest features onboard ASAP.
about three-four driver packages ago seemed like I would be able to put my AMD GPU in an external enclosure, hook up to the portable PCs (eg Ayaneo Flip has Occulink etc), and have the internal APU/'cpu and GPU' work WITH an external GPU,.. having the frame doubling frames handled by the internal graphics 'card'..
settings to control that (optimise frames for quality and speed etc), seemed like 'a fun project'.
(I do not have a portable, or an external GPU encosure,.. but I'd love to toy with!)
anyhow , just a shout out to AMD Beta drivers (Vanguard Program)- they would be my 'go to' trick/cheat if playing on any of these portable.
The title made it sounds as if the game is very heavy that even 4090 can't run it in a stable state. This is not poor optimization, this is technology and the use of different game engine. To combat these kind of challenges, that is why DLSS technology comes in. You can play it stable 120FPS, then don't complain to play it on low settings. But low settings already looks good.
Except the game doesn't look that extraordinarily better to justify the performance cost.
A companies bootlicker?! Nah, must be the wind.
"Hey, your graphics card that cost as much as a vehicle isn't enough to run our geh simulator, but the fault is all yours and now you need to play on a console-like quality because our team is sooooo good and geh"
But…but…but Denuvo does not hurt performance.
Its funny reading the comments here.
One of the most visually stunning games from the ps5 generation and people expect it to run at 4K60 native when it drops to 1080P40 on the PS5 (drops to 720p60 in fights)
Considering the 4090 is around 3.3X the power of the ps5, they pretty much scaled the game perfectly to PC (they scaled it to 4X the performance of the PS5, probably the difference is VRS & CPU)
Atleast this is how it is Native VS Native and people here are commenting it should push 300 frames xD
some people have short minded
"some people have short minded"
And some people can't spell or form a sentence. o.o
I think the RTX4090 is even faster than 3x compared to the PS5. I get 282fps (4.7x 60fps) in Black Myth Wukong with a PS5 like settings and my 4080S is 30-50% slower than the RTX4090.
You do know ff16 isn't a PS5 game lol it's a multiplatform game that they paid to make it a time exclusive game.
Also game has to have a few patches to improve proformance and graphics on PS5 when it released. The big issue with Xbox and PS5 is the CPUs just like with the PS5 pro while may have a better GPU now the CPU is the same as the PS5 just slightly boosted and this why they added PSSR bc they want something like Nvidia has so they can upscale the graphics so the console then can hit better fps
Since you didn't understand, I meant that the game is made for the PS5 and its capabilities.
Horizon Forbidden West was made for PS4 and updated for PS5, FFXVI was made for only for PS5 and it was heavy on the ps5.
RDNA 2 user, with approx 'twice' the teraflops of the console(s);
FFXVI (demo) runs 4k Native, but with FluidMotionFrames(2) needed to take it to 'V-Sync'
with everything set to max/ultra- the game blows away the console version (demo) that I also trialed.
This was using reference picture and sound (both calibrated)- the PC version was amazing, and I kept saying to my partner "I cannot believe this is the same game!"
the pc rig that the approx four year old GPU sits in?
ten years old. (with a modern 'firecuda' drive to help)
Game is OPTIMISED
When I first turned on the PC demo, I set everything to max, and turned off the framerate graphs and just enjoyed it.
Was 'too quick' (I couldn't believe what I was seeing), and the next day I was more than happy to insepect may framerates..
Game uses graphics card WELL
Skyrim soaks 130watts
Ziggurat ~180watts
The Outer Worlds ~220watss
Ghostwire Tokyo ~250watts
Hogwarts and Witcher 3 ~280watts
all maxed and with raytracing etc
Final Fantasy XVI peaks above 320watts on my GPU
it looks truly amazing and, again, it hardly resembled the console version.
sometimes just having much much more resolution can make a game seem different,.. but the ultra settings 'effects' bring a lot to the table.
Witcher 3 is impressive for loading a modern GPU (I still use framedoubling to make it 'fluid', when all Lightpathing is on..
Final Fantasy XVI, certainly on my 'ancient rig', showcases WHY PC gaming is still 'a thing'.
It isn't always about 'double the framerate'.
I actually think DLSS Quality is superior here. There really is no reason to use DLAA or native in this game when you have DLSS Quality because it is already so good that you aren't sacrificing quality and you can hit up to 140 FPS on a RTX 4090 with it.
Personally, I would prefer smoother image motion and more consistent framerate.
DLSS and similar technologies from other GPU manufacturers have become a convenient excuse for poor optimization in modern games. There's no reason an RTX 4090 shouldn’t consistently achieve 60+ FPS on max settings in Final Fantasy XVI—a game that, visually, doesn't even compare to Final Fantasy VII Remake, where I achieved over 120 FPS at 4K with max settings. The disparity in performance between these titles highlights how optimization issues are being masked by reliance on upscaling technologies instead of addressing the underlying inefficiencies.
Sometimes developers do use upscaling is an excuse for poor optimization, but if it were the case here then the game would run poorly with upscaling just like without due to issues of stuttering/frame pacing problems but it does not. In fact, it runs fine at native, as long as you don't expect some crazy high FPS or lower the settings a tad.
The issue is some games are pitching higher end visuals that current hardware simply cannot reach. It is similar to the 120 FPS vs 60 FPS vs 30 FPS issue consoles have been plagued by for generations. Alan Wake 2 is a fine example of this. So is the recent Wukong game which has Cinematic presets which are normally hidden because they're intended for CG use, not normal gameplay use and if not for upscaling the max setting would have just been Very High.
We're seeing this situation with Final Fantasy XVI so it isn't an optimization issue going on here. The game scales extremely well across lower to high end GPUs. It just pushes the visual envelope harder at the high end settings though its visual style can seem misleading due to a mix of realistic and illustration combination of art rather than photorealistic. The game actually uses a ton of high quality assets, too, due to its strong reliance on extremely fast IO so the game often has much more demanding assets on screen then most games. It should be noted this means you can actually turn down the visuals some and still have close to the best because as asset quality improves there are diminishing returns.
I have seen FFXVI gameplay on the RTX3060. Even indoors (with nothing on the screen but a wall) were around 35fps at 1080p :P. On my RTX4080S I get 45fps at 4K native. Luckily with the help of AI (DLSS balance + FG) I can get 120fps, so the game is perfectly playable, but it's strange when I need to use these features in games without RT (it's the first game without RT that push my GPU so hard).
You can run it a bit over 30 FPS at 4K on a RTX 3060 if you use DLSS Quality, or better yet balanced to get a bit more FPS. Much better image quality at a slight trade off of higher settings since most of the settings don't scale much past medium (and some even past low).
The reason you need to use upscaling in this game is because they pushed visuals more than usual so what might normally have been a medium setting is now a low setting, with medium being the equivalent of what should be very high, etc. We can see this because the change from medium and the highest settings is nearly non-existent for most settings, while low to high is different but often not significant. This is because as quality improves it becomes less efficient to scale up. A good artist can make a 10,000 poly sculpted character look nearly as good as a 20,000,000 Zbrush version.
In addition to the above, I suspect they didn't care to bother creating truly lower quality assets for such an asset (150+ GB) heavy game and chose to simply save the money/time calling it a day, so to say, otherwise we would have likely been able to see something closer to the scaling we usually do and thus less reliance on upscaling until the higher settings (which are built for more than current high end GPUs and would be required regardless due to the insanely high quality assets and increased visual effects past an efficiency point).
(((Squeer Enix))) apparently had other (((priorities))) than to optimize this crypto-brainwashing Talmudick agenda that they DARE disguise it as an "entertainment product!":
https://uploads.disquscdn.com/images/a983d9016115adcb6b33a5c8f18151cf1f02ee3806bfaf6b66acba0826b1bed7.png
What does being lgbtq+ have to do with crypto lol you just wanna sound like your saying something smart I guess lol
OK then go suck your ZERO-YEAR-OLD INFANTS' BLOOD OUT OF THEIR PEEPEES like what Ketubot 11b in your Talmud orders you because "iNtErCoUrSe wItH mInOrS lEsS tHaN tHrEe yEaR oLd dOeS nOtHiNg, aS iNtErCoUrSe wItH mInOrS lEsS tHaN tHrEe yEaR oLd iS tAnTaMoUnT tO pOkInG a fInGeR iNtO tEh eYe!!!" you pedophile-worshipper 🫵🏻😂
lol with such a degenerate profile picture, you'd better shut up.
F*k eurogamer. I will derive great joy the day their inevitable death comes.
Indeed and well said! Just look at how absolutely kosher and in line with double standards they are just like how their Talmud says so:
My personal and true story:
(((EuroGaymer)))'s comment policy:
https://uploads.disquscdn.com/images/0f0b81ae5847ff8449d1ac5eba303950180f4f6c9e64935f0b534d7533d69e10.png
And yet allow hateful, brainwashed bigots like this one on their site (the entire comment section was like this except for one sane person, which he got like only 1 or 2 upvotes at max while the others attacking him got 20+ to 40+ upvotes):
https://uploads.disquscdn.com/images/91887ca4baae877759e665d78f410fd422e3c819bd56e02d593c939d4a0a08f4.png
And when I said in a comment: "Can you say 2 genders in your 'free' and 'civilized' Western country without getting suspended, cancelled if not IMPRISONED"?
And in another: "And what was the name of that country that murdered an innocent teenager [Nahel Merzouk] just a few months ago?" and why doesn't the "civilized" West lecture France on their human rights abuses, I didn't swear even once in my comments, yet bigots like the one above among many others were constantly spewing "fuck this and fuck that" up and down, left and right and they weren't banned or even received warnings. And yet this happened to my account:
https://uploads.disquscdn.com/images/628d0991624219dc35f9faf217589c5203fbe287ca7e53ca7d36748b7f0b88fc.png
Very kosher (((EuroGaymer))), very kosher. Probably the most Judaized gayming site on the planet. I don't regret what I said in the slightest nor will I ever apologize for it just because it doesn't satisfy your fussy and hypocritical Jewish double standards.
https://uploads.disquscdn.com/images/d9fd38a05ebf835dd92cb4d446dd1ea396a516368db5f115fae2adf93f98b5fd.jpg
I was pleased with the performance in the demo. I get around 120fps at 4K DLSS balance + FG and I did not notice any stuttering. This is my first Final Fantasy game and I'm looking forward to playing the full game.
I wonder how may folks will notice that FSR3 looks much crisper than DLSS3 in this game with the same upscaling factor, by default. Some recent DLSS versions seem to smooth out the image too much actually, leaving a FXAA-like film over the image.
correct me if im wrong ; but isnt DLSS3 not DLAA, but a quality setting like ultra performance, performance , balanced, quality, + FG ? DLAA cannot be toggled when one of the quality settings has been selected ?
DLAA + FG isnt DLSS 3
If im wrong please correct me.
When people talk about DLSS3 they mean FrameGen only since that is the feature exclusive to it, anything else than that is just DLSS2. You´re also correct, DLAA cannot be toggled along with any other DLSS2 setting, it renders the game at the selected resolution and uses AI to remove the aliasing from it.
Idk I've been playing it and it's been fine 1440p on a QD OLED with a 4080 and 7800X3D and with max settings DLSS Quality I'm getting anywhere from 120-140fps and I haven't had any stutters in the actual gameplay, just when transitioning to a cutscene but I haven't tried the new updates on the demo which I think fixed the stutter but I haven't confirmed it
There is no justification for a game like this not to achieve 4k60 native on a 450 watts beast of a card like the 4090. This is a warning (one of many), and should be (one of many) wake up call for gamers that optimization is not something for "peasants" or "poor people", it is instead a common cause for all, something people should rally behind and demand, because no matter where where you are in the pc gaming strata, you're bound to benefit from well developed, efficient games; and as examples like this and that cities skylines (ran at 20fps on 4090) prove, due to the development environment today being infested with dei inserts, there is no shortage of developers competent at being incompetent, and there is no hardware they can't waste. But pc gamers are too obsessed with a false sense of elitism (which one tour on steam hardware survey give you a healthy dose of reality and myth busting) and kinda help and enable these devs being inept with a sense of they should instead get "better gear" (like bathesda said for starfield, even though their game had nothing to justify that) instead of demanding they start coding again and make their games RUN, anything else won't be accepted.
I'll wait for the movie, but thanks for sharing.
nuts cause just looking at it? seems like a 1600 dollar 4090 would do just fine and dandy. maybe they should oh I dunno optimize for a minute or two…..or maybe such is the power of the ps5…imagine a ps5 PRO?!??! it'll run at infinity fps
Why so many comments for such an overrated series? 🤔
Final Fantasy series stopped in 1994 with FF6.
Every FF game after that was abysmal.
The pixel remasters are great though.
This article is FALSE! With my 4090, turning off DLSS and DLAA and Frame Generation : Native 4K – I am getting an average of 75 FPS. With the Nvidia tech I am averaging 130 FPS.
4k Ultrawide/4090FE/i913900k/32GBRAM@7600/2x 980 Pro SSD/2x 990 Pro SSD/1500WPSU/Z790-E
Lotta name calling and crap talking in these comments. Bottom line it's an unoptimized game engine with a lot of problems when it comes to performance. No Ray Tracing, No massive polygon counts, baked lighting, not open world etc etc. DLSS, Frame Gen are made to get good performance from games with ray tracing features or even path tracing. FF16 doesn't look that much better than FF14.