As we’ve already reported, Star Wars Jedi: Survivor is a really CPU-heavy game due to its optimization issues. And, as we suggested, NVIDIA’s DLSS 3 would potentially do wonders on this game. And we were right as PureDark has shared a video, showcasing a WIP version of a DLSS 3 Frame Generation Mod for it.
As we can see in the video, without DLSS 3 Frame Generation, the game ran with 40-45fps on PureDark’s system. However, when using the DLSS 3 Frame Generation Mod, the modder was able to double its performance and get over 90fps.
Unfortunately, this current version of this DLSS 3 Frame Generation Mod has some visual artifacts. Furthermore, the mod doesn’t have camera info such as position and all the matrices.
What this ultimately means is that this isn’t as good as native a DLSS 3 implementation. And that’s precisely why we’ve criticized Respawn for not using it. Yes, DLSS 3 is currently available exclusively on NVIDIA’s RTX40 series GPUs. However, it would have been a Godsend.
PureDark claimed that while he can add DLSSG, he can’t replace FSR2 with DLSS due to the Denuvo implementation. And currently, the modder has no intention to further polish it and eliminate its visual artifacts.
Lastly, PureDark may one day release this mod to the public. Naturally, we’ll be sure to let you know when that happens. Until then, enjoy the following video!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
gamers fixing the mistake of developers again
Also indeed and this seems to be a trend and not a good one.
So, it COULD be implemented in a cracked version?
yep and its why some reviews are even publicly stating that. Buy the game but play the “cr**ed” version.
First it was generating pixels. Now it is generating frames. Tomorrow it will generate games…
How to cheat in patchig a game that is poorly optimised with technolagy than most of us are not able to afford.
Thay say crime does not pay well it may do for a 40 series graphics card.
Indeed. The word on the street is prices for 3xxx series and by association 4xxx series are about the crash.
This game isn’t designed to run at a constant 60 FPS. It doesn’t do that on the consoles with awful upscaling and TAA. Sh@t in one hand and wait for a patch to fix CPU performance in the other. See what one fills up first. The only “fix” for these ambitious games is going to be frame generation, which is why AMD is now paying to hold technology back from you in these ports. Their GPU’s will become worthless overnight, just like Nvidia’s low VRAM GPU’s did and their frame generation isn’t going to work and be even worse than their artifact ridden upscaling.
The patches might fix Win 11 CPU’s running on Win 10 (Intel). That just means you go from 35 FPS lows on a 4090 to 45.
Now if you want to argue that devs are morons for targeting visuals that can’t come close to holding 60 FPS on a console and going back to 30 FPS design? I would agree with you. They don’t care though. The artists there are either incompetent, don’t game or trying to put out a resume to work for a stupid company like Disney. They care nothing about FPS. They aren’t gonna optimize anything.
Enjoy gaming at 40-50 FPS on AMD Denuvo titles and supporting bribes to developers. Back to Witcher 3 running smooth as glass on a controller with RT and frame gen. Most incompetent diversity hires at current CDPR ever and it still works. Thanks Nvidia!
You’re wrong. I’m never running lower than 80fps on my 4090 on full EPIC settings + RTX enabled.
3360×1440 21/9 screen.
Sure thing AMD/EA astroturfing. Hope the FTC starts actually fining for all these false testimonials and fake reviews. If you aren’t astroturfing no one gives a f@^# about your FPS in scripted scenes, no combat and in corridors.
You’re still wrong.
Today’s update. now locked 100 fps perma.
Sorry if you have a crappy pc.
Do you mean 3440×1440..? and a lot of players are “I think” playing at 4k hence the very low perf.
4k players are not “a lot”.
even 1440p is not that common.
1440p is becoming more common, its UW 1440p thats not that common.
Bruce, you can’t be serious here?
“This game isn’t designed to run at a constant 60 FPS.”
“The only “fix” for these ambitious games is going to be frame generation, which is why AMD is now paying to hold technology back from you in these ports.”
Bruce, there is nothing ambitious about this garbage. You’re not the first person on here that have said this, so what am I missing here? A game but designed to run 60? Frame regeneration is now God send to “fix”? Is anyone else reading this. How can you blame a card manufacturer for what this game is lacking DLSS or any other fake frames tech isn’t designed to be a games a savior. It’s a placebo effect to sell cards that Nvidia purposely build to under perform, so that they don’t cut into their 80% profit margin. Then stupid AMD decides to follow suite in order to compete with nonsense. Bruce you can’t believe anything you just type I know you’re way more logical than this type of FOOLISHNESS.
Go look at the performance on a PS5 in the 60 FPS “performance mode” without malware Denuvo running and get back to me. It’s bad. Really bad.
You aren’t getting optimization on these types of games because they are already optimized on the consoles and the game designers simply didn’t care about reaching 60 FPS. This is a 30 FPS game on console that runs better in some corridors with nothing going on.
PC CPU’s are of course much stronger, but they have Denuvo and more driver overhead.
Unless you have played a game with a console controller and frame generation you don’t know how good it is. Nvidia Reflex gets a lot of the latency back and you get a softer image that sharpens right back to looking light native with something like reshade CAS.fx.
Frame generation is NOT something I would use in a shooter. Single player games on console controllers? It’s brilliant. There is a reason people put it on even with their 4090’s in a game like Hogwarts. It’s simply smoother and a better experience.
Brucie, you’re all over the place here. You off all people I didn’t expect any of this from.
“You aren’t getting optimization on these types of games because they are already optimized on the consoles and the game designers simply didn’t care about reaching 60 FPS.”
THEN DON’T BUY THESE GAMES ON PC. Stop supporting this crap. Don’t hope for placebo effect/fake frames upscaling to save the day. What you are doing is some oppologist/excusing bad behavior. I’m reading a lot of what you’ve said and it sounds like 3 different people typing at once and they are all making excuses for bad behavior. By the way, they are saying the PS5 version is dog crap as well.
C’mon man you know better. Stop buying these games if you know for a fact how they do things. Because other companies don’t do this. This game is terrible excuse for a game and is everything wrong with PC gaming. But I’m not surprised, neir automata was atrocious and some idiot modded it to work for every version except pirated. With some holier then though attitude. Instead NOT supporting this crap, YOU’RE on here cursing AMD or whatever companies you can, in hopes for them to be in line with the fake frames movement.
Fake frames is a bad take and it’s what people repeat who have never used it.
Frame gen is very good on a console controller in a single player game. Even at something like 75 FPS with “fake frames” Witcher 3 feels slightly better than 60 FPS, prob due to Nvidia reflex.
It’s a technology that saves people a ton of money on CPU and GPU to get a good experience or allows them to run RT and decent frames on better hardware.
It’s one of the most pro consumer things any GPU companies ever did and AMD is paying to keep it from you in the most broken age of ports I can remember outside OpenGL games that half the time wouldn’t work because OpenGL at that time was a fragmented sh@^ show and you had Glide in there as well with 3DFX.
Consoles were also highly proprietary tech back then and something like a N64 was stupidly better for the price.
If you think AMD frame gen is going to work well without hardware to leverage it? You are also delusional. Frame gen has had growing pains on Nvidia and they have specialized hardware.
AMD simply bet on the wrong horse and if the consoles abandon them they will be right back where they were pre Zen. Consoles NEED AI upscaling and frame gen and AMD sucks at it.
Denovo as malware is right, its why any and every game I play does NOT get access to the internet. If it needs to talk it can do it via steam/gog.
AMD will I imagine counter, just like they did with thier RT and FSR, they just need to catch up.
I’m seeing 55 -70+fps running a 7800x3d/ 7900xtx rig. Turning on FSR gets me in the 90 -100+fps range.
Cherry picked “lows” and if 55 was the lowest low you will experience in the game (it’s not) 55 FPS is an awful experience. People have a 4090 compared to a 7900XTX comparison videos you can watch and the 4090 has better lows and smokes it overall. Still a bad experience.
Not cherry picked , just stating what I’m experiencing in the time I’ve played. Nor was I comparing to a 4090 but since you bring it up, yeah my setup seems to be performing better than a 4090 right now in this game. Yes the game is not working correctly for anyone but for now it runs better on AMD. Yes I know there are videos on YouTube, like this one
https://youtu.be/HjbdFcUC00A
Which reflects the performance I’m getting as well. I can also turn on FSR and I’m seeing 80 -100fps.
Considering a 4090 is at bare minimum $550 more than my graphics card , I’d hope that it could outperform cheaper cards.
Thats interesting and good to hear.
What qaulity settings did you have the game set to?
I am on the fence about my new GPU (7900XTX or 4080) and we
ther I want to upgrade from my 5900 to the 7800. So its good see some positive FPS numbers.
Setting are 4k with maximum settings across the board but ray tracing off. Numbers are lower of course with ray tracing. 45fps – 50s and with FSR + Ray tracing around 70fps.
I actually had an Asus tuf gaming 4080 and ended up returning it for the Red Devil 7900xtx I have now. The 7900xtx was around $350 cheaper so the 4080 just wasn’t worth it to me.
Well in the end I repaired some old tech I was given, so traded those along with my old 3080-12gb and only paid £200 for a 4080. I went with that over the 7900XTX due to power useage, with undervolting my 4080 uses 200-250w in most games. And I got around the PCIe connector issue with one of De8auers Wireview (https://thermal-grizzly.com/en/products/601-wireview-en), as it allowed me to use my existing 8pin PCIe connectors from my perfectly useable Fractal ion 760w psu.
For all those Denuvo Bootlickers. This means a paying customer gets locked out of what they pay for 24 hrs. I guess that’s the price to pay for keeping those pesky prayers away huh?.. What’s that y’all say again? Oh, “it’s to protect the first 2 weeks of launch sales”. Yes protect the first 2 weeks of a garbage product that can’t even FUQIN LAUNCH. That sounds just about right, You Dumb M*tha Fuqas!
https://uploads.disquscdn.com/images/c22989e599277ace0ec9ab882767470dd801319bc64609353d499a9a407cb067.png
Fella, watch your tone.
But your not wrong about Denuvo, its why you wait to buy a game untill after a few patches have been released or better untill Denuvo has been removed.
DLSS 3.0 is a poor technology. It doesn’t help responsiveness and actually makes it worse.
The main problems of gaming below 60 FPS in action games are twofold:
1) the lack of timely visual stimuli, you basically can’t see what is going on in time to react
2) the delay in response from input, your input is linked to each render pass and is delayed with lower FPS
DLSS 3.0 solves neither one of those issues, and actually worsens them. DLSS 3.0 delays the most recent frame, interpolates it with the previous one to fabricate the interpolated frame, and only then shows the actual frame.
That’s half a frame of extra input lag on each frame with DLSS 3.0, especially at low FPS, that makes the situation much worse.
The extra input lag DLSS 3.0 introduces at higher FPS is less of an issue, but then you don’t need DLSS 3.0 at higher FPS to begin with. There’s no need to smooth out movement when your machine is natively rendering 100 FPS.
DLSS 3.0 smooths out jerky movement, but it doesn’t increase the amount of render passes that happen, and adds half a frame of input lag, so you don’t solve anything.
It’s a great tech for single player games that can get around 70fps without it.
”
DLSS 3.0 is a poor technology. It doesn’t help responsiveness”correct, and its been shown (sic* hardware unboxed) that it only really works as intended when fps is nativly above 100.
Dlss 3 on cyberpunk and a plaque tale are flawless for me tho
If i am having a game at 40fps, half a frame is 12,5ms… Almost a frame if @60fps.
Still, at least i would choose a laggier or even 1fps delayed playing experience at 60fps with DLSS3, over playing more responsively at a choppier 30-40 fps. Especially on non First Person Shooters.
Do i wish DLSS3 didn’t introduce lag? Of course, but it still serves me well.
Forget the performance. The game doesn’t even offer PC specific features such as DLSS, XeSS and / or a decent RT implementation. At least over time they can fix the performance, but AMD sponsorship has held this, like others before, back to the point it really is just a console clone. I’m guessing it was rushed out to be given away with AMD’s hardware.
Ruined by AMD™
Amd sponsored titles often blocks and tanks competitor’s performance but people are ok with it.
Forget the performance. The game doesn’t even offer PC specific features such as DLSS, XeSS and / or a decent RT implementation. At least over time they can fix the performance, but AMD sponsorship has held this, like others before, back to the point it really is just a console clone. I’m guessing it was rushed out to be given away with AMD’s hardware.
Ruined by AMD™
F*kin AMD that hold it back the only feature that would have made the game well enjoyable. I own a AMD cpu but would never ever skip on Nvidia DLSS3 is great.
Are you serious? You think AMD is the original here. You think AMD or any card manufacturer is what’s holding this game back? What the Fuq are you smoking?
I gotta make that B/S game look like Minecraft just to play it with my budget 3060 12g, i9-11900k, 32G, Nvme. SUX