Nixxes Software has just revealed the official PC system requirements for Ratchet & Clank: Rift Apart. These PC specs cover all common resolutions (1080p, 1440p and 4K), as well as the framerate they target and their graphical settings.
These are among the best PC requirements we’ve seen, so kudos to Nixxes for detailing them. And yes, Nixxes has also included the PC specs required for running the game’s Ray Tracing effects.
For gaming at 1440p/High Settings with 60fps, Nixxes recommends an NVIDIA GeForce RTX 3070 or an AMD Radeon RX 6800. The developers also recommend using an SSD, and the game will need 16GB of total RAM for this preset.
For gaming at 4K/High Settings/Very High Ray Tracing with 60fps, Nixxes suggests using an NVIDIA GeForce RTX 4080 or an AMD Radeon RX 7900XTX. Do note that for its RT settings, Nixxes suggests using DLSS 2 or FSR 2.0.
From the looks of it, these PC system requirements for Ratchet & Clank: Rift Apart seem reasonable. Thus, it will be interesting to see how the game will perform on our platform. And yes, we’ll be sure to also test our Intel i9 9900K. My guess is that this CPU will be able to maintain 60fps with DLSS 3. And, if it does, it will show how transformative DLSS 3 can actually be for older PC systems.
Ratchet & Clank: Rift Apart will support NVIDIA DLSS 3, AMD FSR 2.0 and Intel XeSS at launch. Furthermore, the game will support Ray Tracing in order to enhance its reflections and exterior shadows.
Sony will release this game on PC on July 26th. For those wondering, we haven’t received any PC review code for it. What this means is that we likely won’t have a day-1 PC Performance Analysis. However, and as with most titles, we’ve already purchased it and we’ll be sharing our initial performance impressions the moment it becomes available.
Enjoy and stay tuned for more!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email

LOL
https://uploads.disquscdn.com/images/de7d6354ecb0867cb0555800fc5521fb016fd647b554b02af5ae8771ef698f98.png
Marketing bs
It was a lie.
Yeah, the game doesn’t even have a lot of that world jumping anyway. I’m sure an SSD helps, but it doesn’t require a “PS5 SSD”.
They are supposedly using Direct Storage which does the same thing less the compression.
What he said is actually true because it allows them to stream assets from storage directly to the GPU for decompression and then directly into VRAM whereas on a PC the CPU does the decompression, puts it in System RAM and then transfers it from System RAM to VRAM. That makes Real Time texture and asset streaming too slow and result in texture pop in even after a short loading screen Also a PS5 has an advantage because the CPU is using GDDR6 for RAM which is considerably faster than the best DDR4 or DDR5 RAM a PC uses
A CPU doesn’t care about bandwidth. The only thing which is relveant for a CPU is latency and in this field GDDR is slower than DDR4/5
Sure it does, if what you said was true we’d all be running 2133 MHz memory … But you miss the point completely, it takes a lot of CPU cycles to decompress the data from the storage drive, more cycles to move it into system RAM and then even more cycles to move it from system RAM to VRAM across the PCIe bus
Direct storage has the CPU move the raw data directly from the storage drive to the PCIe bus where the GPU takes over and does the decompression and puts it into VRAM. For the purpose of decompression one CUDA core is equal to one CPU core so you can use say 32 CUDA cores and it’s like having a 32 core CPU doing the work while barely putting any load on the GPU
This is why GPUs are used for science, cloud server and AI applications where you need a large number of parallel threads working at the same time.
Who cares, Ghost of Tsushima when.
Me if I didn’t boycott soyny for all their moonbattery.
Not enough diversity, they can’t raise their ESG score with that.
They have raindbow diversity
Racially it’s lacking in diversity, but according to a post on the biggest pro-lgtv website (i.e. reddit), ghost of tsushima has MORE non-straight characters than the last of us 2. Just some food for thought.
“For those wondering, we haven’t received any PC review code for it.” Do you ever? lol. Jokes aside John and team always does great work with performance analysis. But you know how I’m gonna benchmark this? On an fx 8350 with 1060 6gb and HDD. If it runs 30 fps medium, good port. Otherwise, “added to future games when I get old enough”.
Treat yourself and at least get a sata ssd, lol.
I second this. They’ve come a LONG way in pricing, so even a cheapo SATA SSD will make a night and day difference.
Got an nvme on sale. Gotta go beast now!!!
That’s going to be a massive difference.
“Nixxes suggests using DLSS 2 or FSR 2.0.” – Is what happens when AMD is not involved.
I bet VRAM optimization is better too because they are developing on Nvidia GPUs using Nvidia software which is safe to assume since they are including all 3 upscaling technologies so they are likely using Nvidia Streamline
Diablo IV is a Nvidia game yet it eats up over 20 gig’s of vram at 1080P….
For 1440p 60 FPS you need a better PC than PS5, which in 60 FPS mode runs the game without problems in 1800p.
The console also runs the game, in 4K 30 FPS RT mode (with a drop to ~ 1800p) in this mode, again you need a more powerful PC than PS5.
In addition, Nixxes highly recommends to use DLSS and FSR, so instead of playing in 1440p you will play in 1080p.
PS. Nixxes still didn’t fix HT and SMT for Web Head games and those game are on the same engine like Ratchet.
PS. the PS5 is running with lower settings.
You are always going to need higher requirements on a PC to equal a console based on different architectures and less overhead on a console.
Consoles are a “One Trick Pony” and thus can be better optimized to do a single specific task, in this case playing games whereas a PC has to do dozens of things unrelated to gaming.
That’s just basic Computer Engineering 101
Utterly made up requirements. CPU usage decreases with increasing resolution, yet here they show higher end CPU needed for higher resolutions. You see the same nonsense every time one of these requirements tables are released.
Its not nonsense its for ultimate ray tracing.
OK fair enough
They also have higher quality settings. Things like LOD affects CPU performance greatly.
CPU usage doesn’t “decrease” with higher resolutions lmao, people still repeating this bs in 2023. It only decreases IF your GPU is maxed out while CPU has enough headroom, and this happens faster the higher your resolution because res scales linearly with GPU power while usually not with the CPU (it does to a small degree in very few games).
For some reason, games always list strange CPU requirements. These are usually unrealistic in most cases. If you are playing at higher resolutions, a fast 6 or 8 core CPU will be enough, as your GPU is what matters the most here. The 4k CPU requirements are strange as an Intel Core i7 12700k is a lot faster than the Ryzen 9 5900x. They are not even in the same generation level (should be Ryzen 6000 series).