AMD has shared the 4K and 1440p PC system requirements for Bethesda’s highly anticipated new game, Starfield. These more detailed PC requirements will give you an idea of the PC system you’ll need in order to enjoy the game at either 1440p or 4K.
AMD recommends using an AMD Ryzen 7 7700X CPU with an AMD Radeon RX 6800 Series graphics card for 1440p. According to the red team, this combo will be “ideal for hardcore 1440p or widescreen gamers to deliver max visual settings and high fps.”
For 4K gaming, AMD suggests using an AMD Ryzen 7 7800X3D with an AMD Radeon RX 7900 XT graphics card. AMD states that this PC system will provide the “absolute best Starfield experience in 4K.”
Since these requirements are part of AMD’s promotion, they do not feature any NVIDIA or Intel GPUs. Thus, let’s hope that Bethesda will reveal more detailed PC specs for this title before it comes out. After all, it will be interesting to compare the NVIDIA and AMD 1440p and 4K requirements for Starfield.
For what it’s worth, Bethesda has not clarified yet whether or not the game will support DLSS 2 or XeSS. For the time being, we know that it will only support AMD’s FSR 2.0.
Modder PureDark has already stated that he will add support for NVIDIA DLSS 3 via a mod. Unfortunately, though, this DLSS 3 Mod will most likely be behind a Patreon wall. And, in case you’re wondering, the DLSS 3 Mods for The Last of Us Part I, Jedi Survivor, Elden Ring and Skyrim are still behind a Patreon wall.
Bethesda will release Starfield on September 6th. You can also find the game’s “limited” PC requirements here.
Stay tuned for more!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email

This isn’t system requirements, this is an advertisement
daily reminder that AMD sucks. they messed up even the system reqs: every normal company posts them from left to right, not AMD tho. such a dumpsterfire of a company. I hope they go bankrupt and a proper competition to Nvidia emerges somewhere. with AMD, Nvidia can do whatever it wants. YOU SUCK AMD!
AMD?
“Is DLSS Really “Better Than Native”? – 24 Game Comparison, DLSS 2 vs FSR 2 vs Native” – https://youtu.be/O5B_dqi_Syc
“Nvidia’s DLSS 2 vs. AMD’s FSR 2 in 26 Games, Which Looks Better? – The Ultimate Analysis” – https://youtu.be/1WM_w7TBbj0
“Can AMD Match Nvidia Yet? – FSR 2.2 vs DLSS 2.4 Analysis” – https://youtu.be/w85M3KxUtJk
At 1440p CP2077 Overdrive using performance upscaling achieves –
4090 – 100 FPS
4080 – 70
4070Ti – 62
3090Ti – 60
4070 – 51
3080 – 50
3070 – 35
2080Ti – 28
7900XTX – 27
7900XT – 23
3060 – 23
2070S – 20
6950XT – 17
2060 – 15
6800XT – 14
A770 – 11
2060 6GB – 11
6800 – 10
A750 – 10
6700XT – 8
6700 – 7
6650XT – 1
That’s right, a 4.5yo Nvidia GPU is beating AMD’s latest and greatest in RT. Hence AMD keep RT usage in AMD sponsored titles to the bare minimum. And remember DLSS is giving a far better image than FSR.
https://www.youtube.com/watch?v=X51DB4bIT68
AMD you don’t get popular by doing this. It only instills disdain towards the company.
“Unfortunately, though, this DLSS 3 Mod will most likely be behind a Patreon wall.”
Hahaha, thanks for that one John.
It likely won’t be that good that you might as well use FSR, especially if the implementation is done right.
Bet your butt the mod will be free in minutes after released. Never underestimate Bethesda fanbase.
FSR sucks at 1440p and 1080p in most cases. I tested it in many games. Leaving the other two out show how much AMD doesn’t love gamers as they have been cleaning for the past few years. And yes they influenced devs to leave out DLSS and XESS.
DLSS was out before a “ok” version of FSR so saying some games have only DLSS is meaningless. Plus NVIDIA supports their tech out the gate in full
It’s convenient for AMD to mention playing games will require their highest end hardware. Go get the worm little fishes.
Even 1080p requirements are stupid. 1080p is an old resolution. If a game was made for PC there would be none of these silly requirements in most of not all cases.
You need an X670 Mobo for 4K, lol, that’s a new one.
Better buy the best
or suffer like the rest
These aren’t requirements, its a advertisement, lol.
TLDR tbh when you could have just said what I said in one sentence, lol.
If you can’t comprehend my reply to you then you calling me a goldfish is meaningless. 🤷♂️
Lol, he blocked me, how fickle some people are.
I paid $92 *(365PLN local currency brand new in shop here 1400 )for Asrock X670E Steel Legend board – listed as restating PC under 3D load I flash newest bios and and remove some debris from ram slot and now working fine for me 🙂
Not mention board was so new that I needed peel protection foil ];)
Nice sounds like you got a deal due to some tech illiterate, lol.
That was me second AM5 board this month, First was
MSI PRO B650-P WIFI from England damaged socket for $60 ( 39,05 GBP with shipping + Custom taxes 35 PLN)
After bending back pins to proper shape 1h under microscope , I find 7pins broken 2 Vcore 4 VSS ground both type redundant and single memory pin but to my surprise dual channel working and lost only 4th slot 😉
That was me second AM5 board this month, First was
MSI PRO B650-P WIFI from England damaged socket for $60 ( 39,05 GBP with shipping + Custom taxes 35 PLN)
After bending back pins to proper shape 1h under microscope , I find 7pins broken 2 Vcore 4 VSS ground both type redundant and single memory pin but to my surprise dual channel working and lost only 4th slot 😉
THIS GAME IS RUINED BY AMD !!!!
Why? My 3080 will play this like a charm. Are you afraid?
With “just” 10gb VRAM? Everybody knows that AMD sponsored titles require more than 12gb. Welcome to the stutterfest
Can’t wait for 16GB vram requirements to suffer through worse looking textures than we had back in 2012.
Meanwhile console players are petitioning for more then 30fps lmao
Console players are AMD users. They have asked for more RT also.
I suspect most of the PC could be playing at 30fps with how expensive the higher end hardware is that’s needed to play at 60fps, unless you cut the resolution a lot and other visual settings.
After all, a game that’s targeting 30fps on the Xbox is going to be quite demanding on the PC.
I guess, but console is more like a minimum spec pc than a representation of what we actually have
It’s not going to be that demanding, the series X could likely hit 60 at 1440 but they are aiming for “consistency” across platforms for whatever reason. Personally, I’d be pretty pissed if I forked out for the dearer system and was still getting 30 fps.
Wait. RT is a gimmick and not important. AMD fangirls have been saying this since RTX came out.
Ahh yes perfect, really good sh*t, especially since I have 0 amd garbage.
this year some many quality games has been released
– Resident evil 4 remake – 60 billion has spent by capcom
– Star Wars – 50 billion has spent by ea
– assassisn creed mirage coming – 66 billion has spent by ubisoft
– starfield – 120 billion has spent by bethesda
– avatar coming – 134 has billion spent by ubisoft
so people buy all these games at 70 dollars, these are next level quaility of games, that will be remembered for 100 – 200 years
In other words Bethesda can’t optimize for sh*t
To be fair, a game that targets 30fps with no 60fps option on the Xbox console was always going to need some beefy PC hardware to run at 60fps, in fact, looking at the gpu needed, it’s kinda low if that is 60fps at max settings at those resolutions.
It has no 60 fps option on the console it was designed for because Bethesda dropped the ball on optimization.
Gotta give it to the AMD boys, they really made the green team fans angry af lol.
Very Bad News for Nvidia’s Slaves !!!!!! hhhhhhhhh Just buy the lousy 4060 4 Gb VRAM to enjoy a sub-720p gaming experience ………….another thing the game’s engine isn’t the stupid UE4 !!! so expect a massive AMD win !! https://media4.giphy.com/media/h3nyuXub5k9jsRfmi7/giphy-downsized-small.mp4
isnt 7600 equivalent to 4060 ti? Also the 4060 is 8gb.
The 7600 is equivalent to the 4060. And it also has only 8gb. After all the marketing shaming they did of NVIDIA 8gb GPUs. Lol.
who is they though? Not amd reviewers and yes 7600 price is much better. All of this leaves and dies in benchmarks when the game is out.
We have already seen in game video. It’s console class garbage.
Can someone translate to intel & nvidia please.
I have I7 8700K & RTX 4080.
Yikeeeees bruv, you really bottlenecking that 4080 with a 8700k!!!
I play at 4K so most of the usage is coming from the GPU at that resolution. 32GB DDR4 3200 mhz & m.2 SSD.
Even at 4K your minimum frames could be pretty greatly improved by upgrading your CPU. I upgraded from an OCd i9 9900K to an i7 13700K and performance is noticeably smoother. Went from averaging mid 60s in the Cyberpunk benchmark to high 70s. (4K everything maxed with DLSS 2) Mainly from minimum frames being much better. And better minimum frames means better overall consistency. So yeah I’m sure stuff is playable for you but you could definitely benefit from a CPU upgrade
Did you really see this big improvement going from 9900k to 13k700 ? I am using now an 9900k and I planned the upgrade to 13k series but now I wait 14k series due to not considering much gaining I would received upgrading.
Dood if you just were a frequent visitor to this site, the site’s owner, Papadopoylos, owned a 9900k along with a 4090 and it stated in many of his benchmarks that he was getting Bottlenecked.
In other words, yes the 9900k with bottleneck you with newer cards, simple as that.
He stated he was getting bottlenecked at 1080p, not 4k
Yeap, the i9 9900K bottlenecks the 4090 at all resolutions. HOWEVER, if you target 60fps (and not 100-120fps), you can still use this combo (4090 + i9 9900K). Future triple-A games may also work great with it. DLSS 3 can also bring new life to the i9 9900K in games that lack proper multi-thread support, or in games that rely heavily on one CPU core/thread. A Plague Tale: Requiem, a game that a lot consider as a great current-gen game, has no trouble at all running on the i9 9900K (around 120fps with 4K DLSS 3 Quality) -> https://www.dsogaming.com/articles/a-plague-tale-requiem-microsoft-flight-f1-22-cyberpunk-2077-dlss-3-benchmarks-impressions/
Here are also some benchies in a number of CPU-bound PC games (it has comparisons between the i9 9900K and the AMD Ryzen 9 7950X3D) -> https://www.dsogaming.com/articles/amd-ryzen-9-7950x3d-benchmarked-in-10-recent-cpu-heavy-pc-games/
I have a 9900k in one of my systems and all the games run perfectly fine. Yeah you can look at FPS charts but there’s not going to be that big of a difference to warrant spending lots of money to upgrade right now. Especially if you’re playing at 4K which I do.
And I have an RTX 2080 TI, 3090 and a 4090. The 9900k was able to do well with all those gpus. Yes there was some FPS difference but it wasn’t like super crazy of a difference.
The 9900k is a great chip. In fact AMD ryzen didn’t overtake it until I think the third generation. That’s pretty crazy.
Should upgrade, but then again I believe it’s fine still. I kind of regret upgrading from 8086k to 13600kf. Yes the performance upgrade was huge in some cases but not in most cases. General usage is the same and most games I play are the same apart from some needing that strong single core
Wait for cheap Zen 4 or Arrow Lake
Early benchmarks of the RTX 4090 were done on Intel 12th gen CPU’s (Core i9 12900K I think), and some were done on the Ryzen 7 5800X3D which was the best gaming CPU at the time. Once Intel 13th gen and AMD’s Zen 4 (Ryzen 7000’s) came out tech reviewers realized that the RTX 4090 was being bottlenecked by the CPU’s they were testing on. I think even 4K benchmarks suffered.
The 8700K isn’t even an 8-core CPU, and with my RTX 3070 Ti playing at 1440p Cyberpunk 2077 uses most of my 8-core Ryzen 7 3800X. I would almost certainly be CPU bottlenecked on a 6-core CPU from the same generation (so a 3600X or 3600), and since Intel 8th gen was older and lower performance per core than AMD’s Zen 2 CPU’s like the 3800X and 3600X you’re almost certainly seeing a CPU bottleneck even at 4K with an RTX 4080.
Not everything revolves around Cyberpunk. There’s like 10000+ other ganes out there
I was replying to a post about FPS in Cyberpunk…
I was replying to an article about Starfield.
You weren’t replying to an article, you were replying to a comment. Unfortunately any semblance of context seems to have been lost at this point.
I play at 4K so most of the usage is coming from the GPU at that resolution. 32GB DDR4 3200 mhz & m.2 SSD.
I think the RX 6800 was roughly equivalent to the RTX 3070 Ti, but I haven’t looked at benchmarks recently. It’s also worth noting that while most of the driver bugs and glitches for AMD GPU’s seem to have been resolved, the drivers seem to be a performance bottleneck when a new AMD GPU is released and so AMD GPU’s will perform faster over time as AMD improves their drivers (which has caused some to claim that AMD drivers “age like fine wine”).
As for the CPU, I don’t know what Intel could have that would be the equivalent of the 7800X3D in a game like Starfield. Maybe the Core i9 13900K? In this case it isn’t a matter of core count, as the 7800X3D is an 8-core CPU, but rather a matter of obscene amounts of L3 cache being glued to the CPU cores (what AMD calls “3D V-Cache”) which games seem to really like and helps improve FPS. Some games will still favor raw CPU power and/or core count, and in those cases either the 7950X or the 13900K will dominate in those games, but for many others the 7800X3D will be the top pick due to the 3D V-Cache and I have to assume that Starfield will be in that camp.
What resolution do you want to play at? I don’t see why 8700k would have a problem at 1440p or even 4K. It’s still a pretty good chip. But remember these games are made on AMD hardware. And that’s where all these problems stem from. Console sheet.
I don’t buy consoles anymore simply because AMD is the only hardware in it. I want a choice.
Ty, ty..
First AMD bribes developers to block Nvidia’s DLSS. Now they are releasing AMD-only system requirements.
Assuming it was a bribe, after all, think of it from a developers point of view, if they support FSR, it works on AMD, Nvidia, Intel hardware and also works on Xbox consoles, DLSS on the other hand only works on Nvidia hardware.
If I was a developer, I could see the merit of supporting FSR only and forgetting about the rest, especially if it means doing a good implementation of it.
Total BS, Nvidia have even provided a simple dll method to include DLSS in your game. It takes mere minutes, all they need to do is add DLSS to the graphics menu, there’s no excuse for it. AMD games don’t have DLSS because it makes their tech look like sh*t, it’s as simple as that.
This is all assuming that DLSS would be absent. Due to the “marketing” it would be reasonable to leave out all mentioning of DLSS until release date, even if it will be supported.
Yes it’s a bribe. They’re bundling the game with their gpus so they had to buy a whole bunch of copies. Money is passed off not as cash. But with technical support and other ways.
Nvidia clearly stated that they do not discourage developers from putting any other technologies in their games. And then went on to say that they have tools that make it easier to put all technologies into the games.
What was amd’s response? There’s dlss only in some games. Huh?
Nice i got an AMD B650e paired with my R7 7800X3D and will be playing in 1440p 240Hz moitor with my RTX 2080Ti –> ( im thinking of upgrading to a RTX 4080 or a AMD RX 7900XTX )
And… In… Proper? 🤣
this year some many quality games has been released
– Resident evil 4 remake – 60 billion has spent by capcom
– Star Wars – 50 billion has spent by ea
– assassisn creed mirage coming – 66 billion has spent by ubisoft
– starfield – 120 billion has spent by bethesda
– avatar coming – 134 has billion spent by ubisoft
so people buy all these games at 70 dollars, these are next level quaility of games, that will be remembered for 100 – 200 years
are you retarded?
I think his mother was a **tard that got f*ked by a r**ard, that makes this f*ker a next gen r**ard or double r**ard, whichever you prefer.
I think his mother was a **tard that got f*ked by a r**ard, that makes this f*ker a next gen r**ard or double r**ard, whichever you prefer.
English must not be your first language because you’re off by a factor of one thousand with those numbers. I think you meant to say “million,” not “billion.”
Ty, ty..
I don’t even know wtf those requirements mean. It’s absurd as f### to not post the equivalent Nvidia card. Thanks AMD for wasting 2 mins of my time to look up what the equivalent specs cards are
blaim your own lack of common sense. The specs are publiced by AMD not BETHESDA.
I literally said AMD. I have no idea wtf you are going on about
Ty, ty..
AMD needs to stop ruining games. They don’t even try to implement technologies that will make use of modern/future hardware.
They don’t help devs implement ray tracing and block 84% of the market from using the best upscaling tech out there.
Hey AMD, no one cares about traditional, non ray-traced rendering in 2023. If the game is going to be cpu bottlenecked like we think it is, you may as well add rtgi, shadows, etc. It’s essentially free(on an nvidia gpu)
My rtx 3080 10gb will probably struggle with this one because it an amd sponsored title, i expect 16gb vram usage at 1440p for some asinine reason