Starfield feature-4

AMD announced that it will be Starfield’s exclusive PC partner

AMD has just announced that it will be Starfield’s exclusive PC partner, and shared an announcement video. In this video, Todd Howard confirms that Starfield will be using Creation Engine 2; an enhanced version of the engine used in Bethesda’s previous games.

AMD claims that the game will be able to utilize multi-core CPUs. Moreover, Todd Howard confirmed that the game will be using FSR 2.0.

As you may have guessed, the game does not appear to have any specific PC-only Ray Tracing effects. This is a missed opportunity as Starfield could really benefit from RTGI, RTAO and RT Shadows.

What’s also disappointing here is the fact that Starfield won’t support NVIDIA’s DLSS 2 or DLSS 3. As we reported a few days ago, AMD appears to be preventing developers from using NVIDIA’s AI upscaling tech. This is another bummer as Starfield seems likely to be a CPU-heavy game, so DLSS 3 would have been ideal for it.

Ironically, we might see modders implementing support for DLSS 3. PureDark has already done this in previous Bethesda games like Fallout 4 and Skyrim.

Starfield will release in September. You can also find here the game’s official PC system requirements.

Stay tuned for more!

AMD is Starfield’s Exclusive PC Partner

123 thoughts on “AMD announced that it will be Starfield’s exclusive PC partner”

  1. well that settles it, since this is the only AAA game i care about, my next gpu will be amd, nvidia ‘s 4000 series is laughable.

    1. What a clown lmao. Gonna get suckered into inferior garbage because AMD waste their money gatekeeping actually good tech instead of coming up with alternatives.

      1. This guy is confusing AMD with NVidia for sure, AMD does not waste a penny on exclusive deals.

        Not to mention their tech is open source and works on every GPU, which is why the developers go with it.

        John is just making accusations without any sort of evidence, and is just sad to see him reduced to a NVidia chill.

          1. dude i got a gtx 1060, it was the best card nvidia ever released, you nvidia fanboys defending the 4000 series is laughable i hope amd or intel beats nvidia so we can have some actual competition.

      2. Hey moron, he is actually right. Nvidia’s 4000 series are nothing special you nv shill moron.

        1. notice how not one of them is replying with arguments on how i am wrong, they just suck off nvidia harder.

  2. Apparently AMD supplies chips for the current-generation Xbox consoles. Crazy conspiracy theory: AMD is holding its technical knowledge hostage, demanding exclusivity in exchange for helping Bethesda make Starfield run decently on consoles and avoid a Cyberpunk-on-consoles debacle.

    1. Nice theory, but all the technical knowledge about AMDGPUs is freely available from AMD themselves, since they are actively developing a Linux driver for their own GPUs, which has to be open-source.

      BTW, the open-source Linux driver from NVIDIA for their Tegra products is also the reason why Nintendo Switch emulators could be developed so quickly, since all the technical knowledge was already out there in the open-source code.

  3. It makes no sense for AMD to force no DLSS since FSR runs on both. I mean if I am to buy AMD I will do it for other reasons.

    1. AMD does not force anything, developers simply like to rely on FSR cause is open source and works on every GPU.

      John has no evidence for his claims, and is just disappointing to see DSOGaming reduced to NVidia chills.

        1. Cause is open source and works on every GPU, is that simple. DLSS is harder to implement than FSR as well.

          So instead of making baseless accusations like John, think to yourself, if you were a developer what would you do?

          Go the extra mile to implement DLSS that will only work a specific GPU brand while being exclusive to PC.

          Or just use FSR on every build and call it a day.

          Don’t forget worst case you’ll need support for implementing DLSS, which NVidia charges for, while FSR being open source you can get free support and code samples from community (and even some AMD developers).

          1. DLSS is free to implement by any dev as of 2 years ago. And DLSS can’t be THAT hard to implement. The mod community has added not only DLSS 2 to Jedi Survivor but also DLSS FG.

            “Go the extra mile to implement DLSS that will only work a specific GPU brand”, you mean the dominant brand by far in the PC market?

          2. DLSS is free to implement by any dev as of 2 years ago. And DLSS can’t be THAT hard to implement. The mod community has added not only DLSS 2 to Jedi Survivor but also DLSS FG.

            “Go the extra mile to implement DLSS that will only work a specific GPU brand”, you mean the dominant brand by far in the PC market?

          3. It’s not that easy. When you implement two techniques for the same problem, the maintenance cost doubles, because you need support for both techniques. Modding is a third-party solution, and it’s easy, because the modders don’t have to guarantee that the implemented technology will work every time, on every config. But as a developer, we have to guarantee it, and if something goes wrong for the customers, we have to fix it while we have official support for a title.

            Normally, I would only implement DLSS, if NVIDIA pays the maintenance costs associated with it. And DLSS is much harder to implement than FSR.

          4. Why would devs implement a feature that only benefits a portion of the general population of PCgamers? Rather than implement a tech that will definitely benefit all the players?
            Developers only implement DLSS when Nvidia is partnering with said studio, giving assets or money to said studios.

          5. How exactly FSR benefits all the players? I’d rather use lower native resolution than use FSR.

          6. It allows players to gain extra performance by sacrificing visual fidelity without taking away graphics quality. and it benefits them because it’s an open sourced tech that can be run with any gpu.
            When a developer implements DLSS they’re not only giving SOME players the benefit of said tech, they’re also giving their own game as an asset for Nvidia to sell their own proprietary gpus. DLSS is a form of business in the end.

          7. NVIDIA owns 85% of PC market in dedicated GPUs. Thats the reason why they should do it.

          8. Depends on what you will use from UE5. It is easy to allow it, but some surface might not work with DLSS well. That’s why UE5 has an own upscaler, that is compatible with every surface.

            This is also a problem with FSR2, though AMD has an extra layer to feed their algorithm with some extra data. But you need to control it manually. This is not possible with DLSS and XeSS.

          9. The MOD community hasn’t added anything, besides replacing the DLL with specific versions and bypassing the signature check, with the issues that brings to the table.

            You cannot replace specific versions either cause they need to be implemented properly by developers.

            And DLSS is much harder to implement than FSR in general.

            Not to mention I never stated anywhere you need to pay for DLSS, you need to pay for NVidia support.

          10. Seems to me you way off the mark, being implemented in engines does not mean is plug and play, and even FSR has plugins for both engines for that matter.

            SDK is useless unless you trying to implement it in your own engine, you cannot do anything with that code when it comes to DLSS inner workings.

            Plus John baseless accusations are about AMD, not how hard DLSS is to implement, not only that but Nixxe developers have never implemented DLSS 3 in any of their games.

            What they said is also an insult to every developer on earth, as DLSS is NOT easy to implement, unless you go the lazy way and don’t actually take advantage of it, we talking about thousands of lines of code just for it.

            Not all of us copy paste code and have Sony Studios as backup.

    2. Yes because only a dummy spends this much money for a GPU just for fake frames. These people are unbelievable. Now they are looking for problems.

        1. I’m telling you man, ever since Don mattrick and his always online Xbox debut, we’ve been stuck with Xbox xpats. They have diluted the PC game space like no other. I’m telling you man, multi GPU was the way of the future. There are paying way more for one GPU then we ever did for 2 and I don’t think I’ve GPU will ever efficiently handle what we are demanding.

          Now we get one giant GPU that takes up 3 pcie slots in our cases. Why can’t people see what’s going on here and what we’ve lost. Takes up 3 slots and not 3x the performance. But don’t worry, AI to the rescue. What the Fuq man… All this fanboying has gotten us nowhere. But 2 players that is ripping us the hell off. AMD is now doing the same thing Nvidia is going with 6700, 6700xt, 6750xt, 6800, 6800 XT, 6900 XT, 6950 XT. <---- What The Actual Fuq!..

          1. i agree. i rememeber my sad ending days of multigpu. morons believe that one gpu is enough. lazy devs love this bullshit idea more than anyone, the same morons who give us pathetic looking running games, expecting dlss and fsr as damn solution. what a joke. using upscaling on poor old hardware is very understandable as an option, like when we have to turn down graphical options like shadows on weak hardware, but this bullshit that pc gamers now are fighting over is just pathetic. pc’s high end hardware in need of dlss/fsr instead of asking for clean looking Supersampling. arrgh. i remember when i had multi gpu, i used it for nvidia DSR on 1080p monitor. man that 4k downsample on full hd looked lovely. now days we have 1 card- enough-bullshit-solution with dlss or whatever else NEEDED on top of it to perform 4k! what a sh*t show.
            i have not upgraded for a long time because of so low performing gpus/average looking games. there was a time you really needed the best hardware to have the best looking and SHARPEST image. now dlss and fsr are the ideal thing!

    1. Instead it’s built on an engine that’s arguably worse. They’re basically stretching an aging engine to its limits when they really should’ve built a new one from scratch.

      1. Especially since it’s a DX11 game engine and DX11 game engines are notoriously bad to convert to DX12. You really need to rewrite them from the ground up to work properly and efficiently with DX12 (or Vulkan)

        1. Since it’s being labelled as the “Creation Engine 2”, I’m guessing Bethesda already did a complete rewrite of their rendering code in DX12.

          And most likely that DX12 code will be a port of the Xbox Series one to the PC, therefore naturally performing better on AMD hardware.

          Same patterns already happened with Call of Duty: Modern Warfare 2 (2022) & Resident Evil 4 Remake, where they noticeably performed better on AMD.

          Plus the Vulkan renderer of DOOM (2016), which also made use of low-level AMD extensions to take advantage of the optimized shaders ported over from the console versions.

          1. UE4 is too old , too dump , not optimized for DXR and DX12 and don’t benefit from Multi-threading CPUs !
            | Only few developpers can master it like The Coalition GOW4&5].

      2. every open world unreal engine game runs like crap, its not made for it, bethesda’s engine is made for saving interactions in large maps such as dropping stuff in the ground.

        1. But not having more than 5 characters on screen lest everything falls apart and sets itself on fire.

    1. hahahhaaah lol, no, in fact fallout 3 and new vegas is stuck at 3.5gb because its x32. If only there was a open mw for new vegas.

      1. all fallout 4 stutters are related to sh**ty vsync implementation dating back to morrowind – same engine with some sugar on top added recently

  4. Most likely this will also mean it wastes a lot of VRAM because it isn’t properly optimized because AMD doesn’t have the software tools to do that

    1. We have profilers and analysers, contrary to your accusations, is clear you never bothered to develop on AMD hardware.

      I suggest you do your research on GPUOpen and the tools we have.

    1. Since it’s a bethesda game, I expect modders to patch in DLSS like skyrim and fallout, but it might take some time. Probably not going to play it until then

  5. They boasted the same way for callisto protocol. That game still has poor cpu utilization. Also dlss mods might be behind paywall. So it’s not exactly an assured alternative.

  6. Most likely NVIDIA will still be involved in the development process, as something similar happened with Crysis 3:

    AMD became the exclusive partner of that game, and yet the NV code-path was the better one, to the point that DXVK reports all GPUs as being from NV to Crysis3, even the AMD ones.

    Here’s the reason from DXVK’s lead developer:

    Crysis 3:
    All GPUs are now reported as Nvidia GPUs by default. This enables the game to use a fast path with considerably lower CPU overhead, but may cause a small performance hit on certain Nvidia GPUs in GPU-bound scenarios.

    1. Crysis 3 initially nvidia sponsored game. the game being announced and a few days later the game being re-announced in nvidia specific event. then the game alpha/beta key for MP actually being distributed by nvidia. but somehow AMD able to snatch the deal near the very end of the game development.

      1. Didn’t know that, thanks!

        Then it makes sense why the NVIDIA code-path is superior than AMD’s one.

  7. Great that POS worthless trash AMD are doing this!
    Now more ppl will see who they Really are, and all the nVidia hate will shift to this POS company!

    Guessing nVidia will be smiling and thinking the same 😁

    1. Y’all can’t be serious? Your memory can’t be that short. NVIDIA have been doing the same thing, the only difference is when AMD is at the helm Nvidia customers don’t have a problem playing the game. Game$works will love to have a word with you please. C’mon man, y’all can’t be serious here. NVIDIA? Yes AMD is becoming just like Nvidia, but don’t act otherwise. Don’t act as if nvidia isn’t scum.

      1. Yep buddy, they both are scum. Every corporation is doing the same sh*t!

        Devil and the money you know!

    2. You forgot gameworks? BTW is there any evidence of AMD actually preventing the devs from implementing dlss?

  8. I’m afraid this is going to be another Star Wars Jedi Survivor case here, beautiful game with terrible performances… Damn you AMD, you used to be the hero for the PC community, now you’re just another villain.

  9. pathetic AMD can only stay relevant by bribing studios. must be pure suffering to be eternal bottom of the barrel in gaming

    1. So, what happened when Nvidia did it? There are more Nvidia/gamedontworks title out there. So cut the crap and don’t be biased. Both companies indulge in this foolishness. The only difference is, when Nvidia does it, we all get to enjoy the game and not have done proprietary rubbish shoved down own throats. You know better dude.

      1. anon. I wish Nvidia had proper competition. but AMD is pure dogs*hit. an utter failure. they are unable to fix their absymal driver issues in the past decade while innovating almost nothing, always panting to catch up with Nvidia. AMD’s only offering is the old “muchas VRAM” gimmick and shills, cultists and complete idiots gobble it up. what a sorry excuse for a competition. AMD should just end itself.

  10. PCMR turned into fighting over upscaling… That’s console peon BS. We have GPU’s MSRP at $999-$1599 and we are fighting over fake res / fake frames. It’s pathetic.

    1. Don’t tell them that it’s fake man. Don’t you dare speak about about “upscaling”. They destroyed the true power of PC and want to sell you AI rubbish. They tried it with the cloud and it didn’t work now it’s Cloud by proxy via AI. It’s the same BS solution just a different turn. I find it very suspicious that as soon as we needed multi GPU support the most it died. Just wait for the idiot to respond with “only a minority of people used dual GPU” which is straight BS. All the benchmarks on YouTube was always pushing multi GPU, because they can’t afford it or never had it, all of a sudden it’s niche.

      Now more than ever we needed multi GPU support, that is our best way forward. The PC is modular for a reason and to act as if it’s not is a crime onto itself. If the option is there people will use it. So yes, it’s all pathetic indeed. The PC platform is getting held back and most PC folks can’t even FUQIN see this. THEY all running around talking about 4k@120hz. We will never achieve that in modern titles because of the glass ceiling. Every release is more graphically intense, paired with unoptimization as the new Day One DLC.

      1. This guys clearly has never mixed DLDSR with DLSS to get an incredibly antialiased image with far superior sharpness and good framerate.

        1. This guy is extremely sensitive to blur and all that you just named is blur technology. Anything with blur I want no part of. I don’t do enthusiasts builds in order to blur my life. I don’t like fake t!ts or fake anything. DLSS is Fake T!ts for GPU.

          1. Do you know basics of computer graphics? All what you see is fake and aproximation. “DLSS is fake” BS is just ignorant argument to discusion like this.

        2. As someone who used both, I would rather play at a native image than use either FSR2 or DLSS & DLDSR combined. Unless you have definitive proof that upscalers can provide better image quality by than native, native is still the way to go

        3. I have DLDSR enabled for everything, it’s probably Nvidia’s most underrated feature and can be used on ALL games. I paid for those Tensor Cores it’s stupid not to use them especially since DLDSR only drops performance by 2-3% on the games I have tested it on (AC Odyssey, AC Valhalla, HZD and Witcher 3 1.32)

    2. Well said, now wait for the morons to come down on you . This fight over upscaling is truly pathetic for PC commmunity man.

    3. Whole computer graphics in games is fake. Everything is artificaly computed. It is just aproximation. Upscaling is only another way how ro achieve better graphics and go forward. It will be even more significant when higher resolutions like 8K or 16K will come to us.

      1. true most of graphics is a hacking trick because you cant get the actual thing.

        From fog on psx games because the game cant render that far, to prebaked shadows that are like graffitis on the walls and floor because they cant get real time dynamic shadows to umbra hiding areas of the environment you are not looking to the “cinematic” scripted linear games with doorways been destroyed behind you so the game can unload that area from memory to load the new one. Its all fake, even the reflections are fake so now they are like “ah but we have real reflections now and most people cant tell the difference”

        Honestly running any game past high is pointless.

    4. Meanwhile Final Fantasy 16 on PS5 is rendered at 720p (i.e. a resolution from 2006 + the standard from two console gens ago) and still can’t maintain a constant 60fps. Men in glass houses shouldn’t cast stones, clown.

      1. So what? I am not buying FF16 on PS5. I am buying it on my PC with my 7900 XTX inside. Don’t need no upscaling BS 😀

  11. since my comment got downvoted, ill just repeat myself i got a 1060, its great, people say its the best card ever. Maybe, but i am not upgrading to nvidia, 4000 series is a joke and 3000 is the same in price and has the same perfomance. Nvidia is nuts, 300 for a card that functions like a 4050? Thats nuts. My next upgrade will be amd, since the only AAA game i care about is statfield, hopefully their new series can run stafield well so i dont waste money on AAA trash. I only care about starfield. Maybe intel can improve their game and beat nvidia, until then, i am not buying. 4060 is 4050 but it costs 300, not buying unless its 250 or lower.

    1. curious as to what people who downvoted me took issue with this. This is why i am doing this to find out wtf is your major malfunction. No arguments nothing, no “actually you are wrong” just “f you i love sucking off nvidia”

      1. Maybe they are like me and just think you are an azzhole and a whiny little b*tch….. because you are

  12. Big mistake AMD, roll it back you will regret it if you don’t.
    AMD can claim whatever having Tiny Todd do the “confirmation” and speak about it is dooming yourself.

  13. Great, another game with terrible image quality , high Vram usage , garbage or no RT support.
    Thank you AMD.

  14. PC users will have to mod in DLSS or suffer FSR garbage. Shame on AMD for this anti-consumer BS. Remember to downvote the video…

    1. Fake Frames is what you care about? Dude you’ve lost the Fuqin plot. Bro you can’t be serious. That’s what you are worried about? When has any AMD title stopped NVIDIA owners from enjoying the game? I’ll wait?.. bet you can’t say that’s in reverse.

  15. Amd sponsored title so expect at least 16gb bloated vram usage at medium textures and no dlss and xess

  16. Amd sponsored title so expect at least 16gb bloated vram usage at medium textures and no dlss and xess

  17. FSR 2 – work on every GPU from any vendor on all devices: PC, consoles, handhelds
    DLSS 2 – work only on some models from single vendor only on PC

    FSR 2 is much better option than DLSS 2. More people can play

    1. FSR is garbage. We have better alternatives on PC known as DLSS and XeSS. Go back to your console garbage.

      1. You can’t expect that game developers will work on three different solutions. This cost time and money. It is better to focus on library that everyone can use. FSR 2 works on any hardware

        FSR is also open source. It is always better to use open source code that you can control than use some closed library that nobody can fix without direct Nvidia support

    2. No, FSR 2 is not and has never been better than DLSS 2 or a better option, the image quality with FSR 2 is much worse than the AI generated image quality of DLSS 2, FSR 2 is compatible with a wider range of hardware, but DLSS 2 (or DLSS3) is much better than FSR 2, so the right thing is to give the option to use both (FSR 2 and DLSS 2) instead of just one of them, so everyone is satisfied, if Bethesda only deploys FSR 2 in the game and having performance problems will be a shot in your own foot, because we will have to depend on a Mod for the game to have DLSS 2 with more performance and better image quality!!!

  18. still waiting for arguments from the downvoters on why am i wrong and how are nvida prices not super expensive. You morons are fighting over upscaling and ray tracing on a game with global illumination that is cpu bound.

  19. “This is another bummer as Starfield seems likely to be a CPU-heavy game, so DLSS 3 would have been ideal for it.”

    I lived long enough to see “PC gamers” become as peasants as console gamers.
    Mfs pay 8 billion pounds for a GPU only to say some stüpid shït like this.
    “I nEeD a FaKe ReS sO mY pP cAn GeT hArD sEeInG a BiG nUmBeR”

    I would say pathetic, but that would be a compliment to these moröns.

  20. The performance analysis of Startfield is going to be fun.
    Imagine having FSR and still not being able to hit 60 FPS on a Series X.
    I mean sure it’s a dogshit console, but we all know they haven’t optimized for performance in any of their games for the last 25 years.

    In another parallel dimension they would have DLSS3 in Starfield and you would be able to hit 60 FPS on a 4090.

  21. – Gamebryo engine 2023
    – AMD anti-consumer policy
    – Bethesda’s presumed latest plans to monetize mods
    – Bethesda’s recent track record (launch of Fallout 76)
    – new owner Microsoft’s meddling and interference

    What could possibly go wrong?

  22. FSR/DLSS are garbage anyways. I guess it’s great if you are running a potato. I’ll take Native, full crisp graphics/resolution any day

  23. this is a stupid move , also Creation Engine 1 and 2 are garbage especially in Texture streaming , they can thank the modders communities that continually saves their a**es everytime , Todd Howard did another bad move here , he doesn’t or AMD doesn’t want to get their consoles to be humiliated by Nvidia’s GPU with it superior DLSS upscalling technology , because 4K upscaled with FSR 2.0 on the consoles at 30 FPS will obviously looks worse than 4K with DLSS on quality at higher than 60 on PC i can guaranty you this , Bethesda and AMD doing another anti-consumer move like always !
    and to the morons that don’t understand why DLSS doesn’t work on their AMD GPU , it require AI Tensors Cores , physically present only on RTX cards , that’s why , it work both on a hardware and software level , FSR work only on a software level , that’s why it can run on everything and that’s also why it’s worse and less efficient than Nvidia DLSS !

  24. F*k AMD the last time I bought amd was an FX 6200, and even then 3dsmax wouldnt work on the damn thing.

Leave a Reply

Your email address will not be published. Required fields are marked *