AMD Big Navi temp header

AMD hints that 4GB gaming GPUs are not that relevant these days, VRAM demands continue to increase for AAA games

According to one blog post written by Adit Bhutani, the Product Marketing Specialist for Radeon and Gaming at AMD, current 4GB VRAM cards are not much sufficient for gaming these days.

He emphasizes the important of VRAM these days, and as we already know that these days most of the AAA PC games, if not all, eat up a lot of VRAM to fill the textures, and some games even push the limits of hardware as well. Even though 4GB should suffice for some games, having extra VRAM never hurts.

AMD is now stressing that the future is “beyond 4GB” GPUs, and recommends going for either a  6GB or 8GB VRAM capable GPU, like the current RX 570, RX 580, RX 590, and RX 5000 series of GPUs. This makes sense given the high VRAM requirements of today’s PC games, but this is not a “mandatory” criteria for playing games though.

But as we approach the release of future next-gen Consoles, VRAM might slowly become a necessity. Both “PS5” and “Xbox Series X” might feature 16GB of combined memory, which will make 4GB VRAM cards insufficient for today’s standards.

AMD stresses on  the “Game Beyond 4GB” motto.  AMD tested some select RX 5500 XT models with 4GB and 8GB of VRAM, to check how much of a performance difference the GPU’s frame buffer aka VRAM can make at 1080p, and according to their report, we get a performance improvement on average of up to 19% across all the games tested.

As the blog reads:

“Recent releases have shown marked performance increases when switching from a RadeonTM 5500 XT 4GB to a RadeonTM 5500 XT 8GB.  In DOOM Eternal, the 8GB card runs the game at Ultra Nightmare settings at 75FPS (1080p), while the 4GB card can’t apply the graphics settings with that level of VRAM1. Looking at titles such as Borderlands 3, Call of Duty Modern Warfare, Forza Horizon 4, Ghost Recon Breakpoint, and Wolfenstein 2: The New Colossus, there is a performance improvement on average of up to 19% across these games when using the same card and increasing the amount of VRAM from 4GB to 8GB2 “.

Low VRAM GPUs might suffer more stutters during gaming, and also increased texture pop-ins.

” When gaming with insufficient levels of Graphics Memory, even at 1080p, gamers might expect several issues:

  1. Error Messages and Warning Limits
  2. Lower Framerates
  3. Gameplay Stutter and Texture Pop-in Issues “.

AMD has already released 8GB GPUs for the “mainstream” market with 8GB RX 470 and RX 480 GPU models since 2016. But as we approach the launch of next-gen consoles, VRAM requirements are going to rise, as more and more Game developers rely on this design.

Today 3GB and 4GB GPUs struggle with most of the AAA PC games, and even though 4GB is the minimum “baseline” VRAM recommended these days, running out of VRAM might cause several issues.

According to the blog post, ” AMD is leading the industry at providing gamers with high VRAM graphics solutions across the entire product offering. Competitive products at a similar entry level price-point are offering up to a maximum of 4GB of VRAM, which is evidently not enough for today’s games. Go Beyond 4GB of Video Memory to Crank Up your settings. Play on RadeonTM RX Series GPUs with 6GB or 8GB of VRAM and enjoy gaming at Max settings “.

AMD VRAM requirements triple-A games article

Thanks, OC3D.

80 thoughts on “AMD hints that 4GB gaming GPUs are not that relevant these days, VRAM demands continue to increase for AAA games”

  1. I’d go much further than that: it’s long past time 8GB cards became budget tier. We’ve had 8GB cards in the midrange since 2015, and present in the consumer market since 2012 or 2013 IIRC.

    1. I 100% agree but unfortunately their are a few massive issues holding us back.

      1st almost all PC games on Steam still have a minimum requirements of 2GB for the GPU and support potatoes. As long as this continues most companies are never going to bother utilize anything past 4GB cause all that work went into the low end.

      2nd you have the Switch still on the market that has an insane 3.5GB memory pool and a lot of companies will opt in that their game works on that config which boils down to a 2GB GPU budget or less. Again We loose out and the extra GPU memory is not utilized or prioritized on the PC scene with our ports.

      3rd the two main consoles PS4 and XB1 both have basically a 7GB memory pool if you take away what is reserved for the OS. This boils down to a around 4GB GPU budget so again We can’t take advantage of all the modern GPUs or even old Gen GPUs that have been out for a decade.

      I am really thankful the new consoles are coming out this year with their 16GB memory pools, it will make a huge impact for the PC. My thoughts are this will finally end the 2GB PC minimum requirements and finally start to take advantage of all our GPU and system memory We have had for a decade.

      Good thing Steam survey shows most of us are sitting at around 6GB and I am sure this September that number will only go up with all the upgrades that will happen. Really it is a shame for any game could make a PC specific Very High and Ultra setting that take advantage of our PC GPUs. Problem is most will never bother with the effort and aim for the main stream instead where all the money is.

      So in short I don’t think what AMD is saying is true even though I 100% wish it was and I say this as a Game Dev and Gamer. We really need to move forward by now and dump archaic 2GB GPUs.

      1. As a game dev in progress (Unity, Godot), agreed, but we should also take an inspection from the indie point of view that are observed by the mainstream, which usually offer barebone spec requirements a HD iGPUs can run. Of course, indies for the most part can’t really choke mainstream with hardware requirements, due to their oftenly low funding nature and low poly art style. When something like Dead Cells or Disco Elysium turns successful, many other devs pay attention to sales rather than 8 years old minimun spec requirements. AAA/AA will be the main horsepower for years to come, certainly some indie fan remakes will also hop into the “advanced specs” baggage, but not sufficient to consider them relevant tech wise.

  2. Is really like to know what their test bench was for the 5500XT. Since that card has a PCIe 4.0 x8 interface, the performance hit when you run out of VRAM is much worse if you’re not utilizing PCIe 4.0. The same is true for the 8GB version, but you’re much less likely to run out of VRAM, so it isn’t really an issue for today’s games.

    1. Here:

      “Testing done by AMD performance labs 04/5/2020 on Radeon RX 5500XT (4GB vs. 8GB) (Driver: 20.2.2), Intel Core i9 9900K (3.6 Ghz), 16GB DDR4 3200MHz,
      Gigabyte Z390 Aorus Elite, F7 bios, Win10 Pro x64 18362. Performance may vary.

      Testing done by AMD performance labs 11/29/2019 on Ryzen 5 3600X, 16GB
      DDR4-3200MHz, ASROCK X570 TAICHI, P1.70A, Win10 Pro x64 18362.175, AMD
      Driver 19.50. . Using Borderlands 3 @ DX12 High, Call of Duty: Modern Warfare @ DX12 Ultra, Forza Horizon 4 @ DX12 Ultra, Tom Clancy’s Breakpoint @ DX11 Very High, Wolfenstein: The New Colossus @ Vulkan, Ultra.”

      PC manufacturers may vary configurations yielding different results. Performance may vary.

      1. So they were done on the 9900K? That should show a roughly worse case scenario if the 4GB card indeed was hitting VRAM limits. It’s a fair result to show though, since very few people running a $170 5500XT have an X570 motherboard.

      2. If you are interested about GPU memory go to twitter of James Stanard. He work ad ”Graphics Optimization R&D and Engine Architect” at DirectX team and was responsible for co-inventing DXR. But now he mainly speak about GPU memory with PC and Xbox developers

        twitter __ JamesStanard ___ with_replies

          1. He talk a lot with former Valve developer (currently SpaceX) about Sampler Feedback streaming, BCPack texture compression, extended address mapping between GDDR and new DirectStorage (low level access to SSD conroler). I read a lot of your comments here and I know that this is something you will like

          2. If you want go deeper in memory management check ‘Microsoft DirectX 12 and Graphics Education’ channel from latest GDC. Claire Andrews explain how Sampler Feedback works and why it was needed to better control memory usage

            Youtube
            “DirectX 12 Sampler Feedback | Claire Andrews | DirectX Developer Day”

            DirectX Developer Blog
            “Coming to DirectX 12— Sampler Feedback: some useful once-hidden data, unlocked”

            There are also 2 patents with description how this hardware steaming works on DX12 Ultimate cards with Partially Resident Textures (sampler feedback is required feature of dx12_2). You can find them on Google Patents when you search for Ivan Nevraev. Newest one from 2018 was just published last month (2020-05-14):

            US20200151093A1 – Flexible sizing for data buffers

            “A method of handling data buffer resources in graphics processor, the method comprising: establishing a pool of available memory pages tracked by memory pointers for use in a data structure”

            I hope this will be helpful to you

  3. 8GB of VRAM will remain the minimum standard for awhile as 16GB will become the recommended standard. The same will go for system memory as well.

  4. That’s right and even at 1080p VRAM demands will not decrease in the coming years. “Optimization” only takes you so far and today’s devs go to great length to carefully optimize their content for low-end VRAM cards.

    My point is that modern games are VRAM hungry for a good reason. It’s not devs being “lazy” as many ignorant gamers sometimes believe because it’s more convenient to think that rather than to come to terms with how tech and development work.

    1. Yes and no. New rendering techniques requires more VRAM, that is true. At the same time, we’ve had games that looked amazing such as Assassin’s Creed Black Flag, Witcher 3 or Batman Arkham Knight, that used 2 GBs of VRAM just fine.

      1. Arkham Knight actually had stuttering issues on 2GB cards. My old 760 4GB didn’t run it at 60fps, but unlike 770 2GB there was no stutter and it was quite playable. (too bad it sucked)

  5. Wrong. 4 GB HBM = 12 GB GDDR5

    reddit_ . _com/r/Amd/comments/3xn0zf/fury_xs_4gb_hbm_exceeds_capabilities_of_8gb_or/

    1. 4gb is still 4gb no matter how fast it is thats why the 980ti aged better and run games better than the fury X.

  6. 8gb should be the minimum and 16gb should be the standard. I never understood especially would also cap there mid to low end GPU with 4gb or even less gb of Vram. Lets hope AMD makes this a priority for future cards

    1. Bear in mind that even if a next gen console has 16 GB RAM, not all of that is available for graphics. Some of it is used for the OS, some of it is used for the engine and some of it is used for the game, level maps etc.

      For PC 8GB VRAM should be plenty for a few years unless you are gaming at 4K max settings but then extreme GPU performance necessity is going to put you in a category that will require high end and more than 8GB VRAM comes along with that.

  7. Proper VRAM demands are an unknown to be certain, but given consoles will have double the RAM AND will use the SSD as virtual ram as well, you can easily expect 8 GB to not be enough to max out texture settings in 1080p by the end of next year.

    1. 12-16gb vram gpu and a good nvme will be a must if you want to run games alongside the newest consoles at equivalent settings.

    1. Hopefully, no one gets their hardware requirements advice from the marketing team of a hardware maker.

    1. Do you really want to pay so much more for VRAM that you can’t possibly ever need for gaming during the lifetime of your card even at 4K max settings?

          1. People that bought 1080ti’s 3 years ago… I can see many people using those cards for a handful of years down the road. Great card!!

  8. Some games use more than 4 GB VRAM simply because it’s there. The engine loads new textures without removing the unneeded textures from a previous level but we are getting to a point where the extra VRAM really is needed and it’s going to become more and more necessary. I would say don’t buy a card today without 8 GB VRAM if you plan to keep it for a while.

    If you don’t have enough VRAM then the engine starts to use your system RAM which is much slower than VRAM.

    There’s one constant in PC gaming and console gaming. Upgrading is never ending.

    1. We’ve needed 6+ GBs since at least 2016, with titles such as Rise of the Tomb Raider, Quantum Break, Deus Ex Mankind Divided, Dishonored 2 easily going over 4 GB for max settings 1080p.

    2. “” If you don’t have enough VRAM then the engine starts to use your system RAM which is much slower than VRAM.””

      That somehow reminds me of the GTX 970’s 3.5GB VRAM fiasco, lol.

      It was just a joke posted on the Geforce Forums though, in which a user showed that once the VRAM crossed the 3.5GB mark, it ditches the last 0.5Gig section/partition of VRAM, and uses the system’s PCIe Memory instead.

      But the fact is games will STILL use the last 0.5GB portion of VRAM. When the VRAM gets filled then only system RAM comes into play.

      1. True and even though the last .5 GB VRAM was 7 times slower than the other 3.5 GB it was still faster than using system RAM.

        btw I bought a 970 on release and I can tell you that the performance benchmarks didn’t change after the news of the debacle with the segmentation of VRAM. It was still a very nice GPU. I did support the class action lawsuit though.

        The plain and simple truth is that Nvidia lied/misled. That can’t be tolerated.

      1. I thought about it and didn’t want to take things too far off topic.
        Besides the 970 debacle has been discussed to death. They eventually settled out of court on the Class Action Lawsuit and paid $20 to each 970 owner and it’s over and done with but what remains is that Nvidia will lie/mislead.

    3. And I would simplify that advice further: Don’t buy a graphics card today.

      There’s no point in buying anything before “big Navi” and the 3000 series come out. Big Navi will tell us what the real performance level of the consoles is, and then you’ll have to go bigger on PC to account for performance lost to Windows abstraction.

      1. Oh, that’s strange. Who removed all your posts ? The Disqus ADMIN ?

        I thought you got fed up with that “Jackyies’ guy who used to stalk you day and night, lol, so that’s why you nuked your previous Account !!

        1. I think it is an algorithm too because my comments were being removed within seconds of me making them here and there was nothing harmful or spam like in the comments. I don’t make comments like that except the comment that I made on a PCGamer article on their site that PCGamer had really gone downhill. Right after that I got flagged as “Low Quality” poster and my comments began to disappear almost immediately here.

      2. That happened to me for a while. I think it had to do with being rated a “Low Quality” poster by Disqus. Somehow John fixed it. I got rated “Low Quality” right after I made a negative comment about PCGamer on one of their comment sections. I don’t comment on PCGamer anymore if the mods there are that sensitive.

  9. The Next Gen consoles are going to 100% increase the GPU minimum requirements on the PC for the new games. With a 16GB pool it will for sure increase the usage, you can bet on that.

    Also if you Buy into all the UE5 stuff and PS5 IO and that everyone is going to use 8K and 4K textures then this alone will drive that usage up and the millions and billions of polygon models streaming in 🙂 . Will see but I would like to think it will phase out 2GB and hopefully even 4GB at the very least. Just keep in mind there is a massive amount of users on the PC with solid 4GB cards, so will shops want to exclude them? I doubt it.

    Will be pretty cool to see what happens but a change will come for sure and for the good in terms of the PC scene.

    1. Hey, I’m still stuck with the good ‘ol RX 480 4GB variant GPU, haha !

      But I mostly play OLD first person shooters, with the exception of some new AAA games, so I don’t have to worry much about VRAM getting filled. To be honest, I’m quite happy since my entire game collection consists of old school FPS and other indie shooters.

      I created a thread long back on TOM’s hardware forums, and you can check the list of FPS games I usually play. ? I have deactivated my Tom’s hardware Account though, since that Forum makes me sick.

      Very unfriendly and “hostile” atmosphere. Overly strict as well.

      https://forums.tomshardware.com/threads/whats-your-most-favorite-gaming-genre-currently-playing-game-on-your-pc.3508398/

      1. Well this is just it, what right do I have to exclude you from My game I would say none at this point. Your GPU is basically a XB1X that is pretty much a 580 which is pretty close to your 480. I don’t think most 4GB and XB1X users would appreciate it if We all forced you to upgrade.

        The thing is I believe Devs need to focus more on those High, Very High and Ultra settings and still allow you Guys to play that is the Key. To Me the problem right now is most pick a target mainstream config where they can get the most sales and they call it a day. Since I am also a gamer it really makes Me upset most don’t bother with high end or they aim for really low end(there is no range). PC ports should cover everyone for that is really a PC ports responsibility. Anyways I see everything really differently and so many things drive Me insane.

        Yeah your right you can easily play everything especially old school games. Heck your fine with even Doom Eternal, just dump AA and your G2G. Very cool I will check it out I bet you love Shadow Warrior 1 and 2(the new ones) and Amid Evil m/ 🙂 m/. Man GOG pretty much has everything you need now for FPS! Anyways I will check out the list in your link thanks for sharing! Dude don’t get Me started with other forums it’s a sh|t storm lately and everywhere. Got say it’s been really solid here, cool that We can just talk about games and tech.

      2. I stopped posting there as well. Toms isn’t what it used to be. It used to be my go to site to discuss hardware but not for years now.

        Another site that I stopped using is ArsTechnica. They put up an article about the new Dr Who a couple of years ago. I have been a fan of Dr Who for decades and I posted my opinion that I didn’t see why the new Dr Who had to be female after 5 decades of the Doctor being male.

        I was immediately ganged up on by the Politically Correct there that patrol the forums looking for opportunities to show how enlightened they are and a mod even deleted my post.

        1. The Moderators over at Tom’s Hardware are really sick and demented. That forum is very unpredictable, and some of the Mods just abuse/misuse their Power as well.

          It’s a shame that a Tech/hardware site is filled with egomaniacs and bigoted individuals.

    2. For sure the tech is amazing and so is that video there is no question about it! Well this is just it and you can really say that about most of the tech we have had for a decade, most barley use any of it, or tiny amount and most of our games are still static objects etc.

      It be a awesome but a crazy undertaking to have a full game look like that demo, like the Witcher 4!!!!!!! Can you imagine!!!!????

      Huh no way, I did not know it was rumoured to be that size. Well I am sure you could use 4K textures and easily lower that size and smaller objects could easily us 2K, same with all the poly counts and use normals etc, still.

      Yeah I agree and like I said you could optimize the heck out of that demo and make it look really close and be a fraction of the size. I think the read in any texture size and 3D model message is still a ways out in terms of console hardware for full games, PC could do it on the high end but that is another storey. Part of Me still wonders why they did not opt for 64GB of memory instead of a really fast NVMe drive. I still think that would of been a better combo with a regular SSD(SATA) for the Next Gen consoles. Anyways exciting times ahead, I can’t wait!

  10. Sure 4gb is not enough depending on the level of graphics one wants but truth is that many games eat vram and offer nothing worthy.i hope we need 16gb vram why not if it is somethimg worth the amount of memory.

    1. Lol, it happened recently though. It was JOHN’s decision to include me in DSOG’s team, since I was kind of reluctant and hesitant to accept this post.

      Actually, I have been submitting a LOT of articles as a Guest user/reader of DSOG website, but most of my topics were about “news stories” in general rather than actual User’s Articles. Such articles actually fall under the “News” category section, so John decided to make a new author/contributor account for me. That’s it !

      You can check the full list of articles submitted by all “users” over here ! 😀

      https://www.dsogaming.com/category/users-articles/

      1. I gave this site a break for a while because the commenters were toxic and I didn’t like the writers much. You seem like a really good addition, though. Smart writing and mature comments. Glad you are here.

  11. Eh… at the moment no. But I think next year 4gb def won’t be enough. I have an RTX 2060 6gb and it does the job but I am not upgrading till 3000 series. I ran Red Dead 2 At Ultra almost everything at 1440p with 50fps which I was fine with for the great graphics

  12. One thing Amd has almost always gave us more Vram which is one of the reasons why their cards age better.

    Youtube The Mighty HD 7970 Returns, GTX 1050 Ti & GTX 680 Battle! from hardware unbox

  13. AMD has a point, Nvidia has the deceiving eyes, if they’ll release 1650 abomination (Super was good, but RX580 exists) again with 4 GB VRAM.

  14. there used to be a time when pc games where made just for pc and years later downgraded for kids consoles, now pcs have to brute force crappy console builds ported to pc by c teams.

  15. I have an RX 570 with 4 GB of VRAM which I bought used simply because I was on an extreme budget and did not have much choice. I have felt that 4 GB will stop being enough back in the R9 290 days – I recall that when R9 390 was released with 8 GB the increased amount of memory was the most important reason for these cards to exist.

  16. 2021: 6GB will be minimum,, 8GB sweet spot,, 10-12GB recomended,, 16GB might be released by nvidia for consumer GPU (for the first time?)..

Leave a Reply

Your email address will not be published. Required fields are marked *