AMD has released a new video for Godfall in which Counterplay Games details its key tech features. And, according to the developers, Godfall will require 12GB of VRAM in 4K/Ultra in order to play smoothly.
Now as you all know, the NVIDIA GeForce RTX3080 comes with 10GB of VRAM. NVIDIA has stated that this amount of VRAM is enough for both current-gen and next-gen games. Thus, it will be interesting to see whether the RTX3080 will hit any VRAM limitations in Godfall at 4K/Ultra.
Counterplay Games will be using DXR – via DirectX 12 Ultimate – in order to add support for Ray Tracing. The team will be using Ray Tracing in order to enhance the game’s shadows. This will be the only thing that will be improved via RT, so don’t expect better Reflections, Ambient Occlusion or Global Illumination effects.
Godfall will also support AMD’s Infinity Cache, as well as Variable Rate Shading. It will also support FidelityFX Sharpening on all GPUs. PC gamers can use this technique in order to sharpen the game’s graphics (which may be blurry due to the TAA tech that Godfall will be using).
Godfall releases on November 12th and will be timed-exclusive on Epic Games Store. The game will also not have an offline mode and will require an internet connection to play. Lastly, Gearbox revealed yesterday the game’s official PC system requirements.

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
what it requires is a refund.
Let us all hope that the AMD 6800XT with 16GB Vram will truly be competitive with the 3080 in the third party reviewers benchmark tests. Only time shall tell….
Too much requirements,always online,timed epic exclusive,more than enough reasons to skip this crap.
“For Godfall, the bulk of the game was built with 30 people. We’ve had a
36-month development cycle. It’ll be about three years by launch. In the
first two years, two-and-a-half years, it was about 30 people.”
Thats from their CEO.
How are you comparing a studio of this size to rockstar and CDPR who have hundreds of developers?
Do you really expect the same level of optimization?
For a studio of this size and the amount of stuff on screen here this all seems about right.
How you know that it can’t run it 4K 60fps because of video ram and not because of lack of GPU power. Playing a modern game at 4K 60fps can be very demanding
Microsoft flight simulator for example can eat ~10GB at 4K max setting, but even RTX 3090 24GB can’t do 60fps (or even close) in this game. So, video ram is not limitation here because you will never run this game at any setting that will consume a lot of memory anyway
Serves Nvidia right for being stingy with the amount of VRAM.
Seriously how they think they could go down 4 gb from the 1080 ti is idiotic. They just assumed that AMD wouldn’t catch up. Now they will for sure have to put out a 3080ti with 16 gigs for cheaper if they want to compete.
Nah, idiots will buy Daddy Jensen’s products no matter what.
In Russia it’s compounded by the fact that the local currency has been depreciating like crazy for the last year (like losing 30% of it’s value in relation to USD) so people want to buy something tangible due to the a real risk of what little savings they have turning into frog skins.
And prices on AIB models are inflated by like 20%. And you see people in the comments furious about the Nvidia GPUs shortages but basically elbowing each other in hopes of getting one of these VRAM-deficient GPUs. Witnessing it sucks big time.
Funny AMD fanboys were butthurt when Nvidia pushed Mirror’s Edge Catalyst Hyper settings while Fury X lagged with 4 GB VRAM. ?
To all you idiots on here that was jumping down my throat for saying that 8-11 GBs of Vram won’t be enough for Next gen you can go eat A DIKK. For years I’ve watched Vram just get’s eaten by 2 factors, Resolution & Textures. It’s just basic knowledge that once you bump those textures and Resolution 8 GB cards will become obsolete. Even at 1080p in alot of open world titles just aren’t enough. But it became clear why so many of you were defending this garbage, it’s because nvidia was providing small Vram servings. Y’all love to suck the phallus of 4k and Gaytracing but not realizing that they will be the major factor in Vram usage for years to come. I wish people would stop lying to themselves in order to defend their favorite brand
honestly man its like the whole world is upside down in hardware right now. Intel cpus have vulnerability back doors, amd cpus are expensive and powerful and now this. I am not buying any hardware until things stabilize.
And you are right. To anyone that claims that consoles doesn’t affect PCs think again. This is all happening because it’s a console transitioning n year. In about 1- 2 years everyone in PC should know where they stand. Because all the devs are doing is just Vram dumping like they did in this gens transition. But even without that, 4k is a lot of juice and that’s just that. Anyone who is thinking they are gonna sit on 8 Gb of Vram card and cozy on through the industry have quite the wake up all call
Your argument has merit I will start there. The more texture ram you have the more textures you can cram into it. This game however looks like crap and it is obvious an AMD exploitation if this crap looking game really needs more than 10gb of texture ram it is clearly intentional. I can see am open world game like GTA, but with texture memory so fast and so many techniques to render what is in immediate view I don’t see a reason why 10gb isn’t more than sufficient for most 4k games.
This is just bad programming on a bad game, at least when Nvidia was abusing Phys-X the games were actually good. I don’t really see this being too much of an issue going forward with next gen titles, especially with tech like DLSS 2.0, Direct ML and tessellation designed to mitigate this. Time will tell who is right Nvidia opted for faster ram speeds and AMD opted size that is slower ram with a fast cache.
my god you are a damn moron. the typical person buying a 3080 will most certainly be playing at 4k. ffs I play many of my games at 4k using dsr or in game resolution scaling using a 2080 super. 10 gb will be holding a 3080 back in some upcoming games from using settings it could otherwise run if it had more vram.
the typical person buying a 3080 will most certainly be playing at 4k.
I wouldn’t be so sure about that. Many people buying 3080s have high refresh rate monitors, not necessarily 4K.
I have a 3080, and I play 1080p. I do have a 4k monitor and a 4k TV as well rarely used for gaming. Work/content consumption is fine. Gaming is very meh.
1. I rather have high refresh rate. 4k monitors at 144+hz are expensive AF still.
2. Most games look worse in 4k, as all I see currently (as monitors are up close) is how low texture resolution is in many places within a game. Main character looks fine, but when you stand next to a wall and it’s 128×128 texture stretched, I rather still play in 1080p. And I don’t really like playing on TV as latency is worse than on a high refresh monitor.
But good luck playing on 4k if you like to. But it’s by far not general. Most people on PC are on 1080p (66%), 768p(9%), 1440p(7%) and finally 4k usage is very low at under 1%.
I won’t guess what everyone here does, I play games in 4k and prefer it over my 1440p 144hz display and it will be my preference going forward with next gen cards with raytracing.
AMD cpu’s have backdoors too, but only retards and poorf@gs buy AMD CPU’s so hackers didn’t care much about looking for AMD backdoors. But things can change. Popular hardware gets banged first, AMD cultists falsely claimed superiority, just watch
Some game are using 8gb+at 2k on my gtx 1080ti let alone next gen title
10GB is more than enough for 4k. This is an AMD sponsored title. AMD are pushing 16GB of cheap VRAM on all but their bottom cards.
This just looks like an unoptimised mess again on the Epic fail store.
cope
cope
Spotted one of those that sold his 2080ti.
lol more than enough? it is getting pretty close to the edge right now in a few games and thats without any mods. it will NOT be enough down the road and clowns like you will look foolish sometime next year as games get more vram demanding. that said, turning down textures a notch and maybe one other setting will likely be all thats needed in those few games.
Exactly how i was burned with my gtx 680 at the start of the last gen due to lack of vram while it is a theoretically faster gpu than the gpu in the base ps4, i was saying the same things will happen and warned people about the 3080 10gb vram being a limiting factor and unlike the gtx 680 that came out 2 years before the ps4/xbox one the 3080 is already obsolete in term of vram capacity, and exactly like you i was being jumped on with stupid arguments like “nvidia said you wont need more that 10gb vram”
“Most of the vram is used only for caching”
Or the most idiotic one:
“Gddr6x is so fast so it will be able to cope”
How the f*ck vram speed matter when its filled up?!
Hahaha I got the “Gddr6x so fast you don’t need size” too when I tried to point out that for next gen 4K max settings 10gb might not be enough.
“How the f*ck vram speed matter when its filled up?!”
That’s just the gripe these days, people rather argue than see logic and clear and present danger. I remember when Nvidia said 10 GBs of Vram, I said BULLSH*T, EVERYONE said your a fool, it’s enough. Now the new argument is, “it’s an AMD sponsored Title”, never before have I heard such bafoonery. I don’t recall know where to begin with that one. VRAM is VRAM, what’s the point in mark manipulating. Man they will come up with anything to defend the garbage.
For the game’s gfx that allocated size of vram seems very high, can fully understand why it seems like bloated vram usage for what it delivers and that some sees that as baloney.
you only need 4K if you’re playing on big screen TV so consoles need 4K, PC doesnt if you’re playing on like a 27inch or below monitor
GTX6 came out in 2012, so, no.
Yea but still faster than the base ps4 gpu.
Nowadays its low vram is not the only issue developers and Nvidia dont optimize for kepler architectures anymore so even the 700 series are left in the dust.
Ok but there are critical differences here. The Ps4 uses 8GB of GDDR5 memory that shares between the OS and VRAM. Your 2GB card was never going to fare well vs that.
The Xbox Series X has 10GB of faster memory that will be used for VRAM and the other 6GB for system tasks. It’s possible PC games could have some insane texture packs but in general 10GB of VRAM is going to be perfectly fine for gaming.
4GB is still enough in 2020 with 1080p and moderate settings. 10GB will be more than enough if You actually tweak games. Visual difference will be so minimal that it’s silly seeing people getting so angry about this.
You think i play PC to play on Moderate settings. Do you want to be yelled at or are you just here to aggravate people. Because you are just moving the goal post with nonsense. c’mon man, don’t waste or time here. ain’t nobody here talking about moderate settings when we speak of PC, we are talking Maximum Top High Tier & that’s what will happen when you max Sh*t out. Just STop man
lol play on moderate settings with a top end card at 4k?
Even modding Skyrim could easily eat up all the VRAM with them crispy 4k textures.
We all know that 10GB of Vram is no longer enough for 4k, coming this next gen, it will be barely enough or even 1080p, but see i’m not in the business of arguing with people anymore on things that are just right in your face. I have a 8 GB card and i know my days are numbered. AMD finally stepped up and give me that 16GB i wanted. Times are changing and that’s just that.
It isn’t enough. Again I will bring up the undeniable fact that RE Engine, for example, requires well over 12 GB VRAM just at 1080p to run at super ultra graphics. That’s without any raytracing, at all. At 1080p. In fact I’ll install RE2 right now and crank it at 1440p just for screenshots of the settings. Not that it will stop those people. I’ve been saying it since at least 2018 because it’s obvious unless you haven’t PLAYED any modern games. Games from 2019 and before were already putting the existing cards to shame, like this.
https://steamuserimages-a.akamaihd.net/ugc/1676989078610251422/5622327AF6EA4C382C0CF4218AFE1D0D74B5D904/
I have played both re2 and re3 totally maxed out well above vram usage in the setting on a 2070 without any issues during playthrough with stable fps. Vram in settings are not as factual as people think. And I did both chars with both playthroughs as well in re2.
as always Nvidia the genius they are they give you enough vram for the year so you are forced to upgrade again the next year with the next card
it is one of their strategies from way back with 8800GT
Well, they reserve “Ultra” for Top end card. 3080 is just a high end card :p
for 700$ it is just..huh
This seems an awful lot like settings designed to cripple one of the manufacturers, exactly what people hated on Nvidia for years.
it is the same with AMD settings being forced always ON in AMD sponsored games like Horizon Zero Dawn, just to cripple nvidia performance
Fascinating, I will refund the game if it doesn’t run well on my RTX 3080.
Buying an expensive GPU with just 10GB of VRAM was a big mistake if you intended to game at 4K.
will be interesting to see this game being bench on 3080. though nvidia cards in general able to handle VRAM pressure better than AMD card. remember what happen with horizon zero dawn before? TPU just test watchdos legion and card like 1060 3GB end up performing better than AMD 4GB GPU.
It must really suck to have funded Nvidias prototypes (20xx series). Huge price cuts, performance boosts, and now they’re not even next gen ready one gen later. Quit betting on Ngreedia guys. They do not care about their customers.
2080 SUPER, 2080Ti & RTX 3080, 3070 are perfectly well capable cards that will be soon shortlived because nvidia decided 8 & 10 GB of VRAM is enough
Unoptimized pile of crap AND EGS time exclusive, they can shove it deep.
yes 6 – 8 – 10 gb cards are been gimped, and no, this game -should- work with 8gb ram, because it looks like a giant massive turd.
Turdfall
The 3080 shouldn’t have been released with 10 GB VRAM whether it needs it or not. The Big Navi will have 16 GB VRAM and you better believe that will be a selling point to the masses.
For its gfx that size of vram allocated is shockingly high, wtf have they done!?
3080 not strong enough for next gen gaming. Xbox and PS5 is the way to go for glorious 4K gaming!
What even is the point in this game? It just looks like a battle arena slasher.
A very unoptimized Unreal 4 engine game… hard pass.
Did you expect ANYTHING less from the people involved in Borderlands.
If it does it’s because the game is trash and, even so, it’s a 3080. The only real card in each series is the 80ti and 90ti if one exists, or in this case just the 90. We can see, now, why they axed the alleged 20gb models – because they are profiteering. People won’t upgrade to the 2021 – 22 models if they don’t have a reason not to such as tons of vram.
I see AMD spent their dollars well. Buying dev wh0res to promote and cater for their dogsh*t videocards. Literally, biiig VRAM, the only bells and whistles AMD has/had. No one cared about the fact that the “huge” VRAM on AMD cards was bottlenecked by all the other dogsh*t compontents on their faux cards.