EA and DICE have revealed the official PC system requirements for Battlefield 2042’s upcoming Open Beta. This Open Beta will launch for everyone on October 8th, and pre-load will be available on October 5th.
According to the devs, PC gamers will at least need an AMD Ryzen 5 3600 or an Intel Core i5 6600K with 8GB of RAM and an AMD Radeon RX560 or an NVIDIA GeForce GTX1050Ti.
DICE recommends using an AMD Ryzen 7 2700X or an Intel Core i7 4790 with 16GB of RAM. The team also recommends an NVIDIA GeForce RTX3060 or an AMD Radeon RX 6600XT.
Battlefield 2042 Open Beta will be using DX12, and will require 100GB of free hard-disk space.
Below you can find its full PC system requirements.

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email

Does’nt make any sense that they require a Ryzen 3600 but for intel an old 6th Gen i5.
in PUBG they get almost the same performance… 3600 avgs 103 fps and the 6600K at 96 fps at 1080p with a 1080 ti…
PUBG is an unoptimised mess that heavily favours Intel-Nvidia hardware (GTA V is the same). Battlefield V runs very nicely on AMD hardware even though it’s an Nvidia sponsored title.
just going to have to see how it runs.
https://uploads.disquscdn.com/images/b7df0053c5ab80062c471d5ef296ce2a66441be70e62e5fcb315a9c623bab2ac.jpg
? du du dun du dun dun … ?
Battlefield 5.5!
Only the new Xbox X is close to 3060 TI performance…
“Xbox X is close to 3060 TI performance…”
It is better compare AMD to AMD
Xbox Series X – 52 CU RDNA 2 at game clock 1825 MHz
Radeon 6800 – 60 CU RDNA 2 at game clock 1815 MHz
With 8 more CU Radeon 6800 should be 10% faster than Xbox. Of course final result can be different because Radeon use 256-bit memory interface and Xbox 320-bit interface
Radeon 6800:
– 60 CU RDNA 2, game clock 1815 MHz
– 256-bit GDDR6 memory
Xbox Series X:
– 52 CU RDNA 2, game clock 1825 MHz
– 320-bit GDDR6 memory
Radeon 6800 have 8 more CU but lower game clock and slower memory than Xbox GPU
Radeon 6800:
– 60 CU RDNA 2, game clock 1815 MHz
– 256-bit GDDR6 memory
Xbox Series X:
– 52 CU RDNA 2,game clock 1825 MHz
– 320-bit GDDR6 memory
Radeon 6800 have 8 more CU but lower game lock and slower memory than Xbox GPU
mental illness personified
mental illness personified
Lol, imagine thinking that’s the sole reason why GPU prices have skyrocketed in the last 2 years. Yes crypto has played a part in it mainly due to ETH(With has nothing to do with both these guys btw since they pump BTC not ETH) but the major reason is because the chip/raw material shortage plus inflation, higher MSRP and scalper prices.
ETH/BTC aren’t even in their ATH by now yet GPUs prices are still pretty high due to these other factors I just mentioned and demand hasn’t gone down either.
https://videocardz.com/newz/amd-ceo-lisa-su-expects-chip-shortages-to-ease-in-the-second-half-of-2022You’re pretty dumb aren’t you? So what if they’re made out of sand? Do you think the sand just transforms instantly to these chips? There’s a whole sophisticated and complex process behind, go search for it if you’re clueless about it.
Nah man, don’t you know? This is why AMD and NVIDIA both use Heatsinks on their GPUs. It’s not to actually cool the card, but to simply hold all the sand together in the exact right way so the card functions.
Not only that but the fact that, the recomended I7 to go against the 2700x is a 4790. Who came up with those requirements is retarded.
Do you still have to understand that at higher resolutions and with graphics
set to ultra, the CPU requirements decrease instead of increase?
So it makes sense to ALSO see lower CPU requirements in the recommended specs compared to the minimum!
At high resolutions and ultra graphics, the GPU does the most work!
Ancora dovete capire che a risoluzioni più alte e con grafica settata verso
l’alto,i requisiti CPU diminuiscono invece di aumentare?
Quindi è logico vedere ANCHE requisiti CPU minori nelle specifiche raccomandate in confronto alle minime!
Alle alte risoluzioni e grafica ultra il lavoro maggiore lo fa la GPU!
Wut ??? you have no idea what you´re talking about, but even if you were correct, the point i was making was how it does NOT make any sense to pair an 4790k to an 2700x as if they are even remotly close to the same level of performance.
The 4790k is an 4c/8t from 2014 the 2700x is 8c/16t from 2018 the diference in age should already be a tell in the diference between them before even considering the massive gap in core/threads. Hell i doubt the 4790k even “wins” from the 3600 due to the massive IPC gains from 3rd gen ryzen.
https://www.youtube.com/watch?v=BVpfvqHBu6o
I really don’t see all this difference between the 4790k and the 3600!
And the 4790k overclocks much better and reaches the Ryzen 3600 easily once overclocked at 4.8 ghz like the one I got …
Oh yes because lets use a “benchmark” that does not even show CPU usage let alone frametimes between wildly different CPUs and say it’s ALL THE SAME, how dense can you be…
Oh yes because Ryzen it’s also such a bad chip to OC, my god be real. Most 3rd gen Ryzen will turbo clock itself to 4ghz even with no change from the user assuming he has decent enough cooler/case ventilation. ?
Ps The fact that you said “reaches the Ryzen 3600” is my point. despiste the IPC gains the 3600 is only an 6c/12t CPU and 2700x is an 8c/16t CPU if it would take your CPU an massive OC to even reach the 3600 that i would assume it’s on its base clock it sure isn’t reaching the 2700x on any modern titles that are any CPU intensive on the base clock let alone overclocked.
Oh yes because lets f*king use a “benchmark” that does not even show CPU usage let alone frametimes between wildly different CPUs and say it’s ALL THE SAME, how dense can you be…
Oh yes because Ryzen it’s also such a bad chip to OC, my god be real. Most 3rd gen Ryzen will turbo clock itself to 4ghz even with no change from the user assuming he has decent enough cooler/case ventilation. ?
Ps The fact that you said “reaches the Ryzen 3600” is my point. despiste the IPC gains the 3600 is only an 6c/12t CPU and 2700x is an 8c/16t CPU if it would take your CPU an massive OC to even reach the 3600 that i would assume it’s on its base clock it sure isn’t reaching the 2700x on any modern titles that are any CPU intensive on the base clock let alone overclocked.
“6cores + smt must be plenty for anything now, up until the next 5 years”
6 cores is less than slowest console – Xbox Series S (299 usd) use 8 core, 16 threads at constant clock 3.6 GHz. Developers always use slowest console as main platform
CPU on Xbox have constant sustained clock 3.6 GHz on Xbox Series S and 3.8 GHz on Xbox Series X. Memory access is 320-bit with bandwidth 560 GB/s which is more than 15x faster than on PC with 32-bit DDR4 memory
If you want learn more watch Hot Chips conference on YouTube:
“HC32-S4: GPUs and Gaming Architectures”
https://uploads.disquscdn.com/images/ff2cf1636b13766c0b0bc5c15aaabbf36fd11a10f0d143f03627b30ab4e4555f.jpg
CPU on Xbox have constant sustained clock 3.6 GHz on Xbox Series S and 3.8 GHz on Xbox Series X. Memory access is 560 GB/s which is more than 15x faster than on PC with DDR4.
If you want learn more Hot Chips conference on YouTube: HC32-S4: GPUs and Gaming Architectures:
https://uploads.disquscdn.com/images/ff2cf1636b13766c0b0bc5c15aaabbf36fd11a10f0d143f03627b30ab4e4555f.jpg
100GB of space for a glorified demo? What a joke.
1050 Ti MINIMUM spec
unoptimized dumpster fire confirmed. next.
you mean the opposite ? coz 1050ti is weak trash that runs RDR 2 at low settings 900p 30 fps ?
100gb diskspace for one map?!
Wtff?!
Do you still have to understand that at higher resolutions and with graphics set to ultra, the CPU requirements decrease instead of increase?
So it makes sense to ALSO see lower CPU requirements in the recommended specs compared to the minimum!
At high resolutions and ultra graphics, the GPU does the most work!
Ancora dovete capire che a risoluzioni più alte e con grafica settata verso l’alto,i requisiti CPU diminuiscono invece di aumentare?
Quindi è logico vedere ANCHE requisiti CPU minori nelle specifiche raccomandate in confronto alle minime!
Ad alte risoluzioni e grafica ultra il lavoro maggiore lo fa la GPU!