By now, most of you are aware of the underwhelming hardware of current-gen consoles. And while PC gaming hardware has surpassed beyond what consoles could ever dream (hardware-wise), DigitalFoundry decided to experiment with a new budget CPU in order to find out whether it would be able to deliver a ‘console’ experience. And to everyone’s surprise, a system with a £45 CPU coupled and a £110 GPU is able to pull almost constant 30fps at 1080p and with High settings. Hell, if you pack this budget CPU with a high-end GPU, you are looking at a performance greater than the one found on current-gen consoles, something that – in our opinion – says a lot.
DigitalFoundry used an overclocked (4.5Ghz) Pentium G3258 Anniversary Edition with an Nvidia GeForce GTX 750 Ti. In addition, DigitalFoundry enabled adaptive v-sync in its half-refresh mode. This basically means that the framerate is locked at 30fps (with v-sync enabled) and whenever the framerate drops below that number, v-sync is being disabled and there is minor tearing on screen. Basically, this is what consoles are currently outputting in most cases.
Enjoy the video and kudos to our reader ‘Sean’ for informing us!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
http://www.youtube.com/watch?v=GdHKvznxLII
http://www.youtube.com/watch?v=9_b7FE3cYgg
http://www.youtube.com/watch?v=PPf6jOvij-4
Consoles= trashboxes that play indie games and have a few AAA games, of which as yoshida from sony confirmed only 4/10 actually make profit, so a very small part of the gaming community cares about them and ps4 will have less exlusives than ps3, i wonder if ms will up their game since thats all xbone got going, they say they got 1 billion invested in exlusives.
Anyway both consoles are weak, so they got to bring up the exlusives or in 3 years from now a 400$ pc will run all games at superior settings than consoles
I’ll say it again and again this gen should waited till 2015 Q1 cause cpu isn’t better than last gen console and gpu decent I dont care if it have 8gb or 100gb there is no power in it
It shows a number of things. How well Crysis 3 was optimised on PC, how well budget PC equipment can play games now days. Granted a joypad is hardly the best experience and locking frame-rate to 30fps is pretty bad when using a mouse but still pretty amazing such cheap hardware can play at those settings and a solid 30fps.
Note as well that the XBox360 and PS3 have frame-rates from 16fps -30fps at sub 720p, low settings( I know they’re old but their CPU is strong with a 3 core, 6 thread, 3.2GHz CPU on the XBox360).
It runs nicely on High but as soon as you set it to Very High, performance drops. And I don’t particularly know why. It doesn’t look significantly better yet there’s a huge performance hit. The only thing I’ve noticed is slightly better lighting. And I get below 60 FPS in some areas with 2x R9 280X (on Very High).
There are some settings via command you can turn off for very high, like r_silhouettePOM = 0, r_fogshadows = 0.
Very high enables features like SSDO, SSR, volumetric fog shadows, water caustics and higher quality grass rendering, you can also tweak that to.
I have a 4 year old system ( Phenom 965, 4gb ram, I know, tough times). I had a GTS250 all this time but that old lady left for kingdom come 2 months ago. I got a 750ti because it was cheat and I’m amazed by its performance. It gave new air to my PC. So yeah, with just 150 euros I now have a system that can produce FHD 60fps at most games, easily.
Yeah it’s pretty crazy how extremely underpowered those new consoles really are.
It’s a joke and pathetic. It should atleast have been twice as powerful to even be called nextgen!
If it were twice as powerful, then it likely would have been nearly twice as expensive. It costs a lot of money to mass-produce/assemble gaming consoles. Remember when Microsoft spent a billion dollars just to double the RAM? Imagine what it would cost to use an even more powerful CPU and GPU.
Dude, you have to remember that a 1:1 specs comparison isn’t fair. Consoles do not need to run a full-fledged demanding OS in the background, plus the game can be optimized for a closed box, allowing the console to perform much better than an equivalent PC. Finally, most console gamers sit far away from their TVs, so console games do not need their FOVs to be super wide, which reduces the strain on resources. And believe it or not there are still a lot of us stuck with old 720p TVs so devs can get away with sub-1080p games. I’m one of them – my TV is an ancient 32″ 720p/1080i though my PC has a 1080p monitor. While I could hook up my console to my monitor, I often play split screen with friends so the physically larger screen is better for that.
PC is still my favourite platform (I spend maybe 75% of my game time on it) but as long as there are good console exclusives I will continue to buy consoles (after price drops and mid-cycle refreshes of course – I didn’t buy my first 360 until the slim one released at a fraction of the launch cost). Of course, I would prefer to play those games on PC, but the fact is I won’t ever get to, and if I truly enjoy those games (i.e Red Dead Redemption), then the expense on the inferior hardware is worth it. I plan to buy the Xbox One after it receives the inevitable mid-cycle refresh, just for Halo 5!
You’re stuck in the past, my friend.
“Consoles do not need to run a full-fledged demanding OS in the background”
Yes, they do. Actually the current consoles waste more memory on the OS than Windows does. They’re supposed to be multimedia hubs for casual/TV users so they have more shit running in the background than most PCs.
And yeah, image quality doesn’t need to be as good when you sit further from the TV, and the smaller FOV does save some resources. But these are minor details. Even at 900p console games have yet to produce anything remarkable this generation. The fact remains that last-gen started out at the level of a high end PC with too little memory. Now they’ve flipped that table completely shipping a low end PC with a lot of memory. This generation is very different from the last and it will not last more than 5 years. By that time not only tablets but PHONES will deliver equivalent graphics.
He has got a point but so do you. Just removing the kinectic sensor that may not do anything while gaming gets you additional 10% power.
Yup, that’s probably the best example for OS bloatware. The Sony example would be that the PS4 is always recording in the background incase you retroactively decide you want to archive what just happened in the game. A useful feature for some people in some situations surely but that must put some strain on the system. Since noone’s managed to turn the feature off yet we can’t attach a percentage in performance gain to it but there’s no doubt games could be better without this feature.
On that note and since Tom mentioned Halo, Halo 3 introduced theater mode which works by recording *game events* rather than a video feed. This way a long list of entire matches is stored until you overwrite them and you can pick which ones to save long after the fact, and the whole system is definitely less of a strain on the hardware or else they wouldn’t have gone through the trouble of doing it this way. So I would be in favor of leaving recording features up to developers. If they want to deliver that aspect they can put the effort in to do it right.
To force all users to waste resources on recording VIDEO at all times is imo ridiculous.
Indeed…
i really think the current-gen console are a tunnel to the 4k consoles that will arrive sometime 2018 and they will be jst upgraded systems ( hardware-wise ) but by that time PC will run 4k @ 120fps easily
If they can run half of 1080p 60fps now then they sure cant run 4K in 2018. New hardware ofc but still. I would imagine the next gen being 1080p 60fps everything with really good graphics, for a TV screen experience its really good.
As someone who has no higher preference in games than high framerate I’m afraid you’re wrong. The mass market will never demand 120 fps, therefore games will never be optimized for it. You will have to do some serious overclocking in CPU bound games. I can’t get 120 fps in Crysis 3 on my R9 280 (low settings!) because it’s CPU bound and an i7 920 at 3.5 GHz doesn’t cut it. And then I have to listen to people on this site saying Crysis 3 is well optimized lol. It’s only optimized for GPUs, CPUs should have no trouble with it since it’s a last-gen game with small-medium sized levels and very little complexity! I mean the original game had better physics and more AI per scene / area and yet I can crank more frames out of the dual threaded Crysis than the allegedly quadcore optimized Crysis 3.
As long as developers don’t bother optimizing CPU usage beyond “it’s playable” even enthusiasts will have difficulty running graphically intense titles at more than 60 fps. This statement may seem confusing (why graphically intense if we’re talking about low settings and CPU limitation?) but here’s what I mean: I believe Crytek didn’t bother optimizing the CPU side of things because they assumed that the game will be GPU bound for the vast majority of users any way. They assumed that most users would shoot for the best graphics they can get at 30-40 fps, therefore making CPU limitations irrelevant for the bulk of the audience. This is what I mean by the mainstream not DEMANDING high framerate. Mainstream developers assume that players are okay with low framerates and make decisions based on that premise.
Got to say though, GTX 480 will probably burn your house down. :p
Crysis is a CPU-dependent game. When I read that the processor was running at nearly 5ghz that explained it all.
It is like one sixth of the price of a 4770K and meant for overclocking.