Valve has just shared Steam’s GPU and CPU Hardware Survey May 2023 Results. These results are from a monthly survey that collects data about what kinds of computer hardware and software Valve’s customers are using.
As we can see, Intel remains the king of CPUs. Although most of the Ryzen CPUs reviewed favorably, the red team has not managed to gain enough CPU market. From December 2021 until May 2023, the red team has only managed to increase its market share by 2.5%.
When it comes to GPUs, NVIDIA is simply unstoppable. The green team has 76% of the GPU market share, making it the number-one choice for most PC gamers. And yes, even after the ridiculous launch of some of its RTX 40 series GPUs, the green team has still managed to retain its market share. Similarly, AMD has managed to retain its own GPU market share, and Intel is still in third place (with only 8.5%).
The most common GPU appears to be the NVIDIA GeForce GTX1650. Thankfully, we see that CPUs with at least 6 CPU cores are what the majority of the survey takers are using. Moreover, we can see that most of them run their games at 1080p, and that the majority of GPUs that they use are equipped with 8GB of VRAM.
All in all, it becomes clear that NVIDIA has no reason to worry about the reception of its GPUs. I mean, its market share remained the same, even after the launch of the RTX 4060. And, let’s be realistic here. The gap between the green and the red team is HUGE. So, unfortunately, I don’t see AMD pushing any pressure on NVIDIA in the foreseeable future. As for Intel… well… I won’t be surprised if the blue team abandons its GPU plans after the release of Battlemage.

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email


No one is buying GPU to begin with.
Overpriced and no one cares about raytracing.
https://uploads.disquscdn.com/images/91bb82537b7cb60648ca6e45360daf3501a3a9527ecc2e4519ce2a265ef172e2.jpg
That’s because most people are buying pre-built OEM PCs, and the majority of those are still Intel + NVidia combos, which more often than not provide better value than all AMD systems, at least in the part of the world where I live.
Going to do an artical on CoD?
No one is buying GPU to begin with.
Overpriced and no one cares about raytracing.
https://uploads.disquscdn.com/images/91bb82537b7cb60648ca6e45360daf3501a3a9527ecc2e4519ce2a265ef172e2.jpg
Lack of interesting games on PC and ridiculously inflated prices on what are essentially luxury items in a time when overall inflation is hurting the general public will do that.
Exactly, if at least games were on average a bit better than just uninspired, weren’t half broken and ran reasonably…
They just don’t care man. I noticed a trend. Went back and finished some of crapcoms title. Okay any resident evil on day one or year one, GPU is Loud AF. Play any of them now and it’s as quiet as a thief in the night. So what is that telling us, that these games might be purposely broken and hardware heavy to get you to think that your hardware isn’t good enough upon release. So many people fall for this. Buying new gpun for said game. Just look at Witcher 3, runs quietly. Witcher 3 with mods that the remastered is suppose to have, runs quietly. Witcher 3 remastered, RUNS LOUD AF!… Something just ain’t right here.
Just as an FYI, Witcher 3 Remastered has more than just a few mods. The whole LOD system was expanded and improved. Lighting is also improved, not to mention ray tracing and HDR. Plenty of reasons Remastered is more demanding.
On another note, I do agree a lot of games play better a year or so out from launch. I don’t think this is intentional aside from publishers just being pushy and not allowing devs more time to polish/finish before launch.
Reality check: it looks barely any different.
Thank you, I couldn’t even bother to respond man. Nothing against the guy but really now?.. someone is severly wrong with the remaster.
yeah try it myself when it first launch, but to be honest I cannot see much significant difference except for worst performance
Old gen with 84 mods
https://uploads.disquscdn.com/images/9f21eafd5c959984a2e4e0d51efa3da12ff8a1f8bc3493bca8dc4ba9765a4209.jpg
New Gen – No RT
https://uploads.disquscdn.com/images/d583ac6ab4d7da344e51f47949b9a673dd84c1decb25501932674843b30986d6.jpg
Overall lighting has changed significantly and tree and Vesemir’s hair have more detail and better overall textures even though the old version uses Halk Hogan’s textures
The difference in the game engine rendered cutscenes is even more dramatic
Old
https://uploads.disquscdn.com/images/7b0347628938db72ef21cc10d3ca86e5f635bea8c05576a5ca0f67d359daa3b3.jpg
New
https://uploads.disquscdn.com/images/ba6179e93762835d86cd706ef6f23e309d751b26fe1c1a495a3a8523a3794b71.jpg
Not taking sides but being very honest if companies don’t push devs to make games so they will keep developing the same game over a decade. You have no idea what kind of game devs have in the studios.
Trees and grasses and the overall environment was greatly improved …. However they seem to have broken grass physics with the 4.03 update and it’s still CPU bound because it’s not multithreading properly
I didn’t have any problems with noise levels with Witcher 3 Remastered and while I’m not using RT because of the CPU threading issues I do use a mod that boosts the graphic to even higher levels (The old version had a similar mod) plus 23 other mods and it’s actually close to 33 mods because one of them is multiple mods in a single package. I’m also locking the FPS to 60 because that’s plenty for a game like this and that keeps the main thread from banging off 100% as much which results in smoother gameplay
I hear everyone say this yet my rx7600 hasn’t had an issue on Jedi survivor and didn’t have any on the last of us playing on the highest settings on a 1080 monitor
Its a bad time for Nvidia to sell overpriced GPUs. They should have been greedy 5-6 years ago. Now its just not the time, with all that inflation and sht.
Who knows? Maybe next article will say – Sales are down 70%, perhaps Nvidia will finally wake up then? This BS cant continue. Nobody here is buying cards, we got 2 generation of cards just sitting on the shelves, its insane. If they were to release 5060-5090 cards, nobody will buy these either, if the prices dont change BIG time.
They’ve been getting increasingly greedy ever since the launch of the 1000 series cards. If you watched their keynote at Computex you’d realize that they’ve just
been raking in too much money from their other departments (big chunk of it being AI of course) so they’ve been caring less and less about the gaming market.
I believe that at the beginning of the GTX 1000 generation is when they started the founder’s edition bullcrap. Their own FE cards had MSRPs that were at least $100 higher than the OEM ones.
For the 2000 series they bumped up the bumped up the xx70 class card by $120 ($150 for FE), xx80 class by $100 for both non-FE and FE, and bumped up the xx80ti class card by $300 (by $500 for the FE!).
For the 3000 series at the beginning they at least kept the xx70 and xx80 class cards about the same as the previous generation, but introduced the xx90 series that was basically a cash grab due to being $1500. This could’ve been a great card for GPGPU users had nvidia not locked certain features that could’ve been easily enabled on their side, which they obviously did so that their workstation cards would sell more. This was also when the mining boom happened again so they were selling cards to miners through back channels while lying to us about it. A year later they also launched the 3080 ti and 3090 ti which were more high end cards designed to rake in more shekels.
Now we are in the current gen where the general public has been slowly conditioned to accept these stupid prices, so of course that leather jacket wearing bundle of sticks CEO took the opportunity to bump up costs even more. xx90 got a “slight” bump of $100, while the already overpriced xx80 series got a ridiculous bump of $500! They were even going to launch a 12GB 4080 for the absurd price of $900 but a decent amount of backlash (plus some pressure from AMD’s cards) got them to rebrand it as a slightly lower class of 4070 ti and a very generous price cut of $100. The 4070 got a price bump of $100 to $600. They were also so kind to release a 4060 ti for the same price of a 3060 ti, which traded blows with the previous gen and consumed somewhat less power.
tl;dr: nvidia’s been greedy for a while and has gotten greedier over time as the latest generation offers less performance jumps while paying more. THE MORE YOU BUY, THE MORE THEY SAVE! Thank you lord jensen, you’re so caring.
The reality is Nvidia can’t afford to make more consumer GPUs right now because they need the time they pre-booked at TSMC over a year ago to make their high demand H100 and A100 commercial products
Don’t expect to see Nvidia drop GPU prices until next year after they fulfill the demand for the H100 and A100 and open up production time on TSMC
Nvidia is no different than you, they are going to take the job that pays the best EXACTLY like you do. You aren’t going to take a $20/hr job when there is a $40/hr job available. Why would you think Nvidia would act any differently than you yourself would?
Demand on H100 and A100 is NUTS companies are patiently waiting 6+ months for an order and the queue line is humongous without the slightest competition xD
It’s pure monopoly good for Nvidia sitting on mountains of gold
Of course all this insanity is due to the AI trend everybody is sprinting in the AI race
I have a different take on the matter. Lots and lots of games on PC, plus emulation, mean that if you’re not a 15 year old that only cares about graphics and think that a game from 2018 is ‘old’, an average to good GPU will last you AT LEAST 5 years.
Thinking of buying a second 4090, for RT, for an AMD build.
But these posers would have you believe that , almost everyone is playing 4k@120hz with RT maxed. Haha, ??? once you change the resolution to 4k that GPU will be sounding like boeing 747 on a one way ticket to hell. Loud AF!
It will be hell cus it will make ur room hot too lol!
You have posted 10 comments about how loud your GPU is. I think it’s time you repaste it my friend
Oh boy, ????
You facepalm but he’s got a point. Even at a high gaming load your card shouldn’t sound like a jet plane unless there’s something wrong. Some of them could be air flow issues in your case, dust build-up, thermal paste drying out, poorly configured fan curves, GPU voltages being too high, your room just being too hot to begin with, and a poor/faulty cooling solution on your card. I highly recommend you look into all these things.
Sure… Keep talking and don’t ask what I’m talking. ???? You guys are all experts and the rest of us that’s been doing this for 30 years don’t know nothing. “He’s right”, ok man.
We don’t need to ask what you’re talking about because we already know what you’re main point is and there’s not much to add to it. We’re just trying to help you with the loudness issue because your GPU ideally shouldn’t be ear-splittingly loud. Not need to get all bent out of shape about it lol
I’m not bent out of shape it’s just frustrating and disrespectful at the same time that you think I don’t understand fan curve & undervolting. To add insult to injury everyone here should know that once a gpu is at heavy load you better have those fans spinning at 80-100% if u want to keep that core under control and from going past 92-100°. But see I know that some of y’all have your gpus running at a constant 97° with the fans quiet and think it’s ok. That’s the huge problem here. Because if you are on air and you want to keep that GPU healthy, the fans are gonna blast on unoptimized games ESPECIALLY & that’s a fact, no ifs or buts. But I’m “bent out of shape” because I finally had to type this. Something that I shouldn’t have to explain. But y’all wanna be difficult. TALKING about Fuqing thermal paste and god knows what when my GPUs are new and case is huge equipped with over 10 fans on a push/pull config.
So here I am, wasting my time explaining the obvious. My gpu stays between 54-67 with well optimized games because of my fan curve and everything else it’s going to run closer to 77-82° with fans at 100%. And there is nothing you can do about that if you want to play on ultra on alot of modern crappy games or games that are gpu heavy to begin with. You think you’re gonna be able to turn on 4k and undervolting along with a magical/musical fan curve will keep your gou quiet while keep core temps at 57-67°? NOT GONNA HAPPEN! So, please stop wasting my God Damn Time. And let’s get back to whatever it is we like to do.
Something is wrong with your system my friend. My 2080 never went above 74 and it’s fans never above 50%. And I am gaming at 4k mind you. All this on a case with just 4 case fans and my GPU is MSI ventus stock, nothing special. I never have to modify the fan curves.
Maybe you create too much negative pressure? Is your GPU vertically mounted? Does it have enough space in front of its fans to breathe?
Nah, no vertically mounted GPU all my GPUs do this I have some in cases and some out of cases. I’ve had cases opened and card closed. It doesn’t do this for all games. Generally games that are heavy to run. This is normal man, I wish it wasn’t so and some GPUs are particularly loud under load I don’t understand why this is so surprising to anyone. If you use factory settings on the fan curve, you’ll never hear a gpu and that’s because they do that to make the cards seem quiet while the core/junction temp is off the charts. But once you do your own curves to keep junction from hitting rediculous temps, fans are going to go to 100% and that’s loud in any language.
That’s the price you pay for having temps low in order to keep the longevity of your card. Now what you guys should’ve asked is what type of card(s) I have. Because don’t runs louder than some. Just picked up a XFX 6900 XT and that is the loudest card I’ve ever used and everyone says so as well. Even though XFX/amd says it’s ok to run the card at 92-97° constantly, I ain’t doing that FOOLISHNESS. My Asus Tuf 6800 XT, doesn’t run as hot as quick but junction/core temps gets up their as well, so you gotta cool them down before they get to those temps that will cause fans to hit 100%. My wifes 5600 XT does the same. Back when I had gtx 690 & a titan x did the same. That’s air for ya. I have my cards set to 72°/100% fan curve. So it’s gonna get loud regardless because most modern games are crap and they will hit 72° right as you hit the open world on Ultra. Like I said, don’t know why this is so surprising. But maybe someone knows someone I don’t. It’s what I’ve always seen, when it comes to cooling cards on air.
Oh you have a 6900xt, I should have guessed. High temps are kind of expected for this GPU within spec. Don’t worry about it. Leave it to the default curve and enjoy. The manufacturers have designed the fan curves to keep the GPU temps inside safe zones with as low rpms as possible.
By the way man you might be proud of this I got the 6900 XT last week for $605 on Amazon. I couldn’t pass that price up even though it’s not much of an upgrade over my 6800 XT. I got the 6900 xt for less than half of what I paid for the 6800 XT. Figured I’ll just throw the 6800 XT in my wife’s computer in the living room so we can have some family game nights on her machine instead of hauling up to my man cave.
Yeah as long enough PC Builder myself, playing at max refresh rate and max resolution is bad for our GPU. Learnt the hard lesson back in 2013 with my HD6850 that reach 100 when playing Tomb Raider with vsync off at 1080p60 monitor and need to be repaste every month before it finally gave up. Nowadays I usually set it into half max refresh rate from my 144Hz monitor and set my own fan curve to keep my GPU at around 60 C during play session. Default fan curve are always worst for me because they tend to prioritize noise reduction compared to temp
Yep that is exactly what default fan curve does. The card cannot last/survive that way, only make it seems as if it’s quiet when really and truly it’s throttling it’s butt off. Most people don’t even know this because they never monitor their temps while playing. They think “oh my card is a beast under heavy load”, haha it’s simple physics man. Then you check their temps and the card is 87-92. T
I’ve but
Amazing price! Can you feel the difference between the two?
>You think you’re gonna be able to turn on 4k and undervolting along with a magical/musical fan curve will keep your gou quiet while keep core temps at 57-67°? NOT GONNA HAPPEN!
No one here is saying this. We know that air cooled graphics cards will get audible under heavy loads, but good ones that are configured well will not get stupidly loud. Look, just because you have been doing something for a long time doesn’t mean you’ve been doing it perfectly. It’s possible you’ve had poor luck in your experience but that isn’t the case for others. You’re just flat out stating that it’s impossible to maintain good noise and temperature levels at heavy loads when it’s simply not true. There are plenty of cards that have beefy air coolers, good thermal paste and thermal pads properly applied that are capable of achieving reasonable noise levels without thermally throttling the GPU. Also, while it’s good that you’re trying to keep your cards far away from thermally throttling, GPUs are designed to handle high temps, so it’s not “FOOLISH” if some people are running them in their mid 90s for the sake of lower fan noise. They’re engineered to handle high temps so it’s fine. Now if some of those people are having their cards thermally throttle, they should definitely look into making some improvements, but this is a whole another story.
If you still don’t believe me or others here you can check out reviews of graphics cards from pretty much any competent PC hardware media outlet. In those reviews you’ll also see how the same card from different manufacturers will have varying degrees of max temps and noise levels. Also, not all of them are going to sound like jet planes. For example, take a look at techpowerup’s review of the MSI RTX 4090 Gaming Trio X and go to the cooler performance comparison page. You’ll see a graph that shows the temps of several graphics cards VS GPU power load, and a bar chart that shows cooler performance of several 4090s while noise is controlled for 35 dBA. You’ll notice that there are differences, some as much as 11°C.
Tl;dr: You’re using your experience to make blanket statements about GPU noises and temps as there is plenty of evidence telling you otherwise.
Remember when I told you that you don’t understand the many factors because you never asked. You just jumped the gun? Well you just said something very important, “check the reviews”, the problem with this is everyone has a different tolerance. Years ago I learn that I don’t have much tolerance for the fan noise on
GPUs, because I’m just very sensitive to sound. Some people think it’s ok because they are just use to it. But loud is loud regardless, some just have higher tolerance and some simply don’t care. My GPUs have always been watercooled, so I don’t have much tolerance for the “Airbus effect”. It’s just a fact that if you want your GPUs to last longer, you’ve gotta manage that core and keep them cool. In order to do that on air I have put curve set to 72-78°/100% fan speed. That’s over 3k RPM. That’s loud in any language.
I just think there are so many variables here that this argument or back and forth didn’t have to happen. That’s what frustrates me. I was trying to make a point and that just went right over everyone’s head because y’all were too focused on the fan noise, which was suppose to signify GPUs under heavy load. Like I said, variables… No one asked what type of GPU which is the XFX 6900 XT which is known to be load AF!.. my Asus Tuf 6800 XT is a little better but a game like the Witcher 3 remaster on Max will just destroy any GPU. Resident evil 7 and village was the same on release, now they are VERY QUITE, years later. The new culprit is RE4: Remake. I reckon that will be quite next year. Was just trying to make a point of how badly optimized these games were.
I see, basically what could sound audible and tolerable to others could sound awfully loud to you. If you just said that first then we could’ve saved all this time going back and forth. It’s just that when you say something in such a matter-of-fact manner others will take note.
Also, like I said before, we all got your point about games not being optimized and unnecessarily being heavy on hardware. We’ve all been there and we can all relate. We were just concerned about your fan noise issues since you kept saying your GPU sounds like a jet plane under heavy load. Remember man, we at DSO are a small-ish gaming community. We look out for each other.
In other words…
>I’m just very sensitive to sound
https://uploads.disquscdn.com/images/e69ed1f7f0ea1c908ec44a1bc776b1513760e72bb3bee1966a8aced9e7941058.gif
You may want to look into a better airflow case ….. My 3070 Ti dropped the middle fan so I unplugged it and ran Precision X1 and set the remaining 2 fans at 70% and I can barely even hear it and the temperatures max out in the mid 60’s. But I have a solid side, well insulated/sound dampened case with 3 intake fans, 3 fans exhausting through a 360 AIO and another exhausting out the back and no RGB nonsense ….. I need a solid side case to keep the RFI generated by the PC from getting into my test equipment and the wireless (RF) remote sensing devices I’m designing or profiling. Besides I’ve been building computers on a x86 platform since 1988 and long ago ceased to be impressed at what the inside of a computer looks like. and since LEDs do absolutely nothing for performance they are just a waste of money to me.
In fact it is running so well on 2 fans that I have the replacement sitting on my desk since Monday but decided to wait until later today to change it out and do a full deep clean of my system which I do every 6-8 months
Indeed, this is exactly why market share has barely budged.
This happens every so often especially after major events like Y2K where everyone blew their wads in 1998 and 1999 so no one was buying in 2000 and 2001 …. Again after the Great Republican Recession of 2008 ….. and now because of Covid
This is how you make money if you are smart enough ….. I made a ton of money in 2000 because techs were over inflated and I knew no one was going to be buying in 2000 so I sold out at the peak ….. In 2009 because the market was so depressed I bought 1003 shares of AMD at $2.56 and sold it at the peak in November 2021 for $144+/share then I waited for the bubble to deflate (It didn’t burst this time) and last October I bought 1000 shares of Nvidia at $144 and change per share and today it’s worth $387.70
So basically I took about $2750 in 2009 and turned it into over $387,000 today
“No one is buying GPUs to begin with”
First off false. Maybe not many are but I’ve bought two in the last 2 months. So saying no one is – is simply false.
Next, main reason most aren’t buying the new cards right now is that the cards are already producing such high graphics that there is very little growth to justify buying a new one. The days of 60% upgrades between generations are gone. And that doesn’t even touch the fact the a dollar buys less now while nvidia keeps charging more and more for their cards while cutting the specs down.
The only real reason to buy a card this generation would be for the 4090 or the 7900XTX and very few people can afford to throw 1500-2000 on a card. So there is no real reason for the consumer to upgrade their 3060 right now.
The companies need to GTFOver themselves and bring those prices back down to earth or only new builders like myself are going to upgrade anytime soon.
The 1650 is a GTX, not RTX.
Yeah, silly typo. Fixed.
Upgrading my PC isn’t a priority when so many so-called AAA new games are little more than soy-infested, woke, Leftist propaganda masquerading as entertainment.
Well not only that, we have some is the most powerful GPUs right now and it doesn’t matter. They games are a total mess. I went back to playing one of the most optimized game, “mad max”. GPU doesn’t make a damn sound. Then you play any crapcom game on release then okay then now. GPU is as silent as a thief in the night, on release sound like boeing 747 getting ready for a crash landing. When I hear my gpu rev on 1080p Max I know I’m about to have some problems. So my question is, who’s playing in 4k for the noise fest.
Yep, Mad Max was a supremely well optimised game from 8 years ago.
I just tried it for the first time, and what a great game! I sure missed out on it.
I wonder why we dont see games like that anymore.
Time to undervolt.
I only upgrade to not feel left out, and prevent my gpu temp going over 60.
My average GPU temp are around 60 – 65 (live in tropical area ) while gaming and that only after limited FPS to half of my monitor max refresh, if not being limited it will shoot into 70 territory
I usually lock the frame rate to 90 and it never goes above 60 degrees. The only game that managed to do that was RE4 remake.
big hardware is on it’s way out for the average dude. cloud gaming and efficient handhelds will be the future. why download hundreds of gigs for broken games, when you can just play something at 1080/4k/60 if you have decent internet.
The end of actually owning anything is near.
Trying to buy a house since last year and can’t find a damn thing to buy. All the houses are built out of paper now and extremely expensive. You are getting brick prices for paper. One windstorm comes through and you’re FUQ’d! People can’t even find anything to rent either. Because why rent to people when you can just do Airbnb and make 3x the money a month.
owning something digitally is not as impressive as physical. own your land, house, car, bike, wife.
You know what’s the real energy sucker that we can’t afford, that no one is talking about… THE CLOUD! In 10 years there will be too many days centers and that’s what’s been sucking up most of the grid. Now we know why they’ve been pushing for cloud because most people won’t know that’s the real problem, so they’ll just say it’s your car or your AC in the house. Basically blaming it on the public while they carbon tax you to death. Meanwhile the real culprit is Data centers.
We don’t have enough power to sustain this cloud garbage, so they’ll be restricting the consumers/everyday people in order to take from you to meet the data center demand. I was shocked when I found out what the real culprit.
there’s plenty of untapped energy from solar, offshore wind and water, depending on the country. France is building nuclear power plants as we speak. nuclear fusion is “coming” the place i work has a 2MW wind turbine on it’s doorstep, usually spinning.
All that is politics talk and but sustainable. Solar is straight garbage. It’s cost to sustainability is horrible. Wind and water is only good at 5 places in the world, but very viable. That stuff sounds good at a summit but anyone with some salt knows that it’s all BS and once you brought up wind, I’m out. Those things are the worse invention and terrible on bird life. But ain’t gonna lie nuclear seems to be the future but it’s dangerous when something goes wrong. Even though the chances of going wrong is low, when it does… There goes the people of Earth. And we must not forget why we are doing this, “THE PEOPLE OF EARTH”.
I’ve worked in the energy sector for over a decade, it’s a tricky thing to assess/implement. I just know there is a lot of dreaming and political BS in order to bleed tax payers. The tax payers pay for the infrastructure and then they continually gets billed up the hole for the service. While these fat cats get fatter.
The problem isn’t even the nuclear technology itself because it is extremely safe when built, maintained, and operated correctly. What I worry about is the damn diversity hires that’s going to be doing all that building, maintaining, and operating.
Yes and then there is that. The problem with the diversity hire is goes deeper than race, s3x or class. I know people from different races, sexes & classes that are well qualified and can’t get the job. This thing is more than just skin deep and because they claim they are trying to “rectify mistakes of the past”. The more they meddle the worse it gets. The world isn’t fair and it will never be. On another but similar note: as soon as a black person is seen in the video game, it’s “diversity this and diversity that”. Last time I check, as a black man I’ve been playing white characters for 30+ years in a video game and I’ve never complained but these people see one black or something else and it’s game over. I don’t understand the mentality and I don’t like it. Because a bad game is a bad game regardless of skin color, s3x or “economical situation” of the character. The ones calling it out are now just as worse as the ones implementing it. But yes, I don’t want diversity hire to watch over a reactor core. ?
you do realize he and his friends are full blown angry, ®@c1sts, probably Z10n1sts, s©umb@gs, who hate everything in life that is not ‘MURICA, don’t you?
Recently found out the sheer hatred they have to anything that is not ‘MURICAN. Hell, ironically some of them are displaying the confederacy flag lol they are so offensive and triggered all the time we all know deep down they are that guy
https://uploads.disquscdn.com/images/031597cd25b97d97514934d8d4be1e8a750bc1902e257feecceabe7b31a5f299.jpg
Well in regards to video games, I’ve never cared whether a character was black or female or whatever back in the day. But nowadays it’s clear when a character is what they are purely because of diversity reasons and then you get all this virtue signaling about how they put a black character in the game or whatever. That’s what rubs me the wrong way and I’ll call it out every time I see it.
Agree, no matter how much we love renewable energy source, the reality is the cost to build and maintain will be much more then traditional energy sources.
When I tried streaming games it was about 1 GB per hour, so downloading isn’t much different from streaming in total GB used.
Cpu upgrade happens incrementally. A lot of people are still on older gen intel cpus. If we look up year by year sales we might actually see amd selling more annually than intel or just as much as intel
That’s because of Intel. For so many years Intel has been doing these sideways upgrades. They had no competition. Then AMD started to compete and now Intel is back to moving forwards again.
Fascinating, John.
So sad to see most people still gaming at 1080p. And the average gaming PC is less powerful than a console lol.
What’s wrong with playing 1080p? I’ll take frames any day over higher resolution.
Indeed. Diminishing returns on visual fidelity beyond 1080. I bought a monitor that does 1440 with a 240hz refresh and I’m perfectly happy with it.
24″ 1080p is the most common monitor resolution, 42″ 4K isnt that much tempting when you sit 15 cm in front of your PC
So I have an arc a750 and a rx7600. Everyone trashed my buying the a750 even suggesting Battlemage wouldn’t release. Now the arc is considered a Best Buy and you people (media) are saying they will abandon their gpu division after Battlemage.
I bought an arc. I’ll buy a Battlemage. May not even use it, but I’ll buy it solely to support a 3rd party in the market.
Personally I’m sick to death of all you Nvidia fanboys.