NVIDIA GeForce RTX 4090 GPU render

NVIDIA GeForce RTX 4090 was one of the best purchases you could make in 2022

Earlier this month, a report surfaced suggesting that NVIDIA would be skipping 2024 for its graphics cards. As such, the only high-end GPU for the rest of the year (and for next year) will be the NVIDIA GeForce RTX 4090. That is of course if the green team does not release an RTX 4090Ti model. Still, and surprisingly enough, the NVIDIA RTX 4090 was one of the best purchases you could make in 2022.

Now before picking your pitchforks, let me explain. This is an article I wanted to write for a while. In fact, I highly recommended the NVIDIA RTX 4090 when it launched in 2022. And, since this is the weekend, this is better than more Unreal Engine 5 videos, right?

So, let’s start with the obvious. Even in 2023, this GPU is a beast. And when I say a beast, I mean a BEAST. Especially in 4K. Because, despite NVIDIA’s claims, the RTX 3080 and the RTX 2080Ti were never 4K GPUs. So, it’s not like we’re sugarcoating things. We always considered the RTX2080Ti and the RTX3080 as “1440p GPUs“. In fact, we were a bit underwhelmed by the performance of the RTX3080 over the RTX2080Ti. However, for the time being, the RTX 4090 IS a 4K GPU. Hell, even without DLSS 2, this GPU can maintain 60fps in the most demanding triple-A rasterized games like The Last of Us Part I, Hogwarts Legacy, A Plague Tale: Requiem or Star Wars Jedi: Survivor.

What’s also crucial to note is that you’ll never encounter any VRAM issues with this graphics card. After all, there is no game that can fill 24GB of VRAM. This is one of the main reasons why it wasn’t wise to go with all the other RTX 40 series GPUs.

Let’s also don’t forget DLSS 2. Since DLSS 2 is SO GOOD these days, you’ll be mad not to use it in the games that support it. In fact, you can get a better image (with better framerates) than native resolution, especially if you game at 4K. Add to this DLSS 3’s Frame Generation, and you get yourself a GPU that can last you for a long time.

So, for those gaming at 4K, the RTX4090 was a blessing. Surely you had to pay $1600 (you could get it at that price at launch, and right now you can find it at 1800 euros). But hey, this was the first time you could enjoy native 4K gaming with 60fps on Ultra Settings. Let’s also not forget that the RTX4090 offers double the performance of the RTX3080. The RTX3080 launched at $699 and was universally praised for its price. So, given the performance gains, the RTX 4090 came close to that price/performance ratio.

But what about 1440p owners? Why should they pay $1600-$1800 when they could get cheaper GPUs? Well now here’s where things get interesting.

At 1440p/Ultra, the RTX 4090 remains the undisputed champion, allowing you to game with really high framerates. And that’s without using any upscaling techniques. Since we won’t be getting any new high-end GPUs in 2024, this means that for three whole years, you’ll be gaming with blazing high framerates. However, the RTX 4090 won’t become obsolete when the RTX 5090 comes out. Given the power of the current-gen consoles, we can safely assume that the RTX 4090 will last you FOR A LONG TIME, especially if you game at 1440p/60fps.

But wait, I hear you. Why shouldn’t I get an RTX 3080 now for $500 and in 2025 get an RTX 5070Ti for (speculating a price here, let’s say $700)? That’s $1200 vs 1600$.

Well, here is the thing. If you get an RTX3080, you’ll be getting less performance in all games. Yes, you only paid $500. But if you add to that the price of your next GPU, you’re looking at a total price that is close to the RTX4090. Right now, we also don’t expect the RTX 5070Ti to match the performance of the RTX 4090. Hell, even the RTX 5080 may not be able to beat it. The gap between the RTX4080 and the RTX4090 is too big. And if you think that NVIDIA will offer you a huge performance jump, you’re delusional. That, or you haven’t been paying attention to the rest of the RTX 40 series GPUs.

What this ultimately means is that an NVIDIA RTX 4090 will remain more powerful than an RTX 5070Ti. So, instead of paying upfront the price of the RTX4090, you’ll end up paying almost the same amount of money for GPUs that could not match RTX 4090’s performance. Even if you go for the RTX 5080 (let’s say it’s $1000 and not higher, though you know it will be higher), you’ll end up with a GPU that may match its performance. For the exact same price. And the downside is that you’ve missed out on the games that could not run on the RTX 3080 all the previous years. And, let’s be realistic here. When you get an RTX 5080, you won’t be playing CP2077 Overdrive or the Path Tracing Mods of older games. Not when The Witcher 4 and other triple-A games are right around the corner.

And I know, this is speculation (regarding RTX 5070Ti’s price). However, one of the reasons I’m writing this article is so that I can re-visit it when that GPU arrives. Think of it as a bet.

There’s also something else you should keep in mind. Although the RTX 4090 is overkill right now in modern-day games at 1440p, it WON’T be in games that will release in two years. PC requirements are constantly going higher and higher. So, if you’ve invested in a powerful GPU, you’re basically future-proofing yourself for the “more advanced” games. Common sense, I know. Who would have thought that?

Now the ONLY reason someone would be willing to upgrade from an RTX 4090 (when the RTX50 series releases) is due to FOMO reasons. FOMO stands for “Fear Of Missing Out”. And this is one of the most ridiculous reasons I’ve ever seen. Performance-wise, you won’t have to.

So basically, you can skip the entire RTX 50 series and still have amazing performance in all triple-A games. Hell, you might be able to skip the entire RTX 60 series if most games utilize DLSS 2/3. That is obviously at 1440p and not at 4K. Ultimately, you’re looking at a GPU that can last you 8 years. This is crazy. NVIDIA produced a beast of a GPU in 2022, and that was a really rare thing!

230 thoughts on “NVIDIA GeForce RTX 4090 was one of the best purchases you could make in 2022”

  1. Best GPU which most gamers will never need 😁 especially when AAA games quality is in free fall !

  2. What in the actual f*ck is this article John?

    Yes, the rest of the Ada stack blows chunks because Nvidia released GPUs as a higher class than they really were in order to overcharge the naive consumers. I’m looking at you 4060 and 4070.

    But the 4090 is still a stupidly expensive GPU and it isn’t worth the money for anyone not gaming on 4K with really high FPS expectations. How many gamers is that? Very, very few.

    1. How about future-proofing yourself instead of constantly upgrading? You do know that in two years PC requirements will get higher (so the RTX4090 will not be overkill). So you get the benefits now of high FPS, and in two-three years you get 60fps at newer titles (with the same amount of money you’d spend if you were constantly upgrading).

      1. That also not how it works John, the highest jump in requirements is when a new console generation releases.

        I’ll be totally fine with my RX 6950 XT this entire generation, and same goes for my CPU.

        People who are actually intelligent about future proof and know their hardware, and don’t buy PC parts every two years.

      2. “Future proofing”, chortle. My 3080 has lasted close to three years, but it isn’t future proof, nothing is: even diamonds can burn. Also, until Nvidia tried this ridiculous pricing structure, it was always (MUCH) more cost effective to upgrade to mid-range regularly, than to go for less frequent high-end purchases. This is simply because the rate of devaluation of expensive tech products is much higher than lower priced products. As an example, a £250 i5-13600K outperforms the previous year’s £550 i9-12900K, and the £200 i5-12400 outperforms the previous year’s £550 i9-11900K and so on.

        1. If you already have an RTX 3080, there is NO POINT AT ALL to go to an RTX 4080 or RTX 4090. In a way, you did future-proof yourself when you got it (as you can skip the entire 40 series generation).

          For owners of these GPUs, the best strategy is to stick with it until the RTX 50 series comes out. After all, the RTX 3080 can still play most games with 60fps at 1440p.

          This article is mainly focused to those that wanted to upgrade.

          1. You miss the context of my previous upgrade history: GTX 970 (£330 in 2015) to used GTX 1070 (£310 in 2016) to a used GTX 1080Ti (£400 in 2018) to a new RTX 3080 (£650 in 2020)

            Each of those steps near doubled my performance every 2 years for close to the same spend adjusted for inflation. Continuing that trend, I should have been able to buy a new RTX 4090 for £725 in 2022. At the present rate of “progress”, my RTX 3080 will be 6-7 years old before I upgrade.

          2. Each of those upgrades represent maybe a 50-60% perf boost at best, not “near doubled”. Which following your upgrade timescale and path would make the 4080 the next option for you. Pricing of a used 4080 next year leading up to or post release of 50-series will be in your range.

            Personally I would not upgrade unless I am getting 100-150% perf boost these days.

          3. Yeah, OK. 1070 from 970 was about 50% but only one year later for me (equivalent of doubling in 2 years). The 1070 to 1080Ti was in the same generation, but still a 65% leap. The 1080Ti to 3080 is about 65-80% depending on resolution. The doubling in performance was more like every 2.5 – 3 years. 3 years would have been 970 to 1080Ti (120% jump). It is now now 2.5 – 3 years since my 3080 purchase, so where is my 4090 at £770? 4080 would have needed to be £725 (adjusted for inflation) at launch in 2022, not £1269. I might have considered it at that price, though probably still declined.

          4. “4080 would have needed to be £725 (adjusted for inflation) at launch in 2022, not £1269.” – And therein lies my point above about competition pricing, while the 7900xtx did come in around £100-200 cheaper depending on model, it has less features, lower RT performance and uses a lot more power; but performed around the same level as a 4080 in raster.

            If they came in at £799 it would have affected the whole market pricing, no one in that price bracket would even consider Nvidia as an option at that point.

            The 7900xtx can be had for £970 new today; that’s the closest you gonna get. We will never see a 4090 perf level at £770 anytime soon as it remains a halo uncontested product. The problem with AMD is they are the underdog pricing like they are equal.

          5. You still got DLSS to help, assuming AMD does not continue to block it in their sponsored titles. I personally used it a lot when I had a 2080s, skipped the 30-series and mining boom entirely, DLSS gave my gpu extra life.

      3. The RTX 3090 was only 10-15% faster than the RTX 3080 (for double the price). You’d have to be an idiot to get it. That’s not the case with the RTX 4090. The RTX 4090 is around 30-40% faster than the RTX 3080.

        I seriously don’t expect the RTX 5070Ti to match the performance of the RTX 4090. Again, compared to previous gens, the performance gap this gen is too big.

        1. The RTX 4090 is between 30-40% faster than the RTX 4080 actually. In my country, the lowest RTX 4080 is at 1200 euros and the lowest RTX 4090 is at 1600 euros. So basically, the price difference is 33%. You’d have to be crazy to go with the RTX 4080 with these prices.

          PS: If the RTX 4090 is available for 1600 euros in Greece, I’m certain you can find it at even lower prices in other places.

          1. Exactly. The problem is nvidia is overcharging on all gpus. That doesn’t make the 4090 a good buy. 4080 shouldn’t be 1200. Its just nvidia being nvidia. Of course they push you to a higher tier and make it look like a good buy. Whats wrong with 7900xtx at 1000 that even that is overpriced.

          2. Exactly. The problem is nvidia is overcharging on all gpus. That doesn’t make the 4090 a good buy. 4080 shouldn’t be 1200. Its just nvidia being nvidia. Of course they push you to a higher tier and make it look like a good buy. Whats wrong with 7900xtx at 1000 that even that is overpriced.

        2. I mean yeah it’s faster but that’s because both the 3090 and 3080 use GA102 dies, the 4080 uses GA103 and the 4090 uses GA102. They’re literary pulling the same sh*t they did back with 10 & 20 series, due to them being buddies with TSMC and can pull in better profit margins

      4. That also not how it works John, the highest jump in requirements is when a new console generation releases.

        I’ll be totally fine with my RX 6950 XT this entire generation, and same goes for my CPU.

        People who are actually intelligent about future proof know their hardware, and don’t buy PC parts every two years.

      5. The only future proofing someone could do John would be if they’d get a 4090 to play at 1080p60 for years to come (maxed out with RT and DLSS2 Quality, lol)

    2. He’s gone man. John is out to lunch. No bringing that guy back. I can’t believe what I just read.

      1. Right at the moment when i thought nothing could be worse than bloating your site with UE5 articles, and here he comes with the most horrible article that this site has ever known, and it also comes in the worst era where all AAA games are average at best and ported to PC with a pipe wrench, fcking embarrassing

        1. Bro, I couldn’t have said it better. I had know words. But you just said what needed to be said all in one. Because this article is pure trash. This needs to be recalled. Where that corrupt FDA when we need them.

        2. You’re still here, commenting and driving his sites metrics. Keep whining, it helps John get money. All while you’re still miserable and without a job.

          1. Thanks for letting everyone know that you not only don’t have a job. But you’re also Johns b*tch.

          2. Either you don’t know how to read or you’re just dumb, you’re the b*tch here not me, and considering how you replie to everyone with this wannabe sarcastic tone, you have hell a lot of time and you must be the one who don’t have a job, but anyway, i like to give attention to people like you so they don’t kill themselves, that’s my good move for today, next time you will just be blocked

          3. You aren’t what you are, everyone else around you is! Nice projection buddy. An even better move for your day would be for you to get a job and stop disappointing your parents.

          4. There is no hope for you i guess, that’s not your fault dude, that’s your father’s fault, he should know that one doesn’t marry his sister and make mongoloid babies with her, but that’s too late, now the mongoloid baby has a phone and started spreading sh*t on the internet…

          5. Slow down on the “I looked them up on the internet!” Insults. You need to space them out. Oh yeah, and you’re also a total low IQ goat fekker.

          6. Exactly my thoughts lmao. Poors and parasites in this shithole.

            Anyway, back to gaming at 240hz in 4k on my 4090.

        3. “all AAA games are average at best and ported to PC with a pipe wrench, fcking embarrassing” that’s not his fault, not Nvidias either, separate issue.

          1. Recommending the most expansive GPU to brute-force sh*t ports with raw performance is something a reasonable person wouldn’t do, with such artificially high prices and a context of countless bad ports in a row, this is not a good time to recommend a 1500$ GPU at all, it’s like complaining about traffic jam and searching for a proper definitive solution and some dude out of nowhere comes and tells people “hey, have you even tried helicopters ??”

        4. Tbh is not even that surprising, the amount of time John praised the optimization of some ports when said ports are wiidlly regarded as bad by the community is crazy. Hell even tLoUS port that the devs came out and acknowlodged as a problem John said it it could be better but otherwise fine.

    3. I think it is all a matter of perspective. The price of 4090 may seem expensive to some people, but many people can afford spending 2000$ with their single salary, and not to mention rich people.

      IMO, if you’re an adult and you cant afford spending $2000 on your hobby once for a while then it’s a good idea to really rethink your life, because we arnt talking here about buying a house or expensive car, but just a GPU for a fraction on the price. If someone buys 4090 and uses it for the next 5 years, it’s like paying $33 a month.

      1. I’m sorry, but anyone buying this GPU doesn’t have responsible spending habits.

        This card price is not worth it whasoever for a Gaming PC, if anything you can build a new Gaming PC with that amount of money.

        1. People have different hobbies, so I’m not just talking about the GPU prices. 2000$ isn’t exactly pocket money, but it’s also not extremely expensive, as some people try to make it out to be.

          The RTX4090 is overpriced compared to other GPUs, but if you like to play games at higher resolutions, then buying this GPU makes perfect sense. The RTX4090 will easily run games for the next 5 years (because it has insane amount of VRAM and it’s also overpowered). IMO every adult with a decent job should be able to afford spending 33$ a month on their hobby. If people cannot even afford that (33$ the equivalent of two large pizzas🍕), they should be motivated to do something with their lives.

          I can remember dreaming about my first PC because, with the job I had, I would have to work all year just to save up enough money for it. Luckily, instead of crying and blaming everyone but myself for my miserable life, I was ready to do something about it. I rethought my life, learnt the new skills needed for my new job and soon bought my dream PC.

          1. The fact you believe you need a RTX 4090 to play as higher resolutions, is just sad, and shows your far into NVidia marketing you’ve fallen.

            Unfortunately for you, NVidia has a track record of dropping GPUs the moment a new generation releases, I expect my RX 6950 XT to surpass the RTX 4090 in the coming years, going by track record.

          2. You free to believe what you want, I’ve only worked in the sector for the past two decades.

            Is not my money in the table that’s for sure, my computer will last this entire generation running games maxed out and I didn’t need to spend 2,000€ like you all claim.

          3. Ah yes, the old “I’ve worked in xyz for years” excuse in the comments section. Great work.
            Nobody has yet claimed you NEED to buy a 4090 to play high res games. All of this discussion is about being able to leverage technology like DLDSR to get superb picture quality in the latest games with max settings if you don’t have a 4k display.

            As an ultrawide fan, there currently is no ultrawide OLED monitor with a 2160P vertical height, so being able to use DLDSR on our QD-OLED panels at 5160×2160 is amazing, and all whilst having the immersion and image that only OLED on Ultrawide gives for gaming.

            For those that have a bit of disposable income, it’s a nice way to secure the next 5 years or more of gaming performance, and at the minimum, the next 2 years since the RTX 50 isn’t coming out now until 2025 is rumours hold true.

            People buy high end cards because they can, not because they have to. It sounds like you can’t, so are making excuses as to why, and then insulting those online that do. That’s a you problem, and not anyone else to blame.

          4. Kobi you have zero clue what you’re talking about. Stubborn pride in your ignorance just makes people laugh at you more.

          5. This is not even a matter of expense. Most people have enough common sense to see it’s a poor value proposition and will wisely allocate that money elsewhere.

        2. I make over 70 grand a year. I bought a 4090 and still have over 13k in savings. Get a real job, and then you can join me in laughing at you, as if you know what a “responsible” purchase for any individual would be.

        3. “anyone buying this GPU doesn’t have responsible spending habits”

          Based on what? Your opinion. Yes, it’s a lot of money for one component, but it still blows every other card out of the water for content creation, which is a very valid reason to buy it.

        4. Not everyone’s use case is always gaming. I bought a workstation with a threadripper inside second hand for cheap, and could have gone the route of buying 2 Quadro cards which is what’s supported for that Lenovo system (which would have been way more expensive 4 thousand bucks easily if I went with like a 4500 or 6000 series just for the cards!), but went with the 4090 because it’s way cheaper relative to the Quadro cards, will be relevant for way longer, and more practical for my use case which is teaching 3d modeling in blender. Yes I was looking at a 3080, but I buy components once a decade. So in the end I got a system that I spent like 2200 on that brand new would have cost at least 4-6k with native Quadro cards.

          1. That argument is outside the scope of this topic, John is clearly aiming solely for gaming here.

            So I remain accurate in what I said, going off topic to try and find valid arguments against what I said is a waste of my time.

          2. I don’t think it’s outside the scope of the article because people who develop games are more often using gaming hardware because it’s cheaper. The title of the article never mentioned gaming, and I acknowledge that while yes of course the primary use case of the 4090 is for gaming, it’s not it’s only use case, which is why professional software’s also support it, because there is some cross pollination because gaming cards are so much cheaper for creatives in visdev or concept art who don’t want to spend money on Quadro cards. I agree the cards are wildly expensive for the high turnover of gaming, but for other edge cases they make more sense. Sure it’s outside of the scope of gaming, but like, were talking about a spending money on expensive graphic cards here, not games.

        5. My guy, anyone that is middle class or higher (read: a lot of people), can easily spend 2000 dollars or a lot more on their main hobby. Wether or not it is “worth” it, the market outcome will decide that. Nvidia put up this price because market research showed them enough people had disposable income to think it is “worth” to buy it. They were right.

          1. The only people buying this crap GPU are people with low tech knowledge and more money than they need, and the Steam statistics just prove it.

            Pretty much the only intelligent people getting the RTX 4090 are related to professional workloads, not games.

          2. Idk man I think u just broke asf and keeps whining about other people getting a card you can only dream of…

          3. You free to keep assuming, I could buy this crap GPU for my entire household if I wanted to.

            Not to mention my GPU has been performing better in most 2023 games so far, and it only cost me half the price.

            NVidia been struggling hard with all AAA releases of 2023, good luck in Dead Space, Hogwarts Legacy and Star Wars Jedi Survivor.

            So try harder.

      2. High end cards were never upwards of $1100/1200 until the pandemic hit and Nvidia sought an opportunity to artificially inflate costs.

        Expecting this pricing trend to continue in a post pandemic market with looming recessions is foolish.

        1. I guarantee you this pricing continues for the remainder of this decade. In fact, the 90 card is going to go up $100-200 every gen, and the nvidia 7060 ti will be $700 after sales tax. There are a myriad of reasons, but in short nvidia doesn’t need mass sales for the indefinite future.

          The only way prices fall is if a third company, be it Intel or a wild card, truly enter the market and compete and I think one will in 2028-2030 as the margins will be too juicy to pass up.

          1. Uh you don’t follow much tech i can tell, the 7900XTX performs very well vs this card and their is far to much evidence of this, and you are paying atm 919$ USD for the card so 650-900$ less than the 4090, if you want RT fine go get a 4090 but to this day RT is as useless as rock is to a starving man. I challenge anyone to show me their gaming list that has RT games on it and how many are not RT games and which games they play most and do they even turn it on. Most will say no and have maybe 2 or 3. Its huge hit on even the 4090 yall lost it.

        2. the plandemic pozzed logistics and manufacturing costs you absolute r*tard. everyone, even your prostitutes raised prices. chip manufacturing became 5x more expensive AMD smoothbrain

      3. 2000$+ for a whole pc is ok but for a single gpu big nope even through i can afford it i just dont feel its right to pay that much for something like that.

      4. Your happiness should be your only goal. If you worry too much and don’t want to spend money to make yourself happy today, your body can get sick very easily and then you will have to spend your savings for expensive treatments etc.

        We live in the LOA Universe (all rich people know this), so if you put your wellbeing fist (and do everything that makes you feel better) you will attract good circumstances (good health and of course money 💵). If you do what’s everyone else is doing (work hard and ignore your happines) then you will end up like everyone else (poor, unhappy, and sick).

        Yes, I’m arrogant because I like to tell people the truth and not what they want to hear.

        1. Yep better to just spend your savings on frivolous toys and let the taxpayer worry about keeping your sorry a*s alive. /s

      5. Enjoy the life while you are somewhat young. It’s the boomer gen who wanted to save and only spend when at age of 80 when the life might as well be over

        1. Yep better to just expect the working-class taxpayer to take care of you because you didn’t bother saving any money to take care of yourself. Typical zoomer take.

        2. Yep better to just expect the working-class taxpayer to take care of you because you didn’t bother saving any money to take care of yourself. Typical zoomer take.

        3. Better economies for sure and being straight up lucky to be born in those times. My grandparents owned a home and two cars at the age of 20.

          And they never did any schools or “worked” harder.

          If they were born in the times we did they would live in small cube apartment without those cars and most likely not together.

          Millennials and Zoomers will suffer the consequences of those boomers and Xs destroying the economy.

      6. you do realize in every country currencies are different, and economies are super different, yet the prices are basically the same. Not to mention 2k on a video card LMFAO its just stupid no matter what

    4. What if you are one of the people looking to do that exact thing and play 4k 120hz? Is it still stupid? I put it to you that for such high end use cases the 4090 is an excellent product especially when compared to the ridiculous $2000 3090ti released less than a year prior. I think what we are actually mad about it the disappointing value of the rest of the 40 series.

    5. Yea but there isn’t going to be another choice fo 2 more years. The point is even though it was stupid expensive it was still better to buy it new and have a top of the line gpu for 3 years. Buying it next year would be even worse and there isn’t going to be any other options til 2025.

    6. My monitor is a QD-OLED 3440×1440, but I game using DLDSR at 2.25x, so 5160×2160 – This alone makes a 4090 a compelling choice for gaming even if you have a 1440P display, this way you can get crisp and naturally AAd rendering, with amazing performance, along with retaining your 1440P’s refresh rate, so in my case up to 175Hz.

      The price is high, but it’s justified. the high price of a 2080 Ti/3090 series were not justified given their relative performance that the lower models were only a few % shy of. Nothing else comes so close to a 4090 even still.

      A legitimate halo product comes at a price.

          1. What this has to do with the price? The price shouldn’t be based on your over the top, unessesary and spesific way you choosed to play games. You all stupids fall to the trick of nvidia over over overcharging the 4080 so you can say with a few more I can buy the 4090.

    7. I think the article might seem a tad naive now with the 3000 series still alive, but it is going to age very well.

      IMO, the 5000 series nvidia gpus are going to be even worse than the 4000s in performance gain, the 6000 likely too. As a result the 4090 in 2022 would have been a very smart buy come 2027-2028, assuming you’re willing to lower some settings later on.

      1. LOL I got a good laugh out of this one. I don’t know how you keep finding these hilarious robo-GIFS. Keep up the good work!

  3. Shame we see AMD sponsoring/bribing developers to hold back RT and skip DLSS/XeSS when we have such powerful GPUs from Nvidia. Anti-consumer behavior at its best. AMD would be better spending their money developing PC GPUs rather than push console/mobile trash to the ignorant.

      1. The proof so far is the multiple AMD sponsored titles and AMD taking the no comment stance. Where have you been the past two years?

  4. 308(9)0(12 or ti)=4070ti for 1440p
    4090 2x performance hence it would prefer for 4k as 4k is 2x of 1440p

  5. Pretty happy to get one at launch and for £1699 for a base level 450w card too. Getting 3 full years of 4k gaming out of 1 card due to new dlss modes is definitely a smug feeling and helps validate the purchase.

    1. I’m talking mostly about its release date window. It’s still a great purchase, but right now you’ve lost one year of smooth gaming (and its price remains the same, it hasn’t dropped below MSRP).

  6. Future proofing myself by avoiding crap games with expensive requeriments. Also, what’s the power source requeriment for a RTX 4090? The same as the mid tier cards? If not, better add more money in this purchase so you can enjoy Shaniquas at glorious 4k RTX

    1. Lovelace is the most efficient arch Nvidia has made. 4090 only needs 310w to hit 2700-2800mhz depending on game…

      1. Tbh, even with all of it maxed out I can hit 3045mhz on core at 450-480w in CP2077 with PT, but the fps gain is minimal, its just not worth it.

  7. John, you’ve lost the plot man. You are way off the reservation at this point. You won’t be invited to any pow wow at this point.

  8. The same thing is said every generation and every time it ages like milk. You yourself claim that 2080 Ti and 3080/3090 aren’t 4K cards now although owners of such cards have been shouting that they’re playing 4K Max Settings all day every day for years. Nvidia wants you to buy the RTX 5090 and they’ll do everything to make sure that happens.

    1. The same way RTX 4090 is an 8K GPU (because it runs some games with 30fps)? We have the benchmarks and the numbers, right from the start we were stating that the RTX2080Ti and the RTX3080 are “1440p GPUs”. Now if some people wanted to make these GPUs seem like “4K GPUs”, well… that’s their problem.

      As I’ve said already, if someone owns an RTX 3080 (and isn’t idiot enough to demand it to run games at 4K/Ultra/60fps), then he can EASILY skip the entire RTX 40 series. Similarly, if someone gets a 4090, he can easily skip the RTX 50 series. Even if the RTX 5090 is double the performance of the RTX 4090 (it won’t be, but let’s assume such an extreme scenario), the RTX 4090 will still be able to run all games with over 60fps at 1440p until the 60 series comes out.

        1. It was “advertised” as a 4K GPU, but it wasn’t. RTX 3090 was also advertised as an 8K GPU, remember that? 😛

  9. So you´re telling me that the 50XX series that we don’t know NOTHING about it will under-deliver because of “reasons” even tough Nvidia is allegedly skiping a whole year between generations ? Do you not see the contradition on this ? Wouldn’t that make the performance jump from the 40XX to 50XX even bigger, let alone the technology jump.

    1. Someone will get really disappointed if he expects that the delay is so that they can offer a bigger performance jump. Bookmark this story so we can see what will happen 😛

      1. I’m not saying that the reason for the skip is just for a bigger performance jump, but you´re naive if you think a year diference won’t result in a bigger performance diference even if it’s not intentional that is just basic logic.

        1. They never said a year different only a different year ….. It will likely only be a 4 or 5 month delay

        2. They never said a year different only a different year ….. It will likely only be a 4 or 5 month delay

      2. It’s got more to do with market conditions than anything else ….. Just because they delay release and manufacturing doesn’t mean it’s going to be improved over being released a few months earlier, it more likely means they are expecting better market conditions and the design will be exactly the same whether they released it in Oct. 2024 or February 2025

      3. It’s got more to do with market conditions than anything else ….. Just because they delay release and manufacturing doesn’t mean it’s going to be improved over being released a few months earlier, it more likely means they are expecting better market conditions and the design will be exactly the same whether they released it in Oct. 2024 or February 2025

      4. It’s got more to do with market conditions than anything else ….. Just because they delay release and manufacturing doesn’t mean it’s going to be improved over being released a few months earlier, it more likely means they are expecting better market conditions and the design will be exactly the same whether they released it in Oct. 2024 or February 2025

  10. John a 4090 costs 1670 euros and the 4080 1160, and I am thinking of getting the 4080. Is this a mistake?

    1. No, the man is a NVidia chill, no one in their right mind thinks the RTX 4090 is a good GPU due to the price alone.

      1. The question mainly is which worth it more regarding years-since-need-to-upgrade/money-spend. If the 4070 for example can last 2 years but 4090 6 years, you can say the 4090 is better value

        1. Who decides how many years a GPU will last, you?

          I guarantee you my RX 6950 XT is going to last this entire generation.

          And it only cost me half of a RTX 4090.

      2. For gaming only probably not but for professional applications it’s a steal

        A $1600 4090 will smoke a $10000 A100 in every aspect except memory speed and bandwidth making it very desirable for small startups to develop AI and engineering companies to use for design and simulation applications where you don’t need to string several of them in parallel and a single unit will do.

    2. No, the man is a NVidia chill, no one in their right mind thinks the RTX 4090 is a good GPU due to the price alone.

      1. For gaming only probably not but for professional applications it’s a steal

        A $1600 4090 will smoke a $10000 A100 in every aspect except memory speed and bandwidth making it very desirable for small startups to develop AI and engineering companies to use for design and simulation applications where you don’t need to string several of them in parallel and a single unit will do.

    3. Don’t get it, that simple. That price difference is 44%. The RTX 4090 is 30-40% faster than the RTX 4080. So if you believe that at that price the RTX 4080 is a great deal, so is the RTX 4090. However, if you could get the RTX 4080 at 1000 euros… well… that would be a completely different story.

      1. That’s fairly dishonest to claim. The 3080 & the 3090 were using GA102 dies. 4080 is using AD103 whereas the the 4090 is using AD102. Completely different dies and that’s why you see a performance difference.

        1. That has more to do with the node change from Samsung 8+nm to the TSMC 4+nm node

          Last gen the pro line and the consumer lines were on 2 different nodes. Last gen the 3080 and 3090 used the same dies and the A100 and other professional cards used a different die from TSMC

          This generation they are all on the same node and the 4090 is really the same die as the H100 uses. The 4090 was really a way to perfect the manufacturing process and yields for the H100 and was a way to actually get something back by selling them in the consumer market. The 4080 is a completely different die, it’s not a “cut down” die like the 3080 was which is why the numbering scheme is different.

          Your argument is based on ignorance of how IC manufacturing and the numbering schemes works.

          1. Another thing most reviewers fail to take into consideration with the price changes for this generation is it costs about 50% more per die on the TSMC 4+nm node than it did on the inferior Samsung 8+nm. Add in an inflation rate that hasn’t been seen since before many of you were even born and there is no way to get 20-30% performance gains at the same price as last gen.

            Are GPUs overpriced …. a little but not nearly as much as most of you believe they are because you don’t understand the actual price increases in the manufacturing process and are used to 4% generational (2 year) inflation instead of the 13%-15% we saw this generation. Throw another 50% increase in die manufacturing because of a more expensive node and prices are going to be higher and nothing will change that.

          2. That still doesn’t excuse the lack of performance of the 4080. AD103 is closer to 104 but you’re still paying closer to flagship price. You will perform this mental gymnastics over the idea of it being okay to pay flagship prices for midrange dies and thus Jensen will forever Buck Break you hand over fist.

          3. Ok define the difference between a 103 die and a 104 die

            Hint: It changes with every generation and node change

          4. I know the differences, just like how I know the 4080 is just the 4090 Mobile. Do you have the 4080 or 4090? The amount of d*ck sucking and excuses to pay 1200 bucks for a mobile chip is quite fascinating.

      2. With the 4090 I would definatelly need a new PSU and probably a whole new PC to take advantage of it, this is why I am thinking of the 4080. Also there are 0 reports of 4080 cables melting, which is reassuring.

    4. Thanks for you extended reply. I aggree on your points, surelly a 4090 will require a full PC upgrade on my case (I have the 3700X) even if I am gaming at 4K.

      BTW, currently I am using a 2080 which with “optimized” settings and DLSS Performance it can run games at 4K 60FPS (barelly) in pretty much any game.
      The reason I am looking for an upgrade is because I don’t think my GPU will last performance wise until the 50XX series.
      And if I am going to spend on a new 40XX series GPU, it is better to do it sooner than later, to enjoy it more, since there are no significant discounts usually as time passes.

  11. Got my new PC last month with RTX 4090. And I agree, that card is definitely a BEAST. Should last me for quite some time in 1440p.

  12. No GPU at that price is a “best purchase”. Certainly not for gaming. Perhaps worth it for non-gaming use if you use a GPU for that. Otherwise, this article is out of touch, and I say that as someone in the top 5% of earners in the world. I don’t upgrade GPUs until I get double the performance of what I have for the same price I paid for the card being replaced (adjusted for inflation). I went from a new GTX 970 (£330 in 2015) to used GTX 1070 (£310 in 2016) to a used GTX 1080Ti (£400 in 2018) to a new RTX 3080 (£650 in 2020). At the time, I felt the 3080 was an excessive purchase and felt guilty spending so much on something just to play games on. Now I am waiting for 4090-level performance for £770 before I upgrade my £650 RTX 3080. I am anticipating that I will be waiting until 2026/2027 until that happens and my GPU will be 6-7 years old by then. As shown above, my previous upgrade cycle was once every 18 months to 2 years. True stagnation of performance/price.

    1. It’s not out of touch it’s relevant in comparison to all other gpu’s on the market based on price to perf / features. It’s expensive no doubt, but comparatively it’s better value than the rest, that’s how bad the gpu market as whole is today.

      1. You lost your mind even further than John, no one talks price/performance when it comes to NVidia, they have lost that battle a decade ago.

        1. “that’s how bad the gpu market as whole is today”
          So you think the 1080ti launch price was bad?
          You think the 7900xtx launch price is good?

          1. You bet, the RX 7900 XTX retails for 900€ while the RTX 4080 retails for 1,200€ adding insult to injury the RX 7900 XTX smokes the RTX 4080 by over 20FPS.

            So if you asking me if AMD still has the best price/performance ratio of the industry, my answer is absolute yes.

          2. Yeah so the overheating stock version was released at 900, while the AIB’s with decent heatsinks coming in at 1099+. The flagship that matches the 80 class card from NV in raster only.

            Also I know a lot of people hate this list but here are the things that come with 40-seires cards (regardless if you like them or not) :

            Stable Drivers
            4th gen tensor cores
            DLSS2
            Frame generation DLSS3
            DLDSR
            DLAA
            3rd gen RT cores (inc. OMM Engine + DMM Engine)
            Shader Execution Reordering
            AV1 Encoding
            Reflex
            RTX Remix
            NV Freestyle
            GDDR6X (with huge OC potential)
            Better efficiency, Lower wattage + temps
            Chrome Super Resolution video upscaling

            If AMD released the xtx at 799 (with the savings made from moving to MCM) the market would be very different right now. Just because it’s cheaper does not make it better value.

            In the end they both are badly priced, but this is what happens when NV’s top tier is left uncontested, set’s the pricing for everything beneath it. NV shifted their entire product stack up by one due to lack of competition, add to that NV is still sitting on the full AD102 die, with no incentive to release it.

            Honestly with intel and battlemage coming into the mix next gen, it will be for the better. Also I think RDNA4 is going to be a lot better than RDNA3 (beta test arch) not enough competition atm.

          3. Is clear you a NVidia fanboy, you don’t need to waste my time with nonsense, there no overheating to speak off, and stable drivers? I assume you been sleeping under a rock and haven’t noticed the CPU overhead issues NVidia been having, where RTX 4090 falls behind the RX 7900 XTX by over 40FPS.

            Then all the features you mentioned are present on AMD, be with alternatives or otherwise.

            The fact you believe NVidia has better thermals and efficiency, is very clear at this point you have no knowledge whatsoever in this area, one of the strong suites of AMD hardware is undervolting.

          4. Not a fanboy, just see the reality of the situation is all. Because the 4090 is uncontested the, 7900xtx/xt/4080/4070/4060/ti all trash for what they are priced at.

            “Then all the features you mentioned are present on AMD, be with alternatives or otherwise.” – oh yh? list them.

            Um, the 4090 demolishes the 7900xtx at 250-300w power limited, the entire 40-series lineup is far more power efficient.

            “RTX 4090 falls behind the RX 7900 XTX by over 40FPS” – pls send the links on this one, I am genuinely intrigued.

          5. Just stop ok, you made it very clear you don’t know what you talking about, and I suggest you look around DSOGaming itself, I not doing your homework.

            To close this ridiculous crusade of yours, I can confirm right now the RTX 4090 runs at 415W during load, and the RX 7900 XTX runs at 340W, even the RTX 4080 has higher power consumption, despite running slower than the RX 7900 XTX.

          6. You didn’t refute anything I said which is funny. You’re uninformed and cannot read it seems.

            Firstly you’re completely wrong about power and perf, this is from an AMD sponsored title, lol:
            https://uploads.disquscdn.com/images/9eb69fb9cfbdb627b4b7a4af930d2bda80db328caf8585449e64fc2162667fd6.jpg

            And that’s with 100% PL if the 4080 were 70% PL it would be closer to 220w with a few fps loss, maybe none if you are not a total tech illiterate and know how to frequency curve tune in afterburner; hell even the auto tune button does it for you.

            You do not seem to know what “power limiting” or “efficiency” means. Here’s a recent example of my 4090 running hogwarts maxed out:
            https://uploads.disquscdn.com/images/23e0d5fb8d2539bcb21f2390eb3c99b6ba25d58113d4fea3a0d7d8f0e50799f9.jpg
            I’ve power limited to 70% = 310w max in games and curve tuned so depending on load hits 2700-2800mhz+ which is better than stock, and still decimates the 7900xtx at 340w or 500w+ OC in perf per watt by a big margin.
            (The specs of my build is in the cpu-z in my display name)

            If I wanted to I could push the 4090 to a near locked 3045mhz on the core but that would be around 420w+ with PL at 133% not worth it for smaller fps gains.

          7. You can keep trying to lie and bring nonsense, no one is taking your crap over reputable sources that are available all over the Internet.

            As stated I not doing your homerwork for you, you already proved 3 comments before you have no idea what you talking about.

          8. Lol, I already done your homework for you, you just are not able to refute any of it, not one point refuted, what a joke.

          9. You haven’t done anything besides embarrass yourself, as stated I won’t be doing your homework.

            Anyone with knowledge in hardware knows you wrong, and everyone else is a click away on Google from seeing everything you stated is nonsense.

            I not going to waste my time with a fanboy, is a simple as that.

          10. You don’t know what power limit or efficiency means even when broken down with examples lol. This conversation is pointless. Keep coping though.

          11. Super Resolution now works in VLC and makes a huge difference in my library of archived TV shows which were recorded from a dual TV tuner and were saved in 720p. They looked pretty good on 1080p but when I moved up to 1440p they weren’t looking so well anymore but playback in VLC with Super Resolution brings them back to life which also means my expensive graphics card is being used for more than just gaming and that also raises it’s value to me

          12. I’ll have to give it a go. Tbh I personally have always used MPC + MadVR to upscale my 1080p content.

          13. You would just be better off sourcing some higher quality versions of those shows, when possible.

      2. If people continue to purchase at ridiculous prices then this will perpetuate the problem. This is how capitalist economics/supply and demand works. This is what Apple did and why >>£/$1000 smartphones are now “normal”, whereas flagships used to cost about a third of that.

        1. If there is no competition in that tier, pricing will continue to go up. AMD could have disrupted the market with the MCM design but decided to follow suite in terms of pricing, whilst still being slower and having less features.
          The 4090 is a uncontested halo product, that itself will demand a high cost.

          1. AMD f*cked itself this gen. It had a golden opportunity here. Instead, it decided to follow NVIDIA’s example. And then people wonder why nobody buys AMD’s GPUs.

          2. Nvidia has made the 4090 uncontested by downgrading the XX80 and below tiers (badged higher tier than they are AND increasing prices). Previously, the XX60 and XX70 tiers offered the bang per buck that no longer exists.

          3. They didn’t downgrade anything they shifted the whole product stack up by one, because of lack of competition. They are still sitting on a full AD102 die (4090ti) no reason to release it.
            If AMD had the 7900xtx come in at around $799 (around what it is worth imo) the market would be different.

          4. Shifted the product stack up by one AND increased prices a tier too. That’s the equivalent of being two tiers out of step with what they did previously in terms of performance/price.

          5. I am not arguing in favor of their pricing model, I am simply telling you why they did it. Again if the 7900xtx which has less features and performance than the rivals top tier card came in at $799 (due to MCM savings) it would have affected the market as a whole, instead they copied Nvidia.
            It’s no surprise gpu’s are taking dust on shelves today.

    2. If you’re a top 5% earners in the world, I’m not sure why you wouldn’t have a top build basically yearly, lol.

      1. You’re getting ripped off, because the tech will never mature properly if you spend like an idiot. At that point your just there for the clout.

      2. I’m in the top 5% because a) I live in a wealthy country and b) I am not dumb. Because I’m not dumb, I don’t squander my money, and I don’t support horrible consumer tactics. Why would I spend double what I did on the 3080 (i.e. 200%) to get a 40-50% performance boost (4080), or 300% to get 100% performance boost? And, why would should I be expected to do this three years after I purchased my last card? Too little a gain for far too much.

        1. I don’t think the article suggest you should upgrade. It talks about people who want a new gaming PC

    3. I think John has lost his Fuqin mind. Like you said, “no gpu is worth that”. Because next year there will be one double the performance for a better price.

      1. I think the point is that Nvidia are “skipping next year”. In contrast, I am skipping 6 or 7 years…

  13. no John its not. Its because you bought one and use one and now try to hype yourself. Some news for you : in a couple of months the RTX 4090 Ti and the RTX 4080 Ti are comming.

    1. Then he’ll own the 4090 ti for 10% more power @50% more cost. Then he’ll be trying to convince you of how great this buy is, while crapping on the remaining 4090 owners. Can’t wait to hear how the 4090 ti is a better buy than the 4090. Smh

  14. in my country, Argentina, the 4090 is around 2200 USD. Our average wage is like 400 USD. so yeah, if you don’t eat for 6 months you can buy one 🙂

  15. It’s worth it if you’re a 4K high refresh rate gamer or you play a bunch of RT games and you’ve got money to burn. It delivers what you you’re looking for and then some. I was hesitant to purchase it (upgraded from a 3080 ti) but I don’t regret it. It leaves me in awe every time I use it. But if it’s not in your budget don’t bother. Just lower some settings or play at a lower resolution and leave the fomo behind. 1080p60 (on a 1080p screen) is still a fine experience.

  16. Why didn’t you mention the 3090? I am still using my 3090 and I have no problems in 4k. Most of my games I am playing maxed out or at least high settings in 4k. I skipped the 4090 because of my 3090 I could not justify the upgrade also werent people complaning about the 4090 starting fires at launch? Oh wait I did have a 3rd reason I did not upgrade and for me it was the most important one, No EVGA 4090s.

    1. No bro, you are old news. If you’re not using the latest and greatest you’re not in Johnny’s club. As soon as a new hardware is released John is on top of that like flies one a very seductive piece of manure. I can’t believe you came here with your 3090 and thought to get respect, huh 🙄😀

    2. Everyone that has a 3080 or 3090 shouldn’t even consider upgrading. You did well skipping the entire 40 series. For its price, though, the RTX 3090 was overpriced. This doesn’t mean that if you have one you won’t be able to enjoy the latest games.

      1. Overpriced definitely when you can currently get better performance for $800 with a 4070 Ti and use about half the electricity and not have to worry about the spiky transient power draw of almost 3 times the rated power. The transient power behavior of the 3080 and 3090 makes them problematic forcing you into even more inefficiency because you have to oversize your power supply which means it runs outside it’s efficiency curve unless you are pushing it hard. Running outside your peak efficiency curve means your 90+% efficient power supply is only 80% efficient and you are just burning money

      2. Overpriced definitely when you can currently get better performance for $800 with a 4070 Ti and use about half the electricity and not have to worry about the spiky transient power draw of almost 3 times the rated power. The transient power behavior of the 3080 and 3090 makes them problematic forcing you into even more inefficiency because you have to oversize your power supply which means it runs outside it’s efficiency curve unless you are pushing it hard. Running outside your peak efficiency curve means your 90+% efficient power supply is only 80% efficient and you are just burning money

  17. lol buying the xx90 card (of whatever series it happened to be, in this case it was the 4000s) at the peak of the gpu price gouging period is categorically the worst purchase you could have possibly made in at least the last 20 years

    1. It’s crazy, but again we see Nvidia called out over price while we only see AMD called out over garbage upscaling and poor performing RT. I guess you get what you pay for can afford.

    2. Oh by all means, since the 5700XT is the best GPU per dollar, go ahead and get it. Or get the GTX 1060 with its 6GB VRAM. That second chart is priceless, it truly is. Just don’t come here complaining about their VRAM issues or the fact that they can’t run the latest games with 60fps. If you can’t see the problem with this graph, then you deserve the sub-par performance you’ll be getting.

      As for the second chart, the RTX 4090 is CPU bottlenecked at 1440p. Those CPU bottlenecks will go away in two-three years when games have higher GPU requirements. So, instead of buying a new GPU, you’ll keep the same one with great performance. Blasphemy, blasphemy I tell you. How can this be? This is madness 😛

      1. lol nice try with the cherry-picking. Given how poor of a value proposition the 4090 is, practically any other GPU in those charts would be a better option. Worst case, one can get the RTX 4070 for $1000 less and get an identical experience to the 4090 using feature sets like DLSS2 and 3 and lowering a couple of settings here and there while also saving a ton on energy costs, far lower heat output and worry less about the GPU fitting in a case.

  18. I bought a 3080ti at MSRP from the EVGA queue in late 2021 during the shortage and it was one of the worst purchases you could make in hindsight. I could comfortably afford it but comparing it now to the 4090 at nearly the same price it stings.

    1. Early 2025 most likely in February when their new Fiscal Year begins. With the depressed consumer markets it’s better to not have them “on the books” at the end of a fiscal period like they normally have been doing. Plus they probably have most 2024 booked at TSMC in the production of H100 units and don’t want to book time Q3 and Q4 for a new consumer line.

      They may also be switching over to an Intel foundry in AZ and are probably going to take a wait and see approach if that will be feasible by then or if they’ll have to stick with TSMC and wait another generation

  19. “PC requirements are constantly going higher and higher. ”

    True that, John.

    Meanwhile, where are the games that have better physics than GTA IV (2008), better AI than FEAR (2005) or were as groundbreaking in their technology as Half Life 2 (2004) or Crysis (2007)?

    We all know why system requirements keep going up, and it’s not because of better games. It’s because more and more hackjobs get hired, people who haven’t a clue what they are doing.

    1. You nailed it. Higher requirements has always been a crutch for the less talented among the developers. And made worse by constantly tacking on useless graphical bells and whistles to sell the game to the mindless sheep who are mesmerized by visuals at the expense of advancements to technology that would actually improve the gameplay.

      I remember a time when developers who mattered were salivating at the prospect of advancements in tech that would allow for impressive AI only to see that opportunity squandered just to shoehorn more unnecessary and performance taxing visual garbage into games. It’s the same battle sound engineers have been fighting for years trying to get a little slice of those CPU cycles to better the audio design of the game only to see priority given to whatever whiz-bang graphical fad of the year they would rather choke the hardware down with.

      Same sh*t, different year.

  20. 4090 is a gong show of a product. No doubt about it. It’s comically over priced, and rather underwhelming in terms of quality control.

    That being said, there are still people who will invest in it. Why? Who knows? That’s up to the individual. To some people out there, who spend 10+ hours a day on their computer, either gaming/streaming/video creating/photoshopping, whatever it may be, the price may be worth it to them.

    Is it worth it to Joe Blow gamer? Not even remotely. 4090 is a grotesque monstrosity both in pricing to performance, and just in simple physical appearance. I’ve seen toaster ovens smaller than these f**kin things.

    In any case, I just think at the core of the entire article, John is saying, whoever bought a 4090, is set for a few years to come. They clearly shouldn’t even consider upgrading to a 50 series card.

    4 years before 60 cards are hitting us. 4 years for $1600? $400/year for a top performing card with legroom? It’s a heavy price, but it’s not exactly as crazy as some people in here may think.

    I’m sticking with my 2080 and just praying that either Nvidia, AMD, or Intel, do it right next generation. As far as I myself am concerned, f**k the entire 40 series.

    1. You sir, are someone that read the article for what it was. I’m one of those people that spends hours and hours a night editing, so spending 4090 money gets me to bed earlier. I didn’t buy it to game at all, and I do plan on running this card for 4-8 years. It renders out fully edited 60 sec long, 4k videos in 18-22 seconds. A 5000 series or 6000 series card would have to be sub 10 seconds for me to consider changing. At which point I will have paid a few hundred dollars a year to use a 4090 spread out for the next 4-6 years. Look at it this way, would you pay $1 a day to have a 4090 in your PC for the next 5 years? Absolutely worth every penny for certain people like me. I think creators and editors ARE the people mostly buying these. Time is money. I edit 20-60 videos a night and the savings are palpable.

      1. Ya, tbh, this is what I was getting from what he was saying.

        At first when 4090 came out, I was kinda the same as everyone else. Thinking, “Man, if anyone buys this thing, they are stupid…”, and so on, but…. after thinking about it, I can see why some people would see the value in it. Namely, people like yourself, in to productivity. Render time is very, very valuable, no doubt, and I can see you getting your money’s worth out of a 4090 over the next few years.

  21. LMAO..I had a hard time just scanning thru all the BS. Just Wtf man 🤣
    Johnny boy clearly has lost all sense by now!

    Mental institutions are calling him!

  22. LMAO..I had a hard time just scanning thru all the BS. Just Wtf man 🤣
    Johnny boy clearly has lost all sense by now!

    Mental institutions are calling him!

  23. if you have the money and power suply and willign to pay the electricity bills, sure, the 4090 beats everything.
    but for the length of this console generation, a 6800xt or 6950 would be your best buy, price/performance/vram buffer.

    1. Hands down the best buy. I just got a 6900 xt for 550 on an Amazon sale out of nowhere. I didn’t need it because I already had a 6800 XT where I paid 2.5x the damn price for. So I couldn’t pass up the 6900 xt. Undervolted and it’s running like Usain Bolt on crack. I just threw the 6800 XT in my wife’s PC for the living room.

  24. i have a 3080ti, but only because my PC randomly died and had to get a new one. id still be using my 980ti otherwise.

  25. f*ck VRAM John, it’s overrated, morbidly overrated mostly by the AMD cultist crowd, smoothbrain journos and NPC’s.

    12GB is more than enough, but 8GB is fine too if you’re not a 4K fetishist virgin.
    But only on Nvidia cards. AMD cards have abysmal memory lanes and they don’t even pass over some of the load to the CPU so naturally dated AMD hardware can only function with absurdly high VRAM

  26. Who in their mind right pay more than 1000$/euro for a single pc component to be used solely for gaming?!?

  27. Da fck is this ? This site has descended into madness, 1500$ GPU to play the weakest AAA entries of the decade while also 90% of them are bad ports.
    UE5 articles were stupid but in the end of the day they’re harmless, unlike this article which is just inciting people to do dumb purchases and show devs that YES !! We’re here to buy the most expansive hardware so you could keep doing sh*t ports that stutter every 3 seconds.
    The most demanding games are also among the worse ones, if you don’t have sh*t tastes and you’re not a modern AAA junkie, YOU DON’T NEED THIS SH*T OVERPRICED CARD

  28. $1599 starting msrp is not what I would call the best deal. Nvidia could have sold it for $1199.

    If people keep buying GPU’s for the price they were during Covid NVidia and AMD will keep raising prices more and more… Whats next? In 2025 people will say the 5090 at $1799 was a great deal… come on that’s worth more then the avg PC gamers PC is in total.

    1. esto es lo que Juan quiere, para que pueda hacer una comisión de nvidia. Juan es un payaso 🤡😂

    2. esto es lo que Juan quiere, para que pueda hacer una comisión de nvidia. Juan es un payaso 🤡😂

  29. A damn graphics card has nothing to do with moral or cultural decline (decadence) only a decline of some money in your bank account.

    Or if you were smart like me and invested in Nvidia last October when the price bottomed out, you could just sell 4 shares (In my case that cost me $488), take the $1720 it would net you, buy a 4090 and still have money left over

  30. As an early rtx 4090 owner. This article brings both a smile, but also some doubts.

    I feel rtx 4090 despite it’s price was the only real option, I was originally going for rtx 4080. Bit the lag of performance amd price difference. Made change my mind. It was either amd or rtx 4090. Since I use Cuda core acceleration at times. Rtx 4090 it was.

    However I agred it’s fast and even my ryzen 9 5950X has trouble feeding it at times being a cpu bottleneck. Specially at 1440P.

    However 8 years of life span. I will have to say, no to that. Maybe 5000 series will not be that fast so it’s worth replace 4090, but I will not have my card longer than to 6000 series at the longest.

    1. In gaming with RT the 5800x3d with CL14 B-die bottlenecks the 4090 at 1440p and 4K in my tests….

  31. Click baiting much? Trying to get some interactions by posting stupid posts, cause this is not even considered an article by any metric. How is anything Nvidiots has released worth the ammount of money they want? Why is buying a card that consumes so much power that even the engineering group resposible for the plug had to revisit and completely redesign a plug. Huh, youre plugging it wrong lame excuse apple to cover design flaws. Who the hell is happy with frame manipulation software running directly in the drivers with games actively supporting benchmark manipulations? You stupid people are the reason the market has been sh*t since the 1080 and Titan. Dont forget the 4070 8gb is a steal for the price article, or how the rx6500 was so revolutionary in the 1080p and the 128 bit bus is perfect in a modern cards at the $300-400 price range

  32. I get the whole point of the article however it’s valid for only a small fraction of people who really need that kind of power. The price is just out of reach for middle income folks who game 5 to 10 hours a week.

    I picked up a RTX 3080 for $500 on Amazon and couldnt be happier playing all my single player FPS games on a 4k/120Hz 75″ TV. Profound difference over the R9 390x I was using. The prices for RTX 3080 are wildly all over the place between $500 and $1100 USD – but at $500 it was the best thing I could get for that price.

    If I would have splurged on a 4090 , wife would have thrown me out the house.

  33. People without good PC’s will always trash the latest technology. Nothing new, simple minded people as usual.

  34. I got 3070 ti its actually ENOUGH for all sorts of games lol, heck buying 40 series you gotta be a dumb f*k, its so overpriced that its just dumb

  35. I’m with you, got my 4090 and my son got one, by god did we have to put in some overtime to get them, they price hurt a lot. But no regrets now, it’s a fantastic bit of technology, truly mind blowing like when I got my old Voodoo 3 all over again. Not met a single folk with one that wasn’t extremely happy with it.

    1. Going from lark of arguments goes to petty insults, your attempt at trolling also failed /blocked

  36. The 4090 is too power hungry, sure at least you get your power’s worth in performance but i would never feel comfortable with my gpu alone sucking over 400W from my power supply, that just seems so wrong and disturbing to me.

  37. i was reading the comments, it just kept going and going and going, its like i was reading a giant harry Potter book as a kid

Leave a Reply

Your email address will not be published. Required fields are marked *