NVIDIA GeForce header image

NVIDIA RTX 5090 rumored to require 600W & have 32GB GDDR7

Well-known leaker ‘kopite7kimi’ has shared the specs for the next high-end NVIDIA GPU, the NVIDIA GeForce RTX 5090. According to the leaker, the NVIDIA RTX 5090 will require 600W and it will have 32GB of GDDR7 with a 512-bit bandwidth bus.

kopite7kimi has been a reliable source for GPU leaks. So, these do appear to be legit. However, and as with all rumors, we suggest take everything you are about to read with a grain of salt.

The NVIDIA GeForce RTX 5090 will have a 2-slot cooler, and it will based on the PG144/145-SKU30. It will also have 21,760 FP32 CUDA cores and 170 SMs. Below you can find all the specs that kopite7kimi has shared.

  • GeForce RTX 5090
  • PG144/145-SKU30
  • GB202-300-A1
  • 21760FP32
  • 512-bit GDDR7 32G
  • 600W

Now as you may have guessed, these aren’t the full GPU specs. For instance, we still don’t know the frequency clocks. We also don’t know the ROPs or the Ray Tracing cores. Nevertheless, these will give you a vague idea of what you can expect from this new GPU.

For comparison purposes, the NVIDIA GeForce RTX 4090 has 24GB of GDDR6X, a 384-bit bandwidth bus, and required 450W.

Since the NVIDIA RTX 5090 will require 600W, I’m curious to see what NVIDIA will do with its power cable. By now, we all know what happened with the NVIDIA RTX 4090 and those burned power cables. Also, will the NVIDIA RTX 5090 have two 16-pin power connectors? And if so, will it require brand new PSUs?

Rumors suggest that the NVIDIA RTX 5090 will come out in early 2025. Given the timing of this leak, those rumors appear to be accurate. In other words, don’t expect to see an announcement for this GPU in 2024.

Stay tuned for more!

63 thoughts on “NVIDIA RTX 5090 rumored to require 600W & have 32GB GDDR7”

  1. >And if so, will it require brand new PSUs?
    Pretty sure we're reaching the point of requiring a separate psu for the gpu soon.

  2. If nvidia was the last manufacturer of GPUs on earth, I would stop gaming. Nvidia is like the Denuvo of GPUs. Whenever I think of them I get the shakes. Just, Ewwwwwww.

    1. A few more generations and integrated gpu's will be good enough for most gamers.
      Probably getting close now if you don't game over 1080p, If they ever hit my 4080 levels in the next decade I'd give up discrete gpu's, Not interested in 4k dropped back down to 1440p so I'm no longer trying to keep up.

      1. Atm an 780m runs around a gtx580 which is totally crap unless you are ok playing OLD games at 720p, lowest. Thats around 15y behind in performance. Its a huge gap.

  3. When you add the CPU, these PC can pull 1000W when playing games.

    That's similar to a small induction stove. Have you seen the cables hanging off an decent induction stove? They're thick cables to accommodate all the amps flowing through which would create way too much heat otherwise. And there are tons of warnings about not opening or touching any of that stuff.

    Hardware manufacturers are creating 600W GPU and letting gamer kids play around with 1000W+ PSU. It is just an accident waiting to happen.

  4. Nvidia's cards are amazing from a technological standpoint, but the price has gotten totally out of hand due to crypto and AI, and frankly there are practically zero games that are actually fun enough to justify a top tier GPU.

    1. I was looking over my most played games, and I was thinking to myself that almost all of them are either 2D games or 3D games with cell shading.

      I could easily play like 95% of my favorite games on a very low-end GPU if necessary. Just avoiding some AAA titles and suddenly I would only need a PC half as expensive.

      Aren't the most sold PC notebooks anyway. I assume that's what most PC devs actually target, an average notebook, which usually don't have high-end specs.

      In a world with inflation where people are trying to save costs, cutting down on unnecessary PC hardware is a great way to keep your budget in check. There are so many good indie and AA games that you really can do without that handful of AAA "Unreal 5" titles.

      1. This inflationary period was nothing compared to the late 70's and early 80's which had inflation twice as high for twice as long as this 3 year stretch ….

        1. Inflation isn't measured the same way as it was in the 70's & 80's. The actual price increase is much higher than the fake CPI, just like the 818K jobs were fake and the 81 million votes were fake.

          1. It's because food and fuel aren't considered in the govt released figures on inflation. They say it's because those are too volatile to track regularly but in reality we all eat and buy gas even if we don't have a car because fuel is factored into the cost of public transportation, shipping of goods etc. so the govt released numbers aren't very useful imo. Anyone who buys groceries knows what the real world inflation has been the last few years and it's definitely higher than officially reported.

          2. Not true at all ….. I was there and it was much worse in the late 70's than it is today. When I stated driving in 1975 gas was 54 cents a gallon and by 1981 it was $1.30

            In 1980 the Prime Rate was 19.5% compared to a peak of only 8.5% this time around

            In 1980 the Inflation rate was 13.5% while this time around it peaked at 8%

            We had recessions in 1975, 1980 and 1981 where this time we had 1 recession and it was under Trump. Last Democrat President to have a recession was in 1980 44 years ago)

    2. Recent big graphically demanding games interest me so little, I'm now gaming on a Switch.

      I had £3K set aside for a new build recently, but couldn't justify it as there are zero games worth dropping that on.

      1. switch is the most boring handheld i have ever used, i enjoyed PS vita more than switch. The interface, joystick everything is crap.

  5. The RTX5090 will probably use a 12V2x6 power connector (like the RTX 40 Super cards). The standard 12HPWR cable and PCIe 5 power supply will work, but in order to take full advantage of the 12V2x6 safety features on both ends, a new power supply will be required (at least that's what the guy who tested this 12V2x6 power connector told me).

    1. I buy them to play older games at higher frame rates and higher refresh rates.
      Teenagers and people in third world countries aren't profitable to Nvidia. They will keep increasing their price because their cards worth a lot more.

      1. People like me pay more in third world countries due to import duties. The gpu costing you $500 will cost us $580 or sometimes even $600. Few people like myself buy gaming hardware here due to increased prices. But that doesn't means there is no market here. majority gamers in third world countries are PC gamers, not console gamers. Also due to steam's regional pricing model, games in third world countries cost a lot less.

        1. Third-world countries don't have the buying power, which leads most companies to not focus on them which is the right thing to do.

        2. While you have to pay a little higher duties on goods here in the USA we have to pay on average $23,968 per year for healthcare insurance for a family of four. Then add to that cost all the deductibles and co-pays and that price increases even more

    2. Smart people like myself invested in Nvidia before the stock price took off like a rocket because of AI ….. the 500 shares I bought at $122/share 2 years ago next month are now 5000 shares at $124/share

      In other works my $61,000 investment is now worth $620,000

    3. Actually with modern games a 5090 will get you 40-60 fps unless you lower the resolution or settings, and even then you'll still have intolerable stuttering.

    4. .> hundreds of fun indie and AA games to choose from
      .> AAAA games come out with poor performance, DEI, microtransactions, forced launchers, etc

      Gamers: "Yeah, I really think wasting $600-$2,000 on a GPU just to play those 2 AAA games for a few hours, was a good investment. Muh raytracing 'n stuff."

    5. Blurry upscaled 1080p image barely hitting 60 fps running on a fire hazard that take 600W over a faulty 12 pins connector

    6. Say it loud brother. I'll drive you to the highest mountain. So you can broadcast this gem of a comment. If I can't get off work that day to transport you, then I quit. 😀 Beans and rice it is for the family. Seriously man, I couldn't have said it better even if I had a year to brainstorm. 👍

  6. We're entering a world where even basic hobbies will be reserved for the richest people.
    We poor bastards will stay stuck in the sewers, eating ratburgers like in Demolition Man.
    I know some people who don't even have cars anymore.
    They own nothing and they're… no, they're not happy, actually.

  7. if so then its getting into an uncoolable with traditional air cooler territory, i need a board view for this but my guess is atleast 16 powerstages ( VRM) modules with high sides and low-sides mosfets packages. bigger GPU die, atleast 2 Vmem stages and double sided mem modules, and board gonna flex under the weight and many A1/A0 mem Dislocation and solder joints issues, PEX and 1.8v not gonna change but multiple 5v power inductors gonna supply the voltage to the GPU, gonna be absurd no matter what. getting ready for many reballing and board repair for 50 series cards.

  8. At some point just make the 80-90 series a whole standalone console w it's own psu and OS for those who need 4k+, 240hz,maxout fps, 2500$.

  9. Good luck selling that in Europe.

    Electricity is not just expensive, but you also get punished for peak use and many people have dynamic electricity contracts. The bigger the peaks you create, the more you are charged for network costs.

    This means you can save a lot of money on your electric bill by spreading electricity use and avoiding creating peaks.

    Nvidia might claim these GPU don't consume a lot when idling, but they create massive peaks when gaming, which costs a lot of money.

    Electricity is no longer just a flat fee in most of Europe, to counter the intermittency of renewables, people are charged when they create peaks in off-peak hours when the sun doesn't shine or wind doesn't blow.

    Calculations some gamers make by just multiplying hours gamed by € per kWh, don't make any sense. High-end GPU are way more expensive today due to the peaks in demand they create.

    1. I pay £50 per month for electric. Yes, it has gone up significantly, but it's trivial amounts to anyone in the market for a 5090.

          1. All happened when OVO took over Southern Energy… It used to be profits… now it’s all record profits..

    2. "Electricity is no longer just a flat fee in most of Europe, to counter the intermittency of renewables, people are charged when they create peaks in off-peak hours when the sun doesn't shine or wind doesn't blow."

      They're saving the planet by making it more difficult for people to use disgusting carbon-based energy at ridiculous rates for something as trivial as playing video games. I think it's great what Europe is doing.

      I mean, I personally use as much electricity as I want for playing video games, but I'm glad that other people are sacrificing for the planet. For the children. For the future.

  10. When you add the CPU, these PC can pull 1000W when playing games.

    That's similar to a small induction stove. Have you seen the cables hanging off an decent induction stove? They're thick cables to accommodate all the amps flowing through which would create way too much heat otherwise. And there are tons of warnings about not opening or touching any of that stuff.

    Hardware manufacturers are creating 600W GPU and letting gamer kids play around with 1000W+ PSU at 220V. That will knock that kid out cold if he comes into contact with an internal PSU cable. It is just an accident waiting to happen.

    1. There was never a problem with the thickness of the 12HPWR cable, at least according to tests (temps remain perfect even under full load). As proven by gamers nexus it was the connector and user error that has been the cause of these connector melting problems. Redesigned 12V2x6 power connector fixed this problem.

      https://youtu.be/SZPn_Jby1YQ?si=LOk6lRatLwWA2mFT

      1. The connector itself is still too small for a 600 Watt load and that shouldn't be any surprise since it was mainly an Intel design and part of Intel's ATX standards. Seems to be a lot of magical/wishful thinking going on at Intel when it comes to voltage, current and power limits.

        Electrons flow across the surface of a conductor so pin size and contact area is even more important than wire size which fine since it is distributed across 12 wires making it function like a very large single wire. 18 ga will safely conduct 10 amps (Rated for 14 amps if I remember right but you have to de-rate to compensate for things heat, wire length, etc.) or 120 amps total at 12V = 1,440 W

        It's the tiny pins and lack of surface area in the plug itself that is the problem. They really need to limit a single 12V-2X6 to 300 W and anything above that use 2 ….. Or redesign the plug and make the pins and connectors larger

        1. It's hard to believe that nvidia engineers would use the 12HPWR connector if it were really too small for 600W load like you say. Games Nexus investigation proved 12HPWR connector can only melt when it's not fully inserted.

          There have been no reports of connectors melting with the 4070 / 4080 Super cards, so it looks like new sensing pins have fixed that problem for good (card will not work if connector is not fully inserted).

          1. A 4070 Ti max is 285 Watts which is below the 300 W max I was suggesting and the 4080 maxes out at 320 Watts so just barely more than my suggestion …. OK then lets say 350 w max but you still need to for a 4090 class GPU

            Also you might start seeing more 4080 have problems as they age and there is more resistance in the connectors because of the small male pins (GPU side) and not enough contact area with the female side. Proper engineering means taking into account aging and degradation over time and also because of heat

            Go look at a picture of the female side (cable side) of the plug. They are square meaning they only contact that puny little pin on the GPU in 4 places or less than 50% contact area of the pin is used. If you made the pins on the CPU side twice the diameter so you could use round rather than square contacts on the cable side then you would get near 100% contact and lower resistance and thus less heat created. When a connector or wire for that matter heats up it means the resistance of the connector or wire is too high for the current flowing through it. The GPU may only be using 450W but the power supply is sending 500W because 50W is getting eaten by the connector as heat which makes the connection degrade and increases the resistance even more until you get a thermal runaway condition and the plug melts

  11. I fully expect NVIDIA to price the 5090 at 1999 $.

    Although, the reality is that the price of the so-called "halo product" doesn't matter that much, since the amount of people willing to pay for it is rather small anyway, as can be seen from Steam numbers of the 4090.

  12. "Leakers" made the exact same claim about the 4090 a few months before it released ….. In the end it turned out to be pure bullsh*t …….

    I went and looked it up …. same joker made this exact same prediction for the 4090 ….. kopite7kimi misses on his predictions all the time but people always forget the times he is wrong. He'll change it again in a couple of months to 450W or 500W and then falsely claim he was right all along completely forgetting he said 600W earlier

  13. I remember Digital foundry and other YT slop merchants saying "8-10gb of Vram will be PLENTY for future proofing" in early 2022… I went with a 3080ti to get the extra 2 gb(12gb).I know folks that listened to DF and got Fuc*ed @4k. Im playing HL2 VR and maxing OUT my 12gb. Now I see 32gb is to be the new standard in near future. Idiots incapable of critical thought.

  14. Whats I dont get is why, just like the 4090/4080 there is such a massive gap between the top tier and next down. Also if nVidia pull the same greedy $hit with the pricing then I will be simply upgraide to a used 4090 as the prices will be much more consumer friendly.

Leave a Reply

Your email address will not be published. Required fields are marked *