NVIDIA Jen-Hsun-Huang

NVIDIA claims that the GeForce RTX2080 is more powerful than a next-gen console GPU

During his GTC China 2019 keynote presentation, NVIDIA’s CEO Jensen Huang claimed that the GeForce RTX2080 is more powerful than a next-gen console. In short, Huang claims that both the RTX2080 and the RTX2080Ti will be more powerful than either PS5 or the next Xbox console.

Now there are some interesting details to note here, so hold on to your butts.

NVIDIA’s slide compares the RTX2080 with only one next-gen console. As such, we don’t know whether NVIDIA is comparing it with PS5 or the next-gen Xbox. We also don’t know whether the green team is comparing the RTX2080 with Lockhart (the less powerful next-gen Xbox console).

Let’s also not forget that the RTX2080 has been out for one and a half years. Furthermore, both of the next-gen consoles will come out in one year from now. This obviously means that both MS and Sony have enough time in order to further enhance and tweak the frequencies of their GPUs. Not only that,  but console GPUs are more efficient when it comes overall performance and optimization.

Since NVIDIA did not clarify what this next-gen console actually is in its slide, we can only speculate about it. As said, this could very well be Lockhart and not the high-end model of the next-gen Xbox console. Still, it may give you an idea of how powerful these new consoles will actually be.

Lastly, it will be interesting to see whether the RTX2080 will be able handle the first next-gen games. It will be also interesting to see whether NVIDIA’s new GPUs will be significantly faster than the RTX2080Ti; a GPU that NVIDIA hints at being way more powerful than both the next-gen consoles.

Below you can find a screenshot from that slide.

198 thoughts on “NVIDIA claims that the GeForce RTX2080 is more powerful than a next-gen console GPU”

    1. 2080 Super ? LMAO. That’s 2080Ti class. So a $1200 GPU in APU FF plus a Ryzen 8C (only god knows about SMT) Part with a single VC and 1 Fan and sub 400W power consumption in 6 months ? LOL. What are you smoking huh.

      AMD APU being fabbed by Alien tech or what to get that level of efficient.. TSMC 7nm has 2 variants one is EUV which only Kirin was fabbed on. And none not even A13 on 7nm+ which is EUV and only rival is 8nm Samsung EUV. Which Nvidia is fabbing Ampere on which will decimate RDNA2.

      AMDs 5700XT rarely trades blows with 1080Ti which is how many years old ? Forget RTX And you spouting that RDNA2 will catch up to a 2080Ti Tu102 Silicon ? In a Mid range Conslow BS ? Man this is the best joke of the year thanks.

      Stop BS right there.

    1. Microsoft said twice of a X1X. That’s 12 TF. And 12 Navi+ TF is more than 12 Polaris TF. The 2060 isn’t even close to that.

      1. Teraflop is a made up marketing term used by console manufacturers to try and covey to kids how poweful these consoles are. In reality it carries no real meaning due to many other factors.
        Its like gauging a cars lap time based on its exhaust/body kit, its silly.

        1. Yes but its more complex than that. Are the figures stated FLOPS at half or single or double precision? The figures are usually cherry picked when used in marketing and are misleading and meaningless.

          1. Obviously but what are the figures being stated in marketing materials? I’d hazard that they use half precision for the bigger numbers. The method of calculating flops is never stated either, the FLOPS figures used for marketing purposes are BS.

          2. So it’s BS as it doesn’t take into account architectural differences and assumes that all cores are equal. No need for the attempt at being an obnoxious smart ar*e BTW. Has being around too long made you bitter or something?

          3. I understand, thanks for the heads up on how they calculate the figure. My reply said that it was a poor indicator of performance since it assumes that all cores are equal. I didn’t know whether there was an industry standard test for the measurement of GPU FLOPs – like a Linpack equivalent but it turns out they just use the simple algorithm you stated… which is a bit cack really.

        2. Vega 64 is at 12TFs vs 5700XT at 9+TFs. We know which card r@pes the other in efficiency / performance and everything. On top these consolow bs TFs include CPU too.

          Both from same company, we see that discrepancies and forget comparing that figure with Nvidia vs AMD.

          It’s flatout Bullshlt. AMD RDNA is not efficient at all despite being on 7nm TSMC node vs a 12nm old Nvidia Node. This year Nvidia is going 7nm EUV from Samsung, Ampere is going break RDNA2 to ashes.

          And Ryzen 3000 in conslow will be 8C yes, but we do not know on SMT, and ofc 2C will be dedicated to OS and base firmware. Add the clocks, it won’t clock high, as the SoC is GPU + CPU so a PSU and 1 VC cooling system aint gonna cut more than a 2070 PERIOD.

      2. Nvidia was always behind amd in TF numbers and yet their cards outperformed them in games.
        Examples:
        Gtx 980 ti: 6.0 TF
        Amd r9 fury x:8.5 TF

        The 980ti beats the amd card most of the time and in some cases by a large margin

  1. “Let’s also not forget that the RTX2080 has been out for one and a half years.”

    One and a quarter. September 2018. 15 months.

  2. Well, regardless of what anyone pushing a narrative says, GPU’s CPUS, and consoles are behind what’s expected today.

    Especially in the PC market each gain seems so tiny and expensive.

  3. This is all such nonsense, AMD have yet to make a desktop GPU better than the old 1080ti let alone a 12 TF APU to fit inside a small console.

    Yet again Sony and MS are full of it.

  4. Considering that the price of a 2080, not to mention 2080ti, is about the same as the rumored price of BOTH consoles, this discussion is meaningless.

    It’s like comparing hamburgers with caviar, ffs.

    1. I like how someone downvoted you but that person apparently didn’t realize the 360 gpu was better than anything available on PC at the time for 1 simple reason.

      It was the first gpu that didn’t have a separate pipeline for vertex and pixel shaders. It took PC 6+ months to come out with a mixed pipeline GPU and every gpu since has been mixed pipeline.

      It was incredibly efficient and could do significantly more with significantly less. The 360 had a total of half a gig of ram…half a gig. Yet until the last 2 years of it’s 7 year life cycle it was competitive with $1000 PC’s.

      1. It was downvoted and you were downvoted because for a fact you’re wrong. There were absolutely better GPUs available for PC in 2005. 360 had an 8 year lifecycle, not 7.

        You don’t have any idea what you’re talking about. That’s why you get downvoted.

    2. Yep and Xbox One x can’t even beat a GTX 1070…

      A GTX 1070 using better settings than Xbox One X…
      https://uploads.disquscdn.com/images/628e0c70714d693bd69fef7431dc49b6d65b0526fcc92ebc75eb3171d4f03055.png

      Circa RDR2,
      https://uploads.disquscdn.com/images/956a8c2375d33aeb3a1cf6c455149d0c73f60a4ea85925018c6164f88dc810d6.png

      A RX580 is capable to make it run with a 25fps on average with some settings still better than those used on One X…

      Let it run with exact console settings and we can easily guess that the RX580 could run it with a locked 30fps.

      Not really sp4ctacular or sp4cial…

          1. X performance is between 1060 and 1070. By today’s standards that’s quite low. My 5 year old gpu has the same performance. When the new box comes out, it will already be far behind.

    3. Man, are you crazy ? The Xbox 360 GPU as fast as 2 X GTX 6800 Ultra ? I just re-check the specs and I can say “no”!

      1. Have you even played a UE3 game on 360? Lmao you’re bad.

        BioShock had basicaly no graphical fidelity and ran like crap compared to the PC version. Gears 1? 2? 3? Hilarious.

  5. “Not only that, but console GPUs are more efficient when it comes overall performance and optimization.”
    No John, they aren’t. Console optimization = lower graphic settings.

    1. He’s talking about how when you are working with a fixed system (everybody is on the same hardware) you can optimize for it specifically, in a way that you can’t with the miriad of different PC configs. He’s right on that so what exactly is your point?

      1. My point is, how many games run on console with equal or better graphics settings than pc?
        The PS4 has a GPU a bit faster than a R9 270X, plus 8 GB VRAM. How many games a PS4 can run better than a PC equipped with a good cpu and at least a GTX 980?
        Have you ever watched any Digital Foundry’s comparison videos?

    2. This is completely false.

      Console hardware is way closer to the metal.
      Devs get to squeeze every last bit of performance out of console hardware because they can work with the hardware on a very low level with very little OS abstraction.

      On PC the closest thing to this are the new explicit API’s like DX12 and Vulkan .

      1. > Devs get to squeeze every last bit of performance out of console hardware because they can work with the hardware on a very low level with very little OS abstraction.

        That’s not what they’re doing, dingaling. Third party devs arent wasting time cOdiNg tO MeTaL. They’re using SDK and all the hardware is abstracted away. It’s not Nintendo working on firstparty titles and optimizing with ASM for years until it’s perfect. Get a grip.

        1. Nobody inlines ASM anymore unless they’re engine programmers.

          What do you think the SDK offers? The SDK (XDK in my experience) has a very low-level API and instructions that are tightly bound to the hardware.

          Do some research.

      2. A Gaming PC can have a chance to brute force its way to ironing out any small optimization issues – consoles however just kinda…. well buckles under any added pressures.

    3. What he is trying to say is that consoles have less software overhead like Windows running in the background which allows the devs to squeeze a bit more juice out of the same GPU compared to PC. Plus designing games for one GPU for 6 to 10 years straight gives you enough experience to really push the boundaries of that GPU. But I digress, the difference is only felt in a small handful of games near the end of a console generation.

      1. That used to be the case sith bizarre gpu solutions used in older gen consoles, nowadays it’s closer to pc so the devs can push much more of the processing potential from early of the console’s life cycle.

        Which is a good thing for the gamers as well

          1. I wonder what OS is lighter the one on the PS4 or the Xbox one?

            We may never know for sure. I wonder how many titles are truely optimized at the metal however, something tells me all that really does is lower CPU usage not improve visuals or GPU usage.

    4. Wait what? PC gamers these days are all about 1080P at 200+ FPS. They dont give a f*k about graphical fidelity as long as they can twitch.

      1. I doubt that. Maybe in terms of just graphics power, but a lot more goes into it then that. That’s why the ps5, for example, will be able to do ray-tracing, 8k, extremely fast load times etc. The 2070 struggles with just Ray-tracing alone at higher resolutions, so I don’t know how it’s going to “beat” them in any real world scenarios.

        1. yes, forget about 4k native on both consoles and pc into the next gen…both systems have now similar tricks like upscaling variable rate shading etc etc but its still impossible to raise graphic fidelity and ensure 4k 60 fps, you can do that with older titles

        2. BHAHAHAHAHAHAHAHAHA.

          These absolute melons.

          PS5. 8k. Raytracing.

          Do you even see how ignorant and stupid you look right now? PC
          s can’t even do 8k without a significant investment, multiple GPUs, a lot of overclocking, etc.

          You think a trash little plastic box thats 3 inches tall is going to do this.

          You don’t even know what raytracing is. Please, spare us.

          1. Lol, considering I build PC’s, I do know what I’m talking about. And I’m quoting Sony here. Sure, there may be several short cuts taken to achieve these things, but considering the fact they’ll likely be cheaper than a mid-level GPU alone, the masses won’t care. Until you can buy, or even build, a decent PC for less than $1000, people will continue to buy consoles. Especially since they also come with half the headache.

    1. With Jensen owning 80 perfect on the GPU market, his cards are beginning to cost a lot more than a next gen console. How much does an RTX 2080 go for these days again?

  6. Ok but it reminds of The same claims nvidia said about the gtx 680, it was true until 2014-2015(apart the lack of vram on the stock gtx 680 having only 2gb vram compared to the consoles that have access to more) but nowadays many multiplatform games perform very badly on a gtx 680, but im pretty sure its mostly because drivers are not giving this series further optimization and game developers dont optimize for kepler anymore.

    And the fact most developers are optimizing the best they can on consoles while pc is an afterthought.

    1. Not really

      On Star Wars Jedi: Fallen Order, the GTX 750Ti (4GB version) deliver an average framerate of 22fps @1080p maxed out (GameGpu benchmark).
      The PS4 version use a dynamic resolution to maintain its 30fps (1280×720 to 1600×900)…

      On The Outer Worlds, the GTX 750Ti (4GB version) deliver an average framerate of 20fps @1080p maxed out (GameGpu benchmark).
      The PS4 version use a dynamic resolution to maintain its 30fps (1280×720 to 1920×1080)…

      In both cases, let the GTX 750Ti run a dynamic resolution (or run with console settings) and you have a better experience than on PS4.

      Circa RDR2,
      https://uploads.disquscdn.com/images/956a8c2375d33aeb3a1cf6c455149d0c73f60a4ea85925018c6164f88dc810d6.png
      A RX580 is capable to make it run with a 25fps on average with some settings still better than those used on One X…

      Let it run with exact console settings and we can easily guess that the RX580 could run it with a locked 30fps.

      And this is what the actual most powerful console is able to do against a GTX 1070
      https://uploads.disquscdn.com/images/8207928fbd10bb7fd8eda92a25b559e76ebf992cd7f8e332c6f4ab6d9a77a287.png

      There was also an interesting discussion into the FH4 performance analysis thread
      http://disq.us/p/1w40iy3

    2. kepler is not driver issue. it is direct architecture issue causing the architecture end up less performing in newer games. this is direct effect from AMD having all the major console using their hardware. no amount of optimization will help kepler to increase it’s performance against GCN.

  7. JOHN PAPADOPOULOS is a full Retard…..this is old news, and you don’t even know what that video presentation actually conveys, LMFAO….watch it again, you moron…

    as usual, typical garbage DSOG article…..Hilarious…..Now, go ahead and down-vote me, DSOG fanboys and Fangirls…

    1. he read it on resetera, then made an article here. no research, no nothing.

      to have resetera, the most demented, unhinged and retarted forum on the internet as your news source. forums posts basically … eveything else would be better

      1. Did you read the article? Are you even capable of reading?

        John literally cited and linked to the official Nvidia YouTube channel with the video in which Jensen made the claims and then provided a screenshot from the presentation.

        You should do more research before you call out others in the future.

        1. this video and screencap were posted on resetera about an hour or two before he posted them here with people talking about this exact point.

          its not the first time. there are multiple ocassions where he posts articles because he saw threads on resetera

          1. Whenever he’s gotten something from Resetera in the past, he has cited them specifically.

            I’m not gonna hold it against him if another thread coincidentally discusses the same topic.

            Btw, do you have a link to the Resetera thread?

          2. The image in that thread is a *cropped* version of the image John posted. This makes it highly improbable that John took the info straight from Resetera because the image John posted is a full screenshot from the YT presentation.

            If it was the other way around (John posted a cropped version of an image from a resetera thread), that would point to John having taken info straight from that thread.

    2. Hell, if I thought that someone was a full retatrd, or a moron I wouldn’t give anything that he had to say on any platform (or subject) the time of day, and yet here you are on a website where the guy that you feel this strongly about is the editor in chief.

  8. That’s the reason you don’t hear Sony or Microsoft going on about teraflops anymore. They know the GPU isn’t going to be the jump in performance as the processors.

    1. Of course they were limited. Their hardware dates back to the 2012-13 time period, complete with ancient AMD Jaguar CPU’s.

      PC gaming will always lead in brute power, but with Nvidia totally owning the majority of the GPU market, GPU’s are insanely overpriced. It’s reaching the point where it’s threatening the future of the PC gaming market.

  9. It doesn’t really matter. Console can never keep up with Nvidia GPU (AMD has become useless).

    Nvidia can pump more and more powerful GPU’s so fast, that even the 2080 will be considered an ancient relic by 2021 for example.

    1. So just because Nvidia has the fastest GPU of over 1000€ price tag, AMD becomes useless, even tho most of the people buys GPUs in 300-400€ price range? You boy are r**ard of a rare kind.

  10. People need to stop and think about who designs the GPUs that go in these consoles. Its usually AMD or Nvidia who are in the business of selling graphics cards and chips. Do you really think that AMD or Nvidia has a secret low-cost super high-end GPU in their back pocket just waiting for Microsoft or Sony that is more powerful than their high-end $600+ GPUs? Unless there was some huge break-through in chip technology funded by a console maker (which we would hear about in the news) the likelyhood of consoles beating pc in graphics is nearly 0%.

    1. It’s not always about raw power. Consoles have 1 massive edge over PC and that’s actual to the metal coding. If doesn’t matter how close to the metal direct x or vulkan get, they will never allow direct access to the hardware because it would be a massive security risk.

      He didn’t randomly say this to make himself feel good. He said it because he knows consoles punch waaaay above their weight and always have.

      Look at the visuals the consoles produce today on absolutely ancient garbage hardware. Take that same hardware and put it in a PC and you won’t come anywhere near the visual quality or a playable frame rate and this has arguably been the worst hardware generation for consoles ever.

        1. Well it’s more than just windows. Technically even s console could use abstraction layers and some games do. Games that don’t need to maximize resources or just poorly optimized titles

          There are 2 ways to make a game on a console. Let’s use Xbox as an example. A developer could use a common Api like dx12 or vulkan (which is an abstraction layer) or they could spend the extra time and code to the metal.

          It really depends on their end goals, time and budget.

      1. Actually, OpenGL (and other) on PS and DirectX on XBox. Devs interface with APIs when developing for console, just like they do on PC/Mac and Mobiles. In fact, they can and do use the same game engines (Unreal and Unity for example); except for those bigger companies who develop in-house game engines.

        1. The Xbox One hardware predates Pascal-based cards by years. Not to mention it’s seriously held back by its weak Jaguar CPU.

        2. Nah, my rtx 2080 plays red dead just as good as my one x…. 4k at 30 fps lmao, they both do that, and look just as good.

          1. “Nah, my rtx 2080 at ultra settings plays red dead just as good as my one x at low/medium settings”

            Fixed 🙂

      2. Windows is the major difference here. But, this is fixed (and way above) by having way more powerful GPU/GPU’s and more RAM…

    2. Thank you. Consoles can’t beat PC. Unless they take a massive loss on the console. Which Microsoft and Sony aren’t going to do anymore and rightly so

      1. Consoles don’t need to beat PC’s. Most console users could care a less about PC’s because all they care about is playing games and not dealing with
        driver updates and viruses, Malware, and all the other crap associated with PC’s.

        1. Then one would really come out on top choosing Stadia over either of the upcoming consoles then if we are talking about ease of use. PC unique in that everything is custom/ chosen by the customer to their liking. You don’t get any personal customization with console or Stadia at all. Worst of all you have to play all your games in a vanilla format, so no customization there either – It’s no wonder Sony and Microsoft are now scared of Google Stadia, hell they’re basically the same.

          1. Eh, not really. We’ve started seeing games released that do let you customize settings (albeit no where near the level of PC), on the higher-end versions of consoles. For example, choosing 4k @30/1080 @60 on the pro/one X. I imagine the next gen will take this to a much higher level with hardware powerful enough to allow such customization.

            Also, what makes most people choose consoles, and what is going to be Stadia’s biggest hurdle, is the social aspect. As someone who’s been a console player exclusively almost my entire life, and having just built my first gaming PC two months ago, I can’t tell you how frustrating it is not being able to play with friends. Then there’s having to buy games across multiple platforms (I’ve bought rocket league on PS4, switch and PC). Then you don’t have access to your inventories from the other platforms. I know cross platform is beginning to be a thing, but it’s still no where close to where it needs to be in terms of everyone playing on their choice of platform and being able to play with friends. The Stadia doesn’t offer it at all.

            Then there’s the connection issues with Stadia, and we’re still years away from the entire country/world having the necessary infrastructure for everyone. Also, those customization features you mention can also be overwhelming. The PC I built isn’t high end, but it’s no push over (3600X, 16gb 3200, 1660 ti, 1 tb Evo 970), but some games I still prefer on my PlayStation because it’s just easier (looking at Odyssey and Exodus specifically). And I’m a pretty tech savvy person, for the majority of people, it makes the decision even easier.

            So, while I think eventually consoles will be gone, I’d say we’re still at least 2 generations away.

          2. First off the level of game “customization” on a console is laughable, and is nothing to mention really when considering what you can do with PC games in terms of graphics and performance customization, I would hesitate to even call the console level of it customization at all.

            Thinking that it is going to get any better is just insane considering that it is the developers choice to implement such things, and developers don’t think of console games in terms of in game customization, which is why there are precious few that allow it at all.

            Games aren’t going to just stay as they are in terms of the graphical implementations on offer. New consoles means newer graphical techniques, techniques that will push gaming hardware that much harder, so you may or may not be playing a game at full 4K depending on the game, and the developer. Depending on the dev you may be playing their games at 1440p 30, or 60fps.

            Remember, the goal is to get as close to photo-realism as possible, so with new console hardware comes the continuous push to that goal. You guys can’t even assume that a game will run better on the new hardware really, like games on PS4 Pro, and Xbox One X, older games could perform just the same, and most console gamers will be praying for developers to patch older games for better graphics and performance – just part of the woes of console gaming.

            P.S. I don’t think that 5G will take two console generations to proliferate the market, not if China and other countries have anything to say, and the world isn’t going to wait for the U.S. to get its act together when it comes to 5G. America will simply be left behind.

          3. i think the 2 ecosystems have brought benefits to each other…for example the controllers offer a more convenient and rewarding experience than mouse keyboard control scheme. i use them in multiplayer fps too the mouse players wipe me out obviously but i dont get stressed or tired while playing and enjoy more eventually…the vibration is a big small advantage and a wireless controller can save you from growing a hunch

          4. Hunch huh? I don’t know how you play on M&K but I can play for hours and only get stiff from sitting too long.

            vibration has a big small advantage? I assume you mean you can feel when getting shot which would indeed be advantageous except for the fact you see faster than you feel and even if it was the other way around, consoles run only at 60fps meaning the reaction time to you feeling or seeing would be the same for turning around.

            I think arguing in favor of controllers, using first person shooter games is the last type of genre you wanna use. Though IMO third person action games, racing games and such are nearly impossible on a M&K and the DS4 is one of my favorites in that regard.

        2. “driver updates and viruses, Malware, and all the other crap associated with PC’s.”
          Please stop. It is not 1995. Drivers can update automatically if you choose, malware and viruses get blocked by Windows Defender, what other crap? Choices in options? Control options? Or are you gonna point at Batman Arkham Knight, a game from 4 years ago as the standard that all PC games have problems?

          Consoles have legit advantages but what you said there was just incorrect.

          1. arkham knight was just too heavy for a 780 ti which everybody mentioned as the card which should have ran it without a drop of sweat
            with the introduction of 970 generation the game s performance improved much probably because of the increased onboard video memory if im not wrong the 780 was 3 gb only and pcs stream textures worse than console because there is no unified memory pool

          2. Lets be honest here. Arkham Knight was widely lauded as being one of the most terribly optimized ports in years. You could have thrown a $900 GPU at it and still had massive frame drops and stutters. There is a case to be made about PCs getting sh**ty ports but lets not pretend like the hardware isn’t more than capable.

          3. i may agree because unless they will downgrade it alyx is totally console unbound and it looks like how will probably look next gen already but maybe arkham knight was built for the different architecture of the consoles which was more efficient at streaming now with truckloads of memory the problem with the game have gone totally

        3. if you purchase your apps and dont crack them and you keep windows updated you will experience zero security issues at least the major ones…most drivers update in the background now and if you own an ssd in your pc installing large drivers like nvidia s requires only a few minutes time

          1. Wow great rebuttal there. He is right though. Windows updates itself anyway (and so do consoles for that fact) and Defender stops malware and viruses.

            People act like PC gaming is rocket science. It is not. Even PC building is easy. It’s like LEGO. And if your argument would be that the benefits of PC are not worth the “less convenient” methode, then you have no idea what you are talking about. At least offer benefits to consoles themselves. For example: Low entry price, very quick to setup, nice exclusives.

            Also stop upvoting your own comments. It’s really sad.

        4. I never said they have to beat PCs. I keep seeing articles saying how next consoles are going to be so much better than PCs etc. Do 4k 60fps Ray tracing etc. It’s bullshit.

          Both next gen consoles are going to have what was top mid tier 3-4 years ago. Maybe top high end. They can use all the bullshit PR words they want. That’s what it’s going to be

        5. The main annoyances with PC gaming (and I love PC gaming) are forced windows 10 updates, windows 10 updates that constantly break things in the operating system, flaky driver updates, massive patch downloads, the 5 or 6 different online game distribution services I need to join just to own the game, etc. I prefer pc gaming over consoles, but there is something to be said for the simplicity and convenience of modern consoles that are essentially PC’s with highly optimized games for the hardware. It’s nice to kick back in the recliner and jump right into a game in front of my 65″ 4k TV and bigass surround sound system.

      1. So funny…
        Meanwhile, list some PC games that can use 100% of the available 8,10…14…16..18 CPU cores, while using 128gb of system memory and 100% of the gpu& video memory…

        Oh… can’t find any ?

        We’re close to 2020, a standard normal/highend gaming PC can have at least 6 to 16 cores…and yet most triple A titles barely use 1 cpu core at 50%, left alone using all cores at 100%…

        If studios could optimize their code, to make it run at all cpu cores and available ram, gamers could be playing their latest games at 4k ALL ultra, at 90 or 120 fps, instead of 1440p, mid_high and 40 fps…
        But no. Why ? Because all big companies think like “ok, if the gamer can’t play at ultra, it will be his graphic card who will be too blame, instead of our poorely optimized game engine’…

        “Hey, gamer, you can only play our game at 1440p, 40 fps and mid_high settings ? It’s not our fault at all ! The problem is 100% the graphic card you have. What, you just paid 1200 bucks for it ? Oh, that card isn’t good enough, you need to wait for the next chip to be released, so you can buy a more powerful card !’…

        “If our game runs poorly, it’s also because our game is phenomenal! It has the best graphics ever ! Of course, that requires a great graphic card! ‘…

        By 2020, many gamers will have 16+ core computers. But the next big title, will still be using one single thread, on a single core.

        If console devs manage to utilize 100% of the available[8] cpus, why can’t pc dev’s do it, for pc ?

        They simly don’t want to. Actually, they don’t need to… because the graphic card will be doing all the work.

        Next year, if PC games were fully optimized to use all the cpu cores, gamers could be getting 100 or 150% more power/ better graphics.
        But no. They will have all those 12…16…20 cores, sling absolutely nothing…

    3. Consoles are designed to play games only, PC’s are built to use hundreds if not thousands of programs. No PC can compete in cost/performance with a game console. Comparing the 2 is like comparing an Apple to a WaterMellon.

      1. Must be why they have 3 OS’s, apps, online functionality, DVD/BD players, and all that.

        If you still believe this ridiculous, pathetic pro-console shilling, you’re stuck in the 90s and have nothing relevant to say.

    4. Consoles have come to a point where pc graphics are on consoles, at least for xbox one x and the x series, we are seeing 120hz, freesync, 4k and high end graphics. (As well as optimisations)

      PC’s will always remain unpopular compared to consoles and It’s only getting worse.

      1. What in the world are you talking about? I’ve been a console player for most of my life, but having just built my first PC, you’re just wrong. Even mid-tier PC’s beat out the one X. What game can the one X play 120 FPS in 4k, or even above 30 fps for that matter. And the customization options don’t compare to PC either. Then there’s loading times, etc that people might not notice without seeing the two next to each other. I don’t think consoles are in trouble any time soon, but they still don’t come close to PC.

        1. Nah pal they do.. most multiplats, the pc beats in in higher res and fps. In terms of textures, polycounts, actual graphics fidelity are matched, at least on the X.

          Sure PC can do a lot more but next gen consoles are what the devs focus on aka the larger audience and when X series releases, the PC vs consoles gap will close even further, since series X promises RT, 120fps, 8k (most likely 4k games)

          1. “when X series releases, the PC vs consoles gap will close even further”

            Ahahahaha! You make it sound like the gap between consoles and PC was small to begin with. The gap has been huge for a long time, and whenever a new generation of consoles comes around, pc performance is already leaps and bounds ahead, history just keeps repeating itself. Let me tell you, when these new consoles finally arrives, the 3000 series geforce cards will be out sh*tting all over them little AMD apus.

            Oh and by the way, you can forget about 8k and 120fps. It will most likely be 4k with and without reconstruction (game dependant) and 30/60 fps as the norm.

          2. It really isn’t that big at all anymore.. 120fps is coming to the X series. We already have native 4k games and freesync

          3. Tell that to the team that made Red Dead. Many of the console settings according to digital foundry are “lower than the lowest” pc setting.

          4. What are you talking about? The texture quality is set to ultra on a pc equivalent. There is only a few on low settings such as reflection quality.. you know it’s native 4k on x right?

          5. Actually the majority of the settings were low/medium, as well as the draw distance sliders. There’s also graphical features completely missing.

          6. Remind me what the texture quality is set too? Ultra? You forgot to mention that since you’re desperately trying to downplay the x

        2. Gotta love all the driver updates, patches, oddball performance robbing bugs, etc that come with PC gaming. I once had a random framerate hitching issue that took weeks to identify. It turned out to be a totally unrelated ASIO driver that would cause th multiple games to stutter on my oc’d 2500k. Wonderful things to deal with. Way better than being able to casually kick back on the couch and throw a game into a console and play on my 65″ 4k TV I guess.

          1. You’re playing games with an 9 year old processor and you’re upset about having issues. Well I hope you’re playing games on it from that era, otherwise console gaming is exactly where you belong.

          2. Firstly, you didn’t say a few years ago, you merely said once. Once could be anytime in the span from 2011 to 2 weeks ago since the processor is that old. Secondly, what’s your point. I bought a console back in Christmas 2005 only to have it completely crap out on me by mid 2007, but unlike a Windows PC I could not fix the issue because it was not a PC. Using a PC for gaming at least gives the person a chance to fix issues themselves.

    5. The more interesting statement Nvidia could have made was performance per dollar. The next gen consoles appear to be offeringbhigh quality 4k/8k graphics for a fraction of what the price-gouging GeForce RTX line is going for these days. Jensen Huang went out of his way to try to downplay the next gen consoles because he is suffocating the PC gaming market with absurdly overpriced RTX cards and a reasonably powerful next gen console serves as a threat on multiple fronts. One – price. Two – the ongoing trend of PC games that are essentially console ports, which will mean that those game engines are optimized for RDNA when they hit the PC. For as vocal as Jensen has been that Navi is “underwhelming”, he seems oddly threatened by its presence.

      1. Exactly, as a rtx2080 owner i can tell Jensen, he better be scared. I’m not planning to buy a new card now, i have been buying top tier card for a decade nearly and i am so done with terrible performance for high costs. I’m planning on selling my pc as whole. He better be god damn scared if next gen deliver 4k at 60 than i’m sold.

      2. The vast majority of games are not going to be in 8k on consoles. That is marketing BS. The specs that have been leaked make it very clear that this is not possible beyond basic 2d games or really old low poly 3d games. They are going to be targeting 4k 30fps with Low/Medium settings in 90% of games. If they stepped it back to 2k (1440p) they could reach 60fps in most games with High settings but most of the time they are more focused on visual fidelity than good gameplay. If they do choose to target 1440p 60fps then they’ll just tell you its 4k and most people will be too dumb to notice its not really 4k.

        1. Navi has no problem hitting 60 FPS at 4k with medium to high quality. Especially when a game is optimized for the hardware. The switch from weak Jaguar CPU cores to Zen 2 cores should also help performance significantly. I absolutely agree with you about 8k though.

    6. Of course not. First off consoles are sold at a loss. Secondly, Windows is a terrible operating system, both MacOS and Linux beat it at every square except for gaming. Pretty much all websites use Linux cause well, it’s just that much better.

      Eitherway, my RTX2080 has only slightly better performance than my xbox one x. I can play red dead on my x in 30fps 4k and i can do the same on my pc. it will never hit 60fps not even all on low. Besides the Xbox looks way better than all low on pc. Eitherway in other titles my pc out performs the xbox, but only marginally. However I have a 2500eur setup, and the one x was only 266eur lol.

      We can talk numbers all day but my only question is; does it do 4k60fps, i know a few games do on my x, so i have faith it will. If that’s the case I and many others including my friends will ditch pc. Their 1080ti performs worse than a rtx2060 in red dead cause, why not? Nvidia screws over their costumers. Also consoles always had custom AMD chips, they are indeed custom, and specially made for consoles.

      1. Sounds like you need to check your settings. Tons of games run at 2K and 4K on my RTX 2070 at 60fps. You can definitely get 4k 60 on a RTX2080. You may have to bring some settings down to Low/Medium but it can definitely do it. Watch Digital Foundry’s Red Dead PC performance video where they go through all the settings, it may help. They found that for Xbox One X to achieve just 30fps it is using mostly Low settings across the board with a few less taxing settings turned up to Medium or High.

        Also on a side note, these custom console chips were talking about are not built from the ground up for consoles. Most of the time they are simply custom variants of existing chips with slightly altered clock speeds.

      1. Yep :D. Most people just ramp up settings without knowing what they do and then cry that the game is poorly optimized (in general, not specifically RDR2)

  11. And the 1080ti is more powerful than the 2080. Its got 11gb vram. The 2080 only has 8.
    Many aaa games need over 8gb vram to run games at 4k ultra. Only the 2080ti and 1080ti can do that
    How old is the 1080ti?

  12. Well i would hope so Nvidia.

    Guys take a look online at HD 7850 results in 2019
    I like to look at some games that push hardware red dead is a good one i feel.

    1080P low settings one can keep it playing above 30fps a console with equivalent power the PS4 runs it around the same settings

    Advantages the PS4 and xbox one have over the Radeon 7850(7790 Xbox) is 8GB of shared memory which can improve performance as its a unified memory controller, and has more VRAM then the 7850HD.

    Basically what i’m saying is “Console optimization” isn’t as big as people think anymore.

  13. Personally I will buy RTX 3080 Ti whenever it comes out, AMD cards are too much behind in terms of performance per watt.

    1. But it is also about price. Nvidia have been taking the pi.. lately with their prices. I have always had ti cards but the RTX ones were a crazy price so I went with the 2080 for £799 which is around the max i’lll pay for my GPU.

      1. Sold my 1080ti so my 2080 ended up costing me about 150. Supports ray tracing etc and better frame rates, so all good. I’ll probably get the 3080 as long as it’s within my max.

        1. Not sure about what exactly?

          You mean that you’d argue with the 2080ti, 2080 etc. aren’t over priced, really you’d be the only one I think.

          1. It’s overpriced cause they have no competition in the high-end simple as that. Again most people buy $200 video cards in that market Amd is losing to Nvidia. Then again you can find a used RX 570 for 100$ and that’s a good deal.

            5700XT is pleaged with driver issues like no other on Reddit where i don’t see much being posted on Nvidia’s reddit. It seems way worse then launch then usual for Amd. Least their CPU division is kicking butt i can’t even see a reason to buy Intel unless you need an extra 20fps and your gaming at 240+hz or playing some very exclusive old games like Star Craft 2

    2. performance per watt …..
      oh … this GPU have bad performance per watt
      i guess i dont buy it …
      how Rtarded are you today ?

  14. The damage control stars earlier from Huang.

    Conveniently he forgot the current gen consoles have a weak OC to hell amd APU and they can drive games on Mid/High graphics for which you need a high end card from his company in order to archive the same fidelity in many cases.

    He also conveniently forgot “Parity” is the norm for Multi platform games as well as Exclusive games Always drive the consoles to a level for which he would need to make another High end card to able to compete.

    I am not fanboying for Consoles, its just Huang’s Bs this time is not even funny. Is literally pure envy since console makers, except nintendo with the switch (which imo he con into the tegra) have being choosing amd, even Google with their Con service went Amd too.

    Keep pivoting into Machine Learning Huang, because this BS will only work on blind fanboys.

      1. Millions also use Instagram and such tools. There are millions of sheep and i’m not talking about the 4 legged ones

    1. That kind of thinking is exactly what hardware vendors want from consumers. To keep buying the newest overpriced hardware instead of putting pressure on publishers to release better optimised games.

    2. Console hardware has always been already old when they are introduced and good way for manufacturers to push old stuff out of their warehouses

  15. Well that’s about right, next gen console will be somewhere around RTX 2080 Max-Q, gimped yeah. Comparable to 1070 (Ti)

  16. Well according to the limited specs of the series X it is suppose to be able to do 12 TFLOPS off the navi GPU which is supposed to have ray tracing also, it will be interesting to see. I think performance seems like it will be right around that of the RTX 2080 and Nvidia is probably talking about their ray tracing performance in some abstract way that no one is really using.

  17. thats what i calla smart late PR move, to sell unsold 2080 because at launch of ps5 he knows will have no chance of selling them at full price

  18. the declaration stated that laptops with rtx 2080max q are more powerful than the next gen consoles .

    and 2080maxq is less powerful from rtx 2080 decktop version by at least 30%

  19. According to digital foundry, a rtx 2070 super is like a 2080.

    Rtx 2060 super is like a 2070.

    Maybe a 2060 super will beat consoles.

  20. No sh*t, given the price point. People who look at hardware specs on consoles know that they aren’t going to be the most powerful beasts. There’s a scale of economy.

  21. Is this an out of season April Fool’s joke? Since when will a console come close to a mid range computer video card (much less a high end one)? For the PS4 I remember getting excited when they said it would have as much power as a 680gtx or half as much as a 680 gtx which turned out false. Late gen games did look much better than you would think possible like Death Stranding but still, few games are able to hit 60 fps at 1080p. We’ll be lucky to get actual 4k at 30fps most likely 2k resolutions at 30-60 fps variable. Which would be fine but Those claims about 4k 60fps are just claims not actual performance to expect.

  22. As on every console launch, the salty party crashers from the salty and jealous nvidia need to come out and say the next xyz CPU or GPU that xyz console will be using, will be S H I T, compared to the ultra expensive, with huge amounts of processing power purposely looked through some special components or hardware [just look what Eg, HP does to the graphic chips/graphics cards they sell, that don’t accept “standard” nvidia or AMD drivers] nvidia chips.

    Just try to install, eg, the standard graphic drivers for a more recent OS than the one that came with that HP pro laptop…
    Eg, you buy a 2nd hand HP laptop from the company you work for, it came with windows vista, or seven… when you start installing a newer OS, all the components will be fine, there will be drivers for everybody…excepting…the graphic cards…without those drivers, you must use your laptop at 800*600, 16 colors…

    ON PURPOSE, HP put a little useless component on the PC, and turns that card into a ‘proprietary” card…that way, they can prevent people from upgrading their laptop with the latest OS…

    also, don’t you find strange, that a 1200 or 1500$ graphic card, IS ALWAYS JUST ENOUGH POWERFUL to play 1yo games at ultra everything…

    No matter if a gamer buys and pays 5’000 or 10’000 bucks for the latest nvidia graphic card, he can be sure ALL the recent games won’t be able to run at ultra, on that $10’000 card !
    Nvidia always makes sure gamers WILL HAVE TO UPGRADE AND BUY THE UPCOMING $1’500 card, in order to be able to play it at ultra.

    Want 1024*768 ? Need a new card
    Need 1600*1200 ? Need a new card
    Need 32bit graphics ? Buy a new card
    Want full HD 30fps ? New card
    1080p 60fps ? New card
    4k 30 fps ? New card
    4k60fps ? A new card

    And when the current cards started being powerful enough to run almost all games, at ultra, 4k, SAY WELCOME TO raytracing: a new way to restart the graphic cards market, raytracing will destroy the card, and won’t allow ultra settings and high framerates.
    Nvidia, just make a $5’000 card, for gamers, with 3 or 4 GPU’s, 32gb of memory, so the gamer can be happy for the next 2 or 3 years, without having to upgrade…?
    No, you don’t want that, right…you want gamers to feel the need of HAVING TO BUY A NEW CARD every 6 or 9 months, just to be able to play the latest games at ultra…

    Nvidia, if your chips are so advanced, so much better, so great, why are the 3 console manufacturers buying AMD chips for their consoles, for over 10 years, and the 6 -7 years to come…

    We see how LTT struggle, to try to make several nvidia cards work…oh, you need a special sync card, to be able to use 4 or more nvidia cards at once, right…

    WHY CAN’T A SIMPLE GAMER HAVE ACCESS TO SUCH “PRO” SYNC CARD, in order to use 4, 5…or even 10 nvidia graphic cards…

    Yeah, everything is crippled, underpowered, on purpose.

    That’s why console game makers manage to make incredible games run on super slow and poor specs consoles such as a ps4…if we take the hardware from anps4 or xbox1, and put it on a PC, the same games would be running at 640*480, low details, 10 fps.

    IT IS OBVIOUS THEZE NVIDIA CARDS, even at 1500 bucks, aren’t allowed to perform as they should.

    Stupid theory, right… so then, just look at a pro graphic card…with it’s incredible specs, memory, graphic processors, etc, ONLY BECAUSE OF THE DRIVERS, those 8’000 dollars graphic cards run games like a $60 crappy graphic card.
    Simply via the drivers, and 1 or 2 hidden components, nvidia lock its performances, Zaza and make them run like crap, on games.

    Why wouldn’t they do that, to the gaming products, so gamers ALWAYS need to upgrade every x months..

    Yes, salty & jealous nvidia 🙂

  23. Man some people here are seriously retarded. They think consoles have some sort of secret alien technology that can do everything at max at 100+ frames for 500 bucks.

    News flash. A console is basically a PC in a box. High performance costs money. Either these consoles have to cost more or are already being over hyped on specs. But then again, in the console space, nothing is ever over hyped right?

  24. Doesn’t matter, in the end the consoles get the better looking exclusives overall, and more of it. Simply because they can put in millions into high budget games and really learn how to optimise and use the gpu to it’s potential, also new graphical software features develop over time, bells and whistles, and the budget allows for it, since they will sell millions on consoles, on pc it’s a different story. Put a lower gpu, minus the baggage of pc, and a much higher budget and dev time learning the ins and out of the one gpu/cpu, with developent tech advancements over time that allow for more free resources and better optimisation, is a win. Multiplats win out on pc’s but only by a margin, and some better framerates, but they miss the exclusives. Look at what the ps3 achieved with its cpu/gpu and limited ram and ram cache. It kept getting better over time right till the end. Amazing how much you can squeeze out of an outdated gpu/cpu.

Leave a Reply

Your email address will not be published. Required fields are marked *