AMD vs NVIDIA by MAHSPOONIS2BIG

NVIDIA’s CEO on AMD’s Radeon VII GPU: “It’s underwhelming”

Yesterday, AMD has officially announced its first 7nm graphics card that will be based on Vega 20, the AMD Radeon VII. AMD Radeon VII will compete with the NVIDIA GeForce RTX2080 and from the looks of it, NVIDIA’s CEO was not really impressed by what AMD has achieved.

When asked about the new AMD GPU, Jensen Huang told PCWorld that he found this new GPU underwhelming.

“It’s underwhelming. The performance is lousy and there’s nothing new. [There’s] no ray tracing, no AI. It’s 7nm with HBM memory that barely keeps up with a 2080. And if we turn on DLSS we’ll crush it. And if we turn on ray tracing we’ll crush it.”

The AMD Radeon VII will come out in February and will be priced at $699, however most gamers can already find some NVIDIA GeForce RTX2080s at similar prices.

AMD also showed some benchmarks between the Radeon VII and the GeForce RTX2080. In Battlefield 5 and Far Cry 5, AMD’s latest GPU was as fast as NVIDIA’s opponent, whereas in Strange Brigade the Radeon VII was noticeably faster than the GeForce RTX2080.

Now since a lot of people criticized NVIDIA for pricing the RTX2080 at $699, I’m pretty sure that the same people will criticize AMD for releasing a new overpriced product that does not bring anything new to the table. Yes, the Radeon VII can match the performance of the GeForce RTX2080 and while competition is great for us consumers, we do have to wonder why should gamers bother with it when they can already get an RTX2080; a GPU that performs identically, has the same price, and comes with more/newer features.

122 thoughts on “NVIDIA’s CEO on AMD’s Radeon VII GPU: “It’s underwhelming””

  1. Of course it is. But the bulk of the sales will be mid range cards. Wait and see what they come up with to compete with the 2060, 2070 ,etc.

  2. Sounds like a Jensen is having a little tantrum. AMD first to market (again) with a consumer 7nm card. Looking forward to see the benchmarks.

    1. AMD’s own comparison during the presentation shows performance is identical to a 2080 in DX12 and DX11…
      Almost everyone is underwhelmed with AMD’s showing.

      1. Almost everyone is a genius… wait… wut?

        Problem is that not everyone is considering the number that will be produced (this looks to be failed pro chips) and the use of in the pro markets.

        VII is just a stop gap until Navi next year, technically amazing that they shrunk Vega so quickly.

    2. nah that is just the drama he put out to the public lol. the reality is both company are working together to get better profit. why nvidia pushing the price higher with RTX? why AMD did not take this advantage to offer better value for the consumer? they need to put an act to show they are “competing” or else….they will be accused of price fixing like what happen in the late 2010’s XD

  3. I think the biggest hurdle this GPU has is it’s price. If it was just 100-200 dollars cheaper it could put a serious dent in NVidia’s market share.
    Considering 1080TI’s where about 600 dollars once it’s simply not competing on price and that is where AMD was usually best.
    The same GPU with 8GB of HBM for 600 would be a way better deal.

  4. Witcher 3 isn’t a good point of comparison. Witcher 3 is really, really efficient with VRAM usage. At 1080p, maxed out, it uses less than 2 GB VRAM, lol. Doesn’t really cross 2 GB even at 1440p. Maybe gets to 2.5 … maybe … at 4K.

      1. It saves me money, so that’s cool. I think Nvidia had to put something out and the 2000 series isn’t meant to compete; it’s meant to keep them moving forward, even if that forward movement is sluggish.

    1. For people that don’t have a 10 series and are looking to upgrade, the 20 series makes way more sense, it is the latest hardware and is the same price as the 10 series as of now, it’s honestly useless to upgrade to each new series, I don’t see the value in the price for performance increase from one series to another. Best thing is always to jump a series or 2 before upgrading.

      1. I just don’t see any reason to upgrade any time soon. Then again I mostly use my 1080Ti for VR and my next VR purchase will be an Oculus Quest, that’s standalone.

        Maybe in 2020 for a new GPU but that’s if price/performance ratio is sensible.

        1. oculus quest ahahahah
          pimax 5k, its the only decent vr headset and i do need 2 1080ti for it
          after having a high fov headset the view from anything else is a joke

    2. I’m both a 1080 Ti and 2080 Ti owner, and I mostly agree. I have a 3440 x 1440 120 Hz monitor though and I prefer to keep closer to 120 FPS because I’ve gotten accustomed to it. The 1080 Ti still doesn’t manage that in many new titles unfortunately, but the 2080 Ti has just about gotten me there.

      If all the biggest games right now were using DLSS, then I don’t think it would be so underwhelming.

      1. >If all the biggest games right now were using DLSS, then I don’t think it would be so underwhelming.

        DLSS will not work well at target resolutions below 4K, that is why so far it’s limited to it. And in 4K you can get same performance boost and quality by making a custom 1800p resolution.

          1. You can make a custom 3200×1800 resolution in your graphics control panel or by using DSR/VSR, it has the same/better image quality as 4K DLSS (which renders at 2560×1440), as well as the same performance.

        1. i don’t think DLSS is strictly limited to 4k only. the BFV demo that they run during CES presentation is using RTX2060 with RTX and DLSS with 1440p res.

          1. I didn’t say it was limited, but there is a reason why they still have not made lower res modes available.

            DLSS renders the game at half of the target resolution, so 1440p would be under 1080p and 1080p would be 720p. The lower you go, the worse the results are because there is too little information to infer extra pixels.
            4K DLSS has already been shown to have the same/lower quality than simply running a custom 3200×1800 resolution.

  5. Bold and sad but true (post metallica pic lol) considering they had to use a node shrink and they barely match nvidia, for the same price, probably consuming more, and having no additional feature (not that i find RTX particularly useful, but still). So yeah i’d say underwhelming.

  6. Yea ok. You know what’s also underwhelming ?

    THE FAIL RATE OF THE 20 SERIES.

    I don’t think you in such an almighty position to be making those claims. True the 2080Ti is a beast (not looking at $) but as it stands many customers are just PISSED at Nvidia and that is reflecting on your STOCK.

    Holy… how could he say that lmao. He should’ve been humble like “good to see AMD back in action, let’s see who wins” or whatever.

    1. imo Nvidia has grown arrogant over the years. I am so much hoping that Intel gives Nvidia a swift kick in the a*s next year : )

      1. When asked what he thought of Intel’s new push into graphics, Huang commented that Intel’s team is basically someone else’s.

        “Intel’s graphics team is basically AMD, right?” Huang asked. “I’m trying to figure out AMD’s graphics team.” =))

      2. You’re going to be disappointed in this case. Nvidia fans would sell their mothers and first child before they commit the sacrilege of apostasy. It’s a religion for them. Anyway, sarcasm aside, Intel’s first foray in to the GPU market after a number of failures would probably be at the low/midrange and datacenter space. They want market share not glory of being number one. They’ll probably start nibbling at the feet of Nvidia, and start chomping at AMD’s midriff with low prices and mediocre performance. After all, Intel already has some GPU tech from their iGPU and are probably using all their previous unsuccessful products such as Larrabee to come up with a suitable product to go against Nvidia and AMD. Also, there are a lot of cross-licensing going on between the three, I’m guessing that performance wise, their first product would probably fall in the range of where AMD Radeon VII is right now. AMD is already moving on to a new architecture, and Navi is work derived from when Raja was still in AMD, so some of that tech will probably be in Intel’s product, together with some technology from Nvidia pre-2012.

        Nvidia would probably lose the low end market and would probably have to lower price on the midrange to compete. AMD would be the one most affected as Intel will be going for their market share rather than Nvidia. I hope that AMD can pull a GPU Ryzen out of the bag and compete with both Nvidia and Intel when the time comes.

        PS> This is all conjecture on my part. Don’t quote me on it.

  7. 100% agree. Other than the 16 GB HBM memory there is absolutely no benefit for gamers of getting it over a GTX 1080 Ti or RTX 2080. And what game uses so much VRAM anyway?

    1. Totally agree with you. Not to mention you don’t get any of the GameWorks, Physx, their drivers + features and any of their other tech/features.

      Sort of sucks for I really wish they could of done the same turn around as their CPUs. That would of been something.

        1. Yeah, but to leverage that for AMD GPUs you need to do a lot of work.

          All the old games that have Physx GPU support including the new ones like Metro Exodus that drop in 2019 + will never work on AMD GPUs(they have to be patch with a new modded Physx for AMD, no one will bother).

          It will probably take years before a shop makes a custom Physx build for AMD GPUs and even then will they even bother? If the market share is so massively Nvidia GPUs what is the incentive. Want Physx it’s already ready for Nvidia GPUs out of the box.

          Sucks Nvidia them selves did not just make a new Physx build that was made for any GPU in accelerated GPU mode and then just update that going forward and not even bother going open source. I fear practically no one will bother with it for AMD 🙁

          1. because even AMD did not care about it. ten years back nvidia did open the door to AMD (of course it is not free) but AMD reject the idea and work on Bullet instead. Bullet was supposed to be PhysX killer in GPU accelerated department but for a decade of it’s existence no developer actually use it.

          2. Yeah 🙁 feels bad Man, I always wonder what if AMD has just given in and paid the license fee. I personally love the idea of Physx tech(what it is) and was crazy about it when it first came out(and still love it when used well). Man if only every game adapted it and it worked on everything that would of been amazing.

            Huh never heard of Bullet only the TressFX stuff which I thought they where sort of converting into Physx franework feature set past just the hair and grass/fur tech. Thanks for the heads up on Bullet, gonna look it up.

          3. well because after announcing the project they never did anything to promote and help bullet to be adopted by game developer. this is my pure speculation but AMD primarily a CPU company and they think like one. since Bullet is open source (and free) they probably think developer will pick bullet over PhysX automatically. sadly in the world of software it does not work like that. hence with tressFX AMD change their stance a bit on third party middle ware. instead of let developer to use it based on their free will they promote the feature to be used in their Gaming Evolved tittle. but that is just the easiest part. the more difficult part is to spend more effort and resource to make sure developer implement them right. Crystal Dynamic implementation in TR series was very good but part of it was because TressFX is a tech they co-develop with AMD (and they were proud of it). but others? take example of lichdom battlemage. they outright said the implementation of tressfx in that game is just to fulfill their marketing agreement with AMD. tressfx have very big performance hit on nvidia card in that game. instead of fixing the performance together with nvidia like what CD did with nvidia on TR 2013 they simply disabled the feature altogether for nvidia GPU. i think AMD take extra approach with Deus EX Mandkind Divided so something similar did not happen (and end up tarnishing their open GPU initiatives).

        1. If history had panned out a little differently then one likely answer would have been Crysis 4. Possibly also Cyberpunk 2077 if CD Projekt Red hadn’t started catering so heavily to the console crowd.

    2. Games won’t use 16GB for years, AMD needed more memory modules to increase memory throughput. And we get to pay for it. But I will say 8GB is not enough for 4K gaming long term, Nvidia cheaped out.

      1. That’s the thing. Most people seem to like to upgrade every other generation so most likely a 4 year lifespan. 8 GBs may not be enough during that lifespan for 4K.

    3. Current gen games (at 4K mostly though) reach 7+ GB VRAM. We have next-gen consoles with higher system specs hitting in less than 2-3 years from now. It’s going to help then.

      Also, technically, Ray Tracing is a DX12 spec. Nothing is stopping AMD from simply releasing an update to support it on any of its DX12 GPUs, unless there’s a specific requirement we aren’t aware off. As opposed to what many people think, dedicated RT cores a la Nvidia are not a required spec for Ray Tracing, it’s just Nvidia’s way of implementing Microsoft’s DX12 RTX spec.

      1. the RT core is not a mandatory need to run ray tracing on GPU but it’s existence “accelerate” RT performance. remember the star wars demo? there were youtube videos showing cards like GTX1080/1080ti vs 2080 and 2080ti.

      2. 2-3 years from now Radeon VII will not be important as both NVidia and AMD will release far more powerful GPUs. And for years now AMD has been fighting last-gen GeForces. AMD flagship that can go up against non-flagship RTX2080 in raw performance is certainly not impressive, especially seeing as how NVidia constantly adds quality-of-life improvements to every new GPU, whereas AMD’s offering is very barebones.

        Add to that NVidia completely ruling over laptop market and it’s a sad picture.

        1. People that got a 1 GB 8800 GT or 1 GB 3870 could play all of the games released on X360/PS3 at close to max settings.
          People that got a 2-4 GB version of GTX 660/Ti and 7850/7870/270X could play all the games released since 2013 up until this end of this console generation in ~2021-2022 just fine.
          People who will get this Radeon VII will be able to play all next-gen games just fine for the entirety of the next 8-10 years.

          1. You are wrong. Towards the end of life for XBox360/PS3 8800GT (and the like) was barely adequate for anything higher than medium settings for those games. Lack of optimization (and new drivers) for 6 years old videocards coupled with ever happening new features (like tessellation which would obliterate FPS on any Radeon at the time) contributed heavily to that. GTX 660Ti and 7870 struggled to run any modern game even back in 2014 at anything higher than medium settings at 1080p (seeing as how those are more or less equal to PS4 GPU). Radeon VII would probably do better when new consoles are out but that’s because it’s not a midrange videocard by far. Still it will struggle with stuff like raytracing that will be in an every major title 3 years down the road.

          2. Both the 660 Ti and 7870 (the 4 GB versions) run *all* current-gen console games at console quality settings at same or better FPS and/or visuals than the base PS4 for example.

            The PS4 is slower than a GTX660 non-Ti or HD 7850.

            AMD tessellation “deficiency” was tackled since 2011 after the Crysis 2 Tessellation debacle with the Tessellation Override settings in the driver which made AMD GPUs as fast or faster than equivalent Nvidia GPUs with 0 or close to 0 loss in image quality forever since.

          3. GTA5 on low at 720P looked better then the consoles version and the 8800GTX could actually play that game at that

  8. Harsh, but true. So…. I take it they won’t bother release the 1100s or will that be the final death blow to AMD’s GPUs line up for this year? So looks like there will be no 2000s price cut any time soon in the near future 🙁

    Ah well I still love my 1950x and the new 3000s look insane(I want the new Threadripper). It is a shame the price could not of been lower for the new GPU, maybe their mid range will be better. I wonder now if they had went with GDDR6 if it would of cut the price by a large margin or made a diff? Who knows.

      1. You’ve clearly got a bias that is not based upon evidence, but instead an emotionally fueled hatred for one side that drives all of your opinions and remarks.

    1. I’m assuming you’re being facetious. Certainly you know that handling the PCI connector in this manner isn’t at all harmful…

  9. Though Nvidia’s offerings with the 20 series isn’t anything too impressive, I have to agree with him. Why would anyone buy this over a 2080 when the Nvidia card at least offers some tech the AMD does not with the same base performance?

  10. How did you miss LISA’s reaction/response ?:

    [ When asked to comment on Huang’s critique of the card, AMD CEO Lisa Su responded:

    What I would say is that we’re very excited about Radeon VII, and I would probably suggest that he hasn’t seen it yet.” ]

    1. I hate when people start out sentences with “what I would say”…. Of COURSE it’s what you would say, as that’s exactly what you’re plainly about to do. Just say it.

    2. In all honesty though, what has she seen of this card that Jensen hasn’t yet seen? The performance numbers already appear to be out, and as they stand, they are precisely as Jensen described them in his remarks. What other trick up the sleeve does this card have that we or Jensen have yet to see if not benchmarks?

    3. She should have retorted with,

      “Maybe Jen-Hsun is seeking to deflect attention from the G-Sync embarrassment, the problems with DLSS as shown by Digital Foundry and the aggrieved GTX1070Ti owners courtesy of the timing of RTX2060’s announcement. How’s that GeForce Partner Program working out for him?”

      ?

      1. There are no DLSS problems and DigitalFoundry loves it, they just tested RTX2060 and found it has the same performance in 4K as 2080 with TAA.
        Everything else you listed is nothing but cheapo pauper trolling, no one cares what people with last gen cards think, there is always something new that comes out, maybe you should sit home with VooDoo and wait for THE NEXT BIG THING just around around the corner, see where it gets you.

  11. [There’s] no ray tracing… really Nvidia ? no one cares about ray tracing too !!!
    And if we turn on DLSS = And if we turn resolution down !!! haha

  12. Jensen Huang’s not wrong; it is underwhelming. That a new similarly priced GPU is not even able to keep up with Nvidia’s highest end card. Where’s Fermi already?

    1. How so? Everything he said seems valid to me. The AMD card merely matches NVIDIA’s performance despite being released months later, and the price isn’t at all competitive, not to mention it lacks all of the RTX and AI features of the 20 series. What isn’t valid about that?

      You’re blind.

        1. If we were VRAM-starved in current titles that might actually matter, but we aren’t. Doubling VRAM means nothing if current games aren’t even using it. The more intensive titles even at 4K come in around 7.5 GB at the highest.

          Non-FE 2080’s are now easily available at MSRP of $699, matching the new AMD GPU.

          Just because we differ in perspective doesn’t make one of us retarded, but the speed with which you presume that others are definitely makes you an @$$hole.

    2. Two cards, same price, same performance.

      One card has RTX and AI features that at least SOME games are already using and that future games can use. The other one does not.

      The other card was also released much later, knowing exactly what competition it was up against and it still failed to improve upon anything.

        1. I could agree about RTX at the moment, but DLSS is beneficial, even if it isn’t standard in every game yet. And it’s still something that the AMD card lacks completely. You would think that releasing later than NVIDIA would give AMD more of an edge in competing with their offerings, but apparently not.

          1. No point explaining the paupers about expensive hardware, he is too poor to afford.
            RTX is awesome even in BF5, RE2 comes in 2 weeks, Metro in a month.

        2. You are poor hobo and pauper that cant afford RTX card thatst he only reason “none of those features are worth a damn”.
          I enjoy my 2080Ti and all the new features.
          Keep on living in 2001 HOBO

  13. LOL, do you expect this butcher to say that the meat of the competition is pretty good?
    F/ck ray tracing, another gimmick like physx that nobody uses and only serves to drop your framerate.

  14. “$699, however most gamers can already find… RTX2080s at similar prices”

    AMD are typically competitive on price relative to Nvidia so why not this time?

    The cynic in me is wondering if this card has been designed to be extremely capable at crypto-currency mining so AMD are seeking to get ahead of the scalpers who resell such cards at huge mark-ups.

  15. 7nm loaded with HBM2 16gb barely competing with 12nm card
    both 2080 & Vega 7 are terrible cards, we already had 1080 ti at the same price since 2017
    that’s not how new gpu generation works, we should’ve got 2080 under 500$ & same for vega 7
    gpu market is getting depressing by the day, so much so that the overpriced RTX 2060 is starting to look like a good value card

    1. To be fair if Nvidia didn’t bother with Ray Tracing they could have used the extra space for more Cuda cores instead of RT cores and they would have made way better GPU’s for today.

      1. problem is if they make such GPU there is no CPU in the world will be able to keep up with it…..unless you play at 8k res maybe. that’s why they decided to play with RT.

  16. lol no. they will be talk about what drama to put out next and new way to increase the net profit of both company.

  17. He kind of has to say it’s crap, Just to try and boost investors confidence after their share price went down after the crypto craze died off.
    What sort of CEO would say yeah our competitions product is fantastic and it is really going to hurt our sales?

    1. Both Nvidia and AMD have taken a hit but Nvidia stock really did take a nose dive. In the last 3 months it’s down 50%. Over the same period AMD fell 30%. Looks like nervous investors are running for the exit signs but that’s what nervous investors do.

      They generally buy in when stock prices are already too high because they think it’s going to just keep going up and they panic and sell when it’s too low. Trillions of dollars were lost that way in the stock market crash by average investors during the Great Recession in the late 2000s.

      Warren Buffett is quoted as saying, “I will tell you how to become rich. Be fearful when others are greedy. Be greedy when others are fearful.”

      Considering he’s worth around 80 billion dollars I think he knows what he’s talking about with his contrarian views..

  18. As a gamer ( owner of both and also Intel) everything is underwhelming at the new prices they put up in 2018/19.

    GPU’s ? Fail. CPU? Fail. Laptops that overheat and costs more 5 times more than a desktop? (yea fail) Monitors that cost over 2k? Huge fail.

    Its all overpriced, nothing looks interesting, nothing is exciting. How can you get excited when the performance they are trying to sell you already excited 2 years ago for the same price? Actually even cheaper. How can you get excited when the real upgrades over the last gen cost 1200$ for 1 single PART? The whole lineup sucks, this new card is just a new Titan card. Wake me up when they actually release NEW cards on REASONABLE prices.

    The Nvidia Ceo is 500% right, but he could also add that their own line up is pretty dull too. Ray tracing? 1 game that supports it. A competitive FPS. Nobody cares about eye candy if you get owned by people with 144 fps/hz monitors. Metro could be good but its not out yet so. Promises on stuff that dont exist yet or are not out yet = EMPTY promises too mister Nvidia CEO guy. Those are not exactly fun either so.

    DLSS? Meh. How many games already fully support it (not many.) Also, from the comparisons ive seen.. its not exactly 100 % better either. Its a trade off.

    Intel with their crazy new pricing? Removing Hyper T and changing I7 to I9 just for the name sake and making i7 into i3/5?

    I swear, there is nothing exciting about anything anymore. That wallpaper rolling TV? useless.. phone with 5 tiny meh sensor cameras? MEH.. okay i’ll stop here. Nvidia and AMD both suck. At least AMD makes good CPU’s that are well priced. The new ones seem alright, we will see onces the tests are out.

    1. This is just making me mad grrrrrr. Does the greed ever ends. Prices fixing, money making schemes etc.. grrrrrrrrrrrrrrrrrrrrr F it all. I had GTX 8800 for 7 years and then 760 for another 5-7. I can hold on to my 1080 for probably 10 more years, if they keep being greedy and evil. I guess i’ll keep my cash, even if i want to give it to Nividia/AMD. I wont give sh*t until i see something worthwhile.

Leave a Reply

Your email address will not be published. Required fields are marked *