AMD officially announces its next-generation high performance gaming GPUs, Radeon VII

AMD has officially announced its next-generation high performance gaming graphics card, the AMD Radeon VII. The AMD Radeon VII will target high-end gamers, will be the first 7nm graphics card, and will come with 16GB of VRAM.

The AMD Radeon VII comes with 60 compute units running up to 1.8Ghz, will provide 25% more performance at the same power, and will offer 1TB/sec memory bandwidth.

AMD has also revealed some initial benchmarks in which the AMD Radeon VII is 35% faster than Radeon RX Vega 64 in Battlefield 5, 25% faster in Fortnite and 42% faster in Strange Brigade.

Naturally, AMD Radeon VII will support FreeSync 2, Async Compute, Rapid Packed Math and Shader Intrinsics.

The model that AMD showcased at CES 2019 featured three fans, was a dual-slot GPU, lacked any DVI output and appeared to feature DisplayPort and HDMI outputs.

The red team did not reveal when the GPU will be available or how much it will cost.

The red team has announced that Radeon VII will be available on February 7th and will be priced at $699!

85 thoughts on “AMD officially announces its next-generation high performance gaming GPUs, Radeon VII”

  1. Okay i see that it’s beetween 25-45% faster than VEGA64 but how does VEGA64 against 2080Ti ? Their pricepoint is going to be the selling point not the performance itself.

    1. If it goes for 500-550$ I think it should be ok, but I also consider power consumption and how hot it gets. I assume that’s the reference model, since it has 3 fans I think it gets quite toasty. Probably it will be HBM, also costly. She said High Bandwidth Memory a few times.

        1. Indeed, Amd failed to take advantage of Nvidia’s mistake. Now we are left with 2 epic failures. INTEL SAVE US.

          1. lol what mistake? if anything what nvidia has been doing right now is something AMD hope to happen since a long time ago. for several generation nvidia being the one that really give the pressure on AMD to gain more profit. just look at the recently launch RX590. why $275? RX480 with roughly same performance (slightly slower) launch for $250 two years back. after two years this kind of performance should already drop below $200 mark.

    2. “Their pricepoint is going to be the selling point not the performance itself.”

      As always pretty much.

  2. UPDATE: Just announced to come out Feb 7 at $699

    I’m glad we’re getting something, but a node shrunk OC’d Vega just isn’t going to cut it at kicking Nvidia in the teeth. We really need them to do better, and create a brand new architecture that doesn’t rely on GCN. Or Intel if they fail.

    1. Actually, if the Radeon VII can achieve up to 2GHz+ on the core clock, it will be very disruptive for Nvidia…and I welcome it.

        1. I haven’t purchased a Radeon card since 2006, so I hope it does. If the Radeon VII lacks sufficient overclocking headroom, it will be dead on arrival.

    2. nvidia probably already know something like this would happen. i mean when we see RX590 we can already suspect what AMD will do with their pricing.

  3. AMD is suppoused to be the “people’s champ” yet they come up with 700$ card that will go toe to toe with the 2080 without packing the future tech that might enhance gaming. We are effectively living in a monopoly, because AMD does nothing competitive.

    1. Fanboys dubbed them the poor man’s champion. AMD’s has been trying to ditch that market since Ryzen’s launch.

  4. So $100 less than an RTX 2080 for a card that performs the same as the 2080, but has NONE of the AI or RTX features at all?? (not to mention that you can get non-FE 2080’s for the same price now)

    This isn’t REMOTELY the kind of earth shattering competition that AMD fanboys have been heralding the next release to be. Yet again, they’re just catching up to NVIDIA. When is AMD going to release a card that OUTPERFORMS the equivalent NVIDIA GPU instead of falling short or just barely matching the competition?

    Why does AMD think they can even bring this nonsense to market when it brings nothing new or better to the table and shows up to the party half a year late?

    And what about the 2080 Ti?? Is AMD just going ignore that tier of GPU completely for now on?? I just don’t understand what’s going on in their minds…

    This card needed to be $499 to get the respect that AMD desperately needs right now.

    1. Dude Amd has been ignoring the high-end GPU market for years now probably not gonna change for awhile.

      Least Amd is highly competitive with Intel. Amd is a CPU company at heart i’d much rather have them make beast CPU’s then GPU’s.

  5. LOL you just know some modders will come up with texture packs that require 13-14 gigs VRAM now. Apart from the amount of memory, nothing to write home about. Would still go for this over 2080 though, at current prices at least.

        1. Any standard high-quality PSU should be sufficient. I suppose any 650/700+ watts PSU should do the work, leaving some headroom, but the exact power draw from the wall will still vary from system to system.

          The recommended PSU is usually on a higher side, just in case someone wants to overclock his RIG, or add more PC components.

          On another slight off topic note:

          A lot of peeps actually SKIMP on the PSU. And most importantly, many aren’t even fully aware that WATTAGE number alone means nothing when it comes to any power supply.

          The main concern is the “quality” of the power, the quality of the components used/CAPS, as well as the total AMP drawn on the +12V RAIL (output), the efficiency under load, “ripple suppression”, among other factors. The total wattage number of any PSU is not always really the most important deciding factor, primary concern is the ‘quality’ of power it produces, and the total capacity of the 12V source etc.

          Though how the rails are laid out does not affect that much, i.e. single/multiple +12V rail PSUs.

          Multi-rail PSU can be mildly better, especially with a high wattage unit, but it won’t have any impact on your performance, however, it can provide an extra layer of safety in case you get a short circuit. A multi-rail power supply has OCP on all +12V rails, ensuring that your PC components stay alive, if a problem like a short circuit occurs.

          A cheap generic/standard low-end PSU is prone to failure soon, than the units made by reputed brands like SEASONIC, Corsair, BE QUIET, EVGA Supernova, PC Power & Cooling, ANTEC (except *earthwatts* series), XFX, Super Flower, OCZ, just to name a few.

          The OEM also matters a lot, instead of the actual PSU brand. I’ve seen PSUs labelled as 1K watts, but in actual real world scenarios, they can hardly pull 400 Watts from the wall, even under full load.

          I’ve always given the topmost priority to PSU when building any RIG, and I mostly go for Tier 1 and/or Tier 2 units, because we know any TITANIUM/Platinum/Gold PSU is going to be much more efficient under 50% load, than a bronze/silver or a generic 80 plus certified PSU.

          Though the exact Wattage requirement still varies from system to system, and if we plan to Overclock the GPU/CPU, then the PSU should have some headroom as well.

          1. My problem was I bought an hp 6700 non k with a 500w psu swapped the gtx 960 with a gtx 1080 and was great. Time to upgrade to my rtx 2080 had to buy a 650 psu, scared me but works great on first boot. If a new psu is required for the new GPU it ups the cost on all consumers that didn’t buy a gaming pc with a adiucte power supplies.

          2. Agreed. Don’t cheap out on the PSU ever people. PC components are expensive. Especially when you start to look at a $700 GPU alone in your rig.

            I’ve known a few PC repair people in my time and they have told me some horror stories of burnt motherboards and GPUs when a cheap PSU went out.

            On jonnyguru they actually tested one sh*tty PSU that exploded under full load.

  6. Yup, that’s a 7 nm “Vega 20” silicon with 60 compute units (3,840 stream processors), and a 4096-bit 16GB HBM2 memory interface.

    This card could be a respin/cut-down variant of the Instinct MI60/50 card/chip, using a different binning process, having 4CUs disabled. Just my guess, but it seems probable.

    But it seems to be slightly faster than the RTX 2080 graphics card though. Not really worth it, more like a bummer ?

    https://uploads.disquscdn.com/images/9fc4417938c73db034156d100942ebd2a0ae5c27dd95637bfe57f9317fd2f085.jpg

    1. Really not exciting, unfortunately. It’ll trade blows with the 2080 depending on the game, while consuming significantly more power despite being on a smaller node and using the much more power efficient HBM2. I want AMD to win, but this does nothing to combat Nvidia’s dominance in the market.

    2. We have to wait for reviews but one thing about it, if Radeon VII does outperform the 2080 by a little for $100 less then at least that’s something and the new card makes Nvidia’s 2080 Ti look even more ridiculous. From Radeon VII to 2080 Ti you pay 70% more for a 25% increase in performance.

        1. Not remotely. What baffles me is how they can’t seem to break the cycle of releasing cards that do not outperform their competition when they’re releasing them half a year later, knowing full well what they’re going up against. If they were releasing these products at half the price of the competition, it would be different, but they’re not.

          1. There’s nothing baffling about it. It’s economics. AMD is a small company next to Intel and Nvidia combined who they have to compete with as best they can with what little money they have.

            For the last 4 years AMD has averaged around 5 billion dollars in revenue (including both their CPU and GPU businesses) and for the first time in a long time showed a profit of 43 million dollars for the year 2017. The previous 3 years they averaged a loss of 500 million dollars each year. They have been using money that could have gone to R&D to pay down huge debts. They damn near went bankrupt a few years back.

            For the last 4 years Intel has averaged a revenue of 58 billion dollars with an average profit of around 10 billion dollars each year.

            For the last 4 years Nvidia has averaged revenue of 6 billion dollars and averaged a profit of 2 billion dollars each year.

            AMD decided years ago to put most of their R&D money into Ryzen and let the GPU side mostly stagnate. No matter what people wish for AMD to be they cannot fully compete with Intel and Nvidia at the same time. They don’t have anywhere near enough money.

          2. AMD need to make money. sure Ryzen going up well right now and generate profit for the company but to keep Ryzen on it’s edge they need to spend more R&D towards future Ryzen development (intel is not sitting still). so the talks about how AMD making money on Ryzen so there will be more R&D for GPU development might not that simple for AMD. the GPU division will need to make their own money. lucky for AMD nvidia is going crazy with RTX pricing. so there is no pressure to “officially” cut the price of their existing product. RX 590 launch price at $275 is already a proof that they are taking advantage what nvidia been doing.

        2. People are going to start thinking I’m an AMD fan even though for the last 12 years I’ve bought only Nvidia GPUs but it amazes me just what AMD can bring with such a small income and profit. By comparison in budget sizes they make Intel and Nvidia look lazy and wasteful of resources and R&D.

        3. Well i remember when Bulldozer was around Amd did compete in the high-end the 7970ghz edition did a good job at fighting off the 680.

          Now its the opposite Ryzen is easily competitive with Intel but they can’t seem to compete with Nvidia in the high-end.

          1. with bulldozer AMD is counting on the core count to catch up for the lack of IPC improvement. but in reality no matter what you cannot ignore the IPC. Ryzen was a good balance between core count and IPC. and thanks for intel dragging their feet on IPC improvement since sandy Ryzen catch up very nicely with intel.

            to compete with nvidia AMD really need a brand new architecture instead of another GCN with minor improvement.

  7. @John, This is the price: $699 ! Expensive, but it makes sense since it is using 4 stacks of HBM2 memory, and is being built on 7nm.

    Edit: I think this card comes with 3 new games to choose as well. 😀 They even showcased a demo of Devil May Cry 5 running in 4K.

    1. Do you think it would be feasible to have an 8GB version at a lower cost…like $599?

      I mean the average gamer doesn’t need 16GB of VRAM… right?

      1. Yes, but that depends on how they actually design the chip. Or, they could’ve at least used three HBM stacks and drop the memory to something like 12GB, maybe that would have resulted in a 550 USD MSRP.. ….Just my guess.

  8. So its basically on par with a GTX 1080 Ti but has more VRAM that no game will use for a few more years.

    1. Lmao not at all. The VRAM they are using is HBM. HBM provides performance gains far beyond the “raw amount” of VRAM added. The VRAM itself is capable of MUCH higher throughput (1 TB/sec is mind boggling)

      1. While I can see more and more apps using upwards of 8 GB of VRAM, I never see anything getting close to that kind of VRAM throughput. The speed of the GPU and capacity of the VRAM will be obsolete before games require that much memory bandwidth.

        1. There’s actually many applications TODAY that would benefit from that kind of insane throughput.

          For example open world games with extreme amounts of detail need to stream in a large amount of data per frame and gaming at 4K60 requires a huge amount of bandwidth since there’s so much pixel data in each frame.

          Not to mention the applications outside of gaming like video editing at high resolutions etc.

          And lastly, at the most basic level HBM reduces the time it takes to fetch things from VRAM which provides a direct boost to frames per second since frames can be made faster.

    1. I’m not sure they could have done this with three 4 GB stacks of HBM memory but in any case it obviously didn’t add $200 to the price for one more stack of HBM memory or otherwise just for the 16 GB HBM memory alone it would have cost $800.

      1. I don’t know man but even 8gb would be enough it will reduce the cost but i don’t know how much 100-200$ you can’t compete with nvidia at that price 699$

        1. ignorant comments everywhere about vram on this card. much of the performance increase is from the increased memory bandwidth. each stack can only do so much bandwidth so if you remove capacity you lower the bandwidth. 8gb at same speed would only have half the bandwidth of the 16gb version.

          1. Well i don’t know if amd can do some compression on hbm if no GDDR is usually much easier and cheaper to manufacture than HBM.
            One good thing i can think about HBM it reducing peak power draw.
            So short story amd could’ve gone with more efficient way with hardly any performance impact

  9. Ya, but will it still act as a secondary heater, like every other AMD card? AMD is just god awful with thermals.

    Great for the winter countries during those cold months I suppose, but otherwise……

  10. Waiting those 7nm hype people who told it kills anything Nvidia and Intel produces with 12 and 14nm

    Still my 1070ti is itching for this one

  11. Hello John,

    More news from AMD. I think all hope is not lost. It seems AMD will indeed launch entry-level and mid-range products based on the 7nm architecture.

    As told by AMD CTO Mark Papermaster. It is also possible that these new products will utilize NAVIi architecture, and may use cheaper memory technologies such as GDDR6.

    Please give this a read.

    https://www.thestreet.com/video/amd-cto-14830482

    https://videocardz.com/newz/amd-cto-mark-papermaster-promises-more-radeon-products-this-year

    Thanks…..

  12. @JOHN,

    Give this a read. It seems AMD has only got limited stock for this card. This is again a huge bummer for this card and company. This wasn’t meant to be gaming oriented card, to be very honest.

    TweakTown has reported that AMD will launch with “less than 5,000” Radeon VII graphics card, with sources claiming that AMD will lose money on everygraphics card sold. The Radeon VII was originally designed as the Radeon Instinct MI50, a graphics chip that wasn’t expected to receive a consumer release.

    https://www.overclock3d.net/news/gpu_displays/amd_rumoured_to_have_less_than_5_000_radeon_vii_graphics_cards/1

Leave a Reply

Your email address will not be published. Required fields are marked *