AMD Radeon Vega Architecture – First information surfaces via leaked word cloud image

AMD will officially preview its Radeon Vega Architecture this Thursday, however we just got the first information about it via a leaked image. This image comes from AMD’s official Vega website, meaning that it is as legit as it can get.

As we can see, VEGA will feature high bandwidth cache, will be 4X more efficient (our guess is compared to Fiji’s HBM1 as VEGA will take advantage of HBM2), will sport a high bandwidth cache controller, will feature a draw stream binning rasterizer, will support Next Generation Computer Engine, and will offer 2X peak throughput per clock.

As said, AMD will preview its VEGA Architecture this Thursday, so stay tuned for more!

94 thoughts on “AMD Radeon Vega Architecture – First information surfaces via leaked word cloud image”

    1. well in terms of actual performances not really but it is smaller and produces less heat so, in terms of effiency it was good. Now about HBM2, we’ll just have to wait and see.

    2. Lower power draw, smaller, cooled with the GPU die, Faster, Larger Bus-Width.

      Fury X has no struggle with it’s VRAM because the bandwidth is so large, caching texture to the computer RAM had actually no impact, at least not until you would play at 11k Resolution.

      Having HBM actually put to shame the benefit of Delta Compression Nvidia had in term of bandwidth effciency, FPS gains were the only advantage for Delta compression.

      Not truly confirmed: But Delta Compression can sometimes affect image quality as some people report, which I can agree on some provided proofs, but yet had to be expected when you compress data in such fashion. It’s not something to lose minds over, but I’m glad that AMD does try to not go overkill with those features. The reason why Nvidia can skip so much on ram costs is because of Delta compression, pretty much why the Titan XP or 1080 aren’t bandwidth starved just yet, but close enough.

      See though, Nvidia is saving on investing too much in HBM on their newer cards because of that, yet sells them just as expensive ironically enough.

  1. This is a AMD PR propaganda all over again. They said the same BS for there 290X 4GB 4x more efificient….and then the 780Ti 3GB blew it away.

    1. This isn’t true tho, maybe in the beginning the 780ti was a bit better, but it never blew it away + 290X is much faster now, since AMD seem to care about old customers

      1. if they care about old customer they would still supporting 5k and 6k series in main driver now. AMD still optimizing GCN because they still not moving away from the architecture. also what helps GCN on much later was console. that why 290X was winning against 780ti. but for some pc exclusive title like CiV 6 nvidia 700 series compete just fine with AMD R 200 series (DX11).

        1. Pretty sure 5xxx received a little boost with latest drivers too, anyway GCN is pretty good, they better stick with it and only switch if something really reliable can come up their mind, “some” pc exclusive don’t really matter, the majority does, and it says that as of now a 290X is clearly faster than a 780ti in many titles, of course this happened quite some time after release but still…

          1. but people make it out like nvidia completely abandon kepler which is not. kepler usually fare better in pc exclusive title when the game engine is not tailored to address GCN hardware strength. and AMD themselves never deny the advantage they have with their console win. back in 2012 they said radeon is the right way to go because console hardware were all using their hardware. and when asked why Fiji did not support FL 12_1 they said it doesn’t matter because console hardware also does not have them (even their current polaris did not support FL 12_1).

          2. They don’t matter in anything else but market share, how good or bad goes AMD on pc platform isn’t at all influenced by console market.

          3. You can’t connect the dots? games and engines optimizations are key words here, GCN was all the way undertilized, now they are receiveing the goodies. Why can’t you see that?

        1. Your version of “handling” must be pretty low standards, if you honestly believe your single 290x is doing a great job at 4K resolution.

          1. It can do 30 fps avg fine, so can two in crossfire provide 60 fps avg as well. People underestimate those GPU’s way too much.

          2. Let’s see, 50-60fps on high settings AA off, post off. Yeah, doing just fine thanks.

        2. 280x and 290x are great cards – I still have my 7970 HD, a 280x variant and it’s running modern games still on high. Could only imagine how powerful 2x 280x’s would be, but I’m waiting for the 490x so the % increase in power will trump everything else and of course – be the most cost effective investment.

          1. Ikr. AMD has delayed it to ensure it’s the newest flagship – since the 390 were bad reskins. AMD wants to save their rep and make 490 beat the 290x in every way possible and it starts with the VEGA. Stoked – but my wallet ain’t ready even if it’s $499-$599.

      2. I don’t know why people still believe thats nvidias fault in lack of support while in truth 290x is looking better because of game engines are making better usage of the GCN archtectures, since games are being optimized for consoles that use the GCN too.

        1. No it’s not that, the thing is AMD/ATi stuck with GCN for quite some time now, so they’re capable of squeezing everything at its best, but that takes even years sometimes for hardware to shine. I’m not saying it’s nvidia’s fault, nvidia does a different thing, they prefer to release something already quite mature and optimized, so there’s little to nothing margin of improvement for them, the result is, delivering faster products within a certain range of time upon release (often it stays like that, in fact AMD has only lately started to catch up with time, previously they really didn’t)

          1. Man, do you really think AMD would try to harm itself launching underperforming hardware?
            That does not make any sense at all, AMD needs the developers to get in touch with its hardware “intrinsics” thats why they pushed mantle and dx12, and vulkan, because they were limited, for some reason that i don’t know exactly why most games engines are more properly optimized for nvidia hardware, not sure if nvidia drivers are simply better, or if nvidia hardware makes better use or is easier to develop, but right now i think we gonna see some turn around and it is because developers are getting in touch with the amd hardware, that way was paved by the consoles.

          2. They just release it unpolished, not unfinished of anything, it’s not like it’ll improve 30% the card that probably improved the most is the 290/x and it’s not even close to 30%

          3. AMD is not making miracles, its easy to see it just compare results from past drivers from newer drivers the majority of fps boosts are because of initial bad performing drivers on amd or bad developers optimization to amd hardware. if what you are saying was true all games should have gained some substantial gains, simply not true, while on my hypothesis some new games are running much better on amds than nvidia counterparts like gtx780ti. Games are making better utilizitaion of the CGN archtectures, with shader intrinsics and other dev programs that amd is pushing they are creating a library of better optimized codes for their hardware in essence, making it free and accessible to programers(devs) they don’t need to develop from zero, makes them loose less time and have better performance.
            You can believe in what you want can’t but you can’t deny the facts.
            Unpolished puts the fault entirely on amd, which is not true either, they certainly could extract more from their cards since the beggining, but nothing to brag about later.
            There are countless reviews of nvidia 780 performance degradation or lack of optimization, all debunked on youtube, hardware unboxed made one. you can search for it.

          4. The improvement is real, and the reason it’s AMD knows how to work on GCN since it’s pretty old, and they know it so good to grant them a marginal improvement especially on the latest archs, and the other reason is that AMD is really releasing not 100% performing drivers and everything, so again there’s that marginal improvement that can be pulled out by working on them. Nvidia seem to be more involved with developers than AMD, so that’s why your point doesn’t do.

          5. Nah, GCN isn’t meant for DX11 that’s mostly why they couldn’t get out much of the architecture yet and still can’t fully master but at least make it all on part with their competitor, if you compare Nvidia to AMD in the first days of that generation they tried the multithreaded engine known as GCN that required much more CPU power while Nvidia knew that just going for single threaded would profit best performance and thus attract all the consumers regardless of the consequences, that’s mostly why they forced Mantle in the first place and then made it a standard to benefit their improvement over DX11 as a result, it also allows for more resource allocation which is what games needed to evolve to the next level.

      3. Little bit, also little have known how undervolting could also help the blower style cooling in temps then power draw, plus that the 290X was actually overclocked from it’s standard specifications, underclocking has surprisingly little effect on the card so can improve undervolting further but there’s only so low you can go before the Bus-Width starts acting up since it was tied to the core voltage itself, that would require some more memory underclocking but unfortunately, lack of compression was why they justified the massive 512bit, it was easy to know where to stop with those tweaks once you had the VRAM getting errors.

        1. 780Ti had 3Gb, which is not quite enough to do that, 4 is another thing completely even being “only” 1 GB more

          1. I don’t doubt that. But i never said the 780ti didn’t age well ONLY because it had 3GB vs 4GB of 290/x, i said MAINLY, and it’s basically true

          2. The handicap is having 1GB less than your counterpart, which mattered pretty much back then. The 1060 is another thing completely

        2. That is stupid and you know it. Everyone knows that Nvidia no longer supports their older cards the way that AMD does.

          1. That’s because they can’t because the cards differs too greatly, it would take a lot of man power to keep two generations up, that may likely change since both Maxwell and Pascal are similar, but if they drop Maxwell that quickly, that would be a clear indicative that they do want people to move as quickly as they wish for the money.

          2. No, it’s exactly the same tactic Apple has used. They purposely slow down or stop updating older cards to force users into buying upgrades.

        1. It is faster in most of the games, especially newer ones. But it still isn’t not much of a thing, we’re talking several years later, so it doesn’t matter anymore

    2. You even dare to create a half-4ss3d argument, check new benchmarks between the two cards. I can link some sites that shows how 290x is over 780TI but then you will say that those sites arent accurate, so check your sites of preference.

  2. so what does all this jargon even mean? give me a benchmark of games like witcher 3, rise of tomb raider, dishonored 2, batman arkham knight, crysis 3 if it can hit 4K 60fps. reason i mentioned ‘bad port’? simple.. to see how brute force of a ‘beastly’ GPU if it can break the barrier of bad port.

    1. Ha, Jargon, even though they have yet to lie on the improvement part.

      It doesn’t matter how beastly the GPU can be on one pipeline, it’s about having multiples now, which only one game you’ve mentioned actually isn’t even up part to the actual newer API features, only reduced overhead in better CPU processing.

    2. AMD is not relying on any old tricks.

      That you think what they would put out be in between 1060 and 1070, are you nuts? It’ll be much more powerful. Fury X already is, get your facts straight.

    3. It’s too easy to spot you as some butthurt Nvidia fanboy.

      the leaked Benchmarks of it running Doom already showed that it excels and beat the benchmarks of the 1080. The leaked benchmarks, who knows if they’re even up to date to the current build of the 490, but anyone who isn’t a shrill halfwit can tell the 490 is going to be the same power as a 1080 and it will be Better in no doubt ALL DX12 benchmarks.

      Grow up and get out of your moms basement.

      1. ITs almost like you would sound like a butthurt AMD fanboy.
        How anyone can be a fanboy of one piece of HW is beyond my understanding but Ill say this.
        Amd has neither skills nor resources to compete in anything but the price.Its a budget hardware company. GEt over it and play your games instead.
        It will be the same story as every single time. 1 cherry picked benchmark where it will be better by 2 fps and thats it.

        1. I don’t see where I’m butthurt – where all I state is how anxious I am for the AMD 490 – which I know based on Stats and Evidence? Beats the 1080. The only one shown to be butthurt, are the delusional and uneducated Nvidia fanboys who can’t face facts that a ‘rival company’ can release a superior card than the company they faithfully buy from. This is also called Buyers’ Remorse and reflection, look it up some time.

          How can anyone be a fanboy to a piece of circuitry and plastic and metals? If you asked yourself while looking in a mirror, or if you ask QuickshooterMk2 you might get an answer, as for me? I just ‘Happen’ to buy AMD products because it gives me more bang for my buck. This is a Statistical Fact – I get more power PER Dollar that I spend on AMD cards, thus, as an adult, and a person who understands my own finances and economics, that it’s the more worthwhile investment. Perhaps when you graduate from High School and head into College, you might be taught these simple things too 😛

          “Amd has neither skills nor resources to compete in anything but the price.Its a budget hardware company. GEt over it and play your games instead.”

          You tell me to get over it, yet you’re mulling over how a company, in your eyes, makes ‘budget cards’ and debating in the same topic. I’m not even technically stressing over this for there to be a need for me to get over it when I simply know that AMD is better than how you perceive them. Perhaps you should take your own advice and ‘get over it’ since your delusional perception seems to have you at a conflict with yourself and at others? Maybe if you reflect and meditate on why you attack others and why you have a flawed opinion on a field that you’re ignorant about, you might find some inner peace and come to get over it – and play some games.

          Hard for it to be cherry picked when again, the One example I used was 490 running Doom on 4K at a higher benchmark rating than the 1080. It’s not my fault that Nvidia fanboys can’t handle the fact that a recent title that uses Vulkan and DX12 in emphasis – runs better on AMD. It’s not MY fault that kids like you can’t be bothers to do your research before you open your mouth. It’s not MY fault that your ignorance compels you to vomit out the mouth, opinions that are groundless and uneducated – I only Talk about things that I know about, and that I’ve covered and researched, I don’t get why children love to be ‘FIRST’ in saying anything and everything, even if what it is they’re saying is completely incomprehensible and moronic.

          AMD 400 series RUNS better with DX12 – this is a Fact.
          AMD 400 series also, is very good in Power consumption unlike their previous cards.
          AMD 490 uses the new Vega chip and has been shown in numerous leaked benchmarks to surpass the 1080 and be a direct competition to the 1080, and is meant to be 4K ready and VR Ready – and being how the 400 series is also meant to be Cost Effective, aka, Unlike Nvidia who charges you an Arm and a Leg due to Brand Name Price Hikes, which Nvidia are well known for – like Dr. Dre Beats being headphones that are no doubt worth $50-$100 but they cost $300+ Nvidia does the same of “we have the largest market share, we are worth the money, you will give us MORE money than what we really deserve – and none of you sheep will complain” and rightfully so, their arrogance Shines in their Fanboys.

          Do some research before you make a fool of yourself, same goes for QuickshooterMk2 – your ignorance makes me cringe.

          AMD 490 will directly compete – and do better than 1080 in various modern games – and it will cost MUCH less. Just as 290x was the flagship card that has lasted ever since its launch back in Sept and Oct of 2013 – 490 will be the same thing – and if I am Wrong in this? Which I doubt I am – but IF I am? Then it means AMD is a lost cause – but I am confident AMD would not screw up like they did with the 300 series.

          The Nvidia fanboys vs their Rival ‘AMD fanboys’ and the conflict is comparable to 360 fanboys vs PS3 fanboys. Xbox And Nvidia both being Green Team-related – it’s ironic and very comparable, historically speaking – both ‘Green Teams’ at the time had always been portrayed as always Full-Retard and completely emotional.

      2. I can call you an Nvidia fanboy because you fevishly defend Nvidia as if the company itself is either your Boyfriend, or you’re an employee there trying to damage control with misinformation, anecdote, and fallacies. Your poor attempt at deflecting what you said of: ‘u call me fanboy but i call u fanboy too’ is as poor and moronic that I’d have expected such a response from a damp rag.

        I ‘buy’ AMD products – I don’t defend them like they are a Cult or a Religion, like a fanatic loser who has nothing better to do in their time but Defend a COMPANY that they don’t even work for.

        I need Evidence? Use google! lmfao. Are you that incompetent and mentally retarded that you can’t google ‘DOOM RUNNING ON 490’ in GOOGLE? Are Nvidia fanboys that retarded? The answer? is Yes.

        Call me a ‘child’ all you want but the reality is – you portray yourself much more immature than myself, and your inability to debate like an adult reflects your low intellect – if you were somehow older than me? It’d be an insult against you because it’d mean you’re over the age of 30 and you act like a 12 year old. If you’re Younger than me? Then you’re a child trying to call others a Child, which kids do a lot these days.

        Grow up and stop cucking for a company that you don’t work for, nor do they care about you. You’re insignificant and your time used in ‘making fun of people who don’t buy Nvidia products like me’ is laughable, and pathetic. Grow up kiddo.

        1. I don’t recall anywhere in my post did I say that AMD cared, but it is clear that they listen a lot more than Nvidia. Sorry if stating this hurts your feelings boyo. Since you’re jumping to conclusions, you may as well also just jump on my d*** while you’re at it – could at least do me that much since I have to continue reading your rambunctious stupidity.

          Aye, so stop sucking Nvidia off and defending a company – since as you said it yourself, they don’t care about you. So? Take your advice and stop doing damage control for them – while bashing the rival-company due to unforeseeable reasons other than Emotional bias rather than with using logic and facts. I’ve seen your post history, every time AMD fans or even non-fanboys dispute with you and completely destroy you with logic, you just shrink back under the rock from whence you came from.

          Make like Patrick and go back to being under your rock.

    4. Benchmark you can find on internet are useful only for fanboy who can’t afford to buy a real GPU.
      I will tell you a secret: Real word is slightly different than a GPU review.
      If you do not understand the “jargon” play with Mr.potato.

      1. Based on how defensive you get when people don’t agree with you, it’s easy to tell who’s the fanboy. Is all you can do is be the pot who calls the kettle black? You’re a blatant Nvidia retarded fanboy – and the only thing you can do when someone has an opposing opinion, or has facts that deflate your stance – is call them a fanboy as if it magically invalidates their post?

        Stop going Full-Retard.

    5. Long story short. 380 vs 480 will be mimicked with 480 vs 580. These are all parts that held GCN back, archaic and what not, and they’re essentially replacing them with better things. on paper at least. 3rd party benchmarks will tell us the truth.

  3. What is your problem, the image was leaked it’s also in others websites. DSOG doesn’t spread bullshit.

  4. Im not really sure about what all that means but im so amp’d!!!! I don’t even know why!!!…until you tell me they need 300watts and theres no damn way it’ll ever go into a laptop that doesn’t weigh as much as a small child….still though, AMPED!

    1. ps4 pro is the superior platform at the moment, watch last of us 2 trailer and you’ll know how powerful ps4 pro is, sony said pc literally isn’t capable of handling uncharted 4’s graphic let alone last of us 2

      1. hahahaaha,a ps4 pro is weaker than a rx470. Sony just does not want their customers jumping ship to pc.

        1. that’s funny since last time i checked PC does not have a single game on par with uncharted 4 graphically

          1. 4k rise of the tomb raider is better than uncharted 4’s graphics.Checkout skyrim with enb and mods.

  5. “512 TB virtual address space”, is the most interesting thing ?.
    Finally some real New tech and not just a slight Improve.

  6. All the numbers being thrown around are couched in relative terms, and we don’t know relative to what. So they don’t mean a lot without some educated guesswork about what those reference points are.

  7. Checked 16.12.2 drivers? Increased performance by another 10% in DX11 titles. Furthermore this is a design that is NOT polaris. This is a brand new design using GCN as a foundation that is focused on performance with the benefit of power efficiency. Polaris was a mere stepping stone to get the efficiency portion down, R&D if you will. This is why it wasn’t released in a “Big polaris” part because they knew it couldn’t scale up. VEGA merely builds on it and will be from the highest performance piece to the budget cards.

    1. from ~2004 to 2010 I used solely Nvidia. From 2010 to today I’ve used AMD GPU’s. When starting to use AMD cards, the drivers were stable, but they were under performing in a variety of different ways. I had good FPS but stutter and the like. In the last 3 years, beyond the aesthetics of the new manager, I can attest that performance has increased ~35% and stability has remained pretty decent aside from some of the major releases that required hotfixes, which I expected. Even with the older GPU (HD 5850) with the drivers that were not optimized at all, I was seeing 40+ FPS in most games, except for the most demanding, 5 years later.

      There are still some things AMD driver lacks that Nvidia users might see as a deal breaker. AMD even has ReLive now, which is similar to shadowplay albeit in its infancy. But if you just like jumping into games and playing I can’t recall a game I’ve played, old or new, in the last year that required tweaks due to AMD GPU’s/Drivers. The CPU…thats a totally different story…

      1. Your whole reply is opinion based, both companies provided exactly what their products were capable of until AMD shoved out GCN on DX11 that innovated for the future of API’s and that in respect the complexity of the architecture was much harder to optimize on older interfaces, unlike Nvidia who just went balls deep for the cash and rolled out cards that quickly went outdated though they were slightly ahead of the competition and a premium price that disrespect consumers once again. People seem okay with this Titan trend, it’s not worth a thousand dollars tho, they make you believe it is.

        Other things like ReLive already existed for as long AMD had provided H264 sdk’s, it’s not a valid excuse to pay an extra for.

        Saying that Nvidia never had trouble with their drivers is fairly wrong and dumbfounded to compare AMD afterwards, they both had hardware breaking issues, let that be AMD had much less driver updates while Nvidia constantly updates along GeForce Experience to automatically tweak games for the people who have no clue what they are doing. Raptr did it but only by user choice, again something that nobody bothers to look out for and say Nvidia is better.

        The 5000 series dominated against the GTX 400 series, as it was cheaper, just as powerful, power efficient and most Nvidia cards were released AFTER the flagship of AMD that was a better consumer choice.

        What does people still buy? Nvidia.

        Finally the tweaks, it’s PC gaming, tweaking is in the nature. Everything from both companies, OUT of the box works fine, AMD just has a lot of more headroom in tweaking to improve their hardware use which compensates for the much more affordable price they give you.

        1. “Your whole reply is opinion based, both companies provided exactly what their products were capable of until AMD shoved out GCN on DX11 that innovated for the future of API’s and that in respect the complexity of the architecture was much harder to optimize on older interfaces”

          Negative. Even my HD 5850 started seeing good performance boosts and more quality of life improvements fixing the issues I mentioned before up till its replacement in 2015, however it wasn’t strong enough to run the titles I was eyeing so I upgraded to my R9 390, although AMD may have known what their cards were “fully capable” of it cannot be realized with the software bottleneck, which as I’ve mentioned they put a huge focus on cleaning up thanks to the acquisition of HiAlgo.

          “Other things like ReLive already existed for as long AMD had provided H264 sdk’s, it’s not a valid excuse to pay an extra for.”

          Open source and Raptr. Raptr was garbage and Open Source is good for those that don’t mind doing it themselves, however the general consumer base wants to just plug in, install drivers, and go. ReLive is beneficial because of that very basic need that the majority do not wish to tinker around with.

          “Saying that Nvidia never had trouble with their drivers is fairly wrong and dumbfounded to compare AMD afterwards”

          Quote where I mentioned anything about Nvidia drivers. I did not, merely made a mention of my experience with AMD drivers and how they’ve improved beyond the remaining stigma.

          “The 5000 series dominated against the GTX 400 series, as it was cheaper, just as powerful, power efficient and most Nvidia cards were released AFTER the flagship of AMD that was a better consumer choice.”

          And still had issues with the games it was playing. Their drivers were not as optimized back then as they are now, which is why they’re really push the narrative that their drivers performance is less inhibiting, even though stability was rarely an issue.

          1. Both Raptr and GeForce Experience are garbage, point I’ve made.

            I don’t have to quote you because it doesn’t address you directly, it is what the average says, it’s a reminder of the public idea not your own and only experience respectively because whenever I’m posting anything, I care to leave that out, I heard it too often.

            If drivers still had issues, which I don’t remember coming from 4000 series to any of the above were only with newer games of today like Fallout 4 being a trash engine gave render issues to the older generations which is likely to be expected anyway.

            Define optimized when the majority of users had it just fine as they paid no more, because an architecture matured up to now doesn’t mean it was terrible and not worth considering. I’ll just take that as you explaining a downside that is really just common among the hardware.

            Point being, if you had AMD spewing out something new and it’s not optimized but run just as good as the competition which are fully optimized and later up to today now AMD is fully optimized and runs better? Which implies the other side not having any considerable benefits makes your point not really valuable, but rather just a complain towards the short lived GPU competition that one of the hardware side couldn’t see it’s full light of the day to at least who those who got rid of it for something more recent.

            All GPU’s from all generations in my experience and those who I had opinions from were functional with not particular frametime issues whether be Nvidia or AMD.

          2. “Point being, if you had AMD spewing out something new and it’s not optimized but run just as good as the competition which are fully optimized and later up to today now AMD is fully optimized and runs better?”

            There is no “Fully optimized” however if you compare driver performance, especially within DX11 compared to latest compatible drivers on the HD 5850/69XX you’ll see a better experience overall with the titles that were released yesteryear.

            “Point being, if you had AMD spewing out something new and it’s not optimized but run just as good as the competition which are fully optimized and later up to today now AMD is fully optimized and runs better? Which implies the other side not having any considerable benefits makes your point not really valuable, but rather just a complain towards the short lived GPU competition that one of the hardware side couldn’t see it’s full light of the day to at least who those who got rid of it for something more recent.”

            So you really just took my followup comment against that whom I responded to that made the comment of “Fury X sucks major a*s in comperison to the GTX 980TI
            10 FPS less in almost any game besides SWBF which is the only game with a tie to the 980TI” sharing my anecdotal experience to this? I don’t think you really understand what my comments were even getting at, and at this point the labor isn’t worth it. You’ve made up your mind.

          3. No, this was an example. I don’t know if Terascale is fully updated nor do I care anymore since I have GCN. I was pointing out that if hardware is fully functional and competitive but has yet more to be exploited, what does the point come to then? That doesn’t really mean it’s bad in any sort of way.

            Also, just to point out too, because drivers aren’t WQHL doesn’t mean they aren’t fully reliable. Validity is left to be desired, even whether be beta or not. Honestly, I’ve always updated to the most recent drivers, even Beta, haven’t got once anything to be hardware breaking.

            Forget about the Fury X and 980ti, this isn’t what I’m talking about right now. Only the driver issue you’ve mentionned.

          4. Regarding WQHL, I don’t care about it. But others do, including AMD fans so it is worth noting so someone who uses Nvidia doesn’t pickup an AMD if they used my comment as a reference then see’s the vast majority aren’t, either because they’re ignorant to what WQHL represents or they’re simply set in their ways. I didn’t say one way or the other whether or not it was good or bad.

            ” I don’t know if Terascale is fully updated nor do I care anymore since I have GCN.”

            There was a patch just after Crimson which was the last that unofficially supported Terascale, it helped wonders. The only reason I upgraded was for DX12 for a backup dGPU and because some games I began playing were finally taxing it. They were in BETA at the time, I’m pretty sure I could get 30-40 FPS with the 5850 at this point and time, regardless still something I felt was worth sharing to share how my experience of AMD’s horsepower stretches.

            “Forget about the Fury X and 980ti, this isn’t what I’m talking about right now. ”

            So you’ve hijacked the discussion. What was the driver issue I mentioned to you previously?

          5. That the early releases of cards weren’t mature enough, though they did handle the competition very well no?

          6. Considering around the time of the HD 5850’s release is when the disparity between Nvidia and AMD GPU’s started, I’d have to say “No”. Quality of life, including lack of stuttering, is just as important than simply looking at FPS metrics.

          7. I don’t remember those having frametime issues or “stutters” as you call it.

            Those issues that you speak of is by your experience or actual consistent reported benchmarks? The lack of proper driver or performance wasn’t something I’ve noticed at all in between systems it got to utilize.

            As far as my knowledge go before I actually had any experience; AGP HD 3650 and 4650, HD 6850, HD 7850 and last Crossfire 290X to now. Most of all these GPU’s got several lives in between Intel and AMD systems. Functional as intended.

            I’d just be really interested in having those issues pointed out to me, because perhaps I’m just overlooking.

          8. Was my own experience. However it was experienced by others whom owned it in a variety of different scenarios. Was primarily in titles like BF3, Bad company 2, and a few others that I don’t recall as it has been a few years. The last time I played bad company 2 was a few years later when they started cleaning up the drivers and it was running fine. I do remember my frames being fine but the game running choppy as hell.

            My point isn’t that radeon hardware is bad, merely the drivers needed TLC, and its exactly what they got in recent years. The HD 5850, as I mentioned previously, was my first Radeon Card and I was thoroughly impressed as to how long it kept me at medium or above settings compared to the Nvidia cards I had in the past.

  8. https://uploads.disquscdn.com/images/8194aa73fadf202ffa865c14d9c0c8387544f1ac7fc961b9f6f4ce8b67ff8b92.png

    I can’t post links here. So I’m putting them into a picture, look at them if you don’t believe me.

    There’s more and most are before the newest DX11 improvements.

    Jokes on Nvidia since the 980ti is always overclocked to 1200MHz and never compared to stock 1000MHz because it would lose, while the Fury X is at stock and was reported to overclock up to 1200MHz while of course the 980ti can OC to 1500MHz and reach 1080 performance, I’m fairly sure the Fury X can do just as much, we just need people to test it.

    Also the Fury X is said to be power hungry yet, nobody mentions the 980ti to ALWAYS has be overclocked to be good therefor consuming more, yet the Fury X can also be undervolted severely at stock clocks without a performance loss to GTX 980 power consumption, 980ti can’t.

    What else do you want, those are the facts.

    You have a gaming monster (980ti) versus a computing monster (Fury X), I wonder which will exceed the most? None, they were meant to compete against each other, AMD was the only one to take the risk with GCN that is hard to tame in DX11 which is now shining greatly in new games properly coded while unfortunately Maxwell and Pascal are still behind in newer applications and perform at the same IPC per FLOPS.

    @MisterWU To address the flaming, I have two 290X’s with 4.4GHz X5675 .

  9. You’re ignorant if you think it’ll catch on fire, people have unlocked Fury shaders on air cooling at they run just as hot as the 980ti underload. Even the Nano runs fine and is more power efficiency than the rest.

    Those are outdated benchmarks, don’t you ever come to understand that you have to follow architectures getting more mature with the most recent benchmarks possible, it doesn’t matter what sources they are if you have plain in your face the OSD showing stats of the graphics card, specially the Nvidia ones being overclocked from their factory specification. Those besides the DOOM and Dead Rising 4 benchmark were in 4k, look for 1080p and see the margin of error changing, they are all good cards, except Nvidia is still expensive as heck.

    You don’t know how stupidly hot they get underload. Fury Fury X 980ti and Titan X are all >250w cards. They all get hot.

    Then you have properly coded Vulkan and DX12 games where any AMD cards are almost perfectly used and provide better performance than any of it’s rival tiers, yet are cheaper. We are just talking about old games using outdated api’s right now where GCN was never properly used and yet it stands strong.

    Fun fact for you; the Fury X AiO cooler is just as good as any other stock cooler, the only benefit is that the heatsink is not stuck on the card stacking heat and It’s a single fan.

    Overclock runs just fine on any Fury cards too, just like undervolting.

    1. Overclock to similar levels? What? Ever heard of IPC? You probably didn’t even look the sources I gave you either. Considering the fact you overlooked that the 980ti needs to be overclock to outperform the Fury X afterall too.

      Wow, 3 extra FPS is a huge deal in your book, ever heard of frametimes? Those are much more important believe me. Also the 1070 is weaker than the 980ti which remains better in IPC at that maximum overclocks allow 1080 stock performance, like the 1070 can also do but is much prone to VRAM issues as they were reported.

      Call that fanboyism if you want, I do know that it’s possible to pick a Fury right now for 260USD which is much more cheaper than 390-420USD you suggest me and if lucky enough I can unlock all it’s shader cores and undervolt it at the efficiency of a GTX 980 which is fairly reasonable for 260USD.

      Update; All the Fury’s that were 260USD on Amazon got bought unfortunately with only weird offers left, that’s a shame. Newegg still have it’s sale but from today, it’ll be over in 5 days.

      https://uploads.disquscdn.com/images/7f9668006e1bedf038d5e916fb5fb5626c4d7e59077cf2d3971432f4fac2152c.jpg

      I’m looking at recent benchmarks from hardware unboxed and I see both cards getting tied most of the time, if it’s not for the 980ti winning by 3 FPS again with it’s overclocked specifications Vs the Fury X itself at stock or the Fury X getting away with 3 FPS after. They are the same and consume the same.

Leave a Reply

Your email address will not be published. Required fields are marked *