YouTube’s ‘DudeRandom84’ got his hands on an AMD Radeon Frontier Edition and decided to put Crysis 3 to the test. DudeRandom84 used the game’s Very High settings at 1440p with 2xMSAA, and the AMD Radeon Vega Frontier Eition was simply unable to offer an optimal gaming experience.
DudeRandom84 got an average of 46fps, and in some scenes the framerate dropped below 35fps. Of course we do have to note that AMD needs to further optimize its drivers as GPU usage was dropping in some scenes, resulting in an underwhelming performance. Still, do not expect miracles as in other scenes, the GPU was used to its fullest and was running the game with 38fps.
Naturally, NVIDIA’s GTX1080Ti offers a better gaming experience, and we don’t expect the consumer version of the Vega graphics cards, the Radeon RX Vega, to be able to compete with it. As we’ve already said, AMD’s Radeon RX Vega will most likely be somewhere between the GTX1080Ti and the GTX1080.
It’s also worth noting that DudeRandom84 used an Intel i7 Kaby Lake 7700K, so most probably there weren’t any CPU limitations during these tests.
AMD will release the consumer/gaming variant of Vega at the end of July, so stay tuned for more!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
Well it’s a good looking game for sure.
Are you hitting on Crysis 3?
Yes sir. Certainly not hitting on that vega,
That’s racist.
Did you just assume my gender ?
Assuming genders is my fetish.
Did you fetish my gender ?
That’s rape! Somebody help! Oh god he’s internet raping me!
LOL
People should stop comparing the card in gaming scenarios, the Vega Frontier Edition was never marketed as a gaming card, they just said that it can also game. The proper gaming Vega isn’t out yet, so lets just chill.
It won’t make a difference, fanboy. The Frontier Edition is the highest-end Vega card.
The Quadro is the highest-end Nvidia card which means all of them are overpriced like it is.
Yeah, good logic there.
I guess you’ve forgotten to think about Vega FE outperforming an Nvidia card of equivalent pricing in pro apps, as much as 50+% in one, even. Selective presentation of facts.
nice damage control. you get paid to shill?
Hopefully John and Yelt will apologize for being autists once RX Vega comes out.
If you look at leaked benchmarks out there on net for rx vega, you will see that clock speeds are how it is getting faster. It’s top end variants are faster than a 1080, but to do that it’s having to use a lot of juice.
If they can launch cheap enough, it will sell well, but it’s a far from elegant card it seems.
I don’t think its going to sell at a lower cost then the 1080 or even the same cost just taking into account the die size that is 484mm² and the 1080 is 314 mm² making it more expensive to produce I mean you honesty would expect this GPU to go against the 1080 ti just on the die size alone compared to the 1080 ti which is 471 mm² smaller then VEGA that and it has HBM2 which is already having yield problems and is expensive to produce compared to Gddr5x.
I know mate, but if it isn’t cheaper to sell (obviously mining may mess this up further), then i don’t really see it’s selling point. It’s not the fastest card and if it’s not cheaper than a 1080 whilst only being little faster, what’s the point?
I am really starting to fear for vega at this point.
That they had to produce a total power hog of a gpu with such high clock speed to compete against a1080 launched over a year ago, seems to illustrate that AMD really are light years behind currently.
Shut the f`ck up, why do you mórons keep parroting this line? Nvidia’s Titan X, which is the equivalent of the Vega FE, is one of the fastest GPUs for gaming, even Nvidia’s Quadro series performs virtually the same as the “gaming” variants and are even capable of using Geforce drivers.
typical AMD fan doing damage control
so what exactly is this card designed for? bitcoin mining?
Content creation?
you can do that with any high end card and a strong CPU
Yes, however, the market started segregating the consumer and pro features a while back, so both NVidia and AMD saw it as an option to charge more money for identical GPU’s whose only main difference is in drivers.
Both companies made pro gpu’s that have radically different hardware in comparison to consumer GPU’s of course, however, a lot of the pro gpu’s are identical to consumer variants in hardware with only difference being in software, which can accelerate workloads in content creation software.
This is usually where AMD gpu’s surpass Nvidia’s because of higher compute performance and other features… which also makes their hardware more power hungry.
AMD doesn’t have the money nvidia does, so they just turn a consumer GPU into a PRO one with VGA BIOS alterations and different drivers.
Recently, AMD did release Polaris based PRO gpu’s with SSD’s on them to act as a memory though – so in this regard, they do separate themselves from the consumer line.
For most content creators who do things on a freelance basis, and if its a source of revenue, they might be enticed to go for Vega FE.
Otherwise, most consumers with their own content creation will just gravitate towards cheaper consumer variants.
Again, you seem to be enjoying this. Would you like AMD to leave the graphics cards division, so there’s no competition at all for nVidia and green team can charge a lot more for their VGAs?
You do realize that competition is what drives technological innovation and prices down do you?
Grab any quadro and it will match its GTX equivalent in gaming.
At this point we should stop using Vega FE against anything else, because it’s honestly not doing much at all with other games out there and it’s not really meant for gaming.
That’s why AMD never supplied samples for reviews. Their mistake was to release the card at this point in time anyway. There’s always sites/people that grab these cards to test and post tests/reviews… because it’s good clickbait to tag along with the final gaming Vega hype. As soon as proper Vega is out these will be irrelevant… but until then, let the clicks and the pageviews pile up.
Well in this particular case it sure is doing much worse than a GTX 1080, and I am betting that will be the trend, when ppl test it, it will indeed be below the normal GTX 1080 performances.
well there are games that work better with either Nvidia and AMD cards. this isnt anything new. if the 1080 trounces vega in those nvidia favored games that wouldnt be a surprise.
Yeah I guess people just have to wait for the Vega to really come out and then test it, but let us not forget, comparing it to the 1 year old GTX 1080 is not really impressive, unless it costs a third of the price tbh.
First we should all wait for the VEGA card to come out and we can have some real comparisons, but if this is true, comparing the VEGA to the 1 year old GTX 1080 that can now cost less than half the 1200 price of the VEGA, really isn’t impressive at all, it is really poor.
“AMD needs to further optimize its drivers…” Crysis 3 is over THREE YEARS OLD!
Well the architecture is new so yea. But yea it’s underwhelming even though.
Vega is a Fury XT die shrink. There is no “new architecture” here.
Is that right ? Oh i was mistaken. Then why would they sell that product for 999$USD ? I mean, i thought the high price was to payback a shyteload of new architecture RnD. + why did it take sooo much time.
My goddd amd..
Because they are milking.
hence the $600 extra for their AIO water cooled version.
Oh well. Nvidia will pwn it’s customers again.
God damn it…
Or should i say ?
Jensen Huang DAMN IT.
Nice Won xD
Well actually, while the stream processors and compute units configuration match the FuryX, Vega has a lot of new tech in it’s architecture (next-gen compute units, primitive shaders, a new programmable geometry pipeline, and tile-based rendering).
So I guess it could be that drivers are a bit ‘raw’ to say something.
Vega on the same MHZ as a Fury performs exactly the same.
AKA Same IPC as a Fury X
Sure, but Vega is clocked 55% faster. Also, could be that the new tech under the hood isn’t fully implemented at driver level just yet. I remember when AMD released the GCN performance driver months after those cards were out. Some cards like the HD 7870 gained ~20% extra performance.
55% faster clocks is irrelevant when pointing out it has the same IPC and it does not have HBC active in the drivers , ask yourself. If a 16GB HBM2 Card needs something like HBC to run a 3 year old game at high (not ultra) settings then it truly does deserve the name “Fury X Rebadge HBC Edition”.
16 GB is irrelevant unless you have data that shows there is actually a need for the card to have 16 GB to perform as it does in the benchmarks you’re looking at.
Rumor has it that 8 GB is much cheaper to make so the gaming card will ship with 8 and possibly 4 for the lower-end harvested parts. This makes the VRAM compression tech in Vega useful.
The rules changes alone to shrink from 28nm bulk to 14nm FinFET are enough to make just a basic shrink unattractive to a hardware developer. Vega is not just a die shrink of Fiji. Look at the slides AMD has produced that detail the new features and design changes.
We are not just talking about 16 GB here in case you failed to notice that. We are talking about 16GB of HBM2 ! Remember HBM2 ? The much touted by AMD fanboys here HBM2 that will destroooy anything else ?
And now Vega cannot even surpass a 1080 becasuse HBC is not used … Yeah , how are those excuses working out for you ?
This quote of yours is hilarious tho
“Rumor has it that 8 GB is much cheaper”
Need i point out why ? Really? :*)
What are you babbling about?
16 GB of HBM2 is reportedly a lot more costly to implement on a card than 8 GB at this time. If that’s true then it makes sense that AMD will use 8 GB in the gaming cards, and 4 GB for harvested parts. That makes the HBCC or whatever it’s called, that reduces VRAM needs, particularly relevant.
“reportedly”
You really like that word huh. Did you learn of it just yesterday. How interesting do you feel it makes you? You want to know the truth?
Fascinating..
You misunderstand, whether on purpose or too slow its not known. The driver is still using fallback methods from Fiji to render graphics. Moreover we also know the HBC or high bandwidth cache is not enabled in the drivers.
What are you expecting once all that ‘magic’ happens? Raja shoots laser beams out his eyes and because of his glasses they focus and he’s transmogrified into a US missile shield?
It’s almost like you didn’t read my comment.
I did i read and I quote”Blah blah i’m an amd shill, blah blah blah,raja is my g@y lover, blah blah blah”….
‘The driver is still using fallback methods from Fiji to render graphics’
No. I read it. It’s just that I know this to be drivel. AMD Dev has even come out and said so. See Adoreds’ latest upload for more.
There is no magic. Raja isn’t Harry Potter and AMD HQ ain’t Hogwarts. Relax. Have a breath mint or something.
You can expect up to 35-40% increase in some cases.
While it might help, i do not think it will bridge the humongous gap between the 1080ti and the Vega.
Why would HBC help when this card is already running on SIXTEEN GIGABYTES OF HBM2 ?
My bad i thought i had read hbm. Forget that comment it’s out of place.
Aaiight m8, noted 😉
Who said it was supposed to?
Since you guys keep bringing it up as a excuse for its piss poor performance.
My bad i thought i had read hbm. Forget that comment it’s out of place…
So what? 16GB of HBM TWO is not enough to run a 3 year old game without the added HBC ????
lol..
People really can’t read can they? Vega is using Fiji methods of rendering, ie it is not optimized for Vega.
No its not. The fiji excuse does not fly anymore on a released Vega. It stems from almost a year ago where AMD claimed they did not have any Vega specific drivers yet and were using stock tweaked fiji drivers.
You are telling yourself fairytales, These are VEGA drivers which they launched with, So your revised fiji driver excuse that you guys are now trying to turn into “Fiji Method drivers” is simply laughable :*)
This Fiji method myth has been outright denied by AMD themselves. Rx vega benchmarks seem to indicate up to 15% increase in synthetic benchmarks with latest driver available. This seems hit and miss though with variable scores. At best it is just behind an overclocked 1080, at worst it’s behind the stock.
Apparently it’s still pants at tesselation part of benchmark with only minor increases there, but where no tesselation is used its its 15%
You’re correct, I just saw that today. I stand corrected, however we do see a big improvement in driver performance. Adorned did a great video on this today.
“People really can’t read can they? Vega is using Fiji methods of rendering, ie it is not optimized for Vega.”
So the only one not able to read OR think rational thoughts is you. Selling that fangirl PR BS here and think it will fly…
“Fiji Method drivers”.. TOOL lol
I just admitted that I was wrong. Welcome to the block list.
You can block anyone that exposes you for what you are all you like. It won’t change a thing in reality. HF living in your bubble.
honestly i don’t think stuff like HBCC being turned off will affect Vega FE gaming performance. the card have 16GB of VRAM. there is no game will ever use more than 16GB even at 4k right now.
Not about the amount, it’s about the speed and bandwidth. Underclock your video memory to about 1/3 of the max speed and watch your fps.
AFAIK there is no such limitation on Vega HBM2 memory.
Ok , then HBM2 is not enough to make Vega perform well.
He doesn’t mean game specific optimization but arch specific.
Run the Titan Z on older drivers that were mature GTX 970 before pascal support was added. I’d like to see how that does.
MSAA X2 (low)
not even 60 FPS
lol.
You seem to be enjoying this. Would you like AMD to leave the graphics cards division, so there’s no competition at all for nVidia and green team can charge a lot more for their VGAs?
You do realize that competition is what drives technological innovation and prices down do you?
That simply does not take away the abismal performance. AMD is releasing this in mid 2017 (juuuust about) And volta is already shown up and running.
Another GPU they released in 2017 the RX580 can be matched by a OC’ed GTX 970 from 3 years ago.
Yeah im all for competition, it keeps prices down and motivates companies to bring out their most inovative products. Thats why we can blame AMD for failing to compete, Not Nvidia.
Nothing is trying to take away the ‘abismal performance’ (as far as I know RX Vega scores higher in 3DMark 11 than a GTX 1080, if that’s abismal for you…). I said that he looks happy in this scenario. Care to read again? Also, what did I blame nVidia for??
And BTW, I’m sure that cherry-picking some games an OCed GTX 970 can match the RX580 (which probably isn’t OCed). I’d say the RX480 is a better option, performs nice under DX12 and Vulkan and has some extra 4.5GB of full bandwidth VRAM.
It is abysmal performance for a GPU that comes onto the market to compete with a product as old as the pascal arch and already fails to match the top GPU from that old arch while Nvidia already has Volta up and running and demo’ed.
You did not say ” i blame Nvidia” (but you do) Care to read again?? i said we have nothing to blame Nvidia for AMD not competing and if anyone is to blame it is AMD for not being able to compete. Yet the only thing you can do is defend them for it instead. THAT is what i said.
This has nothing to do with cherry picking. When a MID RANGE 3 year old Nvidia card is able to match a 2017 top AMD card in games like “Prey” and Mass effect and a whole list of other games then cherry picking has nothing to do with it. It simply shows how weak AMD really is.
Well, that’s abysmal performance for you then. If RX Vega performs between the 1080 and 1080ti and is priced right, I’d say it’s not abysmal at all.
And no, I do not blame nVidia (care to read again??), after all last year I went for the GTX 1070 even though I loooove the underdog and I’ve been really happy with it’s price/performance 😉
The 1070 is the sweet spot card. As usual for th x70 series true. Great for 2K gaming, nice price goo d perf and efficient. I dont see the need to rush into 4K for myself neither tbh.
1000 euro for this? WIIth half money we can buy a 1080 with the same(better) result. Big delusion.
You don’t buy this card for gaming. It’s a productivity card. The only one delusional is you, with your lack of understanding of what this card even is.
And yet NVIDIAs equivalent card completely crushes it. Productivity card or not, this is a bit sad. Nobody is asking for 200 frames. But it can’t even hit 60 on a 3 year old game on high?
Neither can the GTX 1070/980 Ti hit 60 FPS at 1440p. Crysis 3 is a very demanding game when it’s maxed out.
This was on high settings not max.
No, it wasn’t. “Very High” is the highest setting.
why do you compare it to 1070/980ti? Its AMDs answer to Titan so it should be compared to Titan or 1080Ti, but not much cheaper cards
I wasn’t directly comparing it to Vega. I was pointing out the fact that not even the 1070 or 980 Ti can run a “3 year old game” maxed out.
its just x2 MSAA tho. i dont have numbers but i believe i had better fps with 1070GTX.
No, Nvidia’s equivalently-priced card does not crush it in professional apps.
“Bbbut it’s not for gaming!”
stfu and go spam your AMD propaganda somewhere else
the Vega was advertised as a gaming card and it was delayed for months for obvious reasons
“the Vega was advertised as a gaming card”
The Vega FE was not advertised or sold as a recreational gaming card. It was advertised and sold as an affordable option for small-medium game development studios who will take advantage of its dual mode driver.
Do you even know what a dual mode driver means?
Then, compare the performance of this card in pro apps with a similarly-priced Nvidia.
The whole ‘productivity card’ line is almost complete nonsense. GPUs are utilised in much the same way regardless of whether for gaming or 3D work. Nvidia’s Quadro line are just full-fat variants of their mainstream cards with minor firmware and driver tweaks. Ditto with Radeon Pro. Both have near identical performance characteristics to their counterparts in spite of the huge gap in pricing.
This kind of performance deficiency simply means that AMD either *really* screwed up with their DirectX drivers, or that Vega just isn’t as good as people hoped.
Look up the performance of this card versus Nvidia’s 5000 card in pro apps. Then you will see the point here.
The other thing this card offers is a dual mode driver — something useful for small/medium game development studios.
I am amazed people can’t even comprehend simple stuff like this..
so many idiots considering this a gaming card even though it’s not and it’s not advertised as such, morons PERIOD
So why if this is not a gaming card has been tested with the games? If you make a test with software, you will have comments on the software, if you make a test with games, you WILL HAVE comments on the games. Easy.
By your logic we should test Quadro Firepro and Teslas with games
But nevermind youre too dumb to understand the difference between cobsumer markets and business ones in terms of prices.
In fact the Nvidia Quadro is not tested with the games in this channel, little and immature AMD fan.
Because they are not tested by ANY big reputable reviewer in games because they are not intended for the gamer market. DUH, logic
@YourWifesLover you resort to attacking me based on an avatar? You are so mature, can’t you stick to actual arguments in the here and now? Apparently you can’t, you’re not capable of such feats.
What did I expect…from plebs such as you….cesspool
Don’t mind Cobac, he likes AMD.
You watch so much anime I can only imagine how sexually frusterated you must be, poor kid, anime is garbage. Calling others stupid and then saying “Cobsumers” in your fit of rage, you’re on the autistic side for sure. You can’t get anything right the first time, you and AMD have that in common.
“AMD needs to further optimize its drivers”
The more things change the more they stay the same.
wait, are we really talking that 46 fps en crysis 3 at max settings at 1440 with MSAA are bad results?
high setting is not ultra setting
it is very high settings which are the highest they go in Crysis 3.
with only 2X MSAA that’s not the highest
you really are an idiot. I was talking about the actual graphics settings not the level of AA. all you did was say it was on high settings which is not true. it is on very high settings you dolt.
you are beyond ignorant for not paying attention to context. if you had the least bit of common sense then you would know what performance should be expected on those settings. for example a 1080 ti averages nearly 90 fps in the same spot so it is getting around TWICE the performance of VEGA card. so yes 46 fps is terrible for VEGA.
As if run Crysis 3 at these settings was any easy to get 60 fps.
Not surprised tbh, but have no interest in that side.
Don’t think miners will be interested either at that price.
1080ti seems unaffected by mining. I got mine (msi gaming x) for £650.
1080 ti is not really a good miner except if your mining zcash with it 750 sols with mine that and coupled with its high price and low ether hash rate
is what keeping most miners from buying it.
And so this vega is a good option for miners ?? LEL
Well we don’t really know in till miners can optimize their miners around VEGA but in so far from what I’m seeing it wouldn’t be worth it to mine with especially if the Rx vega pull more power then the FE Vega but then you have to account for undervolting and BIOS mods to the card so eh.
$1000 GPU.
Think…
No, miner will not be buying this , not even at half price.
I was talking about RX Vega of which we don’t know the price of and I still think its going to be above the 1080 price point and literally said that from what I’m seeing that RX vega wouldn’t be worth it to miners from what were seeing from the FE vega did you not read that, all I’m saying is that so far Rx vega doesn’t look like it would be worth it but wait and see.
You know to the people that refuse to support both nvidia and intel the Vega and Ryzen line of products are a way for them to get out of the mid-range PC game. Its been a while since they had anything close to the enthusiast product of intel and nvidia and i think this is good for them. Also imo AMD competing with both is a good thing for the gaming industry.
*uses a graphic card built for 3d applications *
*removes the dust from an old game nobody else is playing, and plays it with that card*
*is aware the graphic card isn’t even 10% optimized for gaming*
*shoots a big load with joy, because the card performs badly*
*rejoices because his favorite nvidia brand performs 10 to 15% better, for a price 30 to 50% higher*
*makes millions of nvidia fans shed tears of joy*
Who even plays Crysis 3 now lol..?
Why is DSOGaming making an article out of this? Did you compare this test to other new GPUs? No? Well, let me do it for you, because gamegpu did a retro GPU test just a few days ago and at 1440P and Intel Core i7-5960X 4.6 GHz:
– GEFORCE GTX Ti 1080 11 GB – 48 minimum fps, 69 average.
– GEFORCE GTX 1080 8 GB – 35 minimum fps, 47 average.
– GEFORCE GTX 1070 8 GB – 29 minimum fps, 39 average.
– Radeon R9 Fury X 4 GB – 24 minimum fps, 39 average.
So Vega Frontier Edition here has the same performance as the GTX 1080.
https://uploads.disquscdn.com/images/92ec653d61d65a72b4917b646a7c693d7a13d95ec3dbd53e3188951003584ecf.png
with Twice the price
gods, AMD wanted a 1000$ for this? and people (fanboys) are saying Nvidia rip people off??
i always thought AMD fans behave console peasants and i was complete right.
They ask $600 more for the AIO water cooled Vega version.
Buy a quadro and game on it, Vega FE pricing is not aimed at gamers.
RX Vega will supposedly cheaper and having better performance on games.
That quadro will actually give you the gaming performance of its GTX equivalent
I have read that Vega can outperform a 5000 by quite a lot in pro apps. One was over 50% faster on Vega FE, despite its obvious driver rawness.
There is more to life than playing recreational video games.
yeah you have read. And we have Seen benches where a P4000 surpasses it.
So, there is conflicting information or some benches favor Nvidia and some favor AMD.
Titan, despite its pricing, is reportedly quite a lot slower in some pro apps than Vega FE.
“reportedly” as in a lil birdy told you.
gg..
What MSAA setting did they use the person in the video had 2x which is considered low in the crysis 3 options menu that and all of those gpus are getting lower frames rates then expected or with what i got on my 1080 ti.
EDIT
They f*cking used 8xMSAA at 1440p of course their going to get low frame rates then the video when its not the proper settings.
They used 2x
Where did they say that tho I looked over the article and they have no mention of what times of anti aliasing it is except for https://uploads.disquscdn.com/images/a0ac489fe1f4fbd2d2660996d372c22c3db7b7cca5977016a62070b68db82f92.jpg I would really like to know because I tried CTRL + F to search MSAA mentions in the article and can’t find any mention of how much of it their using it except in comments and their anti alysing section.
Because that is the default “high” setting that they used @4K.
So you are arguing they had it set to 8x because that is what you see there. And you do not noticed that in that very same screenshot you see that they ran it with 8x MSSAA
…in 1080P.
No I’m arguing that I can’t see what settings they are using and I don’t know Gamegpu as a site because I don’t use them for my benchmarks or know their presets they use so I expect them to be in the article they give to me so newcomers to the site like me can know their exact settings used for each benchmark at each resolution just expecting the reader to know something about the settings just because it is a retro test is a terrible arguement don’t see what is wrong with that.
First we should all wait for the card to come out and we can have some real comparisons, but if this is true, comparing the VEGA to the 1 year old GTX 1080 that can now cost less than half the 1200 price of the VEGA, really isn’t impressive at all.
They wont ever gain >35% performance to get near the TI with drivers.
HBC will also not do a thing if it cannot already be helped by the 16GB HBM2 that it uses right now.
Vega is DOA.
For only a thousand bucks for the air cooled version. $1400 for their AIO water cooled version.
It’s the powersupply
It’s the drivers
It’s the OS
It’s not a gaming card
It’s not competing with Quadro
It’s just getting warmed up
It’s not really cool enough
It’s cause the stars didn’t align
It’s insert excuse here…
AMD this is shameful.
my 980ti get a better average fps at 1440p than this card the only difference is that i use smaa and not msaa
Not 1 single miner will ever buy it for mining at such a price..
Neither will any gamer lol.
That scene is definitely CPU heavy, we need to know if it was limited in this test. Only a processor overclocked to 4.2ghz or higher will ease the CPU limitation.
What a crazy year when AMD’s CPU’s are now competitive and their GPU’s aren’t. My how things have flipped.
It outperformed the 5000 in pro apps as well.
Some but not all. As I said, those were early drivers (even for pro user standards) and would need further optimizations.
I wonder just how much is AMD pushing Vega in clock speeds resulting in higher tdp.
Large compute hardware certainly increases power consumption on a GPU, but I can’t help and think they might be needlessly pushing vega clocks too high which could result in negligible performance increase Vs a much higher power draw… That and I wonder if we might be seeing a repeat of Polaris (higher stock voltage to increase yields – in which case, it might be fun to undervolt and increase efficiency)
I hope I don’t need to refresh peoples’ memory but games that feature Tessellation will always be unoptimized for AMD GPU’s…. Nvidia more or less sabotaged such things so Crysis would run shat on AMD’s hardware.
This is old news that I guess has been forgotten. Nvidia uses ‘features’ that screws over AMD users, whether that’s HairFX….Tessellation…..PhysX was a big one back in the past. Every feature that used to be open and free or is made by them and is proprietary? It’ll screw with AMD consumers, and especially if it has Gameworks involvement.
Witcher 3 suffered with AMD users because of this, Crysis as well, and all the others.
AMD is for poor people, end of story and no excuses.
Poor people have right to live too tho.
Who says they do besides you, the poor person. AMD is there for you, I’m not. Nvidia leads the way yet again.
I have 1060 6GB in my rig, am I rich now? It’s that I always buy best price/performance option, I could care less if it’s made by AMD or NVIDIA, Intel or AMD. These pitty arguments don’t get to me :)..
You owning a 1060 makes you an idiot. The 1070 is the best bang for your buck option. You just don’t know any better that’s the problem.
It’s SC version, much cheaper and almost toe to toe with 1070. But it looks like you are trolling now.
Memory bandwith is less, less memory, lower speeds. The 1060 is not in the same league as the 1070, and is priced as such for poor people. Enjoy making excuses for your garbage card.
Why are so many sites testing this with games and not professional apps? It isn’t a gaming card. Not just DSO … but others as well. Doesn’t really make sense … wait for the regular RX Vega which should be coming out very soon.
“2xMSAA”
Benchmark discarded as meaningless.