DudeRandom84 has benched the AMD Radeon Vega Frontier Edition in PREY and Rise of the Tomb Raider, and shared his results (which are really interesting). PREY is a title that AMD chose to showcase its RX Vega GPU at Computex 2017, however it appears that AMD’s latest graphics is running this game noticeably slower than NVIDIA’s default GTX1080 GPU.
Now we’ve said it before and we’ll say it again: the AMD Radeon Vega Frontier Edition is primarily meant for workstation users, and not for gamers/enthusiasts. Still, these results will give you an idea of what you can expect from the RX Vega. Because let’s be realistic here: while the RX Vega will be faster in games that the Vega Frontier Edition, it won’t offer a 40% performance boost (and if you are expecting something like that, we strongly suggest lowering your expectations).
As we can see, at 1440p and with all graphics settings set at their max values, the AMD Radeon Vega Frontier Edition runs PREY with almost 10fps slower than the NVIDIA GTX1080, and with almost 40fps slower than the NVIDIA GTX1080Ti. There are some scenes during which the AMD Radeon Vega Frontier Edition comes a bit closer to the NVIDIA GTX1080 (performance difference at only 4-5fps), and there was only one scene in which the AMD Radeon Vega Frontier could top the NVIDIA GTX1080.
Rise of the Tomb Raider ran similarly to PREY. At 1440p on Very High settings (but without Purehair), the game ran faster on NVIDIA’s hardware. For the most part, the AMD Radeon Vega Frontier Edition performed slower by around 10fps than the NVIDIA GTX1080, and by almost 30fps than the NVIDIA GTX1080Ti. As with PREY, there were some scenes in which the AMD Radeon Vega Frontier Edition was able to come close to the NVIDIA GTX1080.
AMD will officially release its RX Vega graphics cards at the end of July, so it will be interesting to see the performance difference between the workstation and the gaming variants of the VEGA architecture. As we’ve already said, our guess is that RX Vega will perform somewhere between the NVIDIA GTX1080 and the NVIDIA GTX1080Ti, and from the looks of it this is what AMD fans should – realistically – expect.
Enjoy!

John is the founder and Editor in Chief at DSOGaming. He is a PC gaming fan and highly supports the modding and indie communities. Before creating DSOGaming, John worked on numerous gaming websites. While he is a die-hard PC gamer, his gaming roots can be found on consoles. John loved – and still does – the 16-bit consoles, and considers SNES to be one of the best consoles. Still, the PC platform won him over consoles. That was mainly due to 3DFX and its iconic dedicated 3D accelerator graphics card, Voodoo 2. John has also written a higher degree thesis on the “The Evolution of PC graphics cards.”
Contact: Email
AMD will always be the better choice if you’re poor and don’t care about lower power consumption and quality.
Weird, my R5 1600 is the best CPU at the $200-220 bar none.
That’s not wierd at all. With higher power usage and lower performance, you get what you pay for. Intel may have stagnated, that’s only because until Ryzen they haven’t had the need to make big moves. Once you get a job that pays more than 7 dollars an hour, you’ll be able to run Intel parts. One day Russell, one day!
Ryzen has the best power usage in the business at the moment. You need to read some reviews. You’re looking quite dumb atm:
https://uploads.disquscdn.com/images/f8300c95c2ce6f483b0e02ecabb7b30a7533573a66ca6f3b8cc2638b9b4b4127.jpg
Incredible! You must feel better about your poor mans purchase. Serious computer users still prefer intel for good reason.
I was using a 3570k oc’d to 4.5ghz before this. It was the most logical progression for me. No need to be defensive. It’s okay to be wrong sometimes.
Do you see significant performance boost with R5? I have 3770K OC’d to 4.5 GHz and thinking about change to 8 core Ryzen. But maybe I wait on Zen 2.
AMD has too many problems, unless you’re poor stick to intel.
“AMD has too many problems”
That’s why I am going to wait until Zen 2 arrive. And do not worry about me. I have money on Intel too. 🙂
Yeah you should just waste money for no reason. One thing I really enjoy about these sites is knowing most nvidia and Intel fanboys could never afford the higher end hardware that they like to fanboy about.
I did, I thought I’d see a slight decline with fps in some games, but that’s only been for some older DX9 titles. Windows feels zippier that’s for sure. Productivity is through the roof and benchmarking is a lot more fun.
Don’t wait for zen 2. I am currently running the i7 3770 and went up against my buddies new R5 1600 both stock. He beat me by 50% in physics scores on firestrike and other benches…I was blown away by this. I am an intel guy. Always have been. Ryzen is truly showing its face a s a disruptor.
Buying CPU to lead at benchmarks, which are known to be absolutely useless versus real-life scenarios, isn’t exactly what you wanna do. Right?
Because if you dig deeper you quickly realize that Ryzen is already nearly hitting an overclock ceiling with its stock clocks, while any recent intel blasts it away with its OC. And then you end up with a hard fact that in games a modern quad-core i7 is still faster than top of the line 8 core Ryzen.
What really happens here is AMD pulling a DX12 thing again. They are the first to adopt some new tech on consumer-grade level (DX12 then, 8 cores now) and make the most out of it since the competition has nothing to offer yet. But see, nobody is joking about NVidia and DX12 anymore, the very next-gen GPUs completely obliterated AMD’s offerings for the next 3 years. Intel isn’t stupid either, thanks to AMD they stagnated and were sitting on new tech for years, doing mere sidegrades of their boring i5/i7 lineups – but the very next line of their CPUs may do to Ryzen what NVidia Pascal did to Radeons
elaborate on these reasons plzz
For multi core performance and bang per buck, ryzen has Intel soundly beaten currently and actually runs pretty efficiently.
Gpu wise though, it’s different story. I think 480 and 580 are decent enough mid range gpu, but highest end gpu is dominated by Nvidia
I really wonder what your computer specs are (with a photo for authenticity reasons) because if you think AMD is for poor, you rich boy should have an i7-6950X with dual Titans and a 4k 32″ GSync monitor (or two)…
Unless you’re not that rich…………………….
?
this article is really blowing it out of porportion
:::Fanboy alert:::
Really isn’t.
Everything coming out does seem to indicate Vega being a disappointment, including revies showing a water cooled FE at 1600mhz struggles to match 1080 in many games,and sometimes is closer to a 1070, whilst in some games does exceed the 1080.
Then we find things on Gaf such as this(referring to Gamer nexus review):
“Where they don’t have Vega @ 1600 Mhz but at stock frequency which is a range roughly around 1300-1500 Mhz, mostly settling down at 1440 Mhz.
PCGH did comparisons with 1050 Mhz for Fiji and Vega and for Vega with 1600 Mhz, where they increased the Power-Target and Fan-Speed to make sure it’s staying at that clockspeed.
One part of the synthetics makes it quite clear that Vega is bandwidth starved.
C&P:
– Effective Texture Bandwidth
Now here things are getting really interesting.
That’s a bandwidth test where two different types of textures are tested.
One black texture and one with random colors.
Since the recent GPUs are using Delta Color Compression techniques you see a big difference between a black texture where no color deltas are found and the compression can be optimal and a random colored texture where the compression effectively doesn’t work.
Nvidia is quite the king here, the difference between the best case and worst case is about105-130% in bandwidth.
GCN Gen 3 only manages 17%, GCN Gen 4 47%.
One possible speculation was that Nvidias Color Compression might not be that much better than from AMD but that the tiled based renderer is helping the colour compression technique in addition.
But the DSBR seems currently to be inactive as the triangle bin test doesn’t indicate any tiling.
Without the DSBR the results are 52-60% better for GCN Gen 5 with the black texture.
The range is maybe too small to call it a clear improvement over GCN Gen 4, maybe a few percent.
What’s more interesting are the results with the random coloured texture where the achieved bandwidth is actually 24% lower than from the Fury X, you would expect 6% (484 GB/s vs. 512 GB/s) but not 24%.
The bandwidth utilization is miserable and explains the limited scaling seen with Vega.
You also see 20% higher memory copy throughput in Aida (336 GB/s vs. 303 GB/s) with the Fury X in comparison to Vega.”
“The card throttles heavily with higher clocks, seemingly from insufficient power supply and HBM2 temperatures.”
So, we are seeing that for rx Vega to even compete with a stock 1080 AMD will have to up power draw heavily to get higher clock speeds (it’s already pretty power hungry) and somehow price it low enough to compete with 1080, despite having hbm2, and potentially still underperform as well in non dx12, Vulkan titles.
I can’t help but think Vega is a bit of an unmitigated disaster personally.
We can’t say for sure until the actual gaming graphics card is released. Then if it sucks everyone can dog pile on them.
This article is mind blowing..
Prey is AMD sponsored and Rise Of the Tomb Raider is DX12 game. 😀
“Prey is AMD sponsored and Rise Of the Tomb Raider is nVIDIA sponsored”
FTFY (and stop with the double standards already)
?
PCper just did a review… It’s not beating the 1080 in anything.
I never said anything about performance, did I?
Nope but I just told you.
I know where hardware stands, thank you.
nope
what are you smoking ? their CPU’s are dogshit gaming wise compared to their Intel counterparts. L O L
*in 1080p.
in 1440p+ difference is too small to care.
and gaming is not the only purpose of CPU. Ryzen pretty much is cpu of choice by anandtech, techspot, tomshardware and many more.
It’s just like Rx 480 vs GTX 1060 all over again. In some games the former wins while in other games the latter wins. So just get whichever card has better driver optimisation and lower TDP. Most importantly whichever is cheaper.
“It’s just like Rx 480 vs GTX 1060 all over again”
Only Vega VE is not beating the 1080 in anything…
The gaming variant of vega might. I was talking about that. When gaming edition of vega releases later this month I’m sure it will be gtx 1080 vs rx vega just like it happened with rx 480 vs gtx 1060.
Vega will win in some benchmarks while gtx 1080 will win in another.
Pretty much Vega FE is “Not a gaming card” 13 days 17 hrs 01 min 12 sec till we get to see what the RX Gaming Vega GPU’s can do.
And after that the excuses are DONE.
AMD cards are not gaming cards…
…
…
…
…
…they’re mining cards
But but the drivers are gimped.
But but the games are paid by nvidia to make AMD run worse.
But the games will run better when vulkan, aka the second coming of jesus, becomes the dominant API.
But the reviewers are nvidia shills out to get AMD.
What excuses?
It’s only 2 more weeks. They will release card, I will check which card for $500 offers most, and buy that one. All the other stuff is fanboism.
midrange 8gb rx vega scored 32k on 3dmark, 5k above 1080. 1080ti got 38k mind so if you’ve got the 250e more then go for it. got a feeling if they bring out the 16gb one on the same day as the other ones that it’s gonna be priced around the same as the 1080ti with likely a similar performance, for the sake of cuda cores being needed for my work i’ll still be running a 1080ti but other than that i’d say the vega looks excellent.
There is no 16GB RX planned.
I really hope RX vega will be at least on pair with 1080GTX, because without competition from AMD side Nv will push prices even higher. But damn, that 1080ti is a beast, it’s much faster compared to my 1070GTX and I’m considering buying that card. 1070GTX is good in 1440p, but more and more games requires faster GPU if you dont like turning down details and still want to keep 60fps.
so far that only true for flagship card (x80 and x80ti). but even with the flagship nvidia still did not crross $700 mark for their fastest card since 2013. the x70 and below is pretty much have the same price for several generation. but nvidia is not the only one that want to price their GPU at much higher price. AMD also the same. Fury X end up having the same price as 980ti is mostly because of performance. a few months after releasing Fury X AMD did release the much slower nano at the same price as Fury X. yeah i know some people said AMD charge premium for it’s efficiency and size but what would you think the public reaction if nvidia decided to sell mini ITX 970 for the price of GTX980? and recently AMD themselves talking about the bigger profit margin on more expensive product. hence they were releasing their own “titan” with Vega FE.
In my country prices are a little bit different. For me 1070GTX (midle class GPU) was more expensive than my previous card 680GTX (high end).
This is how prices looks in my country now (that’s the final price that you have to pay in shops, so taxes are included)
1070GTX- 520 euro
1080GTX- 660 euro
1080ti – 850 euro
And the thing is, 1080ti is not the fastest GPU from Nv, their best GPU are available only in titan series for 1230 euro. Around 10 years ago I bought 8800Ultra (ultra high end from Nv) for half of that price.
could be because of exchange rate? in my country it was like that. now US dollar is more expensive than it was before the 1060 current price is about the same as it was with 970 first comes out. but for case like that i did not take it as nvidia pushing the price higher.
Yes, it’s because of exchange rate, but also there are some taxes in my country that final price have to include. From my perspective Nvidia pushed price for their GPU’s much higher, but the thing is, at the same time they have also renamed their cards, so that way most people dont even realize what they have done. Years ago their best GPU in each generation was sold as x800 GTX series (for example 8800GTX) for around 600 euro. Now in the x80 GTX sersies nvidia sell just their second best GPU in each generation, while their best GPU’s are sold only in TITAN series (and these are 1000-1200 euro cards). Of course one year later you can buy gimped version of it (80 “ti” series) at around 700 euro, but still it was cheaper before and you dont have to wait whole year.
well nvidia is money making company. they need to think about how to maximize the profit. before they can push double performance when moving to new generation and they rely on node shrink a lot for that. but with how things are right now that’s no longer true. we pretty much nearing the limit on how small we can go for the process node. making new architecture is not cheap. nvidia spend 3 billion just to make volta happen. AMD still sticking to GCN and we already see how even with node shrink it did not help them to get the efficiency that they want (440w @1600mhz for Vega?). so for what ever it is for nvidia or AMD to release card just how it was a decade a go is no longer a valid strategy for both company.
10 years ago there hadn’t been ten years of inflation
New tech never comes cheap, You neglect to mention Fury’s production cost were partially higher due to limited supply & high cost of HBM!
yes HBM is expensive but for general consumer most of them only care about the end performance. it doesn’t matter if the said card is very expensive to make. if the said card have X performance then the price should reflect it’s performance not the cost to make them. that’s why AMD develop HBCC for Vega and only use two stack of HBM instead of four like Fury X. it is most likely to keep cost in control.
Probably AMD is still targeting the 1080 (non Ti) with Vega but it’s possible they may release a full blown water cooled monster that could possibly challenge the 1080 Ti but it’s going to take a lot to dethrone the 1080 Ti because it truly is a beast as you said.
And yes, Nvidia badly needs some completion for their 1070, 1080 and 1080 Ti.
Not with this generation of Vega . Will have to push clock speeds waaaaay up and power draw would be beyond a joke (isn’t far off it already).
Is the horse called Vega ?
yes, and it’s a lame stallion not fit for racing
More like a pony. A sad pony, looking at how FuryX is the fastest amd card and it’s been 3 years
HA xD
But teh magix drivers have not been released yet to increase vega’s performance by 80% ,this is no fair 🙁
I expected more.
I really don’t think it’s a good idea to draw too many firm conclusions about how RX Vega may perform using the Frontier as a comparison. The RX XTX is supposed to come with a superior water cooler and as such may have faster clocks than Frontier. Also AMD will have had more time to perhaps tweak the drivers for some additional gain in gaming performance. Then again maybe big Vega is already near it’s limits and even with a higher power draw there may not be much improvement over Frontier. We’ll see soon when AMD lifts the NDA and thorough reviews come out.
people have already used water cooling mod on FE and pushed clock speeds and results weren’t outstanding tbh
A yes, but AMD scores well in the rise of the phoenix game.
Those core and memory clocks seem really low on the Vega FE…
Is this normal?