AMD Radeon RX Vega 56 8GB Review

Status
Not open for further replies.

kjurden

Prominent
Aug 28, 2017
1
0
510
What a crock! I didn't realize that Tom's hardware pandered to the iNvidiot's. AMD VEGA GPU's have rightfully taken the performance crown!
 
Thanks for the hard work and in-depth review -- any word on Vega Nano?

Some 'Other Guys' (Namer Gexus?) were experimenting on under-volting and clock-boosting with interesting results. It's not like you guys don't have enough to do, already, but an Under-Volt-Off Smack Down between AMD and nVidia might be fun for readers ...

 

pavel.mateja

Reputable
Aug 28, 2017
6
0
4,510
No undervolting tests?
https://translate.google.de/translate?sl=de&tl=en&js=y&prev=_t&hl=de&ie=UTF-8&u=https://www.hardwareluxx.de/index.php/artikel/hardware/grafikkarten/44084-amd-radeon-rx-vega-56-und-vega-64-im-undervolting-test.html&edit-text=
 


Yeah Tom's Hardware does objective reviewing. If there are faults with something, they will call them out like the inferior VR performance over the 1070. This is not the National Inquirer of tech review sites like WCCTF. There are more things to consider than raw FPS performance and that's what we expect to see in an honest objective review.

Guru3D's conclusion with caveats:

"For PC gaming I can certainly recommend Radeon RX Vega 56. It is a proper and good performance level that it offers, priced right. It's a bit above average wattage [consumption] compared to the competitions product in the same performance bracket. However much more decent compared to Vega 64."

Tom's conclusion with caveats:

"Even when we compare it to EVGA’s overclocked GeForce GTX 1070 SC Gaming 8GB (there are no Founders Edition cards left to buy), Vega 56 consistently matches or beats it. [snip] But until we see some of those forward-looking features exposed for gamers to enjoy, Vega 56’s success will largely depend on its price relative to GeForce GTX 1070."

^^And that's the truth. If prices of the AIB cards coming are closer to the GTX 1080, then it can't be considered a better value. This is not AMD's fault of course, but that's just the reality of the situation. You can't sugar coat it, you can't hide it, and you can't spin it. Real money is real money. We've already seen this with the RX 64 prices getting close to GTX 1080 Ti territory.

With that said, I am glad to see Nvidia get direct competition from AMD again in the high end segment since Fury even though it's a year and four months late to the party. In this case, the reference RX 56 even bests an AIB Strix GTX 1070 variant in most non-VR games. That's promising for what's going to come with their AIB variants. The question now is what's looming on the horizon in an Nvidia response with Volta. We'll find out in the coming months.
 
We've seen what they can do in a factory blower configuration. Are board manufacturers allowed to take 64 and 56 and do their own designs and cooling solutions, where they can potentially coax more out of it (power usage aside)? Or are they stuck with this configuration as Fury X and Fury Nano were stuck?
 
No, there will be card vendors like ASUS, Gigabyte, and MSI who will have their own cooling. Here's a review of an ASUS RX 64 Strix Gaming:

http://hexus.net/tech/reviews/graphics/109078-asus-radeon-rx-vega-64-strix-gaming/
 

pepar0

Distinguished
Jul 21, 2016
27
1
18,535

Will any gamers buy this card ... will any gamers GET to buy this card? Hot, hungry, noisy and expensive due to the crypto currency mining craze was not what this happy R290 owner had in mind.
 

filipcristianstroe

Reputable
Dec 1, 2015
94
0
4,640
LOL.. Vega 56 > 1070 but that's not what im vamped about. AMD needs to get their Mod Edit Language together. Don't you guys see the Vega 56 beats the Vega 64 in witcher 3 1440p? LOL what in the world?

 

FormatC

Distinguished
Apr 4, 2011
981
1
18,990
This linked review was done with a not stable working 3rd party tool and the results are mostly not plausible. I tried to reproduce this values a few times and it won't work for me. It is very difficult to change Vega's voltage and to get really stable results. It is simply not my style to publish click-bait reviews instead of reproducible and serious results. Sorry for that ;)

BTW:
You can undervolt it a little bit, but you have also to analyze the frame times! Only fps are saying simply nothing about the picture quality. With all this ups and downs you get a horrible, micro-stuttering result. About this effect I wrote a few times.

 


You're ignoring the Titan XP because it isn't really a consumer gaming card right? And the 1080 ti, because...reasons? There's an outside chance of taking the value crown. I'd go with that, assuming everyone in this thread who wants one is able to buy one today for <$400.
 
@ 10Tacle... "Yeah Tom's Hardware does objective reviewing" Just no... they are not.

1080p benches for CPU without 1440p and 2160p counterparts just for example. This is manipulation that can drive sales.

Guy number 1: Check benches on Toms: "Oh Ryzen sux, I am not buying it for my next 1440p system".

Guy number 2: Check benches on kitguru: "Oh Ryzen is offering the same gaming experience at 4k than Intel... and they kick Intel butt all over the place in multi-threaded application... I am buying it for my next 1440p system".

See, I just proved you wrong.
 


This card is the strongest miner at the cheapest MSRP. It will sell really well... unfortunately for gamers.
 


I was wondering when you'd show up complaining. I guess you missed the Guru3D article of this GPU earlier this month and their generally SAME conclusions. Provide evidence to back up your assertions every time there's a single negative comment on a review of an AMD product that Tom's is biased against AMD. You cannot and you know you cannot.

And your 1080p benchmark argument is a fail, because EVERY major tech review website uses 1080p as a CPU gaming benchmark. This is not new either. You can go back TEN YEARS on Tom's Hardware, Guru3D, and others who ran a 1280x1024 CPU gaming benchmark resolution when the high end resolutions were the 2K of the time 1600x1200 and 4K of the time 1920x1080. The other reason your 1080p argument is a fail is because there are a lot of gamers out there with 144Hz 1080p monitors. The rest of your comment is just hypothetical nonsense with no statistical data to back it up, and you know it.
 

P1nky

Distinguished
May 26, 2009
61
27
18,560
Your sweet spot graph is wrong. The right vertical axis numbers are for Watt/FPS (what?!) not FPS/Watt.

Isn't there a better way to measure the sweet spot that a "bulge"?
 

caustin582

Distinguished
Oct 30, 2009
95
3
18,635
Just increase the clock rate and pump enough power into a card until it edges out the competition in raw performance. What an elegant strategy, AMD.

I'd be interested in seeing benchmark comparisons between a 1070 and a Vega 56 both OC'd to their max stable frequencies with the same temperature caps. Something tells me the 1070 will win by a long shot every time. I honestly wish AMD would put out something to get really excited about, but it looks like they just gave up and went with the brute force approach.
 

artk2219

Distinguished
Honestly the most interesting part about Vega is that it can be an efficient architecture if you're not chasing absolute performance. This bodes well for the mobile and APU implementations of it. That power bleed when you start chasing the rabbit is interesting though, you also see it with Ryzen when it starts doing the same thing on the same process, it seems like there are definitely some tweaks that need to happen to the process and the architecture. I'm not sure if we will see a part akin to the radeon 4890 which was basically a cleaned up and tweaked 4870, but I'm hoping we see something like Thuban on Ryzens side, a cleaned up and tweaked phenom II with more cores added. Honestly though we may not see any fixes until 7nm, and this may just be a place holder for Navi (There's always next year, its like a recurring joke almost :-/).
 


Well in all fairness to AMD, they don't have the resources to back development of both a new CPU and GPU simultaneously. Also remember that Nvidia is not making CPUs. They can focus solely on CPUs. Not so with AMD. AMD put most of their resources into Ryzen which was a good move. It is earning them much needed revenue.

But lackadaisical overclocking been the general pattern of AMD GPUs for quite some time. They are not near as overclock friendly as Nvidia GPUs. When Fury X came out, it was on par with a reference GTX 980 Ti. However, it had very little overclocking headroom as it was already near maximum clocks out of the factory. An AIB vendor overclocked 980 Ti beat it, and overclocked on top of that it destroyed the Fury X.

But if you want to get a hint of things to come, check out my Hexus link above in a previous post showing a pre-production ASUS Strix "ROG" Gaming RX 64 and comparisons with the reference RX 64:

(Reference vs. Strix FPS at 2560x1440)
Battlefield 1 - 71 vs. 75
Dues Ex - 55 vs. 60
Fallout 4 - 66 vs. 70
Warhammer - 100 vs. 103
Witcher 3 - 75 vs. 79

Now the last ASUS GTX 1080 Strix tested at Guru3D got this with the only same tested game, Dues Ex (reference vs. Strix vs. overclocked Strix at 2560x1440): 69 vs. 75 vs. 81.

Hexus tested on an overclocked 7700K overclocked to 4.6GHz and Guru3D with an overclocked i7-5960X to 4.4GHz so really there's no difference there at 1440p. My guess is the lower numbers of the pre-production RX 64 Strix was due to early drivers. But it does give you a comparative snapshot of reference vs. AIB aftermarket GPUs between Vega and Pascal - pretty close. They key as you said will be in what can the higher clocked AIB Vegas do overclocked beyond out of box.
 

king3pj

Distinguished
This seriously looks like a great card. If it would have been available with third party coolers last July I would have bought one with a 1440p 144Hz Freesync monitor instead of my 1070 and 1440p 144Hz Gsync monitor.

It wasn't a surprise that it beat the 1070 in DX 12 and Vulkan games like Battlefield 1 and Doom but what was surprising to me is that it is essentially equal to the 1070 in DX 11 games like The Witcher 3. Unfortunately it was just a year to late for me and now that I've invested in a Gsync monitor I don't see how AMD can win me back in the foreseeable future.
 

artk2219

Distinguished


The hardest part is finding it for a decent price, we've put up with worse in the past. Or does no one remember the gtx 480, the nvidia fx 5000 series, and radeon 2900 series, or the blowers on the x800 and x1800 / x1900 cards?
 
Status
Not open for further replies.