AMD Radeon RX Vega 56 8GB Review

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

michael_732

Prominent
Jul 30, 2017
12
0
510
i'll but this card in a heartbeat if i see it $425.00 or less. i'll slap a water block on it and put it in my loop. heat and noise finished. i don't love the power draw, but i do some open cl and it'll whoop up on my old desktop's rx 480
 

caustin582

Distinguished
Oct 30, 2009
95
3
18,635


Sure, I can think of a host of reasons why it would be understandable that AMD couldn't produce a superior product, both beyond their control and otherwise, but all that matters to the consumer in the end is that they didn't produce a superior product. At the Newegg checkout, a rational and informed consumer isn't going to go, "Well AMD really did their best, and they're also busy making CPUs to compete with Intel, so I'll just go ahead and buy their graphics card because they deserve it."

I honestly can't think of a reason to get this card outside of fanboyism, or maybe severe overpricing on the 1070 due to miners buying them up (which is almost certainly going to happen with the 56 as well). I see people saying you can just put a water cooler on it, but you can do the same exact thing with a 1070 and overclock it much higher. Hell, water blocks are $100+; just buy a 1080 at that point.
 

none12345

Distinguished
Apr 27, 2013
431
2
18,785
At $400 with the current gpu market, this card is a steal. At $500, not so much, might as well spend a bit more and get a 1080 at that point.

With 1070s going for $450, vega 56 should be able to easily command the same price. Its a faster card.

If i can find one of these near $400 with a aftermarket cooler in the next few months, i'll probably buy one. But i dont want a reference blower cooler, they suck.

However, i highly doubt that will happen, the cards will likely be $460 to $500 with a aftermarket cooler, and then i just wont buy anything. I really dont wish to spend that much on a 1080p card. Every gpu on the market at any price are too slow for 4k in my opinion, and overkill for 1080, so im not buying those cards either. And i really dont want to take the half step to 1440p, where a 1080ti is in its prime.

So....if i cant find a cheap vega 56, i guess screw buying a card anytime soon. (a $320 1070 would also work but they are going for $450 right now, and no way in hell im paying that for a 1070 over a year after its release)

I would also enterain the thought of buying a 580 if i could find one at $220. But, this isnt as big of an upgrade as i really want. Id rather get a 580 at $220 then a vega 56 at $420 tho. But im not touching a 580 at current market prices($350-550 is stupid).
 

Kunra Zether

Reputable
Jun 25, 2016
353
0
4,860
Why are we even comparing it to the 1070 when it's price settles and the aib cards come out it will be over $500 and will be at the same price point of the 1080 so that will be it's real competition. I just don't understand the point of releasing a card to compete with cards that have been out for well over a year. It's fine to have mid ranged cards but to me if I was going to release a card to go after the high end it would have been one the beats the 1080 for a lower price and one that beats the 1080ti for a lower price. Also if prices were not gauged at the moment due to mining the 1070 would still win out in the price to performance sin e it's aib cards were selling as low as $350 and I seen some lower then that. I got my oc strix 1070 for $380 back in February.
 

bit_user

Polypheme
Ambassador

Even when the mining craze settles down, I think it'll be a popular choice for deep learning. Vega was pretty much designed for that.

Sure, the V100 is better, but it's so expensive that Vega still beats it on price/performance. Beyond that, V100's only makes sense if you can afford one.
 

pepar0

Distinguished
Jul 21, 2016
27
1
18,535


Deep learning might have an ROI, but can it be anywhere near that of mining?
 
As a self-imposed policy I don't upgrade my GPU until I can get something twice as fast for the same amount as I spent on the last GPU - this usually meant an upgrade once every 2-3 years. My current GPU is an R9 390, and only the 1080Ti is twice as fast, but it also costs 2.5x as much as my 390 did. I guess it'll be another two years maybe before 1080Ti performance can be had for US$ 400(?).
 


I was thinking the same thing when I read that. It's not particularly strange that their "sweet spot" didn't land directly on any of AMD's presets, when they only tested power requirements for one game with one hardware setup on a single sample of a card, and using in-game settings that relatively few people would actually use.

Perhaps the optimum efficiency level, and power use in general, would differ in a less demanding game running at a higher refresh rate. How many people are actually going to buy a $400+ graphics card to play at settings where they average just 35fps? The vast majority of people buying a card like this will undoubtedly be targeting framerates above 60fps. Even the review points out that it's more of a 1440p card than a 4K card, so gaming at 4K won't likely be the most common usage scenario. And of course, their data set is limited to one game, and other games or settings may affect the card's power use differently depending on how the GPU or it's memory are utilized.

Also, it wouldn't be bad for them to include 1080p performance data for this level of card. At the maxed settings they used, many of these games dropped well below 60fps at 1440p. I'm sure plenty of those considering buying a Vega 56 would be using it with a 1080p screen, probably far more than the number of those getting the card for 4K. High refresh rate screens have been becoming more common too, so it's worth looking into how these cards perform at lower resolutions and higher refresh rates. Perhaps the CPU will become more of a limiting factor in some games, but it could be considered a good thing to show that, to indicate to a potential buyer that there might not always be much advantage to spending hundreds of dollars more on a higher end card for lower resolutions.
 

bit_user

Polypheme
Ambassador

Depends on what you're using it for. Right now, the biggest prize at Kaggle is $1.5M.

https://www.kaggle.com/competitions

Early last year, I helped a friend build a machine for these competitions. At the time, it was still much more cost-effective than renting time on GPUs in the cloud.

Plus, lots of startups & established businesses are embracing it. No doubt it's sold far more $$$ in computing hardware than VR.
 

FormatC

Distinguished
Apr 4, 2011
981
1
18,990
To find out the sweet spot, you need the application/game with the highest power consumption, maximum workload and memory stress. I tried a lot of different games before, but things like Doom or other games in lower resolutions (Full-HD) with CPU bottlenecks will deviler you only crap. If we talk about a sweet spot, we must exclude so many external factors and influences as possible. I need reproducible values, nothing else. For each wattage I made a handful of runs (minimum five) to get a stable average - fps and wattage. I was sitting over 8 hours only for this part and to do this with 5 games means 40 hours. This is stupid. If you prepare your runs with a plan, the result is more stable as each "undervolting sensation" with only one run and a measured power consumption on the wall.
 

bit_user

Polypheme
Ambassador

Depends on how much you care about power-efficiency. I got a new EVGA 980 Ti FTW for $450 (after MIR), when the GTX 1070 launched.

That said, we won't see quite such steep discounting after the Volta consumer cards launch, since Pascal -> Volta probably won't be as big a jump as Maxwell -> Pascal.

I'd say upgrade whenever you need/want to do something your current hardware isn't fast enough for, and there's something in your price range that is.
 

bit_user

Polypheme
Ambassador

Ah, but the relative stress on compute vs. memory will shift, depending on the game.

Look, I'm just saying it's not really fair to complain that AMD didn't have a preset exactly at the sweet spot you found for that game, at those settings. They got right in the ballpark, with plenty of headroom for users to tweak. IMO, that's what counts.

BTW, thanks for your diligent testing. My comment wasn't meant to imply that you should've tested other games or settings, just that the conclusion you can draw from the testing you did is slightly limited.
 

FormatC

Distinguished
Apr 4, 2011
981
1
18,990
It is a translation, but the fact, that we have too much (redundant) profiles for too less effects is the same. Maybe, that I was during the tests a bit too emotional (I got no sample in the first run, made a lot of phone calls and must wait nearly two weeks to get at least one sample). If AMD means, to use better influencers and some funny YouTube clappers to get more "positive reviews", it is not my decision. But if you have to keep a schedule under time pressure and only a few days left to finish this all, this stupid Games of Samples (GOS) with this kind of pre-selection simply suxx. :D
 

CaptainTom

Honorable
May 3, 2012
1,563
0
11,960
Alright seriously - how are you guys getting such terrible hashrates? LOL


I am getting 42 MH/s ETH + 1400 MH/s SC dual mining. I have to mention this so everyone understands that miners are indeed buying up Vega because they are the best mining cards...
 

Kewlx25

Distinguished
Almost double the power consumption for one-fifth more gaming performance is what we'd call catastrophic.

Better than my Nvidia 1070. 3% performance gain is about 100% increase in power. Reducing my stock frequency from 1950 to 1830 nearly cut my power draw in half. I say "stock" because it runs turbo all the time. I did no modifications, just an out-of-the-box MSI Gamer 1070, cheapest one.
 

Nintendork

Distinguished
Dec 22, 2008
464
0
18,780
AMD should focus more on marketing the efficiency, when you're not running at max frequencies Vega really pushes the fps/w.

-25% power target (160W) for nearly 90% of the full power consumption performance is really great.
 

80-watt Hamster

Honorable
Oct 9, 2014
238
18
10,715


True, but at that point it'll be slightly slower on average than a 1070 (if I'm interpreting the charts correctly), and still isn't better in perf/watt. "Almost as fast for the same money" is a tough proposition to sell.
 


I would assume that since this looks to be the Fury replacement it should be able to be built with custom cooling. I don't see why AMD would block that for this card. Or any actually as I am sure AiB water coolers could easily be better.

That said, I imagine that one limit will be the process. It has already been shown that the 14nm they are using is not as good power wise as the 16nm NVidia is using from TSMC.



Or because both of those GPUs are in higher end markets that AMD is not challenging yet. Or if they even will have anything to compete with it.

My biggest question is for how long will these GPUs be viable at the prices, short of for mining, considering NVidia is probably planning a Volta launch soon. Rumors say early 2018 but if the rumors are correct they should be a decent bump over Pascal. I do find it sad that AMDs GPUs seem to be screwed price wise thanks to mining.

Guess we shall see.
 

linford585

Distinguished
Oct 7, 2008
53
0
18,630


Hmm? Is this supposed to be a troll post? I tend to lean AMD for my own purchases because I don't like many of Nvidia's business practices (I actually just bought a Vega 56), but I don't see any reason to say 'AMD ... [has] rightfully taken the performance crown!'.

1080 Ti, Titan X (Pascal), and Titan Xp have no real competitor at the moment.
1080 and Vega 64 are in close competition overall if you don't care or care little about power usage, otherwise the 1080 is the clear winner.
Vega 56 is mostly a win against the 1070, so long as it holds at or near MSRP. Potentially a loss for those who care a great deal about power usage.

Combine that with good professional performance, and I'd say Vega is a good entry into the target markets, but it certainly hasn't taken any kind of crown. And it's late.
 

Bohefus

Prominent
May 19, 2017
10
0
510
Yay!! a new AMD video card!! Too bad you can't find them anywhere.. Too bad it consumes vast amounts of power... Too bad they don't compete with the 1080 ti..
The only thing going for Radeon these days is freesync and mining. Not really a viable option for gamers when they are out of stock in 5 minutes. I'll just go for a 1080 ti and forgo variable refresh rate.
 


Yes. It tends to be a pattern around here: someone creates a new account on Tom's Hardware to complain about their bias against AMD when Tom's has the audacity to actually mention any drawbacks or flaws on an AMD product reviewed. Oh the humanity! And it's good odds that the individual(s) is a Tom's regular who didn't want to post his true feelings under his regular account.
 
Status
Not open for further replies.