AMD Radeon RX Vega 56 8GB Review

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

king3pj

Distinguished


This card isn't meant to compete with the $750 1080 Ti. That's why it has an MSRP of $400. At that price it's decent competition for the 1070. Until mining dies down neither the 1070 or the Vega 56 are going to be available for a decent price though.
 

pepar0

Distinguished
Jul 21, 2016
27
1
18,535
Expensive nVidia cards cause me to be satisfied with my R290. Unfottunately so do hot, hungry, noisy and expensive new AMDs. I am really sorry about the latter as I was hoping this Vega release was going to be my upgrade point to match my relatively new 1440 display.
 

bit_user

Polypheme
Ambassador

One thing that might be a deciding factor for someone on the fence between Vega 56 and GTX 1070 is that the Vega could improve with better drivers and more optimized games (e.g. by using fp16/half-precision and primitive shaders). Whereas, if you go with Nvidia's Pascal cards, it's unlikely you'll encounter any such pleasant surprises, at this point.


As @king3pj said, why are you even considering Vega 56 (or 64), if you can afford a 1080 Ti? That's a very easy decision, if all you're using it for is gaming.
 

rwinches

Distinguished
Jun 29, 2006
888
0
19,060
Including the 1080ti in the testing and graphs really hurts the value of this review. You show results where only the ti is playable, how is that actionable? If the ti was not included the reader could see how the settings need to be at 1440p or 4k to achieve smooth game play providing a real world takeaway. I want to know how to get the most out of a product that I would consider buying for me a Ti would not be on the list. I would of course make different choices building for someone else or a business workstation class system.

Tom's as well as many other reviewers seem obsessed with using max settings and using adjectives demeaning the use of any lower or alternative settings. I buy what I can afford and like everyone else I make do with the system I have. I build my own systems to maximize my dollars which might include used parts or waiting until the 'next new thing' is introduced and getting the previous 'next new thing'. Does this mean I don't qualify as an enthusiast?

Finally buyers have a choice and would not be disappointed with products AMD offers now. Real competition has forced Intel to drop prices and add cores and mesh, but still at a premium even new 1151 offerings need to be 1151v2, but still solid. nVidia is not sitting still either they too did panic and chop ~$350 off their TitanXP card to maintain their lead. AMD has done an excellent job with it's comparatively limited resources and managed to bring everyone better gaming experience spearheading closer to the metal offerings like Mantel, Vulcan and DX12 monitor Freesync/2 all with no direct financial gain. Console gaming has improved greatly and continues to improve using AMD parts.

I still fire up my DFI dual P4 550mHz CPU system once in a while and marvel at it's coolness and play earlier games its still fun.
 

bit_user

Polypheme
Ambassador

I consider Polaris to be an actual gaming card, don't you?

And we don't know what Vega 11 will look like (do we?), although it's safe to assume it'll be no more efficient than Vega 56.

As for the high-end, I agree that it'll take time for RTG (Radeon Technology Group) to course-correct and focus more on power efficiency.
 
I expect Vega 11 will come in the form of a 590 and or 590X, though AMD has streamlined the product stack so far with no "X" models. Not exactly sure where in the product stack it would occupy considering the 580's matches or beats the 1060's, so IF Vega 11 comes in between the 1060 and 1070, there is no competition for it. It would dominate at the $300-$350 price range.

Considering it's a smaller die, I have to assume it will be somewhat more power efficient, though there is no telling by how much.
 

To be fair, it's been difficult to find them at anywhere close to retail price in recent months due to the mining mess, so they might as well not exist as far as gaming cards are concerned. Lately, they could be considered mining cards more than anything.


Nvidia's cards around that price range have been out for over a year now, so they're due to be replaced by something new before too long. Even if it were just a refresh of the 10-series hardware, I would suspect that the 1060's successor might be updated to outperform the RX 580 at least.

Of course, even if AMD launches something with performance between a 1060 and a 1070, if it's anything like their other cards it should perform well at cryptocurrency mining, so if that's still a thing at that point, its price might get driven up to 1070 levels or higher, where it wouldn't be dominating much at all from a gaming perspective.

This graphics card shortage could hurt AMD's install base more among gaming systems in the long run. Just looking at the first graph on the latest Steam Hardware Survey, AMD's share of the market appears to be shrinking quite a bit lately, at least on systems used for gaming. Between AMD, Nvidia and Intel, AMD's graphics hardware was in close to 25% of these systems a year ago, but they are now down to just 18.6%. Over the same period, Nvidia's share of the market increased from 57.6% to 67.6%. When you factor in that some of those AMD chipsets are likely integrated graphics on their APUs, Nvidia has probably been outselling them on dedicated hardware at least four to one lately, and I'm sure those numbers have been even worse the last few months.

AMD might not have trouble selling their cards to miners, but miners don't exactly care about brand loyalty, and won't help further AMD's ecosystem. FreeSync might be a great alternative to G-sync, but if AMD doesn't have graphics cards available at reasonable prices, their potential customers will be more likely to go with Nvidia, and in turn a G-Sync monitor, and once they've bought into one of those, they'll be more prone to sticking with them in the future. A smaller install base means developers will care less about optimizing for their cards as well. Considering how volatile the mining market can be, it's probably better to focus more on maintaining a dedicated userbase more than anything.
 
Sadly, AMD can't completely control the prices and there really isn't much they can do to stop the miners from eating up the supply.

True it's not a good thing that they are losing the gaming marketshare, ,they are turning a healthy profit from the miners. The mining sales are a double edged sword. When things settle down, the miners will be flooding the used market with these cards and that will likely hurt sales for both AMD and nVidia. AMD makes great money now, but will pay for it later.

The gsync and freesync issue you mention is a real problem. This is why I refuse to go with either technology. I do not like vendor lock-in and I will run with regular monitors until there is a single, open standard....which wont happen until nVidia gsync sales drop.

AMD could release the cards for the $300-$350 range and it would own that price segment, but nothing they can do if vendors increase the prices. But even at those prices, miners would likely buy them as they continue to the 580's. Which will lead to the issue discussed above.

AMD is in a tight spot, they need the cash for all the sales, regardless who buys them, but they need non-miners to be buying them. If they find a way to gimp mining on their cards, the sales will just go to nVidia and even after prices settle down for AMD, there likely wouldn't be enough regular sales to offset the lost miner sales.

I have been saving for a new system, but I'm going to wait for the Etherium craze to settle and Ryzen 2, to see what I can afford. The GPU market is insane right now.
 


According to Fudzilla, AMD could be losing as much as $100 per Vega 64 that they sell at $499, and the 56 shares most of the same components. Most analysts are estimating the cost of 8 GB of HBM2 alone to be ~$150, 3x the price of GDDR5. It doesn't look like AMD has any leeway at all in pricing, except to rely on somewhat questionable bundling deals to try to turn a profit.
 

bit_user

Polypheme
Ambassador

Ha! FreeSync is an open standard. Monitors & GPUs can support it royalty-free. Intel has signaled their intent to support it, but I'm not sure of their current status.

So, FreeSync is only pseudo-proprietary, in the sense that Nvidia has so far refused to support it, limiting GPU selection to a single vendor.

BTW, XBox One X is slated to support FreeSync over HDMI.
 


While I don't have information where I can say whether what Fudzilla is reporting is true or false, I tend to be very suspect of reports that only come from a single place. All the other related articles are sourcing Fudzilla, so there is no confirmation at this point.

I do hope this isn't true, however, Microsoft (IIRC) took a loss on the XBone at the beginning to get market share. Once manufacturing processes matured, they were able to make a profit from them. SO this isn't unheard of. As pointed out by the other poster, maybe AMD is trading that $100 for marketshare and gaining some vendor lock in with freesync purchases.
 


Adaptive sync is the open VESA standard that freesync is based on. Freesync is AMD's implementation of it. But, yes, nVidia could easily support it, but they don't want to gut their gsync profits, which is exactly what it would do. If nVidia supported Freesync or adaptive sync, then i would look again at the technology. Until then...not a chance.

XBox and Playstation support for freesync is a no-brainer as they both use AMD parts. Though I haven't heard if PS will ever support it.
 

bit_user

Polypheme
Ambassador

I thought of pointing that out, but what really matters is that it's helping the feature go mainstream. If it's literally ubiquitous, then Nvidia might eventually have to add some measure of support for it.
 


Gamersnexus did an independent article about HBM2 vs GDDR5 with cost estimates for the memory system and interposer:

http://www.gamersnexus.net/guides/3032-vega-56-cost-of-hbm2-and-necessity-to-use-it

Buying marketshare doesn't make a lot of sense if miners are snapping up the cards, as they are unlikely to be buying Freeesync Monitors bundle or no. Consoles are a little different since they can rely on exclusive titles to recoup profits. And the Xbox writedowns were made a lot worse by the infamous RRoD (Red Ring of Death) that wound up costing them over 1 Billion, but it didn't really hurt them since they had money to burn.
 
Really ?

Overclocking ability of Vega 56 = 0%
Overclocking ability of 1070 = 17% or more

With the OC factored in does Vega 56 still consistently match or beat it

And please, in te future, could we not use an nVidia card for comparison that's essentially a reference PCB w/ a nice cooler.
 


If you're comfortable with potentially igniting your card, there's already a workaround for that:

http://www.gamersnexus.net/guides/3040-amd-vega-56-hybrid-results-1742mhz-400w-power

I'm mostly joking about the igniting, but technically you are disabling safety features that would otherwise prevent you from being able to send 200% power through your board with a simple typo.
 

bit_user

Polypheme
Ambassador

What's wrong with comparing a reference board with a reference board?

Another way of looking at it would be to compare products at the same price, except that pricing is so distorted right now...
 
Status
Not open for further replies.

TRENDING THREADS