VisionTek’s Radeon RX Vega 64 Graphics Card Is Now Available

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

zippyzion

Distinguished
Jan 23, 2012
114
0
18,680
Overclocking, overclocking, overclocking... that is all I hear anymore. I was overclocking 486s and starting minor case fires with original Pentiums before most people knew what overclocking even was. Right now the vast majority of people are much happier with stability and warranties. The fact is, that in the configuration that most people will run it the 64 matches the 1080 in gaming performance and exceeds it in compute and other productivity operations. If the 64 is a pig, then it has been slaughtered and BBQ'ed into a delicious sandwich that we can all enjoy.

As it is the 56 exceeds the 1070 in almost all metrics. The 56 REALLY IS a much better proposition and doesn't use much more power. Most people could pop a 1070 out of their system, pop a Vega 56 in, gain a few frames, and not notice a difference, even on their monthly power bill. You have to get into annual usage for it to even make a difference. No power supply upgrades, no cooling upgrades, just card and drivers. On top of that the reference 56 runs cooler than the 1070 DESPITE drawing more power, which means one inescapable thing as mandated by the laws of thermodynamics, it is doing more work (after all, energy in must equal energy out and if it isn't making heat then it is making frames), all the while using fewer cycles to do it in!
 


The biggest metric the 1070 currently wins at is pricing and availability, despite the mining markup. While it would be great if Vega 56 launches at MSRP in quantity, I'd estimate the odds of that occurring at 0%. If it follows the pattern of the Vega 64, the comparisons won't be favorable.

And I think you're mistaken about the amount heat being produced. When you say Vega runs cooler, that's a temperature measurement, not an energy measurement. While the maximum recorded temperature may be higher, Vega 56 is a 484 mm² chip, whereas the 1070 is a 314 mm² chip. Vega may run cooler, but there's over 50% more of it at that lower temperature. Think of a gallon of water at 50 C vs 1.5 gallons of water at 40 C. The one gallon is hotter, but the cooler 1.5 gallons contains much more total energy.
 


Well you are hearing about it for a reason. The fact is a lot of PC gamers like to tinker with their hardware and squeeze out every bit of performance from their hardware purchase to maximize their dollars spent. Hardware enthusiast websites like Tom's exist because PC enthusiasts are tinkerers. Anyone can flip through some benchmarks then go out and buy a video card based on what they see and throw it in their OEM PC as an upgrade and forget about it.

And again, a factory overclocked 1080 is faster than the RX 64 and consumes less power still. That's a high price to pay. Funny how all the bragging about Ryzen being superior to Intel in per-core power efficiency suddenly doesn't matter when it comes to GPUs (and let's keep in mind a video card uses more power than a CPU + motherboard combined). Techpowerup pretty much summed it up:

https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_64/32.html
https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_64/33.html

And based on this leaked test of an ASUS Strix variant of the RX 64, there's not much improvement in performance over the reference:

https://www.computerbase.de/2017-08/asus-radeon-rx-vega-64-vorserie-test/2/

I do not recommend the RX 64 for people looking to upgrade in GTX 1080 territory for a reason and will not back down on calling AMD out on having a disappointing answer to the GTX 1080. Especially after a year's wait. Yeah that's another matter that keeps getting overlooked: the GTX 1080 has been out over a year now.

At least the Fury X answer to the 980 Ti came out only two months later. The only time an RX 64 makes more sense over a GTX 1080 (or closely priced 1080Ti) is if a buyer does not already own a Freesync monitor and wants one. That's it. Having to buy a monitor to validate buying the RX 64 over the GTX 1080.



That's a mighty expensive BBQ sandwich. In more ways than one.



I never argued against the RX 56 other than the power consumption. It's a faster GPU across the board than a reference GTX 1070. But it was also released at a higher retail price point than the 1070 as well (of course none of that matters now). But I will say that your definition of "not much more" power consumption is different than mine. I consider a 56% higher power draw pretty significant:

https://tpucdn.com/reviews/AMD/Radeon_RX_Vega_56/images/power_peak.png

This article is about the RX 64, which should have been released at a lower price point than the 1080. Actual selling prices are another matter entirely and with the 1080 vs RX 64 actual selling prices right now, it's not even close. Two direct comparisons of the same vendor selling each GPU:

RX 56 - https://www.newegg.com/Product/Product.aspx?Item=N82E16814137226
GTX 1080 - https://www.newegg.com/Product/Product.aspx?Item=9SIA0ZX5RK8375
 

TJ Hooker

Titan
Ambassador

Umm, no, for a couple reasons.
First off, you seem to think that GPUs convert some of their consumed power to heat and some to "work". That is incorrect, they convert all their power to heat, and work is performed during this process.
Secondly, the GPU temp depends on both power and cooling solution. A 1070 and Vega 56 obviously use different coolers, so you can't make any direct comparisons of power consumption just based on temperature.
 


Unfortunately, I have to agree. The 1070 will more than likely default win due to availability and demand. Even NVidia can and has suffered from that. Also, as you noted, mining operators will be fond of it enough to increase demand and cause a bump in price: your mining markup.


(This last part isn't specifically aimed at you, 10tacle)

How hot a chip runs does depend on several factors: Internal design, surface area, and operational use, on top of cooling. We see it even within the same chip families where one GPU with cooler A may run at 68, while another with cooler B (a less efficient cooler) may run at 90. Same power draw, same operating speed and conditions, OTHER than cooling solution. The real test of which runs hotter, would be to run a 1070 with the exact same cooling solution as a Vega 56. The cooling system will remove only so much, equally. That will tell us which one really runs hotter. period.

The point there is, a super hot chip, with the right cooling can also run cooler than a moderately hot chip with inferior cooling. A real-world extreme example: a 10MHz 8088 CPU vs an i9 using liquid nitrogen. The i9 runs cooler, in this case, than the 8088. Remove that liquid nitrogen cooling from the i9 and disable its ability to throttle and shut down, and the i9 will go up in smoke while the 8088 chugs along without active or passive cooling (outside of its own packaging.)

This isn't to dis the water volume vs temperature analogy, but an attempt to enhance it while providing yet another angle.
 

zippyzion

Distinguished
Jan 23, 2012
114
0
18,680
AnandTech disagrees with your 56% more power draw comment... in fact they disagree with a lot of what has been said. In direct comparison in the same tests the 56 doesn't use anywhere near 56% more. It isn't even 30% more. It is around 27%, which isn't anything to sneeze at, but is a far cry from 56%. Just what "games" were they testing?

http://www.anandtech.com/show/11717/the-amd-radeon-rx-vega-64-and-56-review/19

As for the Ryzen comment, they aren't even related, for the most part. Till they got Raven Ridged onto the same die they were parallel projects with no overlap. So that comment is neither here nor there. Besides, the APU Vega parts will be hard pressed to compare to the 56 or 64 in any meaningful way. I'm going to guess that they won't have on chip HMB2 memory as well. I expect them to have much more in common with Polaris graphics.

As for the pricing, you and I both know that the market is out to lunch right now, especially where AMD cards are concerned. People have it in their mind that AMD cards are where it is at for mining, and despite the reported low hash rates I imagine that miners see the potential in the Vega cards and have gobbled them up. Mostly I base that on not a lot of gamers talking about their brand new Vega cards in forums. So, demand is high and prices are higher. Otherwise an RX 480 wouldn't even be in the same pricing category as Vega.

https://www.newegg.com/Product/Product.aspx?Item=9SIA6V66483925&cm_re=RX_580-_-14-137-117-_-Product

Also, your links are kinda wonky. You labeled the overpriced 64 as the 56 and linked to an out of stock 1080 so I can't see the price.

And overclocking, the vast majority of people still won't be doing it. Heck I've done it for years, have a self built gaming rig and I don't even overclock. Most of the gamers I know don't overclock. It's just a hardware elitist thing. The need to feel that people are below them. Stock settings with reasonable hardware will run things just fine for just about everyone. Unless you are trying to push to a 240hz display there really isn't a point.
 

TJ Hooker

Titan
Ambassador

Eh, sort of. Yes, a larger die would have a larger thermal mass, but it's still quite small, and probably negligible compared to thermal mass of the cooler. Also, if you're in steady state (i.e. sustained load), the thermal mass of the chip is irrelevant. The benefit of larger dies with respect to cooling is not that the chip can hold more heat, but that the chip has a larger surface area in contact with the cooler, which allows for greater heat flow.
 

TJ Hooker

Titan
Ambassador

Anandtech measures total system power consumption, which will have some fixed power draw irrespective of the graphics card that will affect any relative power consumption calculations. I.e. if you were to assume the rest of the system draws ~100W for both 1070 and Vega 56, then the 56 would be drawing 41% more than the 1070 based on Anandtech's numbers.
 


Thank you. That's exactly why I chose Techpowerup's numbers because they only reported the GPU, not the entire system.
 


I'm merely making an observation that it's amazing watching the concerns of power draw on an Intel CPU vs. AMD CPU suddenly no longer becomes an issue when it becomes Nvidia vs. AMD on a GPU.



Yep. Which is why I brought it up. Twice. But the point is that there are 1080s in stock in the $520-$550 price range. Neither is the case with the RX 64. Will that change any time soon? Stock levels might rise, but I doubt prices will drop. Then there's the leak that AMD's own price structure for the RX 64 may have been "introductory" only.



Then between my posting and your reading, someone snapped up that MSI 1080. It was in stock for $619. NewEgg is consistent with maintaining their pricing, whether in or out of stuck. Exactly why I used them as an example. Their prices listed are they last sold for before going out of stock.



I'm sure that's true, but most car owners don't tinker with their cars either. That doesn't mean there's no market for aftermarket performance parts, right? It's a niche market, but doesn't take away the fact that you can get more performance for the dollar hopping something up.



That's a rather broad (and unfounded) assumption of what the mentality is of others who overclock. I couldn't disagree more. If I'm running a 60Hz 4K monitor with a stock 1080 hitting 54FPS in Witcher 3, I want that overclock to get that 60FPS/60Hz marriage. Just to put it into perspective using a Guru3D test of an ASUS STRIX OC GTX 1080 with Hitman at 1440p:

Reference GTX 1080 - 103
Factory overclock Strix - 114
Overclocked Strix - 124

That's a big jump, especially if running a 144Hz G-sync monitor, especially when Guru3D's RX 64 sample got 107.

http://www.guru3d.com/articles_pages/asus_geforce_gtx_1080_strix_oc_11_gbps_review,38.html

 


You and shrapnel_indie are both right. And I muddled two explanations together without thinking them through. Let me try again.

The original point (now conceded) was the idea that because Ryzen 56's reported temperatures were lower than a stock 1070, it therefore produced less heat. I tried to point out that the given temperature was not indicative of the total heat being produced, and of course mistakenly conflated that with temperature * area without factoring in heat dissipation via the cooler. Obviously, a superior cooling solution will also result in lower temperatures being achieved, while the actual amount of heat being produced would be unchanged.

The next thing I overlooked (but I'm in good company here) is that the chip itself only accounts for a portion of the total power being consumed. As shown in the thermal imagery, VRM's, MOSFET's, RAM, etc are all using their share of power and producing heat. As William Shatner kindly pointed out, the amount of power dictates the amount of heat, whether that heat comes in the form of thermal transfer, photons (which themselves transfer energy, and thus heat), kinetic or otherwise.

Consider the horse flogged, but, couldn't help myself.

Anyway, if you're committed to the Freesync ecosystem you don't have any other options. Of course, the same can be said of G-Sync too. And if price is the main determining factor here, any markup on Vega will eat into the Freesync savings. Vega 56 plus a Freesync Monitor might wind up being the overall champion here, but only if AMD can meet demand at the announced price. The best news for AMD is that the G-Sync premium hasn't budged so far, and miners don't care about fps.
 

DerekA_C

Prominent
Mar 1, 2017
177
0
690
I'm sorry but an evga 1080ti hybrid is is $795 currently on Amazon then they have the FTW3 for $850 with a guaranteed 2ghz overclock. At stock it already runs cooler then any and gives you a solid 60fps at 4k but with the ability to grow with future games Nvidia is set for at least the next year on all these cards that can push to 2ghz Asus has one MSI has one Gigabyte has one so are all these companies including the ability to get that much more out of it wrong. I think they knew just like Nvidia knew that AMD wouldn't be able to compete with the 1080ti which is why they released it they knew they could keep up with the 1080 someone is on the inside clearly. Now they can relax and profit as much as possible from current gen with several price drops along the way before releasing volta in a year which they literally could release today it has been finished for over 2 months they just have to cut it down and optimize its drivers which they usually do after a few month of releasing the enterprise boards but why should they when they are still 35% faster then the fastest Vega.
 
Status
Not open for further replies.