AMD Radeon VII 16GB Review: A Surprise Attack on GeForce RTX 2080

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

nobspls

Reputable
Mar 14, 2018
902
12
5,415


From Tom's:
https://www.tomshardware.com/reviews/amd-radeon-vii-vega-20-7nm,5977-8.html
"The company says it controls throttling and fan control on Vega 20 through a network of 64 sensors that create a junction temperature, rather than the edge temperature reported by a single sensor that was used previously. As a result of this more informed reading, the company can be extra aggressive about extracting maximum performance from Vega 20 without exceeding the GPU’s limits. In thermally-constrained scenarios, Radeon VII should run a couple of percent faster with its clock rates modulated according to junction temperature. But because AMD is pushing this GPU hard at the expense of power and heat, there’s no real way to tame the unrefined cooler without giving up peak performance."

What overclocking do you think this abomination can even really have? It is already too hot and too loud.Get a custom $300 water cooling kit for this thing and see if it still got room to OC? Be lucky to squeeze out another 5%.
 


I hate how overclocking has gone from an option to an expectation. It wasn't long ago that most people, even computer people, had no idea what overclocking was. You ran the hardware the way it was designed... and it performed as advertised. That was all there was to it. Now everyone expects to just be able to get 20% more out of a CPU or GPU. People are even paying more for hardware that has been tested and proven to overclock better. This is getting absurd. I many cases overclocking just isn't worth it... and manufacturers know it. Factory overclocked models are only ever a little better. What is the point? A few more FPS? A higher number in a benchmark? Do you even SEE, and I mean REALLY SEE a difference in your games? Very rarely... especially on the hardware people buy to overclock. On really low end hardware it can sometimes be the difference between an unplayable 25 FPS and a more playable 35 FPS... but on a high end card that will give you well over 60 FPS on literally anything... utterly pointless.

Usually the people talking about overclocking don't even overclock. As a former overclocker, I gave it up. The performance gain was never enough to warrant the potential damage to the hardware. Sure, it is cool to see people push their stuff to the bleeding edge... I enjoy watching that, but I'm CERTAINLY not trying it. I don't have the cash to replace a $700 graphics card if I somehow kill it. Most people don't. Most people don't even risk it. Overclocking is not a feature most people look for. It is rather unfair to judge a product on whether it overclocks well or not at all.

Man... people are going to hate me for that opinion

Vega's performance is fine as it is. It is an option in a space where we didn't have an option, and in some titles it is better, in others it isn't... but in none of the titles is it unplayable. You get a versatile card with a LOT of VRAM for gaming. Also, the joy of GCN architectures is that they can refine them through drivers and software. I'd be interested in seeing a comparison in a year to see where VII is. Heck, I'd like to see a comparison from launch day to now on the Vega 56 and 64.
 

King_V

Illustrious
Ambassador
I'm with you on that. I see a lot of posts about people asking to overclock, or keep their CPU or GPU running at its highest speed all the time, but with no explanation as to why they're doing this.

I sometimes want to scream when I see posts like that: "What problem are you trying to solve?? Why do you want to run your CPU at its boost speed even when you're idling on the desktop? Why are you overclocking your GPU yet haven't mentioned anything about poor performance?"
 

Jim90

Distinguished
Vega II = simple die shrink of EXISTING Vega64.
Vega II = DEMO ONLY of 7nm shrink process...which COMPETES with Nvidia's 2nd most powerful gaming gfx card.

...DEMO !!!!!!
...NOT/never was intended for mass market sales.

NAVI = NEW architecture. Expect low-mid significant competition, but EXPECT also HIGH END significant competition - for Nvidia.

AMD intended Vega II as working demo ONLY.
 
Feb 9, 2019
1
0
10
Try to undervolt that card. See this link

spotreba-1.jpg




There is a power consumption after undervolting Raden VII
 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795


Yup, thanks!
 

I wouldn't say they do "nothing". Air is still going to be entering or exiting the case through them due to pressure, which should help reduce buildup of warm air around the card, even if the card itself isn't actively forcing air through them. For a card that produces as much heat as this one, any additional air circulating around the card undoubtedly helps.


It's probably at least somewhat a reflection of your math being off. : P The 2080 draws around 220 watts under load, which would place the Radeon VII's power consumption at around 35% higher. Aside from that, it's probably due to the fact that this GPU actually only has a core count that's in-between Vega 56 and 64, and AMD overclocked it to its limits to get that extra performance. This is ultimately just Vega on a smaller process, or in other words, re-purposed professional compute hardware. Some of its features, particularly the 16GB of HBM2, don't really benefit gaming performance much, but are very expensive, so they can't sell the card for less than the competition. With a similar-performing chip built to utilize 8GB of GDDR6, they could have likely sold it for around the price of a 2070, where it's performance could be a lot more compelling, and they wouldn't have had to push the card to it's limits, keeping the heat and noise more reasonable. Vega was built around expensive HBM though, so they're stuck with that, at least until their next generation of high-end hardware is ready. Navi will likely fare better, but that will probably only cover the mid-range, while a Radeon VII successor is likely at least a year away.


Unless the thing fails shortly thereafter and it's not covered by warranty and you're out $500. : P I would personally want a good 3-year warranty on a card in that price range. Even a new 2070 for $500 would arguably be preferable, even if it doesn't perform quite as well.


I suspect that information was likely reasonably accurate, though it's possible that some things could have changed since then. Maybe the card was originally slated for release somewhat earlier, when it might have cost a bit more to produce, or maybe they originally planned to give it a liquid cooler, which would have added to the price. I'm sure their margins must be quite low though, due to that huge amount of HBM2. You can be pretty sure that if AMD could have sold this card for less than the 2080, they most likely would have, as it would have been better received. And the whole point of that article was about reasons why AMD supposedly got rid of that executive after only being with the company for less than a year. Also worth pointing out, that particular executive used to work for Nvidia, so make of that what you will. >_>


In some cases it can be a decent option. The Ryzen 2600 or 2700, for example, can be overclocked up to a similar performance level as their higher-clocked "X" counterparts, and have been priced low enough most of the time where even if you replace the stock cooler with something better and quieter, you can end of paying less overall. So, overclocking there can be a way to save a bit of money to put elsewhere in a build without sacrificing performance, particularly if you wanted to replace the cooler anyway. That seems to be a trend with AMD's lower-priced versions of chips. Intel locking overclocking to higher-end hardware is a bit more questionable, but I can see why they do that. As for graphics cards, factory overclocks have been the norm for a long time, and in general, cards with moderate overclocks don't tend to really cost more.


Wow, that chart you posted shows the Radeon VII's 1% Lows performing better than even a 2080 Ti, and 0.1% Lows coming close behind, with both well ahead of an overclocked 2080. Most impressive. You must really love Radeon VII to show it outperforming the competition where it really matters. : 3
 

nobspls

Reputable
Mar 14, 2018
902
12
5,415


Good job distorting the truth again. The 0.1% is what really matter. Yeah right. If understood anything about statistics and looked at the data provide at GamerNexus you would have seen the one spike in the data that negatively impacted all the cards, and basically invalidated any real value of the 0.1% data. Yep the desperation and grasping at straws, literally just want to show how in 1 out of 1000 frames that data favor the AMD card because of some game anomaly behavior.
 

Out of curiosity, were you this active when the RTX 2070 and 2080 came out offering similar performance to their 2 year old predecessors at higher prices?

FWIW, as someone who intends to buy a high end GPU this year I agree that the VII is disappointing, just like the 2070 and 2080 are. The 2080ti is a phenomenal piece of hardware, but the stratospheric pricing puts it out of reach of the vast majority of gamers (myself included). So again, pretty disappointing.

I'm wondering why you're singling out the VII for particular scorn when as far as I can tell it's just another in a serious of disappointing high end graphics card launches.

The above is a genuine question, by the way. I'm not trying to bait or provoke. You seem thoroughly unimpressed by this launch (with good reason) and I'm interested whether you see the RTX launch in a different light.
 

nobspls

Reputable
Mar 14, 2018
902
12
5,415


It used to be same price for the same tier while you got increased performance, but the RTX launch is disaster of epic proportions. It is reinforcing the precedent of increasing prices for increasing performance, and only old stuff for the low-mid range options. But who knows if and when, not soon is my guess, that gamer can look to get more for their moeny. AMD Vega launch basically demonstrated to everyone, I'm pretty sure nVidia took notes, that you can just raise prices, and we got the RTX, and now the vii release now isn't helping either.
 


Well, to be fair to NVidia, the RTX cards have kind of been separated from the GTX product line. That said, I get where you are coming from. However, there is a bit of a silver lining here. People who have mid range or high end graphics from the last generation, or high end from the gen before, don't HAVE to upgrade. If I'm happy with the performance of my RX 470 or my GTX 1060, then there is no market pressure to buy the next generation of card because it won't be THAT much better. Heck NVidia probably knew this and priced their cards high to recoup development costs. AMD is in the boat where it had a handful of cards sitting in a warehouse somewhere that they never planned on releasing, but the delay on Navi (which was supposed to be shown off at CES according to rumors) pushed AMD to have to release SOMETHING... so they launched VII. An older architecture that benefited from a die shrink. An example of what 7nm can do. You can see it in their launch, it feels rushed because it was. There is short supply because they weren't going to release it at all. It is priced high because it is identical to massively expensive cards on the silicon level. 7nm Vega was a test, and its release was a mistake, but it was a mistake AMD had to make to attempt to remain relevant. NVidia has little excuse for their super high pricing. AMD had thrown all their eggs in the Navi basket... and someone dropped it. So... here we are. I mean, we can get angry at them, but what does that actually do? NVidia has 1 GPU that exceeds performance from their last generation, AMD has one GPU that exceeds performance from their last generation. Those two GPUs are still a similar amount apart in price and performance. We are right back where we were with Vega 64 and the 1080 Ti... they are just more expensive, because they perform better. With the exception of RTX and DLSS, this is nothing but an extension of the last generation. If you think about it like that, your head won't hurt as much.