AMD Radeon RX Vega 64: Bundles, Specs, And Aug. 14 Availability

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

bit_user

Polypheme
Ambassador

Oh, it's nothing new. Fury and Polaris also had better specs on paper than the Nvidia models with similar game performance.

I don't believe it's simply a matter of drivers. Even Nvidia-specific game optimizations wouldn't fully explain the discrepancies.
 

bit_user

Polypheme
Ambassador

Please direct your attention to these two pages (make your browser window full-screen, first):

https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units#GeForce_10_series
https://en.wikipedia.org/wiki/List_of_AMD_graphics_processing_units#Radeon_RX_500_Series

Both companies engage in selling the same base GPU die with some compute units disabled. It's likely that some of these down-spec'd dies have defects in some of the disabled units. This is long-established practice for both CPUs and GPUs.

To your other point, you can see exactly which products share the same die - typically not more than 2 or 3 models.
 
So basically Vega is 10% stronger than a 1080 GTX FE. It's not bad, but obviously too little too late.

Anyway, it will sell, but the real thing is the Vega core in APUs for laptops coming around next year. 1080p 60 fps on a laptop without mxm looks really nice.
 
So what about async compute that's still hasn't been used in Radeon cards? And it's only now Vega is implementing tile based rendering if I am not wrong. I don't know the entire list, but I read abt this particular subject and made sense then.
Similarly, u see tensor units in the Tesla card. But that won't feature in the Volta since tensor units are for deep learning.
 

colemar

Prominent
Mar 31, 2017
3
0
510
Vega RX 64 vs. GTX 1080
----------------------------
Performance: Vega wins. That is if you care about consistency and minimum fps, which is what all gamers should care about.
Price: About the same, though I can't find a GTX 1080 for less than $509 on Newegg.
Power consumption: in the avg game at 4K GTX 1080 draws 180W with peaks up to 310W. Vega is 295W, but this should be Thermal Design Power therefore the avg power draw should be less. Must wait for benchmarks.

All in all Vega seems a fair value.

AMD vs Nividia: sorry AMD, you are late, try again.
 
For me, this is a complete slap in the face. I *already have* a Ryzen system. I *already have* a freesync monitor (which I've been sitting on for TWO YEARS waiting on a card to push it). I have no interest in either of the games in the third bundle; I *just* want the liquid cooled full Vega.... so AMD is effectively asking me to pay an additional $100 for no benefit to myself. If they really go through with this, then---Mod edit: keep it clean--- you AMD. They will have rewarded loyalty with shit. Pardon my French.
 


The problem is not all decisions are technical. There's a reason you get evil marketing slides twisting truths and caveating them to the ground.

What you might say "this is hands down technically better", a Manager will come and say "ok, but it's not using our IP and costs 3X to implement: denied", then you get rekt and whine all the way to your desk and then some more to your pairs. Not that I have experienced that much in my life! :cries:

Cheers!
 

SteveRNG

Distinguished
Jun 23, 2015
56
5
18,535


You do realize that AMD doesn't owe you anything,right? They can sell their cards for whatever the market will bear. They can set an MSRP and provide no further incentives. And then retailers can sell those items for over MSRP. And then the market takes over, miners buy up everything and the retailers and AMD make a ton of money by continuing to raise the price. And they will sell you a liquid cooled full Vega and nothing else. You just may have to buy it at the price that everyone else in the world is willing to pay. Nobody going to sell you anything for MSRP if there are others willing to pay more for it. That's what everybody has been doing with e mining craze.

I know it sucks to be ready to upgrade and for the market to be crazy. I'm still playing on a !*#(@!*&# GeForce GTX 570 and I'm tired of playing 3-5 year old games. I get it. But you can't be mad at AMD or retailers for selling their product at a price the market will bear. Because if they sell it for less, they are STILL going to sell out and you won't get one anyway.



 

aldaia

Distinguished
Oct 22, 2010
533
18
18,995

Hmmm, nope, there are actually several alive VLIW uArchs. Everybody knows about Itanium while ignores the many successful VLIWs. It is estimated that only 200 to 300 thousand Itaniums have been sold since it's inception. Itanium numbers pale in comparison to the successful (and mostly unknown) VLIWs:

  • ■ Texas instruments and analog devices have several VLIW DSPs that are widely used in the embedded market, they have been sold in volumes probably surpassing the billion units.
    ■ STM has sold about 1 billion ST200 VLIW cores in its many video SOCs. Anyone owning a budged digital TV, DVD, Bluray player, multimedia disk, or set-top-box probably has several St200 VLIW cores at home.
    ■ CEVA offers the VLIW CEVA-XC4000 as IP. It is estimated that also around 1 billion units have been shipped.
    ■ And finally, the rock-star of the VLIW world, the Qualcom Hexagon. Qualcom ships well over a billion (maybe even 2 billion) units of Hexagon every year. Which makes Hexagon the second most successful (in number of units) uArch in history after ARM, relegating x86 to third place.
 

dusty13

Honorable
Jan 20, 2014
37
0
10,530
"Both figures exceed Nvidia’s 250W rating on GeForce GTX 1080 Ti, which isn’t even in Vega’s crosshairs."

not sure if that will remain to be true ... from a compute power perspective the water cooled version competes with the titan xp quite nicely.

the question will more likely be if amds' (it seems perpetually drunk) driver team can slap something together that will actually make use of that power soon after release and not - like usual - about half a year later when reviews have written in stone the card is slower than the competition from nvidia.

i am afraid amd will botch it once again and end up with a card that actually can (at least closely) compete with the titan xp but it will take them too much time to get there and all the reviews will have been done by then.

in any case the most interesting version for my taste is the 56 ncu model with 210 tdp. it should get very very close to or even slitghly beat a 1080 stock after real optimization has been done ... for a much lower price.
 

I said that knowing it was incorrect, just because "Itanium" is a name most people recognize easily and know how it failed, although not thanks to being VLIW though.

For all I would like to further the argument in favour of VLIW, I can't because I don't have access to information for any of those products. Care to help illustrate the point then? If you agree, that is.

Cheers!
 

Mestas

Prominent
Jul 31, 2017
2
0
510
Most likely DOA. Why would you invest in Vega. Older, uses more power. Nvidia will probably release Volta in 2018 which would make Vega obsolete. People are playing down the power consumption, but I think it's pretty huge. Your new card can "just" compete with year+ old tech with a higher power draw. Even for the fanbois, take the name stickers off each card, in a blind taste test which would you choose?
 

InvalidError

Titan
Moderator

His point is simple enough: VLIW instruction sets are still alive and well, just not in products where the buyers care what the underlying architecture is. Many once promising but now mostly forgotten architectures are still alive and well in embedded applications. Your own PC may have over a dozen MIPS, ARM or other ISA processors embedded in its various components.
 


And you don't think that AMD isn't already working on their 2018 releases? Working on driver improvements, etc?
 


Yes, I'd also love to compare a knife to a bazooka, but that is not the point I'm trying to make nor asking help with. Thanks for pointing out the obvious though!

Cheers!
 


I have long ago accepted my character weaknesses ... when it comes to "brand loyalty", I accept my "true self" as being a "hardware whore" . We were an Asus shop (MoBos) for 10 years but they started to lose our favor w/ Z87 ... now we rarely use an Asus board using Giga and MSI almost exclusively. Tho, if they Asus was to return to form, and eliminate the RoG tax, we would have no qualms abandoning MSI and Giga.

With GFX cards, its been quite a while since AMD had a competitor at the top end. We would oft use AMD cards when budget restricted the user's options. AMD was competitive in the 2nd and 3rd tiers, and basically owned all the budget tiers. The problem w/ that approach is that there's much more profit at the high end than the low end and this led to shrinking financial resources as nVidia \'s market share climbed to 80% when nVidia's stranglehold dropped to the x70 series.

And once 10xx dropped .... AMD didn't really have a horse in the race from the 1060 on up. These new Vega cards can compete, but not up against the cards they are purported to be competing with, especially when we ignore "out of the box" early published results and focus on how the cards are actually used ... pushed as far as they can go w/ Afterburner.

As for the Freesync thing, cost is really only a factor when two items offer the same thing and Freesync / G-Sync are far from the same thing.

Freesync's has the greatest impact between 40 fps and 70 or so .... after that the impact falls off considerably.

Gsync's has the greatest impact between 30 fps and 70 or so .... after that the impact falls off considerably.

So why does G-Sync cost more ? Because G-Sync monitors, aside from doing what Freesync does, also includes a hardware module which provides motion blur reduction technology. When you get up above 70 fps, users have the option to turn off G-sync and use ULMB. Freesync has no corresponding hardware module and therefore can not provide this function.

There are Freesync monitors that do provide MBR technology but here the monitor manufacturer has provided the necessary hardware / technology to implement same and has included that cost in the price of the monitor.

I'm hoping that the AIB cards find a way to "cut the chains" and get the overclocking headroom into double digits. For me, being limited to single digit % fps increases when overclocking has been the major weakness of everything AMD has put out since the 290. It is of no concern to me that the cards are competitive "out of the box" when one side is giving me 4%, 8%, or even 12% when the other side is hitting 18%, 25% or even 30+ % ...
 

bit_user

Polypheme
Ambassador

Yeah, it works fine for embedded - where binary compatibility between different CPU models isn't typically a requirement and latencies are more easily tamed. That still doesn't mean it's the best option for GPUs, however. AMD and Nvidia have both done VLIW in the past. I trust they know what they're doing - better than we do, at least.

Funny thing is, Itanium/IA64 isn't even true VLIW. It actually allows runtime scheduling, but they make it easier by explicitly encoding the dependency graph (hence the term EPIC: Eplicitly Parallel Instruction Computer). That was done as a nod to forward compatibility with implementations that have different execution widths and latencies. Arguably, it wasn't enough to make VLIW work in a general-purpose platform.
 

blppt

Distinguished
Jun 6, 2008
576
92
19,060
I'm still going to buy one, mainly because I've been waiting forever for a top-end AMD card to replace my 290x, and the Fury with its 4GB framebuffer wasnt doing it for me. But all this hype leading to a card thats likely not as fast as my year-old factory overclocked SC EVGA 1080 in gaming is somewhat of a letdown.

Probably go for the watercooled one, as the last early adopter air-cooled AMD card i bought (290x) was a nightmare of noise.
 

bit_user

Polypheme
Ambassador
IMO, the DeepBench results make a pretty good case that Vega was primarily built for machine learning:

http://wccftech.com/amd-radeon-vega-nvidia-pascal-gp100-performance-benchmarks/


Similarly, I think Ryzen was primarily designed as merely a stepping stone towards Epyc. AMD is going where the big money is - and that's the cloud.
 

RedFIveStandingBy

Prominent
Mar 3, 2017
7
0
510
They hyped this so much, both the "red team" and AMD together, not sure what they were expecting though, AMD is still AMD another space heater that happens to also be a graphics card.
 
Status
Not open for further replies.

TRENDING THREADS