AMD Radeon RX Vega 64: Bundles, Specs, And Aug. 14 Availability

Status
Not open for further replies.

Bob_8_

Honorable
Nov 18, 2015
25
0
10,540
Know what it is - Feels just like you open your main Big Christmas present - and it's not what you dreamed it would be =(
 

pacdrum_88

Distinguished
Sep 16, 2014
113
4
18,765
Man, I'm stoked. Everyone seems down. But looking at the monitor I'm getting, the Acer XR342CK which uses FreeSync, and the crazy price of all decent GPU's right now, and the $399 Vega 56 sounds like a steal to me. I'll be getting it as soon as it launches. Video editing, gaming, this thing will be doing it all for me!
 

none12345

Distinguished
Apr 27, 2013
431
2
18,785
Hrm im torn. I was thinking i wasnt going to get a vega. But the vega 56 at $400 seems like a decent deal.

Need to see benchmarks. But with the mining craze inflating gpu prices right now, thats not a bad price, if you can get it for that.

Cant even get a 580 atm, sold out, and even if you tried, it would be up near $300 if not more. Vega 56 should be at like 70% faster, so for $100 more it looks pretty good.

I guess it depends on where the board partner vegas come out price wise. It doesnt look as good if its $450, which is probably what the non reference ones will be.

At $400 for a non reference vega 56 with a good cooler....i think it might be hard for me to pass it up.

All depends on benchmarks tho. IM expecting vega 56 to be about halfway between a 1070 and 1080.
 
AMD should release their own game engine to make it easier for the devs to exploit all the Vega features. The same goes for Nvidia.
All the new features and performance gains, just wasted without proper support :(
 

mpampis84

Commendable
Apr 26, 2016
31
0
1,540
I don't see this going well for AMD. You can get a water cooled EVGA GTX 1080 for less money than the water cooled RX Vega, and less power hungry, too.
For example, had I decided to upgrade from my RX 480, I'd also have to buy a PSU.
Why choose the RX Vega?
 
D

Deleted member 217926

Guest
It's a shame the 980 Ti successor isn't the 1080 as the AMD marketing team would have you believe in that first slide. It's the 1080 Ti.

980 Ti to 1080 Ti is a 65% gain. Fury X to Vega is a 25% gain. :spamafote:

I wanted this to be a win. Competition is great for my wallet. This is a fail though and a big one. A $699, 350w liquid cooled card competing with a vanilla, 180w GTX 1080 Founder's Edition from 2015. Give the AMD driver team a year and it might compete with a factory overclocked 1080. Of course Nvidia will have a new generation out by then. You don't have to tout things like 'platform cost ( that includes the monitor )' and ' higher minimum frame rates' and offer games and coupons on monitors when you have a winning product.

Ryzen is an epic win and it obviously ate all the R&D budget. Hopefully they will reinvest some of that profit in the graphics card division and have something to compete in the high end in a few years.

I think this will mostly be a product for the "AMD can do no wrong" crowd and people with disposable income.

Who knows, real reviews might prove me wrong. But when I saw the 16GB Frontier Edition competing with my 2 year old 980 Ti in benchmarks I knew this wasn't a world beater.
 

hannibal

Distinguished
The 500$ air cooled version compete with 500$ 1080... I see no problem in there at all! The Vega will be somewhat better in Vulcan and DX12 and will lose in dx11 titles. So it is close call depending on what games you play and how much it means that future games Are mainly dx12 or Vulcan.
The vega will eat more power, but Freesync in cheaper than Gsync so it all depends on what you want. The competition is back Also in above Nvidia 1060 level and that is good!
 

madPav3L

Prominent
Jul 31, 2017
1
0
510
Not a good choice from AMD to bundle it with Samsung CF791... it's known that the monitor has flickering issues with Freesync.
Just google "samsung CF791 freesync flickering"
 

bit_user

Titan
Ambassador
Poor AMD. With Volta lurking around the corner, they needed Vega to retake the crown.

It's good that they can still offer something competitive, but I'm sure they had to take a big hit on their margins. Nvidia could drop prices in response, and they'd probably still have bigger margins than AMD @ Vega's launch prices.

Don't believe me? Just check out the die sizes:

Code:
Nvidia GP104: 7.2 billion transistors
AMD Vega 10: 12.5 billion transistors

As the article points out, that's even bigger than the GP102, from the 1080 Ti and Titan Xp.
 


I have to agree with the bold part 100%. I don't think it will be a major disappointment, because they realized Vega won't compete at all in the higher tier, so they're giving them away for chump change, so pricing wise, it will be "there". Taking into account PSU upgrade and Freesync savings, it will be more or less on par with the 1080 in an all-around pricing deal at least.

They seem to have focused all the "pro" features in the wrong card. Polaris would have been a million times better off with this fancy new way to address memory and video editing, since most of the time, those tasks don't require pixel pushing. Vega should have focused on performance first and efficiency second.

Jeez, AMD. Get rid of GCN soon and give us a VLIW derived uArch for the mid range.

Cheers!
 

Shotta06

Honorable
May 4, 2017
199
0
10,710
Well, waited.

1080Ti here I come.

Now, hopefully Threadrippwr beats the i9 in multitasking like early benchmarks proclaim.

Then, my 1950x will pair nicely with the Ti
 

bit_user

Titan
Ambassador

Wrong comparison. The main competitor is their air-cooled vs. the normal GTX 1080. And it came out one year ago - in 2016.


Given that this beats Fury X, which was pretty equal with the 980 Ti, I'd say the issue with Frontier was early drivers. And, as the article said:
it sounds like the pixel engine’s Draw Stream Binning Rasterizer, which we introduced back in January, is currently disabled on Radeon Vega Frontier Edition cards. However, AMD says it’ll be turned on for Radeon Vega RX’s impending launch.

Still, as it doesn't seem like it's going to be cost-competitive, they made the right decision to target it towards professional applications. Hopefully, it's at least good at machine learning.
 

bit_user

Titan
Ambassador

When you're talking about GPUs, the two are basically one in the same. In order to make a GPU fast, you need to focus on making it efficient.


Why are you so sure that's the right answer? GCN was a pretty big leap ahead of their old VLIW4, when it was introduced. It's basically the same thing Nvidia is doing. I'm not sure there's really a better way.
 


Not quite. I do realize they are closely related, but you optimize for them differently.



Yes, there is a better way. They have tied themselves to GCN using HSA and need to bundle in stuff they really don't need for mid-range cards. All the compute stuff can be added into a VLIW derived uArch. Yes, VLIW4 was crap at computing, but that is because it wasn't *meant* to be used for it when nVidia made the first compute jump.

This is not something I can argue over confidently, since the last VLIW uArch that was alive died with Itanium. That is to say, I have no current-day examples on how a VLIW uArch fares to regular "compute" uArchs in GPUs.

Cheers!
 

Shotta06

Honorable
May 4, 2017
199
0
10,710
With all the so called knowledge some people post with its a surprise their not a top engineer at AMD/Intel making 6-7 figures.

Its obviously not as easy as you(no one specific) make it seem. Why else are R&D budgets,so high?
 

bit_user

Titan
Ambassador

Hey, if you don't like arm-chair quarterbacking, then why are you reading these forums?

I wish we could get more of the inside dirt from the chip designers, themselves. But, they generally like keeping their 6-figure jobs, so we don't. I used to work with some ASIC designers and try to read more in-depth analysis than you'll find on this site, but my expertise is really on the software end of things.
 

Shotta06

Honorable
May 4, 2017
199
0
10,710
Hey, if you don't like arm-chair quarterbacking, then why are you reading these forums?

I wish we could get more of the inside dirt from the chip designers, themselves. But, they generally like keeping their 6-figure jobs, so we don't. I used to work with some ASIC designers and try to read more in-depth analysis than you'll find on this site, but my expertise is really on the software end of things.

O, I love it. Wasn't meant as a diss. My degree is in Computer Science and I am also on the Software end. I just find it interesting people who know such about the hardware aspect. Software development and hardware development are apples to oranges.
 
Want to know where the true meaning of tflops lie? The Radeon cards where the first to be gobbled up in the mining craze.
Nvidia usually cut down their top of the line card and give it in various flavors. Aka the 1080, 1070, 1060. But I think AMD doesn't do that. That's y they have higher power draw and r more prepared for future tech like dx12 and Vulkan.
 

Radmeister

Prominent
Jul 31, 2017
3
0
510
I'd be scared to buy one, something doesn't add up and after my experience with the r9 nano that was like having a cat battle a rattle snake in my case I have a feeling they cheaped out on a 0.10$ component that is gimping these cards out. There could be some thermal throttle happening that is giving them lower average performance perhaps their paper specs are only for a short burst. Gotta wait till a full review
 
Status
Not open for further replies.