AMD Vega MegaThread! FAQ and Resources

Page 17 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


It's even stranger than that. The 480 had a much greater clockspeed deficit against the 1060, ~400 MHz at stock. Vega FE has closed the gap to within ~200 MHz of the 1080, and ~100 MHz of the 1080ti.
 
AMD Vega Details Leak Ahead Of Official Launch
by Fritz Nelson July 13, 2017 at 12:10 PM
"The long-awaited unveiling of AMD's Vega is scheduled for July 30 at the company's Capsaicin shindig, just prior to SIGGRAPH, but a few details are leaking out ahead of time. We have a few of our own sources who are telling us that there will be four different reference design models, including two versions at the top end (the XTX), a presumably lesser-resourced XT model, and then the XL at the lower end.

We're told the XTX will come as either an air-cooled design with a blower fan or liquid-cooled, with the latter at 375W TDP and the former at 285W TDP. The XT and XL will both be air cooled and operate at 285W TDP. The already-shipping Vega Frontier Edition also comes in air- or liquid-cooled variants, with the air cooled version hitting 300W TDP."
http://www.tomshardware.com/news/amd-vega-gpu-leak,35000.html
 


From this article, I'm encouraged about Vega's performance. They do not tell which version of Vega they matched up against the 1080. Considering that they are saying that the Ryzen+Vega system was $300 cheaper, this could have been the Vega XT. Which would lead me to think that Vega XL is the competition for the 1070 and the Vega XTX's are against the 1080ti/Titan XP.

As for the "hiccup" they mention....despite them saying that it couldn't have been the 1080 as they have benchmarks showing it's performance in BF1. But, nVidia has been having driver issues for several months now and performance then does not necessarily mean performance now. Plus, I've been hearing of driver issues with GSync (among other issues), so I wouldn't discount the possibility that it was the 1080 system.
 


i don't think that was any better on the Freesync side. for every AMD driver release there always mention about issues with AMD Freesync.

 

You can't. Once you're in, you're in. There's no escaping Vega, the gravitational pull is too strong. Fear not, though, for the future is brighter than the star! ;-)
 


Nice! Now let's hope it is $100 cheaper than the 1080.
 
I found some news but not sure where to post it.
Fujitsu releases details of new AI processor
by Hilbert Hagedoorn on: 07/17/2017 11:09 AM
"They are targeting 10x the performance of "the competetion", and since its release is schedueled for FY2018, that means the latest Nvidia and Intel deep learning chips.
Given their history with K Computer, and the fact that its still at the #1 spot in HPCG and Graph500, I am inclined to believe that they will hit 10x their competitors performance."
http://www.guru3d.com/news-story/fujitsu-releases-details-of-new-ai-processor.html

Japan is going to enter the A.I. race.
 
Well I couldn't hold on anymore... ordered a 1080.

The delays, 300W+ power draw, and finally the very weird "blind test and we won't tell you which is which" stuff made me switch to Nvidia. Been an all-AMD GPU guy since they were still ATI 🙁
 


Neither tech is perfect, but the article made it seem like it couldn't have been the 1080 system as they know the 1080 has good performance in that game. I was just pointing out that, that is a ridiculous assumption on the authors part.

It just annoyed me that they assumed the issue had to be with the AMD system, because they had some old information and assume that nVidia hasn't changed anything. Someone needs to inform that author that the days of nVidia's driver superiority are gone.
 


I can't say I blame you- the RX570 / RX580 are good cards but can't be had for the price due to the miners. I think we know all we need to about RX Vega based on oc results and such on Vega FE (it's going to be the same silicone after all- and I don't believe there is a magic driver that will just fix performance- it may well improve over time but it will take a while).

AMD are killing it on the cpu side, but graphics wise they're not in great shape right now- such a shame as well because it wasn't that long ago they were really doing well (HD7970 and R9 290X were both superb gpu's when they came out). The only thing that is making me hopeful is that Navi is rumoured to be using some of the same tricks that AMD have been showing for the Threadripper / Epyc cpu's (i.e. multi die on one chip- which should hopefully avoid all the issues with dual discrete gpu's).
 


that might be the case on driver side but you have to remember nvidia have much strict control with Gsync. with Gsync as long as it is not about panel quality issue the experience should be the same. Gsync related issues are all addressable by driver update since there is some form of communication between Gsync module and GPU. that is not the case with Freesync. in majority of the cases AMD have no control about what monitor maker do with their monitors. hence there are some cases monitors need to be send back to manufacturer for firmware update because it cannot be solved via driver update or limited VRR window with Freesync enabled despite the said monitors capable of supporting much higher refresh rates.
 
Being a Freesync user, I can say I've had zero problems with it. I have read problems with other panels, but I'm pretty sure AMD has little guilt there.

Like renz said, nVidia has better control over GSync, since it's *their* technology and Freesync is just a certification based on an open DP standard. So the panel manufacturers don't need to implement them fully and still slap a "Freesync" sticker; kinda like "DX12 compatible", haha.

Cheers!

EDIT: Typo.
 
The AMD FreeSync Review
by Jarred Walton on March 19, 2015 12:00 PM EST
FreeSync vs. G-SYNC Performance
72753.png

"Except for a glitch with testing Alien Isolation using a custom resolution, our results basically don’t show much of a difference between enabling/disabling G-SYNC/FreeSync – and that’s what we want to see. While NVIDIA showed a performance drop with Alien Isolation using G-SYNC, we weren’t able to reproduce that in our testing; in fact, we even showed a measurable 2.5% performance increase with G-SYNC and Tomb Raider. But again let’s be clear: 2.5% is not something you’ll notice in practice. FreeSync meanwhile shows results that are well within the margin of error.

What about that custom resolution problem on G-SYNC? We used the ASUS ROG Swift with the GTX 970, and we thought it might be useful to run the same resolution as the LG 34UM67 (2560x1080). Unfortunately, that didn’t work so well with Alien Isolation – the frame rates plummeted with G-SYNC enabled for some reason. Tomb Raider had a similar issue at first, but when we created additional custom resolutions with multiple refresh rates (60/85/100/120/144 Hz) the problem went away; we couldn't ever get Alien Isolation to run well with G-SYNC using our custome resolution, however. We’ve notified NVIDIA of the glitch, but note that when we tested Alien Isolation at the native WQHD setting the performance was virtually identical so this only seems to affect performance with custom resolutions and it is also game specific.

For those interested in a more detailed graph of the frame rates of the three runs (six total per game and setting, three with and three without G-SYNC/FreeSync), we’ve created a gallery of the frame rates over time. There’s so much overlap that mostly the top line is visible, but that just proves the point: there’s little difference other than the usual minor variations between benchmark runs. And in one of the games, Tomb Raider, even using the same settings shows a fair amount of variation between runs, though the average FPS is pretty consistent."
Closing Thoughts
"It took a while to get here, but if the proof is in the eating of the pudding, FreeSync tastes just as good as G-SYNC when it comes to adaptive refresh rates. Within the supported refresh rate range, I found nothing to complain about."
http://www.anandtech.com/show/9097/the-amd-freesync-review/4
 


Yes and no. Yes, nVidia has much tighter control, but that module adds a lot to the price, and while the issues should be able to be handled via drivers, they seem to be dragging their feet fixing them.

IIRC, initially AMD had little to no control over Freesync, but that has since changed. AMD owns the Freesync brnading, while Adaptive-sync is the open standard that Freesync is based on. To carry the Freesync name, they must qualify the panel with AMD. Though, I doubt the restrictions are as tight as gsync. The initial Freesync panels had a lot of issues with ghosting and AMD made the certification change to fix it before Freesync got a bad rep due to cheap panel manufacturers. This is probably why the initial Freesync monitors had little to no extra cost for the consumer , but now there is a premium for it.

Either way, I don't like vendor lock-in and haven't gone for either tech.

Can't wait for the benchmarks on Vega, I just hope it wasn't the XTX that was in that comparison mahcine against the 1080. That would be disappointing.
 


I don't really understand why everyone is acting suprised at these results- if you want a review of RX Vega just look up the Vega FE- it's the full die.

If clocked high enough then Vega is up there with GTX 1080 (faster than ref card but not many of the factory overclocked models). The issue with it is that it's a much bigger die, has an interposer, and HBM stacks (so much higher assembly / manufacturing costs) and will consume considerably more power.

I think the issue is Vega is a dual purpose design, covering both professional markets (where it looks quite strong) and gaming. I'm guessing it's not got quite the optimal balance of resources to push very high frame rates as it's the same (sub optimal imo) layout as Fiji- it's very compute heavy (Fiji has the same back end in terms of rops and so on as Hawaii but almost double the shaders- if you look at nVidias top gaming cards they feature a heavier focus on the back end for example). The thing that is really disappointing from a gaming standpoint is it appears all the advanced new techniques AMD included on Vega have done *absolutely nothing* for it's gaming performance which is a shame. It's got a discard accelerator, supports tile based rendering, a stronger geometry engine and supposedly up to 2x throughput per clock- however the performance per clock numbers show it has barely improved vs Fiji.
 


It's probably because everyone was hoping AMD can compete again in the top GPU gaming segment, which would bring prices of high end cards little lower. NVidia will have no reason to lower their prices or hurry with Volta cards, if AMD can't even compete with today's NVidia architecture. Bad for NVidia and AMD fans...
 
Yeah it is disappointing to say the least. More and more it seems Vega is simply the stepping stone to Navi. Vega has pieces they needed for Navi but don't have any positive impact for Vega. Once they make Vega on 7mm and "glue" two or three of them together I think they will have realized there full vision but for today Vega is a let down.
 
Where exactly can AMD still compete? RX 550/560 is no match for GTX 1050/1050 TI, RX 570/580 is getting "eaten" by crypto miners so GTX 1060 wins... Vega RX will probably compete with GTX 1070/80, but why would you buy it over NVidia counterparts which have much lower power consumption? If they lower RX Vega prices under 1070, mines will probably love them, but gamers will stay empty handed again. Good thing AMD still has Ryzen to eat Intels share of the market.
 


It's worth remembering that Vega *is* a very good compute gpu- and looks strong in the professional market. That is potentially quite a profitable market for them as well.

The issue is specific to gaming. Also the low end isn't as much of an nVidia wash as you'd think, from the reviews I've seen the performance hierarchy looks like this:
GT 1030 -> RX 550 -> GTX 1050 -> RX 560 -> GTX 1050 TI -> RX 570 (although I agree the crypto miners have screwed up the 570 / 580 prices- it's worth noting that the price hike is due to resellers selling out of these cards- so AMD's sales numbers should be healthy enough on those parts).
 


Not everyone has too pay high prices for electricity or care, so that won't be a factor for some people. Lack of any other available cards and price could be driving factors to buy one of the 4 variations.
 


From all the benches I've run and seen, the RX560 is neck and neck with the 1050 (and the 1050Ti is 30% more expensive for 15% more performance), the crypto mining fad is at an end (so RX580 will be back on market in the next few weeks) and the only benches we have on Vega are either leaked ones with non-final drivers (that still put it very close to the 1080) or ones on pro apps (where it eats the Titan XP and twice more expensive Quadro cards for breakfast).
So AMD can compete on entry level, on mining, on midrange, and in big compute; the one area where it doesn't compete is 4K gaming... Which is also less than 4% of the market, according to Steam. Not bad for a small company going against two juggernauts.
 
Status
Not open for further replies.