AMD Vega MegaThread! FAQ and Resources

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

jaymc

Distinguished
Dec 7, 2007
614
9
18,985
They announced that the gcn code named Vega would be also called VEGA !! So I can only presume that the 500 series is the Polaris refresh, and that vega will use it's own name as a brand name.

@gonfreecss Polaris low to mid, Vega high end yes.
 

vvacenovski

Prominent
Mar 3, 2017
33
0
530
How do you guys expect Vega to perform as opposed to current Pascal cards? I'm doing a build next month and can afford 1080Ti. Would Vega be worth waiting month more?
 


Vega might beat the 1080Ti in Doom and other id tech 6-based titles (none at the moment) and play in the "normal" 1080's ballpark otherwise. It's all speculation though, and driver optimization often lag behind a new chip's revealing at AMD's.
 


As of now, more is known about AMD's next range: the 580 will be a slightly higher clocked 480 (it's Polaris) while Vega will be a wholly different beast. You could get a factory overclocked 480 now if its performance satisfies you (mind the design though, Tom's France recently tested several 480 cards and found out that some don't actually cool the chip well and that the reference design isn't that lousy actually).
 

jaymc

Distinguished
Dec 7, 2007
614
9
18,985
AMD Vega powered LiquidSky streaming servers go live...
This looks animal an now in open beta... check it out. Nice design win for Vega as well...

https://youtu.be/KAn_oVBiBYY

http://hexus.net/gaming/news/industry/103957-amd-vega-powered-liquidsky-streaming-servers-go-live/
 


It's an April fools joke, just early.

I think Vega will be a solid product but I wish the hype train would lay off it. They always build things up so much and when the product comes out and it's just a good solid product, everyone acts disappointed. Vega will be competitive, but we have to stick to reality.

Besides, looking at those "slides", there are some things that don't make any sense like HBM2 having a 2048 bus when HBM1 had 4096. Plus it says "up to 16gb" then shows fictitious cards that even the dual Vega card only has 8gb.

Even though the Fury was equal in performance with the 980ti despite less memory(the bandwidth made up for the limited VRAM), people saw the 4gb it was rocking as a gimp factor despite the facts to the contrary. I doubt AMD would have a 4gb Vega card since most people don't understand much beyond the big "Xgb" memory on the shiny box. Hence the multiple people that come here on the forums asking why they aren't getting 200fps on their GTX 740 4gb. Something along the lines of:

"Why can't I get 200FPS on XXXXX game with my 740 4gb?"
Answer: "The GPU is too weak."
"But it has 4gb DDR5?!?"
Answer: "Yeah, it's about as useful as lipstick on a pig..."

I am patiently waiting for Vega, but doubt I will be upgrading my GPU anytime soon. My CPU is in line for an upgrade first, waiting to see what CannonLake and Ryzen+ brings to the table. Probably end up with a fully new system at that point. It's been a while since I had a AMD CPU.
 

jaymc

Distinguished
Dec 7, 2007
614
9
18,985
just realized that an deleted it... completely forgot about april fools... they had me goin... way too good to be true i guess...

After seein crossfire running on two rx480's on adored tv new video... I would actually love to see a dual gpu card with vega... it seems they have crossfire running perfect was it always this smooth ?. i don't know is this cause the pci express lanes go directly to the cpu on ryzen ? same as the mouse response he speaks of in the game usb 3.1 straight to the cpu as well..

other reviewers are talking of this "silky smooth" gaming experience with ryzen as well... just incase ya haven't seen the video it here it is, very different results in dx12 with amd gpu's, everyone else tested with nvidia gpu's: https://www.youtube.com/watch?v=0tfTZjugDeg

but judging by that video on adored vega is going to be very competitive in dx12 especially with more cores on cpu, eg Ryzen.. or i suspect the 6900k.. unless nvidia does something about their dx12 drivers an fast....

I can see myself buyin vega and ryzen.. don't know when but I will eventually I reckon. Ryzen's gettin better all the time... an vega looks like it's gonna be a demon in dx12 games.
 
DX12 has helped a lot and AMD has been really good with their drivers lately. Crossfire is a great feature...when it's supported and has a proper game profile. I'm sure that for every game that runs great, there is one that has massive stuttering and crap performance. Just the nature of the beast.

From what I've been reading Crossfire has been fairly stable, while SLI has been getting worse and worse. The fact that AMD allows Crossfire on all their GPU's is pretty cool. They seem to understand that people want a decent card but don't always have much money, then later on want more performance without breaking the bank and might want to try Crossfire out. That is what happened with me. I bought a 6870, then a few years later wanted more so I bought a used one from eBay. That Crossfire setup ran nicely for me until I had the money for an upgrade to my current R9 390.

A dual Vega card is not out of the question, but I have to wonder, would it be bottlenecked by the PCIe 3.0 slot. A card that powerful (assuming it's somewhere near the 1080ti/Titan XP). A beast of a card like that would need a lot of bandwidth but wont have the convenience of using 2 PCIe slots.
 

jaymc

Distinguished
Dec 7, 2007
614
9
18,985


they did release an awfull lot of update's to crossfire in recent past... I couldn't help thinkin if they were considering releasing a new dual gpu card with vega... It's very good point you make though would one pci express lane handle the traffic, good question. I have no idea..Could one lane handle the traffic of two 1080's even ?
 


no it is not. if the game needs more than 4GB then Fury will not be able to cope with it. if that is possible then AMD would not need to create HBCC with VEGA. in case of Fury AMD have to play around with their drivers to lower the VRAM usage in games. they were probably do it the same way nvidia did with VRAM management with their cards with odd memory config.
 

Th3pwn3r

Distinguished
Aug 29, 2011
257
0
18,860
The pricing of Vega will make or break the card. If it's anywhere near the price of a 1080ti I doubt I'll be buying it. If it's a couple hundred of dollars less then I'll buy 1 or 2 of them :) My machines need new cards pretty badly.
 


So you wou;dn't buy Vega if it was the same price as the 1080ti but 10% more performance? I don't know how it will perform, but it seems you are assuming it will be less for the same price.
 


What I was referring to was when the VRAM gets full and the GPU needs to swap out frames, the bandwidth somewhat compensates and lowers the latency. It's not perfect, but the Fury X seems to manage. I'm sure there is more to it than just that, but I believe that is part of the equation.
 

jaymc

Distinguished
Dec 7, 2007
614
9
18,985

Th3pwn3r

Distinguished
Aug 29, 2011
257
0
18,860


Well, I'll have to factor in power consumption as well but 1080/1080ti are tried and true. We have real world data from real people telling us how it performs, people that aren't working at Nvidia. What do we have for VEGA? Just what AMD tells us. I was sick of AMD playing the waiting/delay it game over a decade ago. They still continue to stall things out far too much this day and age.
 


Because it has been released. Vega is a unknown to everyone but AMD. Every vendor picks benchmarks that make their product look as good as possible, that's just marketing.

I seriously doubt AMD is delaying for no reason. They had a delay waiting for HBM2 to be ready and avilable in mass quantity for Vega, which means millions of chips. nVidia wasn't willing to wait and went with GDDR5(X) for their consumer products. I am not aware of any other delays, but I would think that at this point in time, they are in production and testing, as making millions of GPU's is not a short process.

AMD might be able to get fairly close to nVidia on power consumption this time as they have the 14nm process + HBM2 and architecture refinements. However, this is all speculation as there is nothing concrete yet, but I really doubt AMD is going to release a 400w monster of a card that can't even keep up with the 1080, they can't afford that. This generation of GPU's and CPU's are kind of a make it or break it for AMD. Ryzen has been a fair success and performs well, but is really just getting off the ground. Vega needs to be competitive in price, performance and power consumption.

The nVidia praise for power consumption is interesting since nVidia fanboys kept screaming how it didn't matter back in the fermi days, but now tout it as proof of their superiority. I think for most people, price/performance is the main concern and AMD tends to win that battle...like they are with the 470, 480 4gb and 480 8gb at the moment.
 

Th3pwn3r

Distinguished
Aug 29, 2011
257
0
18,860
The real problem is that we're all spoiled. Most people want the best for the least, the most for the least. Some people don't care though and will pay any price for the absolute best they can get like in the case of the extreme processors years ago that were $1300 versus the processor right under for $400.

Anyhow, for me the power consumption is a big deal because the less power consumed, the less heat created as a by product. I don't even have hard drives in my machines anymore due to power consumption. Plus, I learned that storing things is kind of a waste of space considering how well I can stream everything these days :)

As of right now I have two machines running older AMD cards(R290s I believe) and they still serve me pretty well for what I play most of the time but my 3rd machine is running a Gigabyte 1080 Xtreme edition which can handle the 4k gaming I use it for. So...depending on what happens with Vega I may be investing in the $1400-1600 in 1080 ti cards or whatever the price for two Vega cards will be. Some people have posted numbers ranging from $300-$600 for Vega depending on which variant you choose. Not sure where they came up with those numbers but...they were there.
 




nah that go both way. back in 2010 many in this forum saying that using nvidia GTX480 will make your power bill skyrocket by the end of month. but when AMD start losing their power efficiency competitiveness no one mention those "power bill issue" anymore.
 


and that's the thing that killing AMD.....
 

jaymc

Distinguished
Dec 7, 2007
614
9
18,985
Found this on Nvidia forum it's in relation to AMD's GPU's working far better with Dx12 than Nvidia's...

Original poster keeps pushing for someone from Nvidia to answer but to no avail.. but get's some interesting info along the way...

User supermanjoe states "In the architecture side, the main difference of the IHV implementations is the concurrency and interleaving support. In the practice GCN can work very well with any codepath. Pascal is better than Maxwell, but still not work well with mixed graphics-compute warp interleaving. But it won’t get negative scaling like Maxwell. Probably the secret of GCN is the ability to execute compute pipelines along with any graphics pipeline on the same CU with any state. But there are not much data on how they are doing this. GCN1 is not as good as GCN2/3/4 at this, but it is still the second best desing."

This looks like a hardware problem for sure if true that is I guess... anyhow hole tread can be read here: https://forums.geforce.com/default/topic/964990/-iquest-the-future-of-directx-12-nvidia-/?offset=20

So According to this poster Nvidia don't know how AMD is doing it so well... And Maxwell got negative scaling with Dx12... It appears they have a real problem here...

Here's another quote from the same thread "In theory the best scenario for the game engines is to offload the long running compute jobs to a compute queue, so these can run asynchronously with the graphics pipelines, and this is the best case scenario for AMD too. But this is also the worst case scenario for Nvidia, even for Pascal. Now most games are designed for this, and Nvidia can’t handle it well."

Well there ya have it..
Jay
 
Status
Not open for further replies.

TRENDING THREADS