That's cool. I'm glad that ATi managed to develop Vega further for AMD's APUs.
When I said "disappointment" I meant that after all of the hype surrounding it as "The Next Best Thing", it couldn't even outperform what nVidia already had on the market. Remember how Vega was launched in more than one stage? The first stage was for workstations and the second was for gaming. It didn't really stand up well against nVidia's offerings and the pricing was out to lunch. I can't forget how frustrated I was because I wanted to upgrade to Vega but it just wasn't worth it.
This is why Radeon VII came out not too long after which also turned out to be underwhelming compared to what nVidia had and it was also way overpriced. This is the reason that I'm still rocking my R9 Furies instead of something newer and better. Well, that and the fact that AMD has, for some unknown reason, adopted nVidia's pricing strategy which really doesn't make sense to me.
Yeah, I have a Vega GPU as well in my R5-3500U craptop (Vega 8 I think) but I just use craptops for farting around when I'm not at home and have no access to "The Monster that Lives in the Black Tower" (corny as hell but I like it 😛) and for some reason my craptop came with a free mobile GTX 1050. Now, I despise the Green Goblin but if it's free, I'll take it. I actually leave it disabled until I actually need it for something (encoding video with CUDA has been my biggest use of it so far).
I agree with you there but this was the problem with AMD's marketing department. They were hyping the crap out of Vega which made people think that maybe ATi had finally put out something that could match or exceed Pascal. Remember that an R9 Fury is capable of competing with a GTX 1070 at 1440p (YouTuber Greg Salazar has a video about it).
My personal disappointment was AMD's abandonment of Tahiti, an architecture that was VERY potent to produce Polaris which, while quite efficient, was a serious backwards step in performance. Since I flatly refuse to buy anything from nVidia (It makes my keyboard and mouse feel slimy), I was still stuck with my R9 Fury. I say stuck because I want to game at 1440p and while the Fury technically CAN do that, 1440p at medium settings looks worse to me than 1080p at high settings.
Newegg had some insane sale on some refurbished ones for $200CAD so I bought a second one. Crossfire does still work sometimes so I thought "What the hell, I have a 1000W PSU, might as well use it!" because these cards draw 700W under load by themselves. I've only ever actually used them both in benchmarks because fortunately, for gaming at 1080p (which is what I mostly do but I'd love to jump to 1440p), the R9 Fury is still more than powerful enough to run any game happily at 1080p but not necessarily at max settings.
Well, a funny thing about that. As I was typing all of this out, I found an XFX RX 5700 XT for only $480CAD. So I bought the thing. LOL
I can understand that AMD may have said Vega would be the 'next big thing' but to be fair, they weren't ENTIRELY wrong.
Vega has massive compute (far more than any NV counterparts do), so it was pretty good at gaming, but it was also amazing for professional software.
Trouble is, not many games made use of AMD's compute or open source features (such as TressFX, etc.) which run on ANY hw (and does so pretty well - far better than NV proprietary features)... but most developers tend to go for the money, so when games are developed on PC, they will optimize the software for GPU's from companies that can pay them to do so (aka, usually NV - since it has deep pockets).
That, and, I also think some people may have had too big expectations.
AMD was a much smaller company compared to NV with far fewer resources (the fact they released Vega and were a mere step down from competing with NV at the absolute top end was actually amazing when you consider the size of NV and its resources vs AMD).
In fairness, Radeon VII WAS indeed what they said it was... a first 7nm gaming GPU fit for pretty much high-end gaming (with unmatched compute performance I might add).
It basically did bring competition to Nvidia... perhaps not in the absolute high end, but rather a step down from that.
One of the things that increased Vega's power consumption was its compute power (that and the fact AMD tended to release GPU's with unoptimized voltages in order to increase the amount of functional dies).
With Navi, AMD basically took Vega and lowered compute whilst focusing more on gaming relevant hw and performance... this improved power consumption, but voltages were still quite high (but manually modifying voltages tends to drop power consumption by about 20% while increasing performance by 5% because the boost clocks can be maintained).
With enhanced Vega (which they used in Zen 2 Renoir), they focused on improving the architecture (and with this, I think AMD made the right choice - focus on improving the uArch and making new ones - IPC and efficiency gains from uArch level modifications rather than relying on frequency increases and node enhancements alone).
RDNA 2 is the next step up from that as it affords 50% greater performance per watt compared to Navi (and enhanced Vega), so it will be very interesting to see how it performs (there's potential there for large performance - but obviously, the gpu die would also need to be larger if AMD introduces more CU's - this could cause some issues with viable dies... at least until AMD decides to start using chiplets for GPUs' - which may happen with RDNA 3 next year).
Anyway, for me, I think focusing on the 'absolute high end' is not the correct strategy.
Given how fast things progress and how frequently new GPU's come out, if I was on a desktop, I'd abstain from upgrading the mid-range GPU for about 3 years... maybe 4 (depending on the games and resolution I'm playing at - and of course the software I'm using in general).
At any rate (and as I said before), the high end GPU's of current generation will 'trickle down' to mid-range of new generation (and appropriate prices) with potentially better features and maybe even better performance at also better power draw.
So, for me, even if I'm mainly focusing on content creation, mid-range to entry level high-end (like Vega 56 was at the time) is more than enough to last a while until I'm ready for a replacement (but in fairness, I have no plans on replacing my PH517-61 monster of a laptop anytime soon - maybe in 3 years if it becomes necessary as I advance in my studies - though in fairness, I think I'm covered for about 4 years).