News Vega Reloaded: AMD Ryzen 5000 Cezanne iGPU Exposed In New Benchmarks

I feel it is a little too early to say if this is the graphic config for the top end Ryzen 5xxx APU. I believe I read somewhere that AMD confirmed that they will still stick to Vega for the APU for another generation, before moving on to RDNA2. But I am sure AMD will not sit idle by retaining Vega 8 for their top end APU, when Intel Xe is threatening to dethrone their graphic edge. I do suspect that AMD may introduce Vega 10 or 11 for their next gen top end APU.
 
As interesting as this is, I find most desktop-based IGPs to be more or less irrelevant for the vast majority of us. True, there are those for whom IGPs in desktops are a godsend but their needs tend to be so pedestrian that even Intel graphics would be enough for them. Attaching these GPUs to low-powered CPUs makes the most sense as even the current Athlons make for great HTPCs.

Vega was a disappointment, no doubt, but that was for high-end gaming. However, for an IGP, it is spectacular and if it's less expensive for ATi to produce Vega than RDNA, so much the better for AMD's pricing on these APUs. I really think that AMD made the right choice here. Save your more expensive bullets for your bigger guns.
 
Last edited:
  • Like
Reactions: Rdslw and watzupken
Without power figures its pointless .. ~6% for 10% power -> bad
~6% with identical power or less -> nice, especially for laptop parts.

Since Zen 3 will be manufactured on NP7 which allows either 10% improved efficiency or up to 7% greater performance (compared to existing 7nm node), I'd say its 6% more performance at same power consumption.

To be fair, it would have been better if AMD incorporated say 2 more CU's instead of just raising clocks.
 
Wouldn't the APU GFX performance benefit hugely from a quad-channel RAM controller?

Well, going from dual to quad channel would nearly double the available bandwidth, but in laptops, you don't usually get more than 2 RAM slots (unless you're getting a high powered machine which can also be described as a desktop replacement that comes with 4 RAM slots - and those are usually more expensive than the lot), so dual-channel is often seen as 'enough'.

I do think AMD is a bit late with incorporating dual-channel, but in all honesty, apart from the desktop iGP, the whole system wouldn't really benefit from it (and laptops with APU's don't usually get 4 RAM slots).
 
As interesting as this is, I find most desktop-based IGPs to be more or less irrelevant for the vast majority of us. True, there are those for whom IGPs in desktops are a godsend but their needs tend to be so pedestrian that even Intel graphics would be enough for them. Attaching these GPUs to low-powered CPUs makes the most sense as even the current Athlons make for great HTPCs.

Vega was a disappointment, no doubt, but that was for high-end gaming. However, for an IGP, it is spectacular and if it's less expensive for ATi to produce Vega than RDNA, so much the better for AMD's pricing on these APUs. I really think that AMD made the right choice here. Save your more eexpensive bullets for your bigger guns.

Vega was revamped for Renoir (they call it 'enhanced Vega').
AMD managed to increase each Vega core performance by 56%. About 15% of this performance came from clock increases (which was afforded by the 7nm node), however, 41% of the performance improvements came from pure uArch enhancements.

In essence, AMD made 'enhanced Vega' to resemble Navi in performance and efficiency.

Vega wasn't really a disappointment... I am rocking a Vega 56 in my Acer Predator Helios 500 laptop (PH517-61). Its running fast, cool and quiet when fully stressed and its hard limited to 120W (which isn't even fully used - I can easily OC it on to 1450MhZ on core and 900/950MhZ on HBM without exceeding 120W and I'm just a hair from reaching GTX 1080/Vega 64).

The reason I went with Vega in that laptop is because it comes with large amount of compute performance (which is useful for my content creation)... and the laptop is a desktop replacement.

I can easily run any game maxed out... and I'm barely getting to 65 degrees Celsius when gaming (with the laptop being generally barely audible in the process).

Vega needs appropriate voltage/frequency modulation to make it tick appropriately.
AMD overvolted the heck out of Vega on desktop (which is why it was sucking a lot of power), but it could be undervolted to bring that well under control and improve thermal efficiency as well as performance.

Not sure what was the problem... the fact it didn't compete with 1080ti?
Well, I'm not bothered by that because most of the money is in the mid-range anyway.
Besides, getting too high powered GPU's can be expensive... so why bother when I can easily get the same or better performance with better power consumption in next generation of GPU's?
 
I feel it is a little too early to say if this is the graphic config for the top end Ryzen 5xxx APU. I believe I read somewhere that AMD confirmed that they will still stick to Vega for the APU for another generation, before moving on to RDNA2. But I am sure AMD will not sit idle by retaining Vega 8 for their top end APU, when Intel Xe is threatening to dethrone their graphic edge. I do suspect that AMD may introduce Vega 10 or 11 for their next gen top end APU.

This is what I'm thinking as well.
Enhanced Vega is great, but one of its problems is that its lacking in CU's.
AMD would do well to introduce an APU with 10 and possibly 12 CU's for iGP Vega.
It would increase performance by quite a bit.
 
Yep, but there would be almost none demand for those and you need a heck a lot memory bandwide to feed those GPUs and that means very expensive memory. Let see is 5 years when ddr5 is not anymore redigiously expensive. Maybe then 12cu apus!
 
Wouldn't the APU GFX performance benefit hugely from a quad-channel RAM controller?
I believe it may benefit iGPUs, but at the expense of cost. Quad channel memory will increase the cost of the processor as well as motherboard if I am not mistaken. Which runs counter to people buying APUs for gaming because they are on a budget. Considering a low end GTX 1650 runs a lot faster than the fastest iGPU and don't cost that much, I rather get a dedicated GPU if the cost of APU + motherboard + RAM becomes too expensive.
 
  • Like
Reactions: alextheblue
I believe it may benefit iGPUs, but at the expense of cost. Quad channel memory will increase the cost of the processor as well as motherboard if I am not mistaken. Which runs counter to people buying APUs for gaming because they are on a budget. Considering a low end GTX 1650 runs a lot faster than the fastest iGPU and don't cost that much, I rather get a dedicated GPU if the cost of APU + motherboard + RAM becomes too expensive.
Yes, I can see that. However, a GTX1650 has a different energy budget. The thinking is more, why add CU's or increase iGPU frequency to have better GFX performance when the current bottleneck is bandwidth? And with 8 core/16 thread CPUs, I imagine even the CPU perofrmance could benefit. Now I am ignorant on CPU design but still I can;t escape the feeling that it might be cheaper overall.
 
Vega was revamped for Renoir (they call it 'enhanced Vega').
AMD managed to increase each Vega core performance by 56%. About 15% of this performance came from clock increases (which was afforded by the 7nm node), however, 41% of the performance improvements came from pure uArch enhancements.

In essence, AMD made 'enhanced Vega' to resemble Navi in performance and efficiency.
That's cool. I'm glad that ATi managed to develop Vega further for AMD's APUs.
Vega wasn't really a disappointment... I am rocking a Vega 56 in my Acer Predator Helios 500 laptop (PH517-61). Its running fast, cool and quiet when fully stressed and its hard limited to 120W (which isn't even fully used - I can easily OC it on to 1450MhZ on core and 900/950MhZ on HBM without exceeding 120W and I'm just a hair from reaching GTX 1080/Vega 64).
When I said "disappointment" I meant that after all of the hype surrounding it as "The Next Best Thing", it couldn't even outperform what nVidia already had on the market. Remember how Vega was launched in more than one stage? The first stage was for workstations and the second was for gaming. It didn't really stand up well against nVidia's offerings and the pricing was out to lunch. I can't forget how frustrated I was because I wanted to upgrade to Vega but it just wasn't worth it.

This is why Radeon VII came out not too long after which also turned out to be underwhelming compared to what nVidia had and it was also way overpriced. This is the reason that I'm still rocking my R9 Furies instead of something newer and better. Well, that and the fact that AMD has, for some unknown reason, adopted nVidia's pricing strategy which really doesn't make sense to me.
The reason I went with Vega in that laptop is because it comes with large amount of compute performance (which is useful for my content creation)... and the laptop is a desktop replacement.

I can easily run any game maxed out... and I'm barely getting to 65 degrees Celsius when gaming (with the laptop being generally barely audible in the process).
Yeah, I have a Vega GPU as well in my R5-3500U craptop (Vega 8 I think) but I just use craptops for farting around when I'm not at home and have no access to "The Monster that Lives in the Black Tower" (corny as hell but I like it 😛) and for some reason my craptop came with a free mobile GTX 1050. Now, I despise the Green Goblin but if it's free, I'll take it. I actually leave it disabled until I actually need it for something (encoding video with CUDA has been my biggest use of it so far).
Not sure what was the problem... the fact it didn't compete with 1080ti?
Well, I'm not bothered by that because most of the money is in the mid-range anyway.
I agree with you there but this was the problem with AMD's marketing department. They were hyping the crap out of Vega which made people think that maybe ATi had finally put out something that could match or exceed Pascal. Remember that an R9 Fury is capable of competing with a GTX 1070 at 1440p (YouTuber Greg Salazar has a video about it).

My personal disappointment was AMD's abandonment of Fiji, an architecture that was VERY potent to produce Polaris which, while quite efficient, was a serious backwards step in performance. Since I flatly refuse to buy anything from nVidia (It makes my keyboard and mouse feel slimy), I was still stuck with my R9 Fury. I say stuck because I want to game at 1440p and while the Fury technically CAN do that, 1440p at medium settings looks worse to me than 1080p at high settings.

Newegg had some insane sale on some refurbished ones for $200CAD so I bought a second one. Crossfire does still work sometimes so I thought "What the hell, I have a 1000W PSU, might as well use it!" because these cards draw 700W under load by themselves. I've only ever actually used them both in benchmarks because fortunately, for gaming at 1080p (which is what I mostly do but I'd love to jump to 1440p), the R9 Fury is still more than powerful enough to run any game happily at 1080p but not necessarily at max settings.
Besides, getting too high powered GPU's can be expensive... so why bother when I can easily get the same or better performance with better power consumption in next generation of GPU's?
Well, a funny thing about that. As I was typing all of this out, I found an XFX RX 5700 XT for only $480CAD. So I bought the thing. LOL
 
Last edited:
  • Like
Reactions: deksman
That's cool. I'm glad that ATi managed to develop Vega further for AMD's APUs.

When I said "disappointment" I meant that after all of the hype surrounding it as "The Next Best Thing", it couldn't even outperform what nVidia already had on the market. Remember how Vega was launched in more than one stage? The first stage was for workstations and the second was for gaming. It didn't really stand up well against nVidia's offerings and the pricing was out to lunch. I can't forget how frustrated I was because I wanted to upgrade to Vega but it just wasn't worth it.

This is why Radeon VII came out not too long after which also turned out to be underwhelming compared to what nVidia had and it was also way overpriced. This is the reason that I'm still rocking my R9 Furies instead of something newer and better. Well, that and the fact that AMD has, for some unknown reason, adopted nVidia's pricing strategy which really doesn't make sense to me.

Yeah, I have a Vega GPU as well in my R5-3500U craptop (Vega 8 I think) but I just use craptops for farting around when I'm not at home and have no access to "The Monster that Lives in the Black Tower" (corny as hell but I like it 😛) and for some reason my craptop came with a free mobile GTX 1050. Now, I despise the Green Goblin but if it's free, I'll take it. I actually leave it disabled until I actually need it for something (encoding video with CUDA has been my biggest use of it so far).

I agree with you there but this was the problem with AMD's marketing department. They were hyping the crap out of Vega which made people think that maybe ATi had finally put out something that could match or exceed Pascal. Remember that an R9 Fury is capable of competing with a GTX 1070 at 1440p (YouTuber Greg Salazar has a video about it).

My personal disappointment was AMD's abandonment of Tahiti, an architecture that was VERY potent to produce Polaris which, while quite efficient, was a serious backwards step in performance. Since I flatly refuse to buy anything from nVidia (It makes my keyboard and mouse feel slimy), I was still stuck with my R9 Fury. I say stuck because I want to game at 1440p and while the Fury technically CAN do that, 1440p at medium settings looks worse to me than 1080p at high settings.

Newegg had some insane sale on some refurbished ones for $200CAD so I bought a second one. Crossfire does still work sometimes so I thought "What the hell, I have a 1000W PSU, might as well use it!" because these cards draw 700W under load by themselves. I've only ever actually used them both in benchmarks because fortunately, for gaming at 1080p (which is what I mostly do but I'd love to jump to 1440p), the R9 Fury is still more than powerful enough to run any game happily at 1080p but not necessarily at max settings.

Well, a funny thing about that. As I was typing all of this out, I found an XFX RX 5700 XT for only $480CAD. So I bought the thing. LOL


I can understand that AMD may have said Vega would be the 'next big thing' but to be fair, they weren't ENTIRELY wrong.
Vega has massive compute (far more than any NV counterparts do), so it was pretty good at gaming, but it was also amazing for professional software.

Trouble is, not many games made use of AMD's compute or open source features (such as TressFX, etc.) which run on ANY hw (and does so pretty well - far better than NV proprietary features)... but most developers tend to go for the money, so when games are developed on PC, they will optimize the software for GPU's from companies that can pay them to do so (aka, usually NV - since it has deep pockets).

That, and, I also think some people may have had too big expectations.
AMD was a much smaller company compared to NV with far fewer resources (the fact they released Vega and were a mere step down from competing with NV at the absolute top end was actually amazing when you consider the size of NV and its resources vs AMD).

In fairness, Radeon VII WAS indeed what they said it was... a first 7nm gaming GPU fit for pretty much high-end gaming (with unmatched compute performance I might add).
It basically did bring competition to Nvidia... perhaps not in the absolute high end, but rather a step down from that.

One of the things that increased Vega's power consumption was its compute power (that and the fact AMD tended to release GPU's with unoptimized voltages in order to increase the amount of functional dies).
With Navi, AMD basically took Vega and lowered compute whilst focusing more on gaming relevant hw and performance... this improved power consumption, but voltages were still quite high (but manually modifying voltages tends to drop power consumption by about 20% while increasing performance by 5% because the boost clocks can be maintained).

With enhanced Vega (which they used in Zen 2 Renoir), they focused on improving the architecture (and with this, I think AMD made the right choice - focus on improving the uArch and making new ones - IPC and efficiency gains from uArch level modifications rather than relying on frequency increases and node enhancements alone).

RDNA 2 is the next step up from that as it affords 50% greater performance per watt compared to Navi (and enhanced Vega), so it will be very interesting to see how it performs (there's potential there for large performance - but obviously, the gpu die would also need to be larger if AMD introduces more CU's - this could cause some issues with viable dies... at least until AMD decides to start using chiplets for GPUs' - which may happen with RDNA 3 next year).

Anyway, for me, I think focusing on the 'absolute high end' is not the correct strategy.
Given how fast things progress and how frequently new GPU's come out, if I was on a desktop, I'd abstain from upgrading the mid-range GPU for about 3 years... maybe 4 (depending on the games and resolution I'm playing at - and of course the software I'm using in general).

At any rate (and as I said before), the high end GPU's of current generation will 'trickle down' to mid-range of new generation (and appropriate prices) with potentially better features and maybe even better performance at also better power draw.

So, for me, even if I'm mainly focusing on content creation, mid-range to entry level high-end (like Vega 56 was at the time) is more than enough to last a while until I'm ready for a replacement (but in fairness, I have no plans on replacing my PH517-61 monster of a laptop anytime soon - maybe in 3 years if it becomes necessary as I advance in my studies - though in fairness, I think I'm covered for about 4 years).
 
Last edited:
  • Like
Reactions: Avro Arrow
I can understand that AMD may have said Vega would be the 'next big thing' but to be fair, they weren't ENTIRELY wrong.
Vega has massive compute (far more than any NV counterparts do), so it was pretty good at gaming, but it was also amazing for professional software.

Trouble is, not many games made use of AMD's compute or open source features (such as TressFX, etc.) which run on ANY hw (and does so pretty well - far better than NV proprietary features)... but most developers tend to go for the money, so when games are developed on PC, they will optimize the software for GPU's from companies that can pay them to do so (aka, usually NV - since it has deep pockets).

That, and, I also think some people may have had too big expectations.
AMD was a much smaller company compared to NV with far fewer resources (the fact they released Vega and were a mere step down from competing with NV at the absolute top end was actually amazing when you consider the size of NV and its resources vs AMD).

In fairness, Radeon VII WAS indeed what they said it was... a first 7nm gaming GPU fit for pretty much high-end gaming (with unmatched compute performance I might add).
It basically did bring competition to Nvidia... perhaps not in the absolute high end, but rather a step down from that.

One of the things that increased Vega's power consumption was its compute power (that and the fact AMD tended to release GPU's with unoptimized voltages in order to increase the amount of functional dies).
With Navi, AMD basically took Vega and lowered compute whilst focusing more on gaming relevant hw and performance... this improved power consumption, but voltages were still quite high (but manually modifying voltages tends to drop power consumption by about 20% while increasing performance by 5% because the boost clocks can be maintained).

With enhanced Vega (which they used in Zen 2 Renoir), they focused on improving the architecture (and with this, I think AMD made the right choice - focus on improving the uArch and making new ones - IPC and efficiency gains from uArch level modifications rather than relying on frequency increases and node enhancements alone).

RDNA 2 is the next step up from that as it affords 50% greater performance per watt compared to Navi (and enhanced Vega), so it will be very interesting to see how it performs (there's potential there for large performance - but obviously, the gpu die would also need to be larger if AMD introduces more CU's - this could cause some issues with viable dies... at least until AMD decides to start using chiplets for GPUs' - which may happen with RDNA 3 next year).

Anyway, for me, I think focusing on the 'absolute high end' is not the correct strategy.
Given how fast things progress and how frequently new GPU's come out, if I was on a desktop, I'd abstain from upgrading the mid-range GPU for about 3 years... maybe 4 (depending on the games and resolution I'm playing at - and of course the software I'm using in general).

At any rate (and as I said before), the high end GPU's of current generation will 'trickle down' to mid-range of new generation (and appropriate prices) with potentially better features and maybe even better performance at also better power draw.

So, for me, even if I'm mainly focusing on content creation, mid-range to entry level high-end (like Vega 56 was at the time) is more than enough to last a while until I'm ready for a replacement (but in fairness, I have no plans on replacing my PH517-61 monster of a laptop anytime soon - maybe in 3 years if it becomes necessary as I advance in my studies - though in fairness, I think I'm covered for about 4 years).
Honestly, I've NEVER had a flagship card before I ordered my XFX Triple Dissipation RX 5700 XT yesterday, unless you consider the HD 7970 to have still been a flagship card when it was renamed the R9 280X. So I totally agree with you there but Amazon was selling it for $90 less than Canada Computers. I got it for $480CAD (That's $326USD for a card with an MSRP of $400USD, I couldn't say no.)

My card history actually starts with an ATi EGA Wonder in 1988 and goes on through companies like CirrusLogic, Oak, an nVidia RIVA TNT2 VANTA, An Albatron GeForce FX-5400, an XFX GeForce 6200, a Palit 8500GT 1GB and then I was pretty much all ATi from there (with a PNY 8400GS PCI purchased as a diagnostic tool) with twin XFX HD 4870s, twin Gigabyte Windforce HD 7970s (traded one of the 7970s to a co-worker for an actual reference HD 5870) then I got one of those Sapphire R9 Fury Nitro OC+ cards that newegg was practically giving away for about $350CAD. A year later, I got another one that was a refurb for just under $200CAD, also from the egg.

As you can see, I tended more towards middling and good value cards (The HD 4870 may have been the greatest value in GPU history). I'm not exactly a noob when it comes to tech. Hell, my first build was a 286-16 at the age of twelve (and my father DIDN'T help me build it). I even remember the original Hercules Monochrome video card in my family's original IBM PC (yes, the model 5150). Damn, I'm just... OLD! LOL

I actually hadn't signed on to Tom's in about three years and my account still has 20 decorations (I have no idea what any of them even mean because when I was on before, it was a totally different system).

I think you'd enjoy something I wrote over ten years ago when I was in university that got immortalised inTom's Guide (One of my greatest online achievements ever, and I laugh at the attitude I had back then):
 
  • Like
Reactions: deksman
Honestly, I've NEVER had a flagship card before I ordered my XFX Triple Dissipation RX 5700 XT yesterday, unless you consider the HD 7970 to have still been a flagship card when it was renamed the R9 280X. So I totally agree with you there but Amazon was selling it for $90 less than Canada Computers. I got it for $480CAD (That's $326USD for a card with an MSRP of $400USD, I couldn't say no.)

My card history actually starts with an ATi EGA Wonder in 1988 and goes on through companies like CirrusLogic, Oak, an nVidia RIVA TNT2 VANTA, An Albatron GeForce FX-5400, an XFX GeForce 6200, a Palit 8500GT 1GB and then I was pretty much all ATi from there (with a PNY 8400GS PCI purchased as a diagnostic tool) with twin XFX HD 4870s, twin Gigabyte Windforce HD 7970s (traded one of the 7970s to a co-worker for an actual reference HD 5870) then I got one of those Sapphire R9 Fury Nitro OC+ cards that newegg was practically giving away for about $350CAD. A year later, I got another one that was a refurb for just under $200CAD, also from the egg.

As you can see, I tended more towards middling and good value cards (The HD 4870 may have been the greatest value in GPU history). I'm not exactly a noob when it comes to tech. Hell, my first build was a 286-16 at the age of twelve (and my father DIDN'T help me build it). I even remember the original Hercules Monochrome video card in my family's original IBM PC (yes, the model 5150). Damn, I'm just... OLD! LOL

I actually hadn't signed on to Tom's in about three years and my account still has 20 decorations (I have no idea what any of them even mean because when I was on before, it was a totally different system).

I think you'd enjoy something I wrote over ten years ago when I was in university that got immortalised inTom's Guide (One of my greatest online achievements ever, and I laugh at the attitude I had back then):

Since you mentioned refurbished hw... I don't know if you recall back when Polaris was highly sought out as it provided exceptional bitcoin mining performance (far above what NV had to offer).
Those Polaris GPU's were frequently both Overclocked and Undervolted (at the same time), and it was subsequently discovered that buying second-hand (and cheap) Polaris mining GPU's from e-bay for example tended to be really good at gaming... no degradation whatsoever (and it was noted in the reviews that the likelihood for this was the major manual voltage and frequency optimizations that very much improved both performance and efficiency of those GPU's.

For comparison sake, before my current PH517-61 laptop, I was using an Acer 5930G with Core2Duo and 9600mGT (NV gpu).
I actually undervolted both the CPU and GPU which resulted in at least 10 degrees Celsius lower temperatures when both were fully stressed, and I was able to push my GPU clocks even above stock.

By regularly maintaining the laptop once every 2 years (cleaning out the dust from the system and replacing thermal paste on both CPU and GPU), that laptop lasted me 9 years (before I gave it to my nephew to use it for school - so the laptop is now 12 years old and still functional - with upgraded RAM to 8GB and an SSD for the main drive (and ripped out the DVD and replaced it with a caddy which had a 1TB hdd).


Point is, refurbished and previously used hw, can easily be a great purchase if it was maintained properly (unfortunately, too many people don't do this).