News AMD Ryzen 4000 Renoir Desktop Benchmarks Show Up To 90 Percent Uplift Over Last-Gen Flagship

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

alextheblue

Distinguished
For some stupid reason AMD is still blocking us from their onboard powerfull APU graphics like the one you find in coming Xbox or the existing Xbox one X ....
The consoles you mentioned are tightly integrated custom gaming machines that exclusively use GDDR. The desktop APUs are all-rounders and they drop into an AM4 DDR4 platform. They couldn't feed a large iGPU with enough bandwidth using two DDR4 channels, even with very expensive fast DDR4. One day maybe when prices are more reasonable for HBM and interposers, they'll add more CUs and HBM, especially for special mobile variants. If the next APU lands on AM4 it will probably be more bandwidth efficient, but we probably won't see another really big leap until AM5 and DDR5.

On the desktop side, it isn't necessary to produce special expensive APUs for the overwhelming majority of users that need more graphics. If they really need more powerful graphics, they need to get a graphics card.
 

beers

Distinguished
BANNED
Oct 4, 2012
261
53
18,790
Kind of curious how these will fare on DDR5, it'd be really advantageous to release with a quad channel memory controller but I don't see that in anything but epyc/threadripper.
I don't want or need a APU chip but am bursting to see what a 4900x for example will bring to the table
That's cool but it has nothing to do with the article.
 

nofanneeded

Respectable
Sep 29, 2019
1,541
251
2,090
The consoles you mentioned are tightly integrated custom gaming machines that exclusively use GDDR. The desktop APUs are all-rounders and they drop into an AM4 DDR4 platform. They couldn't feed a large iGPU with enough bandwidth using two DDR4 channels, even with very expensive fast DDR4. One day maybe when prices are more reasonable for HBM and interposers, they'll add more CUs and HBM, especially for special mobile variants. If the next APU lands on AM4 it will probably be more bandwidth efficient, but we probably won't see another really big leap until AM5 and DDR5.

On the desktop side, it isn't necessary to produce special expensive APUs for the overwhelming majority of users that need more graphics. If they really need more powerful graphics, they need to get a graphics card.

you just said it , they could add HBM2 Memory for the onboard APU and not make it shared ... I did not say make it 100% like the console , I said the APU power they are not giving us.

and 4 GB of HBM2 wont cost alot and people will pay for it if they can have a tiny PC that can be gamed on.
 
you just said it , they could add HBM2 Memory for the onboard APU and not make it shared ... I did not say make it 100% like the console , I said the APU power they are not giving us.

and 4 GB of HBM2 wont cost alot and people will pay for it if they can have a tiny PC that can be gamed on.
The raw cost for 4GB HBM2 is quite small, you're right. Maybe $25? I don't know, but somewhere in the $25 range for large orders might be a reasonable guess. The problem is you're thinking in base costs, not final product costs. What's more, you now have to have a silicon interposer to link the HBM2 with the rest of the GPU, or Intel's EMIB. That adds quite a bit more cost and complexity.

If we're looking at something like Renoir, that's a 156 mm2 chip. Rough estimate is that AMD can get about 340 such chips per 300mm2 wafer, which gives a cost of about $45 per chip (just for the chip, mind you -- packaging and other factors mean it's going to cost quite a bit more than $45 for AMD).

Now, on Renoir the GPU is only about 20% of the die, but it's also a relatively slow GPU. Something like the PS5 GPU would be about five times as large. Put a GPU like that into a Renoir design and instead of 156mm2 you'd be looking at something closer to 300mm2. Now instead of 340 usable chips per wafer, AMD would be getting perhaps 160 usable chips. And the per-chip cost is closer to $95.

Add in the additional cost of the HBM2 and Interposer, and really 4GB RAM isn't enough so let's do 8GB! Now we're looking at a chip design that has a base cost of probably $150-$200. Package everything up and it's going to be more. Regardless, that's three or four times as much as the current Renoir design! And unlike consoles where Microsoft or Sony foot a big chunk of the bill for the design and other elements, AMD is paying for all of this, and AMD needs to make a profit.

Option 1: $50 chip with weak integrated graphics, which can be paired with a dedicated GPU that will provide much higher performance. AMD sells this for maybe $150 and makes a nice profit.

Option 2: $175 chip with higher performance graphics, similar to a console. AMD would need to sell these at a price of around $525 to end up with similar margins to the Renoir design.

And unlike a console where users don't upgrade the hardware, this is supposed to be a PC chip. The current design might be very fast, but five years from now there will be substantially faster GPUs available. At which point if someone upgrades, they've just 'wasted' $400 worth of silicon. They could have just bought a $150 CPU and a $350 GPU originally, and potentially ended up with a better overall experience.

I'm not saying it can't be done, but there are all sorts of factors that go into the final design. Look at Intel's Kaby Lake G. Intel killed it one year after launch, because it was so great. That was a chip with RX 570 level graphics, and the price was quite high to get there.
 
  • Like
Reactions: alextheblue

nofanneeded

Respectable
Sep 29, 2019
1,541
251
2,090
The raw cost for 4GB HBM2 is quite small, you're right. Maybe $25? I don't know, but somewhere in the $25 range for large orders might be a reasonable guess. The problem is you're thinking in base costs, not final product costs. What's more, you now have to have a silicon interposer to link the HBM2 with the rest of the GPU, or Intel's EMIB. That adds quite a bit more cost and complexity.

If we're looking at something like Renoir, that's a 156 mm2 chip. Rough estimate is that AMD can get about 340 such chips per 300mm2 wafer, which gives a cost of about $45 per chip (just for the chip, mind you -- packaging and other factors mean it's going to cost quite a bit more than $45 for AMD).

Now, on Renoir the GPU is only about 20% of the die, but it's also a relatively slow GPU. Something like the PS5 GPU would be about five times as large. Put a GPU like that into a Renoir design and instead of 156mm2 you'd be looking at something closer to 300mm2. Now instead of 340 usable chips per wafer, AMD would be getting perhaps 160 usable chips. And the per-chip cost is closer to $95.

Add in the additional cost of the HBM2 and Interposer, and really 4GB RAM isn't enough so let's do 8GB! Now we're looking at a chip design that has a base cost of probably $150-$200. Package everything up and it's going to be more. Regardless, that's three or four times as much as the current Renoir design! And unlike consoles where Microsoft or Sony foot a big chunk of the bill for the design and other elements, AMD is paying for all of this, and AMD needs to make a profit.

Option 1: $50 chip with weak integrated graphics, which can be paired with a dedicated GPU that will provide much higher performance. AMD sells this for maybe $150 and makes a nice profit.

Option 2: $175 chip with higher performance graphics, similar to a console. AMD would need to sell these at a price of around $525 to end up with similar margins to the Renoir design.

And unlike a console where users don't upgrade the hardware, this is supposed to be a PC chip. The current design might be very fast, but five years from now there will be substantially faster GPUs available. At which point if someone upgrades, they've just 'wasted' $400 worth of silicon. They could have just bought a $150 CPU and a $350 GPU originally, and potentially ended up with a better overall experience.

I'm not saying it can't be done, but there are all sorts of factors that go into the final design. Look at Intel's Kaby Lake G. Intel killed it one year after launch, because it was so great. That was a chip with RX 570 level graphics, and the price was quite high to get there.

The final cost for 4GB of HBM2 is around $50 , 8GB around $100 and this is not bad price if you can give me 12 Tflops integrated like the coming Xbox console this year...

AMD just made a deal with console makers not to release this to the PC market to save the console market and sell them millions of chips in return. This is the ugly truth
 
The final cost for 4GB of HBM2 is around $50 , 8GB around $100 and this is not bad price if you can give me 12 Tflops integrated like the coming Xbox console this year...

AMD just made a deal with console makers not to release this to the PC market to save the console market and sell them millions of chips in return. This is the ugly truth
Any links to this "truth" you're claiming?