AMD's Mysterious Fenghuang SoC Spotted in Chinese Gaming Console

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

g-unit1111

Titan
Moderator


They actually didn't. It was manufactured by Patriot Memory with AMD branding for a one time line. It is no longer being manufactured.
 

bit_user

Polypheme
Ambassador

Hmm... If they stick with 128-bit DDR4, then you'd even get a benefit even from a short stack @ ~120 GB/sec. In fact, a much greater benefit than Intel is currently getting with its tiny chunk of 128 MB eDRAM in its Iris Pro CPUs (I think the transfer bandwidth of that is only like 50 GB/sec, bidir - https://en.wikichip.org/wiki/intel/crystal_well ).


Are you sure about that? I thought the bandwidth was limited by the number of dies. At 2 GB, HBM2 should only be 2 dies, meaning a quarter of what you get at Vega 10's 8-die (2x 4-stack) config.


Not that I'd mind, but size and power considerations make it seem very unlikely.
 

bit_user

Polypheme
Ambassador

$58.4M ("including buying the chips from AMD"), according to this:

https://www.anandtech.com/show/13163/more-details-about-the-zhongshan-subor-z-console-with-custom-amd-ryzen-soc

the ‘equivalent’ cost in typical USD is around the $625 mark.
So, it'll launch well above all of the current-gen consoles. ...or maybe the console version's price will be subsidized.
 
Yes, stacked die count effects overall bandwidth. HBM2 retains HBM1's 1024 bit interface at double the throughput and 2 x 128-bit memory channels per die, and supports configurations of 2, 4, or 8 dies per stack. A 4 or 8-hi stack yields 256 GB/s whereas a 2-hi stack should still yield 128 GB/s, as it doesn't have enough dies to satisfy the 8 memory channels. I don't actually see any limitation to the size of each die however, so I would think in theory you could make a 2 GB stack that had 4 dies, to preserve bandwidth. Still, 128 GB/s would be a great step up as a buffer for on-board graphics.
 

InvalidError

Titan
Moderator

It is one step closer regardless of actual implementation: at first you have the usual off-chip DDR4, then you have a spin-off of the same chip using GDDR5, yet another spin-off with a small amount of on-package HBM for the IGP/L4$ (Broadwell-like) + DDR4 DIMMs is only one more step from there. Nothing really groundbreaking there, it has already been done (by Intel) in a slightly different form.
 

bit_user

Polypheme
Ambassador

sigh

This is no different than PS4 or XBox One X. It's just using newer CPU and GPU cores. They're all big APUs with unified GDDR5 memory. Deal with it.

And what's the source on this HBM2 version people are mentioning? Is it at all credible? Or just an outgrowth of the initial misreporting that the chip had in-package memory?
 

InvalidError

Titan
Moderator

What the story doesn't say is whether there will be any differences between the console and "PC" uses of the chip. Any modern PC gamer can tell you that 8GB shared between IGP and conventional OS would be far too little for comfort. If Subor plans to have a fully-fledged gaming PC alongside its consoles, then I'd expect the PC variant to support add-on DDR4. I'd count that as a significant difference.

The 2GB HBM stories are based on 3DMark database entries showing a mystery AMD APU with 2GB of HB2. Could be a stats reporting error, could be something else.I wouldn't be too surprised if Subor got the 8GB GDDR5 version for its consoles and a 2GB HBM2 variant for desktop.
 

bit_user

Polypheme
Ambassador

They showed a board and announced a release date for the PC version of sometime this month. You can even see the die:
https://www.anandtech.com/show/13163/more-details-about-the-zhongshan-subor-z-console-with-custom-amd-ryzen-soc

So, it sounds like it will limp along with 8 GB. I agree - that doesn't sound like much for a full PC. Now, a console (with a trimmed down OS and customed-tuned software) can manage, as we've seen with PS4. Maybe typical gaming PC specs are low enough in China that popular games will fit 8 GB adequately.


Interesting. Links are always appreciated.

(I don't see anything wrong with posting links to other sites, as long as the information is not provided in any article on Tom's. My first choice is always Tom's.)
 
It was spotted in SiSoftware Sandra last year:

https://www.fudzilla.com/news/processors/45194-mystery-amd-fenghuang-raven-apu-spotted-in-sisoft-sandra
https://wccftech.com/amd-fenghuang-desktop-apu-with-15ff-28-cu-graphics/

https://www.tweaktown.com/news/62107/amd-fenghuang-15ff-faster-rx-vega-gh-rocks-2gb-hbm2/index.html

And as InvalidError mentioned, it's also shown up in 3DMark:

https://www.reddit.com/r/Amd/comments/8nxyx9/amd_fenghuang_3dmark_11/
https://wccftech.com/amd-fenghuang-apu-3dmark-specs-performance-leak/

And a video:
https://www.youtube.com/watch?v=5deDsJ8l8SA
 
Aug 2, 2018
83
0
140


1- you can make GDDR5 dimms ... They just dont make them because it more expensive than soldering them on the PCB , and is only used on GPU cards and sockets will make the card thicker and will block the cooling block ...

2- Making quad channel dimms for having twice the performance of onboard APU is also a good reason for cost... and the more cost is not much.

3- and I am not speaking about what AMD already did , I was speaking about their plans for APU platform. They decided to make their AM4 to be for both APU and CPU , and They already have this technology. I personally prefer them to make the APU platform separate if they decide to go dual channel for AM4 ..

4- They already made a PC with onboard GDDR5 for the whole system , and it is a success , where is your point that they never did it ? First PS4 Pro , then Xbox one X , and now this chinese company. all what you need for Xbox X is to allow windows to be installed on it ... and the Chinese did that by contacting AMD and asking them the same technology


Having said 1,2,3,4 ... I know what is done is done , so

I demand That AMD release this platform for the PC Market ... with optional 16GB GDDR5 / 8GB in ITX format.


 
There is no market for these. If there were, they would either already exist or be in some phase of production. GDDR5 may have wonderful amounts of speed, but it doesn't come at zero cost.

Power consumption is significantly higher in GDDR5 than other memory types. This was a major factor in Vega AIBs utilizing much more expensive HBM2. The power usage of GDDR5 to attain similar bandwidth would have pushed the graphics cards power envelope through the roof. While this wouldn't prevent using GDDR5 in the desktop market, raising energy consumption is not exactly a selling point.

You could at least make a minor use case for a GDDR DIMM on graphics cards, yet you don't even see them there.

Quad-channel memory barely helps CPUs in ordinary, every day tasks. This has been demonstrated before. We don't need to drone on about it. CPUs are not hitting severe bandwidth constraints on the desktop. Adding more channels to your CPU to realize zero gain in CPU performance is wasteful. You're looking at costs in the neighborhood of $300 for a quad-channel motherboard, so your dream of not adding much to the cost is out the window.

You're not going to realize 1:1 scaling of your GPU portion in an APU if you're sharing the memory controller between a graphics device and CPU. Even if you gave all of the bandwidth to the GPU, APUs still contain a smaller area dedicated to graphics than is found in a discreet, mid-tier+ AIB graphics card. There simply isn't enough room on the silicon yet to add as much graphics device as would be necessary to compete with add-in boards. We might see this on 7nm, but it's going to be at the very least another year.

Power dissipation for such a package also has to be considered. Combining two devices that can easily burn through 65+ watts on their own does not make for an easier device to cool. Expect the cost of managing heat to go up.

The chip in this article exists because somebody contracted AMD to make it. It's clear that AMD doesn't mind sticking a GDDR5 memory controller instead of a DDR4 controller into their CPUs, it's just business. You are welcome to contract them to do the same for you, but they aren't going to do so just because somebody says they should.

They? Who's they? At this point in time, AMD is a fabless semiconductor company that designed the SoC. AMD did not make an entire platform. You're going to get just as far demanding Sony or Microsoft make, or even design you a platform.

Demand all you like. Standard DIMM quad-channel memory doesn't fit in the ITX form factor, although SO-DIMM quad-channel has been done on the ITX form factor. What factor are your GDDR DIMMs going to manifest in?

For all this extra money you propose both manufacturers and consumers throw at your perceived problem, what exact problem are you hoping to mitigate? I can only assume to this point you're saying all this should be done to improve the graphics performance of APUs? Even if your desires were granted, in the end you will still end up with a system that performs worse overall and costs more than a reasonably equipped dual-channel CPU and discreet graphics card combination using off-the-shelf components.

You can already build better performing ITX form factor machines, sporting twice your 16 GB memory you are calling optional. If the problem is price, your solution isn't going to address that aspect in a positive fashion, but it's certainly going to perform worse than already available options.
 

Rogue Leader

It's a trap!
Moderator


pzv5j7l.jpg


You really just do not understand how this all works.

First read this about why GDDR and DDR are different

https://www.gamersnexus.net/guides/2826-differences-between-ddr4-and-gddr5

Then to answer your points
1. They don't make them because they are not useful as SYSTEM RAM
2. Quad channel doesn't give "twice the performance", ever, period. This has been explained already better than I even could.
3. Financially a stupid thing, because there is no reason to use GDDR like you think, and having CPU and APU on the same platform gives people options, slims the product line making it more profitable while making more options available to the users.
4. They made a game console with vague specs. Being that it has 24 Vega cores its conceivable all of that memory is dedicated to the GPU and the system still needs its own memory. Being we don't have specs on the whole system we don't know for sure.

If you're going to demand things of a major corporation it would be useful for you to understand what you are asking. Maybe try researching how all this works first, come here, ask questions, and talk to people, we would be happy to help.

 

mossberg

Distinguished
Jun 13, 2007
159
32
18,720


impressive-your-level-of-fail-is.jpg
 

bit_user

Polypheme
Ambassador
Some good replies, already, so I'll keep this short & sweet.


According to whom?


Are you saying that PS4 and XBox One X are examples of PCs? The problem is that you cannot install a standard PC OS on them, so they cannot be used as PCs and we can't get benchmarks to see what impact the GDDR5 actually has on CPU performance.

This new company is the exception, and I will be interested in seeing some benchmarks on the PC version. My expectations are low, but I'm keeping an open mind and will let the data speak for itself. I suggest you do the same.

Further, I suggest harnessing your clear enthusiasm in the subject and using it as motivation to dig deeper and learn more about this tech. Perhaps learning more about RAM would be a great place to start.
 

TJ Hooker

Titan
Ambassador

Well, you can actually hack a PS4 to run Linux. How it performs, I have no idea.
 

bit_user

Polypheme
Ambassador

Wow! I hadn't heard that.

That said, the slow CPU cores and not-exactly-cheap price of the PS4 Pro have me a bit less interested than I might've been. I didn't even bother to run Linux on my PS3, back when that was still possible.

I'd rather sit back and wait for the PC version of this new machine to get benchmarked. That would be more interesting and potentially relevant.
 
That's completely in line with the type of standard response we get from people when somebody call's them on their BS because they are talking out their left earhole. It might be different if anything you were saying made sense. As bit user indicated, might want to actually learn about the tech before talking about the tech.
 

mossberg

Distinguished
Jun 13, 2007
159
32
18,720


So you are taking your ball and going home, because everyone knows you are clueless and you cannot handle the truth?
 
Status
Not open for further replies.

TRENDING THREADS