AMD Elaborates on PS4's Custom ''Jaguar'' APU

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]greghome[/nom]If the CPU is a Kabini like CPU, then either Kabini will be really powerful....or the PS4 would have pretty weak CPUs compared to even an AMD A4 or Intel Celeron B830 system[/citation]Jaguar is 15% better IPC than bobcat per module so u can figure out the rest with 4 module = 8 cores. Not really fast lol
 
Sounds fantastic an 8 core jaguar APU with a supercharged Radeon 7870 intergrated gpu on die. Man i hope we will see some type of APU like this on the pc market. Hopefully with karveri!
 
[citation][nom]leon2006[/nom]Its internal gpu so its clear to be less capable compare to dedicated gpu[/citation]One of the biggest limitations of current desktop APUs is memory bandwidth. At best, they're able to get around ~30GB/s of available memory bandwidth from the system's DDR3 memory. Things will be different for the PS4, since this time the APU will have direct access to GDDR5 as its system memory, allowing 176GB/s of bandwidth. This eliminates the bottleneck.
 
[citation][nom]tomfreak[/nom]Jaguar is 15% better IPC than bobcat per module so u can figure out the rest with 4 module = 8 cores. Not really fast lol[/citation]
Jaguar doesn't feature a modular architecture; from what I've heard, it's a monolithic quad core with shared L2. From the sounds of things, it's a multi-core module with two Jaguar quad cores "glued" together, much like the Core 2 Quad being two Duos paired up.

The extent of the customisations aren't yet known, which is a bind; I really want to know how far they went, an example being that Jaguar doesn't seem to support overclocking one core without sleeping the rest, and single thread performance may still have some importance so you'd expect some sort of turbo mode.
 
[citation][nom]jrharbort[/nom]One of the biggest limitations of current desktop APUs is memory bandwidth. At best, they're able to get around ~30GB/s of available memory bandwidth from the system's DDR3 memory. Things will be different for the PS4, since this time the APU will have direct access to GDDR5 as its system memory, allowing 176GB/s of bandwidth. This eliminates the bottleneck.[/citation]

Common memory frequencies for APUs are DDR3-1333 to DDR3-1866. DDR3-1600 is 32GB/s in dual-channel and DDR3-1866 is about 17% higher and DDR3-2133 is about 33% higher. That's up to almost 43GB/s, a long ways away from a mere best of 30GB/s. Otherwise, you're right so long as we remember that all of these numbers are theoretical numbers, not quote real-world. Just like USB 3.0 is never really 480MB/s, you're not going to get the full speed out of the memory systems.
 
[citation][nom]A Bad Day[/nom]EDIT:And if you still think integrated GPUs suck, how about the GT 610m? Sure, it's a discrete GPU with its own memory system, but a worthless one because an Intel HD 4000 matches it in performance.And yes, there was an i7 laptop that had the GT 610m paired with it. Seriously.[/citation]
All else being equal, I would choose the GT 610m over the HD 4000 for both driver maturity and feature support.
[citation][nom]tomfreak[/nom]Jaguar is 15% better IPC than bobcat per module so u can figure out the rest with 4 module = 8 cores. Not really fast lol[/citation]
Who said anything about modules? Jaguar is not a module, it's a complete core, sharing only the L2 cache with other cores.
 
[citation][nom]DjEaZy[/nom]... AMD's ATi buy starts to pay of...[/citation]

"Starts"? Their GPU/APU business has been their main moneymaker for years now. They haven't been able to compete with Intel in the CPU market for ages and I'm surprised they haven't exited the x86 desktop CPU market altogether by this point. They are so far behind Intel it's silly.
 
[citation][nom]magicandy[/nom]"Starts"? Their GPU/APU business has been their main moneymaker for years now. They haven't been able to compete with Intel in the CPU market for ages and I'm surprised they haven't exited the x86 desktop CPU market altogether by this point. They are so far behind Intel it's silly.[/citation]

They're behind in performance per core and power efficiency at load, but that's no reason to leave the market, especially since they're now set to start catching up. Besides that, AMD is ahead in well-threaded performance at every price point that they're in right now.
 
Let's hope AMD didn't undercut Nvidia on price so much, or sign a poor ass contract, and they end up cranking out processors night and day for years on end and lose money hand over fist. Nvidia led one the first round/s, I wonder what happens when the consoles get extended from 3 to 5 years life spans and they still are contacted to produce, maybe Nvidia got burned and wasn't so thrilled this time?
 
[citation][nom]abbadon_34[/nom]Let's hope AMD didn't undercut Nvidia on price so much, or sign a poor ass contract, and they end up cranking out processors night and day for years on end and lose money hand over fist. Nvidia led one the first round/s, I wonder what happens when the consoles get extended from 3 to 5 years life spans and they still are contacted to produce, maybe Nvidia got burned and wasn't so thrilled this time?[/citation]

AMD's use was probably more related to the concept of the APUs (something that AMD, unlike Nvidia, has been working on for several years and in which AMD has made great advancements) than it was AMD being cheaper than Nvidia for purely graphics-related costs.
 
I think the 1.8 Terra flops is the apus combined performance, x86 and gcn cores so don't get hopes up for a 7850 inside, a 7870 can do 2.5 Terra flops on its own so don't know why people are throwing that around either, even the Amd article points to the 1.8 as the combined apu compute performance.
 
[citation][nom]mauller07[/nom]I think the 1.8 Terra flops is the apus combined performance, x86 and gcn cores so don't get hopes up for a 7850 inside, a 7870 can do 2.5 Terra flops on its own so don't know why people are throwing that around either, even the Amd article points to the 1.8 as the combined apu compute performance.[/citation]

The GPU is supposedly an 18 unit GCN GPU. That's right between the Radeon 7850's 16 and the Radeon 7870's 20. The Radeon 7870 also has a significantly higher frequency. The Radeon 7850 has a 20% lower core coutn and a 14% lower GPU frequency than the Radeon 7870, giving it 1.72TFLOPS. The CPU cores aren't going to do much in FLOPS (they're eight low-frequency Jaguar cores, a very light architecture that is an upgrade to AMD's netbook Atom competitor, Brazos), so it's safe to assume that even if that number is combined for the CPU and GPU, the GPU will still be around a Radeon 7850's GPU in performance.
 
[citation][nom]silverblue[/nom]Jaguar doesn't feature a modular architecture; from what I've heard, it's a monolithic quad core with shared L2. From the sounds of things, it's a multi-core module with two Jaguar quad cores "glued" together, much like the Core 2 Quad being two Duos paired up.[/citation]
Jaguar is the name of the core design, and does not denote how many cores are present. Multiple module design, such as Core 2 Quad, doesn't make much sense. Which module are you going to attach the GCN cores to? Plus, you just sacrificed some chip to chip latency and heterogeneous computing, by essentially farming the GCN cores off die for one or more modules, but that was the point of having the GPU cores there in the first place. You might as well move the GCN cores off module if they're not directly attached and regain some thermal ceiling at this point. On the other hand, it sounds like the Jaguar cores are going to have some sort of access to GDDR5, which should remove any immediate bandwidth concerns. I find that to be the most interesting part in all of this. Jaguar is a very small core, by design, so fitting 8 together does not sound unreasonable at all. It's only speculation on my part but perhaps some reasoning behind the decision for small and energy efficient x86 cores was to save more resources for the GPU cores? I really can't think of anybody who is concerned about the amount of electricity being consumed by their gaming console.
 
[citation][nom]blazorthon[/nom]Common memory frequencies for APUs are DDR3-1333 to DDR3-1866. DDR3-1600 is 32GB/s in dual-channel and DDR3-1866 is about 17% higher and DDR3-2133 is about 33% higher. That's up to almost 43GB/s, a long ways away from a mere best of 30GB/s. Otherwise, you're right so long as we remember that all of these numbers are theoretical numbers, not quote real-world. Just like USB 3.0 is never really 480MB/s, you're not going to get the full speed out of the memory systems.[/citation]

its not ddr3 though its gddr5
 
[citation][nom]SkateROck[/nom]its not ddr3 though its gddr5[/citation]

All FM1/FM2 APUs use DDR3. If you're referring to the PS4's GPU, then yes, it's GDDR5. However, I never said that it was DDR3. I only said that the FM1/FM2 APUs use DDR3.
 
This is not the first time AMD/ATI have been used in a console fellas!!!! Intel nor Nvidia have tech that would benefit consoles, if they did then it would be their tech used and not AMD.
 
[citation][nom]mauller07[/nom]I think the 1.8 Terra flops is the apus combined performance, x86 and gcn cores so don't get hopes up for a 7850 inside, a 7870 can do 2.5 Terra flops on its own so don't know why people are throwing that around either, even the Amd article points to the 1.8 as the combined apu compute performance.[/citation]

Or the Titans GTX 4.5 gflops (2,5 times faster!) both are smoking new and already the PC part is 2,5 times faster. How will it look 6-7 years from now when the new PS5 come? Geez!
 
[citation][nom]rantoc[/nom]Or the Titans GTX 4.5 gflops (2,5 times faster!) both are smoking new and already the PC part is 2,5 times faster. How will it look 6-7 years from now when the new PS5 come? Geez![/citation]

Titan is also $1000, more than twice as expensive as the whole console, and would be put in a system with something like another $600-800 of components, if not even more expensive. When you can't build a PC with comparable performance to the console for a similar price, the console has a win over the PC.
 
im well chuft its using x86 as everyone has pointed it should benifit the PC gamer to when games get crossed over to pc format thas got to be a bonus i reckon. would i like a PS4 ? oh yes although i dout i'll have the money spare with the way things are going financially. maybe next year for me :/ .
 
"As stated on Wednesday, the APU will consist of eight x64 AMD "Jaguar" cores and a next-generation Radeon GPU comprised of 18 "compute units" capable of cranking out 1.84 teraflops."
Since it has x64 cores will it be able to run windows or will windows have problems using gddr5 as system memory.
 
[citation][nom]shafe88[/nom]"As stated on Wednesday, the APU will consist of eight x64 AMD "Jaguar" cores and a next-generation Radeon GPU comprised of 18 "compute units" capable of cranking out 1.84 teraflops."Since it has x64 cores will it be able to run windows or will windows have problems using gddr5 as system memory.[/citation]

Windows doesn't care what type of memory you have because the type of memory doesn't affect the operating system. Windows would probably not work because of it not having drivers to support PS4, not because of anything to do with the memory.
 
Status
Not open for further replies.