AMD Elaborates on PS4's Custom ''Jaguar'' APU

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]A Bad Day[/nom]From what I researched, there were no mentions of a separate GPU. Besides, it would make sense to have a single GPU than a more cumbersome dual GPU system (think micro-stuttering).[/citation]Well, I also haven't seen any mention of a external GPU, but that doesn't mean it doesn't have two iGPUs. After all, I haven't seen any 8-core Jaguar designs announced. It's probably two custom Jaguar+GCN APUs together.

With that being said, your assumption that having two GPUs in a console would mean microstuttering is way off base. The reason we have microstuttering has to do with how we have to utilize multiple GPUs to render on PC, for compatibility purposes. On a console you're not going to be using traditional SLI or Crossfire, except maybe as a fallback for lazy devs. Even then, there's a lot they could do to minimize or eliminate microstuttering on fixed hardware like a console. You could even change how you render entirely - the two GPUs could render independently.

Basically, having two GPUs on a console means more bang for the buck, without the drawbacks of having SLI or Crossfire. Especially if you're putting two APUs together, for twice the CPU and GPU punch, while still maintaining a low cost and thermal envelope! I think this is the way to go to keep costs down this generation, and I wouldn't be surprised if MS did something very similar. Jaguar CPUs have better IPC than their predecessors, and the consoles get carefully tuned and tailored software, with lower overhead. So performance should be comparable to a reasonably high-end system.

The lower costs could also mean a more reasonable lifespan (4-5 years?), and using more-or-less standard AMD tech means the successor to these consoles likely won't have backwards compatibility woes.
 
[citation][nom]sumgye[/nom]All it looks like is, processor wise, the numbers have been shuffled around a bit, the architecture has changed (*cough* you do realize, Sony, that moving to x86 means we're likely to see a PS4 emulator before a PS3 emulator, right? *cough*), and there's a lot more bandwidth available to move information around (why isn't marketing pushing THIS as a huge point?).Anyway, just my 2 cents.[/citation]
Perhaps Sony is trying to be forward looking in the marketplace.

The PS4 is using an x64 based architecture, so if I might ask, why would you need an emulator? The other bits don't seem to be widely known yet, so could also be non-proprietary.

When you release a console, your new software market is limited to that console. If Sony made the right decisions here, they also include the currently installed PC market. Imagine the return on developer's investment then? Almost a guarantee for the PS4 to take a higher role as lead platform.

A question that should probably be asked: Is being able to license software to both the PS4 console and PC markets likely to impact your console sales significantly?

How about the software licenses sold, if they're usable on a PC?

I'm speculating this time around, the console is not being made for a "fat" loss. I'm also speculating, the console crowd is not going to gravitate to the PC market all of a sudden, just because they can, nor the PC market jump ship for any mysterious reason. However, with other players like Steam trying to sneak into the console business, this move would make some strategic sense.

Just as there is a Windows compatible XBOX 360 controller, this adds the possibility of a Windows compatible Dual-Shock controller.

See where I'm going here?

On the flip side, AMD seems a bit like an eager kid, a week before his birthday, with their announcements of "wait and see what we got..."

Imagine what is going to happen to software performance on a PC running AMD GCN hardware when developers are specifically optimizing their code for it, knowing they've a dedicated user base in the PS4, and maybe even in the new XBOX console.

All just a pipe dream, however, until more is known. Sony may very well come out with yet another closed system, and I would be disappointed. I don't play games enough to want a console, although once in a while I see a game that piques my interest. If Sony, or even Microsoft could close the software licensing gap with the PC, that would make my day.
 
[citation][nom]blazorthon[/nom]The GPU is supposedly an 18 unit GCN GPU. That's right between the Radeon 7850's 16 and the Radeon 7870's 20. The Radeon 7870 also has a significantly higher frequency. The Radeon 7850 has a 20% lower core coutn and a 14% lower GPU frequency than the Radeon 7870, giving it 1.72TFLOPS. The CPU cores aren't going to do much in FLOPS (they're eight low-frequency Jaguar cores, a very light architecture that is an upgrade to AMD's netbook Atom competitor, Brazos), so it's safe to assume that even if that number is combined for the CPU and GPU, the GPU will still be around a Radeon 7850's GPU in performance.[/citation]Jaguar is getting a significantly upgraded 128-bit FPU, much better than Brazos. Lots of new instructions too. So while it is still a lightweight CPU, it is an improvement, and should be able to handle anything the GPU can't.
 
[citation][nom]alextheblue[/nom]Well, I also haven't seen any mention of a external GPU, but that doesn't mean it doesn't have two iGPUs. After all, I haven't seen any 8-core Jaguar designs announced. It's probably two custom Jaguar+GCN APUs together.With that being said, your assumption that having two GPUs in a console would mean microstuttering is way off base. The reason we have microstuttering has to do with how we have to utilize multiple GPUs to render on PC, for compatibility purposes. On a console you're not going to be using traditional SLI or Crossfire, except maybe as a fallback for lazy devs. Even then, there's a lot they could do to minimize or eliminate microstuttering on fixed hardware like a console. You could even change how you render entirely - the two GPUs could render independently.Basically, having two GPUs on a console means more bang for the buck, without the drawbacks of having SLI or Crossfire.[/citation]
Is AMD planning to sell any 8-core Jaguar designs in the consumer channel? Why would they do that? It may be counter intuitive, but how many people buy budget processors with enthusiast core counts?

It's already been stated the APU is custom built for Sony. We have been told what it is. If it was a pair of modules, or an APU with a supplemental GPU, why not just tell us that instead? Like Sony really needs more bad press when somebody tears into a PS4 for the first time and claims they lied to us?

As for micro-stuttering being non-existent if it did happen to be a multi-chip module, because you referred to the device as a console instead of a PC, I disagree. A console is generally just a PC with proprietary hardware and software. The same reality applies. If AMD or even NVIDIA could fix the micro-stuttering as easily as you have suggested, I doubt it would still be a topic of conversation.

[citation][nom]alextheblue[/nom]... Basically, having two GPUs on a console means more bang for the buck, without the drawbacks of having SLI or Crossfire... [/citation]
SLI and Crossfire are brandings that have been trademarked for marketing purposes. Do you really think there would be a different chip-to-chip interconnect because it's a console? The benefits of going with a custom designed APU and coupling it to GDDR5 seem to fade, like lower cost, and a single device to cool.

On the other hand, how much more would it be for AMD to finish integrating the rest of the bits into their custom APU, I mean, if it's not already done? They were integrating the FCH into their chips already, so how much more do you need, component wise?
 
[citation][nom]alextheblue[/nom] Especially if you're putting two APUs together, for twice the CPU and GPU punch, while still maintaining a low cost and thermal envelope!The lower costs could also mean a more reasonable lifespan (4-5 years?), and using more-or-less standard AMD tech means the successor to these consoles likely won't have backwards compatibility woes.[/citation]

The problem with dual APUs is like with dual socket motherboards.

They're more expensive because you got an additional CPU/APU that needs its own interconnect, and has to be connected to the other CPU/APU, which drives up motherboard costs.

It would be cheaper to just get a more powerful APU.

And plus, coding for dual GPUs would be more difficult than a single one. Microstuttering is a result of an inherent flaw in the current method of rendering frames with more than one GPU. Other methods that reduce microstuttering are susceptible to screen tears.
 
[citation][nom]A Bad Day[/nom]The problem with dual APUs is like with dual socket motherboards.They're more expensive because you got an additional CPU/APU that needs its own interconnect, and has to be connected to the other CPU/APU, which drives up motherboard costs.It would be cheaper to just get a more powerful APU.And plus, coding for dual GPUs would be more difficult than a single one. Microstuttering is a result of an inherent flaw in the current method of rendering frames with more than one GPU. Other methods that reduce microstuttering are susceptible to screen tears.[/citation]

They could be on the same package instead of on different chips in separate slots. If that were the case, then it's also possible to eliminate micro-stuttering by syncing the GPUs. It's also possible to use the older methods of dual-GPU rendering without their fall-backs because they could be treated as a single GPU if on the same chip package. This would also alleviate any issues with yield problems that a single large die could create, so it's possible that it would actually be cheaper.
 


MCMs can in fact fix micro-stuttering. They wouldn't need a new interconnect or anything like that either. They'd simply need another clock generator to keep the GPUs in sync or they could treat them as a single GPU.

This is not practical for regular graphics cards because the only GPUs often put on dual-GPU cards are too hot for it to be easily done. AMD and Nvidia also don't have to worry about yield issues as much since they can sell inferior chips in lower range cards whereas a console can't have that, so yields can be a serious issue (the PS3's CPU, for example, was limited by yields to the point where they disabled one of its processing units to allow for chips with up to one faulty unit to still be usable.

Furthermore, I doubt there would be a horrendous uproar if something like this was done. Who's going to care how many dies are in the main system chip?
 
[citation][nom]bigpinkdragon286[/nom]Jaguar is the name of the core design, and does not denote how many cores are present. Multiple module design, such as Core 2 Quad, doesn't make much sense. Which module are you going to attach the GCN cores to? Plus, you just sacrificed some chip to chip latency and heterogeneous computing, by essentially farming the GCN cores off die for one or more modules, but that was the point of having the GPU cores there in the first place. You might as well move the GCN cores off module if they're not directly attached and regain some thermal ceiling at this point. On the other hand, it sounds like the Jaguar cores are going to have some sort of access to GDDR5, which should remove any immediate bandwidth concerns. I find that to be the most interesting part in all of this. Jaguar is a very small core, by design, so fitting 8 together does not sound unreasonable at all. It's only speculation on my part but perhaps some reasoning behind the decision for small and energy efficient x86 cores was to save more resources for the GPU cores? I really can't think of anybody who is concerned about the amount of electricity being consumed by their gaming console.[/citation]
Sorry, you're right. I should've put Jaguar-based.
 
[citation][nom]blazorthon[/nom]MCMs can in fact fix micro-stuttering. They wouldn't need a new interconnect or anything like that either. They'd simply need another clock generator to keep the GPUs in sync or they could treat them as a single GPU...[/citation]
Do you have any real world data to back up your assertion? I'm always interested in some good tech reading, and I haven't yet heard of the issue having a solution this simple. I will agree that the micro-stuttering issue is possible to eliminated, I disagree that putting the two pieces of GPU silicon on the same module is the solution.

On that note, consumers have been living with the micro-stuttering this long, if they went with a MCM, would they actually spend much energy in trying to fix the problem? I suspect the PS4 would simply inherit the unfortunate stutter as a byproduct of going MCM, if the custom APU was put together in this way.

I see it as a cost of diminished returns problem, as much as anything. Why spend the R&D for such a small return? I think the problem will work itself practically enough, when an engineer working in the graphics department has a eureka moment, or the sharing of the rendering workload is handled in a different fashion.
[citation][nom]blazorthon[/nom]... I doubt there would be a horrendous uproar if something like this was done. Who's going to care how many dies are in the main system chip?[/citation]
Anybody effected by the additional latency inherent in MCM design, but you're spot on as to it being transparent to the end user. Wonder how many clock cycles such a design slows down your transport to the GPU registers? It totally shoots in the foot the HSA when you put those transport latencies back in, as you question doing your floating point in house where it started on the CPU, or taking the hit of going out to the GPU. If you can't guarantee which GPU half of your APU you're going to end up on, performance could end up unpredictable.

I'm still betting against MCM.
 
Am I the only one who is a little bit worried about this? Forget the 17 GB/s little mistake, we're talking about a chip that will encompass eight "Jaguar" cores.

Is that a joke?! Sony agreed on having a console that would run on eight x86 cores that are a successor to AMD's C and E series cpu's ? ? ? CPU's that are used in netbooks and entry level notebooks?! What a disappointment.

Don't get me wrong, I get the whole "low power consumption" thing, but it's supposed to be a gaming console that would have to last for at least (if memory serves me right, SONY indicated that they'll be releasing a new PS every) four years ! ! !

I'm staying with my PC, skipping SONY this time!
 
[citation][nom]bigpinkdragon286[/nom]OT: What's with all the down votes for everybody, all of a sudden? Someone in a bad mood? 🙂I remember the cost advantage you're talking about very well, but the last few years I've seen PC releases retail at the same price point as console releases. Now that EA is pushing Origin, prices don't even come down anymore for a lot of titles. I've given up buying games upon release due to the price hikes, wait for Steam sales, and for titles to lose there exclusivity to the Origin service, such as Crysis 2.[/citation]

I' prefer BitComet, it is chipper :)
 


Cell was mostly a floating point processor IIRC, kinda like a GPU with only a few, fast cores. It's possible that much of that work will be shifted to the GPU, hence the weaker CPU.
 
I'm hoping that the Xbox One and PS4 are too similar to determine a clear winner in the graphics department. Hopefully, it will create better games and exclusives for either system.
 
Status
Not open for further replies.