Report: PlayStation 4 Rumored to Use AMD Fusion CPU/GPU

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
According to benchmarks on trinity the cpu is slower than i7 but the graphics are only 48% faster at the most in a real world game test vs ivy bridge 48% does not equal 7x, seeing that haswell is 5x faster than ivy bridge that is still much faster than trinity.
Also the power usage is 20x less on haswell vs ivy bridge which again beats trinity that means it runs much cooler than trinity as well.
The top end haswell is faster than current i7 which means on cpu intensive games it will blow away trinity.
 
The point is moot. Firstly, I doubt very much that Trinity will power the PS4. Secondly, for Haswell to be that fast means you're going to be forking a LOT out for it.
 
As far as pricing Sony says they understand a high price was detrimental to sales of ps3. But do they realize inflation has went backwards and $500-600 actually is like $700 now. So the mainstream can not afford $400 now even though its 7 years later. Obviously early adopters buy all consoles and pc but we are a miniscule portion of sales.
 
Haswell will use the same production node as Ivy, so Haswell, even if it will be more efficient than Ivy, will not have 5 time faster GPU. It would be too big or the CPU part would have to be much much smaller (not gonna happen). Kaveri will have better GPU and worse CPU than Haswell... Nothing new in there... The next production node reduction after Haswell can give Intel the edge and it can put more space to GPU if they want to, but that is far, far away.
So far we only have rumours and if Sony would use modified Kaveri, then 2013 would be too early to release the console and even 2014 would be guite difficult to achieve... So trinity would be more like it and trinity does not have Next Gen Graphic engine, so something does not match...
 
[citation][nom]techguy911[/nom]According to benchmarks on trinity the cpu is slower than i7 but the graphics are only 48% faster at the most in a real world game test vs ivy bridge 48% does not equal 7x, seeing that haswell is 5x faster than ivy bridge that is still much faster than trinity.Also the power usage is 20x less on haswell vs ivy bridge which again beats trinity that means it runs much cooler than trinity as well.The top end haswell is faster than current i7 which means on cpu intensive games it will blow away trinity.[/citation]

Haswell can't be twenty times more power efficient than Ivy, let alone using twenty times less power while being faster. They will use the same process node and beyond that, Haswell is only one new architecture... twenty times faster could push it far beyond even silicon's limits on current production technology unless it's that much more power efficient by having a huge amount of dies at a very, very low clock frequency and even then, it would only be in extremely parallel situations. It would basically be an X86 GPU. Furthermore, you're comparing HD 4000, the cheapest CPU with which is over $249, to much cheaper APUs. I was comparing HD 2500 to AMD's APUs because that's the Intel IGP in Intel's CPUs that are in AMD's price range. Intel could have an IGP on the same performance level as Tahiti and it wouldn't matter because it would only be on far more expensive CPUs at this time.
 
[citation][nom]traumadisaster[/nom]As far as pricing Sony says they understand a high price was detrimental to sales of ps3. But do they realize inflation has went backwards and $500-600 actually is like $700 now. So the mainstream can not afford $400 now even though its 7 years later. Obviously early adopters buy all consoles and pc but we are a miniscule portion of sales.[/citation]

I have to admit if the next Xbox/PS is more then £250/$400 then I'm out. Just cant afford to pay that much for what in effect is a toy. A nice toy but still a toy. I could 6 years ago but not now.
 
An APU and a Discrete GPU is not so bad. While the APU can compute some openCL as well as general computing combined is still better than the old Cell processor alone. Not to mention the flexibility of exchanging some of that power to do Crossfire with the discrete GPU when needed.
If you look from a developer perspective, it is much easier to port games into these AMD processors than the Cell.
What can go wrong?
 
[citation][nom]mikeccuk2005[/nom]An APU and a Discrete GPU is not so bad. While the APU can compute some openCL as well as general computing combined is still better than the old Cell processor alone. Not to mention the flexibility of exchanging some of that power to do Crossfire with the discrete GPU when needed. If you look from a developer perspective, it is much easier to port games into these AMD processors than the Cell.What can go wrong?[/citation]

What can go wrong? Oh, there's probably an infinite amount of ways for this to go wrong. There probably are infinite ways for it to go right. For example, the APU and discrete GPU might have poor CFX that causes micro-stutter (can you imagine the arguments that this would bring out from people who can see it and people who can't?), the system might use too much power and not have sufficient cooling for it, and there could be many other problems. Whether or not it works remains to be seen and honestly, unless someone screws something up badly, it will probably work out very well. However, lets not pretend that it must go well.
 
[citation][nom]A Bad Day[/nom]I hope AMD doesn't get rejected like the Apple situation due to insufficient manufacturing capability.[/citation]
Doesn't work like that, AMD licences out the chip then Sony makes them wherever they want.
 
@ hasten,

The GPU in Xbox360 was based on a unified shader R500 codename Radeon design. While it was not a high-end unified shader GPU like the X1950XTX was, it was 1 full generation ahead of what AMD had on the desktop at the time. While Xbox360 launched in November of 2005, the first unified shader GPU for PCs from AMD launched in May 2007 as 2900XT.

Thus, it's not true at all that you can't have next generation parts in a console. The main constraints are power consumption and costs (how much is Sony willing to subsidize the console's MSRP).

The other part about Fusion APU -- AMD allows different SKUs to be used in CrossFire, for example HD7970 and HD7950. It's not inconceivable to have a Fusion APU with low-end HD7000 series and a slightly downclocked HD7850 dedicated GPU in the same console. On the power consumption front, a Pitcairn based HD7950M (1280 shaders), full 256-bit bus and 2GB of VRAM consumes just 50W of power in total in laptops. By end of 2013, this GPU will be 1 generation old as it will be replaced by HD8950M.

If you look at the specs, you can have a 725mhz 1280 SP HD7950M = 1.856 Tflops. That fits nicely into the sub-50W power envelope, the cost structure and performance sweet spot. It's too crazy expensive like an HD8970M would be and it doesn't consume 180-200W of power like GTX680/7970 would be. Using HD8970 or GTX780 would be completely out of the question due to cost and power consumption limitations.

However, foregoing the inefficient and slow Cell / PowerPC architecture and having dual-HD7000 series graphics would actually give PS4 a chance to surpass Xbox720 in terms of graphics. The Cell was too expensive and very power inefficient. Using the APU approach provides an added GPU boost and superior CPU performance at the same time. Sony also can't afford to use a modern Core i5 CPU from Intel since those are too expensive. Most of us here would much prefer an AMD Fusion CPU over PowerPC or Cell 2.0 designs which are vastly inferior to an off-the-shelf x86 CPU modern processor. I don't think Cell is even possible since Sony abandoned the development of future Cell architectures in 2007.
 
What I meant to say is the GPU in the Xbox360 was a unified shader design but not a high-end SKU in that unified shader HD2000 product stack. While on the PC we had a high-end X1950XTX, it was a fixed pixel/vertex shader design. So MS took a gamble and used a next generation AMD unified shader mid-range GPU ~ 2600XT or so and it paid off because a 1 year older PS3 with RSX could not really outperform the 360 overall in graphics, other than few exclusives such as Uncharted 3, Killzone 3, etc.

@ techguy911,

It is also because on the emphasis on GPU that under no circumstances should Sony even think about Haswell. They'd get much more GPU power by pairing a 384 - 512 shader APU graphics in the Fusion processor and some HD7000/8000 series GPU. A Haswell CPU is dead end since you won't be able to cross-fire it and Intel is lucky to sell you a dual-core i3 for $130. If AMD won the GPU design in the PS4, they can deliver a double discount for the GPU and CPU to Sony which would make Haswell even less competitive in price.

By end of next year, AMD will have an APU with superior graphics to Haswell, no question about it. They have a 512 Shader Kaveri APU design slated for 2013.

"APU Kaveri will sport eight GCN units for a total of 512 Radeon cores - like Radeon HD 7750."
http://www.nordichardware.com/news/71-graphics/46017-kodnamn-foer-amd-radeon-hd-8000-serien-synas-i-senaste-drivrutinerna.html

Pair that with a mid-range HD7000/8000 GCN mobile discrete GPU and PS4 will be pretty damn good vs. the early rumors of HD6670/7660 GPU.
 
Status
Not open for further replies.