AMD Will Sell Modified Version PlayStation 4's APU

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]gilgamex[/nom]When It comes to APU's theres no limitation to the die size, only the inherent disadvantages (such as power consumption & heat) that forces you to streamline and optimize the CPU & GPU architecture. So while Intel dedicates more for the CPU than GPU that only means that you have an i5 which will be its inherent size and the power and size of their gpu will make up the whole die size right? So all AMD did was take the Athlon II architecture and originally made Lllano with 5000 series radeon cores, and then piledriver with trinity and 6000 series Radeon cores. So it doesnt matter what the size of those CPU dies are, they are put in with their full size, there isn't a limit, same goes for Intel.[/citation]

Um, what? There are tons of inherent limitations to die sizes. Intel took their time in releasing SandyBridge-E and still hasn't released IvyBridge-E because the yield at that die size on the process was too low at the time and still remains low for Intel's 22nm process. NVidia's GK110 chip is massive and suffers from a high defect rate as a result. In order to increase product yield they have to disable part of the chip.
 

alextheblue

Distinguished
[citation][nom]gm0n3y[/nom]Well it compares to a mid-high range system today. By the time the console is released it will be in the mid-range. This is about the same as the xb360/ps3 were when they were released.[/citation]Actually when the 360 was released it was pretty strong. Xenon and Xenos compared well to high-end PC components of the day, such as an Athlon X2 4800+ and an X1800 or 7800-based graphics card. On the CPU end (Xenon) it was an in-order chip, but it had lots of FP performance, custom instructions, custom VMX, three cores and SMT. The GPU (Xenos) was an advanced custom design that was ahead of the curve in many ways, such as unified shaders (which didn't show up on the PC until R600) and a lightning fast custom eDRAM daughter chip.

So at the time, the Xbox 360 was actually quite impressive in terms of raw performance. However, they are not headed that direction this time. Now they're aiming for mid-high performance. This will help them keep costs down from the very beginning, instead of taking years and years to recoup their losses. Frankly, they don't need to be high-end, as long as their raise the bar enough to allow for some heavily multithreaded PC titles. :)
 

alextheblue

Distinguished
[citation][nom]A Bad Day[/nom]Label it as A10+, and make sure to mention that it has high performance in Crysis 3 because of C3's heavy multithread support.For the engineering, reduce cores to six, give it a quad-channel 2400 MHz DDR3 RAM support (preferably DDR4), and OC it over 2 GHz.A future $600 Tom's Hardware's gaming desktop will feature that kind of an APU..[/citation]Even if you could get adequate CPU perfomance compared to upcoming Steamroller chips, it would require an entirely new socket. So now you've got AM3+, FM2, and some new socket that probably will see almost no support because it requires a top of the line quad channel memory setup to get the most out of the iGPU (all while being completely wasted on the CPU side). Also, you're very unlikely to find any laptop manufacturers willing to support quad-channel, and laptops are outselling desktops now.

Plus on desktops, those of us who still want a desltop with good graphics are perfectly fine with discrete graphics. I don't see a huge market for this desktop-only mega-APU using expensive memory and a new socket. In fact, the whole idea strangely reminds me of Quad FX. I think AMD's more current approach of increasing iGPU performance steadily while still offering plenty of discrete graphics solutions is more flexible and will remain the better, safer option for the foreseeable future.
 

InvalidError

Titan
Moderator
[citation][nom]Also, you're very unlikely to find any laptop manufacturers willing to support quad-channel[/citation]
Quad-channel on laptops would make sense with a BGA CPU and RAM integrated on the motherboard much like it would be for a discrete GPU. Much easier to accommodate a 256-bits wide bus when you can place the ICs around the CPU instead of having to line them up neatly to accommodate SODIMM slots.

Another possibility would be to integrate the first 2-4GB on the APU's MCP itself and make two off-package channels available for extra RAM and bandwidth. The memory controller could be asymmetrical with on-package DRAMs operating at 3.2GT/s DDR4L or a low-power GDDR5 variant and off-package DRAM being 1600MT/s DDR3L.

There are countless possibilities once you ditch traditional connectors, sockets and design.
 

demonhorde665

Distinguished
Jul 13, 2008
1,492
0
19,280
[citation][nom]bunz_of_steel[/nom]So AMD builds a customized APU for the PS4 but can't put out an APU with some hair on it? Don't get me wrong the apu's I have experienced work great and have no issues playing HD or doing office work. However when it comes to a powerhouse workhorse... I guess APU isn't the way to go but rather the FX line. Problem there is AMD can only compete on price points not performance as the i5 eats their lunch.[/citation]




I5 eats the FX's lunch ... can you scream fan boy any louder??

ignoring power consumption the 6 and 8 core FX chips performs JUST as good as An I7 and I5 in games. put both platforms on work programs like 3ds max and autocad and the FX's quickly trounce the I5 (not the I7's though), due to actually having real physical cores as opposed to just hyperthreading. NO the I5 does not eat the Fx serie's lunch. unless you are talking about the quad core Fx's.


now for the article at hand , I see no reason why AMD should bother releasing a watered down version of this chip , seriously ??? ask me they should wait one year after PS4 launches and release a full version of it for pc (minus the sony specific tech still for obvious reasons). That aside their current APU offerings are fair enough , no need for another budget minded apu
 

demonhorde665

Distinguished
Jul 13, 2008
1,492
0
19,280
[citation][nom]NoMyFriend[/nom]Time for benchmarks:http://www.anandtech.com/show/6726 [...] -laptops/2http://www.anandtech.com/show/6508 [...] -tested/14http://semiaccurate.com/2012/10/01 [...] vy-bridge/I would post more benchmarks, but just go to anandtech and see the latest benchmarks. In no way, shape, or form does AMD 'wipe the floor with Intel' in, well, ANYTHING! The only place the benchmarks are even CLOSE is when you measure an 8 core AMD CPU vs. a 4 core Intel CPU like an i3 that doesn't do hyperthreading or is painfully locked in core speed binning. Attrocious for people to try and mislead with information like that.[/citation]


dumbest intel fanboy post i ever read ...

first off the I3 is NOT a quad core , it's a dual core

2 the FX line of cpus beat all but the I7's in work applications

3. as i said before I5 , AMD FX( 6 or 8 core) or I7 doesn't matter when it comes to gaming they all perform in the same 2-5 fps margin. se the tom's benchmarks for that.

4. as some one else pointed out these bench test you list are all fubar given they are testing 150 dollar cpu's (AMD) against 500+ dollar intel chips.

5. Amd has beaten intel in the past , so to say they never beat intel at any thing is BS,
here's why

A) they had the first 1 ghz and beyond cpu's back in 99 , so beat intel in that race

B) they had the first 64 bit backwards compatible CPU

C) They had the first dual core cpu

D) they had the first native quad core cpu

E) they had the first native 8 core (intel still hasn't released an 8 core cpu yet)

F) amd did have the top performance CPU for a year there with their original 64 bit single core FX

G) amd has delivered very solid graphic chips , Intel has been unable to produce a video chip that is even as good as two year old tech , and they even got depoer wallets than Amd and nivida combined to throw at it. I'd say this constitutes AMD TOTTALLY stomping the crap out of intel in the graphics field.
so yes they have beat intel at several "points" granted their current architecture is not as good as intel's but you can't take away what they have acomplished simply on that fact , consider also that they made those hallmark's in technology on a Research and development budget about 1/4th the size of intel's budget.

so ask me , I'll go amd cos those guys can pull some pretty amazing s-- off on a much tighter budget than the intel guys can. makes me wonder where intel would be right now if they had to work on amd's budgets.

don'tget me wrong i'll get intel when i can afford their higher end chips. but currently i see no reason to drop amd on teh low end to mid side , intel's i5's are just not that much more compelling for what i do with my computer (3ds max work and gaming)
 

demonhorde665

Distinguished
Jul 13, 2008
1,492
0
19,280
[citation][nom]downhill911[/nom]Exactly! Forget about low profit and almost non-existing segment and go full forward towards high-end where money is made.[/citation]

LOL the idiots that thumbed you down obviously missed teh sarcasm in your post LOL
 

ojas

Distinguished
Feb 25, 2011
2,924
0
20,810
Um, i see the excitement, though Jaguar is their sub-20w architecture (next gen Bobcat). I'm not really sure you can buy this thing off-the-shelf.

Plus Jaguar based SoCs will likely be quad core...and a part of me suspects that we're still at least an entire year away from a unified address space, or seeing GDDR memory being used by the CPU as well.
 

kinggraves

Distinguished
May 14, 2010
951
0
19,010
Pardon me if I missed it skimming past all the AMD vs Intel jerkoffery but how much of the chip Sony is getting is "Sony technology" and how easily would it be to build our own PS4s with the non-Sony version? Since it is based on an x86 processor, how easy is it going to be to either rip or emulate Sony's OS and have the PS4 experience? What is Sony going to have to prevent piracy on the PS4, and how difficult is it going to be to get around? These are interesting questions. Sony building the console on an x86 platform poses a lot of possibilities.

Then again, Vita is "based off Android" too, so there may be things we're missing.
 
[citation][nom]ojas[/nom]Um, i see the excitement, though Jaguar is their sub-20w architecture (next gen Bobcat). I'm not really sure you can buy this thing off-the-shelf.Plus Jaguar based SoCs will likely be quad core...and a part of me suspects that we're still at least an entire year away from a unified address space, or seeing GDDR memory being used by the CPU as well.[/citation]

There's no difficulty in making a CPU use GDDR5 memory, at least not any more than making a GPU use it.

Unified memory spaces is also not really that difficult. It's been done for years, decades even for various products. I don't see any reason for why AMD couldn't have it with this APU. Maybe they don't, but they sure can.
 

gallidorn

Distinguished
Jan 23, 2009
104
0
18,680
[citation][nom]tipoo[/nom]A cut down version? So basically the article means AMD will not suddenly cancel its plans for APUs with the next gen cores in them? "Cut down" could mean anything, and surely they won't try to put the 7850-like chip in there while limiting it with DDR3.[/citation]

It just shows that AMD can make an APU that is capable of high-end gaming, but they would never release it for desktop/laptop computers, because it would demolish sales of graphics cards. Why would anyone want to buy a video card, if they could get the same performance from an APU?
 

InvalidError

Titan
Moderator
[citation][nom]kinggraves[/nom]Since it is based on an x86 processor, how easy is it going to be to either rip or emulate Sony's OS and have the PS4 experience?[/citation]
The PC version would likely have dual-channel DDR3 on FM2 socket rather than quad-channel GDDR5 so with less than 1/5th the RAM bandwidth, the PC version would be unable to reach anywhere near the same graphics performance as the PS4 version would.

Also, PS4 software would likely be written to take maximum advantage of shared memory architecture to eliminate unnecessary copying and communications between the CPU and GPU, which could make efficient emulation difficult with discrete GPUs.

Emulation of x86-based consoles may not be as easy as it sounds.
 

A Bad Day

Distinguished
Nov 25, 2011
2,256
0
19,790
[citation][nom]JacekRing[/nom]"This ability to take one architecture and customize it for various clients (AKA consumers, Sony, etc) is part of the company’s "flexible system on chip strategy"."Let me correct his statment for everyone."The ability to take one APU, downgrade it and resell it to various clients (AKA consumers, Song, etc) is part of the company's "flexible laziness on a chip strategy"."[/citation]

What is AMD suppose to do with APUs that failed QA for the PS4s? Throw them out?

And saying "Why don't AMD fix the manufacturing flaws" is equally stupid. Intel also bins their own CPUs, because even their state of art fabs screw up sometime.
 

InvalidError

Titan
Moderator
[citation][nom]A Bad Day[/nom]What is AMD suppose to do with APUs that failed QA for the PS4s? Throw them out?[/citation]
The PS4 custom CPU has quad-channel GDDR5 memory controller which does not leave many options for reusing sub-par chips.

Since this is a custom design created exclusively for Sony, Sony probably buys the chips by-the-wafer which means AMD has no such thing as non-qualifying chips to worry about as long as yield per wafer meets the contract requirements. Another way to sell custom ASIC designs is to license a mask set for whatever foundry the client chooses and let the client sort out yield problems with his foundry directly.
 

jurassic1024

Distinguished
Jul 1, 2008
122
0
18,680
So AMD has to sell their console APU/SoC for PC use to break even. nVIDIA was right to pass on this and develop Tegra which has a far greater chance of bringing in more revenue than some console contract with some low budget hardware.
 

jalek

Distinguished
Jan 29, 2007
524
1
18,995
If AMD is going to produce the SX version of the APU, will they later sell a co-processor to boost it back up with other technologies in place of Sony's? I don't know that it would be the cash cow Intel had in the 386/486 days, but telling people that you're selling them a cut-down version of a processor isn't great marketing.
 


Most graphics cards use cut-down processors and they're the cards that tend to sell the most. Heck, even most CPUs are cut down in some way if they're not a top-end model. I highly doubt that the marketing implications will be an issue.
 
Status
Not open for further replies.

TRENDING THREADS