[citation][nom]nforce4max[/nom]Who would have thought back in 2005 that the G70 would be alive long enough to go from 110nm down all the way to 40nm. Then again 24 shader, 8 rop, and only 128 bit MCM package. If any thing besides ram that holds the PS3 back its the RSX while it would be nice to have all the SPEs enabled instead of one electrically dead and another reserved for OS[/citation]
Actually, the SPEs it has available for games is readily more than enough; no game can really make use of all that math power for its own core uses, at least with as little instruction power the Cell has to run on: it's a VERY SIMD-heavy chip, with the SPEs designed primarily for streaming media, (read: Blu-ray and other high-def video) rather than gaming. Uncharted 2 managaed to apparently jimmy some use of them to run a little bit of its graphics effects, but such is a pain to program.
The one thing the PS3 has in spades is raw SIMD math power: if you have tons of data, and you need to process it in bulk, then the Cell will chew through that at a rate that keeps pace pretty admirably even with modern CPUs. (and blazes past what was around in 2006) However, the Cell is short on instruction power; only the PPE can fetch instructions; the SPEs have to wait for instructions to be passed down from the PPE to "prime" them before they can get to work. A further bottleneck comes from the fact that the SPE cannot truly address the main memory; they basically have to wait for the PPE to shuffle it over to them. In essence, in every way, the PPE is the chip's bottleneck, as the SPEs keep having to wait for it to give them what they need before they can do anything.
And yes, RAM is another limitation; RAM quantity is a perrenial gripe in consoles; it's always expensive as hell when the console's being made, so makers tend to go conservative on it. (Nintendo moreso than others) Bandwidth is also a concern; cost concerns kept the memory interface to 128 bits; anything more would've required that the CPU/GPU packages pack potentially hundreds of more pins, required that many more traces through the motherboard, possibly requiring more layers, and would've required more RAM chips on the board, all of which would've driven prices up in ways that would not have come down as fast.
All told, the loss of one or two SPEs isn't really much to worry about. The one locked-out in hardware was done for the very valid reason of raising yield rates; originally, the 90nm version had a pretty poor yield level that, while nowhere near as bad as, say, Fermi's, was still pretty unacceptable when you were trying to build a console.
[citation][nom]Ramar[/nom]That's mostly an issue with people don't know how to code for the PS3.[/citation]
No, it's not quite that; the PS3 isn't a magical black box full of fairy dust with arbitrary processing capabilities. It's a computer that uses a very PC-like architectural design at pretty much all levels. So its strengths and weakness are both very real.
Aside from the limitations I described to nforce4max, the whole bit about "coding right" to "get all of the power out of the PS3," it's a matter of working with the rather skewed arrangement of the PS3's abilities. In short, the Cell has only so much instructional capacity to go around, as it can only fetch and begin execution of a single one per clock cycle. Further, a lot of that will have to be eaten up moving data in and out of the CPU for the SPEs to use; thankfully each single move can transfer up to 16,384 bytes, so a single instruction is enough to keep an SPE fed for a while in this regard.
The main limit is dealing with the fact that while the whole chip's SPE arrays, (As in the PS3) can handle an impressive 48 operations per clock cycle, you get all of 1 instruction to do it in. If your program doesn't have constant use for that sort of arrangement, then you aren't going to leave the CPU fully busy. The matter is, at some points a game is GOING to need to use single-data instructions.