Would it have been possible to run any standardized benchmark on it?In basic terms it had a unique processor and support for Linux out of the box. Console hardware tends to be rather efficient at launch as well compared to other consumer CPUs, it was also pretty inexpensive all things considered. Included networking, storage, etc.
Similar projects were done with the PS2 as well.
Not sure what measurement you could use for performance outside the listed specs. It really would have depended on the application, whether they used the GPU, and so on.
Are there any cross-platform benchmarks that would work on the PS3? I would like to test and compare my devices, if possible. by cross-platform, I mean includes support for ChromeOS, or can run in a browser.Well the whole idea for supercomputer usage was clustered processing, so it would scale with the number of PS3.
Only one I know of that might work is LINPACK for a straight FLOPS benchmark.
"Our 16 PS3 Gravity Grid generates a total performance of 40 GFLOP/s (40 billion calculations per second). It should be noted that this benchmark was run in double-precision and because of the limited RAM on each PS3 we were only able to fit a matrix of size 10K on the entire cluster."
1,760 PS3's.Why was the PS3 used as in supercomputers, and how powerful truly was it?
The key was they were cheap, and available. They could be networked and could run standard linux. That made them attractive for clusters.
The Cell processor's design could be thought of as an earlier version of how modern GPUs are designed: use a more general purpose processor to manage a bunch of simpler processors that do all the number crunching. I recall one of the use cases Sony said you could do with the Cell was assist the RSX GPU with graphics tasks. And indeed, some of them used the Cell to perform what would be done in compute shaders today. Computer shaders are often used today to work on screen space buffer effects like SSAO.Why was the PS3 used as in supercomputers, and how powerful truly was it?
Exactly.The key was they were cheap, and available. They could be networked and could run standard linux. That made them attractive for clusters.
Wasn’t the cell really difficult to develop for?The Cell processor's design could be thought of as an earlier version of how modern GPUs are designed: use a more general purpose processor to manage a bunch of simpler processors that do all the number crunching. I recall one of the use cases Sony said you could do with the Cell was assist the RSX GPU with graphics tasks. And indeed, some of them used the Cell to perform what would be done in compute shaders today. Computer shaders are often used today to work on screen space buffer effects like SSAO.
I'd argue the Cell was the progenitor of APUs.
I wanted to see where the grain of truth in this is, but if we went by the TOP500 list in 2008, the IBM Roadrunner (which uses Cell) was the only one in this list to do something better than 0.25 TFLOPS per kW (it got ~0.44 TFLOPs per kW).Exactly.
And apparently use a lot less power.
It was only difficult in the sense that:Wasn’t the cell really difficult to develop for?
Right, the removal of Linux. I remember now. Security concerns were cited as the reason, I believe.More background on the topic: https://en.wikipedia.org/wiki/PlayStation_3_cluster
Note the change that led to the demise (I had one of these. Dang shame when it happened).