IBM Still Developing Cell Processor; Loves Games

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


With multiplayer capacity, Ive seen one game that really proved this, being MAG. Terrible game though, laggy as hell, graphically inferior, and a low amount of fps. For an exclusive, they really brought the ps3 to it's knees or just did a shitty job making it. Maybe if they did 40 players instead and slightly smaller, more detailed maps it would have been better, although there has to be a community that loves the game.
 
[citation][nom]joe gamer[/nom]I'm pretty sure PC exclusives and even some ports would top those lists. That being said the PS3 is definitely a much more versatile machine than the 360, standard replaceable HDD, integrated WIFI, free multiplayer, blu-ray and extra processing power. If only they had kept costs/prices down and not let the 360 beat the shit out of them in sales, then more games would be better optimized and not built for the 360's paltry specs. AMD says they will be ready once 3d blu-ray becomes more common, we'll see.[/citation]

AT first that was the case, but now the tables are turning. NA is the only territory PS3 trails by any significant margin. I think the sales gap hardware wise is less than 4 million now last I read. Worldwide software sales in the last year have favored PS3 over 360 as well.

[citation][nom]dalta centauri[/nom]With multiplayer capacity, Ive seen one game that really proved this, being MAG. Terrible game though, laggy as hell, graphically inferior, and a low amount of fps. For an exclusive, they really brought the ps3 to it's knees or just did a shitty job making it. Maybe if they did 40 players instead and slightly smaller, more detailed maps it would have been better, although there has to be a community that loves the game.[/citation]

I've never played that, but the general consensus from reviews are along the lines of "it works" but yeah it could've been designed better. From a technical aspect I still don't see any kind of developer ambition like that on 360 though as far as going above and beyond what's been done already. Sure Halo has a massive online structure but looking at the game itself in action, even Reach's action looks fairly mundane and simple next to games like Resistance 2 (60 players) Warhawk and Killzone 2 (both 32 players) which have proven to hold up well technically.

It'll be interesting to see how the rest of this console generation plays out, but chances are the PS3 will continually prove to have more long term value than the Xbox 360 and especially the Wii; ironically enough for much of the same reasons most people criticized as being "unnecessary" at launch...or that Microsoft with its overpriced proprietary add ons has charged its 360 users for ever since to avoid falling behind.
Proof that consumers don't know what they want until its commonplace enough to be obvious.
 
I don't think any of the current consoles have a 'long term value' anymore, seeing as they all use outdated hardware that's being upgraded with new controller type hardware and software that only works as a seperate unit. I say next year we need a new generation of console releases, even if it costs twice as much as the ones already out, I would like to see what the generic console gamer thinks about it's starting price if it's 350-500$.
Hell, if Sony makes a PS4 that actually does have 1080p resolutions with 45+fps I would think about buying it.

Also, no matter what can be proved or said; I still think Sony and Microsoft adopted the Nintendo's original movement idea and improved on it. Sadly, Nintendo went about releasing a 'better' motion control idea at the same time other companies did, like they arranged a meeting to all release the same type of gaming genre of motion control at the same time.
 
Why did the comments have to turn into a 360 vs. PS3 flamefest, it seems? Hardly any of you know much of anything about hardware engineering, so you're mostly just going on hearsay. Worse yet, it seems none of you understand the concept of "engineering:" making use of limited resources to design a solution to fit a SPECIFIC need. I guess most of you have been spoiled by the (false) idea that you can have a "one-size fits all" piece of hardware, that you can have your cake and eat it, too.

The biggest thing is that the CPU has little to do with the graphics compared to the GPU. PC Enthusiasts here have known this for years, though I can understand if most of you console-only gamers don't know this. As it happens, both the 360 and PS3 actually have relatively close GPUs; the PS3 has a slightly cut-down G71, while the 360 has a custom ATI/AMD CPU that sorta-resembles a halfway gap between an X1k and HD 2k series card. The main differences here are as follows:

- The PS3 has more memory bandwidth; it has 22.4 GB/sec exclusive to the GPU, while the 360 has to share its GDDR3 bandwidth with the CPU. On the flipside, the 360's GPU gets direct access to all 512MB of memory, while the PS3 has to indirectly access the CPU's memory if it needs more than 256MB.
- The PS3's CPU cannot do AA+HDR. As a result, most games use HDR, but do not touch AA. (FF XIII is an exception, as it has no HDR, but uses AA instead)
- Because the PS3 can't DO both, it's also not spending power on both; as a result, it puts that toward making all its major games run at 720p, while on the 360 comparable games would run at 576p, 600p, or 640p.
- Some sports games run higher than 720p natively on BOTH consoles. However, for "top-shelf" major games, (be it GTA 4, MGS 4, Halo 3/3ODST/Reach, Uncharted, Fallout New Vegas, etc.) None run higher than 720p. Playing on a 1080i/p TV will result in the game being "stretched" to fit.

Overall, the PS3 and 360 are rather close for consoles; no two consoles have been this close in capability in quite a long while; perhaps the SNES and Sega Genesis were close, IF you ignored the SNES' huge advantage in audio.

The idea that something can "use a console to its max" is really a bit misleading: due to decades of engineering expertise being applied to the consoles, MOST top-shelf games "use every bit of a console's power." For the most part, if a game looks better, it's because they found a "better" way to re-balance where resources are spent: typically by cutting quality to parts you won't notice, to boost it where you will. The machine's doing the same work for about the same results, it's just that the results are arranged in a way you'll notice better.

[citation][nom]Flameout[/nom]i've seen games on xbox360 and they look the same as my ps3. what's so great about cell?[/citation]
Well, when's the last time you watched a Blu-ray movie on your Xbox 360? Most games don't even use most of the SPEs on the Cell. However, they ARE used when you're playing a Blu-Ray movie. The Cell in the PS3 was an application-specific chip: designed for the PS3 and nothing more. The PPE (the main core) is used 100% for games, with light usage of the SPEs. (7 sub-cores) However, for movies, the main core is overkill, and largely spun down, while the SPEs are used for more efficient provision of math power, that's needed for movies.

[citation][nom]joe gamer[/nom]Asymmetrical processing will never be able to compete with symmetrical processing. When you know that all of the CPU's cores are identical programming a multi-threaded app is not difficult, you simply balance the load equally. But an architecture with an asymmetrical processing load?[/citation]
Actually, we've been seeing assymetrical CPU architectures in use for years and years and years, and being programmed for pretty darn well. They're called video cards.

[citation][nom]joe gamer[/nom]If the Cell processor was really worth a damn don't you think that someone somewhere outside of the PS3 would be using it for something/anything?[/citation]
Funny you mention that... Since the same sort of design (though modified for native double-precision, of course) was used in the world's most powerful supercomputer from 2008-2009.

The Cell design DOES have its definite uses; in a previous post around here I'd detailed over how it's actually peraps the most efficient design for supercomputing in terms of performance-per-watt: it manages to be a more math-heavy (and hence less instruction-heavy) design than conventional symetrical CPUs like x86 designs, while still maintaining close to their theoretical peak in real-world applications; a PowerXCell 8i manages to reach about 75-80% of its theoretical maximum in real-world tests, which is on a par with major x86 CPUs like Opteron and Xeon. This is FAR better than the 30-35% level gotten by GPUs. Hence, in spite of the lower THEORETICAL power of the Cell, it manages to be better suited for general-purpose computing applications than a GPGPU.
 

As a price point, is the cell processor cheaper/expensive to make and sell then an Opteron and Xeon?
The Cell processor isn't new, but due to how common it's placed within tech, I would assume it costs more through it's scarcity?
 
Status
Not open for further replies.