PS4 and Xbox 720 Graphics Specs Toe-to-Toe, Says Insider

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]upgrade_1977[/nom]LOL[/citation]

u%2Bmad%2Bbro.jpg
 
[citation][nom]upgrade_1977[/nom]LOL[/citation]
[img=http://1.bp.blogspot.com/-pMK1K_PwzrU/Tmb2aJPEDoI/AAAAAAAAcOA/vLya3LpljYY/s320/u%2Bmad%2Bbro.jpg ][/img]
 
Funny how some people think their ps3 is more powerful than a gaming pc LOL!
 
[citation][nom]ceeblueyonder[/nom]look above,another stupid car-analogy...[/citation]
Fine. look at Itanium vs Xeon. It's the same concept.

You don't own a sony product, yet you KNOW that the cell is the best APU/CPU on the market? What are you basing this on? Are you a programmer? Have you worked for a game development company? How closely do you follow the tech world as far as CPU architecture development goes?

I've worked for a game development company before, witnessed development for multiple platforms, have read countless stories about how terrible the Cell is as far as coding goes, etc.

So why do you want the Cell to be in there vs an AMD APU? Give us the technical specifics, since you've been impassioned enough to make a dozen or so posts on this topic about why Sony should stick with their existing architecture? If you don't know, then please stop making uninformed posts talking about how superior architectures when you have no clue.
 
[citation][nom]c4v3man[/nom]You know what games are right? Code. Do you know what programmers use to write game code? Windows (and linux, when discussing PS2/3). Just because it doesn't have to run windows, doesn't mean that x86 isn't more efficient, or easier to program. The custom silicon graphics (SGI) chips used back in the day in rendering workstations, etc are still exotic, even if they're a decade old. Just because something has been around for awhile doesn't mean it's not exotic. And the Cell isn't designed for gaming, it's designed for highly parallel code. Which gaming COULD use, but rarely does, beyond a few threads. The most parallel section of code is the graphics subset, which is already handled by the GPU. Just because I developed my own engine for a car, doesn't mean it's not cheaper for me to buy a similar or higher perforamance engine from another manufacturer, especially if they're making 10-100x the amount of engines. Volume reduces costs significantly. AMD can make x86 processor for a heck of a lot less than IBM/Sony/Toshiba.[/citation]


the actual console doesn't run windows. for example, ps3 doesn't run windows. it runs its own proprietary os managed by one of the spe's on the ps3. game developers apparently use an os to run their developing softwares, be it, windows or linux.

it's general knowledge that a specialized microprocessor such as the cell, is more "efficient" than x86-based microprocessors in certain task, such as gaming. guess what the ps3 does?

OMG! is it that hard to refrain from car-analogies? what's up with that?



 
[citation][nom]c4v3man[/nom]Fine. look at Itanium vs Xeon. It's the same concept. You don't own a sony product, yet you KNOW that the cell is the best APU/CPU on the market? What are you basing this on? Are you a programmer? Have you worked for a game development company? How closely do you follow the tech world as far as CPU architecture development goes?I've worked for a game development company before, witnessed development for multiple platforms, have read countless stories about how terrible the Cell is as far as coding goes, etc. So why do you want the Cell to be in there vs an AMD APU? Give us the technical specifics, since you've been impassioned enough to make a dozen or so posts on this topic about why Sony should stick with their existing architecture? If you don't know, then please stop making uninformed posts talking about how superior architectures when you have no clue.[/citation]

i am willing to bet $50 (i'll make a pay account if i have to) that you don't work for a game development company or will ever work for one. ever.
 
[citation][nom]ceeblueyonder[/nom]the actual console doesn't run windows. for example, ps3 doesn't run windows. it runs its own proprietary os managed by one of the spe's on the ps3. game developers apparently use an os to run their developing softwares, be it, windows or linux.it's general knowledge that a specialized microprocessor such as the cell, is more "efficient" than x86-based microprocessors in certain task, such as gaming. guess what the ps3 does?OMG! is it that hard to refrain from car-analogies? what's up with that?[/citation]

Sure, it may be more efficient clock for clock. When the cell is clocked significantly lower than a comparable X86 CPU, then that means squat. Obviously none of the consoles run windows. But they are developed on x86 based machines. Initial testing is done on an x86 target. With games being oftentimes developed for Windows, PS3, and X360, having them ALL coded for X86 is more efficient, wouldn't you agree?

Just because you now have the knowledge to code for the Cell, doesn't mean it's efficient to do so. It's still easier to write one piece of code, than it is to write the same code twice, or three times.
 
[citation][nom]ceeblueyonder[/nom]i am willing to bet $50 (i'll make a pay account if i have to) that you don't work for a game development company or will ever work for one. ever.[/citation]
I wasn't employed directly by the development company, I was employed by an IT consultant, and I was over there about 20 hours a week managing their infrastructure. So as I stated, I worked for a game developer. So I was talking to developers, watching the build process, for a couple years. Unfortunately the game they were developing at the time I was there got cancelled (based off a flopped movie). It was cheaper to cancel the game probably 90% through it's development than to publish it.

Where's my $50? I'm done... you're just a troll, and nothing is gained by continuing to prove it. The fact that you are arguing for something you don't even own just proves the fact.
 
[citation][nom]c4v3man[/nom]Sure, it may be more efficient clock for clock. When the cell is clocked significantly lower than a comparable X86 CPU, then that means squat. Obviously none of the consoles run windows. But they are developed on x86 based machines. Initial testing is done on an x86 target. With games being oftentimes developed for Windows, PS3, and X360, having them ALL coded for X86 is more efficient, wouldn't you agree? Just because you now have the knowledge to code for the Cell, doesn't mean it's efficient to do so. It's still easier to write one piece of code, than it is to write the same code twice, or three times.[/citation]
those game developers using linux are using virtual machines to run/test the ps3 games. games being developed for windows have to be ported over to consoles and vice-versa.

it's not about coding for one specific processor anymore. whether cell or amd, all of this might be for null, if game developers have enough "tech" to work with that whatever code they use to port for these machines will be irrelevant, since a game made with such and such api or game engine can be run on a pc or a console because code and api, to me, is becoming heterogeneous, as bf3-skyrim, mw3 has shown.
 
"If anything, all this AMD talk could mean that both consoles will be more PC-like than ever before."

dumbest statement i have ever heard on toms.

consoles have been "like" pc's since the last generation of consoles. Dreamcast, xbox, and game cube all used pc based chip components only the ps2 really had a unique and different chip design being used in it, and it was not soemthing that system got kudos for really.

that said i doubt people will give two fly asses , what hardware the next gen uses if they also use , anti used game schemes and other forms of rumored DRM build-ins.

Seriously I'll never get the menatllity of the game industry when it come's to property rights and such.

Books have been around for ages and no one ever tried turning them into "service right" purchases instead of owing your physical book, same can be said of music and movies. but the game industry (and the music industry now ) out of the blue gets it in their head that their products deserve a different set of rules.

i say screw em , if they keep getting their way in the legal system i'll just stop playing games , as horrible as that may seem , i can remember a time in my life when my life didn't involve them and their "services". If they want to make games for themselves, so be it let them. I'll just read more books again.
 
[citation][nom]c4v3man[/nom]I wasn't employed directly by the development company, I was employed by an IT consultant, and I was over there about 20 hours a week managing their infrastructure. So as I stated, I worked for a game developer. So I was talking to developers, watching the build process, for a couple years. Unfortunately the game they were developing at the time I was there got cancelled (based off a flopped movie). It was cheaper to cancel the game probably 90% through it's development than to publish it.Where's my $50? I'm done... you're just a troll, and nothing is gained by continuing to prove it. The fact that you are arguing for something you don't even own just proves the fact.[/citation]

i think i won this bet, if what you say is true since you confessed that you weren't employed directly by the company.

just because you fixed their phones and talked to a couple of people there doesn't mean you work for them or know anything about game development. i mean, even if you managed to become good friends with one of their employees, how much could have this employee told you that would make you know all of a sudden about anything that is developing games?
 
[citation][nom]ceeblueyonder[/nom]i think i won this bet, if what you say is true since you confessed that you weren't employed directly by the company.just because you fixed their phones and talked to a couple of people there doesn't mean you work for them or know anything about game development. i mean, even if you managed to become good friends with one of their employees, how much could have this employee told you that would make you know all of a sudden about anything that is developing games?[/citation]
They told me to do something, I did it. I got a pay check for doing so. They employed my company, which in turn employed me. I was working for them as a subcontractor effectively.

What game development company do you work for? What technical reasons do you have for using the cell (specifics). What function of gaming do you need high performance floating point calculations such as what the Cell is optimized to run? Why haven't you given a single technical reason as to why it would be even remotely intelligent to stick 3-4 Cell processors (which is ridiculous, since they would simply increase the SPU count, as opposed to going through the difficulty of interconnecting multiple cell processors together).

Post direct answers to these relevant questions, as opposed to continuing to question my employment history, which has been made clear on my end.
 
[citation][nom]hellfire24[/nom]a $700-$800 pc would kill both of them![/citation]

I don't think anyone can argue that, but we have to remember these aren't going to be launched at 700-800$

More than likely these consoles will be launching in the 300-400$ range, don't forget both of these systems will be competing against the Wii U which will have a 1 year head start on them. So they'll have to be price competitive not just against one another, but against Nintendo's system which will have a years worth of manufacturing behind it, which means they'll have already streamlined their processes and be able to cut their console pricing when MS and Sony get to market with theirs.
 
[citation][nom]c4v3man[/nom]Sure, it may be more efficient clock for clock. When the cell is clocked significantly lower than a comparable X86 CPU, then that means squat. Obviously none of the consoles run windows. But they are developed on x86 based machines. Initial testing is done on an x86 target. With games being oftentimes developed for Windows, PS3, and X360, having them ALL coded for X86 is more efficient, wouldn't you agree? Just because you now have the knowledge to code for the Cell, doesn't mean it's efficient to do so. It's still easier to write one piece of code, than it is to write the same code twice, or three times.[/citation]

Sony needs to get rid of the Cell. I remember reading a story on how Microsoft contracted IBM at the same time they were developing the Cell. Good Part of Cell Dev intelligence got used in the xbox360 , but not all that SPE non-sense.
 
[citation][nom]kawininjazx[/nom]To build a new console for under $400, it is going to have to use older hardware, not to mention it takes a long time to develop a console. You can't just put in the newest GPU at the last second.Think about this, would you be able to run a game like Gears of War 3 on a PC with a Celeron Dual-Core, 512mb ram, and an Nvidia 7600 512mb? No, but on a console you can, that's pretty much what a 360 is.[/citation]

its all about the buss speed

people look at the chips and go uuuug cos yes the chips them selves are moderate or low end hardware , but what empowers a console in leaps and bounds is the main board buss speed

you see on PC , your video card can have a internal buss speed of 128 bit upt to the mosnter carts that have 512 bit buss speeds. however, any time data is transfered between system memory and the video card , it get's bottle necked to 64 bit , because the CPU , the ram and on board chipsets all run at 64 bit (or 32 bit on win xp). however a console does NOT have that limitation (oh the shame calling a PC feature a limitation ) but it's true.

the main board, CPU(s) abnd GPU(s) on a console ALL have matching buss speeds in the case of the 360 and ps3 that speed was either 128 bit or 256bit (can't remember which) . piont is on a console the components get MUCH closer to reaching their theorectical speeds because there are no information buss bottle necks like you see on a PC , so the same components put on a console run MUCH faster and more efficently on the console than if you were to build a pc with those same components. becasue on pc every thing times to main board/cpu buss speed which is at 64 bit right now. keep in mind also the speed and ammount of your system ram , also affects how close to theorectical limtis your computer operates at hence why more ram is always good.

again on a console , they can go with moderate chips , throw every thing on a really fast buss speed and drop in a smaller ammount of much faster ram , and wollla , they got a cost efficient console that out performs a PC with the same specs.


that is also why console ports require more pc hardware as well, it's all about buss speed.

but don't get me wrong , I'm not saying that pc isn't more advanced than consoles in no way. PC's are drastically more advanced if you are talking about throwing down some serious bank. but the fact remains , pc buss speed changes occur very very slowly , it took us nearly 20 years to go from 32 bit cpu's to 64 bit, this is not because of tech limits but simply the fact that the market cant rewrite arcitecture so fast because nothing would ever have compatibility that way. the fact AMD and then intel figured out how to make x86 64 bit compatible and backward compatible with 32 bit code , was a small miracle, one I don't expect to be repeated in 20+ years when they finnally make a 128 bit PC platform.

now by all means thumbs me down , like a know a lot of pc fanboys and haters will do , but remember while you do this , I'm a much bigger fan of PC gaming myself. and I'm only stating facts here. i'm not trying to take any one side or say that one platform is better than the other (because it is clear pc's are more advanced in general). again just stating facts about buss speeds , because a high buss speed is what makes consoles possible as a cheaper gaming platform.
 
The XBOX 720 will be a GREAT CONSOLE!!!

The performance of games on the console will be FAR greater than similar hardware on a PC.

The mistake people are making is comparing the XBox720/PS4 to a PC. What they SHOULD be doing is comparing them to the 1st generation of the XBox 360/PS3.

If we assume that they will use roughly 180Watts for 1st generation hardware then the new systems will be better because of:

a) more power efficient (smaller die size)
b) more efficient architecture (more processing for same power consumption)
c) tessellation (eventually may use 25% of the power for the same rendering job)
d) more efficient Anti-Aliasing algorithm
e) improvements in gaming engine design

*So let's be clear, we have EIGHT YEARS of improvements in hardware and software by the time the next XBox is released. If we have 180Watts to work with then these games will look far, far better than the 1st gen XBox 360 games and they'll continue to get better. Tessellation in particular won't be optimized fully at first so future games will become much more efficient and anti-aliasing done properly won't have nearly the performance hit as it does on current PC's.

Bottom line...
People keep complaining that we aren't using high-end gaming cards. Consoles have a power limit. They did a great job on the XBox 360 and the next version will be far better.

Seriously, what do you expect them to do, create a 400Watt consoles?

The XBOX 720 and Sony PS4 are going to be really great, successful gaming consoles.
 
I hope they're both based on AMD silicon, since that might get developers to use OpenCL or DirectComputer for physics acceleration, etc, as opposed to using proprietary code like Physx.
 
[citation][nom]c4v3man[/nom]I hope they're both based on AMD silicon, since that might get developers to use OpenCL or DirectComputer for physics acceleration, etc, as opposed to using proprietary code like Physx.[/citation]
DirectCompute not directcomputer. Stupid typo.
 
[citation][nom]demonhorde665[/nom]its all about the buss speed people look at the chips and go uuuug cos yes the chips them selves are moderate or low end hardware , but what empowers a console in leaps and bounds is the main board buss speed you see on PC , your video card can have a internal buss speed of 128 bit upt to the mosnter carts that have 512 bit buss speeds. however, any time data is transfered between system memory and the video card , it get's bottle necked to 64 bit , because the CPU , the ram and on board chipsets all run at 64 bit (or 32 bit on win xp). however a console does NOT have that limitation (oh the shame calling a PC feature a limitation ) but it's true. the main board, CPU(s) abnd GPU(s) on a console ALL have matching buss speeds in the case of the 360 and ps3 that speed was either 128 bit or 256bit (can't remember which) . piont is on a console the components get MUCH closer to reaching their theorectical speeds because there are no information buss bottle necks like you see on a PC , so the same components put on a console run MUCH faster and more efficently on the console than if you were to build a pc with those same components. becasue on pc every thing times to main board/cpu buss speed which is at 64 bit right now. keep in mind also the speed and ammount of your system ram , also affects how close to theorectical limtis your computer operates at hence why more ram is always good. again on a console , they can go with moderate chips , throw every thing on a really fast buss speed and drop in a smaller ammount of much faster ram , and wollla , they got a cost efficient console that out performs a PC with the same specs. that is also why console ports require more pc hardware as well, it's all about buss speed. but don't get me wrong , I'm not saying that pc isn't more advanced than consoles in no way. PC's are drastically more advanced if you are talking about throwing down some serious bank. but the fact remains , pc buss speed changes occur very very slowly , it took us nearly 20 years to go from 32 bit cpu's to 64 bit, this is not because of tech limits but simply the fact that the market cant rewrite arcitecture so fast because nothing would ever have compatibility that way. the fact AMD and then intel figured out how to make x86 64 bit compatible and backward compatible with 32 bit code , was a small miracle, one I don't expect to be repeated in 20+ years when they finnally make a 128 bit PC platform. now by all means thumbs me down , like a know a lot of pc fanboys and haters will do , but remember while you do this , I'm a much bigger fan of PC gaming myself. and I'm only stating facts here. i'm not trying to take any one side or say that one platform is better than the other (because it is clear pc's are more advanced in general). again just stating facts about buss speeds , because a high buss speed is what makes consoles possible as a cheaper gaming platform.[/citation]

Good Stuff , dude.
I will keep this for reference.
Pound for Pound Consoles are more efficient.
Best bang for the buck.

 
[citation][nom]gray_fox_98[/nom]Good Stuff , dude.I will keep this for reference.Pound for Pound Consoles are more efficient.Best bang for the buck.[/citation]
Except that it's mostly wrong, yeah it's great stuff. Memory controllers, various interconnects, etc have nothing to do with the ADDRESS SPACE that the code can address, and has everything to do with how much data can fit through in a clock. So Windows 32bit (whatever version you're talking about, xp, 2000, Vista, 7) doesn't prevent you from using a 128 bit bus between the CPU and the GPU, or the CPU and the Northbridge, or the Northbridge and the RAM, etc. It has nothing to do with it. All the consoles have less than 4GB of memory, so they're not using a 64bit address space anyways. They do use 64bit, 128bit, etc for floating point calculations, but the memory address space is still 32bit.
 
The most immediate question that comes to mind is if all the graphics elements that are seen within the XBOX 360 are so good, why aren't we seeing it in the PC space yet?

Xenos's particular range of features are going into a closed box environment, hence the API can be tailored to expose all of the features of the chip, however on the PC space graphics processors really need to be tailored to the capabilities of the current DirectX release. This is where Xenos has an issue in that its features and capabilities are clearly beyond the current Shader Model 3.0 DirectX9 specification while it lacks features that are expected to be a requirement for WGF2.0.

WGF2.0 has requirements for virtualisation, and whilst Xenos has the luxury of being able to access the entire system memory this is by virtue of the fact that it is the system memory controller. Part of the virtualised requirements of WGF2.0 appear to be able to include unlimited length shaders, where Xenos has some hard coded limits here and, whilst large and defeat-able through a couple of methods, probably wouldn't meet the requirements for WGF2.0 here. When we looked into WGF2.0 in our DirectX Next article there was, at that point, a suggested requirement to the graphics pipeline to have a fully integer instruction set as well as the floating point pipelines, however Xenos's ALU's are purely float in operation.

The shader processing design is clearly very different from today's graphics processors, but then there is the fact that PC's will be catering to a greater range of utilisation of features as there is a quicker evolution cycle as far as graphics are concerned - some titles being released even now are very limited in their shader use, whilst others are utilising them extensively; Xenos's design is likely to be most beneficial when the majority weight of processing requirements goes towards shaders as opposed to the more fixed functionality elements of the pipeline. Arguably, though that balance is already shifting, and if Xenos is actually as good at shader processing as it purports it still begs the question as to why ATI are looking towards more traditional shader pipeline over the next 12-18 months instead of using this, even though it has slightly greater capabilities than current PC API's allow. Perhaps the answer lies in the fact that this is such a big change that trialing it in a closed box environment, where developers will have more time to tailor specifically to the hardware requirements, as the hardware will stay the same for the next 3-5 years, makes sense as they can also use the experience gained from that to assist in the development of a PC architecture based on a similar processing methodology.
 
[citation][nom]c4v3man[/nom]Except that it's mostly wrong, yeah it's great stuff. Memory controllers, various interconnects, etc have nothing to do with the ADDRESS SPACE that the code can address, and has everything to do with how much data can fit through in a clock. So Windows 32bit (whatever version you're talking about, xp, 2000, Vista, 7) doesn't prevent you from using a 128 bit bus between the CPU and the GPU, or the CPU and the Northbridge, or the Northbridge and the RAM, etc. It has nothing to do with it. All the consoles have less than 4GB of memory, so they're not using a 64bit address space anyways. They do use 64bit, 128bit, etc for floating point calculations, but the memory address space is still 32bit.[/citation]

Couldn't you understand the sarcastic tone in my comment?
LMAO
Cool Story Charlie , i make i remember it.
 
All you people ragging on these specs obviously have no idea how console system specs work. There are immense inefficiencies in Direct X and are costing you hundreds as you upgrade your GPU every 12-24 months. If PCs were as efficient as the current gen of consoles are you'd be amazed at the quality they'd give. from 4 year old GPUs.

BUT, the PC industry has no intention in you keeping your GPU for 5 years, unlike the console market who BEG you to keep your console for 5 years. they do this by actually improving the system software and allowing the developers to more out of what they have. Unlike PCs where no one is worried because they know there is new tech on the horizon.

It's very much a case of Apples and Oranges.
 
Status
Not open for further replies.

Latest posts