PS4 and Xbox 720 Graphics Specs Toe-to-Toe, Says Insider

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]dark_lord69[/nom] Also, another positive is that the 7670 in the PS4 is capable of DirectX 11 and Tessellation.xBox 720:If the xBox 720 is running a 6670 and does not have dual cards or an APU to help it may be a slightly slower proformance machine than the other two. The Dx11 and tessellation will help the apperence of any games using those features which will give it a visual advantage over the Wii U. But if a game was created using Dx9... .[/citation]

1- The Xbox360 Can Run games developed in Direct 9 , but has its own customized version of the API(DirectX 9.0c ) , some of its features come directly from directx 10.
I doubt all of these consoles Direct X api will be a direct copy of the PC counterpart.
2) If there is issues with the Wii U running DX10 and PS4/XBOX720 running DX11 , I guess most Devs will go for the lowest denominator and port the games to the PS4/720 , unless the Wii U is a catastrophic failure.
 
[citation][nom]dragonsqrrl[/nom]I'm aware that it had a unified architecture, but what do you mean by "2 Tiers down"? So you're saying that because Xenos had a unified architecture, it therefore performed better and had more in common with an X1950 XTX as opposed to the X1900 XT? First of all I hope I'm just misinterpreting your run on sentence, because I don't think that's accurate, at all. While unified architectures are technically more advanced because they allow more freedom to allocate resources where needed, that doesn't necessarily translate to greater performance, especially in the pre-DX10 era. In any case, when your getting down to that level of specificity, I think it becomes pretty difficult to prove your argument. I was speaking in far more general terms.What I'm saying is in terms of published theoretical specs, such as clock freq, FP performance, interface bandwidth etc, Xenos isn't quite up there with the highest-end of the X1900 series. However, it also came out before the X1900 series debuted, and at the time it was amongst the most powerful GPU's available.[/citation]




No no no i was quoting (stingstang 04/06/2012 7:41 PM) who siad I remember that the xbox 360 had a graphics chip in it that was about 2 tiers less than the highest desktop card out at the time.

but the fact is the Xeon GPU was far more advanced then anything on the market at the time and if you could have had that chip on the desktop it would have been considered high end having been 512mb of ram + a 10mb daughter chip with 256gb/s bandwidth to handle AA and most post processing effects with little to no performance drop. the Xeon is what was considered DX9+ capable as it had some custom shaders software wise and could do hardware tessilation in DX9 mode a form of it anyways tho nothing i know of for the 360 games used it but still .

 
So we're told that developers have been "begging" MS and Sony for new console hardware, and in response they get hardware that would be suited to a mid-range laptop? I know they want to keep the prices down, but this is insane. They could at least go with something like the GTX 460.

For backwards compatibility, the issue is whether the GPU has changed because there are royalty issues. The 360 is not backwards compatible with the original XBox because the former has ATi and the latter had nVidia.
 
[citation][nom]upgrade_1977[/nom]With multimonitor gaming all the rage now, I wonder if the consoles will have multi-video outs.??[/citation]

I don't see how that's possible given the seriously under-powered GPUs. Unless these rumors are false.
 
[citation][nom]soldier37[/nom]lol took the words out of my mouth.[/citation]
It's far cheaper to use standardized x86 hardware, than the custom design that is the Cell. Besides, putting in more Cell APU's wouldn't yield a massive performance increase in most situations. It's like the Itanium processor... great at a few things, but average or below average in most. All while being excessively expensive.

Do you own anything made by a company other than Sony? The results are what matters, not how you get there. If developers could spend more time making the game look good, rather than wasting a ton of time trying to get something to run on an exotic architecture just because Sony wanted to be different, everybody would win.
 
[citation][nom]gray_fox_98[/nom]UMAD BRO?MS and Sony are businesses , they are out there to make money , if they thing they can get away by doing this , they will.Looks like the Massive Growth in Gaming graphics+Physics has stalled. (unstoppable since the 90s.)Iam mad too , but what can we do?[/citation]
Do you know why it took them 6 years to "tap out" the architecture? Because it's a PITA to program for, and inefficient with industry standard coding methods. It's not "so powerful, it took 6 years to fully use it". Games look better today then they did 6 years ago. Just like the Xbox 360. Developers make advanced in rendering paths, techniques all the time.
 
[citation][nom]dark_lord69[/nom]Wii - 4870 basedxBox - 6670 basedPS4 - 7670 basedBut look at this...In terms of proformance:4870 = 5770 = 6770 = 77706770 > 6670(xbox)7770 > 7670(ps4)So believe it or not the Wii U is actually based on a chip that has higher proformance than the other two. Before you start sending hate mail read on...PS4:The PS4 has the GPU clocked at a whopping 1GHz which may be faster than the standard clock for that GPU (pc version video card) but we don't know for sure because the 7670 is not sold in stores (yet). Also, the PS4 is going to combine GPU power with the APU. Thus acting like a hibrid crossfire (dual video card) setup. This may be enough to push it into a catagory that will allow for proformance closer to a 7770 (or beyond). Also, another positive is that the 7670 in the PS4 is capable of DirectX 11 and Tessellation.xBox 720:If the xBox 720 is running a 6670 and does not have dual cards or an APU to help it may be a slightly slower proformance machine than the other two. The Dx11 and tessellation will help the apperence of any games using those features which will give it a visual advantage over the Wii U. But if a game was created using Dx9... my money would still be on the Wii U... Why?Wii U:1. The 4870 is actually slightly faster than the 5770, 6770 and the 7770 (not by much, it's really in the same class of proformance).2. The 4870 GPU and memory clock in the Wii U actually bring the proformance up to the level of a 4890.What I see...-The Wii U will be capable of 1080p and plenty of AA and other graphical enhancements as well.-The PS4 games may look nicer when they use dx11 and tessellation features but may not use/be capable of as much AA and gaphical enhancements as the Wii U due to Dx11 being more taxing than Dx10 (4870).-The xBox 720 (like the PS4) may look nicer than the Wii when running games that use dx11 and tessellation features but without a dual card setup (or hybrid setup) it may not use much AA or graphical enhancements.Overall I like the way the PS4 is stacking up against the competition but they aren't going to get me to pay an insane amount like they initially charged for the PS3 ($600).I like the Wii U's graphics potential but it doesn't use Dx11 and I don't know what to think about that controller.I don't know much about the new xBox 720 so I can't really comment...Bottom line, they will very likely all be running similar in terms of graphical proformance and my money is actually going to the console that puts in (or has the capability of adding) a TV tuner for recording purposes. And it would be really cool if you could plug in a hard drive and use it as a NAS server.[/citation]

You should google it, this article has everything backwards, it's the nextbox that has been announced to have some sort of dual gpu setup, then all that's known about the PS4 is that it will have an cpu+gpu made by amd, if the cpu is an apu an hybrid xfire setup is a possibility but nothing is confirmed by sony.

piss poor article.
 
Will future games be brought down in graphics quality to meet console needs? I hope not.

If a console can't handle the decent graphics of now or the future (imagine trying to run a game from 2018-2020 on these proposed specs). Not sure when the PS5 is coming out, but my guess is probably in the next 10 years.
 
[citation][nom]ceeblueyonder[/nom]get away with doing what, exactly?and to answer your question, i am not mad. just sad.[/citation]


Not dropping Cutting Edge Gaming machine in the market , just like the XBox360.
The Xenos GPU was as good as the high end graphic cards , back in 2005.
Y U SAD?
 
[citation][nom]c4v3man[/nom]Do you know why it took them 6 years to "tap out" the architecture? Because it's a PITA to program for, and inefficient with industry standard coding methods. It's not "so powerful, it took 6 years to fully use it". Games look better today then they did 6 years ago. Just like the Xbox 360. Developers make advanced in rendering paths, techniques all the time.[/citation]

No S#it Sherlock. It's been like this pretty much since the 32bit console wars. Devs use set API'S in the beginning to make a good looking game fast and easy , and then towards the the end of console life-cycle they switch to customized solutions/programming directly to metal.
Nothing new really.


 


LOL

tumblr_l96b01l36p1qdhmifo1_500.jpg

 
x86 [citation][nom]c4v3man[/nom]It's far cheaper to use standardized x86 hardware, than the custom design that is the Cell. Besides, putting in more Cell APU's wouldn't yield a massive performance increase in most situations. It's like the Itanium processor... great at a few things, but average or below average in most. All while being excessively expensive. Do you own anything made by a company other than Sony? The results are what matters, not how you get there. If developers could spend more time making the game look good, rather than wasting a ton of time trying to get something to run on an exotic architecture just because Sony wanted to be different, everybody would win.[/citation]

no expert on cpu technology but a gaming console doesn't have to use x86 hardware because it doesn't have to run windows os.

as for cheaper, an amd a8-3850 apu is $130. if this speculative ps4 uses similar lineage and a dedicated gpu as the articles states then that is another say..... $100. so that is $240 right there gone on a pretty low-end gaming rig in pc standards.

give me a modern cell processor elk and an nvidia 680m-like gpu with double the memory that current ps3 has and give it to me for $400-$600 and that is like a "cheap pc gaming rig" price with "high end stuff" inside. i say high end because an nividia 680m is obviously a high end mobile gpu (i can dream right?) and a cell processor, however exotic you say it is, is not as exotic as you say it is because it's been around for 7 years. and it is high-end in terms of its specialization for gaming as oppose to an x86 microprocessor.

as for price, sony will probably save money from leveraging the cell processor on the hardware front (or make it back like they did with ps3 since they are co-creators) and since it is cell-based, sony will save money on the game development/api front from the 7 yrs of developing ps3 games....

the results are what matters and how you get there. you said it. not one or the other. how you get there will matter because if sony uses the cell processor, sony will be able to leverage cost over time, game development experience/cost, and the unique non redundant qualities inherent in a specialized microprocessor such as the cell, will be utilized closer to 100% of its potential as oppose to a generalized cpu like a x86 microprocessor that you mention.

i don't own anything by sony.

but i wish i own sony because i would try to still use cell processor and an nvidia 680m for a reasonable price.

also, i am sure sony didn't co-produce the cell microprocessor just to be different. they don't even market it. do you see a "cell processor inside" sticker on ps3's? no.

ur argument is lame. and i am lamer for even answering it.
 
[citation][nom]gray_fox_98[/nom]No S#it Sherlock. It's been like this pretty much since the 32bit console wars. Devs use set API'S in the beginning to make a good looking game fast and easy , and then towards the the end of console life-cycle they switch to customized solutions/programming directly to metal.Nothing new really.[/citation]
Exactly. Getting rid of the Cell would be the best thing for gamers, bar none. Then we can get great looking games much faster in the development curve.
 
[citation][nom]sweatlaserxp[/nom]The 360 is not backwards compatible with the original XBox because the former has ATi and the latter had nVidia.[/citation]

Errrghh.... Wrong. The Xbox360 can emulated many old Xbox games , i played panzer dragoon orta , shenmue 2 and halo in my xbox360 with no problems.
Where the hell did you got your info from?
 
[citation][nom]gray_fox_98[/nom]Errrghh.... Wrong. The Xbox360 can emulated many old Xbox games , i played panzer dragoon orta , shenmue 2 and halo in my xbox360 with no problems.Where the hell did you got your info from?[/citation]


I would say the biggest emulation issue between the original Xbox and the 360 would have been the switch to a power PC architecture for the 360 vs the x86 celeron 700 that was used in the original xbox. Even then like you said the 360 can still run the old games in an emulator.
 
[citation][nom]c4v3man[/nom]Exactly. Getting rid of the Cell would be the best thing for gamers, bar none. Then we can get great looking games much faster in the development curve.[/citation]

game developers don't have to learn anything new if they have developed ps3 games in the past. only new game developers, in which 7 yrs of api/game devs are already available to them.
 
Back in the 90s, we had console exclusive games. If you wanted to play something like Mario Bros, you had to buy a console. They should start doing that again. PC gaming should not be changed to suit the limitations of outdated console systems.

I keep telling myself there are certain game genres that are better suited for consoles (eg. racing, sports, etc), while others are really meant to be played on PC.

So if people want to play something like Skyrim, stick to PC for a better experience. It wasn't made for consoles. It was just built to comply with console limitations to sell more copies.
 
[citation][nom]ceeblueyonder[/nom]x86 no expert on cpu technology but a gaming console doesn't have to use x86 hardware because it doesn't have to run windows os.

i say high end because an nividia 680m is obviously a high end mobile gpu (i can dream right?) and a cell processor, however exotic you say it is, is not as exotic as you say it is because it's been around for 7 years. and it is high-end in terms of its specialization for gaming as oppose to an x86 microprocessor.

as for price, sony will probably save money from leveraging the cell processor on the hardware front (or make it back like they did with ps3 since they are co-creators) and since it is cell-based, sony will save money on the game development/api front from the 7 yrs of developing ps3 games....

and the unique non redundant qualities inherent in a specialized microprocessor such as the cell, will be utilized closer to 100% of its potential as oppose to a generalized cpu like a x86 microprocessor that you mention. [/citation]
You know what games are right? Code. Do you know what programmers use to write game code? Windows (and linux, when discussing PS2/3). Just because it doesn't have to run windows, doesn't mean that x86 isn't more efficient, or easier to program.

The custom silicon graphics (SGI) chips used back in the day in rendering workstations, etc are still exotic, even if they're a decade old. Just because something has been around for awhile doesn't mean it's not exotic. And the Cell isn't designed for gaming, it's designed for highly parallel code. Which gaming COULD use, but rarely does, beyond a few threads. The most parallel section of code is the graphics subset, which is already handled by the GPU.

Just because I developed my own engine for a car, doesn't mean it's not cheaper for me to buy a similar or higher perforamance engine from another manufacturer, especially if they're making 10-100x the amount of engines. Volume reduces costs significantly. AMD can make x86 processor for a heck of a lot less than IBM/Sony/Toshiba.
 
[citation][nom]c4v3man[/nom]Exactly. Getting rid of the Cell would be the best thing for gamers, bar none. Then we can get great looking games much faster in the development curve.[/citation]

The Cell was a bad idea from the beginning , i never really like the thing , most devs shake in fear when they have to code for it.
It was meant to be a Media Processor , really good at decoding video , help cure cancer , but a freaking nightmare to code games for.
Sony was beating in Blue Ray to take the over the world , therefore the CELL made perfect sense as a CPU.
It turn out to be different , the 2xcell setup could failed against xbox xenos gpu , and sony had to trow a Nvidia GPU at the last minute. (Blue ray didnt took over dvd yet).
Because of this (and sony's own arrogance) , the Dev Support and tools were very poor compared to the wii and Xbox360.
The whole architecture is a bloody mess , the only thing that saved sony was their credibility for past glories , PSX and PS2.
I hope they get it right this time , specially when they are have currently $2Billion loss for this quarter.
The games that made Sony Special are now Multi-Platform. (ie Final fantasy , tekken, MGS).
Only Gran turismo saves the day , but Forza is equally as good.
Hum.... lets wait and see....
 
It would be nice if both PS4 and Xbox 720 will support mouse and keyboard, that will kill one of console port plagues on PC, clunky controls. Also these specs promise performance of PC with 7850 or similar GPU, thus will up the quality of graphics in ported games. People you are forgetting two facts. First, PS3 and Xbox 360 were on par with high end PCs performance when they started selling. Second console games don't have to compete for resources with other software and OS, or to manage hardware resources through clunky drivers. Multicore CPUs didn't help with it at all, games are today competing with other software on level of threads instead of process. For developers it's much easier to pull maximum out of consoles, they don't have to deal with gazillion of execution prevention and similar security systems, both on software and hardware level, with PC it's impossible, feel free to take any game and compare performance on Vista and Win7 for example. These new consoles (and forget Wii U, it's same generation as PS3 and Xbox 360) will actually improve quality of video games for 2 years after their sale starts, granted quality of console ports will get stuck at that level for 4-6 years. So kiddos shut up with that whining, we PC exclusive gamers who know how consoles work are expecting 2 years of great PC gaming.
 
Status
Not open for further replies.

Latest posts