Xbox 720: 6x Performance Increase, Kinect 2, 3D, 1080p

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]guardianangel42[/nom]Every single time and I mean EVERY SINGLE TIME consoles are even MENTIONED in an article discussing performance, a PC gamer chimes in with their PC-centric perspective on hardware requirements while SIMULTANEOUSLY deriding and IGNORING console hardware.If you're even mildly proficient at building computers here's a very relevant thought exercise for you: build a computer with 512MB of TOTAL RAM that can play Mass Effect 2 at medium settings and 720p resolution.You CAN'T. It isn't POSSIBLE to DO with any modern version of Windows. You cannot build a computer that runs even Windows XP (since ME2 runs only DX9) and Mass Effect 2 simultaneously PERIOD, let alone on Medium settings and HD resolution. The system requirements for ME2 alone are twice what's available in a modern console, and that's the bare minimum.Yet somehow consoles do it. My Xbox runs ME2 perfectly fine and, while the textures aren't anything to write home about, it still looks decent.Therefore, anyone who decides they want to compare PC hardware to consoles, anyone that want's to say that the MINIMUM a console needs is a flagship GPU, and anyone that thinks they know enough about game and game console design to speak intelligently on the subject needs to remember that unless you've personally made a game for a game CONSOLE, your frame of reference is irrelevant. PC and console hardware CANNOT be compared.[/citation]

Console OS, drivers, and firmware are all coded specifically for high optimization for the few tasks that they can do. The OS is extremely basic, so it doesn't need much memory (even XP needs a sizable chunk of that 512MB). The games on Windows use so much memory because of their extremely poor ports and their having higher quality than the same game on a console. The drivers don't need to be big (heck, they are probably just part of the OS and firmware, at least for the most part), so they don't take much either. Consoles can do what they do not only because this is what they're designed for, but also because they are ancient and the software/games that they run make this obvious just by looking at them. Compare them to decent PCs of the time and a little after and suddenly, this all changes and not in further favor of the consoles. Using older games that are more like the console games in picture quality and behavior let you do exactly what you asked for, gaming on half a GB of memory.

The consoles manage to run games because the console versions are not only natively developed, but are also more cut-down versions than those that are on the PC. Using PC developed (or at least better optimized) games lets a PC make far better use of its more powerful hardware than most of the crap ports do.

Furthermore, as other people have probably mentioned, a lot of console hardware is just modified PC hardware and is directly comparable. You don't seem to understand the subject that you're ranting about.
 

buddhabelly34

Honorable
Mar 19, 2012
61
0
10,630
[citation][nom]blazorthon[/nomThe 670, with at least 8GB/s of PCIe throughput, is not PCIe bottle-necked. With 4GB/s, it shows weakness. With anything less, it is now bottle-necked. You'd need at least 4 PCIe 3.0 lanes for a GTX 670. Considering the level of optimization would probably allow the console 670 to play 5760x1200 better than the regular 670 can do 2560x1600 in a desktop, it would need 8GB/s of PCIe throughput, so either 8 PCIe 3.0 lanes or 16 PCIe 2.0 lanes to avoid PCIe bottle-necking well enough for code to be able to account for this issue.[/citation]
I'm sorry, are you telling me that the GTX 670 needs 8 PCIe 3.0 lanes to not have a bottleneck?

http://www.tomshardware.com/reviews/geforce-gtx-680-review-benchmark,3161-5.html

That shows 11.7 GB/s bandwidth transfer on the lane. Pretty sure that is way less than 8 PCIe 3.0 lanes capability (16GB/s x 8 = 128GB/s). Though, admittedly, it is a fair bit more than 2.0 is capable of.



[citation][nom]blazorthon[/nom]The 7670 has VLIW5 and the 7770 has GCN... The two have radically different architectures in their GPUs.[/citation]
That is fine and dandy (I don't familiarize myself with which cards they use old architecture on but give a new gen moniker to), but I was simply stating 7670/7770 because they are mid-to-low range cards that have been part of rumors about the new Xbox hardware.


Not trying to start another back and forth with you (like the Minecraft article) so I will leave it at this. Feel free to reply, I'll read it, but I won't pipe up.

Also,
[citation]What Xbox 360 plays today's games? None. They might play games that were made today but use ancient hardware, but that doesn't make them today's games. Furthermore, I have several computers from 2004-2006 that can play some games today, if you count WoW and such as today's games (I don't, but you seem like you would). No, these aren't high end computers that have been upgraded over the years... A P4 3.0GHz computer with a Radeon x200 or some crap like that can play WoW and can do it above the minimum settings too.[/citation]
couldn't agree more.
 
[citation][nom]buddhabelly34[/nom]That is fine and dandy (I don't familiarize myself with which cards they use old architecture on but give a new gen moniker to), but I was simply stating 7670/7770 because they are mid-to-low range cards that have been part of rumors about the new Xbox hardware.Not trying to start another back and forth with you (like the Minecraft article) so I will leave it at this. Feel free to reply, I'll read it, but I won't pipe up.Also, [citation][nom]blazorthon[/nom]What Xbox 360 plays today's games? None. They might play games that were made today but use ancient hardware, but that doesn't make them today's games. Furthermore, I have several computers from 2004-2006 that can play some games today, if you count WoW and such as today's games (I don't, but you seem like you would). No, these aren't high end computers that have been upgraded over the years... A P4 3.0GHz computer with a Radeon x200 or some crap like that can play WoW and can do it above the minimum settings too.[/citation]couldn't agree more.[/citation]

PCIe 3.0 is 1GB/s per lane. Eight lanes is 8GB/s. Sixteen PCIe 2.0 lanes is also 8GB/s. Sixteen PCIe 3.0 lanes is 16GB/s. The 7770 is about twice as fast as the 7670, maybe more than twice as fast. I'm not trying to start a fight, but you were wrong about one thing and acted like two very differently designed and performing cards are similar.
 

aggroboy

Distinguished
Sep 17, 2010
197
0
18,680
[citation][nom]buddhabelly34[/nom]what xbox can play at 1080p or even over 30fps at 720p? none. what is your point?console gamers need some perspective. your idea of "360 is good enough" screws over those that want progress. seems selfish if you ask me.[/citation]
Can a 2005 PC even play Max Payne 3 at 30fps 720?

Or you can check the cutting edge PC specs during every major console launch - PS1, PS2, Xbox 360.
 

JOSHSKORN

Distinguished
Oct 26, 2009
2,395
19
19,795
At this point, I'd say use an 8-core CPU processor, 16 GB RAM and a GTX 690. Otherwise, forget it. It has to do for the next 10 years, right? That, or allow upgrades.
 
[citation][nom]JOSHSKORN[/nom]At this point, I'd say use an 8-core CPU processor, 16 GB RAM and a GTX 690. Otherwise, forget it. It has to do for the next 10 years, right? That, or allow upgrades.[/citation]

GTX 690? Now you're just being completely unreasonable. Even the 670 is overkill. Something like the 7850-7870 would be ideal if the refresh cycle is kept within a more reasonable time frame. That would play 1080p games and above for years to come on consoles without being ridiculously expensive and having needlessly high power consumption. 16GB of RAM is also unnecessary. Just giving it 4GB-8GB should be more than enough. Give it something like a high enough frequency 256 bit XDR2 connection that is shared between the CPU and GPU instead of two separate controllers with DDR3/4 and GDDR5. It could easily fit in a $300-500 price range for different models without sacrificing performance.
 

bigdog44

Honorable
Apr 6, 2012
167
0
10,680
6x power across the board could mean:
... Custom HD6970 derivative core at similar clk spd, but with 16 SIMD units, 16-20 TUs, dual geom/tess engines, modded vid engine, but gutted of the back-end.
... 6MB L2 ( 256KB/core + 4 MB shared ).
... A beefy 40-60MB eDRAM (depending on clk spd)
... 32 ROPs
... A variety of powerful cpu options.
... 3GB of GDDR5 at a decent clk spd.
 

bigdog44

Honorable
Apr 6, 2012
167
0
10,680
@ Blazorthon...
I generally agree with most of your posts on this topic and others, but the comment about consoles cpu's not using GDDR memory is incorrect with regards to the XB360 which has a Unified Memory Architecture.
 
[citation][nom]bigdog44[/nom]@ Blazorthon...I generally agree with most of your posts on this topic and others, but the comment about consoles cpu's not using GDDR memory is incorrect with regards to the XB360 which has a Unified Memory Architecture.[/citation]

My bad, Xbox 360 does this, but PS3 doesn't. Regardless, it is not feasible at this time. Graphics memory has very high latency compared to system RAM and that would likely be unworkable with a much faster processor than the Xbox 360's Xenon. XDR/XDR2 could undoubtedly do it, but not regular graphics RAM anymore.
 

Bloob

Distinguished
Feb 8, 2012
632
0
18,980
If the specs still hold, then the next XBox would be about 16-20% faster than WiiU, meaning basically that WiiU should get most of the next get cross platform games.
 

bigdog44

Honorable
Apr 6, 2012
167
0
10,680
[citation][nom]blazorthon[/nom]My bad, Xbox 360 does this, but PS3 doesn't. Regardless, it is not feasible at this time. Graphics memory has very high latency compared to system RAM and that would likely be unworkable with a much faster processor than the Xbox 360's Xenon. XDR/XDR2 could undoubtedly do it, but not regular graphics RAM anymore.[/citation]

Is Rambus doing ok these days? Last I heard they were in trouble, but that was months ago. Also, unless MS is heavily invested in them, it may be too prohibitive to be an option. With a sufficiently large on chip eDRAM, latency may be less of an issue for both the cpu and gpu. What do you think?
 
[citation][nom]bigdog44[/nom]Is Rambus doing ok these days? Last I heard they were in trouble, but that was months ago. Also, unless MS is heavily invested in them, it may be too prohibitive to be an option. With a sufficiently large on chip eDRAM, latency may be less of an issue for both the cpu and gpu. What do you think?[/citation]

From what I can tell, Rambus is doing alright, just hurting a little from losing some of their patents that they were using to try to strangle Nvidia and maybe a few others. In their current condition, I'd say that they're probably reasonably willing to play fair for now.

As for the eDRAM cache, well, it depends. The cache would help, but it can't hold everything. At best, such a cache would be about 64MB (48MB or 54MB being more likely). If the system can intelligently manage the cache to have the relevant data cached right before it is needed, then theoretically, it could be done... However, this would be a less power efficient and much more complex solution than simply using XDR2. There's also always a chance of a cache miss, forcing the CPU to wait a very long time for the huge latency of the main memory. It's at least theoretically doable, but is still a less ideal method.
 

hakesterman

Distinguished
Oct 6, 2008
563
0
18,980
I think 6X performance increase is pretty good. The " always On " state i don't like, it better be a feature
you can turn off. Backward compatable is a plus for those who want it. I wish they would produce a
base model and let you customize it to your likeing, like you do a PC. You choose your HD size, your
memmory, your Graphic's card etc. I know I'm Dreaming, but hey someone has to stretch the minds of these developers. Go Xbox 1080 Gold.........
 



Dude. you probably only had it for one or 2 years. the xbox 360 was six years old and it could still play bf3 at an equivalent setting of medium. back then, people were probably running 8800gt
 

upgrade_1977

Distinguished
May 5, 2011
665
0
18,990
Last year they said the top PC graphics cards where 10x more powerful then the current consoles. Since then they released new GPU's that are almost 3 times as powerful..

So..... With that logic, we can deduce that if XBOX 720 was released TODAY,

Xbox 360 or PS3 = 1x
Xbox 720 = 6x
PC = 30x

So with Quad SLI and today's fastest Graphics cards theoretically, if SLI or Crossfire was 100% scalable:

PC Quad SLI or Quad Crossfire = 120x

Keep in mind, by the time Xbox720 is released, even more powerful, possibly double or triple the performance of today's cards, will be out, so 120x is not a high mark to hit with a PC.

So, I think i'll stick with PC gaming.





 
[citation][nom]upgrade_1977[/nom]Last year they said the top PC graphics cards where 10x more powerful then the current consoles. Since then they released new GPU's that are almost 3 times as powerful.. So..... With that logic, we can deduce that if XBOX 720 was released TODAY, Xbox 360 or PS3 = 1xXbox 720 = 6xPC = 30xSo with Quad SLI and today's fastest Graphics cards theoretically, if SLI or Crossfire was 100% scalable: PC Quad SLI or Quad Crossfire = 120xKeep in mind, by the time Xbox720 is released, even more powerful, possibly double or triple the performance of today's cards, will be out, so 120x is not a high mark to hit with a PC.So, I think i'll stick with PC gaming.[/citation]

With quad Crossfire or SLI, you'd be lucky to get much more than triple a single GPU's performance, if even that. Also, as much as I dislike the industry's impact of now ancient consoles, at least I'm not so biased as to compare multi-thousand dollar graphics setups to currently very cheap consoles in order to prove my point (and fail at doing so anyway). You should have at least used proper information.

[citation][nom]TheBigTroll[/nom]Dude. you probably only had it for one or 2 years. the xbox 360 was six years old and it could still play bf3 at an equivalent setting of medium. back then, people were probably running 8800gt[/citation]

Actually, the consoles play it at 720p (or lower) with no AA and texture quality that is lower than medium (they have neither the performance nor the memory capacity for more than this in games such as BF3). Furthermore, they play it at DX9, not the significantly more intensive DX11 and also have other things cut-out (probably such as tessellation). Also, the 8800 GT is substantially faster than the graphics in the PS3 and Xbox 360.
 

upgrade_1977

Distinguished
May 5, 2011
665
0
18,990
[citation][nom]blazorthon[/nom]With quad Crossfire or SLI, you'd be lucky to get much more than triple a single GPU's performance, if even that. Also, as much as I dislike the industry's impact of now ancient consoles, at least I'm not so biased as to compare multi-thousand dollar graphics setups to currently very cheap consoles in order to prove my point (and fail at doing so anyway). You should have at least used proper information.Actually, the consoles play it at 720p (or lower) with no AA and texture quality that is lower than medium (they have neither the performance nor the memory capacity for more than this in games such as BF3). Furthermore, they play it at DX9, not the significantly more intensive DX11 and also have other things cut-out (probably such as tessellation). Also, the 8800 GT is substantially faster than the graphics in the PS3 and Xbox 360.[/citation]

You should learn to read, Again, I was not giving any improper information, I said
"if" SLI or Crossfire was 100% scalable:
And
Keep in mind, by the time Xbox720 is released, even more powerful, possibly double or triple the performance of today's cards, will be out, so 120x is not a high mark to hit with a PC.

I was not being biased, and last time I checked, you can build a faster low end gaming machine that will be faster then a console, at the same price. Having better graphics is an option. I actually own an xbox 360, bought one a few months ago so I could play with my console friends, but I only played it like 3 times, because I don't like watching the graphics draw in front of me, I hate jagged edges, and adding a grainy effect doesn't equate to better graphics, and I can't stand it when games randomly drop FPS and freeze for a couple of microseconds at a time. Obviously, if your used to consoles it might not bother you, but it bothers me. Computers aren't just about having better graphics, the whole overall experience is better. Smoother, bigger worlds, more people can play online, ect. ect. ect.
Besides, whats the purpose of spending money on games, being promised backwards compatibility, and then when the new console comes out, they don't work... Is that not a waste of money?

PC's aren't a waste of money, you get the performance you want, depending on what your budget allows you to, and if you want more, you just pay more and upgrade... and all my old games still work, and I don't have to pay so so a year to get an xbox subscription, and I don't have to sacrifice graphics for gameplay. I really don't get why everyone keeps falling for proprietary hardware, just to get screwed. PC's are popular for a reason, because they are modular, and you can build them to suit any need. PC's could easily replace console's in the living room for the same price, and you'd get an even better experience, but people keep buying consoles....WHY?!?!?!

Another point, you can do so much more on the PC, they can be used for doing photoshop, streaming, running servers, creating websites, running a business, watching movies, editing music, ect. ect. ect. Plus it's fun to build them, and customize them. Sorta the geeks hot rods if you will. I think the only thing consoles had going for them was the price vs. a pc, and peoples lack of education about pc gaming. If only they spend billions on PC gaming advertising like they do for consoles, I could only imagine what the PC market would be like today.

Honestly, I think PC's, onlive, gakai, and facebook gaming is gonna take over, and the console market is going to crash ...... soon. Probably when the next consoles come out and they are $800 bucks.

 
[citation][nom]upgrade_1977[/nom]You should learn to read, Again, I was not giving any improper information, I said And I was not being biased, and last time I checked, you can build a faster low end gaming machine that will be faster then a console, at the same price. Having better graphics is an option. I actually own an xbox 360, bought one a few months ago so I could play with my console friends, but I only played it like 3 times, because I don't like watching the graphics draw in front of me, I hate jagged edges, and adding a grainy effect doesn't equate to better graphics, and I can't stand it when games randomly drop FPS and freeze for a couple of microseconds at a time. Obviously, if your used to consoles it might not bother you, but it bothers me. Computers aren't just about having better graphics, the whole overall experience is better. Smoother, bigger worlds, more people can play online, ect. ect. ect.Besides, whats the purpose of spending money on games, being promised backwards compatibility, and then when the new console comes out, they don't work... Is that not a waste of money?PC's aren't a waste of money, you get the performance you want, depending on what your budget allows you to, and if you want more, you just pay more and upgrade... and all my old games still work, and I don't have to pay so so a year to get an xbox subscription, and I don't have to sacrifice graphics for gameplay. I really don't get why everyone keeps falling for proprietary hardware, just to get screwed. PC's are popular for a reason, because they are modular, and you can build them to suit any need. PC's could easily replace console's in the living room for the same price, and you'd get an even better experience, but people keep buying consoles....WHY?!?!?!Another point, you can do so much more on the PC, they can be used for doing photoshop, streaming, running servers, creating websites, running a business, watching movies, editing music, ect. ect. ect. Plus it's fun to build them, and customize them. Sorta the geeks hot rods if you will. I think the only thing consoles had going for them was the price vs. a pc, and peoples lack of education about pc gaming. If only they spend billions on PC gaming advertising like they do for consoles, I could only imagine what the PC market would be like today. Honestly, I think PC's, onlive, gakai, and facebook gaming is gonna take over, and the console market is going to crash ...... soon. Probably when the next consoles come out and they are $800 bucks.[/citation]

I know the differences between the capabilities of PCs and consoles. The way that you presented them was very biased and wrong. There is no reason at all to mention something such as a what-if scaling was better with ridiculously high end setups because that doesn't matter and you were using graphics configurations that are at least an order of magnitude more expensive than most consoles are right now, making it a pointless what-if with hardware configurations that are irrelevant for this subject. I don't like our consoles and I think that my reply to you made that very clear, yet your reply to me implies that I was acting like some sort of console fanboy by stating how poorly you were arguing. I don't even play consoles anymore strictly because I don't want to put up with outdated graphics and such, so don't pretend that I'm some sort of console fanatic who doesn't know better.
 

upgrade_1977

Distinguished
May 5, 2011
665
0
18,990
Wrong, there is nothing wrong with the way I presented my post. You stated that I used improper information. I didn't. If you read what I wrote you would see that. I clearly stated that sli and crossfire aren't 100% scalable, and if it was it would be 120x, also, I stated that "when the consoles come out, more powerful cards "SHOULD" be out and that 120x isn't that high of a target to hit. Since obviously moore's law dictates processing power double's every 18 months, i think that is a perfectly fair assumption based on the expected release date of the xbox 360.

As far as making you look like a console fanboy, sorry bout that bud, I was just defending my position based on your last post. Yeah, I'm a PC guy, but I really believe PC's are superior in every way. Sorry if that upsets people.
 
[citation][nom]upgrade_1977[/nom]Wrong, there is nothing wrong with the way I presented my post. You stated that I used improper information. I didn't. If you read what I wrote you would see that. I clearly stated that sli and crossfire aren't 100% scalable, and if it was it would be 120x, also, I stated that "when the consoles come out, more powerful cards "SHOULD" be out and that 120x isn't that high of a target to hit. Since obviously moore's law dictates processing power double's every 18 months, i think that is a perfectly fair assumption based on the expected release date of the xbox 360. As far as making you look like a console fanboy, sorry bout that bud, I was just defending my position based on your last post. Yeah, I'm a PC guy, but I really believe PC's are superior in every way. Sorry if that upsets people.[/citation]

Moore's law has absolutely nothing to do with performance. Moore's law (which is also just an observation, not really any sort of law) refers to the doubling of transistor density in affordable chips every 12 months. This has been proven to have slowed down and is more like 24 months nowadays, but still, nothing to do with performance and is also not every 18 months anymore either.

My point was that your what-if statement was pointless because it doesn't matter. It's a what-if and not only that, the info that you based it on is wrong. I could also say that if the Netburst architecture did hit its intended 10GHz mark when it was supposed to, then the consoles could have been outperformed in CPU performance a long time ago, but that doesn't mean that it's relevant or even correct. Whether or not you explained that this is not true does not matter because you made the comparison anyway.

I'm not upset by your views of consoles versus PCs and I more or less agree with them.
 
Status
Not open for further replies.