Next Xbox Rumored to get Two GPUs, Hexacore CPU

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
"The VG24/7 article also mentions that an always-on Internet connection will be a requirement for the Xbox for piracy protection purposes.

-Power or internet goes out-

No problem, I can simply play offline mode.

OH, WAIT.
 
[citation][nom]A Bad Day[/nom]"The VG24/7 article also mentions that an always-on Internet connection will be a requirement for the Xbox for piracy protection purposes.-Power or internet goes out-No problem, I can simply play offline mode.OH, WAIT.[/citation]
How you gonna play with no power anyway? lol Internet outages are a concern.
 
CF solution? I don't think so... When did console ever have good graphics? I swear man, the rumors are just ridiculous... and why would a xbox need 6 cores just to power everything? 4 is more than enough for consoles...
 
[citation][nom]JonnyDough[/nom]With that justification I can steal a car I wouldn't have purchased, but then buy one down the road when I have money. I think you're just trying to justify stealing. Here's a better idea, download the demo for the game - or rent the game. Or wait for the game to come down to a reasonable price. Or realize its a luxury item and do without it.[/citation]

You're wrong and here's why. When you download a song, movie, or game to try it out, the creator loses nothing. If you don't like it, you delete it. The creator didn't lose anything and didn't gain anything. If you do like it and you buy the whole album/movie/TV series/game, then not only did the creator not lose anything from you trying it out, they also gained from you buying.

If you steal a car, then the owner obviously lost something. Even if you buy one the next day, they still lost something. Also, not all games have demos and why pay to find out if you want to buy a game through renting it? That's ridiculous. So, I need to pay to know if I like a game? What if I don't like it, but now I've wasted money to find out if I like it. Even if I do like it, now I need to spend more money than just buying it.

Waiting for a game to become cheaper? Yeah sure, lets wait three to five years for a game to become cheap and never play something new. That's truly fair.

[citation][nom]buddhabelly34[/nom]So integrate driver management into Steam. A simple dialogue box "Hey we noticed there is a driver update for your graphics card that will make this game run better. Want us to install it for you?"Similar to what they do with the multitude of Direct X versions already.People who want it to "just work" shouldn't define a market as varied as this. They have had control for too long. I personally hope like hell that the always-on requirement and no used games idea takes hold. Many will switch to PCs and I'll finally get properly optimized games. My GTX460 should be plenty powerful enough for 100% of the games on the market for the next two years at 1080p full graphics if the Xbox 360 is able to play it at 1080p with low-med settings. Alas, this is not the case because no company cares about the PC market. They assume we all pirate their games (when in reality Gamestop is the biggest pirate of all).You heard me, Gamestop is the pirate, not the guys on TPB. At least most of them don't condone keeping the game without buying it at some point. Gamestop just takes all the companies profits after the first sale of the game.Wow what a tangent.[/citation]

Your GTX 460 doesn't come close to playing 1080p at maxed out settings in several games today. It would be one hell of a GTX 460 if it played Metro 2033 or BF3 at 1080p with the settings maxed out and some AA. You can't even do that on a GTX 560, so there's no way that the slower 460 can do it.

I certainly hope that this always-on DRM does not catch on. It's a ridiculous thing to require and won't stop pirates any more than current DRM does (which isn't a whole lot, excluding for the idiot pirates). We see the pirates finding work-arounds and hacks so that the games don't need the always-on internet connection.

[citation][nom]nhat11[/nom]It's a console, there's always DRM in some form or shape. You can't upgrade parts on a console, it's a static piece of hardware besides a couple of proprietary add-ons. This hasn't changed since consoles came out.[/citation]

That there is DRM hasn't changed, but the DRM has changed. What fraction of Xbox 360s and PS3s don't even have an internet connection? I bet it's pretty high. Forcing people to always have an internet connection to their console is a bad move. I know that several of my friends and family members don't have an internet connection to their console(s).
 
[citation][nom]Kelvinty[/nom]CF solution? I don't think so... When did console ever have good graphics? I swear man, the rumors are just ridiculous... and why would a xbox need 6 cores just to power everything? 4 is more than enough for consoles...[/citation]

The links to the actual news releases about this clearly state that there is supposed to be two GPUs, but NOT in the current form of CF where the two GPUs render each frame in an interleaved, one after the other manner. They are supposed to use a different method for dual GPU cooperation that has yet to be disclosed (assuming that it even has two GPUs)

Also, core count tells us NOTHING about the CPU's performance. For all we know, it's cores are as fast as Athlon II or FX cores, in which case it takes 6 of them to match four Sandy Bridge cores. To know the performance of a CPU, you must know the core count and the performance per core. Sometimes it's more complicated than that because of Turbo or shared resources getting more bogged by having multiple cores in use versus looking at the performance of a single core. The latter is part of why a single core of a single module performs slightly better than half of both cores being active even when Turbo is disabled on an FX processor.

You don't know exactly how fast a CPU is until it has been benchmarked. For all we know, it's six highly clocked ARM Cortex A15 cores and thus still not too much faster than some of the low end Desktop CPUs.
 
[citation][nom]confish21[/nom]I hope the new lines of consoles are as powerful as possible.[/citation]
They are always as powerful as possible, but that still means that they will be weaker than my now one year old pc by a million miles just because they have to keep their prices massively lower. We may spend 500+ on just a graphics card for a pc but they have to get the graphics card+cpu+mobo+controllers+kinect+case+memory+psu+HDD+etc etc etc...all for around only 3 or 4 hundred bux. So the graphics chips are extremely weak in comparison to pc graphics.
 
[citation][nom]rmpumper[/nom]X720 - always-on Internet connectionPS4 - no used game supportMy guess is that a lot of people will migrate to PC's in coming years.[/citation]Yeah! Then they can play all those physical media offline-only PC titles, with no DRM, that they bought used! Oh wait...
 
u know what? Ubisoft is probably the only company I pirating heavily on their always on games. If Xbox is going to be like this I bet many will join the pirates for sure. Good luck on developers.
 
[citation][nom]blazorthon[/nom]Your GTX 460 doesn't come close to playing 1080p at maxed out settings in several games today. It would be one hell of a GTX 460 if it played Metro 2033 or BF3 at 1080p with the settings maxed out and some AA. You can't even do that on a GTX 560, so there's no way that the slower 460 can do it.).[/citation]
I'm talking more about console ports. Yeah, I except the fact that something as taxing as Metro isn't gonna happen (but I don't care because the game itself isn't very good) because it's geared to enthusiast level hardware. But that doesn't mean I should see marginal performance over 6 year old hardware just because companies take their time to develop properly for consoles.

And I think you missed the point, the Xbox 360's X1800 is showing way too similar of performance to my GTX460. Though in the past year that has shown to be less true. Specifically in Skyrim's case, even prior to the patch. Just doesn't seem right that 6 year old hardware is performing so relative to 2 year old hardware.

compare: http://www.gamespot.com/forums/topic/28877581/gtx-460-and-bf3
with
www.lensoftruth.com/head2head-battlefield-3-analysis/

appparently the difference in 4ish years of tech is only a slider on 1920x1080 from low (what i guess is low) to about medium/high. That is what I have a problem with. If that game was developed for nvidia cards specifically then I would see maxed out settings on my PC, considering the hardware of the 360 and it's performance...



Note: hard to be incredibly accurate when actually benchmarkers being concerned with new tech and not my old gtx460. i dont feel like benchmarking it myself either, but i can assure you that 1920x1080 on medium/high is about 30fps.
 
[citation][nom]blazorthon[/nom]Micro-stutter is pretty much eliminated completely by activating V-Sync if you have more than 60FPS, so if two GPUs are a better option for some reason, there is little reason to not do it just because of the problems we currently associate with multi GPU setups since those problems are a thing of the past if you have the right setup. If MS wants two GPUs, then that means it may have better multi GPU support on the PC and if MS does it properly, then there won't be micro-stutter, variable FPS, and the like (properly meaning simply keep FPS above 60 and activating V-Sync; very simple things to accomplish).PCs can play older games. Some very old games don't run properly with Vista, 7, and newer, but just use XP mode or an XP virtual machine and there's no problem. Games so old that they don't run in the new OSs will run in XP and if they are that old, then a virtual machine's performance is probably enough for them anyway (unless you have an already slow computer).[/citation]

i was thinking more allong the lines of pc gameing is already thought to be expensive as hell, it isnt, but is thought to be that way. 2 gpus would just make it that much worse.

add to that, a port will always be a port... you know how bad games play now when they are ported even somewhat ok... imagine how bad they will play when they require a multi gpu setup...

the think is companies who get more resources, use them... they will brute force problems if they can... the reason we have engines that run VERY well right now (the modern warfare one) is because they had to trim all the useless crap out of the game to get ti to play that good, where as eairlier, they were bloat heavy engines.
 
[citation][nom]EzioAs[/nom]This is good news right? If they integrate 2 GPU into the Xbox, all game developers will optimize their games and SLI or crossfire will scale better for the PC once it's being ported or am I wrong?[/citation]
The gpus will not be runing in sli according to this rumor. Instead they will run independently
 
"The VG24/7 article also mentions that an always-on Internet connection will be a requirement for the Xbox for piracy protection purposes."
Just one more reason why pirating games is better - you don't get that BS (at least on PCs anyway).
 
[citation][nom]alidan[/nom]i was thinking more allong the lines of pc gameing is already thought to be expensive as hell, it isnt, but is thought to be that way. 2 gpus would just make it that much worse. add to that, a port will always be a port... you know how bad games play now when they are ported even somewhat ok... imagine how bad they will play when they require a multi gpu setup...the think is companies who get more resources, use them... they will brute force problems if they can... the reason we have engines that run VERY well right now (the modern warfare one) is because they had to trim all the useless crap out of the game to get ti to play that good, where as eairlier, they were bloat heavy engines.[/citation]

It wouldn't require two GPUs; the Xbox would simply have two GPUs anyway. A single GPU with the power of those two could play it too. The difference is that two lower performance GPUs are more energy efficient than a single high performance GPU of the same generation even if their aggregate performance is the same. For example, two 6770s are on par with a single 6970 in performance despite using significantly less power than a single 6970.

Having two GPUs also allows one GPU to do the graphical work while another is doing Physx or something else at the same time, not just letting them work in tandem.

[citation][nom]buddhabelly34[/nom]I'm talking more about console ports. Yeah, I except the fact that something as taxing as Metro isn't gonna happen (but I don't care because the game itself isn't very good) because it's geared to enthusiast level hardware. But that doesn't mean I should see marginal performance over 6 year old hardware just because companies take their time to develop properly for consoles.And I think you missed the point, the Xbox 360's X1800 is showing way too similar of performance to my GTX460. Though in the past year that has shown to be less true. Specifically in Skyrim's case, even prior to the patch. Just doesn't seem right that 6 year old hardware is performing so relative to 2 year old hardware.compare: http://www.gamespot.com/forums/top [...] 60-and-bf3withwww.lensoftruth.com/head2head-battlefield-3-analysis/appparently the difference in 4ish years of tech is only a slider on 1920x1080 from low (what i guess is low) to about medium/high. That is what I have a problem with. If that game was developed for nvidia cards specifically then I would see maxed out settings on my PC, considering the hardware of the 360 and it's performance...Note: hard to be incredibly accurate when actually benchmarkers being concerned with new tech and not my old gtx460. i dont feel like benchmarking it myself either, but i can assure you that 1920x1080 on medium/high is about 30fps.[/citation]

You both have some points about the console ports, but answer me this: just how bad were console ports three, four, and five years ago, back when DX9 graphics cards like Radeon 3000/4000 cards and Geforce 8000/9000/GTX 200 cards were more common? I could be wrong, but I don't think they were as bad in comparison to the difference between console ports and native PC games.

Unless I'm wrong, it will be several years after the next generation consoles come out before the next console ports are truly inferior to the games designed specifically for the desktop like they tend to be now.
 
[citation][nom]Bloob[/nom]I wouldn't want to pay over a 100$ for every boxed game I buy...But if it's always online, then I won't be getting it anyways...[/citation]
There is also a rumor that the price of games could possibly double because of the 'huge' development costs of designing games and characters with more complex polygons. I think that rumor along with mosta of these other rumors changing every week or so are false. MS is really stupid to force constant-online and no used games.

As a PC gamer, I admire the thought of dual-gpu and multicore cpu, which in turn should result in better PC ports.
 
[citation][nom]nhat11[/nom]It's a console, there's always DRM in some form or shape. You can't upgrade parts on a console, it's a static piece of hardware besides a couple of proprietary add-ons. This hasn't changed since consoles came out.[/citation]
Theres a difference between fixed hardware on a console and DRM. Forcing Internet always connection only acts as an INCONVENIENCE to loyal customers. The whole combating piracy issue is a joke. Pirates will always find ways around it (This also raises the question why they need to protect from piracy if there using physical media as speculated. To me it sounds like a way to kill the used game market)

If Microsoft allowed for free 'Silver' memberships to play basic multiplayer online, then perhaps constant online connection wouldn't be a great deal for many. Its still an inconvenience for those with poor internet connections, or those that want that still enjoy the single player experience.
 
[citation][nom]Hydroc10[/nom]So in about 2 years pc gamers will start getting some decent ports?[/citation]

Are you saying PC's have less capable versions of the same game? i found the opposite is true, but I haven't actively played a console in years. Everytime i watch people play games on a console I don't envy them.
 
[citation][nom]zooted[/nom]That is ridiculous.[/citation]

Yes, necessitating an always-on internet connection in order to play games is ridiculous in the extreme. It's about time that all American gamers and gamers around the world stood tall and told these companies "NO MORE FARKING DRM! N O N E!" and started pushing our elected officials to make it illegal, with jail time in prison if you use it, to use DRM.
 
[citation][nom]blazorthon[/nom]It wouldn't require two GPUs; the Xbox would simply have two GPUs anyway. A single GPU with the power of those two could play it too. The difference is that two lower performance GPUs are more energy efficient than a single high performance GPU of the same generation even if their aggregate performance is the same. For example, two 6770s are on par with a single 6970 in performance despite using significantly less power than a single 6970.Having two GPUs also allows one GPU to do the graphical work while another is doing Physx or something else at the same time, not just letting them work in tandem.You both have some points about the console ports, but answer me this: just how bad were console ports three, four, and five years ago, back when DX9 graphics cards like Radeon 3000/4000 cards and Geforce 8000/9000/GTX 200 cards were more common? I could be wrong, but I don't think they were as bad in comparison to the difference between console ports and native PC games.Unless I'm wrong, it will be several years after the next generation consoles come out before the next console ports are truly inferior to the games designed specifically for the desktop like they tend to be now.[/citation]

take a look at most games that are ported to the pc, even when done well, they require more power to play than the equivalent on a console.

what im afraid of is all the ports that come to the pc will no longer be working great with 1gpu, take a look at gothem city imposters, and all the problems that has, at least from what i read.

dont think of a console as a computer, where 2 gpus will benefit the computer, think of it as 2 gpus that will get programmed for exclusively, and while on the console the engines will run fine, but on the pc, the engines will run poorly.

you have to think of most companies as money hungry, and not developers who care, because even if they care, they need money, and the people with it do not care.
 
Status
Not open for further replies.