• Happy holidays, folks! Thanks to each and every one of you for being part of the Tom's Hardware community!

Rumor: Microsoft Revealing Xbox Infinity (720) in April

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I only have two problems with the PS3 compared to 360 -> I cant stand the controller on the PS3 (personal opinion obviously) and the chat system absolutely sucks on the PS3. Friends and i actually quit playing DCUO because we had so many problems with it. The chat feature just works on the 360. I've been told that its because MS has their own chat servers so game devs don't have to bother developing for it and making a half-@ product.
 
[citation][nom]cercuitspark[/nom]The processor the PS4 is based on isn't even out yet, and you're calling it outdated?And adding the optimization you can get from programming for a spesific set of components, you will get much more graphically advanced games for the next gen console than for the PC in it's current from.- A PC enthusiast and PC gamer.[/citation]

Sony is using an AMD CPU, clock for clock it's not as good as the I7, however when you going to multi thread tasks it's capable.

If the PlayStation four forces you to use more than one core for videogame, the AMD structure would be more beneficial to it than in I7, especially when you look at the cost of in I7 and the fact that a multi core processes AMD CPU sometimes surpassed the I7.

Calling it outdated is stupid but it's not entirely wrong

[citation][nom]downhill911[/nom]I pray that Xbox infinity will be on par with PS4 or better, then all those new games are going to look epic and PC ones even better.So far, I am really impressed with killzone, although it was only 720p but you could see textures where much better.Really hope Microsoft will try to out-performance PS4, but I somehow doubt that.[/citation]

Kill zone was honestly the most unimpressive demo up there next to the racing game, in the kill zone demo I can just see every way they can fake making it look better than it actually is. What I mean is The little techniques that you use to make it seem like you're doing more than you actually are.

The most impressive game on their was watchdogs and that was PC version not the PlayStation four version.

Other than that was tech demos from squareenix and Capcom, with what might be a cut scene from infamous three, mixed in with again that's more of a tech demos on an actual game and a racing game that had severe popin issues.

Sure the text resolution was higher, but beyond that I saw nothing great.

[citation][nom]tokencode[/nom]'It usually takes them a few years to squeeze all of the potential out of a new hardware platform. The new generation of consoles may finally bring some software that can push modern PCs, but I'm not so sure PS4 or the new Xbox will take a 3-way SLI setup. Remember, some computer gamers are playing at 3 times the resolution of your 1080p TV today. With things like Occulus Rift on the horizon, I'm not sure consoles will ever catch again at this point.[/citation]

With the oculus rift you take a huge burden off the graphics card. I believe the current developers kit for the oculus rift is 720 P split in two, and the consumer version should be 1080 P split in two.

Current 3-D on the PC takes 120 Hz monitor and it renders true 1080 P images
3-D through the oculus rift requires a 60 Hz and only one 1080 P image total.

And even with the rendering less, the thing feels like you're seeing so much more and that you're in the game actually there.

But here's the thing, if the oculus rift doesn't actually come to the consoles, if headmounted displays don't come to the consoles, will they ever take off. I don't care which console is beat the 360 ordered next PlayStation whichever one of them brings a headmounted display from day one and actually allow support for will be a godsend to everybody. But if it doesn't come out in the consoles I have a fear that they'll fail.


[citation][nom]p05esto[/nom]MS better not F up again like Win8. I don't have a lot of confidence at this point to be honest. Everything I've so far has been depressing about the xBox.[/citation]

Before the launch of the Nintendo Wiiu, Microsoft is looking to just make the console 20% faster than it. And given that at least the last two E3's, Microsoft shown that they don't give a damn about the gamer

[citation][nom]theLiminator[/nom]Actually, at the time of it's announcement, the PS3 was much more powerful than any standard PC that was available to consumers at that time. But the PS4 doesn't seem to be that powerful already.[/citation]

Most people who have a computer using at most and I5 and maybe integrated graphics, wasn't even until recently that more than 4 GB of RAM became a standard on a 64 bits system

hell let me look up a laptop for you

Lenovo ThinkPad L520 Laptop,
15.6" LED HD Backlit,
Intel Core i5-2520M Dual-Core 2.5GHz,
4GB DDR3,
320GB Hard Drive,
802.11n

cost 1200$ list price, 650 deal price,
pre built (what most people get) is PATHETIC too, but i also dont see brebuilt towers much anymore. last one i saw was over 900$ and had a sub mid range gpu in it

[citation][nom]blazorthon[/nom]PC hardware of the time was also using much less power than high-end PC hardware today. PC hardware power consumption limits have gone far up whereas consoles have had to go down. It's ridiculous for anyone to expect performance differences to not follow those lines.Also, the PS4's graphics performance are supposedly comparable to something like a Radeon 7850. With their great optimization, it's possible that nothing far short of two Radeon 7950s or two GTX 670s will be able to give a better graphics quality experience. You won't get better performance within a console's price range, that much is almost certain.[/citation]

yea... no... not a chance in hell... they will be able to pull some more power out of it, but this is based on x86 something was already know and know how to pull power from well... you may get at most a 20% boos for the one hardware scheme to program for, but you arent makeing a 7850 preform up to crossfired 7950

[citation][nom]kinggraves[/nom]Irrelevant. The average consumer isn't willing to drop over $1k to play games no matter what the quality is. Enthusiast PC gamers make up such a small fraction of the consumer base you might as well not exist to them. Reality bites.MS has hinted it's going to take an approach closer to Nintendo than PS this time around, so I'd say ps4 will have the best hardware. This is likely why they're holding back until April letting Sony parade itself around for awhile. MS is in the most advantageous position right now, they know what Nintendo can do and what Sony is promising it can do. They've got a couple months to figure out how to make theirs sound better.I do however think MS might have something to show come April. A lot of people are saying Sony is being coy with the PS4, I don't really think they have anything to show right now. They were originally planning PS4 for 2014, but it would be a bad idea to let the competition get such a head start on them. They were forced to get it together for Holiday 2013. I'm predicting even more than the usual Sony launch failure by the end of this year.[/citation]

look at the consoles
nintendo>super nintendo> nintendo 64>game cube>wii>wiiu
each one was a leap in hardware preformance (gamecube to wii was double the power, not much but a leap)
if microsoft really just goes for 20% better than the wiiu, which at the time was rumored to be 360 quality graphics but at 1080p, would people jump on the next xbox?
with the games that sony showed off, it may not look like a major leap, but it is, would people really go for xbox if all it did was go to 1080p and had kinect for everything?

[citation][nom]cobra5000[/nom]If you can't show the console, why waste the effort? It is a joke! Do you hear me Sony, Microsoft?[/citation]

they didn't show it because when microsoft has their next box blow out, sony will have something fairly major to announce... its not hard to see why they didn't show it...

[citation][nom]shikamaru31789[/nom]Agreed. The level of graphics I was seeing in those PS4 demos was equal to what a mid-high end PC can do now. For instance, the Watch Dogs PS4 demo seemed to match the E3 demo which was running on a high end PC. Buy a $500 PC now, and I highly doubt it'll meet the minimum requirements of multi-platform games in 6 years, while the $400 Xbox Infinity and PS4 will still be playing those same games.[/citation]

because watch dog demo was pc source, not ps4

 


LOL, the CPU's architecture is absolutely irrelevant in how much performance they can get out of the graphics cards. Furthermore, we don't get full power whatsoever from any of our PC hardware, at least for gaming, because optimization is crap in order to allow for wide compatibility. Driver updates alone can improve performance by far more than 20% compared to the original drivers for a given generation, so a much more than 20% gain is easily realistic for consoles that won't have any of these issues whatsoever.

Furthermore, we have no idea about any modifications to the GPU design coming from the seemingly similar Pitcairn Pro. For all we know, some major improvements that don't show up in the FLOPs performance benchmark may have been done. Even without such improvements, what I said is entirely possible, if not outright likely.
 


Wasn't everyone complaining that the PS3's processor was too advanced for gaming nowadays? You people just won't be happy unless they have three processor in it.(x86, x64, CELL). >.> The processor won't matter if it's built recently. It could be a PCIe 3.0 compatible x86 CPU which is rare to see. Everything will still be custom and you'll be getting that optimization that made the PS3 last this long with half a GB of RAM.
 


While the PS3 has a video card that's comparable to a 7800 GT, it is able to compete with a GTX 560 and give it a run for its money. This is due to the wonderful world of optimization. Making a game work only with a specific setup so that it runs as smooth as possible. If game developers started making different versions of games for each custom PC out there then we'd all have the best graphics possible.

Of course, that won't happen because everyone uses different video cards. If one of the brands ended up going belly up then we would have a huge increase in optimization in the next 6 years.

Ex: ATI goes bankrupt and stops making video cards. Leaving Nvidia to be the only one designing video cards. Video games would run smoother on all newer systems since they would be specifically made for Nvidia cards without any ATI support.

It's kind of due to these two companies competing that games need a lot of raw power. If every computer was the same with a GTX 660 in it and you could boot straight to a game instead of your OS then there would be no problem with the PC and even Crysis 3 would be playable in Ultra settings with the highest anti-aliasing possible.
 


Since I don't know the actual hardware, I'm going to use the "leaked" hardware of the 720 as an example.

The 720 will have more RAM (DDR3) but the PS4 will have faster RAM (GDDR5). This means that you'll be able to do more at once on a 720 but everything will be rendered faster on the PS4. This isn't where the consoles will get all of their power anyways. If the PS3 is an example, 500 MB of RAM lasted a long time and only had difficulty browsing the internet and such. Gaming was not affected by it that much.

The 720 will have eight cores running at 1.6 GHZ in its CPU so it will be able to compute a lot at once. The PS4 will be using an easy to code x86 quad-core CPU. While more cores will help the performance out with some tasks, it will not be as powerful as the rumored CPU in the PS4. The PS4 will supposedly be able to do 1.5 times the amount of processes as the 720 in the same time frame. However, less cores means that it could run into some problems in the future (unlikely since dual-core hasn't run into major problems yet).

As for the GPU, I don't know how many GHZ the PS4 is supposed to have or how much VRAM the 720 is supposed to have so I can't comment on it.

The two consoles will both be able to play games for the next 5-6 years so just pick the one that you like the exclusives of. (I loved the PS3 exclusives so I'm getting a PS4)

 


Nit-picking, but x64 isn't an architecture like x86 is; x64 is just a 64 bit extension of x86. The architecture is still x86. For example, almsot all modern CPUs from AMD and Intel are 64 bit, but we call their ISA (instruction set architecture) x86. It's named after the Intel 8086 CPU, the first CPU or at least one of the first to use an early version of x86, IIRC.

Also, Intel has many PCIe 3.0 compatible CPUs and AMD's CPUs are technically PCIe 3.0 compatible because their PCIe controller is on the northbridge on their motherboards, not on the CPU die, so all it should take for PCIe 3.0 compatibility is a new northbridge chip with PCIe 3.0 compatibility. Besides, I fail to see how PCIe version is important (assuming that PCIe is even used) in the PS4.

You make a good point about how consoles seemingly can't please everyone.
 


The PS4 supposedly has an eight-core Jaguar CPU, not quad core IIRC.

The PS3 had 256MiB XDR for OS and such and 256MiB GDDR3 for the graphics. The Xbox 360 had a single 512MiB GDDR3 pool of memory. Gaming was greatly affected by this and game devs have had to deal with it being a big hurdle. Just because they dealt with it doesn't mean that it wasn't an issue. I don't see the supposed 8GiB of the PS4 being an issue any time soon since it's much more extreme for the time than what the old consoles had for memory for their time IMO, but still.

Dual core CPUs have run into a lot of trouble lately. Several games can struggle even on the fastest dual core CPUs unless they have Hyper-Threading (if even then).

Also, I don't see why you're putting so much emphasis on GHz. We can tell the performance of processors strictly by frequency nor even with both frequency and core count.
 
Status
Not open for further replies.