New PS4 Specs Leak Based on January's Development Kit

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
All about driving costs down since Sony is being slaughtered as a company. Their valuation, margins, revenues, profits etc are considerably down the past 7 years.

They can no longer to take as steep as a hit on a console, so as Nintendo went to an extreme Sony is following suit. Less subsidized console to help generate more profit to keep them alive.
 
looks like half the people don't realize why an 8 core cpu is in the dev kit instead they sing amd's praise and future(!) prospects of multithreaded(!!) gaming lead(!!!) by amd (in reality nintendo, sony and ms).
new sony/ms consoles will have (rumored) 8 core cpus based on jaguar cores which are low clocked, power efficient, easy&cheap-to-produce-and-sell products. right now, the closest available thing to 8 core jag is a cheap bd cpu. the stronger the cpu, the better it will do in multiplayer games and in games that need multiple, simultaneous user inputs which are a Lot of console games.
lastly, it is a dev kit, not final specs. take rumors with a grain of Native Client.

on another note.... so this where all the unsold 8 core bd cpus ended up. 😗
 
[citation][nom]mobrocket[/nom]why 2 ethernet ports?[/citation]

One probably connected to a PC for diagnostics and updating, another connected to the network for gaming against other consoles. It helps to eliminate plugging and unplugging constantly to swap between the two.

Or the second port could not be ethernet at all, it could be a header port which simply uses an ethernet jack for programming the firmware of the mainboard.
 
This may be another nail in the coffin for the PC as a desktop (or as we tend to think of it). I just wish xbox/sony would allow cpu upgrades and allow for external GPUs to be used. Imagine having a 7850ish like gpu in the console that you could CF with an external GPU box....
 
[citation][nom]BestJinjo[/nom]LukeCWM,The most CPU limited titles on the PC are MMOs, RTS games and shoddy console ported 2-core mess that Bethesda games are. MMOs and RTS are not found on consoles. The most popular console games are generally very GPU limited titles. Consider that current titles that run on PS360 do not even run at 1920x1080. Actually recent games run well below 1280x720 (Black Ops 2 880x720, Uncharted 3 well below 1024x768, Dark Souls 1024x720). Now next gen cosoles will have DX11 graphics (w/ very demanding graphical effects like tessellation, bokeh depth of field, POM, dynamic area lights, higher resolution textures, etc.) and will supposedly target 1920x1080 resolution with some AA. The minute you are talking about gaming at 1920x1080 2-4xAA in DX11 and next generation DX11 games, you are going to be GPU limited in 90% of games, if not more. People keep ripping Bulldozer/Vishera apart but consoles aren't going to have HD7970 Ghz CFX or GTX680 SLI. Most likely consoles won't even have a GPU as powerful as a GTX670. When FX8120-8350 are paired with a GTX670 at 1920x1080 AA in non-MMO/non-RTS titles, the gaming system is almost entirely GPU limited, regardless if an AMD FX8000 series or a Core i5/i7 CPU is used:http://pctuning.tyden.cz/hardware/ [...] 7?start=16Having an AMD FX8000 series processor is actually the 2nd best option long-term assuming MS/Sony cannot afford to fork out more $ for Intel's Core i5/i7 CPUs. Think about this, would you rather have a Bulldozer/Vishera or some underpowered crappy IBM PowerPC architecture or a tablet 8-core Jaguar CPU?[/citation]

I don't think they'll have something like a 670 for a LONG time; it would make the console MUCH More expensive. That's a 300 dollar + card right there(sorry if that's incorrect). Current consoles have weak graphics with 512 MB of VRAM, but hopefully they can change it with what I've heard to be around a 6670 area so console gamers can come closer to PC quality and performance.
 
People, the 8 core CPU is for a Dev Kit. This is not the PS4...this is what developers use to develop games for the PS4. Obviously it has to be much more powerful than the PS4. My guess is the Ps4 will have a quad core APU at around 3.0GHz with an external low end video card (like the 6670) with 1GB of GDDR5 video RAM and 2 GB of system RAM. Pretty much a very low end PC. Developers will have to optimize their code in order to get anything close to 40 FPS at 1920x1080.
 
This is a dev kit model, so the headphone jack is likely there for easier sound testing. Keep in mind, as people once again seem to speculate this as reality, it's an email tipster much like the XBox spec hoax although a bit more credible. Even if true, dev kits are subject to change.

If it is true, this is a fairly powerful system. Keep in mind consoles get more mileage out of hardware than PCs do with the same hardware since they're single task oriented and don't have to worry about compatibility. This build would show to me that Sony is planning on trying to push to 4k resolutions to help sell their other 4k products, it's really more than needed for 1080p. For those who are going to point out that this wouldn't be enough for real 4k, remember Sony promised 1080p with the PS3. How many 1080p games have there really been?

This leaves the question as to whether or not a company that is desperate to put their numbers in the black is going to take a loss on the hardware or sell it at cost pricing it far higher than the Wii U.
 
Looks like all consoles are shifting to x86...wonder if devs made a lot of fuss porting between things and finally thought it would be easier to just use PCs?

I wish it was Piledriver and not bulldozer though. Can't really decide if it's bad for the PC...will they have better optimizations, or will they suddenly use less floating point operations? CAN THEY reduce float in favour of int ops? I doubt...

But interesting year ahead.
 
Dear Sony, please release Gran Turismo 6 as premier launch title with the PS4, I will pay you $1,000 for this bundle, no problem.

I will only buy the PS4 when GT6 is ready for it, and so will over 5 million other people I know.
 
Think about this, would you rather have a Bulldozer/Vishera or some underpowered crappy IBM PowerPC architecture or a tablet 8-core Jaguar CPU?

Erm, Power7+ IIRC is a good deal more powerful than the fastest Xeon available to the general public. PowerPC arch is certainly not underpowered. Xbox360 launched in what....2006? It has a PPC Tri-Core and in 2013 you rarely hear even the topflight developers complain about lack of cpu power. Not the mention the Cell CPU in PS3 (PowerPC ISA) which for a while after launch was a good deal more powerful for games than anything you could buy for your gaming PC.
 
Dev kits often have twice the RAM so 4GB seems likely.

edit: I'd also like to point out that this might make devs move more into integer based systems instead of the float ones. Floating point calculations belong on the GPU.
 
Hmm... Lets see how this "good" this leak is compared to 720 leak that was totally made up. But is this is true it may finally mean that multicore support will sneak in game development!

 
[citation][nom]Vorador2[/nom]With that CPU retrocompatibility is ruled out :-/ damn it.But it is good for AMD.[/citation]
You sure about that? The Cell is basically an 8-core system, with one disabled for yields and another disabled for security. If Sony works to come up with an official interpreter, this system might be able to emulate PS3 code - though it would no doubt require tweaks to run as well. Still, considering Sony gave the finger to backwards compatibility with the PS3, I doubt they'll bother at all this time around, save for the odd emulated PS1 game (though they'd be smart to offer emulation for PS2 games as well).
 
By using components like these, I can really foresee both the production cost and selling price would not be high. it really is not going to cost more than $400 to produce one. compared to $700+ for the 1st gen PS3.

The PS4 baseline model would likely not cost more than $399 on debut if Sony want it to sell like crazy, or $499 if they chose to not lose too much money.
 
And my current gen GTX 690 and i7 @ 4.8 ghz already outclasses it and not even close to being released yet. Mmm makes me want to go out and buy one. I hate to see what PC hardware I have by the time it does come out. But I guess if your 12-17 this is good news.
 
ok. this rumored specs looks better than an amd apu with just the four cores like the A10's. what is an AMD R10xx GPU, though?

still would have preferred an updated RISC CPU like the cell processor, though, since according to what i have read are just more efficient in getting to the metal as far as coding were concerned.

the ps3 was, indeed, difficult to program for in the beginning. but that is just the nature of technology. PS3 was the first time people had to deal with six or more cores and coding in parallel (again from what i have read). it isn't surprising that programmers had to get used to it.

as long as the ps4 continues that trend of heavy-parallelism, i am not too concerned about, even if the parts are run of the mill AMD cpu's.

maybe, sony or AMD will bump it up to steamroller since the PS4 is not going to be out until the holidays. and i hope they stick an HD7970 in there since consoles are suppose to last 5 or more yrs down the road, instead of just say 2 yrs like it is in the pc world.
 
If Sony works to come up with an official interpreter, this system might be able to emulate PS3 code - though it would no doubt require tweaks to run as well.

Could be wrong, but theres probably no single multicore x86 cpu on the planet that is capable of emulating a PS3 cell at full speed. Never mind one that needs to be economically feasible for the PS4. Hell, we still dont have great PS2 emulators out yet. Its not a question of the cpu (AMD BD/PD) being overall much more powerful than the PS3 cell ("much" is debatable, but it should be more powerful, assuming they leverage the integrated GPU as well as the x86 cores), emulating a completely foreign multicore CPU requires a LOT of overhead.
 
Status
Not open for further replies.