Sony: Developers Requested x86 Architecture for PS4

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

IndignantSkeptic

Distinguished
Apr 19, 2011
507
0
18,980
[citation][nom]shikamaru31789[/nom]If the rumors/leaks are true, the Next Xbox is very similar to the PS4. x86 8 core AMD APU, but with a slightly less powerful GPU and slower DDR3 RAM. The leaked prices suggest it will be a good bit cheaper than the PS4 though. With both the PS4 and the Next Xbox on x86, I can see the PowerPC based WiiU getting passed over for multi-platform titles alot of the time. Nintendo just can't seem to get things right.[/citation]

Watch now how the PS4 games will be nerfed so they run on XBox720 as well. Hopefully the WiiU is too stupidly far behind that game developers won't try to nerf the games even further so they run on WiiU as well.
 

IndignantSkeptic

Distinguished
Apr 19, 2011
507
0
18,980
[citation][nom]brythespy[/nom]I may be wrong, or misreading something here, but I thought x86 Only allowed for up to 4GB of RAM? Yet the PS4 is said to have 8GB.[/citation]

I believe you are thinking of 32bit. 32bit memory addressing space on any architecture is what causes the limitation of 4GB of RAM.
 

ojas

Distinguished
Feb 25, 2011
2,924
0
20,810
[citation][nom]JOSHSKORN[/nom]Way to keep us living in the past. x86? C'mon. FAIL![/citation]
Hahahahaha yeah right. Let's see now:

Where are PCs going? NO WHERE!. They're going to be on x86 till AMD and/or Intel manage to create a more efficient architecture. AMD might do that first because of their APU thing, though it might not necessarily be efficient as Intel64 for traditional CPU-oriented tasks.

Then, where are consoles going? x86, because it's actually very efficient, and AMD has the whole APU thing going on, leading to cut costs for console makers and because it's the same platform as PCs, less cost for developers too. Plus, this is also great for Intel/AMD on the CPU side, since i quad-threaded processors in PCs will become a minimum for gaming going forward. So all the 2C/2T CPU users or the older quad-core users will upgrade their machines.

This is also brilliant for Intel and AMD from the APU point of view, since these games will be designed around APUs (though i actually don't think this will be the case, since the PS4's bandwidth is that of a discrete GPU. So this might be more beneficial to Nvidia/AMD, and Nvidia's stuck in PhysX there too).

Now where is mobile going? x86! ARM has already hit the roof as far as efficiency is concerned, the Nexus 10 spikes up to 8w with its 4 A15s and GPU, Ivy Bridge/Haswell and AMD's Richland go as low or lower. ARM can't implement A15s without A7s without killing battery life (big.LITTLE exists for this reason, imo).
It's really Qualcomm vs Intel in the mobile space, Samsung and Nvidia will compete with AMD going forward.

So neither PCs, nor x86 is dead. It's the only ISA that's good enough and that scales well enough to suite all purposes.

Oh and, forgot servers, workstations, HTPCs, business PCs and others use x86.
 

teodoreh

Distinguished
Sep 23, 2007
315
13
18,785
So nowadays, PCs are x86. Macs are x86 and major consoles are all x86.
Processors are x40 times slower than GPUs, and we still care about x86 architecture.

Which visionary will create the next Amiga that will be a quantum leap? Anyone?
 

aggroboy

Distinguished
Sep 17, 2010
197
0
18,680
[citation][nom]teodoreh[/nom]So nowadays, PCs are x86. Macs are x86 and major consoles are all x86.Processors are x40 times slower than GPUs, and we still care about x86 architecture.Which visionary will create the next Amiga that will be a quantum leap? Anyone?[/citation]
I don't think Amiga was the quantum leap, at least not moreso than C64 or PS1.
 


Amiga was WAY ahead of it's time, though it's price prevented it from ever catching on. The Amiga had custom chips for nearly every function, CPU, Video, Audio and I/O along with multiple memory bus's. It was an amazing system for it's era and it's design allowed it to do things nobody else could. Of course all that customization made it hard to code for and too expensive to really be competitive with the IBM clones that started to surface. You can go out and get Amiga Forever and play around with it, kinda amazing for it's time.

Now to the commentator mentioning "x40 slower the gpu", no their not. GPU's are just vector processors, meaning their designed to compute vector math. CPU's are integer processors (called scalar), meaning their designed to compute single integer / logic math / instructions. Through internal SIMD coprocessors a generic x86 CPU can compute SIMD vector instructions (SSE/AVX/FMA) which also acts to process the old 80-bit FPU instructions. Vector processors are really REALLY good at doing large amounts of simultaneous array math, they absolutely suck at doing single integer calculations or logic comparisons (IF / OR / XOR / AND). Vector CPUs have been around since the 70's, they are used for different things then scalar CPUs are. What we've found is that both types of processors work best when paired with each other. x86, for as clunky as it is, has proven a very efficient architecture for desktop and commodity computing. And by efficient I mean cheap, it's performance per USD is higher then most other design types which is why it's so prominent in applications where hardware costs dominate.

https://en.wikipedia.org/wiki/Vector_processor

 

blppt

Distinguished
Jun 6, 2008
576
92
19,060
[citation][nom]ojas[/nom]Hahahahaha yeah right. Let's see now:Where are PCs going? NO WHERE!. They're going to be on x86 till AMD and/or Intel manage to create a more efficient architecture. AMD might do that first because of their APU thing, though it might not necessarily be efficient as Intel64 for traditional CPU-oriented tasks.Then, where are consoles going? x86, because it's actually very efficient, and AMD has the whole APU thing going on, leading to cut costs for console makers and because it's the same platform as PCs, less cost for developers too. Plus, this is also great for Intel/AMD on the CPU side, since i quad-threaded processors in PCs will become a minimum for gaming going forward. So all the 2C/2T CPU users or the older quad-core users will upgrade their machines.This is also brilliant for Intel and AMD from the APU point of view, since these games will be designed around APUs (though i actually don't think this will be the case, since the PS4's bandwidth is that of a discrete GPU. So this might be more beneficial to Nvidia/AMD, and Nvidia's stuck in PhysX there too).Now where is mobile going? x86! ARM has already hit the roof as far as efficiency is concerned, the Nexus 10 spikes up to 8w with its 4 A15s and GPU, Ivy Bridge/Haswell and AMD's Richland go as low or lower. ARM can't implement A15s without A7s without killing battery life (big.LITTLE exists for this reason, imo).It's really Qualcomm vs Intel in the mobile space, Samsung and Nvidia will compete with AMD going forward.So neither PCs, nor x86 is dead. It's the only ISA that's good enough and that scales well enough to suite all purposes.Oh and, forgot servers, workstations, HTPCs, business PCs and others use x86.[/citation]

Intel tried to get away from x86 and failed miserably thanks to some clever engineers from AMD. If they had their way, we'd all be using some form of Itanium right now.
 


Itanium ([strike]Intel64[/strike], EMT64) had it's purposes but it absolutely sucked for general computing. It relied heavily on the compiler doing the branch prediction during compile time to factor in the additional instructions needed for it's VLIW architecture. Compilers can not predict the future, so many of the compiled operations ended up being useless are are discarded during execution. On the other hand, if the code your compiling doesn't contain variable jumps then you can get a lot of performance out of a VLIW setup. It's why VLIW was a good architecture for GPUs.
 

blppt

Distinguished
Jun 6, 2008
576
92
19,060
Agree with Palladin above...Amiga was so far ahead of its time graphics and sound wise, its amazing. Basically, consider it was dishing out games that had better than Genesis/Megadrive graphics in 1985. There was the rare Sharp X68000 during that time frame (never sold in US, IIRC), but pretty much nothing could touch Amiga until SNES in 1990/1 and VGA PC (though without custom chips like the Amiga blitter, you would have to wait till 486 till overall performance in games could surpass Amiga OCS. And even then, the Amiga was still better at certain things like smooth parallax scrolling with multiple sprites).
 

Shin-san

Distinguished
Nov 11, 2006
618
0
18,980
[citation][nom]darkavenger123[/nom]Err...the Atari Jaguar was 64-bits, so was N64. Dreamcast was the first 128-bit console, and so is PS2....but number of bits in CPU does not means better performance. It simply means extra registers to play around in the CPU. XBOX was 32-bit Pentium 3...but it ownz both PS2 and GameCube. Even PC's 64-bit games doesn't means it's better than their 32-bits counterpart. It simply means it can access > 4GB RAM....And no consoles to date have that much memory.[/citation]It's even weirder than that. They'll say "64-bit graphics" but in reality, it doesn't mean that the CPU itself is 64-bit. The Dreamcast's CPU was 32-bit, for example. Usually it means that some piece of hardware is 64-bit, like the graphics bus, sometimes the CPU, sometimes the GPU.
 

Shin-san

Distinguished
Nov 11, 2006
618
0
18,980
[citation][nom]teodoreh[/nom]So nowadays, PCs are x86. Macs are x86 and major consoles are all x86.Processors are x40 times slower than GPUs, and we still care about x86 architecture.Which visionary will create the next Amiga that will be a quantum leap? Anyone?[/citation]Apparently Commodore wants to bring back the Amiga in a Mac mini-like form factor.

 

ojas

Distinguished
Feb 25, 2011
2,924
0
20,810
[citation][nom]palladin9479[/nom]Itanium (Intel64) had it's purposes but it absolutely sucked for general computing. It relied heavily on the compiler doing the branch prediction during compile time to factor in the additional instructions needed for it's VLIW architecture. Compilers can not predict the future, so many of the compiled operations ended up being useless are are discarded during execution. On the other hand, if the code your compiling doesn't contain variable jumps then you can get a lot of performance out of a VLIW setup. It's why VLIW was a good architecture for GPUs.[/citation]
Wait, AMD64 and Intel64 are both names of the two's respective implementation of x86-64.

Itanium was IA64, not Intel64.

If i'm not wrong, wasn't Itanium supposed to be for servers/enterprise work, and not for general consumers?
 

blppt

Distinguished
Jun 6, 2008
576
92
19,060
[citation][nom]ojas[/nom]Wait, AMD64 and Intel64 are both names of the two's respective implementation of x86-64.Itanium was IA64, not Intel64.If i'm not wrong, wasn't Itanium supposed to be for servers/enterprise work, and not for general consumers?[/citation]

Yeah, but Intel had plans on replacing X86 with a variant of Itanium on the desktop too. Never got the chance as AMD64 was closer on performance to Itanium IA-64 than anybody expected, and was naturally far superior when running legacy x86 code.
 

ojas

Distinguished
Feb 25, 2011
2,924
0
20,810
[citation][nom]blppt[/nom]Yeah, but Intel had plans on replacing X86 with a variant of Itanium on the desktop too. Never got the chance as AMD64 was closer on performance to Itanium IA-64 than anybody expected, and was naturally far superior when running legacy x86 code.[/citation]
Ah interesting. Didn't know that (too young back then).

And the forums sort of suck. I posted something that didn't post and now i'm in no mood to type it over again.

But in short, i think that the fact that even Intel couldn't come up with something viable to replace its own ISA, probably says more about x86 than the replacement.

x86 is like QWERTY, has its flaws, but it's probably the best we have, and hard to replace. :lol:
 

ojas

Distinguished
Feb 25, 2011
2,924
0
20,810
[citation][nom]mikenygmail[/nom]"Sony actually co-developed the PlayStation 4's APU with AMD"Actually, AMD co-developed the Playstation 4's APU with Sony.[/citation]
Um, no, AMD makes APUs not Sony. I'm not sure what you're getting at, the original sentence in the article seems fine to me.
 


Yeah you are correct, Intel changed it around several times. Itanium is IA-64 not Intel64. AMD created the 64-bit expansion of the x86 instruction set and eventually licensed it to Intel under a cross licensing agreement. Intel originally called it EMT64 before changing it to Intel64, confused the hell out of me sometimes because lots of documentation still refers to it as EMT64.

IA64 was Intel's play to cut AMD out of the CPU business, they have a very long history of fighting over rights to x86. Intel owns x86 but IBM made it a condition of their original contract (original IBM PC) that Intel supply AMD with a license so that IBM wouldn't have a single source for their parts. AMD started making Intel x86 clones and eventually the 3rd party IBM PC Clones started coming out. Intel saw how lucrative the commodity computing environment was and tried to say that AMD's license only covered the first few Intel CPUs, lots of legal battles ensued and eventually AMD was granted a permanent non-transferable license to the x86 ISA. After that Intel wanted to switch off x86 and pushed manufacturers for their *new* 64-bit ISA, Itanium. Intel promised they would make this super compiler that would make all your software magically run on the Itanium, didn't flesh out and eventually developers stayed with x86. MS made up to Windows Server 2008 for Itanium, they said their not making anymore OS's for it though.

For the most part HP is the only one still using Itanium with HPUX, it has a small niche market with simulations and physics calculations. Those workloads tend to work will with VLIW design's, though GPGPU processing is rapidly making Itanium design's obsolete.
 

mikenygmail

Distinguished
Aug 29, 2009
362
0
18,780
[citation][nom]ojas[/nom]Um, no, AMD makes APUs not Sony. I'm not sure what you're getting at, the original sentence in the article seems fine to me.[/citation]

Uh, actually I was quoting the article, as indicated by these quotation marks: " "
My point was that AMD makes APU's, not SONY.

Article states:
"Sony actually co-developed the PlayStation 4's APU with AMD"

My point and post:
Actually, AMD co-developed the Playstation 4's APU with Sony.
 

mikenygmail

Distinguished
Aug 29, 2009
362
0
18,780
The way the article stated it made it sound like AMD and SONY were both 50% responsible for the APU, which is ridiculous because AMD alone makes APU's. AMD made the APU and customized it, with some input and minor tech from SONY, for the PS4.
 


That was more Intel trying to be crafty, though remember VLIW design's dominate GPUs today. Intel basically tried to use a vector processor with static encoding as a generic CPU, it worked well with supercomputers (early crays) and other extremely high processing systems. Of course they didn't realize that most instructions processed by general purpose CPUs are simple integer operations and logical compares, not calculating the gravity well of a black hole. So essentially they misunderstood their target market. Itanium would of worked great as an add-on coprocessor, though that would of meant sharing the market with AMD and they absolutely didn't want competitors.

Me personally, I'm an UltraSPARC kind of guy. I believe the SPARC ISA and architecture is wonderfully beautiful in it's simplicity and expandability. It was designed to be an open ISA that could scale from a pocket calculator to a 64-CPU mainframe or even supercomputer. Makes it very easy to implement and expand.
 

blppt

Distinguished
Jun 6, 2008
576
92
19,060
I hate the idea of everything being based on a 30 year old ISA too, but since all of our x86 legacy apps would need to run through a software emulator on a "pure" new architecture anyways, just think of the modern x86 cpu as a very efficient "hardware emulator"---they all have RISC cores, with a x86 interpreter sitting on top.
 


Honestly most of "x86" we use now isn't even the original x86. SSE / AVX and all that are actually their own ISA's that operate on their own RISC register sets. x86 is just the binary language we use to represent logic, memory access and integer math, it all gets recoded when it pass's through the front end of the CPU anyway. This has the interesting side effect of letting hardware manufacturers design their own internal optimization routines.

Take a single core off an Intel SB CPU. It has three integer logic units inside it though the x86 ISA only allows for one. Their decoder unit rips the instruction apart into several small instructions and attempts to order them in the most efficient method possible so it can process multiple instructions simultaneously. That ability has been the primary reason the x86 ISA has kept up with the others.
 

catfishtx

Honorable
May 15, 2012
145
0
10,690
I am not a software developer so someone correct me if I am wrong, but now that the consoles are going x86, wouldn't it be easiest to code a game to the PC, then branch off to the PS4 and xBox 720 sides and add in their console specific features? So in essence, future console games are really ports of PC games. You got to love the irony!
 
Status
Not open for further replies.