[citation][nom]ojas[/nom]Ah interesting. Didn't know that (too young back then).And the forums sort of suck. I posted something that didn't post and now i'm in no mood to type it over again.But in short, i think that the fact that even Intel couldn't come up with something viable to replace its own ISA, probably says more about x86 than the replacement.x86 is like QWERTY, has its flaws, but it's probably the best we have, and hard to replace.[/citation]
The x86 is the ultimate, as long as we're interested in more computing power (and we are). It's not that there couldn't have been competition. There was another growable CISC around for a time, 68k. This was Motorola's family, which powered Amiga, Mac, Atari,.. basically everything but the PC. Motorola's board eventually wrongly concluded that the future belonged to RISC, and committed a CPU suicide, going with IBM's Power instead.
There are a number of things going for CISC:
The ISA is formal. Which means the inner workings of the CPU can differ a lot without the software needing to take any consideration. This means a lot of generations, tiers, technologies and manufacturers can co-exist on the same software binaries. Creating a mass market advantage for development efforts for both CPUs and software. It also means the software is reliable. It also means that any crazy way of doing it internally is OK. Software doesn't care. The compiler doesn't care. The CPU is a dark box. This is the way CISC beat RISC. All the supercomputing technologies could and did move into the x86 CPU, and then some. Prefetch, pipelining, superscalarity, out of order, code-fission, vector-processing, code-fusion. And in all this, it came the day when the CPU grew so powerful that one of the main bottlenecks was moving work in and out of the CPU. And this is where all the compiler-centric architectures were truly F**ed.
But there is another reason: And that is a mathematical discovery about complex systems that says the most efficient way, and sometimes only workable way, of handling complexity, is to do it at the lowest possible level, i.e. inside the CPU, NOT in the compiler. Of course, EPIC's magnificent failure had already proved this principle in practice.