[citation][nom]h2o_skiman[/nom]What is really sad about this announcement is how Intel paid off Microsoft to drop Power/SGI/Alpha architectures from Windows. NT supported all them and was designed to be cpu independent using HAL. I ran NT and Windows 2000 beta on Alpha and it blew away anything that Intel could offer at the time. Windows 2000 fixed many of the original compatibility issues suffered in NT. Instead of consumers having real choice in CPUs, we are left with only AMD and Intel, which are the same CISC 32/X64 architecture and it is not the best for many applications. If we had the choice today, you might see gamers of HTPCs running on RISC chips and office productivity on CISC. But instead, Intel made the choice for us.[/citation]
You're way off. Itanium killed Alpha and all the other RISC chips except for POWER based ones.
It's common knowledge now that the Alpha was the greatest chip and architecture ever, with the best instruction. We all know this. Except it's completely untrue. They ran hotter than Hell, were obscenely expensive, and were routinely beaten by POWER based processors (although they also beat POWER chips too, depending upon the timing and application). DEC originally made them very high clock speed, low IPC devices, then slowly realized this wasn't the best, and increased IPC and focused less on clock speed. They were made so expensively, they could never have been mass produced, and their alleged superior instruction set was not. Their implementation was just very, very expensive, with a lot of custom designed silicon in each processor.
Anyway, Alpha is gone. Good riddance. Those machines were real junk and I'm glad never to have to work with one again.
You're also wrong about Intel forcing CISC on everyone. They wanted to move to VLIW, but AMD and the idiot consumers wanted an extension to x86, and Intel had to add the extensions. Don't blame them for this.
Also, x86 isn't CISC, under the covers. CISC instructions are converted to RISC (it's called de-coupled architecture), so the lines are blurred.
That you think CISC is better for office productivity is strange. The complex instruction sets had nothing to do with being designed with certain software in mind (with the exception of some being well suited for certain languages), but with hardware limitations available at that time.
For example, people bitched and moaned about the 8086/186/286 segmented memory addressing. It was created to save memory when memory was scarce and expensive. That's generally when CISC was made. Very powerful instructions kept code density very high, allowing much more sophisticated applications in the same amount of memory.
In some cases, weird addressing modes were for limitations in the actual processor hardware. For example, early DEC minicomputers were 12-bits. 5-bits were for the instructions, leaving only 7 bits for the memory, or 128 bytes that could be addressed directly. Kind of small, even for then. So, they'd create complex instructions where the 7 bits would actually point to a register, and the full 12-bits of that register would point to memory location (so in a sense, the instruction had the action to be performed, followed by a pointer to the register with the value of the memory address).
CISC instruction sets were made at a certain time, and when technologies changed (i.e. memory became less of an issue), instructions sets more suited towards those technologies were born. Not instruction sets for certain apps.