This is definitely wrong. According to this, MIPS 64-bit R4000 was launched in 1991.
http://processortimeline.info/proc1990.htm
I personally remember the 64-bit race among RISC CPUs of the early 1990's, with MIPS among them. In fact, the Nintendo 64 launched in 1996 and had a legitimately 64-bit CPU.
kinggremlin :
AMD was dead last in the race to 64 bit among major architectures which is why they could only come up with the band aid 64bit extensions for x86 which ended up screwing us all and handicapping us to all the outdated garbage that was in x86.
This definitely makes you sound like an AMD-hater. Let's not forget that Intel had plenty of time to extend x86 to 64-bit however they wanted. Instead, they chose to use it as an opportunity to force their new IA64 micro-architecture on the mainstream computing world.
And if you want to talk about band-aid solutions, how can you overlook the hack that Intel bestowed upon the x86 ecosystem that is PAE - extending 32-bit x86 to address 36-bits of physical memory? This was purely to breathe a bit more life into x86 until their Itanium CPU could take hold (but IA64 was short on more than just time). Had AMD not launched x86-64, maybe Intel would've just updated this to support 40 bits, etc.
You might nit-pick a few decisions AMD made in x86-64, but I didn't hear much criticism at the time. IMO, it was eminently pragmatic, and included a number of worthwhile improvements, beyond just 64-bit addressing.
As for the rest of your CPU history lesson, I'd suggest anyone truly interested in the timeline of microprocessor developments would be much better served by reading this:
http://processortimeline.info
From my perspective, discovering it was the best thing to come of your post.
kinggremlin :
I'm not sure what it is about AMD fans that they think AMD invented everything when they've not really innovated much of anything. You don't see that with fanboys of other brands.
I dunno which is worse: fanboys or haters.
A lot of the innovators of the computing world are no longer with us, or are shadows of their former selves. It turns out that market timing & execution are things which matter just as much as an idea's genius and originality. The fact is that virtually all of the ideas underpinning today's computing products had their origins
decades earlier. I knew a computer engineer who worked at Data General & Stratus, in the 1980's and 1990's. He said he thought they were innovating, only later to discover pretty much all of the original ideas they thought they had were already done by IBM and others, in the 1960's and 1970's.
Credit should go where it's due, but I don't really worry too much about who invented what. The things that really matter are bringing good products to market, pricing them reasonably, and supporting them properly. How much of their IP is truly original is fairly moot, IMO.
If you want to see real innovation in computing hardware, I think you'll need to look beyond semiconductors. For example, I find quantum computers fascinating - like cutting-edge physics experiments, except you can use them to run algorithms!