AMD gave almost no tech specs for the 939/opteron launch, the AM2 launch and as for the phenom
they made performance comparisons against their own existing chips. To be fair i do recall a few
overzealous assumptions, though they were stated as such with refuseal to produce benchmarks. But
intel is the one who likes to get creative with benchmarks...
It has been stated for years that part of AMD's downfall's is due to lack of any strong matketing
campaiegn. Intel has ad's running in every branch of the media constantly whether they have a
product that justifies it or not. AMD is limited largely to the internet, and even that was
scarce up until the past couple of years.
Yes i've looked at SPEC.org. I've also read numerous accounts of intel "mixing up" systems in
which they would do things like run double the cpu cores, double the disk RPM speed along with
the number of disks in raid 0 compared to the AMD rig and run benches in 32bit while the AMD
system was runing 64bit. That didn't happen to long ago actually and i'm fully aware that it
merits links reciting these occurances. As it's 3am, i'm just going to respond and will make sure
to post links ASAP. I'm just loathe to take any internet benches at face value, especially since
my phenom 9850 BE is benching 15%-20% above reported benches.
When intel is court in numerous countries across the globe for unfair bussiness practices i
hardly think their sales are simply a matter of excess stock.
The engineers from the alpha team that AMD inherited are responsible for the socket A chips as
well as the opteron and in turn the 939 desktop chips which have an IMC. Considering the fact
that when the $1500 core 2 extreme qx6850 launched AMD's fastest chip was the $180 6000 x2 which
had 750mhz memory speed 6.25% slower then the qx6850, the AMD ended up with 7.5% slower memory
performance in PC mark 05 memory test. Bumping the memory up to 800mhz cut the difference down to
less than 2% in pcmark05.however in Sandra's memory integer and floating point benches the 6000x2
@ 750mhz has a 34% lead despite the 6.25% speed handicap. That's pretty much a clear cut win with
the IMC. Which is part of the reason that the opteron/939 earned AMD the crown in june of 2004
with the launch of the 939 with improved IMC and support for server and desktop chips.
Intel should have revisited the IMC long before considering pushing out 8 and 16 core chips to a
market that has almost no software supporting anything beyond a dual core. They just did not have
the ability to impliment it despite working on it for the past 3.5 years.
Beautiful engineering solution? Engineering shortcut would be more accurate. Intels initial
"multi-core" chips are little more than a shrunk down version of a multi-socket board. two die's
that not only share cache, which creates a bottle neck, but having to comunicate between cpu
die's, to the northbridge, to the memory, and back on down the path to the shared cache. Nothing
pretty about it, it's dirty and limited, balanced only by ridiculous cache size. You act like the
phenom x3 is the first example of amd using imperfect silicon. Going back to the 939 chips the
manchester dual core was a toledo with half the L2 cache disabled. I'm also fairly certain that
some of the semperons were x2's with a damaged core.
The Dec 2000 AXP ran NT, not the EV7. The EV7 was the first implimentation of an IMC though.
Those workstations also introduced PCI bus and the VGA standard. In fact im actually typing this
on a 22inch compaq qvision monitor that was the standard for one of the Alpha workstation
revisions. But i'll explain how i know that and how i came to get the monitor a bit later.
An amd chip that can run a 64bit OS. Hmm well lets see first off while your comment on the IA-64
may be technically accurate well reffering to the original anyway...there is a reason why x86-64
won out for desktop. It's the same hardware jump that was made when going from 16bit - 32bit
computing. Running 32bit apps on the first release of IA-64 required that it be emulated or have
a dedicated processor just for 32bit code. Which translated into ssssssllllllllooooooowwwwwwww
performance. It was horrible. An exsclusive 64bit code cpu...that didn't have 64bit support
enabled. It was a 64bit chip emulating 32bit code. Talk about progress.... but even in 64bit the
arch was terrible. Which is why intel had to license x86-64 from AMD. But you say nothing else is
64bit cpu..well you're wrong and you're right. But first the history lesson.
Most people would say that 32bit computers weren't around until windows NT 3.1 and 95. Sadly,
it's true in the sense that there was no 32bit software supported until then, but the reality is
the first 32bit cpu was....the 386. Yes the first x86 32bit cpu was the 386. Which was launched
in 1986. I was 3 ffs. The 386 ran 16bit software at full speed in hardware along with 32bit
application and ported apps to 32bit when available.
This is the same thing with the A64 chips. Also almost identical with the new IA-64 intel chips,
though intel did change a few things which of course run a bit slower then A64. Well documented
differences if you actually look for them, the easiest find is probably wikipedia.
The intel IA-64 based MAC os machines do almost the same thing. 64bit GUI apps are supported
using openGL, x11 quartz and something else i can't remember. Non gui 64bit frameworks are
supported as well, and 64bit POSIX and math libraries are supported in the command line. Though
it's a 32bit kernel. Hmmm just like the 386 was implimented.
AMD64 adresses 48bits of the available 64bit address while IA-64 only uses 32bits (i know the new
server chips are upping it to 44bits but havne't caught anything regarding the Nehalem)
So AMD chips that have run 64bit OS's. My 144 Venus opteron, 165x2 Toledo core opteron, Athlon
4400x2 Toledo core have all run/ currently running 64bit XP, 64bit Server 2003, 64bit Vista
ultimate, 64bit Linux. The 165 opty currently is running 64bit Server 2008, though it ran fine on
the 144 opteron as well. Currently am running 64 bit vista ultimate on my phenom 9850 BE DFI LP
UT 790FX with 2x2gig 1066mhz Giel and 4870xt with a Windows performance index rating of 5.8 (the
5.8 being my hard disks atm, CPU, GPU, RAM are rated at 5.9) Those are as 64bit as you get short
of running a 64bit long server envirorment which doesn't do me much good, though i'll try 64bit
long linux just for S&G's.
Funny thing is when I built a PC back in 2002/2003 (just before the release of Athlon 64) I built a system with a Pentium 4. My reasoning? Well at the time Athlon XP ran very hot, wasn't as fast as Pentium 4 (performance wise) and my friends who had them had the worst time getting them to just run at the stock speeds they were supposed to run at due to horrible chipsets and mobos. Heck I had to overclock a friends so it would be seen as a 2700+ (think it ran at like 1.4GHz) but it would only be seen at 700MHz if you left the BIOS as is.
Wow jimmy...just wow. Are you serious? Because there is no part of that paragraph that isn't horribly wrong and albiet painfully funny. It feels like bait...but oh well, the 700mhz boot speed i actually saw, though i knew why it happened.
It was a chip with an unlocked multi and the board didn't support changing multipliers at all (which probably could have been fixed with a bios update) it didn't support changing them manually, but did support Cool & quite (if the microcode identifying the chip wasn't detected and C&Q was left on it would default to the lowest multi FSB, usually 100mhz x 7 or 133x6 which is looking pretty likely) The advanced features menu where the multiplier option was located in the bios was only visble after pressing F1 (true of gigabyte and abit boards) If it happened to be a Gigabyte GA-700NA pro NF2 board there were 5 little dip-switches that let you hard set the multi before even powering up the board. Having all of them in them On/off set it at either Auto/x7. Or maybe the chip decided 700mhz was all you two deserved.
Yes, actually i do realize that intel has a hand in creating I/O standards and peripheral's. But those aren't what put them in the spot light, nor are those things that they misrepresent. I loved nvidia chipsets long after i disliked their gpu's.It's kind of like saying everyone hates the nazi's but they don't realize that if it weren't for them we wouldn't have the jet engine. Doing good in one area doesn't mean they do everything right.