Anyway, in my opinion upgradeability is overrated. I mean, why would you buy a new CPU and have it crippled by obsolete chipsets and ram?
At least in the last few years, when I’ve paid attention, most innovation has been in both CPU and the tings surrounding it, especially ram. For instance I have a moderately old p4 Northwood a (2 GHz), when or why would I ever upgrade the CPU without upgrading my motherboard and ram as well? Oh yes, if I wanted I could have put in a 3.066 but if I wanted to spend as much money as that would cost me at any time before it became obsolete (when the Northwood c came out) I might as well have bought a 2.8 GHZ when I first build my system, which wouldn’t have set me back more, and which would be almost as fast.
In my opinion the time when it’s a good idea to upgrade is when new innovation arrived, I thought about it when the i870 came out. But then, that would be as much the motherboard and the ram-system and then the upgradeability of my system would mean nada.
The CPU is almost as expensive as both motherboard and the ram (of course, it varies) so why not back your investment with those as well?
The only situation where I see upgrading only the CPU is if you’re so much of a power user that you just have to upgrade your 2.6 Northwood c (and in that case you’d at least have bought a 2.8 in the first place) to something like 3.2, and that’s silly in my opinion, simply.
One other exception is if you had an early socket 478 with RDRAM, as far as I know the 850/rdram combination didn’t evolve that much, but it kept being competitive, so upgrading something Williamette'ish or Northwood a’ish to say a Northwood b would make some sense I figure.
But in my opinion the fact remains that we may talk a lot about upgradeability but when does it really make sense? It hasn’t in my time…