Answering the rethorical question the article asked, to me, it means little. I love staying updated in the news department, but I'm a bit more conservative when it actually comes to buying new gear.
Some people will think I'm mad, but last year, at the beggining of November I decided to do one last upgrade to my socket 775 system; it's a P45 Gigabyte Motherboard, with a healthy 4GB of DDR2 800 from Corsair and an original copy of Windows 7 64-bit Home Premium OEM edition.
The upgrade I did was to the CPU and to the GPU. In the period of two weeks in November I made a good decision and a bad one. I bought a brand new Core 2 Quad Q9550 to replace my Core 2 Duo E8400 and a brand new, just out of the factory, Gigabyte Radeon HD6850 to replace my factory overclocked 8800GT.
Ironically, I guess most people would think that, from a "Sandy Bridge is around the corner" perspective, I was making a bad decision on the CPU (even more considering the E8400 I was coming from) and a good decision on the very efficient and affordable HD6850.
And when you look at CPU buying recommendations, the most you will see is a Q9400 recommendation a few months ago; I must have clearly been out of my mind doing something nobody else was doing.
But I did the math: I have an E8400, revision E0, that I can still sell while it is worth something (which I did); and buying a Quad Core is just a drop in replacement.
Going Nehalem in November wouldn't have made any sense, so waiting for Sandy Bridge was the only option, but here again you can see that the "right around the corner" is not an entirely valid argument - when giving out advice, sites like Tom's Hadware should always tell everybody to take the recommendations with a grain of salt . How badly do you need the new technology ? Everybody knows that even without this unfortunate Sata problem, the newer chipsets always bring new problems that only get solved with a couple of BIOS updates, etc. So, in reality, you should wait at least 3 months before buying a new piece of brand new kit.
That is what I did with the Q9550 and DIDN'T with the HD6850. Of course, each personal experience is just that, but in my case I got a CPU that has a pack date of 27th September of 2010 and is built on a very refined 45nm process; I can easily overclock it to 3.2Ghz on stock voltage and get Core i5 750 performance and wait another year for Sandy Bridge (when it's problems are solved).
Besides, getting another platform would imply buying new DDR3 memory. Sure, that would be included in the price difference between a Core i5 750 and the Q9550 I bought, but I also had to buy a new motherboard, and then there's the added risk of a faulty ram stick or motherboard that I had to take, and all the wasted time it would imply. Just upgrading the CPU is much easier and faster, and less dangerous too, with no motherboards coming out and in (and possibly out) of the case and all the tools needed lying around. And then there's the Windows 7 OEM license problem that would come with a brand new motherboard (not to mention a probable reinstallation of every piece of software, including OS, even if I went past that license problem). Just upgrading the CPU solves that problem too.
Now to the HD6850 - lots of driver problems and even hardware problems - the first batch of cards came with information on the box that tricked me and others to believe the card had two dual link DVI-I ports, when in fact it only had one dual link DVI-I and one single link DVI-D (meaning only one monitor connected through a Dsub adapter instead of two, or, for those with big DVI monitors, only one could be 2560x1600, the other tops out at 1080p); the BIOS it came with idled the twin fans at 73% speed (with a BIOS fix coming online just five days after), and it hard crashed the PC with games like Crysis and a possible fix came online two months after with an F3c vga bios fix to improve compatibility with some games... So I took that card back to the store and came back home with six month old technology - a GTX 460 1GB - it's slightly slower and not as power efficient under load, but it's got a much bigger choice of drivers - that do work, and all the hardware manufacturing has had time to mature.
I'm glad with the CPU I chose, and I'm glad to have learned from the mistake I made when I jumped in on the GPU "buy it as soon as it comes out" enthusiast bandwagon.
All of us get to be guinea pigs from time to time and even laughed at by some idiots for doing the work for them; I had my first time, and hopefully the first of not many LOL.