I'm very glad I picked up my Q6600 all those years ago.
I am finally reaching the point where the 3 tiers higher rule (coupled with other architectural advantages) is starting to make upgrading look very compelling. I made the choice at that time (over 3 years ago now) to invest in extra cores specifically for future-proofing and because I wanted the greater flexibility and utility of having free cores to deal with an AV app starting up, or to free up the system enough that I could simultaneously run other apps and not have my game slow to a crawl.
I remember the single core days when even task switching out of a game would sometimes hang the game or even the PC, at least being a very likely cause for errant, erratic behavior.
Similar to the argument proposed in the recent TH article about the benefits of SSDs freeing up a system in games when AV starts up, or even more so when you run FRAPs or other apps to capture video while in game, the multi-core idea just makes sense following similar logic (more system resources / removing a choke-point in the system).
So, even though most games and apps can't fully utilize 4 cores at once, as long as the OS is smart enough to apply additional apps to 'free' or under utilized cores, having more cores (albeit to a point of diminishing returns) just make sense.
Now it's a very disappointing fact that the majority of game developers primarily base their applications around stunted consoles, but seeing companies like Bethesda release the graphics enhancements package for Skyrim's PC audience gives me at least a little hope that maybe more companies might consider making moves in those directions (albeit backward from a designed for the PC app).
Frankly I'm surprised that it took this long for developers to start taking advantage of the multi-core tech, but I suppose once again we can probably place a majority of that blame on the 'designed for consoles' thing (PCs are just an afterthought).