Increasing the frequency linearly increases power consumption exponentially.
Increasing the number of cores linearly increases power consumption linearly.
Therefore, if you want to save power, single-core is fundamentally flawed. Theoretically, you could construct a multi-core processor that did the same amount of work in the same amount of time but at a lower frequency and thus using less power (down to a certain, useless cutoff).
I disagree.
In fact, we could rename this thread IBM vs Intel+AMD. Go ahead, google IBM's power6 chip. over 2 times the processing speed of the current generation, no extra cores required, no exponential power increase. Now I know this is not a PC chip, but it does go to show there are ways to increase the speed of a processor greatly without simply adding more cores. There are several technologies in the works out there that have the potential to further increase CPU speed (think clockless CPU's and quantum CPU's). All in all, CPU top-out-speed is a long way off.
I'd also like to point out the multithread/multitasking irony: One of the main virtues of multiple cores is being able to multitask efficiently. At the same time, if 2 or 4 multiple cores becomes the standard, then all software will start to be written with 2 or 4 active threads. Now you're about as incapable of multitasking as the guy running a single core that's 2X as fast as one of your duals.
But wait! You say, "maybe they won't write all software with multiple threads" Then congradulations, you now have now paid for an extra core that you don't even use all of the time. What a waste.
I want my high speed single core dream back. If I really need another core, I'll wait till they come as an add in card.
u seem to know ur stuff and i will assume u do know that the above mentioned technologies have exorbitant costs that are far beyond wat a typical consumer is willing to pay (or companies willing to implement that technology IF it is already available)
i would think that dual core is the way to go, given that the current silicon technology is nearing its end of its life(too much crosstalk, current leak as the nm gets smaller) . until the newer technologies are cheap enough (or even, exist) to implement.