I apologize for the long rant, but I felt compelled to say something. If nothing just to get it off my chest…
I think home desktop computing is in the toilet -- at least from a growth perspective. Full and utter commoditization is beckoning.
I upgrade my computer every 2-3 years, usually when I can see a significant improvement in performance moving from an existing system to a new one at a reasonable price point. For example, in the past I've moved from a 486-DX100 to a P-75 then to a P-233, Celeron-500, Piii-1GHz and finally to a p4-3GHz. Each move was 2-3 years apart. I could get solid performance gains and utilize new applications, making the move worth it for me. Now I've had a P4-3GHz chip for over a year and there is not a single compelling reason to perform a significant (Mobo/memory subsystem/CPU) upgrade anywhere on the near horizon, even a year out. Intel seems to have all but given up on improving raw CPU performance at any rate near what it has been over the last decade. Now they seem mired in a struggle to re-brand their CPU cores frequently enough so that the folks who don't read Tom's Hardware frequently (the vast majority of regular folks) will think there are all kinds of new and exciting products available now versus last year.
This reeks of commoditization. Take the low end boombox market for example. Do you think Sony, Panasonic, Technics, Aiwa or any host of other throw-away brands have done anything to improve FM reception quality, sound quality or CD playback in the last 20 years? Barely a thing. Many cheap amplifiers from the 70s can go head-to-head with your typical boombox. But I bet you can't even find the boombox you bought 6 months ago in any store today. Why? Because, when companies realize that their product is a commodity, they start focusing on glitz and glamour, marketing campaigns and fluff to distract you from the fact that their product hasn't improved in years, just the appearance seems more up-to-date. I sure don’t know what I would do if my stereo didn’t have a little animated spaceship LED landing pad…
So back to CPUs and desktop computing as a whole, this is where I think things are headed – toward a future in which we’re all stuck at 4-5GHz, but who cares, because my case doubles as a neon dance club for my pet gerbil.
What gives, have we run up against physical limits or limits of the x86 architecture? There are 3.4Ghz chips on the market, and 3.8Ghz chips on the roadmap, representing 10 and 20% performance gains on top of what I already have. This represents barely noticeable performance gains in day-to-day use and certainly not worth the price tag they're putting on them. What happened to the days of the 486-33 and 486-50 where the difference in performance between two successive chips was over 50%! Intel now charges nearly $100 more for the imperceptible 0.2GHz performance gain of a P4-540 vs. P4-550 (http://www.sharkyextreme.com/guides/WCPG/article.php/10705_3402941__3). And these are not even the high end chips. Have you seen AMD's Opteron pricing schedule? It starts out reasonable, but at the high end? What a joke. Yes, I know this pricing based on supply and demand has been going on for a while, but it stinks. The new marketing/numbering schemes just add to the confusion so that no one who doesn't frequent sites like Tom's have any chance at figuring out what they're getting. Brilliant marketing by the CPU makers, but consumers lose in the end.
So, will 64-bit chips be our performance savior? Hah! We don't have an OS or applications that remotely take advantage of it, and besides, does a proper implementation even improve performance that much over existing 32-bit architectures? I think it's marginal improvement at best for most applications. 16 to 32-bit was a big jump and led to much improved system stability, due to the better memory architecture. 32 to 64 is barely even necessary except for large simulations and high end multi-processing environments.
By the looks of it, even cutting edge CPUs are going to become cheap commodities within a decade, and the focus will be on custom chips for custom needs. We're already seeing some of that with GPUs for graphics, Sound DSP processors, Network accelerators, MPEG compressors, etc... All designed to offload work from the new bottleneck at the core: the CPU.
Gordo
I think home desktop computing is in the toilet -- at least from a growth perspective. Full and utter commoditization is beckoning.
I upgrade my computer every 2-3 years, usually when I can see a significant improvement in performance moving from an existing system to a new one at a reasonable price point. For example, in the past I've moved from a 486-DX100 to a P-75 then to a P-233, Celeron-500, Piii-1GHz and finally to a p4-3GHz. Each move was 2-3 years apart. I could get solid performance gains and utilize new applications, making the move worth it for me. Now I've had a P4-3GHz chip for over a year and there is not a single compelling reason to perform a significant (Mobo/memory subsystem/CPU) upgrade anywhere on the near horizon, even a year out. Intel seems to have all but given up on improving raw CPU performance at any rate near what it has been over the last decade. Now they seem mired in a struggle to re-brand their CPU cores frequently enough so that the folks who don't read Tom's Hardware frequently (the vast majority of regular folks) will think there are all kinds of new and exciting products available now versus last year.
This reeks of commoditization. Take the low end boombox market for example. Do you think Sony, Panasonic, Technics, Aiwa or any host of other throw-away brands have done anything to improve FM reception quality, sound quality or CD playback in the last 20 years? Barely a thing. Many cheap amplifiers from the 70s can go head-to-head with your typical boombox. But I bet you can't even find the boombox you bought 6 months ago in any store today. Why? Because, when companies realize that their product is a commodity, they start focusing on glitz and glamour, marketing campaigns and fluff to distract you from the fact that their product hasn't improved in years, just the appearance seems more up-to-date. I sure don’t know what I would do if my stereo didn’t have a little animated spaceship LED landing pad…
So back to CPUs and desktop computing as a whole, this is where I think things are headed – toward a future in which we’re all stuck at 4-5GHz, but who cares, because my case doubles as a neon dance club for my pet gerbil.
What gives, have we run up against physical limits or limits of the x86 architecture? There are 3.4Ghz chips on the market, and 3.8Ghz chips on the roadmap, representing 10 and 20% performance gains on top of what I already have. This represents barely noticeable performance gains in day-to-day use and certainly not worth the price tag they're putting on them. What happened to the days of the 486-33 and 486-50 where the difference in performance between two successive chips was over 50%! Intel now charges nearly $100 more for the imperceptible 0.2GHz performance gain of a P4-540 vs. P4-550 (http://www.sharkyextreme.com/guides/WCPG/article.php/10705_3402941__3). And these are not even the high end chips. Have you seen AMD's Opteron pricing schedule? It starts out reasonable, but at the high end? What a joke. Yes, I know this pricing based on supply and demand has been going on for a while, but it stinks. The new marketing/numbering schemes just add to the confusion so that no one who doesn't frequent sites like Tom's have any chance at figuring out what they're getting. Brilliant marketing by the CPU makers, but consumers lose in the end.
So, will 64-bit chips be our performance savior? Hah! We don't have an OS or applications that remotely take advantage of it, and besides, does a proper implementation even improve performance that much over existing 32-bit architectures? I think it's marginal improvement at best for most applications. 16 to 32-bit was a big jump and led to much improved system stability, due to the better memory architecture. 32 to 64 is barely even necessary except for large simulations and high end multi-processing environments.
By the looks of it, even cutting edge CPUs are going to become cheap commodities within a decade, and the focus will be on custom chips for custom needs. We're already seeing some of that with GPUs for graphics, Sound DSP processors, Network accelerators, MPEG compressors, etc... All designed to offload work from the new bottleneck at the core: the CPU.
Gordo

<font color=blue>Futile is resistance,</font color=blue><font color=red> assimilate you we will.</font color=red> 