Well the average cubicle joe would be just fine with a 700 mhz proc running windows 95- but then where is the money in that??? MS intel etc need to push the envelope to keep people upgrading.
You get a killer game out in 64 bit vista dx10 only and it will pull a couple million easy- then all these companies sititing on their ass will step up cause they got alot more people posting for a 64 bit driver or demand for a 64 bit version of their program.
By your theory there was no need to ever go past 16 bit- I bet if we look back to 1996 we can find a million "who will ever need 1 GB of ram or a 10 GB hard drive" posts. Once the 64 bit is mainstream the software and hardware to take huge advantage of it will be made.
I think you have a hard time understanding my comments. I never said that there will NEVER be a need for 64bit. I only said that FOR ME now is not the time yet. My current PC has 2gb, looking at going to 3gb. All current programs (including resource intensive 3D apps like Maya) run well on Vista with 2-3GB of RAM, and more RAM will only very marginally improve performance (at the expense of compatibility). The current market doesn't demand 64bit any more than it demands 32 core processors. It sounds cool but is it really necessery (there is a big show-off factor there)? No apps support it yet, no developer wants to spend extra time and resources to program something that makes things more complex, without a real need.
My theory (if I have one...) is that 64bit will make sense in due time when the real need arises. Pushing for a new standard when the benefits are still marginal, and the needs are more psychological than technical isn't the way to go.
My point is, if you are a real PC enthusiast, you change PC's at such a fast rate, that you have the flexibility to not have to look ahead 2 years to figure what you think (or hope) the standard will be then. 2 years from now, this PC I'm using now will be demoted to fileserver or Mediacenter, when my next high-end PC will take over as a 3D workstation/Gaming platform. And since I will be using both, I will need an extra license anyway, so if it needs to be 64bit by then, 64bit it will be.
Similar discussions can be had about AV standards like HDTV. When the first HDTV's came on the market, some people where saying that there was no way around it and you might as well buy them now, since it was going to be the future. These people often find themselves now with obsolete 1080i TVs, without HDMI, without support for HDCP and they have spent 4 times as much money for a TV in a time when HD content wasn't even available. They have to spend money again on a TV that will actually play HD-DVD/Blueray. If they where patient enough to wait what the market was going to do, they could have saved themselves a lot of money, and a lot of headaches.
And 'because it's the future!' isn't a good enough reason for me to change to a new standard...'because it will improve your PC's performance by 25+%' and 'this program will run sooooo much better with 4gb+ of RAM, thus needs 64bit', or '6gb of RAM are only € 200 now', or 'every program is written in 64bit anyway nowadays' on the other hand are good enough reasons.
AMD has pushed 64bit on us with the promise of enormous performance gains (or, as I see it to differentiate themselves and put Intel in a trend-following, rather than trend-setting situation). But with the Core architecture Intel has proved once again that 64bit isn't the performance multiplier we needed so bad, but multithreading and good processor design was. And if 64bit can help to improve that even more, than that's even better...But 64bits on its own won't do anything until the limits of 32bits are reached (which may be within the next 18-24 months...or not). I've been following this industry long enough to know Moore's Law still applies, and with 1-2GB RAM being the sweetspot for most users now (with the OS the main decider on what amount of RAM makes sense), it will be at least another 18 months before we reach that magic barrier of 4GB...