slvr_phoenix
Splendid
Are you still so sure there is no need for 64 bit computers any time soon ?
P4Man, you're still not even trying to give people who disagree with you in any form any credit whatsoever. I can't think of a single person who said that 64-bit is <i>never</i> needed or that it is <i>just</i> marketing. What your common opponent will however say is that 64-bit computing is not needed for the <i>typical</i> consumer, and likely will not be for yet another two to three years. You still haven't once managed to refute <i>that</i>. Your examples always include highly atypical usage that might at best apply to 0.03% of all computer users out there.But hey, people will keep claiming 64 bit is only for marketing and not needed....
And this only proves your own lack of an ability to predict the future. Major GPU upgrades are on about a 1 year cycle. Until just this last year the CPU MHz race was also fast paced. So anyone who bought a computer 18 months ago would be <i>very</i> likely to be sacrificing eyecandy and framerates on the absolute newest games if they refused to update their hardware.Don't tell if you bought such a combo in august 2002, that you'd expect to have to upgrade/replace it again today.. ?
Similary if I'd buy a truly high end gaming rig today, I would expect it to last two years without major upgrades and without sacrificing much if anything on eyecandy and framerates.
Heck, just compare an ATI Radeon 9700 Pro to an ATI Radeon X800 or an nVidia GeForce 6800. Besides the obvious framerate differences that you'd see, the pure 3D programming advancements are quite noticable as well. For that matter just the anti-aliasing algorithms have come a long way in just 18 months.
You'd have to be unimaginably stupid to <i>not</i> have to expect to upgrade to get great image quality and framerates in the absolute latest and greatest games after 18 months. You either upgrade or settle for less than perfect. Less than perfect, by the way, is really not that bad of a choice either.
Besides, putting your faith in the future in the hands of Tim Sweeney is certainly no proof of your own ability to predict the future. "<font color=blue>Well, we are aiming at the kind of PC that we think will be <b>mainstream</b> in 2006.</font color=blue>" (The bold is my emphasis.) He <i>really</i> thinks that <i>most</i> households will have 1024MB of video RAM in just two years? Tim Sweeney may be a 'programming god' (the words of BeyondUnreal contributing editor Twrecks, not me) but he's certainly no psychic, or even good at predicting market trends for that matter.
For a perfect example of this, consider the statement "<font color=blue>The normal maps are typically 2k by 2k.</font color=blue>" Yeah. That makes sense. As it is most people are being limited today by their MONITOR. Unless we either see the prices of monitors change sometime soon (which isn't likely) or we see monitor dot-per-square-inch numbers change drasticly (which is even less likely) then the coolest video card in the world with even a gig of RAM isn't going to mean squat when the game is run on a cheap-arsed 17" CRT monitor, or <i>worse</i>, an <b>LCD</b>. So making normal maps 2K by 2K is going to be incredibly wasted on pretty much everyone.
So then, which makes more sense: 2Kx2K maps that won't mean squat to anyone or 1Kx1K maps that will load 4 times faster (and at the same time use 1/4 of the memory) and look just as good on everyone's monitors? Let me ask you how many times you think <i>anyone</i> will be moving so close to an object that it takes up 100% of their screen and they can actually use the direct 1 to 1 pixel translation from a normal/texture map <i>that</i> large? Anyone? Anyone? Bueller?
<pre><b><font color=red>"Build a man a fire and he's warm for the rest of the evening.
Set a man on fire and he's warm for the rest of his life." - Steve Taylor</font color=red></b></pre><p>