Tim Sweeny on Unreal Engine 3

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Are you still so sure there is no need for 64 bit computers any time soon ?
But hey, people will keep claiming 64 bit is only for marketing and not needed....
P4Man, you're still not even trying to give people who disagree with you in any form any credit whatsoever. I can't think of a single person who said that 64-bit is <i>never</i> needed or that it is <i>just</i> marketing. What your common opponent will however say is that 64-bit computing is not needed for the <i>typical</i> consumer, and likely will not be for yet another two to three years. You still haven't once managed to refute <i>that</i>. Your examples always include highly atypical usage that might at best apply to 0.03% of all computer users out there.

Don't tell if you bought such a combo in august 2002, that you'd expect to have to upgrade/replace it again today.. ?

Similary if I'd buy a truly high end gaming rig today, I would expect it to last two years without major upgrades and without sacrificing much if anything on eyecandy and framerates.
And this only proves your own lack of an ability to predict the future. Major GPU upgrades are on about a 1 year cycle. Until just this last year the CPU MHz race was also fast paced. So anyone who bought a computer 18 months ago would be <i>very</i> likely to be sacrificing eyecandy and framerates on the absolute newest games if they refused to update their hardware.

Heck, just compare an ATI Radeon 9700 Pro to an ATI Radeon X800 or an nVidia GeForce 6800. Besides the obvious framerate differences that you'd see, the pure 3D programming advancements are quite noticable as well. For that matter just the anti-aliasing algorithms have come a long way in just 18 months.

You'd have to be unimaginably stupid to <i>not</i> have to expect to upgrade to get great image quality and framerates in the absolute latest and greatest games after 18 months. You either upgrade or settle for less than perfect. Less than perfect, by the way, is really not that bad of a choice either.

Besides, putting your faith in the future in the hands of Tim Sweeney is certainly no proof of your own ability to predict the future. "<font color=blue>Well, we are aiming at the kind of PC that we think will be <b>mainstream</b> in 2006.</font color=blue>" (The bold is my emphasis.) He <i>really</i> thinks that <i>most</i> households will have 1024MB of video RAM in just two years? Tim Sweeney may be a 'programming god' (the words of BeyondUnreal contributing editor Twrecks, not me) but he's certainly no psychic, or even good at predicting market trends for that matter.

For a perfect example of this, consider the statement "<font color=blue>The normal maps are typically 2k by 2k.</font color=blue>" Yeah. That makes sense. As it is most people are being limited today by their MONITOR. Unless we either see the prices of monitors change sometime soon (which isn't likely) or we see monitor dot-per-square-inch numbers change drasticly (which is even less likely) then the coolest video card in the world with even a gig of RAM isn't going to mean squat when the game is run on a cheap-arsed 17" CRT monitor, or <i>worse</i>, an <b>LCD</b>. So making normal maps 2K by 2K is going to be incredibly wasted on pretty much everyone.

So then, which makes more sense: 2Kx2K maps that won't mean squat to anyone or 1Kx1K maps that will load 4 times faster (and at the same time use 1/4 of the memory) and look just as good on everyone's monitors? Let me ask you how many times you think <i>anyone</i> will be moving so close to an object that it takes up 100% of their screen and they can actually use the direct 1 to 1 pixel translation from a normal/texture map <i>that</i> large? Anyone? Anyone? Bueller?

<pre><b><font color=red>"Build a man a fire and he's warm for the rest of the evening.
Set a man on fire and he's warm for the rest of his life." - Steve Taylor</font color=red></b></pre><p>
 
> I can't think of a single person who said that 64-bit is
>never needed or that it is just marketing.

I know plenty that still claim even today its purely marketing.

>What your common opponent will however say is that 64-bit
>computing is not needed for the typical consumer, and
>likely will not be for yet another two to three years. You
>still haven't once managed to refute that.

I don't need to refute that, because my point has <b>always</b> been that people buying <b>high end </b> hardware in the first place should seriously consider a 64 bit capable platform, whatever the reason is they buy high end:
-because they need it (for gaming, rendering, mathematical simulation, video/audio,..). Those are the people that will benefit from it first

-because they don't like upgrading their machine every 2 years, and want to hold on to it as long as possible. All the more reason to get a future proof machine.

But I never claimed such a thing for moms and dads running word and excel and some email and stuff on celerons or durons. Those people don't need Athlon 64's or high end P4(EE's) with geforce FX6800's either ! Can't you get into your thick skull ? I never said everyone needs, I said it makes sense for pretty much everyone buying highend hardware.

(That being said, we'd all be better off if moms and dads started buying a 64 bit capable machine as well, since it would free developpers much sooner from having to choose between 32 or 64 bit versions, or worse, having to make 2 versions. having a near infinite virtual address space instead of a cramped one also leads to better and safer code (see Linus comments on the subject I linked earlier) Therefore, I hope moms and dads will switch asap as well, even though it will benefit us all only in the long run, and maybe me, and the industry as a whole, rather than them. pretty much like I wanted to see people move to windows 2000/XP even if '98 was working fine for them).

>And this only proves your own lack of an ability to predict
>the future

LOL. Be my guest slvr, search through the archives, find some predictions from me and lets see how terribily wrong I was each time. good hunting ! I'll give you a cookie if you find a SINGLE prediction that was way off the mark. I've been posting here for over 3 years, and over 2500 posts, so should be able to dig somethig up. Lets see what you can do.

>Major GPU upgrades are on about a 1 year cycle

You claim I can't predict the future, but you can't even predict the past ! nVidia and ATI have been more like on a 18-24 months cycle, and this is increasing further. The NV30 (FX) was released in november 2002, the R300 (9x00) in july 2002, almost two years ago and X800 is basically a respin of the same chip.

>Besides, putting your faith in the future in the hands of
>Tim Sweeney is certainly no proof of your own ability to
>predict the future. "Well, we are aiming at the kind of PC
>that we think will be mainstream in 2006." (The bold is my
>emphasis.) He really thinks that most households will have
>1024MB of video RAM in just two years?

Of course not, if you apply a tiny bit of common sense you'd understand what he is saying: 1 GB cards will be common as high end gaming options by then, just like 256 Mb cards are common now even if not everyone has one.

>Tim Sweeney may be a 'programming god' (the words of
>BeyondUnreal contributing editor Twrecks, not me) but he's
>certainly no psychic, or even good at predicting market
>trends for that matter.

You don't have to be Nostradamus to see the trend; the ammount of videoram almost doubles each year, and has done so since 1998 (maybe earlier, I have no data on those years). From 16 Mb in '98 to 256 Mb in 2003. This summer we will likely see 512 Mb cards. So, are you claiming there won't be 1 GB gaming cards by 2006 ?

>For a perfect example of this, consider the statement "The
>normal maps are typically 2k by 2k." Yeah. That makes
>sense. As it is most people are being limited today by
>their MONITOR.

I'm sure he doesnt have a clue what he is talking about, maybe you should educate him ? After all, that demo of his engine looked like total crap to me, I'm sure you could do better.

FYI, 1024 maps are pretty common today, but maybe it hasnt occured to you that you could actually get closer to an object (like a wall, or a person) so that you don't see the entire map ?

= The views stated herein are my personal views, and not necessarily the views of my wife. =
 
I know plenty that still claim even today its purely marketing.
Prove it.

I don't need to refute that, because my point has always been that people buying high end hardware in the first place should seriously consider a 64 bit capable platform
Then why do you push A64 on people looking for mid-to-low end systems if they don't need it?

All the more reason to get a future proof machine.
All the more reason to get hardware that has seen a few revisions.

Those people don't need Athlon 64's or high end P4(EE's) with geforce FX6800's either ! Can't you get into your thick skull ? I never said everyone needs, I said it makes sense for pretty much everyone buying highend hardware.
Uh huh. You say that <i>here and now</i>, but yet in the past?

(That being said, we'd all be better off if moms and dads started buying a 64 bit capable machine as well, since it would free developpers much sooner from having to choose between 32 or 64 bit versions, or worse, having to make 2 versions. having a near infinite virtual address space instead of a cramped one also leads to better and safer code (see Linus comments on the subject I linked earlier) Therefore, I hope moms and dads will switch asap as well, even though it will benefit us all only in the long run, and maybe me, and the industry as a whole, rather than them. pretty much like I wanted to see people move to windows 2000/XP even if '98 was working fine for them).
Do you ever get tired of being wrong and parroting other people's supposed wisdom? First of all there's the issue of it being completely worthless without an OS for those moms and dads. Second off it wouldn't free a single developer so long as 32-bit systems exist as the majority. (Or likely even as a minority for that matter.) Third a lot of developers make more than two version of their programs already. Fourth, near inifnite virtual address space means nothing to people who don't even max out a 2gb limitation. Fifth, making people upgrade from something that works and meets their needs simply because you see things differently than they do is just plain wrong.

LOL. Be my guest slvr, search through the archives, find some predictions from me and lets see how terribily wrong I was each time. good hunting ! I'll give you a cookie if you find a SINGLE prediction that was way off the mark. I've been posting here for over 3 years, and over 2500 posts, so should be able to dig somethig up. Lets see what you can do.
For one thing if you believe that we'll actually have 1GB cards as <b>mainstream</b> by '06 that in and of itself is proof enough for me. However if you want to prove the validity of other of your predictions to people then <i>you</i> do it. I'm calling you on <i>this</i> one.

You claim I can't predict the future, but you can't even predict the past ! nVidia and ATI have been more like on a 18-24 months cycle, and this is increasing further. The NV30 (FX) was released in november 2002, the R300 (9x00) in july 2002, almost two years ago and X800 is basically a respin of the same chip.
Point of view. I consider actual chip retooling to be major. You appearantly would rather have the sky and the moon. And they're on 12 month product cycles.

Of course not, if you apply a tiny bit of common sense you'd understand what he is saying: 1 GB cards will be common as high end gaming options by then, just like 256 Mb cards are common now even if not everyone has one.
So now you're challenged you suddenly are saying that you don't believe in his quote as literal, but as something that has to be interpreted with unexpressed conditions? Yeah. Who's working with common sense now? Of course I should have realized that I had to misinterpret the quote in order to understand what he was <i>really</i> saying! How foolish of me.

You don't have to be Nostradamus to see the trend; the ammount of videoram almost doubles each year, and has done so since 1998 (maybe earlier, I have no data on those years). From 16 Mb in '98 to 256 Mb in 2003. This summer we will likely see 512 Mb cards. So, are you claiming there won't be 1 GB gaming cards by 2006 ?
I never said that. I said that they won't be <i>mainstream</i> by 2006. Further, I'm saying that because of the limitation of monitors, there really won't be an advantage to continuing a rapid expansion of memory.

I'm sure he doesnt have a clue what he is talking about, maybe you should educate him ? After all, that demo of his engine looked like total crap to me, I'm sure you could do better.
I believe that his requirement of 1GB video RAM to play at decent quality settings will be education enough for him. The question is not whether his demo looks like crap, but whether the exact same quality of image could have been obtained without such waste.

FYI, 1024 maps are pretty common today, but maybe it hasnt occured to you that you could actually get closer to an object (like a wall, or a person) so that you don't see the entire map ?
Oh god you're a clue. If the map is 2Kx2K and the screen resolution is 2K, and the human eye can't even pinpoint a difference down to 4 pixels, then you're talking about an object having to fill more than eight times the screen size before anyone can even notice pixelation in the maps. And <i>most</i> maps won't even fill 10% of the screen if you shove your POV right into the object. 1Kx1K maps are more for bragging rights than for actual benefit. 2Kx2K is complete and total waste as no monitor in the world has that kind of image quality to even render it perfectly <i>and</i> even if one existed no human eye could tell the difference between that and a 1Kx1K map.

I write molecule (electron density, atomic, and thermal ellipsoid maps) and crystal reciprocal space including orientation matrix rendering software for a living. Believe me, I know the limitations of monitors, and especially of LCD flatscreens. I work on a high end 21" CRT just to see things that most of my users never will, and I can honestly say that in the vast majority of cases 1Kx1K maps are severely overdoing it for most objects. Sure, in some extreme cases (mostly walls) it would be useful. Objects however will never need it. Human eyes just aren't that good.

<pre><b><font color=red>"Build a man a fire and he's warm for the rest of the evening.
Set a man on fire and he's warm for the rest of his life." - Steve Taylor</font color=red></b></pre><p>
 
well i think everyone is getting thier pnaties in a twist over nothing lol.

i think its possible that both nvidia and ati will have aminstream 1gb cards out. notice i didnt say it would be in all aminstream homes, but I thinks its perfectly alright to think that in 2 years that there will be cards in the mainstream segment that are equpped with 1gb of ram. I know its hard to see now, but there are 256mb cards in the mainstream now, heck even some low end model have 256, that doesnt mean performance. So hey if you can get them in 256 for the low end now, then i dont see why 1gb is such a far off thing for the mainstream segment in 2 years.

Consider this as well, with the advent of pci-, video cards will start doing more then just playing games, they will also allow for offloading video work fromt eh cpu to the gpu, especailly in the video editing corner. So this could very wlel spawn cards with 1gb of ram on them, where it could be very usefull besides just in gaming. Its not so far fetched that ati or nvidia wouldnt also offer a mainstream (150-300 depending on your definition of mainstream ) card by 2006. No one said early 2006, so it might be the end of the year, but I could see it.

If there is potential to draw in more users with the additional bandwidht provided by pci-e, then you know the companies will take advantage of it.

I also wanted to comment on talking about how no one that isnt pc savy should need ot buy an ahtlon 64 chip. While I would agree they dont need 64bit, there are reasons for having it. I know many older people that use thier pcs to make home movies from camcorder footage, do alot of picture editing. those things can be helped by having a more powerful cpu, which the athlon 64 is compared ot like an ahtlon xp or celeron. Heck, in these situations even 1gig of ram isnt out of the question for people. I do agree that if a person is only surfing the net and emailing, then there is no need to invest extra in ahtlon 64, or a p4e, whatever. But more and more pcs are getting used for more things, more people are becoming pc literate lol.
 
>Then why do you push A64 on people looking for mid-to-low
>end systems if they don't need it?

1. There are no low end A64 systems.
2. Its the best platform available right now for most things regardless even of 64 bit

>All the more reason to get hardware that has seen a few
>revisions.

Pfff.. like what ? nForce 3 ? A64 ? Or maybe you meant BX and Pentium 2 ?

>Uh huh. You say that here and now, but yet in the past?

I've never said anything else if you'd care to listen instead of just picking fights.

>Do you ever get tired of being wrong and parroting other
>people's supposed wisdom? First of all there's the issue of
>it being completely worthless without an OS for those moms
>and dads.

Read it again, you've got extremely poor reading comprehension skills. "since it would free developpers <b>much sooner</b>". I will spell it out for you: the sooner 64 bit systems go mainstream, the sooner developpers will be able to ignore old 32 bit systems. I am not claiming that will happen this or next year, maybe not even this decade, but every year it takes is a year that could have been avoided. Capiche ? Or do I need to make a drawing with that.

>Second off it wouldn't free a single developer so long as
>32-bit systems exist as the majority.

Hey, guess what, my point exactly.

>Third a lot of developers make more than two version of
>their programs already.

So ? What the hell does that prove ?

>Fourth, near inifnite virtual address space means nothing
> to people who don't even max out a 2gb limitation.

Bogus. Read Linus post again, it seems you don't understand it. If there is a particular paragraph that you fail to grasp, I'll be happy to explain.

>Fifth, making people upgrade from something that works and
>meets their needs simply because you see things differently
>than they do is just plain wrong.

If you get the choice between good and better and the same price, I don't see the issue. I'm not pushing anyone to upgrade btw. A friggin ridiculous statement. But hey, you can count to five, i'm impressed.

>For one thing if you believe that we'll actually have 1GB
>cards as mainstream by '06 that in and of itself is proof
>enough for me. However if you want to prove the validity of
>other of your predictions to people then you do it. I'm
>calling you on this one.

God allmighty you are tiresome. It was Sweeny's statement, not mine. Call Sweeney on it if you like. I just added some perspective so you might understand what he most likely meant. Its *one* phrase from an interview. Sweeney is no idiot, he doesnt think everyone will have a 1 GB card in 18 months, otherwise why would he even bother to make the engine downscale to work on older hardware as well ? Can't believe you want to argue over such a trivial, obvious, moronic "point".

>So now you're challenged you suddenly are saying that you
>don't believe in his quote as literal, but as something
>that has to be interpreted with unexpressed conditions

I thought only complete idiots would take it literally the way you interpreted it. Either that, or you think Sweeney is delusional. Did you even read the interview ?? Which part of this quote didnt you understand: "Basically DirectX 9 cards will be minimum spec,<b> so any DirectX 9 shipping today will be capable of running our game, </b> but probably at reduced detail. If you only have a 256 meg video card you will be running the game one step down, whereas if you have a video card with a gig of memory then you'll be able to see the game at full detail."

> How foolish of me.

indeed.

>The question is not whether his demo looks like crap, but
>whether the exact same quality of image could have been
>obtained without such waste.

*yawn*. that's a nice question.. very interesting. the answer will also be very helpfull to anyone playing Unreal 3 based games on a 64 or 128 Mb card.

= The views stated herein are my personal views, and not necessarily the views of my wife. =
 
well i think everyone is getting thier pnaties in a twist over nothing lol.
**ROFL** That's pretty much the only source of entertainment here on THGC. The computer industry itself certianly hasn't been interesting lately. So what else is there? The thing to keep in mind though is that it's all in fun. Anyone taking it seriously should probably re-examine their life. :)

think its possible that both nvidia and ati will have aminstream 1gb cards out.
You're welcome to think what you like of course. I just don't see it. High end? Maybe. Mainstream? Not very likely. Utterly useless? Definately.

What disturbs me is not that 1GB could be mainstream, but how utterly wasteful it would be to design around that. You wouldn't even gain anything visually for it. It'd be like nVidia's polygon counts: purely theoretical but never fully utilized because of limitations elsewhere.

Consider this as well, with the advent of pci-, video cards will start doing more then just playing games, they will also allow for offloading video work fromt eh cpu to the gpu, especailly in the video editing corner.
I'm not sure where you get this idea. Encoding/decoding hardware has existed for a long time as standard PCI cards. Matrox, as uselsess as they are for most things, has quite a nice line of professional framegrabber cards. Sure, these things could be added to modern graphics cards. But the point is that they <i>could</i> have been added a long time ago. There just aren't enough people who want such a product to justify the cost of development, manufacturing, marketing, and sales. Ask Matrox. They'll tell you. The market is pretty slim and you can already do this with a PCI card and use your favorite graphics card.

If there is potential to draw in more users with the additional bandwidht provided by pci-e, then you know the companies will take advantage of it.
An interesting opinion, but I have to say that I strongly disagree. They'll just push the theoretical bandwidth numbers in marketing and let products continue to barely use a fraction of that in reality. It's cheaper and just as effective.

While I would agree they dont need 64bit, there are reasons for having it.
Of course there are reasons. The question is do they need those reasons? My experience has been that <i>most</i> of my friends and family members don't even need a 32-bit PC yet. If the software was still written for 16-bits then a 286 would meet their needs.

But more and more pcs are getting used for more things, more people are becoming pc literate lol.
I would argue the contrary. From my experience people seem to be growing less and less PC literate as time passes because software is making PCs so easy to use that they simply don't have to understand anything.

<pre><b><font color=red>"Build a man a fire and he's warm for the rest of the evening.
Set a man on fire and he's warm for the rest of his life." - Steve Taylor</font color=red></b></pre><p>
 
ok um maybe i need to define waht pc literate means. by that i mean they use it to do more things, such as multimedia things. i dont know about where you are, but that is a fact ive seen.

As far as the mainstream 1gb cards. Hey i never said they would be useful or not wasteful. Since when do you need 256mb of ram on a card like the ati 9200? There is an example of a mainstream, or id consider low end, card that comes with 256 of ram. Its wasteful and probably laughable but hey its all about marketing.

sadly its not a question of wether everyone can take advantage of it, its a question of wether companies can convince users its needed. I know you want the think nvidia and at only release things that will be super useful and not wasting the potential of parts, but im afriad that would be wrong. I just dont get why you think its so ludicris to think they might up it to 1gb if anything, just for the marketing.

As far as reasons for having 1gb, its true that for most jobs, you wont see any help. But let me try to clear up what I was trying to point out. Yeah pci cards do a fine job for video capture and editing, look at the great ati all in wonder line. But I do think with pci-e and the duplex bandwidth, you coudl see a performance boost for high end video work like that, stremaing hd content. Now look, I said High end, lets get that straight lol. Hey even today hd cards are scarce but starting to show up, even ati will be release an HD all in wonder. But just ocnsider the possibilities of the extra bandwidth, just more you could offload to the gpu instead of the cpu, if anything helping in multitasking. Hey we cna argue till we are blue in the face, but neither of us relaly knows what will ahppen, I just think its possible, not impossible.

Beyond video work obviously you can see the potential for 3d rendering, offloading work to the gpu. Another high end application. Now I dont know if 1gb is the answer there, but you dont know that it isnt, we will just have to wait and see. Personally, I dont care, I know for gaming it wont matter much at all, thats been proven in the past, but there are applications, areas i would use it for, that interest me and I wait to see what happens.
 
I've never said anything else if you'd care to listen instead of just picking fights.
Maybe if you keep telling yourself that one day it will be true.

1. There are no low end A64 systems.
While I cannot refute that statement (because they <i>are</i> no low end A64 systems) I <i>do</i> have to wonder why you said it as though it meant something.

2. Its the best platform available right now for most things regardless even of 64 bit
That's entirely debatable, and that's the point.

Pfff.. like what ? nForce 3 ? A64 ? Or maybe you meant BX and Pentium 2 ?
You don't have to make yourself look so ignorant, you know. You know perfectly well that I meant AXP + nForce2 (or god forbid, VIA), or NWC + i865. They meet or exceed most people's needs. Their price is good. They've been through a few update cycles. For the typical SOHO user you can't ask for a better match to their needs.

Read it again, you've got extremely poor reading comprehension skills. "since it would free developpers much sooner". I will spell it out for you: the sooner 64 bit systems go mainstream, the sooner developpers will be able to ignore old 32 bit systems.
Look in a mirror some time. If you'd actually comprehended what I said you would see that my point was that sooner was a moot argument as nothing will make existing systems that meet their owner's needs magically change. Those systems will still need support. There is no such thing as much sooner in this case. But then obviously you can't comprehend that.

I am not claiming that will happen this or next year, maybe not even this decade, but every year it takes is a year that could have been avoided. Capiche ? Or do I need to make a drawing with that.
It's funny that you argue "much sooner" and timescales over a decade in the same breath and you think <i>I'm</i> the one having problems seeing things clearly. Could it slice a year or two off of the timescale? Sure. Is that even remotely close to "much sooner"? Hell no. Is that a justification to make people pay more for things that they won't even need? Again, hell no. Just because you have a hard on for 64-bit memory access doesn't mean that everyone needs it or will even benefit from it any time within the lifetime of their PC.

>Third a lot of developers make more than two version of
>their programs already.

So ? What the hell does that prove ?
Again with your comprehension problems. Maybe you need to work on that. For your edification it <i>means</i> that many software developers are all ready quite used to developing for multiple platforms and therefore proves that saving them the work of supporting the 'old' 32-bit architecture doesn't actually save them from anything.

Bogus. Read Linus post again, it seems you don't understand it. If there is a particular paragraph that you fail to grasp, I'll be happy to explain.
Read it again? If you'd actually link to it maybe I'd read it the first time. 😛 I've read plenty of Linus Q&A in the past though and am quite certain that his intelligence is matched perhaps only by his inability to drag his head out of the clouds and see past theory into reality. He's 100% pure idealism.

<i>Most</i> (and I mean well over 99%) programmers access memory through higher order functions because <i>most</i> don't use assembly. For them there is no difference between accessing memory on a 286 or an A64. So what possible benefit is there for them? Further, <i>most</i> SOHO users (and again by most I mean well over 99%) don't even come close to Window's kernel limitations. So for this vast majority, what benefit is there, hmm?

If you get the choice between good and better and the same price, I don't see the issue. I'm not pushing anyone to upgrade btw. A friggin ridiculous statement.
If it were at the same price there would be no argument. And if you honestly believe that you've <i>never once</i> pushed anyone to upgrade then you should seriously consider being tested to see if you're schizophrenic.

But hey, you can count to five, i'm impressed.
You're impressed far too easily then.

God allmighty you are tiresome. It was Sweeny's statement, not mine.
But you pushed it, supported it in fact, and not once clarified the stipulations that you are now suddenly pulling out of your arse. You <i>made</i> it your statement. If you didn't want to be caught on it then you should have made your distinctions clear at some point in time <i>before</i> now. And thank you for providing yet more evidence of his fallability.

Basically DirectX 9 cards will be minimum spec, so any DirectX 9 shipping today will be capable of running our game
So will he personally refund ATI Radeon 9600 owners when it won't run at playable framerates on those cards? Or is it possible that he quite clearly has his head up there with Linus and simply can't see reality from that far away? It has been my experience that intelligence and common sense are not in any way directly related, expcept quite possibly in an inverse relationship.

Can't believe you want to argue over such a trivial, obvious, moronic "point".
And yet you continue to debate back. What does that say about you, hmm?

*yawn*. that's a nice question.. very interesting. the answer will also be very helpfull to anyone playing Unreal 3 based games on a 64 or 128 Mb card.
The answer is already out there. Game developers have never been limited from using 2Kx2K textures and maps. The visual tests have been run. There is no point. They're only good for bragging rights. They serve no actual purpose. I do work on crystallographic software that works with up to 4Kx4K images. Most of our customers collect 1Kx1K images, and with good reason. Theory and reality are two entirely different places and while support for theory is always nice, a solid base in reality is quite necessary.

<pre><b><font color=red>"Build a man a fire and he's warm for the rest of the evening.
Set a man on fire and he's warm for the rest of his life." - Steve Taylor</font color=red></b></pre><p>
 
ok um maybe i need to define waht pc literate means. by that i mean they use it to do more things, such as multimedia things.
Ah, but you cannot give a contrary definition to a term which has already been defined long ago. (Well, okay, sometimes you <i>can</i>, but then try explaining to a scientist that 'chaos' was defined millenia before and they scoff because that's just their nature.) PC literacy is the knowledge of how to do something, not the ability to do it.

It seems like a little thing, but it's the finer points in life that define the universe. For example a person can do many great things on a computer and still be completely PC illiterate so long as someone who was PC literate made tools for them that were easy to use. Without those tools however their literacy would be their limitation and they would either have to learn or have to settle for not doing as much.

i dont know about where you are, but that is a fact ive seen.
Actually, from my experience the number of people who are using their computer to do more are actually far outpaced by the people who are using their computer to do less. In the past people who wanted to use a computer understood that to do so they had to learn. Thus made them capable of a great many things because they were PC literate. Now though most people are capable of only doing what software makes it easy for them to do. And they don't bother to learn how to do much of anything because the software makes it so that they don't have to. This means that while they appear to do more major things, in truth the quantity of things that they are capable of doing are greatly deminished because of their dependance upon easy to use tools.

sadly its not a question of wether everyone can take advantage of it, its a question of wether companies can convince users its needed. I know you want the think nvidia and at only release things that will be super useful and not wasting the potential of parts, but im afriad that would be wrong. I just dont get why you think its so ludicris to think they might up it to 1gb if anything, just for the marketing.
I never said that they wouldn't make such cards for the high end. Mainstream by 2006 though is highly unlikely. And yes, I know how sad marketing is. I fight marketing almost on a daily basis at work. 'Tis the toil of the engineer.

But specifically the graphics industry has nothing to gain by pushing memory expansion that quickly. If they do they are only forcing themselves into a corner. There are far too many other features to push first and the memory expansion has already burst too far in too short of a time for them to let that pace continue unchecked. Hardware limitations alone will pose a considerable problem, as will cost and justifiability.

But I do think with pci-e and the duplex bandwidth, you coudl see a performance boost for high end video work like that, stremaing hd content.
And as I said, there's no demand for it. They could have done this years ago with even just PCI or AGP. Matrox even tried it once. There was no market to justify the expense of the product. And as it is, most professionals don't even touch anything like an AIW card. Those card's don't deliver an exact frame rate. For that you use frame grabbers and they're in a whole different class.

But just ocnsider the possibilities of the extra bandwidth, just more you could offload to the gpu instead of the cpu, if anything helping in multitasking.
What you're not understanding is that bandwidth has not been the limitation for this. There has been far more than enough bandwidth available. At any point in time card manufacturers could have offered cards that do just that. They haven't simply because there has been no monetary advantage for them to do so. This feature alone won't sell a product. It never has and likely never will.

Personally, I dont care, I know for gaming it wont matter much at all, thats been proven in the past, but there are applications, areas i would use it for, that interest me and I wait to see what happens.
The way that game developers are pushing things right now, 1GB will be completely useless. That is not however to say that it won't matter to games. I know that seems contradictory, but it isn't simply because game developers are overlooking one really cool concept: morphable textures.

Right now, for example, a hoard of orcs is drawn using the same model with the same texture. They all look alike. It's boring. It's unrealistic. But that's what we use because of memory limitations.

However, especially with DX9, it would be simple to devise a system where each orc in the hoard had a minor random variance from a standard model and each used a standard texture and normal map and morphed it using that variance into one that fit their new slightly changed model. You wouldn't be using gigantic textures that the game's visuals wouldn't benefit from. You would just be using a plethora of smaller textures to add more variation to the game.

I've been trying to push that concept to game developers since DX8 came out. I have yet to see any takers though. No one wants to waste that much memory, and frankly I can't blame them. Right now the best use would be in a game with a limited number of models, such as The Sims, simply because of the memory limitations in today's cards.

But basically, until game developers consider radical new concepts like that, there won't be much use for 1GB cards in the next two years.

<pre><b><font color=red>"Build a man a fire and he's warm for the rest of the evening.
Set a man on fire and he's warm for the rest of his life." - Steve Taylor</font color=red></b></pre><p>
 
>Maybe if you keep telling yourself that one day it will be
>true.

Maybe if you want to make a point instead of hollow accusations you could provide a link where I said otherwise. Put up or shut up.

> I do have to wonder why you said it as though it meant
>something.

Yes it means something. The only x86 64 bit options today are high end, so by definition if you want to argue the usefullness of that feature you are limited to high end. When a AMD64 compatible VIA C3 becomes available, you could argue its not much more than marketing, and I wouldnt disagree.

>You don't have to make yourself look so ignorant, you know.
>You know perfectly well that I meant AXP + nForce2 (or god
>forbid, VIA), or NWC + i865. They meet or exceed most
>people's needs

they do not represent high end (at least, not the AXP), so they can not meet the needs of people buying high end to avoid obsolescence as long as possible. Further more, A64 has been on the market for close to a year now, has seen several new steppings, half a dozen chipsets and chipset revisions, while I have not heard of *ONE* serious chipset related issue (only some bios issues regarding specific DIMM support, not unlike springdale) hardly a brandnew, unproven platform. Springdale is barely 3 months older, does that make the difference ? If you think its wiser to recommend a 3 month older platform that will be limited to 32 bit apps forever, and think it will prove to be a better bet against obsolescence, I don't share that vision.I'm also warry of first generation technologies (like PCI-E and DDR2), but by now A64 has proven itselve more than enough to be recommended without second thoughts IMO.

> Those systems will still need support. There is no such
>thing as much sooner in this case. But then obviously you
>can't comprehend that.

Because its a simplification. Do I really need to explain everything to you like to a 5 year old ? There will be a point in the future where for instance game developpers will say 'screw the old 32 bit systems, its not worth our effort supporting them'. Just like they are going to say that on DX7 class hardware or < 1GHz cpu's. This will apply to pretty much any software, just like in the 16->32 bit transition. You think minesweeper requires 32 bits ? Well good luck finding any recent 16 bit software these says, supporting it just not an issue anymore (and has not been for quite some time, thank GOD), and the same will happen sooner or later with 32 bit. With some class of software this may happen sooner rather than later (like games), for others it will most likely last until at least the next decade. Is that so hard to understand ?

My point is (and has always been) already for this reason alone, the move to 64 bit is overdue, and definately not premature like many claim (including intel and its fanboys). We should have started moving 5 years ago, just like we started moving to 32 bit back in 1985 and so in 5 years from here, 32 bit systems would be a non issue. Now its going to bite us, big time, like DOS limitations kept biting us up to ten years after the first 386 was introduced.

> Is that a justification to make people pay more for things
>that they won't even need? Again, hell no.

A 64 bit capable machine isnt more expensive than a comparable 32 bit one. An Athlon 64 2800+ beats an AXP3200+ on just about any benchmark even running 32 bit software, and costs exactly as much on pricewatch ($2 cheaper even last time I checked). This is very much unlike the 16->32 bit transistion. So no, I have no trouble at al recommending a 64 bit capable machine for this performance class, quite on the contrary.

> Just because you have a hard on for 64-bit memory access
>doesn't mean that everyone needs it or will even benefit
>from it any time within the lifetime of their PC.

No its because people buy intel's marketing BS unchallenged that software development will be held back for the next decade or so. That pisses me off, yes.

> For your edification it means that many software
>developers are all ready quite used to developing for
>multiple platforms and therefore proves that saving them
>the work of supporting the 'old' 32-bit architecture
>doesn't actually save them from anything.

Oh give me a break.. You do software development for a living, and you claim supporting extra architectures doesnt cost you anything ? That is laughable.

Also, there is big difference between code you can port with just a recompile, and code you can not. You can not just recompile an OS to make it 64 bit. And read Linus' post, you can not go from dense mapping to sparse mapping with just a recompile, you'd have to rewrite the app, therefore, its quite likely a lot of software that has to support both 32 and 64 bit systems will not use sparse mapping.

>Read it again? If you'd actually link to it maybe I'd read
>it the first time. 😛

Wow I'm impressed -again. You can actually shoot down an argument without even having read it! Here is is:<A HREF="http://www.realworldtech.com/forums/index.cfm?action=detail&PostNum=2229&Thread=9&entryID=30176&roomID=11" target="_new">http://www.realworldtech.com/forums/index.cfm?action=detail&PostNum=2229&Thread=9&entryID=30176&roomID=11</A>

> am quite certain that his intelligence is matched perhaps
>only by his inability to drag his head out of the clouds
>and see past theory into reality. He's 100% pure idealism.

Well, if you can't argue his points (especially if you haven't read them yet), shoot the messenger. Besides, and this is getting way off topic, I've not read his biography, nor do I collect his autographs, but from what I've read, he is far more practical than idealist. just two simple examples:
-his recent decission to let Linux contributers sign their code and declare its there own (to protect Linux from future SCO-like lawsuits). Simple, practical, but definately not ideological (in fact, highly controversial in the OS community).
-his initial decission to make Linux as a monolothical kernel, not a microkernel (which is less elegant, less "pure", but a lot more practical). If he was more academical, and less practical, rest assured it would have been a microkernel based Minix clone.

> And if you honestly believe that you've never once pushed
>anyone to upgrade then you should seriously consider being
>tested to see if you're schizophrenic

I don't own a computer shop, nor do I have any stocks, so what the hell what I win by having someone upgrade that doesnt need it ? Again, you insinuate an accusation that is not only baseless, but also ridiculous. Either prove it, or refrain from it. FWIW, the last 12 months I've build or configured over a dozen systems, just one of them was an A64, and ten or so bartons. But then, none of those people where hardcore users or 3D gamers, did any sort of 3D rendering or encoding and most had constrained budgets.

>And yet you continue to debate back. What does that say
>about you, hmm?

Don't worry. This is my last reply unless you want to argue something relevant. i've made my points, I provided the links, I tried to help and inform some posters in this thread, I've made my contribution. If all you're interested in is questioning my motivation, my mental sanity, the authority in their field of people like Tim Sweeney or Linus Torvalds, misinterpreting quoted literal statements from them, then be my guest, but I'm not interested anymore.

= The views stated herein are my personal views, and not necessarily the views of my wife. =<P ID="edit"><FONT SIZE=-1><EM>Edited by P4Man on 05/26/04 02:46 PM.</EM></FONT></P>
 
Maybe if you want to make a point instead of hollow accusations you could provide a link where I said otherwise. Put up or shut up.
Funny how I ask you to prove yourself and you don't, but then you expect me to because you're just special? Sorry, but it doesn't work that way.

Yes it means something. The only x86 64 bit options today are high end, so by definition if you want to argue the usefullness of that feature you are limited to high end.
How the hell do you figure? That's the damn worst logic I've ever seen. Well, okay, not <i>the</i> worst, but it's still really bad. That's like saying that you can only compare DX9 video cards to other DX9 video cards because you're only running DX8 games. It just makes absolutely no sense whatsoever. <i>You</i> make absolutely no sense whatsoever. If I wasn't so frustrated with developing a report generation standard I'd have given up on your sorry arse yesterday.

they do not represent high end (at least, not the AXP), so they can not meet the needs of people buying high end to avoid obsolescence as long as possible.
Since when has avoiding obsolescence become an innate requirement? Besides the fact that the industry itself is constantly changing and you're going to need a new mobo every two years to avoid obsolescence anyway, there's also the simple fact that you <i>can't</i> keep changing your words around by adding more and more stipulations without completely losing all credibility in the debate. (Which, by the way, you have just for that reason alone.)

There will be a point in the future where for instance game developpers will say 'screw the old 32 bit systems, its not worth our effort supporting them'.
Well duh. Way to state the obvious. What you fail to explain however is how that far distant date when that actually happens is "much sooner". Stop repeating yourself and start debating or just stop talking all together.

My point is (and has always been) already for this reason alone, the move to 64 bit is overdue, and definately not premature like many claim (including intel and its fanboys).
Uh huh. That's why the variety of x86 64-bit OSes and software, as well as the plethora of x86 64-bit hardware vendors, is so grandiose. It's because the move is <b>overdue</b> and has nothing to do with the move actually being premature. Yeah.

We should have started moving 5 years ago
Hey, now we're even supposed to have started moving to x86-64 before <b>anyone</b> even had a single hardware product for it. Yes! It all makes sense now.

Face it. You have a 64-bit hard on the size of Texas and California combined and your objectivity is shot.

A 64 bit capable machine isnt more expensive than a comparable 32 bit one. An Athlon 64 2800+ beats an AXP3200+
Wow. Can you whip out complete nonsensical BS or what? First of all comparing the AXP3200+ to the A64 2800+ is in itself debatably fair. Even if we overlook that however, there's also the fact that you're still choosing high end systems to justify people saving money by buying a middle to low end 32-bit system. Yeah. You go girl.

No its because people buy intel's marketing BS unchallenged that software development will be held back for the next decade or so. That pisses me off, yes.
Right. And you haven't bought AMD's marketing BS hook, line, sinker, rod, reel, and fisherman. No, of course not. It's so clear that you're being as objective as humanly possible here. Why of course it's because Intel told everyone that they don't need 64-bit hardware yet that people aren't buying 64-bit hardware. It has nothing to do with the unavailability of 64-bit software to justify the high end system that they don't need the performance of. It's completely Intel's marketing and not the needs of the consumer. Yeah.

Oh give me a break.. You do software development for a living, and you claim supporting extra architectures doesnt cost you anything ? That is laughable.
Only because:
1) You're too ignorant to know the advantages of PyQt.
2) A lot of ANSI C++ + Qt developers already support multiple architectures just through different compilers. What's one more? Maybe an extra 2-5% of development time for testing. Oh darn.

Heck, before my company dropped support for the DEC Alpha, I used to support that and x86. Other than the little extra time to swivel my chair to face my Alpha PC and start the compile there just after I started it on my x86 box there was no extra time needed. And the Alpha is NOTHING like an x86, even with that crappy x86 emulation.

But of course I obviously no absolutely nothing on the subject and am just laughable.

Also, there is big difference between code you can port with just a recompile, and code you can not. You can not just recompile an OS to make it 64 bit.
Who ever said that you could? Just because Linus wrote Linux (or at least the first kernel) doesn't mean that the only software that ever existed is OSes. I don't know what you're smoking, but please put it out before you respond next time, <i>if</i> you even reply. You're just not even coming close to making sense.

And read Linus' post, you can not go from dense mapping to sparse mapping with just a recompile, you'd have to rewrite the app
If you use a low level language. If you use a higher level language then you have no problems because you're abstracted from that kind of a concern. But of course you won't take my word for that if for no other reason than because it's my word.

Wow I'm impressed -again. You can actually shoot down an argument without even having read it!
When you read enough about Linus it really doesn't take much. The guy is just simply a clue sometimes. But you're right. Thanks for the read. It was a complete waste of time. Maybe <i>you</i> should read it though as it has absolutely <i>nothing</i> to do with my argument that people who aren't even using 2GB yet, physical, virtual, or otherwise, will gain absolutely nothing from an A64's better memory handling. But thanks for wasting my time because as it turns out your poorly thought out rebuttal actually helped my case.

Well, if you can't argue his points (especially if you haven't read them yet), shoot the messenger.
Actually, I don't argue his points at all. I argue your use of his words to back up your nonsense.

don't own a computer shop, nor do I have any stocks, so what the hell what I win by having someone upgrade that doesnt need it ?
So says Mr. Chapion Of The A64. Every single suggestion that you give people lately is A64 this and A64 that. Pushing doesn't just mean physical you know. Your <i>advice</i> counts too, even the advice you give on this forum.

Don't worry. This is my last reply unless you want to argue something relevant.
Hey, it's not my fault that you keep avoiding my points, forcing me to stretch this thing back into place. Your nonsensical rebuttals are the non-relevance. So please, by all means, make that your last reply if you're that's the best that you can do.

I tried to help and inform some posters in this thread, I've made my contribution.
I find it sad that you count misinformation as a contribution.

misinterpreting quoted literal statements from them
You still haven't explained how taking a quoted statement literally is misinterpreting it. But then that's still only one of the now countless things that you refuse to answer intelligibly.

but I'm not interested anymore
Then either stop replying or stop lying when you say that you're not interested anymore. It's either one or the other. You can't have it both ways. 😛

<pre><b><font color=red>"Build a man a fire and he's warm for the rest of the evening.
Set a man on fire and he's warm for the rest of his life." - Steve Taylor</font color=red></b></pre><p>
 
I should know better than to reply but I will nevertheless.

>Funny how I ask you to prove yourself and you don't, but
>then you expect me to because you're just special? Sorry,
>but it doesn't work that way.

You made the accusation, the burden of proof is upon you.

>How the hell do you figure? That's the damn worst logic
>I've ever seen. Well, okay, not the worst, but it's still
>really bad

If you keep replying to short quotes, you can't have a decent discussion that way, in fact I don't think you can with you anyway. Let see:

You said:"Then why do you push A64 on people looking for mid-to-low end systems if they don't need it?" Which:
1) I don't (btw, I asked proof, and somehow *I* should prove that ?)
2) I have alwas said it makes sense for *high end*, I gave you reasons why
3) Since there are no low end 64 bit computers, I could not even have possibly recommended one.

Then you quote one line out of my reply " There are no low end A64 systems." and reply: " I do have to wonder why you said it as though it meant something."

???? WTF ?

So I repeat myself "Yes it means something. The only x86 64 bit options today are high end, so by definition if you want to argue the usefullness of that feature you are limited to high end." to which you reply "That's the damn worst logic I've ever seen."

Seriously, are you retarded ?

I'm sorry, I gave up reading the rest of your post. Too bad if there was something interesting in there. You're not debating points, your debating words and semantics for the heck of it. I suggest you join the fight now between Crashman and Darko in that other thread, have fun. I've got better things to do with my time.

= The views stated herein are my personal views, and not necessarily the views of my wife. =
 
lol ok we can jump around the issue by arguing what literate means, whihc you said is the knowledge to do somehting not the ability to do something. i belive in reference to pcs thats not true. pc literte to me means someone can use a pc to get what they need done odne, no matter if they know what an mpeg2 is or not, if they no how to run a program, know about programs to use and accomplish their goals, they know what they are doing.

i grant you that there are many people that dont know much of anyhting whne it comes to specifics, but that its getting worse id disagree on, I know many younger people that Im amazed no so much, alot more then i knew when I was kid at the time. anyway, this is was useless to talk about, i hate talking symatcis lol.


wlel ok your opinion is there will eb no mainstream cards, my opinion is they wont, since we have no hard info to support either conculsion, we will have to leave it at that till we see what happens.

Yeah I know what oyu mean about fighting wiht marketing, us engineers get no love lol, have to make it pretty for the consumers, doesnt matter at what cost to performance lol.

i know you say there is no market for the hd content im talking about, but your wrong, there is, and its slowly growing, so dont tell me it doesnt exist, heck im part of it since i do alot of high end 3d rendering/ video work with hd content. And i realize an aiw is not a pro platform, i dont evne use one, but i was just pointing out how mainstream video work has become in general, you can even buy aiw's at walmart now, now thats mainstream lol.

i totally agree that the tech to offload to the gpu and the bandwidth has been around for years, but i dont see why you think pci-e wont help in any area at all, allowing simulatneous up and down traffic at the same speed in both directions. maybe not at ththis moment, but down the raod, thats what Im saying. right now no, becuase there is nothing to take advantage of the hardware yet. Sorry if i got you confused on that point, should have made myself clear.

Frankly I think its too early to speculate what will happen in 2006, so much going on so fast, you may be suprised what comes out, im sure you dont discount the possibility completely, stranger things have happened.
 
I know they are talking about highend severs for the most part but they do expect 64bit to catch on quickly.

<b>Microsoft senior vice president Bob Muglia Windows Server Senior Director Jeff Price and Paul Thurrott of super site for windows Talk about windows server and 64bit stuff where they expect it to go and why.</b>


<A HREF="http://www.winsupersite.com/showcase/muglia_winserver.asp" target="_new">http://www.winsupersite.com/showcase/muglia_winserver.asp</A>


Some of the highlights


<b>Windows Server 2003 64-Bit Extended Systems release</b>


BM: So we think 64-bit is a big deal.

Paul: I haven't kept up on the Itanium stuff as much as I could have, I guess, but I did go to the first 64-bit Windows reviewer's workshop that you had in Mountain View, probably four years ago or so [LINK], and I recall that the big problem, aside from performance, was that the versions of Windows you were coming out with for Itanium weren't complete, didn't include all of the features from the 32-bit versions. And then of course the performance of 32-bit applications was garbage.

JP: Both of those problems are fixed with the new 64-bit platform. If you look at the numbers today on AMD64, and then flash forward 12 months...

Paul: Sure. I mean, you can buy laptops today with an Athlon-64.

BM: You can, you can. It's not Opteron, but it's based on the same technology.

JP: It's only a matter of time before we get to the point where it doesn't make sense for a server vendor to ship anything but 64-bit machines based on AMD64 or Itanium.

Paul: So you think this platform will be bigger on the server at first?

BM: I think we'll start on the server, but it's going to move to the client very quickly.

Paul: So what are these machines looking like today? How many processors can you get in a single box on AMD64?

BM: Four-ways look great. And we'll see a lot of that over the next 12 to 15 months. It will take longer for 8-way to come out. But I think AMD64/Intel EM64T is going to be the volume platform of the future.

Paul: Oh I think so too. I think it's going to happen very quickly. Before Christmas 2005, all [mainstream] PC systems will be 64-bits.

BM: They'll all be enabled by next year. I think it will be huge on the client.

Paul: Absolutely.

BM: One thing we've found is that 32-bit applications run better on the 64-bit OS than they do on 32-bits. Just adding a 64-bit processor and the 64-bit OS changes everything.

Paul: Now what are you comparing there? Are these machines running the same clock speed...

BM: Same everything. Same chips, same everything. We run apps on 32-bit Windows, and then take those same apps and run them on 64-bit Windows, and you'll get about an 8 percent performance improvement on average.


<b>Intel EM64T vs. AMD64</b>


Paul: Are you seeing any difference between AMD's [64-bit] stuff and Intel's stuff?

BM: Yes. [Smiles]

Paul: Would you care to clarify that? [Laughs]

BM: Well, AMD has done a good job ...

[Laughter]

Paul: OK, I realize these companies are both important partners...

BM: I think both have invested very heavily... and I'm sure that customers will be happy with either solution.

Paul: All righty.

[Laughter]

BM: Are there differences? Yes, there are differences.

Paul: OK, so how do these companies differentiate their 64-bit products?

BM: So there are some things that AMD's done that Intel hasn't done, and I'm sure Intel will continue to invest here, and will do a really good job. AMD led the way on this one. There's no doubt they led the way on this one.

Paul: Right, I thought [AMD64] was going to be the orphaned [microprocessor] of the decade, the next Alpha...

BM: Oh I didn't think so. But do you know why I knew? Because of Dave.

Paul: Dave Cutler.

BM: Yeah, Dave's been all over this. Dave worked really closely with to design the chip. He was trying to get something that was really compatible and the problem that we have is that we want to support all of our applications totally. And these chips are just fantastic for that.

Paul: It's almost like applying the Microsoft model to [chip design]. The Itanium, for all its advantages, just couldn't run the installed base very well.

BM: No, not very well.

Paul: And it never will.

BM: No.

Paul: So back to the core OS benefits, again, where do these figures come from?

BM: This is our own internal testing. It's pretty remarkable what we're seeing, actually.

JP: There are a bunch of address space limitations to 32-bit, and for certain functions, you just can't get enough memory. And with a certain amount of memory, all of those limitations go away.

BM: We tested a whole series of workloads. Some workloads just don't benefit that much from 64-bits, but having a 64-bit OS on there gives you certain advantages. Other workloads--even if the app is 32-bit--you get a huge benefit by running on a 64-bit OS. The most extreme example of that is Terminal Services, because it's limited by the amount of physical memory in the box, in terms of capacity. So even though it's a 32-bit application, you can now run a lot more users simultaneously on the same computer. And these four-ways are blazingly fast.

Paul: These machines we're talking about. Are they out now, or are they coming out next year?

BM: They're out now. They're AMD Opteron systems.

Paul: Physically, what is the limit on RAM in today's Opteron machines?

BM: It's a physical limit based on the number of slots in the machine. I'm not sure what that number is. I'm sure you're going to see 32 GB systems today.

Paul: Compared to 4 GB on 32-bit.

BM: Well, three really. Though we can do more with address extensions. It's funky. Kind of like the old school memory extender stuff.

Paul: Ah yes, the good old days. But wow, 32 GB of RAM this year.

BM: Sure. I mean, we've actually built Itanium systems [at Microsoft], these really big systems, with a terabyte of RAM in them.

If I glanced at a spilt box of tooth picks on the floor, could I tell you how many are in the pile. Not a chance, But then again I don't have to buy my underware at Kmart.
 
Oww dude, don't stop now ! nothing quite as entertaining as a good boxing match ! c'mon man, spank that silver sunburnt surfer silly !

---------------------------------------------------
I am severly limited in what my mind can perceive.
 
I should know better than to reply but I will nevertheless.
**ROFL** Who is more the fool? The fool, or the fool who keeps talking to him even though he knows better?

If you keep replying to short quotes, you can't have a decent discussion that way
You can't have a decent discussion with people who keep <i>conveniently</i> ignoring considerable portions of the conversation either, but that doesn't seem to stop you. Why should this? 😛

in fact I don't think you can with you anyway.
Actually, I'm quite adept at conversing with myself. (Or at least with the guy that drives the body and the over voices in our head anyway.)

So I repeat myself "Yes it means something. The only x86 64 bit options today are high end, so by definition if you want to argue the usefullness of that feature you are limited to high end." to which you reply "That's the damn worst logic I've ever seen."

Seriously, are you retarded ?
I'm sorry. I didn't realize that I was dealing with a developmentally disabled individual. I'm sorry that you just can't seem to grasp the concept that the very usefulness of a feature itself to people who don't need it is one of the many questions of life, liberty, and the pursuit of a PC that meets people's needs, whether that feature be x86-64, DX9, or even SATA.

You're not debating points, your debating words and semantics for the heck of it.
I would ask you how it is even possible to use the correct usage of "you're" the first time and not the second time in the very same sentence, but then I already know the answer to that now. So instead I'll simply point out a quote from a previous post in this very thread in a post that I made to trooper11.
<font color=blue>In reply to:
------------------------------------------------------------
well i think everyone is getting thier pnaties in a twist over nothing lol.
------------------------------------------------------------
**ROFL** That's pretty much the only source of entertainment here on THGC. The computer industry itself certianly hasn't been interesting lately. So what else is there? The thing to keep in mind though is that it's all in fun. Anyone taking it seriously should probably re-examine their life. :)</font color=blue>
I have the distinct impression that one of us has actually been taking this seriously. (And I know that it isn't me.)

Come on man. Wake up. I've only been pulling your leg to get a rise out of you because I'm bored. I mean let's face it, THGC is about as dead as the C=64. If you're taking this seriously then you can't possibly win. (And if you're not taking this seriously then you'll realise that there is no point to winning anyway.)

<pre><b><font color=red>"Build a man a fire and he's warm for the rest of the evening.
Set a man on fire and he's warm for the rest of his life." - Steve Taylor</font color=red></b></pre><p>