Whats the actual differance with 64 bit and 32

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.


Since when? I'm running 64-bit Thunderbird right now.
 


Oh, ok well i will go look on the site again. When i looked last week there was no mention of a seperate 64 bit version and the version i downloaded said it didnt support a 64 bit OS when i installed it.

Mactronix
 
Thanks for the heads up on the 64bit Thunderbird Mark, I now have it running also. Minus points for Mozilla for making it so difficult to find though.
From thier home page no chance, I had to actually search specifically for 64 bit Thunderbird on the net to find it.

Come on guys (yea i know they wont be reading this) get your act's together.

Seriously though its no wonder people are cautious about switching if even the apps that are 64 bit are so hard to find. Or was i expecting too much to get a choice of 32 or 64 from the home page ?

Mactronix
 

Think of it like the difference between a 6-digit calculator and a 12-digit calculator. A 12-digit calculator may use more power than a 6-digit one, but if you've bought a 12-digit calculator do you think it uses any less power when you only put 6-digit numbers into it? No, not really.


By far the biggest benefit of 64 bits is that a 64-bit application can use more memory. Any program that NEEDS lots of memory and is limited by the 2GB or 3GB available in 32-bit mode is going to have to do disk I/Os, and since it takes a million times longer to access data on disk than it does on RAM this has a huge performance impact. 64 bits removes the memory ceiling and (for the apps that need it) this is a much, much bigger benefit than anything else.

In terms of mere 64-bit calculations, they're few and far between, and since the performance penalty of splitting a 64-bit arithmetic operation into two 32-bit ones is a mere three-fold or so, it's really not worth thinking about. In a spreadsheet, for example, you'd likely be lucky if one in 100 instructions had to compute a 64-bit number. In 32-bit mode that would translate to the difference between 100 instructions vs. 102 instructions (one of those 100 is replaced by 3 instructions). So the overall effect is pretty negligible.

There may be some very specialized apps that are very 64-bit intensive, but most run-of-the mill stuff (including games, which offload most of the heavy crunching to the GPU), don't really benefit much at all from the 64-bit arithmetic capabilities of the CPU.