mactronix :
I'm assuming that there must be parts of the CPU that are dormant when running 32 bit ? so would this mean it uses more power in 64 bit mode. If so does is it enough for more volts when running an OC ?
Think of it like the difference between a 6-digit calculator and a 12-digit calculator. A 12-digit calculator may use more power than a 6-digit one, but if you've bought a 12-digit calculator do you think it uses any less power when you only put 6-digit numbers into it? No, not really.
mactronix :
How does it affect things as far as the bandwidth goes ? again assuming the app is codded for 64 bit then its processing the data twice as fast, or is it just twice as much ? so what effect does this have if any on the buses ?
By far the biggest benefit of 64 bits is that a 64-bit application can use more memory. Any program that NEEDS lots of memory and is limited by the 2GB or 3GB available in 32-bit mode is going to have to do disk I/Os, and since it takes a million times longer to access data on disk than it does on RAM this has a huge performance impact. 64 bits removes the memory ceiling and (for the apps that need it) this is a much, much bigger benefit than anything else.
In terms of mere 64-bit calculations, they're few and far between, and since the performance penalty of splitting a 64-bit arithmetic operation into two 32-bit ones is a mere three-fold or so, it's really not worth thinking about. In a spreadsheet, for example, you'd likely be lucky if one in 100 instructions had to compute a 64-bit number. In 32-bit mode that would translate to the difference between 100 instructions vs. 102 instructions (one of those 100 is replaced by 3 instructions). So the overall effect is pretty negligible.
There may be some very specialized apps that are very 64-bit intensive, but most run-of-the mill stuff (including games, which offload most of the heavy crunching to the GPU), don't really benefit much at all from the 64-bit arithmetic capabilities of the CPU.