As a former owner of an Opteron 270 (2 x dual-core at 2.0 GHz per core, 1 MB L2 cache per core, ccNUMA using 2 x nodes of Dual-Channel Reg ECC DDR1-400).
The Opteron 200/800 series did not support Cool'n'Quiet as Registered DIMMs need a constant flow of energy to work and the memory controller is integrated (has pros and cons).
Since these new CPUs likely support Cool'n'Quiet, I'd recommend just disabling C'n'Q to get the stuttering to stop.
In addition to this add /usepmtimer to C:\boot.ini
http://search.microsoft.com/results.aspx?q=boot.ini+%2Fusepmtimer&l=1&mkt=en-US&FORM=QBME1
ccNUMA really wasn't designed for games, it will likely stutter a fair bit if it is used and /usepmtimer is not used at the same time.
You can run ccNUMA enabled or disabled in both Windows XP, XP x64 Edition, 2003 Server (x64), and Vista (and Linux, etc too). It depends what gives you the best performance.
"Node Interleaving" is disabling ccNUMA, so instead of having the OS control it the 'chipset' controls it, it will stutter less but the performance of the memory will not aggregate as well compared to ccNUMA. (Latency vs Throughput arguement)
The ccNUMA vs Node Interleaving being for one OS and not the others is total garbage. All Windows Kernels based on NT 5.1 will support it, just games may not scale well (stuttering) if ccNUMA is used.
Vista
will not improve ccNUMA performance in gaming, the technology is years old, well established, and can not be improved very much at all (from a inter-latency perspective within an OS's Kernel).
Just google "NUMA" and "ccNUMA", it is an idea perhaps a decade old now.
http://en.wikipedia.org/wiki/Non-Uniform_Memory_Access
VALVE'S SMP TESTS SHOW GREATER THAN 100% SCALING!!