CPU link width 8bit vs 16bit??

rambo117

Distinguished
Jun 25, 2008
1,157
0
19,290
ok, so today ive been screwing around crashing and OCing my comp and came across a setting to set my CPU link width to either 8bit or 16bit.. what is it?? id assume that 16bit is better but idk, want some oppinions first.

btw, finally got my HT speed to 2600mhz!! =]
 
Solution
16 bits will provide more bandwidth, so potentially more performance. However the wider the bus, the more chance of data corruption (the error checking and correction should prevent it from crashing but packets have to be resent so loss of performance), so you may have to lower the frequency of the HT to compensate. So in some cases you may be better leaving it at 8 bits, but I've not found any problems running at 16 bits (although mine runs at only 2000MHz because of maximum it supports).
16 bits will provide more bandwidth, so potentially more performance. However the wider the bus, the more chance of data corruption (the error checking and correction should prevent it from crashing but packets have to be resent so loss of performance), so you may have to lower the frequency of the HT to compensate. So in some cases you may be better leaving it at 8 bits, but I've not found any problems running at 16 bits (although mine runs at only 2000MHz because of maximum it supports).
 
Solution

thanks devastator! i'll give it a shot with 16bit. if its unstable with my 2600ht oc, then ill just leave it set to auto.
 

TRENDING THREADS