Thanks Matisaro!
In fact, there is something that I want to know and problably you have the answer. It's correct to say:
a) the information runs at 266Mhz between memory(DDR) and chipset
b) the information runs at 266Mhz between chipset and CPU (AMD Tbird or XP)
c) BUT CPU is clock at 133 times the multiplier.
If this is right, am I wrong assuming the procesor gets twice the information it's clock, but because of the multiplier (making let's say 1.2Ghz) can "absorb" this 266Mhz? So, AMD calculates that the "optimum feeding" for it's processors is 266Mhz?
A little more deeply: if there is a "optimal ratio" between CPU Mhz/FSB Mhz (the one that feed CPU with just the data that can process, nothing more and nothing less) shouldn't be diferent for every CPU? Because a 1.2Ghz CPU can't "assimilate" as many data as a 1.6Ghz CPU, the FSB should be lower than the higher clocked CPU.
I assume that this optimal is just theoric and may change between programs that requieres diferent operations, but seems reasonable. What's more, what is the value of this ratio?
I hope I haven't bored you will all this thinking/meditation.
I have adressed the questin to Matisaro, but obviously anyone is welcome to help me.
Thanks in advance!
DIY: read, buy, test, learn, reward yourself!