Archived from groups: comp.dcom.lans.ethernet (More info?)
I wonder if anyone can give a definitive answer as to why there is a
minimum spacing specified on (some) ethernet cable. The thick stuff
with markers every 2.5m for example which is 1 bit of delay at 10 MHz.
There is some mention of it on various web sites but the reasons for
it are not stated. Maximum lengths etc are simple enough to
understand: you need to be sure that collisions are not late. The only
reason I can think of for specifying a minimum distance is to maximise
the effect of a collision when two MAUs start transmitting at the same
time. Only I can't see that it would. They won't actually start
together. If they're waiting for the line to become free, the last
data going past them will make sure one starts after the other. So the
second will start up at the eaxct moment the first's one's data
arrives. So it will experience a zero time-difference collision. The
first one will have a two bit difference. Even if there's an advantage
in that - which I don't understand -it assumes exactly one 2.5m
section of cable. But the 2.5m is only a minimum: the spec doesn't
require exact multiplesof 2.5m over hundreds of metres! So I'm racking
my brains as to why it was ever specified at all.
I wonder if anyone can give a definitive answer as to why there is a
minimum spacing specified on (some) ethernet cable. The thick stuff
with markers every 2.5m for example which is 1 bit of delay at 10 MHz.
There is some mention of it on various web sites but the reasons for
it are not stated. Maximum lengths etc are simple enough to
understand: you need to be sure that collisions are not late. The only
reason I can think of for specifying a minimum distance is to maximise
the effect of a collision when two MAUs start transmitting at the same
time. Only I can't see that it would. They won't actually start
together. If they're waiting for the line to become free, the last
data going past them will make sure one starts after the other. So the
second will start up at the eaxct moment the first's one's data
arrives. So it will experience a zero time-difference collision. The
first one will have a two bit difference. Even if there's an advantage
in that - which I don't understand -it assumes exactly one 2.5m
section of cable. But the 2.5m is only a minimum: the spec doesn't
require exact multiplesof 2.5m over hundreds of metres! So I'm racking
my brains as to why it was ever specified at all.