I have a Cisco 3750 connected to a wireless radio on a tower. They connect via Cat5e and the radio is powered via power over ethernet. I have a patch cable between the power injector and the switch, a patch from the power injector to an inline lightning arrestor, and then a cable from the lightning arrestor to the top of the tower where the radio is, all of which total about 150ft of cat5e.
Now that you understand the setup, my problem is that anything over 10mbps full duplex starts having connection issues, packet loss or complete connection drop. I have set both switch and radio to Auto/Auto, hard coded 100mbps Full Duplex on both, and hard coded 10mbps full duplex on both. The only combination that works with 0 loss is 10mbps Full Duplex. When i run the cable test from the Cisco while coded to 100mbps, the test fails and says there is a problem at about 10 meters which is about the distance to the lightning arrestor, however when coded to 10mbps, the test passes.
I understand that this looks like the lightning arrestor to be the issue and that is always my first check, but my question is:
Why would the connection have a problem at 100mbps, but not 10mbps?
Thanks
Now that you understand the setup, my problem is that anything over 10mbps full duplex starts having connection issues, packet loss or complete connection drop. I have set both switch and radio to Auto/Auto, hard coded 100mbps Full Duplex on both, and hard coded 10mbps full duplex on both. The only combination that works with 0 loss is 10mbps Full Duplex. When i run the cable test from the Cisco while coded to 100mbps, the test fails and says there is a problem at about 10 meters which is about the distance to the lightning arrestor, however when coded to 10mbps, the test passes.
I understand that this looks like the lightning arrestor to be the issue and that is always my first check, but my question is:
Why would the connection have a problem at 100mbps, but not 10mbps?
Thanks