Hi, I was thinking about wi-fi signals and stuff, and there was a thing I didn't quite understand. I thought maybe one of you guys could explain/clearify this to me. (It might also be possible there are things I'm just plain wrong about in the whole wi-fi system, if so, please correct me)
the question:
Why does the wi-fi signal weaken over distances?
The reason I don't understand this might be easier to explain with a little example:
This is how I currently think Wi-fi works now (if anything is uncorrect I say here, please correct me):
Device <-----wifi signal----> wirelessrouter
Now let's say the device, that is 10 feet away from the router, wants to browse a website, lets say google.com . It sends out a signal to the router, asking for [http://www.google.com] in a few bytes via the wifi signal. now the router recieves this signal and sends back the google page, also in bytes (a few more this time ofcourse ^^).
Now let's say this same process happens but now over a distance let's say 30 feet and we'll also place a few walls between the device and the router.
Why will this process now take longer than the first situation? I do understand that if we place the router 300 yards further the signal will simply not be strong enough to send any information to the or recieve from the router. But why will it be slower over a little longer distance? If the signal now fails to send every bit to the device (which to me would be the most obvious effect of the longer distance), the signal send would just be wrong, and the webpage would display wrong.
Now how does this work?( Does the router like ask for verification of the bits send?)
the question:
Why does the wi-fi signal weaken over distances?
The reason I don't understand this might be easier to explain with a little example:
This is how I currently think Wi-fi works now (if anything is uncorrect I say here, please correct me):
Device <-----wifi signal----> wirelessrouter
Now let's say the device, that is 10 feet away from the router, wants to browse a website, lets say google.com . It sends out a signal to the router, asking for [http://www.google.com] in a few bytes via the wifi signal. now the router recieves this signal and sends back the google page, also in bytes (a few more this time ofcourse ^^).
Now let's say this same process happens but now over a distance let's say 30 feet and we'll also place a few walls between the device and the router.
Why will this process now take longer than the first situation? I do understand that if we place the router 300 yards further the signal will simply not be strong enough to send any information to the or recieve from the router. But why will it be slower over a little longer distance? If the signal now fails to send every bit to the device (which to me would be the most obvious effect of the longer distance), the signal send would just be wrong, and the webpage would display wrong.
Now how does this work?( Does the router like ask for verification of the bits send?)