why does wi-fi signal weaken over distance?

the lunchboxx

Reputable
Apr 18, 2014
3
0
4,510
Hi, I was thinking about wi-fi signals and stuff, and there was a thing I didn't quite understand. I thought maybe one of you guys could explain/clearify this to me. (It might also be possible there are things I'm just plain wrong about in the whole wi-fi system, if so, please correct me)

the question:
Why does the wi-fi signal weaken over distances?
The reason I don't understand this might be easier to explain with a little example:

This is how I currently think Wi-fi works now (if anything is uncorrect I say here, please correct me):

Device <-----wifi signal----> wirelessrouter

Now let's say the device, that is 10 feet away from the router, wants to browse a website, lets say google.com . It sends out a signal to the router, asking for [http://www.google.com] in a few bytes via the wifi signal. now the router recieves this signal and sends back the google page, also in bytes (a few more this time ofcourse ^^).

Now let's say this same process happens but now over a distance let's say 30 feet and we'll also place a few walls between the device and the router.
Why will this process now take longer than the first situation? I do understand that if we place the router 300 yards further the signal will simply not be strong enough to send any information to the or recieve from the router. But why will it be slower over a little longer distance? If the signal now fails to send every bit to the device (which to me would be the most obvious effect of the longer distance), the signal send would just be wrong, and the webpage would display wrong.

Now how does this work?( Does the router like ask for verification of the bits send?)
 
Its important to understand that the router doesn't actually sent bits/bytes wirelessly. It is an electromagnetic wave that is sent by the router, which then gets "decoded" by the wifi adapter into bits and bytes that the receiving computer can understand.

The reason why the speed of bits degrades over distance, is because the electromagnetic wave sent by the router loses its energy, the further it travels. This is quite complicated to explain, but once the signal does loses its energy, it's harder to transfer data.
 
Why does the wi-fi signal weaken over distances?

For exactly the same reason your voice can only be heard over a certain distance.
Talk normally, and you can only hear it in the same room. Leave that room, and I can't hear you.
Yell, and you can maybe hear it over a football field length.

If I can't hear you properly, I'll ask you to repeat. Or in WiFi terms, resend the packet. Obviously, resending takes more time. And thus the communication (data transfer) is 'slower'.
WiFi is exactly the same as me speaking "1 0 0 1 0 1 1 0 0 0 1 ....." just at a different frequency.

Normal routers can only 'talk' to a certain volume (signal strength). Too far away, and it is incredibly weak. Eventually, you can't hear it at all.
 
(Edit: sorry for double posting, having some problems with the quoting system ^^)


So when the router 'says' "0 1 0 1 0 0 1 0 (and seven other bytes, making for a total of 8 bytes)" to the device, the device might recieve " 0 ... 0 1 0 0 1 0 (on each byte missing some bits, making for a total of 7 bytes" but now how does the device know this is a signal that is not how it's supposed to be, how does it know it's actually a weak signal and it has to check what it recieved with the router (or if the router just sends the code multiple times, how many times does the router send the code)?

and now when it has recieved a signal and begins checking the multiple inputs it has recieved with oneanother which one does the device know is right?