Question Why is acutal download speed like 10x slower than the one i get on speed test

veeljko23

Reputable
May 24, 2020
336
2
4,685
For example if internet speed is 20mbps when im downloading something its around 2.4mbps usually at peak and for everyone, 50mbps on speedtest, acutall download speed is 7 or for 100 its around 11 or 12... Can someone explain me why?
 

USAFRet

Titan
Moderator
For example if internet speed is 20mbps when im downloading something its around 2.4mbps usually at peak and for everyone, 50mbps on speedtest, acutall download speed is 7 or for 100 its around 11 or 12... Can someone explain me why?
"download", from where?

I suspect this is simply megabits ve megabytes.

Speedtest and your ISP report in megabits/sec.
Most places you might download from report in megabytes/sec.

8:1 ratio.
2.4 * 8 = 19.2 (close enough to 20)
7 * 8 = 56
12 * 8 = 96.

No speed loss, just a difference in reporting units.
 
  • Like
Reactions: gggplaya

Math Geek

Titan
Ambassador
the megabit vs megabyte is most likely the main thing you are seeing as explained above.

other common issue is the server you are downloading from. just because your connection is 100 mb/s does not mean the server you are downloading from can give you that speed. often they limits speed so everyone can get some, other times network congestion or other "peak time" problems can effect your overall experience.

speedtest checks the max you can do but many many times you won't see those speeds in day to day use except from certain sources.

for instance i can get my full GB speeds from steam most of the time (100 MB/s+), but MS serves rarely give me more than 15-20 MB/s.
 

veeljko23

Reputable
May 24, 2020
336
2
4,685
the megabit vs megabyte is most likely the main thing you are seeing as explained above.

other common issue is the server you are downloading from. just because your connection is 100 mb/s does not mean the server you are downloading from can give you that speed. often they limits speed so everyone can get some, other times network congestion or other "peak time" problems can effect your overall experience.

speedtest checks the max you can do but many many times you won't see those speeds in day to day use except from certain sources.

for instance i can get my full GB speeds from steam most of the time (100 MB/s+), but MS serves rarely give me more than 15-20 MB/s.
Oh, my bad just noticed that on site the measurements is megabits while on internet providers site its shown in mb/s never cared enough to see the difference but i understand that completely now, thanks for the explanation and help
 

DSzymborski

Titan
Moderator
Oh, my bad just noticed that on site the measurements is megabits while on internet providers site its shown in mb/s never cared enough to see the difference but i understand that completely now, thanks for the explanation and help

You mean Mb/s or Mbps. Again, if you use "mbps" or "mb/s" then nobody will know what number you're actually talking about, which makes conversations extremely confusing. Precision is extremely important in this area.
 
To a point is makes sense that they use different unit even though it is confusing since it depends on what hat you put on :)

Network people tend to look at data signalling. This tends to always be some form of binary so they prefer to use bits. They don't really care what type of data is actually being transmitted. It used to mostly be telephone calls when they first went from analog to digital phone and this was years before internet existed so they didn't really think of it data transfer. Although you will see network people talk about frames/packets per second when they are looking at firewall or switch throughput.

File transfers are looking at mostly data storage. Things like files sizes are all stated in bytes so it makes sense they use this term. All they really care about is how long does it take to get a file from one location to another. They tend to only care about the actual payload part of the packet and ignore a lot of the overhead inside .
This means they do not count overhead used for things like IP addresses and other header information that is wrapped around the actual data. This means even though you generally can convert 1 byte to 8 bits the data rates will be slightly lower in bytes/sec because of this extra data they are ignoring that the network people count.
 
A couple of things here.
  • ISPs know about Speed Test and their trunks may deliberately give you the best routing to them so as to make the performance better. The routing between your computer and the actual server you want to talk to won't ever be that ideal.
  • The reason why ISPs advertise in bits per second as opposed to bytes per second is because bits per second is the most accurate value.
    • Data is sent serially to your home, so it's only coming in a bit at a time
    • The transmitter and receiver can only send/receive so many bits in a second
    • When it comes to encoding data in the PHY layer (that is, turning those 0s and 1s into a physical thing), for the purposes of data reliability, extra bits may be used. For example, 8b/10b encoding, used in USB 2.0, SATA, and PCIe before 3.0, uses 10 bits to send 8 bits of data, hence the name. Transmissions with self-clocking capabilities use at least half the bits to embed the clock signal. And finally, some bits may be used to tell the receiver that a transmission has started and has ended.
    • There are also some encoding schemas where trying to translate a packet into 8-bit bytes doesn't really make sense, like those that use hamming codes.
    • The ISP may not be using the same encoding schemes across the board. Like they may offer a 100Mbps package, but the usable data may vary if it's being sent over a cable vs. over RF.