Testing Server Speeds

kirklw

Reputable
Aug 22, 2015
2
0
4,510
I would like to test the speeds of some of the sites I visit most often to see if it makes sense to increase my internet speeds. If those websites are incapable of sending the data to individual users at a speed faster than my current connection can handle, there is no point in upgrading. How could I test to see if those servers can do this? Do any of you have concrete answers that can help and make this whole test unnecessary? Thanks!

CURRENT INTERNET:
16 Mbps down
3 Mbps up

CURRENT ROUTER:
1300Mbps capable

CURRENT WIRELESS CARD:
roughly 600Mbps capable (USB 3.0)
 

Kewlx25

Distinguished
Browsing doesn't require much bandwidth. Loading tomshardware hits something less than 1Mb/s at one second resolutions. Streaming on the other hand can consume a lot, especially while buffering.

Load up your favorite bandwidth meter and watch it when you load whatever sites you care about. If you see your bandwidth usage reaching 80%(12Mb/s) or more, then you can benefit from more bandwidth. If you're less than 50%(6Mb/s), you definitely won't see any benefit.

The rule of thumb is most web sites can send you data as fast as you can receive or request it. In theory most websites should load faster than your browser can render them, assuming 10Mb/s or faster Internet. In practice, you run into issues with latency less than bandwidth. If you're video streaming, you need lots of bandwidth to buffer quickly and sustain the stream, but even a 1080p stream is only 5Mb-8Mb/s, but it is nice to not have to wait several seconds for it to buffer.

Many ISPs have poor peering and it seems like YouTube or Netflix can't send the data fast enough, but this is rarely the case. No network is perfect, but 99% of the time, YouTube or Netflix are trying to send me 1Gb/s or faster.
 

sirstinky

Distinguished
Aug 17, 2012
644
0
19,360
+1^

The "theoretical" speed of routers is entirely ambiguous. There are so many variables, as was mentioned above, that limit your data rates. As far as needing more bandwidth- I've found that there's little difference between a 12 mbps and a 24 mbps connection. For a real difference, you'd have to go to a 40 mbps on up to 1 gbps. The big ISP's are now going fiber optic to transmit data to get the best data rates. Although the hardware is capable, the method used to transmit and receive that data isn't. If you've ever watched a YouTube video and it starts good, but begins to render very poorly-constant buffering, resolution changes (going from 1080p to 360, down to 240, and back to 1080) and get the message from Google, "Experiencing Delays?" click the box and see what they mean. ISP's actually do the best they can and most of the time it's just fine, but for heavy streaming like gaming or videos, it's a matter of asking for more cookies than are in the cookie jar.
 
Individual site loading:

I suspect this is just too random to be accurate, however I highly doubt increasing your ISP bandwidth will have a noticeable affect. If you're struggling with Netflix or another streaming service that's a different story but bandwidth allocation for individual sites should be LOWER than your current ISP bandwidth and again probably quite random depending on time of day and other factors.

For most individual sites about the ONLY thing I can think of is PAID sites that bump you up to a higher bandwidth allocation if you subscribe.
 
Your question actually is extremely complex even though most people think it is simple.

It greatly depends on how the application works and the distance between the server and you. Many times the distance (ie the latency) is the limiting factor well before the bandwidth.

A over simplistic example (you can look up tcp window size if your really want to know). Say it take 1 second to send a message to a server and get a reply assuming the server does not add any delay at all it is all network.

Now the sever send 1 byte of data and waits until the client confirms it got it. So you can transfer 1byte/sec. Now let say it only takes .5 seconds to get messages between the devices you speed now goes to 2bytes/sec. Now of course the application can transfer say 100bytes of data in a message if it is written that way so you would get 100bytes/sec or 200bytes/sec.

Because you have the variable latency and also the application controlling how much data is sent in a single message it tends to be impossible to guess how much data application can use without some knowledge of how it transfers data. For example web pages pull lots of tiny files buffers from many location where file downloads generally use the largest buffer sizes they can.