News Modern web bloat means some entry-level phones can't run simple web pages, and load times are high for PCs — some sites run worse than PUBG

Status
Not open for further replies.

Neilbob

Distinguished
Mar 31, 2014
205
239
19,620
Back in my day I had brief period (very late 90s/early 2000s) of being a website designer, and the overriding mantra was always to keep everything as small and compact as it could possibly go because the assumption was that the majority of people would be using 56k (which usually equated to 3.3k/sec) dial-up connections, with a few lucky sods having 128k ISDN or maybe even 256k ADSL! Any element over 100k in size (maybe 250k for the very biggest important stuff) was out of the question. Perhaps it's time for companies to return to that mindset, even in this time of fibre and 100s of Mbits, and design efficiently for the lowest common denominator.

We also had to jump through a million hoops to ensure sites worked properly with Internet Explorer 6, but that was a whole other issue. Fond memories they are not.

Those times are long past. I wouldn't have a clue now.
 
Apr 1, 2020
1,451
1,113
7,060
Much of what makes most sites slow and resource intensive these days are the trackers and garbage that loads alongside the page you want. For example, the site of this tab runs with 67.3MB RAM with only tomshardware, Twitter, and futurecdn enabled, with everything allowed it uses 118MB RAM, and none of those extra things enabled additional pictures, tables, charts, or anything they just eat resources. According to Solarwinds Pingdom's ( https://tools.pingdom.com/ ) report for the same page, with everything enabled there are 89(!) requests, with all but 35 coming from non futurecdn and tomshardware domains.

And it's not just a problem on low end hardware and developing countries, anyone who has used dialup this millennium or had to use sub 5mbps satellite or cell internet these days, as many people have to since they don't have access to 5G or strong unthrottled 4G.
 
Last edited:
Feb 15, 2024
4
4
10
Back in my day I had brief period (very late 90s/early 2000s) of being a website designer, and the overriding mantra was always to keep everything as small and compact as it could possibly go because the assumption was that the majority of people would be using 56k (which usually equated to 3.3k/sec) dial-up connections, with a few lucky sods having 128k ISDN or maybe even 256k ADSL! Any element over 100k in size (maybe 250k for the very biggest important stuff) was out of the question. Perhaps it's time for companies to return to that mindset, even in this time of fibre and 100s of Mbits, and design efficiently for the lowest common denominator.

We also had to jump through a million hoops to ensure sites worked properly with Internet Explorer 6, but that was a whole other issue. Fond memories they are not.

Those times are long past. I wouldn't have a clue now.
When I think of it I can hear the familiar beep-boop, pause, screech!

I wonder how much horsepower of modern web is directed squarely at cross-site tracking and all the greasy undercarriages of web design…
 
Back in my day I had brief period (very late 90s/early 2000s) of being a website designer, and the overriding mantra was always to keep everything as small and compact as it could possibly go because the assumption was that the majority of people would be using 56k (which usually equated to 3.3k/sec) dial-up connections, with a few lucky sods having 128k ISDN or maybe even 256k ADSL! Any element over 100k in size (maybe 250k for the very biggest important stuff) was out of the question. Perhaps it's time for companies to return to that mindset, even in this time of fibre and 100s of Mbits, and design efficiently for the lowest common denominator.

We also had to jump through a million hoops to ensure sites worked properly with Internet Explorer 6, but that was a whole other issue. Fond memories they are not.

Those times are long past. I wouldn't have a clue now.
I'm sure a lot of the bloat is from ads on the webpage. Doesn't matter how efficiently you make the page if most of the size is from ads.
 

Sippincider

Reputable
Apr 21, 2020
131
96
4,660
Our corporate IT had the big idea to block Google on all their devices (security!). Then quickly relented after discovering how many pages, including a few of their own, use Google fonts...

Is very annoying when what should be a straightforward page needs to connect to half of the Internet to render. Why does a page which has nothing to do with Facebook, sit and hang waiting for Facebook?
 

bit_user

Polypheme
Ambassador
Yup. I keep thinking about this, when I read people say you don't need more than (insert some ancient, low-spec CPU) for mere web browsing. The web is increasing in complexity, just like any other software. It evolves with the client platforms that use it, and is optimized only to the point where it's usable on the mainstream machines most web developers are probably using.

Sadly, so much of the heft of the modern web is from spy-ware. Video ads can also bog a low-spec device, terribly. Firefox has an option to disable autoplaying videos, but it often doesn't seem to work.
 
Last edited:

bit_user

Polypheme
Ambassador
founder and former CEO of Discourse Jeff Atwood was quoted as saying Qualcomm is "terrible at their jobs. I hope they go out of business"...because Qualcomm CPUs were 15% behind Apple's.
Good point. I hope someone clapped back with how much slower his Discourse platform is than one of his competitors. I'll bet it's a lot worse than 15%!
 
  • Like
Reactions: snemarch
On my router I've got the DNS server set to AdGuard DNS ... 94.140.14.14 and 94.140.14.15.
Good first line of defense that applies automatically to all devices that connect to my wifi or wired connection.
https://adguard-dns.io/en/public-dns.html

On most all computers I use a custom hosts file to block known malicious and browser slowing sites.
https://winhelp2002.mvps.org/hosts.htm
It's a bit dated ... 2021, but it works!

I also use firefox with noscript and adblock to block any unwanted ads and trackers that got through the above.
 
Last edited:

Neilbob

Distinguished
Mar 31, 2014
205
239
19,620
I'm sure a lot of the bloat is from ads on the webpage. Doesn't matter how efficiently you make the page if most of the size is from ads.
That's certainly true. It didn't really occur to me, but if there was an ad back then, it was usually a tiny little GIF image. And not more than a couple of them.

Ugh, simpler times. Now I want to sit on my rocking chair on the porch and observe the setting sun while sipping a warm beverage.
 

bit_user

Polypheme
Ambassador
That's certainly true. It didn't really occur to me, but if there was an ad back then, it was usually a tiny little GIF image. And not more than a couple of them.
I remember one of Google's early selling points was how unobtrusive and light-weight its ads were.

Ugh, simpler times. Now I want to sit on my rocking chair on the porch and observe the setting sun while sipping a warm beverage.
LOL, for sure. Just awaiting the coming AI-pocalypse.
 

Findecanor

Distinguished
Apr 7, 2015
250
161
18,860
Much of what makes most sites slow and resource intensive these days are the trackers and garbage that loads alongside the page you want.
I use the Privacy Badger plugin which automatically detects and blocks trackers on web pages I visit. I can also block sources of my own choosing.
If I enable the source of the auto-playing videos* , Privacy Badger reports and blocks no fewer than 24 trackers on this article alone. Apparently the player in turn caused several trackers to be added. If I disable it, the number of detected trackers drop to 16 or 17.

I should also say that I do not block any ad network that does not track me. But unfortunately practically all ad networks do. That is on them.

Also, there are trackers that are not ad networks, and there are some few very nice sites that do show me ads without tracking me.

*) Auto-playing videos on the web should be illegal, IMNSHO
 

nimbulan

Distinguished
Apr 12, 2016
37
32
18,560
Oh hey are we finally taking notice of the decades of bloat that's been piling up on the internet? I occasionally run into web pages that basically lock up instantly even on a high end gaming PC, much less a phone.
 

PlutoDelic

Distinguished
May 31, 2005
83
12
18,635
15y ago during a presentation at Cisco, there was this one dude who did some (very precise) predictions of bandwidth needs/growth and emerging trends.

He used to describe this phenomena of bloat as "churn". Still to this day, there's no better word for it.
 

DougMcC

Commendable
Sep 16, 2021
132
89
1,660
Patreon load with no restrictions: 7 seconds
load with only JS enabled for patreon.com: less than 1 second.
No obvious visual differences in the content.
Stop letting random sites run javascript on your dime.
 

Lord_Moonub

Commendable
Nov 25, 2021
12
11
1,515
I find Tom’s hardware crashed and reloads on mobile, constantly. I’m sure the trackers and ad load are the reason. At times it is unreadable.
 
  • Like
Reactions: eichwana

snemarch

Distinguished
Feb 2, 2010
66
66
18,610
I use the Privacy Badger plugin which automatically detects and blocks trackers on web pages I visit. I can also block sources of my own choosing.
If I enable the source of the auto-playing videos* , Privacy Badger reports and blocks no fewer than 24 trackers on this article alone. Apparently the player in turn caused several trackers to be added. If I disable it, the number of detected trackers drop to 16 or 17.

I should also say that I do not block any ad network that does not track me. But unfortunately practically all ad networks do. That is on them.

Also, there are trackers that are not ad networks, and there are some few very nice sites that do show me ads without tracking me.

*) Auto-playing videos on the web should be illegal, IMNSHO
Privacy Badger stopped "automatically detecting" trackers because it turned out to be a... tracking issue: https://www.eff.org/deeplinks/2020/10/privacy-badger-changing-protect-you-better .

These days, you really should be using uBlock Origin, and (ad-blocking wise) ONLY uBlock origin – mixing ad-blockers can make the situation worse. (It's fine to throw in NoScript, which I definitely recommend; the initial per-site configuration is a tiny hassle compared to the security and speed you gain.)

Oh, and there's the issue of the Be Very Evil company restricting the APIs available to browser extensions, making ad-blockers less effective. If you need a Chromium-based browser, Vivaldi probably seems like the best bet – they've stated they'll at least attempt to keep the WebRequest stuff available: https://vivaldi.com/blog/manifest-v3-webrequest-and-ad-blockers/
 
Best solution for web bloat is an ad blocker. Writing clean web code is best done with a text editor using pure html, css, javascript and whatever server backend language that is kept up-to-date. I don't see many doing it this way any more.
True, but I can understand people looking for a no-hassle way to create a website. However, the platform itself should incluse by default some checks, wizards etc. To push their users towards optimizing their website : minifiers, coaching, modular libs and dependency management, code scanning to make sure Strict mode is active and enforced... Cutting down 5Mb of JS to 150K is a net win performance-wise : smaller code, much faster parsing. Start here, you solve 40% of the problem. Define sane defaults for media management, you solve another 50%.
 
  • Like
Reactions: bit_user
uBlock, and ditch anything Chromium based. Seriously people it's high time to get off the Google train if you value anything resembling privacy. I'd also implore websites to screen their ads more closely, ie. actually use the consumer facing side. Most sites don't even realize (or don't care) about the drivel they are feeding their readers. That's not even mentioning the malicious stuff they are often inadvertently hosting...
 
Status
Not open for further replies.