r/dataisbeautiful OC: 2 13d ago Gold 1 Wholesome 2 Helpful 4

[OC] Despite faster broadband every year, web pages don't load any faster. Median load times have been stuck at 4 seconds for YEARS. OC

Post image
24.9k Upvotes

5.9k

u/VivianStansteel 13d ago

I'd love an extension that automatically accepted only the essential cookies and closed the pop-up. That would make my web pages load faster than anything else.

3.4k

u/PGnautz 13d ago Silver Gold Helpful Wholesome Heartwarming To The Stars

Consent-o-Matic. Not working on all pages, but you can report unsupported cookie banners to get them fixed.

492

u/verhydndrum 13d ago

My guy, you deserve a fucking knighthood for that!

337

u/PGnautz 13d ago Wholesome

Don‘t thank me, thank the fine people at Aarhus University instead!

111

u/javier_aeoa 13d ago

Tusen tak Aarhus Universitet then! :D

24

u/realiztik 13d ago

That’s a lotta Tak

→ More replies
→ More replies
→ More replies
→ More replies

36

u/Deciram 13d ago

But is there something for iPhones?? I do most of my web browsing on my phone and the cookies pop up, the chat, the join our mailing list popups make me rage. So many things to close or in the way before I can even start looking

30

u/oktoberpaard 13d ago

The same extension exists for Safari on iOS. You can find it in the App Store.

→ More replies

46

u/namtab00 13d ago

sadly not available on Firefox mobile, the only mainstream mobile browser allowing extensions... 😔

11

u/nawangpalden 13d ago

Maybe check Firefox beta?

Otherwise try iceraven; a fork of Firefox. It's available there.

Kiwi browser also supports Chrome extensions.

→ More replies
→ More replies

7

u/chux4w 13d ago

Bah. Doesn't work for Android Firefox. I'll definitely be getting it for desktop though, thanks.

→ More replies

203

u/LetterBeeLite 13d ago Silver

87

u/PsychotherapistSam 13d ago

Don't forget the uBlock Annoyances list! The internet is so much better with these.

9

u/Similar_Tale_5876 13d ago

I have an option under filterlists/Ads for "EasyList." I don't see anything explicitly called "EasyList cookies." Is Ads/EasyList the thing I should make sure is checked? Or keeping poking around?

10

u/Ankivangelist 13d ago

It's in the section called "Annoyances" (toggled closed by default)

→ More replies

21

u/running531 13d ago

What does easylist cookie do?

→ More replies
→ More replies

181

u/2cilinders 13d ago

There is, and it's not I Don't Care About Cookies! Consent-O-Matic is sorta like I Don't Care About Cookies, but instead of simply clicking 'accept all' it will select the least amount of cookies

15

u/yensteel 13d ago

More companies should optimize their code. The pictures can use tinyjpg or webp for example. They're so heavy. I'm also guessing there's a latency aspect as well, and many things are loaded serially.

14

u/rwa2 13d ago

Uh, we do. We just optimize until we hit 4 second load times because customers get impatient and leave if it takes any longer than that.

493

u/SorenKnoxville 13d ago

I believe there is an extension called I Don’t Care About Cookies that serves this function?

577

u/GenuisPig 13d ago

Correct but its recently been acquired by Avast. I dropped it as soon as I heard the news

32

u/steipilz 13d ago

There is a fork on github from before the aquisition.

17

u/CoziestSheet 13d ago edited 11d ago

You’re the real beauty in this post.

219

u/Picksologic 13d ago edited 13d ago

Is Avast bad?

1.0k

u/SniperS150 13d ago

short answer- yes

long answer- yesssssssssssssssssssssss

244

u/FootLongBurger 13d ago

not to challenging anyone, I’m genuinely curious, why is it bad?

863

u/SniperS150 13d ago

"When Google and Mozilla removed Avast’s web extension from their stores, a scandal broke out which revealed that Avast (who also owns AVG) had allegedly been spying on their users’ browsing data and selling it to corporations for millions of dollars in profit."

That as well as an autoinstalling browser that slows your computer down.

255

u/TheNormalOne8 13d ago

Avast and McAfee both auto installs their extensions. Both are shit

116

u/solonit 13d ago

Someone link the How to uninstall McAfee antivirus video made by McAfee himself.

28

u/GarsaFwipSitOnMyFace 13d ago

McAfee didn’t uninstall himself

→ More replies

79

u/GoldenZWeegie 13d ago

This is rubbish to hear. Avast and AVG have been my go tos for years.

Any recommendations on what to swap to?

216

u/mikeno1lufc 13d ago Helpful

Use Windows defender. There is absolutely no need to use anything else

Also download Malwarebytes. No need to pay for premium. Just have the free version ready to go in case your machine gets infected with malware, Malwarebytes is by far the most effective tool for removing most Malware.

I work in cybersecurity and honestly everyone will give you this advice.

Don't even think about Norton, AVG, McAfee, Avast, or any other traditional anti-vifus software. Window Defender is better than all of them by quite a margin.

42

u/Axinitra 13d ago

Windows and Malwarebytes is what I use. A few years ago I bought a one-off lifetime license for Malwarebytes and it's still rolling along and updating automatically, although in this era of recurring subscriptions this seems too good a deal to be true.

25

u/ComradeBrosefStylin 13d ago

Yep, Windows Defender with a healthy dose of common sense. Malwarebytes if you're feeling fancy. Don't download shady files, don't open attachments from senders you do not trust, and you should be fine.

→ More replies

5

u/RodneyRabbit 13d ago

This is what I use, but also if I'm going to sites that I'm not sure about or want to test a new bit of software, then I either use a VM or a program called sandboxie which is now completely open source and rather good.

9

u/atomicwrites 13d ago

Right, the only situation where you should use third party av software is of you're an enterprise IT team that needs to controll security across all your computers centrally. And in that case still don't use McAfee.

→ More replies
→ More replies

171

u/intaminag 13d ago

You don’t really need an antivirus. Windows catches most things now; don’t go to shady sites to avoid the rest. Done.

46

u/tfs5454 13d ago

I run an adblocker, and a scriptblock addon. Never had issues with viruses, and i go to SHADY sites.

→ More replies

106

u/MyOtherSide1984 13d ago

For antiviruses? Nothing. Windows defender does a great job on its own assuming you're not a complete nincompoop with what you download and such. If you really want, run Malwarebytes once a month. Ultimately, just be smart and you won't run into problems.

Side not - I download some SKETCHY shit on my secondary PC that hosts my Plex server. I see a program that might do something cool and I just go for it while bypassing all of Windows warnings. Never had any issues. Just don't be stupid by downloading/viewing porn or free movies and shit.

18

u/61114311536123511 13d ago

virustotal.com is fantastic for checking suspicious links and files. It runs the file/link through about 30 different malware checker sites and gives you a detailed, easy to understand report

→ More replies

7

u/Leo-Hamza 13d ago

Common sense

→ More replies
→ More replies
→ More replies
→ More replies

21

u/girhen 13d ago

They used to be good.

But now... yeah. Bad.

→ More replies

11

u/Enchelion 13d ago

And Avast just merged with Norton.

11

u/SorenKnoxville 13d ago

I didn’t know that! Thanks for the info.

→ More replies

25

u/wozza365 13d ago

Doesn't that one accept though?

→ More replies

9

u/straightouttaireland 13d ago

Ya but that auto accepts all cookies, wish there was a way to auto reject them.

→ More replies

21

u/moorepants 13d ago

consent-o-matic

36

u/lordsmertrius 13d ago

Ublock said it blocked over 1500 scripts on YouTube last night. Over half of them were for ads and the YouTube premium ad still shows somehow. I can't imagine trying to watch without a blocker nowadays.

5

u/EternalStudent07 13d ago

uBlock Origin is what I use. Very configurable.

→ More replies

13

u/zoinkability 13d ago

I believe Ghostery can do that now

→ More replies
→ More replies

276

u/kirkbot 13d ago

what happened in 2019 to make it go up?

223

u/Firstearth 13d ago

Whilst everyone is arguing about latency and JavaScript this is the thing I’m most interested in. Whether the peaks in 2016 and 2019 can be attributed to anything.

130

u/f10101 13d ago

It looks like this is largely due to testing methodology and URL dataset changes.

The source is here I believe: https://httparchive.org/reports/loading-speed?start=2015_10_01&end=latest&view=list

Annotations are indicated on the graphs.

→ More replies
→ More replies
→ More replies

3.7k

u/uncannyinferno 13d ago

Why is it that ads must load before the actual page? Drives me crazy.

3.8k

u/Drach88 13d ago edited 13d ago Silver Helpful Wholesome

Reformed ad technologist here.

First off, many ads are served in something called iframes. An iframe is essentially a separate webpage embedded in the main page, that's running with its own resources on a separate execution thread than the main page, so even if the main page is bloated with a ton of resources, the content in the iframe will still load.

Secondly, there's typically a ton of javascript bloat -- both in terms of javascript used for page functionality as well as javascript used for ad/tracking functionality. Much JS runs asynchronously (non-blocking), but a lot of it runs synchronously (blocks other stuff from loading until it's done executing)

Thirdly, the internal dynamics of the operational side of many web publications are torn between internal groups with differing motivations and incentives. Very rarely do those motivations line up to actually create a product that's best for the consumer. Dealing with expansive javascript bloat and site optimization is simply a nightmare to push through internally between different teams of different stakeholders.

1.1k

u/ashrise2050 13d ago

Excellent explanation. I run a site with lots of users and some pretty complex code, but no trackers or ads. Loads in about 1.2 sec

363

u/DesertEagleFiveOh 13d ago

Bless you.

131

u/Dislexeeya 13d ago edited 13d ago

I don't think they sneezed.

Edit: "/s" Can't believe I needed to add that it was a joke...

20

u/randomusername8472 13d ago

I assume that /s is what you type because you sneezed during your comment so... Bless you :)

→ More replies

60

u/ppontus 13d ago

So, how do you know how many users you have, if you have no tracking?

263

u/pennies4change 13d ago

He has one of those page counters from Geocities

22

u/CyborgKungFu 13d ago

I remember putting one of those on my page and refreshing my browser a bunch so my page looked more popular than it really was.

4

u/basafish 13d ago

Good times. Nowadays no one trusts those numbers anymore...

44

u/YaMamSucksMeToes 13d ago

You could easily check the logs, likely a tool to do it without tracking cookies

→ More replies

233

u/Drach88 13d ago edited 13d ago

They probably mean no third-party client-side tracking.

Technically, every time someone loads a new asset from your site, your webserver can log the request. This is how early analytics were initially handled in the bad-old-days -- by parsing out first-party server logs to estimate how many pageviews, how many unique visitors (ie. unique IP addresses) etc.

Eventually, someone realized that they could sell a server-log-parsing service in order to boil down the raw data into more usable metrics. Furthermore they could give the website owner a link to a tiny 1-pixel image hosted on their own servers, and they could ask the webmaster to put that 1-pixel dummy image on their site in an img tag, so the browser sends a request to the analytics-provider's server. Instead of parsing the webmaster's server logs for analytics, they parse out the server logs for that tiny 1-pixel image. This was the birth of 3rd-party analytics. Fun-fact -- this is how some marketing email tracking and noscript tracking is still done today.

22

u/Astrotoad21 13d ago

Most interesting thing I’m going to learn today. Thanks!

90

u/Drach88 13d ago Silver

Oh dear God, please go learn something more interesting than adtech. It's a miserable, miserable field full of miserable miserable misery.

I'd recommend binging CGP Grey videos on more interesting topics like:

How to be a Pirate Quartermaster

How to be a Pirate Captain

The Trouble with Tumbleweeds

How Machines Learn

The Better Boarding Method Airlines Won't Use

The Simple Solution to Traffic

Watch even a minute of any of these videos, any I promise you'll learn something exponentially more interesting than my random musings on the history of web analytics.

13

u/MrPBandJ 13d ago

With the internet being a focal point in all of our lives I think it’s very important for people to learn what goes on while they’re browsing! We teach people about the local climate, traffic laws, and cultural traditions. Learning “what” happens when you load up a new web page and “why” is very informative. Your brief description of “where” our digital ads/trackers was clear and interesting. Maybe working in the industry is miserable but giving others a glimpse past the digital curtain is an awesome thing!

15

u/PressFforAlderaan 13d ago

fwiw you sound like a very cool person. Cheers!

→ More replies
→ More replies

4

u/kylegetsspam 13d ago

Fun-fact -- this is how some marketing email tracking and noscript tracking is still done today.

Indeed. And it's why your email client probably has a "don't load images by default" and you should enable it.

→ More replies

41

u/Boniuz 13d ago

Resolve it in your infrastructure, like a normal person

→ More replies
→ More replies

7

u/L6009 13d ago

1.2 seconds.....
Its like running website it offline to see the changes you made

→ More replies
→ More replies

252

u/ShankThatSnitch 13d ago

As a former front end dev for a company's marketing website, I can confirm that speed problems are mostly due to all the JS that loads from the various metrics tools we had to embed. We did everything we could to get better speeds, but eventually hit a wall. Our speeds were amazing if we ran it without the chat bot, A/B testing, Google analytics, Marketo...etc.

91

u/wozza365 13d ago

Can second this. 3rd party tools that we have no control over are about 3/4 of the total download on our site including images etc. We've optimised the site to be lightweight and fast and then these tools literally destroy the performance. The site is lightning fast even on bad connections when using adblock. Optimizely is our biggest pain point. It has a huge amount of JavaScript and takes fucking ages to run on load, adds a second to the load time and for some A/B tests we have to wait for their shit to load as well, not leaving it async.

TL;DR for non tech people: use an adblocker AND use strict tracking protection on your browser (Firefox and Brave have this - not sure on the others). Not only will you have less data being tracked on you (already a big bonus) but websites will load way faster.

6

u/ShankThatSnitch 13d ago

We used VWO for our A/B testing, but the same problems. I appreciate that you are a man of culture, going with Firefox.

7

u/Gnash_ 13d ago

how ironic that a service caused optimizely is causing most of your troubles

5

u/ShankThatSnitch 13d ago

It is, however what the tool is optimizing is the sales funnel, not the website. How optimized is your path from entering the site to filling out a form. In the end that ends up being the most important things for a marketing site, much to my despair as a developer trying to make a good site.

→ More replies

133

u/zoinkability 13d ago edited 13d ago

Ironically when we were trying to meet Google’s published goals for page and site performance the biggest offender was all Google code. GA, YouTube, GTM, Google Optimize, etc.

54

u/Enchelion 13d ago

Google's web code has always been an absolute mess. It's mind boggling their search algorithm/system remains as good and fast as it does.

45

u/driftingfornow 13d ago

Man their search engine sucks these days.

→ More replies
→ More replies

13

u/ShankThatSnitch 13d ago

Exactly. It is a bunch of shit.

9

u/sir_ramen 13d ago

I use NoScript to block all that out, and the site usually still works. Why is it on the site, if it's not needed? Is simply for marketing and tracking?

7

u/uristmcderp 13d ago

User data is one of their most profitable products.

→ More replies
→ More replies

32

u/Drach88 13d ago

Marketo.... now there's a name I haven't heard for a while...

24

u/ShankThatSnitch 13d ago

Sorry to bring up bad memories.

→ More replies

44

u/Something_kool 13d ago

How can the average internet user avoid your work respectfully

67

u/Drach88 13d ago

Ublock origin chrome extension.

(Make sure it's ublock origin and not ublock)

→ More replies

21

u/Denastus 13d ago

If you want to disable JS just install NoScript (Firefox only). You will be surprised how broken a website can actually be.

Edit: running uBlock Origin also helps with page load times.

6

u/Drach88 13d ago

Chrome let's you blacklist/whitelist JS on different domains natively.

→ More replies
→ More replies

3

u/ouralarmclock 13d ago

Me too! Who did you work for, I worked for PointRoll.

3

u/savageronald 13d ago

Cousin! I worked for EyeWonder/MediaMind/Sizmek (before and for a brief time after the PointRoll acquisition).

→ More replies
→ More replies

80

u/robert_ritz OC: 2 13d ago

Gotta log those impressions.

70

u/Erur-Dan 13d ago

Web Developer specializing in marketing content here. We know how to do better, even with the ads, trackers, and other bloat. We just aren't given enough time to optimize. 4 seconds is deemed short enough to not be a problem, so the budget for efficiency just isn't there.

16

u/kentaki_cat 13d ago

4 Seconds is insane! I worked for a 3rd party A/B-Testing SaaS company a few years ago and we used to get shit from our customers when page speeds went above 3 sec.

Tests usually revealed that there was a plethora of other tracking code that had to be loaded before everything else, while our plugin was loaded async

But yes, it's never Google or cross-site tracking code and always the 3rd party tool where you have direct contact to someone who will listen to you complain.

But of course, if you think it's worth it for a few wording A/B- tests to pay us more to implement server side testing, I won't stop you

I'm not in the business anymore but server-side anything seems to be a lot easier and more common now.

→ More replies

8

u/BocciaChoc OC: 1 13d ago

Odd, if a website takes me more than 2-3 seconds I generally just leave

→ More replies

30

u/cowlinator 13d ago edited 13d ago

No, sometimes it's much worse when ads load after the actual page. When those ads take up 0 space before loading, you start clicking, and then the ad finishes loading and suddenly takes up space and moves other content down, and you click the wrong thing.

It's terrible. Don't ever do this, web devs. I will hate you. Everyone will.

5

u/Deastrumquodvicis 13d ago

My dad gets irrationally angry when it happens on his phone browser. He’s trying to read something and I hear “QUIT FLIPPING AROUND”

→ More replies

3

u/GIAway 13d ago

Aside from the obvious financial reasons, ads are served based partially on location, so many people that access your closest server get fed the same ads as you, the server will recognize the ads as high traffic and will move it to a faster cache.

→ More replies

986

u/Chaisz3r0 13d ago

Get an ad blocker and block 3rd party JS.

563

u/ItsDijital 13d ago

3rd party JS has absolutely exploded in the last few years. I don't think most people are even aware of it, but it's not uncommon for some sites to have upwards of 10 different companies loading their junk on each page.

150

u/wozza365 13d ago

3/4 of the total download for our website is 3rd party tools for analytics etc (and our site doesn't even have ads on it). Google, Microsoft, Facebook, Optimizely and others all make an appearance. So 1/4 is the actual website, the content on it, and the frameworks we actually use for development.

19

u/basafish 13d ago

It's almost like a bus with 3/4 of people on it being the crew and only 1/4 are actually passengers.

→ More replies
→ More replies

120

u/Hickersonia 13d ago

Yeah... I had to unblock facebook and google so that certain users in my warehouse could use UPS Campus Ship... wtf

23

u/SaggyFrontButt2 13d ago

What is JS?

61

u/dw444 13d ago

JavaScript, the language all of the consumer facing, and a considerable amount of the behind-the-scenes part of the internet is written in.

→ More replies

99

u/ar243 OC: 10 13d ago Wholesome

A mistake

8

u/[deleted] 13d ago

[deleted]

→ More replies

8

u/FartingBob 13d ago

As someone who occasionally starts learning JS, why is it a mistake? Is it the resources it uses, the limitations of the language or something else bad about it? What is the best replacement option to learn?

41

u/tomius 13d ago

Js === bad is mostly a joke. It has its quirks because it was created very fast and it keep its retro compatibility. But nowadays, modern Javascript is great to work with.

There's also no other real option for coding on websites.

It's one of the most popular (if not the most) programming languages now, and it's not going anywhere.

It's a good language to learn.

8

u/Avaocado_32 13d ago

what about typescript

10

u/tomius 13d ago

Sure. But it's basically the same. It's actually a superset of JavaScript and transpiles to it.

→ More replies
→ More replies
→ More replies
→ More replies
→ More replies
→ More replies

147

u/Thorned_Rose 13d ago

Yep, it's disturbing to see my adblockers go from blocking a few things on a page to ten things and now it's not unusual to see hundreds. 😕

62

u/KivogtaR 13d ago

Mine tracks how much of the internet is ads (at least I think that's the purpose of that number?) Overall I'm at 34% for the websites I visit. Some web pages are considerably more.

God bless adblockers

46

u/SkavensWhiteRaven 13d ago

Ads are literally a global pollution problem at this point.

The amount of CO2 that could be saved if users where private by default is insane. Let alone the cost to store your data...

28

u/Thorned_Rose 13d ago

We've subsequently installed Pihole on our home server and yep, like you we're getting about 25-35% traffic blocked by that. And I still get ads seeping through for my browser adblockers to take care of. It's depressing how much internet bandwidth is wasted.

25

u/romple 13d ago

I thought something was wrong when I set up my pi-hole and literally thousands of requests were being blocked.

But nope... Just an insane amount of periodic requests from telemetry and tracking.

14

u/DiscombobulatedDust7 13d ago

It's worth noting that typically the number of periodic requests goes up when blocked by pi hole, as DNS failure will be seen as a temporary issue that can be retried

5

u/TheEightSea 13d ago

You still are getting the ads that come from the same domains that provide you the real websites. Like YouTube, it doesn't use external domains but its own. You cannot block them using a DNS sinkhole, you need a DOM adblocker.

→ More replies

8

u/Tintin_Quarentino 13d ago

Is there a setting inside uBlock Origin we need to toggle?

block 3rd party JS.

→ More replies

17

u/LillaMartin 13d ago

Can you target JS to block them with ad blocker? Does websites rarelly use JS to more then ads?

Just asking incase i try it and suddenly menus dissapears on some sites!

29

u/wozza365 13d ago

Firefox and Brave have it but in to block 3rd party tracking. It's rare that it breaks functionality as a user - these trackers are usually served from a different domain to the one you are on, so they block anything not coming from the same domain or CDNs.

If you block JS entirely, most websites these days break or functionality is limited, it's not recommended although does get you through the paywall on many news sites still.

13

u/lupuscapabilis 13d ago

Does websites rarelly use JS to more then ads?

Yes, JS is used for a large amount of things on websites. Almost any time you click something on a site and it does something without reloading the page, it's JS. And that's just one example.

7

u/WarpingLasherNoob 13d ago

Without JS you wouldn't be able to use 99.99% of websites out there.

→ More replies
→ More replies

87

u/NickSheridanWrites 13d ago Gold

Had an argument along these lines with my IT lecturer way back in 2000. T3 lines were on the horizon and my lecturer was proselytising that one day all loads and file transfers would be instantaneous, failing to account for the fact that we'd just use it to send bigger files and higher quality feeds.

Back then most mp3 were around 4MB, you'd rarely see a JPEG reach 1024KB, most streaming media was RealPlayer, and I had an onion on my belt, which was the style at the time.

9

u/Medford_Lanes 13d ago

Ah yes, nineteen-dickety-two was a fine year, indeed.

8

u/dinobug77 13d ago

There’s also the fact that if things happen instantaneously then people don’t trust it. Insurance comparison sites are a prime example where users didn’t believe it could return accurate quotes that quickly and a built in delay of up to 30 seconds has been added which users think is enough time for the quote to be accurate. (Can’t remember the exact time but they tested different length delays)

On a personal note I designed a small website and worked with the developer to ensure speed of load was below 1 second across all devices. When finished we tested it and was 0.3 seconds to load each page. Users were clicking seemingly randomly through the menu items but not completing the form submission. turns out even though the hero image / copy changed they didn’t think the site was working properly and clicked about and left. We slowed it down to 2/3 seconds per page and people started using the site as expected and completing the form.

TL;DR people don’t trust machines.

9

u/CoderDispose 13d ago

My favorite story along these lines was an old job where we built a webpage for managing large amounts of data. It would save all changes as soon as you made them, but it was so fast people didn't trust it and were complaining there was no save button. I put one on the page. It doesn't do anything but pop up a little modal that says "Saving... complete!"

Complaints dropped off a cliff after that, hehe

→ More replies

546

u/RoastedRhino 13d ago

4 seconds is acceptable, so the more bandwidth the more content sites will push through, up to a few seconds of waiting time.

An interesting analogy: historians found out that most people across history were commuting approx 30 minutes to work. In the very old days, it was a 30 minute walk. Then at some point it was 30 minutes on some slow city trolley. Now it may be 30 minutes on a faster local train, or even 30 minutes in the highway. Faster means of transport did not yield shorter commuting times, but longer commutes covered in the same 30 minutes.

103

u/elilupe 13d ago

That is interesting, and reminds me of the induced demand issue with designing roadways.

If a two lane road is congested with traffic, city council decides to add two more lanes to make it a four lane. Suddenly all four lanes will be congested with traffic because when the max load of the roads increased, so did the amount of commuters deciding to take that road.

48

u/bit_pusher 13d ago

Which is why a lot of road designers look to second and third order benefits when improving a roadway. You increase highway capacity to improve flow on other, complimentary, roads.

35

u/gedankadank 13d ago

And despite this, once a region has even a modest population, it's impossible to build out of car traffic, due to the way cars scale in the space they require. Eventually, private car routes have to be closed in favour of more space-economic modes of transportation, but most cities stay car-centric for far, far longer than they should, because most people think they want the ability to drive everywhere, not realizing that everywhere is packed with cars and unpleasant to be in.

24

u/tehflambo 13d ago

imo it's: "i don't want to stop driving; i want everyone else to stop driving"

8

u/shiner_bock 13d ago

Is that so unreasonable?

/s

6

u/goodsam2 13d ago

As long as they front the cost for it more directly which they basically never do.

Roads are insanely expensive considering the amount we have. Such a waste.

→ More replies
→ More replies
→ More replies

7

u/Unfortunate_moron 13d ago

This is oversimplified. Sure, if you only improve one road, it becomes more popular. But if you improve a region's transportation network (improve multiple roads + public transport + walkable and bikeable solutions) then everything improves. Also don't forget that during off peak hours improvements to even a single road make it easier to get around.

Induced demand is real but only up to a point. There isn't some magical unlimited quantity of people just waiting to use a road. It's often the same people just looking for a better option than they had before.

Also don't forget that traffic lights are one of the biggest causes of congestion. Studies in my city predicted a 3x increase in traffic flow and a 95% drop in accident rates by replacing a light with a roundabout. The city has been replacing existing lights with roundabouts and the quarter mile long backups magically disappeared. Induced demand is surely occurring but nobody notices because the traffic problem is solved.

→ More replies
→ More replies

66

u/bobcatsalsa 13d ago

Also more people making those commutes

→ More replies

415

u/XPlutonium 13d ago edited 13d ago

I actually think the reason for this actually backward

Like when net was slow websites were light and didn’t have much functionality per page and even across pages. But as 3G and 4G starts coming every Tom dick and Harry starts making end user download all of ReactJS for 2 hello worlds

So even in large organisations while they have criteria for optimisations and all often they don’t keep the average user in mind and the best case or just have poor accounting methods or even in fact sub par infrastructure and yet want to fill in features

(I’m not blaming any company per say but want to say that this will always be a problem even in the future with 25G where some company will make you teleport to the new location there will be a at least 2-3 second load time). In a sense that the better speeds enable better tech which then needs even more speed and so on

229

u/meep_42 13d ago

This is exactly right. We have found the optimal waiting vs functionality time for a webpage is ~4 seconds. Any advances in computing or bandwidth don't change this, so functionality will increase to this threshold.

40

u/Section_Hiker 13d ago

its a happier client base when the response times are consistent

9

u/TheFuzzball 13d ago

The probability of bounce increases 32% as page load time goes from 1 second to 3 seconds. - Google/SOASTA Research, 2017.

Is this optimal 4 second time across the board, or it a maximum target on a low powered mobile device using 4G?

If it’s 4 seconds in the worst case, it’s probably quite reasonable (up to 2 seconds) on a desktop/laptop with a more reliable connection.

If it’s 4 seconds on desktop/laptop, the maximum on mobile could be many multiples of 4 seconds due to performance (e.g. you’re throwing all the same stuff that loaded on a fast dev machine at a 4 year old android phone), or network latency or bandwidth.

97

u/Sininenn 13d ago

Tolerable =/= optimal, fyi.

It would be optimal for the loading time to be below a second, so no time is wasted waiting for a website to load.

Just because people tolerate the 4 second wait does not mean it is the best case scenario...

And no, I am not complaining that 4 seconds is too long.

78

u/Fewerfewer 13d ago edited 13d ago

It would be optimal for the loading time to be below a second

That would be optimal for the user, but the company is evaluating "optimal" on more than one criterion (development cost, fanciness, UX, etc.). The comment above you is saying that 4s is the apparent break-even point between these "costs" for the company: any longer and the user won't care how cool the website is, they'll leave or be upset. But any faster, and the typical user won't care much and so there's no point in spending extra development time (money) or paring down the website features in order to hit <1s.

→ More replies
→ More replies
→ More replies

66

u/Skrachen 13d ago Silver

13

u/privatetudor 13d ago

And yet I still had to wait 3s for out.reddit to redirect me. Modern web is painful bloat.

→ More replies

17

u/rogueqd 13d ago

The same thing exists for roads. Building wider roads to relieve traffic causes people to buy more cars and the traffic stays the same.

→ More replies

76

u/spiteful-vengeance 13d ago

When we wrote HTML back in the 90's early 2000s it was like writing a haiku. Over 100kb was a mortal sin.

Website devs these days take a lot of liberties with how they technically build, and, for the majority, there's very little emphasis placed on load time discipline.

A badly configured JS framework (for example) can cost a business money, but devs are generally not in touch with the degree of impact it can have. They just think "this makes us more productive as a dev team".

SRC am a digital behaviour and performance analyst, and, if you are in your 20's, I was writing HTML while you were busy shitting your nappies.

24

u/Benbot2000 13d ago

I remember when we started designing for 800x600 monitors. It was a bright new day!

12

u/spiteful-vengeance 13d ago

I distinctly remember thinking frames were amazing. On a 640x 480.

8

u/retirementdreams 13d ago

The size of the screen on my first mac color laptop (PowerBook 180c) with the cool trackball that I paid like $3,500 lol.

→ More replies
→ More replies

14

u/sudoku7 13d ago

And that's why a lot of modern changes are happening within the webpack and tree shacking space. Get rid of the parts of the kitchen sink you don't and all.

25

u/spiteful-vengeance 13d ago edited 13d ago

Yeah, it can be done right, but there's a distinct lack of business emphasis on why its important, and how important it is.

From a technical perspective this understanding is usually taken care of by the devs, but their goals are very different in terms of placing priority on load time.

They tend to take the approach that 5 secs on their brand new i7, 32GB machine with super internet is good enough, but when I tell a business that every extra second costs money, and people are using shitty mobile devices, there's generally a bit of a freak out.

→ More replies

15

u/wozza365 13d ago

Personally I disagree. In my experience devs are brutally aware of bad performance but have limited power because they either don't get the time investment to solve it or it's out of their control due to 3rd party tracking, ad tools being far more bloated than they should be.

If Google, Facebook and a few others all cut their tracking tools to be half the size, this line would drop literally overnight. They are on basically every single website now. They're tracking your porn searches, your dildo purchases, your favourite subreddits and they're A/B testing whether you prefer blue or green buttons.

Performance is a big thing in front end frameworks right now too, they're all focusing on it and some businesses are well disciplined - we don't have a strict kb limit, but we rarely use 3rd party packages (outside of our main framework) and those we do have to use have to meet reasonable size requirements. But the impact is limited due to the 3rd party tracking we have to have with no option for alternatives because the business people use those tools.

7

u/spiteful-vengeance 13d ago edited 13d ago

Yeah, I've seen some dev teams do it right, don't get me wrong, and they are an absolute joy to work with. It's more that a) they are greatly outnumbered by less attentive teams and b) they still generally don't have the measurement frameworks and business acumen to fully comprehend how important it is.

The good thing about letting the business know about that importance (and how it translates to a $ value) is that they will let/encourage/force these development teams to really focus on it, and understand that the marketing team adding a million 3rd party tracking scripts actually costs more easy money than it generates.

→ More replies
→ More replies

10

u/XPlutonium 13d ago

I agree with this wholeheartedly :)

PS: also a relatively experienced Dev here been in it for 15 years now. Kids running behind React and the newest shiniest object every 2 months makes me think ah shit here we go again. I guess some things don’t change from Drupal to JQ to (whateverthelatestshitis)

→ More replies
→ More replies
→ More replies

147

u/ThePracticalDad 13d ago

I assume that’s simply because more content and hi res images are added offsetting any speed gains.

83

u/IronCanTaco 13d ago

That and the fact that websites themselves are bloated to no end with over-engineered stacks which depend on what is popular at the moment.

26

u/czerilla 13d ago

I'm fairly sure it's just Wirths law in action.

9

u/IronCanTaco 13d ago

Hm didnt know about this. Thanks. Will use it in a meeting someday when they want more frameworks, more libraries, more pictures … sigh

→ More replies
→ More replies
→ More replies

138

u/space_iio 13d ago edited 13d ago

median load on which websites? the top 10 most popular? just 10 random websites?

200

u/robert_ritz OC: 2 13d ago

httparchive uses the CRuX corpus of a total of 4.2 million URLs. Post.

Point taken and I think I'll update my blog post.

17

u/plg94 13d ago

If you even say "download speeds have gone up", it'd be nice to show that in the same graph (and latency, as someone else mentioned). Also, has the download speed of the servers that measured the loading times improved?

8

u/robert_ritz OC: 2 13d ago

I couldn’t find global internet download speed over the same time period. I tried several data sources but none were openly licensed or they stopped in 2017.

I hoped that it’s clear internet has gotten faster.

→ More replies

10

u/[deleted] 13d ago edited 13d ago

[removed] — view removed comment

7

u/tomius 13d ago

Intimate ransom moment ❤️

→ More replies

12

u/jbar3640 13d ago

Web pages nowadays are just crap: - tons of JavaScript for almost no purpose - loads of analytics, ads and other spyware - idiotic cookie management, annoying newsletters pop-ups, non-requested small videos running in random places, etc.

I would love navigating simple HTML pages and small useful style CSS. maybe a small amount of JavaScript for small and useful use cases...

139

u/DowntownLizard 13d ago edited 13d ago Wholesome

Latency doesnt change though. Not to mention the server processing your request has nothing to do with your internet speed. Theres multiple back and forth pings before it even starts to load the page. Like making sure its a secure connection or that you are who you say you are, etc. Gonna take even longer if you need to hit a database or run calculations to serve some of the info. Its why a lot of websites utilize javascript and such so you can just refresh a portion of the page without actually loading an entire new page. Its helps speed up load times when you can let the browser itself do most of the work. Everytime you load a page you are conversing with the server.

Edit: A good point was made that I was unintentionally misleading. There have been optimizations made to improve latency in the types of protocols to avoid a lot of back and forth. Also bandwidth does help you send and process more packets at a time. There are a few potential bottlenecks that render extra bandwidth usless, however (server bandwidth, your routers max bandwidth, etc).

I was trying to speak to the unavoidable delay caused by distance between you and the server more than anything. If had to guess on average theres at least .25 to .5 seconds of aggregate time spent waiting for responses.

Also it's definitely the case that the more optimized load times are the more complex you can make the page without anyone feeling like its slow.

9

u/bigots_hate_her 13d ago

That’s not fully correct.

While many hits to the server may be necessary, modern communication protocols try to mitigate latency by exploiting the wider bandwidth that we have access to and sending many packets in parallel to avoid the back and forth required by older protocols. Some protocols even keep a connection alive which means the initial handshake is avoided for subsequent requests.

Furthermore, higher overall bandwidth decreases the time packets spend in queues inside routers which results in further latency reduction.

→ More replies

72

u/CadmiumCopper 13d ago

ITT: people that really don't understand the difference between latency and bandwidth

16

u/Clitaurius 13d ago

but "faster" internet /$

→ More replies
→ More replies
→ More replies

82

u/redpaloverde 13d ago

It’s like adding lanes to a highway, traffic stays the same.

29

u/dgpx84 13d ago

Indeed, because the only true limit on it is humans' tolerance for misery, which is unsurprisingly quite constant. Page load times would actually increase if it wouldn't be too detrimental to viewership, because if they could make it even slower, they could hire even sloppier developers and double the number of ads.

4

u/DerJuppi 13d ago

Not just traffic, but the max speed of your car also doesn't change when expanding the highway. Your car usually has to make a few trips on the empty road before the page begins loading.

27

u/onedoor 13d ago

It's feature bloat. A good example is old reddit and new reddit. Old reddit takes a third to half the time to load. Sometimes much less than that.

→ More replies

8

u/aheadwarp9 13d ago

This is why I block trackers and ads as much as possible... Not only are they annoying, but they are hurting my productivity with their bloat.

7

u/PM_ME_LOSS_MEMES 13d ago

Websites have also gotten more and more full of bullshit JS nonsense and trackers

17

u/gw2master 13d ago

I interpret this as: users are willing to tolerate about 4 seconds of load time so as technology increases, webpages will increase in complexity until they hit this (rough) threshold.

→ More replies

23

u/ctnguy OC: 16 13d ago

I wonder if latency lays a part in this? It’s probably as important to initial load times as bandwidth is. And latency hasn’t changed that much in recent years.

9

u/SexWomble 13d ago

I imagine so. After all, a web page isn’t the download of a single thing. It’s lots of small downloads that depend on the previous so latency is very important.

10

u/Garaleth 13d ago

Latency from one side of the world to the other can be as low as 100ms.

I wouldn't expect it to ever surpass 1s.

3

u/EmilyU1F984 13d ago

That‘s if the website were a single file. It is not. It‘s many different requests to different servers, that happen consecutively.

And that adds up to 4 seconds on average.

Thing is: those websites are definitely not bandwidth limited. You could download a 12,5mb website in a second on 100mbit/s from a bandwidth point.

But it‘s a couple of mb taking 4 seconds. So it‘s most defined latency.

→ More replies
→ More replies

5

u/jtmonkey 13d ago

Yes but look at the experience you’re having now. You’ve got video headers and real time updates. Add to cart buttons under 8 images at all angles in hi res. The load time stays the same because that’s the target. We build it to load in that time using the maximum amount of tools at our disposal.

5

u/Fluffigt 13d ago

Studies show that the average user didengages after a 3 second wait, meaning that after 3 seconds they lose interest and close the page if it hasn’t loaded. This is of course just an average, but we use this as a benchmark when testing our apps. If the user ever waits more than 3 seconds we need to improve performance. (Websites are apps)

3

u/abrazilianinreddit 13d ago

Oh hey, my website average is below that! I see this as a major victory!

3

u/dislocated_dice 13d ago

It’s almost like corporations are using advances in technology to spam you with more ads.

I’m still mildly surprised that it’s been that long with “no progress” though. And what was the dramatic cause of the drop in 2017?

→ More replies

5

u/TheDevilsAdvokaat 13d ago

I'll bet this is because of ads and snoopers and trackers.

Some page load more than 500 pieces of crap along with their page.

4

u/BizarroMax 13d ago

I'm assuming it's because back in 1994, when we wrote web pages, you opened 'vi' and wrote some HTML and the web page was 947 bytes long. Now I go to web pages and the source is a hopelessly obfuscated spiderweb of frameworks, advertising platforms, user tracking cookies and pixels and other garbage, it's not even remotely human- readable any more. Plus modern web site design is intended to present only very limited clickable whitespace for when you're trying to get focus back on the browser, and they serve background ads in that space to generate unintended clicks, which makes them money.

I love me some capitalism, but web design and for-profit journalism are a pox on its house.

→ More replies

3

u/boocap 13d ago

A part of it is that the speeds aren't really faster. Your latency hasn't changed. The pressure is the pipe is the same as it's been for a decade, the pipe itself is just a lot wider, so something tiny like html code really doesn't benefit. But the number of devices and large file download speeds are much higher. But a ping in 2012 vs 2022 really hasn't changed.

→ More replies

4

u/Cacachuli 13d ago

This graph should go back farther in time if the data are available. As an old timer I can tell you that current load times are just fine. I would love to compare them with load times in the 1990s when you would literally watch pictures slowly loading from top to bottom.

4

u/pab_guy 13d ago

Most web devs are not great at optimization, and it's not something that the business cares about as long as you are under some predefined metric (ideally based on user testing and bounce rates).

That said, there is huge room for improvement across the board. I used to optimize page loads and could take a 3 or 4 second page down to 250ms with the right techniques.

Use CDN and reference popular CDN hosted versions of libraries that are much more likely to already be in browser cache.

Optimize imagery. This means using vector svgs where possible, but even then, if you crack open an SVG you will find all kinds of needless metadata that can bloat the file 2-3x.

Lazy load content, especially stuff that is hidden in menus or below the 'fold'.

Making a dozen API calls on page load? No. Consolidate them into a single payload. If using microservices you can create a "UX API" or "Experience API" abstraction layer that makes local microservice calls with much lower latency on behalf of the client and returns all the results into a single payload.

So much they could do, but don't because bandwidth is cheaper than really good developers.

→ More replies

5

u/kemek 13d ago

As speed increases, devs build richer pages...

→ More replies

15

u/spicypants54 13d ago

Try it again with a PiHole enabled.

25

u/TheOneTrueTrench 13d ago

Virtually no one has gotten "faster" internet in the last decade. Hell, I just upgraded from 50 Mbps internet to 1 gigabit, and it's not "faster" at all. It's broader.

Let's say that looking at a map, and you notice that there's a 2 lane road between two cities. And right next to it is a 10 lane highway. They both have a 65 MPH speed limit.

That freeway isn't 5 times faster, it's 5 times broader. You can fit 5 times as many cars on it, but obviously those cars are still going 65 MPH.

All of the upgrades to our internet connections are just adding the equivalent of lanes to a highway.

So, with that in mind, let's change this title to match this.

Despite broader highways every year, it still takes 15 minutes to go to the next city over. The average amount of time to get from Springfield to Shelbyville and back 8 times has been stuck at 4 hours for years.

When expressed in this manner, it becomes clear that there is simply no reason to expect adding lanes to a highway would make a trip between two cities to be even a second faster.

→ More replies

28

u/Kiflaam 13d ago

Well you see, these days, not only does the traffic have to first route through FBI and CIA servers, but also Chinese, KGB, NSA, and others before the packets can finally be sent to the client/server.

→ More replies

20

u/robert_ritz OC: 2 13d ago

The data for this chart came from the wonderful httparchive.org. Tools used to make the chart: Python, Pandas, Matplotlib.

I also wrote a blog post about the topic on datafantic.

In addition, I built a simple Streamlit app to let you calculate how much time you have (and will) waste on website loading. Lots of assumptions are built in, but it gives you a number. Personally, I've wasted over 30 days of my life waiting for web pages to load.

If webpages load times were around 1 second, I could save more than 16 days of my life over the next 46 years.

→ More replies

3

u/ShadowController 13d ago

You get a bloated framework! You get a bloated framework! You all get bloated frameworks!!!!

3

u/dgpx84 13d ago

Bloat, baby, bloat! We can add 50 more redundant front-end frameworks and UI libraries and 69 new adtech tracking snippets with almost no degradation in speed thanks to Apple's new CPUs and 5G! Yayyyyy!

→ More replies