The dawn of the Taft Test
January 1, 2016 7:46 AM   Subscribe

The Website Obesity Crisis Maciej Cegłowski calls for downsizing web pages. And "I shouldn't need sled dogs and pemmican to navigate your visual design." (previously)
posted by doctornemo (71 comments total) 91 users marked this as a favorite
 
The hardware, software and now webware industries has been ignoring this advice for many decades. Instead of honing our skills and revising our obese tech stacks, we hope that additional computing power will save our asses. Some of the blame should also go to management for managing projects poorly, rushing code and getting seduced by the latest web dev trends.

The Verge's Apple watch review is hilariously poorly designed. It's like a web designer was told to make the piece "pop", capitulated to the futility of the situation and implemented every known anti-pattern.

Even this critique's presentation has its share of unneeded images that mostly add visual noise or micro entertainment, because apparently otherwise we can't read a single paragraph of text.
posted by Foci for Analysis at 8:10 AM on January 1, 2016 [4 favorites]


This Cegłowski character keeps knocking out of the park, as far as I'm concerned.

The images that Foci for Analysis criticizes are forgivable, I think, because the piece originated (and retains the format of) a talk, where visual noise (accompaniment, maybe?) and micro entertainment are great ways to keep an audience engaged.
posted by col_pogo at 8:26 AM on January 1, 2016 [8 favorites]


In case you were wondering, the presentation page weighs about as much as Dostoyevsky's The Idiot.
posted by skymt at 8:27 AM on January 1, 2016 [8 favorites]


Hooray for Maciej for saying this so clearly. I particularly like how he touched on so many different themes, from the bloat of ads to the giant images to the over-engineered (and expensive) cloud hosting. The Taft Test is clever, too.

This post is a good incentive for me to make a New Year's Resolution to finally reboot my blog. The cool blogger kids these days are self-hosting using static site generators like Pelican. Fast and lean.
posted by Nelson at 8:30 AM on January 1, 2016 [11 favorites]


If you're wondering if it's worth your time to watch the video, the answer is yes. I saw this when it started making the rounds a few weeks ago, and Maciej's speaking style is every bit as clear and clever as his writing style.
posted by Banknote of the year at 8:35 AM on January 1, 2016 [2 favorites]


The page sizes are truly mindbloggling, especially considered how much of that isn't information.
posted by tommasz at 8:55 AM on January 1, 2016 [1 favorite]


We've got brand- new computers at work and even they tend to choke on rendering some clickbaity sites in Chrome or Firefox- a coworker had some Buzzfeed-alike that was so overrun with ads and AJAX and so on that she had to kill the Firefox process, and this on a quad-core computer with a bunch of RAM. Shit's shameful and there needs to be some accountability for it.
posted by Pope Guilty at 8:56 AM on January 1, 2016 [2 favorites]


Lots of great points here, even if the author should have taken his own advice and spent an extra half-hour or so editing his slideshow into a text article. I hate how so many sites these days are bloated in file size and download times, while simultaneously being dumbed-down to contain much less useful information. What he identifies as "chickenshit minimalism" and "interface sprawl" are particularly irritating.

The idea that every visual presentation on the web has to be tailored to a smart phone or tablet may make sense in some ways, but it definitely has reduced diversity in how things look, as well as the overall information density of a lot of sites.

Throw in all the various forms of advertising-related bloat, from auto-play videos to those blocks of "recommended articles" that are really paid ads, plus what seems like a trend toward producing useless talking-head videos for news stories that easily could be conveyed in a few sentences of text, and it's hard to see how things will improve anytime soon.
posted by Nat "King" Cole Porter Wagoner at 9:04 AM on January 1, 2016 [3 favorites]


Even this critique's presentation has its share of unneeded images that mostly add visual noise or micro entertainment, because apparently otherwise we can't read a single paragraph of text.

To be fair, it's the text of a talk he gave; the images are his slides.
posted by asterix at 9:05 AM on January 1, 2016 [2 favorites]


He actually snuck a screenshot in of his own website as an example of accidental bloat. :)
posted by R343L at 9:06 AM on January 1, 2016 [2 favorites]


Even this critique's presentation has its share of unneeded images that mostly add visual noise or micro entertainment...

Yet the text itself loaded up and made itself readable immediately, even on the crippled connection I seem to have today. That is a graceful sort of decay, at least, and miles better than what is becoming standard practice.
posted by Western Infidels at 9:06 AM on January 1, 2016 [1 favorite]


Oh I'm liking this. He made several good points. Especially the part about the web being less participatory, and more a product to be consumed, the more complicated it gets.
posted by Too-Ticky at 9:19 AM on January 1, 2016 [7 favorites]


One part of the page-bloat problem that doesn't get much attention is the impact on memory-constrained mobile devices. I own an original iPad Mini, released in 2012, which has 512 MB of RAM. (There are still low-end Android devices with as little.) Increasingly often I found that heavy pages would cause the browser to run out of memory and crash back to the home screen. All those ad and analytic scripts trying to track and monetize my views kept me from actually viewing things.
posted by skymt at 9:24 AM on January 1, 2016 [13 favorites]


I build what the guy paying my paycheck tells me to build. He wants sites covered with every available faddish widget. He wants them built according to schedules and logistical situations which preclude proper requirements-gathering, testing, or maintenance (because those things cause money). He isn't interested in producing the best possible website; he's interested in finding the optimal cost:profit ratio. And he wouldn't understand any of these concerns even if I pointed them out (believe me; I've tried).

So, yeah. Web development (like everything else) is driven by a profit motive. Profit motives push projects to schedule aggressively, pick cheap tools over technically sound ones, under-resource, cut corners, and defer to the client's wishes even when the client's wishes are insane. Ergo, the websites I build probably suck ass on your phone, and in general. Sorry.
posted by escape from the potato planet at 9:24 AM on January 1, 2016 [34 favorites]


Complexity is like a bug light for smart people. We can't resist it, even though we know it's bad for us.

This is a perfect summary of exactly why I want to quit the IT business. (It’s not just web design, it’s the entire industry). I’m so sick of this bullshit. I’m so sick of explaining to people that it doesn’t have to be that complicated, proving to people that it doesn’t have to be that complicated, and then just being dismissed and ignored. People just embrace and latch onto the complexity and can’t even conceive of letting it go.
posted by 1970s Antihero at 9:25 AM on January 1, 2016 [28 favorites]


Back in the summer I ran across this blog post lamenting some parts of the same issue: The Verge's web sucks.

That story notes that a particular page at The Verge (this one, complaining about the increasingly poor experience mobile browsing presents) produces 12 MB of data transfer, 7MB of which is actually source code: JavaScript.

7MB is about the amount of disk space consumed by Windows 3.0.

7 MB of an ultra-high-level, Lisp-like language like JavaScript might be expected to produce far more end-user functionality than 7 MB of x86 object code, but of course the JavaScript is all about ads, stats, and tracking, with virtually no direct benefit for the person who downloads it and hosts it.
posted by Western Infidels at 9:28 AM on January 1, 2016 [9 favorites]


Saying Maciej's own article page is bloated is facile and betrays a lack of understanding of the article's argument.

The page is large because it contains thumbnails of the slides. This page is a static representation of a talk with slides; of course the slides should be there. The images pass the Taft test.

In detail: the entire page loads in 1017kB. 4% of that is the article English text. 3% is HTML markup. 0% is CSS or Javascript. The other 93% of the page are the thumbnails of the slides. There are 103 images, each averaging roughly 10kb. Each image has been carefully resized and then saved in JPG or PNG format, depending on which is appropriate. The PNG images have already been well compressed; optipng can't improve them. It's a remarkably tight page.

The page size is all content. What's not there are the true causes of web bloat that the article is talking about. Javascript ads. Javascript trackers. Javascript page rendering. Needless animations. Giant decorative background images. Complex GUI toolkits and controls. These are the abuses of Web technology that are leading to page bloat. Maciej's page has none of these.

What Maciej's page does have is some carefully written text and some carefully presented images from his talk. He publishes these a few times a year, and it's the best format I've ever seen for converting a conference talk to a static web page. I'm impressed that the technology he implements them with is as minimal and clean as he argues we should use.
posted by Nelson at 9:47 AM on January 1, 2016 [58 favorites]


For “fun”, open up ublock's logger on a popular site's page. Refresh it, and see the components trickling in. Gasp at the number of sites that load two (or more) very slightly different versions of popular frameworks like Angular or Node. And so many fucking webfonts (yeah, on the font front, MeFi, I'm lookin' right at you …).
posted by scruss at 9:53 AM on January 1, 2016 [3 favorites]


I notice more and more I visit the same handful of sites, which feels like a rut ... but whenever I dare to leave my rut I'm assaulted by 4700 trackers, popover ads, and pages that just don't load. I usually just nope out and don't go back. (I imagine this registers as a unique user and "hey, our social media is working!" and don't worry that I stay 3 second, don't read the article, and never come back ... The ads got served!)

Also yeah I'm tired of stock photography at the top of every article. I can read things without pictures!

I am also at the point where I would like publishers to curate their ads and show me what HUMANS think the other humans who read their website might be interested in. Because all the algorithms serve me are ads for etsy and ModCloth, and I already shop both those places and I am starting to miss newspapers ads which, while random and not all relevant to me, at least showed me things and places I'd never heard of. How am I supposed to know I might like a thing if you only show me things I already like? I feel like discovery on the web is getting harder and harder ... Everything just shows me the same stuff I already like (Netflix, Amazon, ads in general) or presents more and more hyperspecific interests. If I want to read or watch or discover NEW things, its shockingly hard. All these algorithms DO get better at showing me things I like, but I already know I like them. Like etsy no longer suggests completely insane things to me, just pages and pages of stuff that looks exactly like stuff I already bought and therefore do not need more of. GOOD WORK, algorithm guys, you have successfully created a not-insane algorithm that actually shows me shit I like. BUT I DON'T WANT TO BUY ANY OF IT. They've removed discovery (and also the curation they used to do), making etsy as boring and samey as everywhere else on the web.

Sorry, that was a big tangent. The point was, the same two damn ads chase me all over the web and its very boring and there can't be any return on them at this point. I'm already brand aware and I just go directly there when shopping there; I don't click through. Can't a human find me some ads that are different? Some artisinal, hand-curated ads like the old days when an ad dude who knew his market placed the ads, not an algorithm that REALLY REALLY wants me to buy shoes at ModCloth.
posted by Eyebrows McGee at 10:03 AM on January 1, 2016 [28 favorites]


I agree, Nelson, and I sort of regret that I wrote one of the first comments in that direction. The page is about as small as it can reasonably be. My point was more that comparing byte counts of web pages and literature doesn't really work when including an image or two can take a short essay into novella territory. Everything else is dead on, as usual for Maciej.

(Obligatory link to the @pinboard Twitter, featuring more good opinions about technology.)
posted by skymt at 10:05 AM on January 1, 2016 [3 favorites]


(My point bring that they use 12 ad trackers/servers/targeters to invade my privacy and inflate page sizes, and all it does is show me the same two damn ads everywhere I go, it seems like a gigantic waste of technology! They could just glue a ModCloth post-it to my monitor and it'd be cheaper AND less intrusive AND have exactly the same effect advertising wise AND give me faster load times.)
posted by Eyebrows McGee at 10:09 AM on January 1, 2016 [8 favorites]


I'll play devil's advocate, to a degree...

One: comparing the byte size of JavaScript to the byte size of natural-language prose is a bit silly. A JavaScript file isn't a novel. It's a JavaScript file. It should be as large as it needs to be to fulfill its function (and no larger). (Whether its function is really necessary is another matter.)

Two: as page sizes have increased, so has bandwidth and processing power. When I started my career, 60 KB was considered the maximum acceptable size for a page. But that was when most folks were still accessing the web via dialup. Bandwidth just isn't nearly as scarce a resource as it used to be.

(For the record: I think that much of the crap sites include isn't necessary, and clearly the prevalence of mobile means that bandwidths constraints still exist and deserve serious consideration.)

Most of the increase in page size, of course, can be blamed on JavaScript. Browsers now have fast, standards-compliant, feature-rich JavaScript engines and APIs. This makes it possible to do all kinds of things which weren't possible before. So, people are doing those things. And each one of those widgets requires not just JavaScript, but a slew of attendant markup, CSS, and images.

Most of this stuff is being done with JavaScript because it isn't (easily) doable with plain old HTML/CSS. Web technology is on an eternal treadmill. Developers and designers want to do things—layout things, UI things—that aren't quite possible with HTML/CSS. (Heck, the popularity of Angular and Underscore and module loaders and so forth demonstrates that developers want to do JavaScript things which aren't quite possible with (native) JavaScript.)

So folks find clever, sometimes hackish ways to achieve these things, by loading up their pages with additional JavaScript (and images and markup and CSS). Eventually (and very slowly), these techniques stabilize, get absorbed into the relevant specs, and become part of the browser's native technology stack—so it's no longer necessary to push that stuff down the wire. The browser already has it built-in. Stuff which used to be part of our application code is gradually becoming part of the runtime environment.

So, this absorption-into-the-browser helps to mitigate bloat, to a degree. But you can be sure that, as soon as any given feature gets absorbed, designers and developers will start reaching even further. Freeing developers from the complexity of hacking custom fonts and columnar layouts and whatnot into their pages only frees them up to focus on, I dunno, making the fonts animated and rendering the columns in raytraced 3D. Sites which conscientious avoid pushing technical boundaries are plain vanilla, and plain vanilla is never good enough—not for developers (who want to play with the new shiny), and not for the marketers who generally call the shots (who want their site to have a "wow" factor that stands out from the pack).

re: pointless ads: yeah, pretty much. Ad targeting is accurate enough to make me hate the sleazoids who track me thusly, but not accurate enough to actually show me anything I'm interested in seeing. The thing is, ads are so cheap that it's profitable to harass 99,999 people with shit they don't care about, for the sake of the one person who does care about it. Mass cultural pollution is a viable business model.
posted by escape from the potato planet at 10:17 AM on January 1, 2016 [5 favorites]


This reminds me a bit of the Slack vs. IRC debate. Why do so many people pay for a heavyweight tool like Slack to do things that IRC clients have been doing for years, for free?

For the same reason that reading something is more pleasant on Medium than it is on stallman.org -- because the designers of these tools care a lot about their visual experience, at the cost of almost all other considerations.

And as long as people continue to buy better hardware, and networks can handle the load, there will be engineers who push into that expanding envelope, either to impress their bosses or just to make cool things. Or, more likely, because it makes laziness easier -- high performance machines are more forgiving of unoptimized code and bad testing, so it's no big deal to slap on another Facebook widget.

Every coat closet eventually fills to capacity. And every workforce -- whether it be organic or silicon -- performs more work as it becomes more efficient, not less.
posted by swift at 10:23 AM on January 1, 2016 [5 favorites]


Every coat closet eventually fills to capacity.

Yeah, this variant of Parkinson's law (work expands to fill the time available for its completion). But what's so interesting about today is that suddenly we're switching / have switched en masse to mobile browsers on phones that have the computational horsepower of desktops ~5-10 years ago (but closing much faster than expected!), and power constraints like never before.

What was acceptable and not so noticeable on a desktop with ethernet suddenly feels horrible on an LTE connection even with a modern iPhone. If you're on a 3G (or worse) connection with a 3-year old device, forget it.

And that's led to the surge in complaints, culminating in the panic/crowing over AdBlockers on iOS this summer. This talk was a really good follow-up.
posted by RedOrGreen at 10:56 AM on January 1, 2016 [6 favorites]


PREACH IT BROTHER MACIEJ
posted by lalochezia at 11:11 AM on January 1, 2016 [2 favorites]


Some of these justifications are getting awfully close to tech-determinism ("What we have today must be globally optimal, not a series of historical accidents.")
posted by kiltedtaco at 11:19 AM on January 1, 2016 [5 favorites]


Am I missing something? I did not think his page was bloated at all. I actually appreciated that his commentary was on a page that loaded fast and clean for me. I checked and it looks like 1.2 MB, which in these times is a freaking miracle.

I liked the slides of ad groups growth in a small span of years. That was very illustrative.

It cuts to the bone a bit since my institution is in the process of re-doing its website. It has been an ongoing project for over 7 years. Sometimes that hero image and overlays are necessary because people want something "LARGE" to show the wait was worth it. Never let the marketing group run your website, only bad things can happen. Very bad things.
posted by jadepearl at 11:21 AM on January 1, 2016 [1 favorite]


I'm sure it's been said, but these giant websites with their bloat and bandwidth gluttony are further separating the web into "internet for people in the developed world who can get their crazy fast internet" and "internet for people in the developed world living in rural areas and people in the developing world accessing the internet on smart phones where they pay by the Byte." I can just barely get 3G coverage in rural Cote d'Ivoire (some of the time; often I can only get 2G access); MefiClassic loads for me, gmail on the "slow" setting loads about one every three times I try. The guys I work with are just starting to get smart phones and onto facebook; they often can't load or send any pictures on facebook because of size and speed. I really wish there was priority on democratizing the web by making more sites quick to load, even if the users aren't in the middle of Silicon Valley or in Boston or Paris or something.
posted by ChuraChura at 11:23 AM on January 1, 2016 [33 favorites]


For the same reason that reading something is more pleasant on Medium than it is on stallman.org -- because the designers of these tools care a lot about their visual experience, at the cost of almost all other considerations.

That's a great point, and perhaps the only justification for bloat that I'm comfortable with. But for me the crux of the talk focused on the funding models, and how more often than not the majority of bloat is caused by ad networks and tracking mechanisms, which in general only make sites less enjoyable to use.
posted by sammann at 11:24 AM on January 1, 2016 [3 favorites]


it's funny how many people are responding to the bloated design critique and not the real heart of this talk which is his take on the "give away the razor, sell the blade" model the web had followed since privatization, where the "blade" is visible and invisible marketing silos for consumers to live in ie "adtech": this bubble is going to finally pop.
posted by ennui.bz at 11:44 AM on January 1, 2016 [2 favorites]


Love it.
I have long ago decided that there is a good amount of the internet that I’m just not going to see anymore. I have so much blocking going on that I can’t even see some sites. Oh well. The pages that start loading up all kinds of super flashy graphics I usually close before they finish loading. I’m not going to struggle to read someone else’s content anymore.
posted by bongo_x at 11:45 AM on January 1, 2016 [4 favorites]


Medium would be fine if it just tried to format things nicely, and provide a good editor for authors.

But if you look at what all that javascript is doing (apart from gratuitous little animations), it's tracking exactly where you are in the article and reporting back to the server, which is pretty creepy.

There's really no reason a site like Medium should need to serve javascript.
posted by idlewords at 11:46 AM on January 1, 2016 [11 favorites]


the point being that "bloated design" ultimately derives from an industry based on trying to make monopolistic plays for different segments of the internet: the modern web is full of "features" which really reflect someones anti competitive business plan...
posted by ennui.bz at 11:48 AM on January 1, 2016 [2 favorites]


There's really no reason a site like Medium should need to serve javascript.

Medium is a SPA, which is impossible to build without JavaScript. And, yeah—there's no reason that a site based on text content has to be a SPA, but SPAs are ultimately way faster and less bandwidth-hungry than conventional sites (assuming that you're staying on the site long enough to view multiple pages).
posted by escape from the potato planet at 11:49 AM on January 1, 2016


Part of Medium's product is tracking exactly where you are in the article, and what you read, and what you attended to. You may not like that, but it's deliberate. And it has a cost, which I'm pretty confident the Medium engineers and product people know exactly.

What I hate is all the bloat that comes from ignorance and greed. "Hey, load up another ad tracker Javascript because our bizdev asshole told us to". Or "let's just load the original 6MB image the artist gave us even though it's only being displayed 200x150". Or "let's have seven versions of the same Javascript library loaded". That kind of inefficiency is just wasteful and without meaningful value. Or negative utility to the reader, in the case of the tracker.
posted by Nelson at 11:53 AM on January 1, 2016 [3 favorites]


I'd be interested to see what the 'break-even' point is for a SPA (single-page application, which I had too look up) vs. a truly minimal traditional website.
posted by sammann at 11:54 AM on January 1, 2016 [2 favorites]


Part of Medium's product is tracking exactly where you are in the article, and what you read, and what you attended to.

Part? What else is in their business plan?
posted by ennui.bz at 11:56 AM on January 1, 2016 [2 favorites]


I'd be interested to see what the 'break-even' point is for a SPA (single-page application, which I had too look up) vs. a truly minimal traditional website is.

Well, that's gonna depend entirely on the site. SPAs generally use some kind of framework; Angular is one of the big ones. The minified version of Angular 1.3.15 is 45.2 KB; that'll be reduced further by gzipping. (And if you load a copy from a CDN, the client may already have it.)

Once you add other stuff—the application code built on top of Angular, probably some kind of responsive CSS framework such as Bootstrap, and of course the actual content (text and images)—it's not unusual for an initial page load to be a few hundred KB (not counting any ads, trackers, etc. that the marketroids may insist on adding).

But once the browser has loaded all of that and done the initial setup (parsing everything, initializing the JavaScript framework, etc.), page transitions can be very, very fast and lightweight. As you navigate around the site, it only has to load the new content. It doesn't have to load a complete new version of the page, with a complete new copy of the entire page's header/footer/navigation/tracking/boilerplate, and then do all of that parsing/initialization/rendering anew. It's one HTTP request for a few KB, compared to potentially dozens of requests for hundreds of KB.

For many sites, I'd guesstimate that the break-even is probably somewhere between two and six page loads. Again, though, it all depends on the specific site.
posted by escape from the potato planet at 12:02 PM on January 1, 2016 [1 favorite]


Arguing that Medium has to serve javascript because it's a single-page application is begging the question.

As to bandwidth and speed, if you're serving cleanly-formatted text articles, there's minimal difference in bandwidth between pulling that text in through javascript in a single-page app, versus navigating to a different static page in your browser.

Except that the former approach breaks the Web.
posted by idlewords at 12:05 PM on January 1, 2016 [14 favorites]


Arguing that Medium has to serve javascript because it's a single-page application is begging the question.

Good thing I didn't argue that, then.
posted by escape from the potato planet at 12:06 PM on January 1, 2016


As to bandwidth and speed, if you're serving cleanly-formatted text articles, there's minimal difference in bandwidth between pulling that text in through javascript in a single-page app, versus navigating to a different static page in your browser.

What is your definition of a "cleanly formatted text article"? Does it exclude custom fonts, responsive layouts, and rich form controls? Because even if employers would allow me to build sites without those things, I'm not sure that I would want to. And all of that stuff has a performance cost: for transmitting the code over the wire, for parsing, for rendering. Doing that stuff once, instead of on every page view, is an enormous performance improvement. So it's simply not accurate to say that there's "minimal difference". Unless you're arguing that every web page should look and function like a Linux man page from 2003. (SPAs aren't the answer to every problem, of course, but they are a wonderful solution to many problems.)

Except that the former approach breaks the Web.

In what sense?
posted by escape from the potato planet at 12:13 PM on January 1, 2016


On SPAs, the most aggressive experiment I know of in client-side Javascript rendering was New Twitter in 2010. They switched back to server-side rendering in 2012. Mostly because they found their servers could do the page layout and rendering work faster and more reliably than the client. Not so much about bandwidth as time-to-visible-content. It was counterintuitive to me when I read it, but I believe it. That was all 3 years ago and I wonder what the thinking in the industry is now.
posted by Nelson at 12:23 PM on January 1, 2016 [4 favorites]


Ads and trackers and other bullshit have bloated the web so much that Ghostery can make a business tracking all that shit and displaying it to enterprise customers. "Here is what you're doing. This calls this, which calls this, which calls twenty-seven other things."

Now granted, it still takes people saying, "Gosh, maybe we don't want to do that," to make a difference. Probably why I still fantasize about hanging up my text editor and pouring coffee to pay the bills.
posted by fifteen schnitzengruben is my limit at 12:25 PM on January 1, 2016 [2 favorites]


The only problem I have with Maciej is that he doesn't publish talks more often. I could read them all day.
posted by kevinbelt at 12:39 PM on January 1, 2016


Having read Doris Goodwin's The Bully Pulpit, I have an abiding fondness for William Taft and his brilliant wife whose tragic stroke (blamed at the time on "hysteria") blighted his presidency, so I'm just glad the Taft Test is kinder to him than most of the jokes. It's true, what website wouldn't be better if its ads were swapped out for Taft, his mustache beaming sadly.
posted by thesmallmachine at 12:44 PM on January 1, 2016 [1 favorite]


On SPAs, the most aggressive experiment I know of in client-side Javascript rendering was New Twitter in 2010. They switched back to server-side rendering in 2012.

FWIW, Angular 2 (currently in Developer Preview) will support server-side rendering. I don't know the details, but it's Google's attempt to address this issue.

(Not trying to come across as an Angular/SPA evangelist. Just saying that, yeah—rendering time is a concern, but it's not necessarily an insurmountable one. And, granted, a feature that will be available in a forthcoming release is not an argument in favor of using a framework on a site today.)

(Also, for the record, the "rendering" I was talking about above is the actual pixel-by-pixel plotting of a page onto a bitmap—as opposed to the "rendering" that Twitter's move away from SPA was meant to address, which is more like assembling the page's HTML/DOM. Bitmap rendering, of course, can only be done client-side.)

Anyway, the debate about SPAs is secondary. The biggest problem, as already copiously noted, is ads and trackers. A cleanly built SPA generally functions at least as smoothly (even on a phone) as a conventional page loaded up with malware.
posted by escape from the potato planet at 12:53 PM on January 1, 2016


your 1 million ajax calls can go fuck themselves. also, just for fun, ask any web dev under 30 what 'chunky not chatty' means.

prediction: server side rendering is a dead end without server side caching and fucking fast rendering engines and really effective session isolation and very intuitive out-of-the-box client apis. signed - the geospatial community.
posted by j_curiouser at 1:20 PM on January 1, 2016 [1 favorite]


As someone with ADD, I find the majority of websites these days almost impossible to successfully read a full piece of "content" on. The great piece in the OP was honestly probably one of the longest non-Mefi chunks of text I have consciously read through on the internet in months.
posted by threeants at 2:21 PM on January 1, 2016 [6 favorites]


I'm really curious what the discourse is on interrupting readers so soon after starting to read a page, with popups, autoplay video, inter-paragraph links to other stories, and so on. Like, there is apparently a business case that does not involve readers finishing reading stories.
posted by rhizome at 2:49 PM on January 1, 2016 [4 favorites]


I notice more and more I visit the same handful of sites, which feels like a rut ... but whenever I dare to leave my rut I'm assaulted by 4700 trackers, popover ads, and pages that just don't load.
Huh Eyebrows, I thought that was just me, since I use dialup at home, especially with the pages that don't load. I document some of the more amusing fails from time to time on 1-minute-modem

One thing I notice is that aggregator/discussion sites are still fairly svelte. Metafilter, hacker news, and to some extent reddit (well, I disabled all the bloated subreddit themes). Even of all things 4chan loads fast (and I certianly didn't have that one cached); IIRC facebook is reasonably usable over dialup too, although it's well out of my cache too. There seems to be a stronger and stronger division between sites designed to be returned to repeatedly and lingered over and used, and sites that just want to shove every tracker and ad down every momentary visitor's eyeballs.


(Also, tip of the hat to metafilter's comment notification longpolling javascript, which works stunningly well over bad connections, doing things most javascript crud developers apparantly have no clue about, like dealing with timeouts and lost connections.)

posted by joeyh at 3:30 PM on January 1, 2016 [14 favorites]


With cacheing/CDNs/etc. the actual amount of data transferred per page for those JS libraries is presumably not as much as it appears - of course that still only really works out if your connection is fast enough to load the initial version. What bugs me is all that code running, the resource use on your device, and what exactly is it doing anyway?

An aside about JS and client-side rendering and stuff - it's just interesting to think about how we keep going back and forth between "thin" and "fat" clients in the history of computing. And while I think the arguments for offloading certain things to the client right now make sense all those theoretical efficiencies are basically made irrelevant real quick when you dump in a huge batch of crap loaded from all over the place.
posted by atoxyl at 3:40 PM on January 1, 2016 [1 favorite]


I wouldn't mind SPA fat client sites much if their developers understood that their ajax calls can time out or fail and had sensible error handling, retrying etc. But my experience is that, like other exception handling, most developers get that wrong much of the time. I wasn't able to load any twitter page during its SPA years without changing the url to mobile.twitter.com.
posted by joeyh at 3:57 PM on January 1, 2016 [4 favorites]


Twitter, and indeed mobile twitter, are very much single-page applications today. They use new features to preserve real URLs and other navigation features, but you're only loading the interface once; all the other navigation you do is just a server call rather than a pageload.
posted by reluctant early bird at 4:03 PM on January 1, 2016 [2 favorites]


Yeah, History.pushState() means that a lot of y'all are using fat-client SPA sites er'ryday without even realizing it.
posted by escape from the potato planet at 4:05 PM on January 1, 2016 [2 favorites]


I recently saw a small site with no fancy JS going on, but it was still unreadable for about 30 seconds thanks to black text on a black background. The black background was supposed to be a textured white paper image, but that image was a 2MB png on a dirt-slow server, when it could have easily been a 10Kb jpg.

There have always been ways to cripple websites through crappy design, but as design in general gets more complex, there are increasingly more ways.
posted by p3t3 at 4:21 PM on January 1, 2016 [2 favorites]


When twitter serves an url these days, the body is pre-populated with data, so it's visible more or less right away. Links on the page are hijacked by ajax, and fail in various and sundry ways, but it's a bit better than the prior 2 mb mass of javascript that had to all load to retrieve 140 chars of content.

It's still stupidly bloated for what it does.
posted by joeyh at 4:46 PM on January 1, 2016 [2 favorites]


The cool blogger kids these days are self-hosting using static site generators like Pelican. Fast and lean.

Not necessarily. Static site generators almost always rely on some third party service like Disqus for comments. That's essentially the same as letting some ad partner inject who-knows-what on your page. Even if you only have a few comments on a post, using Disqus can easily amount to 800KB - 1MB of crap loaded in a cold cache scenario.

And Disqus is absolutely atrocious for privacy. Their whole business model is predicated on the fact that they can track you across all the millions of blogs, newspapers, journals, etc. that have chosen to use their service. It's a great business model if you think about it — they get all the benefits of a slimy pernicious ad network without being blocked by any ad blockers. Just imagine what kind of profile they can build on you when they know not only every blog post and news article you passively view, but also what you say when you do choose to interact. It's a marketer's wet dream, and they are collecting all the data.

And because Disqus is so popular and shows up everywhere, it makes it that much harder to manage multiple identities if you need to segregate parts of your life. You would basically have to log in only to comment and then log out immediately afterwards, being very careful not to load any tabs in the background as you never know whether they might have Disqus powered comments on them while you're logged on.

If you do go the static site generator route, at least consider not loading Disqus on page load and make it require clicking something, as that way you won't automatically compel all that evil on every user.
posted by Rhomboid at 5:13 PM on January 1, 2016 [4 favorites]


without being blocked by any ad blockers

Ghostery blocks Disqus by default. I haven't missed it.
posted by escape from the potato planet at 6:23 PM on January 1, 2016 [5 favorites]


I've been working remotely in a place that only has DSL, and for a long time I was dealing with crappy/intermittent service. Sites like Metafilter loaded just fine, but most links wouldn't work for all the reasons people have listed here. It was incredibly frustrating, but really just brought me down to the level of service that most of the world can only aspire to. I would have been ok being unable to access video and other high bandwidth media, but it was incredibly aggravating to have text-only sites be entirely unusable because of shitty design.
posted by Dip Flash at 7:14 PM on January 1, 2016 [4 favorites]


I'm sure it's been said, but these giant websites with their bloat and bandwidth gluttony are further separating the web into "internet for people in the developed world who can get their crazy fast internet"

....

It was incredibly frustrating, but really just brought me down to the level of service that most of the world can only aspire to.

Oh I think this has already happened and been happening for quite some time - an act of staggering ethnocentrism which is shutting many of these companies out of explosive growth markets.

You can see this "other" web where culture and network combine to require different types of pages. If you look at the Japanese internet, for example, you would probably be horrified. Many pages are essentially assembled in javascript; the html is just an empty frame serving up script after script. Even worse; tonnes if not most of the content are static images, the pages are fancy mosaics, and under 5% of the text on many pages is actually text, not jpeg or gif! The pages are also crammed with shit, they have so much stuff on them compared to "Western" pages.

The reasons for this are twofold. 1) Japanese fonts are a son of a bitch; to guarantee what's displayed an image is sometimes the best way, and 2) Japanese pages were developed primarily to display on mobile devices, not desktops, and thus the development of the web over there was quite different, and desktop design often an afterthought. (I also think there are a lot of cultural aspects to it, as well, but lack the knowledge to speak confidently on it).

Meanwhile, when I've been browsing pages from South Africa/Namibia and Kenya (aimed at a domestic audience, ie not fancy tourist lodges), they too are different. They often feel very old school. There's just a few fonts, usually Arial, Times and maybe Verdana. Black text on white background abounds. Lots of table based HTML and often only basic CSS, and a lot of "modular" style design choices we would associate with the heyday of table-based design. There might be javascript, but it's not the sophisticated stuff you see in the west, it's old school server-side stuff running a news ticker or something equally basic. And the pages load fast(er), and they have a lot of information on each page, with static, un-hamburgered menus that display the same on every page.

It's not necessarily the immaturity/budget/skillset that drives these design choices in either SA or Japan, I feel. It's also the environment these pages are served up in, and how they run. Flaky servers, flaky connections, flaky hardware; you need relatively robust pages that don't cost a tonne to build or maintain.

I presume India is the same. China certainly is (though the CCP blocking all and sundry has created a truly unique internet there, so it's a bit different).

It's interesting to me, to see which web giants get this, and which don't. Facebook, surprisingly, does all right in my experience. Twitter is a joke. Microsoft (Outlook/hotmail) and Google (gmail, jesus fucking christ, why do these web-based programs take as long to load on my cable internet today as they did when I was on a v90 dial up in the fucking nineties, WHY?) are terrible. Weirdly, Yahoo used to be pretty good, I wonder if it's cause they just never updated their shit for a long time. Ebay is pathetic, and it's damning that the mess that is Aliexpress beats it handily. Amazon? Faaaaark (actually, Amazon is fucking awful to access here in Australia, too. I read all these articles about their A/B testing and how they so carefully engineer what displays etc. It's shit. They are shit. Their search is crap and woefully slow, their emails are embarrassing, I read like one out of fifty, and the pages take forever to load).

Anyways, the internet has been dividing like this since access in the "developing" world exploded, imho. To be honest, I look forward to the day Western companies accept that they could learn a thing or two from their brothers and sisters in the global 'south'.
posted by smoke at 7:55 PM on January 1, 2016 [19 favorites]


The only way to browse the web on a very slow/metered connection I've found that works well is Opera Mini, which runs everything through a proxy that recompresses images (or removes them) and does other compression and selective stripping to cut pages down. Facebook's no JavaScript mobile site piped through Opera and no images is probably usable-ish at dialup speeds.

It is much less likely to timeout- I think Opera assembles the page and streams it down fully-formed, text-first, so you aren't stuck waiting for some useless resource to load.

My New Years resolution is to turn NoScript back on, and purge cookies more.
posted by BungaDunga at 8:16 PM on January 1, 2016


Well that was quick. It's only January 2nd and already "chickenshit minimalism" has won the award for Phrase of the Year 2016.
posted by benito.strauss at 12:06 AM on January 2, 2016 [1 favorite]


I think a lot of designers and developers can try to curb their own code-bloat tendencies by using Chrome Dev Tools bandwidth throttling features while building/testing. It should get them in the habit of adding in performance tweaks or re-thinking which assets they really need.

Facebook started a similar (opt-in) weekly bandwidth throttle for their devs last year called "2G Tuesdays".
posted by p3t3 at 12:10 AM on January 2, 2016 [2 favorites]


This phenomenon makes it nearly impossible to use much of the web in parts of the developing world where bandwidth is very limited. Thank God for metafilter's clean design that lets me get my fix almost anywhere!
posted by mkuhnell at 12:25 PM on January 2, 2016


parts of the developing world where bandwidth is very limited

Oh, you mean like rural America! (I just upgraded my house in the Sierra foothills from 1Mbps to 12Mbps and it makes all the difference.)
posted by Nelson at 1:13 PM on January 2, 2016 [3 favorites]


Ghostery blocks Disqus by default. I haven't missed it.

Just because comments at your local newspaper or Huffpo or whatever are bad doesn't mean that all comment sections are bad. Blocking all of Disqus is throwing the baby out with the bathwater.
posted by Rhomboid at 8:25 PM on January 2, 2016


Auto-blocking Disqus is a good thing. On the very occasional sites where conversation is interesting, Disqus can be enabled. Otherwise, it's blocked 99% of the time and unable to do whatever intrusive tracking it feels is necessary.
posted by honestcoyote at 2:54 AM on January 3, 2016 [6 favorites]


There was a joke post a few years back (the Onion I think?) about a 'slow-internet café' — akin to the slow-food movement, where things are more leisurely, artisinal, and retro at 2400 baud. Farce or not, I'd actually go there if it could refuse to load any site that was over [X] kb in size.

Can you imagine how delightful it would be to return to a time where sites loaded slowly because of the dearth of fast lanes rather than the size of the behemoths that attempted to traverse them?
posted by iamkimiam at 9:30 AM on January 4, 2016 [1 favorite]


hypertextual - a blogging framework to fight chickenshit minimalism
Chickenshit minimalism is defined as "technology that is aesthetically and functionally simple, but consumes an inordinate amount of clock cycles and memory to accomplish those simple tasks". Chickenshit minimalism is websites with flat UIs three colors and four sentences of text on a 1080p hero image and hamburger menus and 35 stylesheets and 45 JavaScripts that are downloaded from an ad network every time you fire off a request and 3 second load times.
posted by maudlin at 4:37 PM on January 4, 2016 [3 favorites]


benito.strauss: "Well that was quick. It's only January 2nd and already "chickenshit minimalism" has won the award for Phrase of the Year 2016."

Oh, I don't know, I thought the highlight was, "I shouldn't need sled dogs and pemmican to navigate your visual design."
posted by Chrysostom at 5:43 PM on January 5, 2016


a blogging framework to fight chickenshit minimalism

So to save clock cycles and memory, you design a system that requires you to rebuild and push your application server every time you want to fix a typo? Clever.
posted by effbot at 1:47 PM on January 9, 2016


« Older When Liberals Attack   |   “If creativity is the field, copyright is the... Newer »


This thread has been archived and is closed to new comments