Exponential Hangover
July 21, 2015 12:47 AM   Subscribe

Web Design: The First 100 Years
So despite appearances, despite the feeling that things are accelerating and changing faster than ever, I want to make the shocking prediction that the Internet of 2060 is going to look recognizably the same as the Internet today.
Unless we screw it up.

If you think the Web is a way to CONNECT KNOWLEDGE, PEOPLE, AND CATS, then your job is to get the people and cats online, put a decent font on the knowledge, and then stand back and watch the magic happen.

If you think your job is to FIX THE WORLD WITH SOFTWARE, then the web is just the very beginning. There's a lot of work left to do. Really you're going to need sensors in every house, and it will help if everyone looks through special goggles, and if every refrigerator can talk to the Internet and confess its contents.

And if you think that the purpose of the Internet is to BECOME AS GODS, IMMORTAL CREATURES OF PURE ENERGY LIVING IN A CRYSTALLINE PARADISE OF OUR OWN INVENTION, then your goal is total and complete revolution. Everything must go.
posted by CrystalDave (42 comments total) 93 users marked this as a favorite
 
Holy shit that was brilliant. Who is this person??
posted by nevercalm at 1:20 AM on July 21, 2015 [8 favorites]


Oh the thing fails at Republicans.
posted by nevercalm at 1:21 AM on July 21, 2015


This guy is hilarious. "I like to think that the guy in the picture didn't have to put on the bunny suit, it was just what he liked to wear."

Assuming it's all by the same guy, his name is Maciej Cegłowski. He's been featured before, e.g. this brilliant story on how we forgot how to cure scurvy.

Oh, and I found it convincing too, as much as these things can be. Something unexpected can always happen, but we're already off the track to Accelerando.
posted by zompist at 2:11 AM on July 21, 2015 [7 favorites]


Right now there's a profound sense of irreality in the tech industry. All problems are to be solved with technology, especially the ones that have been caused with previous technology. The new technologies will fix it.

We see businesses that don't produce anything and run at an astonishing loss valued in the billions of dollars.

We see a whole ecosystem of startups and businesses that seem to exist only to serve one other, or the needs of very busy and very rich tech workers in a tiny sliver of our world.


or to put it another way... nobody's trying to "disrupt" the systems that are making life miserable for millions of people... because they're not the right people.
posted by oneswellfoop at 2:19 AM on July 21, 2015 [9 favorites]


Some refreshingly sane observations. Especially the one about Elon Musk talking absolute shit about a robot rebellion inside 10 years, and the actual state of AI, which is pretty much that we're as far away from it as we ever were.
posted by GallonOfAlan at 2:38 AM on July 21, 2015


TUMULTUOUS, SUSTAINED APPLAUSE
posted by frijole at 2:55 AM on July 21, 2015 [24 favorites]


Especially the one about Elon Musk talking absolute shit about a robot rebellion inside 10 years

I once knew a guy who sincerely referred to anyone who fell short of his own level of technological expertise as a "robot".

Just saying.
posted by RonButNotStupid at 3:34 AM on July 21, 2015


I suspect that I will hear the phrase "inside 10 years" applied to artificial intelligence every year until the day I die.
posted by IjonTichy at 4:24 AM on July 21, 2015 [1 favorite]


The writer is also the guy behind the Pinboard bookmark sharing site.
posted by hwestiii at 4:53 AM on July 21, 2015 [4 favorites]


Good talk. Ironically I was reading another news article, 3-500 words with one jpeg, but I didn't manage to finish reading before the rendering of all the non-topical ad's crashed the browser frame. (yes I should get around to ad block but I generally stay away from the 'worst' of the web).

Not sure I agree with his hard division of the three objectives (cats, savior, savant), tech is changing the world, it's not 'web tech' and occurring at a different rate and with very different dependencies than the classic 'web' but huge. Take medicine, that world does not use the web well at all, but various MRI's, less invasive surgery and such have been game changers.

What we need for space is not faster or smarter but raw materials. Once we have that without needing to boost every last bolt and sippy cup it'll take months to schedule regular flights to LEO. Why there isn't a major full on asteroid capture project...

He's totally right that web design will not be that different in 50 years, kinda like books are a stable tech, it's real hard to predict what's coming out of a lab not on anyone's radar. The funny DARPA video of robots falling over was all over, few noticed that a couple labs had close to working models, that's going to accelerate. Robot butlers, yea!!

No, not butlers (Elon may have one) but automation that can maneuver in human space might change fire fighting. And war, sigh. Right now there's a small research market for the sensors needed for cars and robots, when that explodes (with perhaps a few other tech areas) robots that are useful will be underfoot, literally. No web interface. 20 years? 200? And that makes asteroid mining practical.

But yeah, web pages will max out on annoying layers and be about the same.
posted by sammyo at 5:28 AM on July 21, 2015


Sadly, I tend to believe that the Internet of 2060 will be underwater.
posted by delfin at 5:33 AM on July 21, 2015 [1 favorite]


A few of hubs, but the internet will route around wet nodes.
posted by sammyo at 5:40 AM on July 21, 2015


And a fish farm in Silicon Valley might just be an improvement...
posted by sammyo at 5:42 AM on July 21, 2015 [1 favorite]


I'm not coming across a detailed account of Elon Musk's statements in the course of casual googling, so this may not be what he's saying, but IMO the imminently dangerous version of "AI" doesn't have anything to do with simulated nematodes or making conscious machines, it's autonomous weapons. So not "true" AI, but more like the sort of thing referred to in games as AI, except IRL.

The militaries of the world would really like something along the lines of The Sorcerer's Apprentice, except holding a gun instead of a bucket. Previously, the FPP on last year's UN conference called the "Meeting of Experts on Lethal Autonomous Weapons Systems".
posted by XMLicious at 5:44 AM on July 21, 2015 [1 favorite]


The Concorde entered commercial service and safely ferried douchebags across the Atlantic for 25 years.
posted by thelonius at 5:47 AM on July 21, 2015 [9 favorites]


I've used pinboard for ages. It's a breath of fresh air, a service that's useful and powerful and yet manages to avoid destroying itself through "progress" (bloat, misguided redesign) and is generally, wonderfully, modest. It makes a lot of sense that the guy behind pinboard is the person who wrote these words.
posted by tempythethird at 5:52 AM on July 21, 2015 [5 favorites]


The author is also a very good twitter follow.
posted by fricto at 6:05 AM on July 21, 2015 [4 favorites]


I once knew a guy who sincerely referred to anyone who fell short of his own level of technological expertise as a "robot".

Well, most of us refer to people arrogant enough to refer to other people as "robots" as assholes.
posted by aught at 6:27 AM on July 21, 2015 [1 favorite]


We have a space station in 2014, but it's too embarrassing to talk about. Sometimes we send Canadians up there.

Hilarious. This entire thing is gold.
posted by The Bellman at 7:14 AM on July 21, 2015 [2 favorites]


The author also wrote this article, which I'm sure must have made the MeFi rounds before as well: No Evidence of Disease
posted by Gordafarin at 7:20 AM on July 21, 2015 [1 favorite]


I love a lot of things about this, not least that instead of fancy content packaging the whole thing's presented between inconsistent <p> tags in a long table. That's all you really need, most times.
posted by postcommunism at 7:29 AM on July 21, 2015 [7 favorites]


Just 5 days ago we had another Maciej post: [T]he flaw at the heart of our country is not just geological. And if you want to invest in a future post, see the Kickstarter to send him to Antarctica.
posted by jjwiseman at 7:35 AM on July 21, 2015 [3 favorites]


I love a lot of things about this, not least that instead of fancy content packaging the whole thing's presented between inconsistent <p> tags in a long table. That's all you really need, most times.

I go further.

Try this:

Open up the linked site and drag the browser window over to the left until all the images are off-screen. Notice that the ideas are still there -- in the text -- still coherent, and still clearly presented. The author didn't really need even those limited 'visual aids' and neither do you.

Oh, Clippy, representing the true current threat level of 'AI', looming over Elon Musk is cute, I suppose, but superfluous.

Well done.

But just once I'd like to read something about the internet that doesn't assume that everyone fetishizes cats.
 
posted by Herodios at 8:57 AM on July 21, 2015


Great essay (as his always are). I have been thinking for a few months that the decline of the early web, and the rise of the commercialized social media web, has correlated with the decline of standards and the rise of services.

I'm not that techy a person so I hope someone can help me out if my history is wrong on this, but I perceive the early days of the web as based on shared standards. Like Cegłowski mentions, things like e-mail, internet protocols, HTML, or even UNIX. It seems like the Internet used to be more distributed. (It also had a more intimate connection with universities as sites of important servers). But the thing is, distributed systems don't work all that well when a single thing suddenly gets a huge audience—and a big part of today's web culture revolves around precisely that phenomenon. Back when only nerds used the web and the audience of any given node on the network was limited, this wasn't a problem.

In the Olden Dayes, if you wanted to share your cat video, you uploaded a .mov file (or a .rm file or a .avi file) to an FTP account on your web hosting provider and hoped that the person on the other end had a browser with the appropriate video plugins. And if your cat video became enormously popular, you hoped that the bandwidth surge wouldn't a) crash your site; or b) cause a huge overage charge on your hosting fees that month.

Now, people use YouTube, Facebook, Twitter, Vine, Imgur et al to handle lots of things that were once distributed. If YouTube goes down for a day, that's Google's problem. If Gmail goes down, the world grinds to a halt, but does so for everyone. It's not like what happens if you personal e-mail server or web site crashes or gets hacked. The centralization of the web has emerged as the numbers of people being served has increased and as the potential audience of an item has expanded to the millions or hundreds of millions.

I don't think you can go back to a distributed model at this point. The only thing I can think of that could serve as an alternative to privatized, centralized, for-profit services like Facebook, would be if the web were suddenly turned into a public utility somehow. Given how much we already dislike the government having a view of our Internet activities, that's probably going to be a hard sell.
posted by overeducated_alligator at 9:04 AM on July 21, 2015


The author is also a very good twitter follow.

Confirmed:

Google+ Photos is shutting down in favor of Google Photos. Please be sure to remove the + from your photos by August 1
posted by straight at 11:17 AM on July 21, 2015 [2 favorites]


Confirmed:

Google+ Photos is shutting down in favor of Google Photos. Please be sure to remove the + from your photos by August 1


That one isn't even satire.

Goodbye Google+ Photos, hello Google Photos!

posted by Nonsteroidal Anti-Inflammatory Drug at 11:57 AM on July 21, 2015


I don't know. Helping people connect and share ideas used to seem like a good idea before Facebook.
posted by RobotVoodooPower at 12:43 PM on July 21, 2015 [2 favorites]


"When I talk about a hundred years of web design, I mean it as a challenge. There's no law that says that things are guaranteed to keep getting better."

This is just great. The nice thing about accepting a challenge like this is that success is in enjoying the ride, not trying to imagine what it will look like at the end.
posted by iamkimiam at 2:17 PM on July 21, 2015


So it's not just me who thinks computers have basically stopped getting faster, right? The only way computer speed affects me is through gaming, and although I don't play as much as I used to, I have the impression graphics in computer games haven't really improved much since about 2007-2008. I also have the impression if you bough a top-of-the-line gaming rig in 2007-2008, you would still be able to play games today at good graphics settings.
posted by pravit at 6:43 PM on July 21, 2015


Cegłowski seems to discount all of the improvements in avionics and safety that have taken place in the last 20-30 years as being insignificant. The current form and speed of commercial aircraft may not be significantly different than those of the 1960s, but I'm not sure that they can be said to be effectively the same aircraft (although that's obviously not without unintended consequences.)

I'm also not sure what that means if you continue the analogy to software, either - maybe that the significant innovations that can and need to be made are in stability and security now vs. raw, groundbreaking leaps in performance?
posted by ryanshepard at 9:24 PM on July 21, 2015 [2 favorites]


Fantastic article, and I'm following Pinboard on Twitter now. Thanks!

I'm also not sure what that means if you continue the analogy to software, either - maybe that the significant innovations that can and need to be made are in stability and security now vs. raw, groundbreaking leaps in performance?

The most work has gone into programmer productivity in the last decade or so (starting arbitrarily with, say, Rails), on the logic that processors are still getting faster, so you save money by making the same software for less effort and then externalizing the costs to the hardware. Even productivity eventually reaches a point where things are "good enough," though. I think we're mostly getting there in a lot of languages now, with the caveat that concurrency is still a bit hard. The Rails community and other framework-oriented spheres are now talking about letting the frameworks fall away, which regains some lost efficiency without sacrificing the productivity; they're breaking down the scaffolds, and what's left is clean, efficient, productive conventions based on software stacks with simple APIs. It's strangely come kind of full-circle.

But I don't think we could have made it here on a straight-line path. There was too much work to be done that would have been cost-prohibitive in the meantime. Which is to say, we needed software to solve some problems, and sometimes the feline internet utopia of personal home pages and handy bookmark sites is just going to have to wait its turn. I say this as someone who uses MetaFilter more than any other site on the internet, but at the same time, I probably dedicate more bandwidth to cloud backup and automatically downloading podcasts than anything else. There's room for both! I have problems, and I will write code until they're all solved, or pay someone to do it for me. (And doing more today is usually more important than doing it more efficiently tomorrow.)

Stability? Software is generally more stable than its ever been. Security? Ha! I wish! Maybe, if only, after the next disaster...

I have the impression graphics in computer games haven't really improved much since about 2007-2008. I also have the impression if you bough a top-of-the-line gaming rig in 2007-2008, you would still be able to play games today at good graphics settings.

2007 to now is the difference between The Witcher and The Witcher 3. It's pretty significant progress by just about every metric, but lighting is probably the most striking. The funny thing is, unlike CPUs, graphics cards have continued massively increasing their throughput, and there still is a noticeable jump between graphics cards a couple generations apart. The trick, of course, is that rasterization is pretty easy to parallelize, so it fits in really well with the trend toward a huge number of very tiny processors. This has also been a boon for physics simulation and scientific computing.
posted by WCWedin at 5:46 AM on July 22, 2015


Stability? Software is generally more stable than its ever been.

I guess I meant network / procedural stability - in the last year, for example, American airlines had a chunk of its fleet grounded by a failed iPadupdate, and United's was grounded worldwide after an unspecified "connectivity issue". Heading into the IoT, individual programs may be more reliable than ever, but critical infrastructure and networks as a whole seem unacceptably fragile.

I say this as someone with no coding experience, however, so I don't know what's actually feasible. To a novice, it appears that these systems have overshot our ability to forecast risk, our willingness to invest in prevention, and our ability to safely manage complexity. I hope I'm wrong.
posted by ryanshepard at 6:56 AM on July 22, 2015


> To a novice, it appears that these systems have overshot our ability to forecast risk, our willingness to invest in prevention, and our ability to safely manage complexity. I hope I'm wrong.

Well, you're wrong on the first bit - people have been screaming for years and years about the staggering risk associated with mission-critical software that's organically grown over the years around a core held together by chewing gum and string - or worse, Fortran (I kid, I kid).

But as for our willingness to invest (what? real money for no new features?) or to safely manage complexity, you're right on.
posted by RedOrGreen at 8:08 AM on July 22, 2015 [1 favorite]


We see businesses that don't produce anything and run at an astonishing loss valued in the billions of dollars.

I wish he had expanded this into more than an aside. ISTM that most of what people think of as "the internet" today is the so-called "services" that burn VC cash while trying to "grow" until profit magically starts happening at some point. Twitter is maybe the most visible of these, and as an example, I just looked up some numbers -- according to Crunchbase, Twitter consumed $1.2 billion in venture capital until its IPO in 2013 (founded in 2006, so $170 million or so a year); its shares are currently trading for around $36, for a market cap of $24 billion; and it lost $645 million in 2013, $578 million in 2014, and $162 million in 2015 Q1. To run a website that lets people post 140 character messages (and desperately, and so far unsuccessfully, tries to find a way to "monetize" them).

What happens to "the internet" when the delusional gamblers providing its financing finally give up?
posted by junco at 11:36 AM on July 22, 2015


Cegłowski seems to discount all of the improvements in avionics and safety that have taken place in the last 20-30 years as being insignificant.

As he says, "The point of my parable is this: imagine if you could travel back in time and offer to show one of those Boeing engineers what air travel would look like in 2014, fifty years on." And then he talks about how those engineers would probably be surprised to see that our passenger jets aren't routinely supersonic and that our manned space program has kinda stalled. It's not meant to be an analogy that is perfect in every way.
posted by jjwiseman at 12:06 PM on July 22, 2015 [3 favorites]


pravit: "So it's not just me who thinks computers have basically stopped getting faster, right? The only way computer speed affects me is through gaming, and although I don't play as much as I used to, I have the impression graphics in computer games haven't really improved much since about 2007-2008. I also have the impression if you bough a top-of-the-line gaming rig in 2007-2008, you would still be able to play games today at good graphics settings."

With the rise of VR, which requires two separately rendered, high-resolution viewpoints of an environment running at a rock-solid 90 fps, there's still a lot of room for GPU power to grow. Many of the debut VR titles are graphically minimalist to compensate, and it will be years yet until computers capable of running a VR scene at modern visual standards are affordable for most people.
posted by Rhaomi at 6:42 PM on July 22, 2015 [1 favorite]


Junco: What happens to "the internet" when the delusional gamblers providing its financing finally give up?

Well, I'd wager it's a bit similar to what happens in cities when large malls or skycrapers run empty: nothing much. The internet (as in the underlying network infrastructure) will stay and people will continue to see and talk to each other. But they will do so a bit differently than before and in other places/spaces.
posted by Captain Fetid at 3:42 AM on July 23, 2015 [1 favorite]


That said, this bit annoyed me:

Consider the war Microsoft is waging against XP users. After years of patching, XP became a stable, beloved, and useful operating system. A quarter of desktops still run it.
This is considered a national crisis.
Rather than offer users persuasive reasons to upgrade software, vendors insist we look on upgrading as our moral duty. The idea that something might work fine the way it is has no place in tech culture.


XP is a poor choice for making that argument, because many persuasive reasons have been given for moving away from it - it's a 32 bit OS that has a relatively archaic design, which in turn has resulted in a number of security issues. And when you consider how fundamental these problems are, it becomes clear that the solution requires rebuilding from the ground up.
posted by NoxAeternum at 12:27 PM on July 23, 2015 [2 favorites]


So it's not just me who thinks computers have basically stopped getting faster, right? The only way computer speed affects me is through gaming, and although I don't play as much as I used to, I have the impression graphics in computer games haven't really improved much since about 2007-2008. I also have the impression if you bough a top-of-the-line gaming rig in 2007-2008, you would still be able to play games today at good graphics settings.

This is basically true, yea. A lot of my friends are using 2008-2010 era GPUs, or laptops with GPUs that perform in that range. And if you bought a system in 2010 or 2011, you're basically still current. CPUs have gotten something silly like 15% faster since 2011, and even a mild bit of overclocking(on CPUs capable of such) brings them in line with or ahead of current offerings.

I have a GTX 580 sitting on my desk. It was $5 at a thrift store. A used one is maybe... $60 online? It was $500 in 2010, but it can still play most games on "high" now, and beats entry level stuff like the 750ti.

GPUs have gotten a LOT faster, but very very few games really push them at all. And only at really high resolutions >1080p, or with graphical settings turned up really high in ways that are extremely hard to notice if you turn them on, play the game for 5 minutes, and turn them off.

On my laptop, or my friends budget rigs you can play most games on medium or high and they still look fantastic. These are like, $200 thrift store and craigslist parts systems with scratched up monitors and peripherals of the same pedigree.

If you want to use a VR headset, or a 4k monitor, or something then yea you need more power. But you almost need more power than you can buy. And it gets hilariously, exponentially more expensive for seriously diminishing returns.

A $1000 computer is not twice as fast as a $500 computer. And a $2000 computer isn't 4x as fast. It's more like... 1.5-2x?

All i know is my $500 system, while a little loud at times, rattley(i STILL cant figure out whats rattling!), and janky doesn't run in to a "wall" with anything. And i don't think a single part in it is newer than 2013, and that's the graphics card. The rest of it is 2010-2011 era stuff.


Inversely, a smartphone from 2010 is an unusable piece of shit now because of software updates. That's where the real exponential growth was, but the fire seems to be gone. A 2012 or 2013 phone is basically indistinguishable performance wise from a brand new one. I know people using galaxy S3s and iphone 5's and they're just like "why would i bother?". I hold those up to a newer phone, try the same thing on both, and it's like... oh look cool, that menu opens half a second faster.

I get the distinct feeling that if you bought a top of the line computer now, it would perform better in 2020 than a 2010 computer does in 2015. I think that point is just about here with phones, as well. Something you can put your own updates on after they lose interest like a nexus 6 will likely not be a hunk of junk in 5 years.
posted by emptythought at 5:02 PM on July 24, 2015


This guy's writing is just fantastic, and what a great piece. I do think he leaves out a fourth possibilty though, the one that is actually happening: the web being overtaken by a small number of mega platforms and services like Facebook. Aided by more and more legislation to regulate the internet plus the law of conglomeration, their almighty algorhythms will get to dictate what to many people becomes a complete online experience, enabling both the monetization and control of most content.
posted by blue shadows at 12:36 AM on July 25, 2015




That's an interesting article, but it's ultimately talking about just the ad-supported, for-profit internet. This is an important segment of the internet, but there's a lot of other internet out there too, and I don't think Cegłowski is especially concerned with the ad-supported internet in the same way. He's definitely not in his professional life; he runs a successful pay-upfront internet service and keeps a blog of long travelogues which recently lead to $38K in kickstarter money for him to go to Antarctica and write about it for the site.

Vox says it's the end of "The end of quirky, specialty sites". The Dissolve was great but I'm not sure it was especially quirky, and I definitely don't think its commercial failure signals the end. Deeply concerning if you work for an internet media company, and unfortunate for everyone who likes film reviews, but there is a world beyond the commercial internet, and that's going to continue to be the case as long as the internet works (technically) in more or less the same way it does today. If all the ad supported sites in the world dropped dead and were subsumed into facebook tomorrow, we would be worse for it but the old internet would not die with them.
posted by vibratory manner of working at 10:49 AM on August 7, 2015 [1 favorite]


« Older What happens when you talk about salaries at...   |   A rarified air. Newer »


This thread has been archived and is closed to new comments