Arthur C. Clark- 1974, predicting the Internet
April 1, 2012 10:10 PM   Subscribe

Back in 1974, Arthur C. Clark imagined that we would be talking to each other via computers, and even purchasing theater reservations. How absurd!
posted by HuronBob (42 comments total) 11 users marked this as a favorite
 
This was filmed in 1974? My mother was 20 years old at the time.

Currently she runs all sorts of webinars and training programs from her home office when she can't travel to meet with clients directly. He is talking about exactly the job she has right now. I can't wait to show this to her.
posted by King Bee at 10:15 PM on April 1, 2012 [1 favorite]


I don't know guys....it sounds pretty farfetched.
posted by sendai sleep master at 10:20 PM on April 1, 2012


Why would we want this when we can do it all over the telephone?

I think he's trying to propose a solution that needs a problem.
posted by mccarty.tim at 10:25 PM on April 1, 2012 [2 favorites]


He was describing technology that already existed at the time. Englebart's demo was in 1968 (which actually showed all this stuff). The Apple II came out in 1977, and the Apple I a year before that. this video came out in the 60s (I think)

The altair was available in 1975.
posted by delmoi at 10:28 PM on April 1, 2012


And, in 1964, he was saying this.
posted by HuronBob at 10:31 PM on April 1, 2012 [2 favorites]


The owner of the memex, let us say, is interested in the origin and properties of the bow and arrow. Specifically he is studying why the short Turkish bow was apparently superior to the English long bow in the skirmishes of the Crusades. He has dozens of possibly pertinent books and articles in his memex. First he runs through an encyclopedia, finds an interesting but sketchy article, leaves it projected. Next, in a history, he finds another pertinent item, and ties the two together. Thus he goes, building a trail of many items. Occasionally he inserts a comment of his own, either linking it into the main trail or joining it by a side trail to a particular item. When it becomes evident that the elastic properties of available materials had a great deal to do with the bow, he branches off on a side trail which takes him through textbooks on elasticity and tables of physical constants. He inserts a page of longhand analysis of his own. Thus he builds a trail of his interest through the maze of materials available to him.

And his trails do not fade. Several years later, his talk with a friend turns to the queer ways in which a people resist innovations, even of vital interest. He has an example, in the fact that the outraged Europeans still failed to adopt the Turkish bow. In fact he has a trail on it. A touch brings up the code book. Tapping a few keys projects the head of the trail. A lever runs through it at will, stopping at interesting items, going off on side excursions. It is an interesting trail, pertinent to the discussion. So he sets a reproducer in action, photographs the whole trail out, and passes it to his friend for insertion in his own memex, there to be linked into the more general trail.


-- Vannevar Bush, "As We May Think", The Atlantic, July 1945
posted by radwolf76 at 10:37 PM on April 1, 2012 [15 favorites]


Yeah, well, he totally got what was going to happen in space in 2001 wrong.
posted by oneswellfoop at 10:41 PM on April 1, 2012 [5 favorites]


I'm sorry, oneswellfoop. I'm afraid I can't do that.
posted by HuronBob at 10:44 PM on April 1, 2012 [3 favorites]


He had the formfactor right for the iPad, was only off by a decade or so on the timescale, and the fact that we'd be using them to play Angry Birds IN SPACE instead of watching BBC video clips.
posted by radwolf76 at 10:53 PM on April 1, 2012


It's Clarke.
posted by lumensimus at 10:54 PM on April 1, 2012 [5 favorites]


If you're at all conversant in the history of computing and the internet, this wasn't that surprising a prediction, of course. He isn't standing there suddenly coming up with it, after all; he's asked a leading question. In some ways it's also interesting to compare the two key ways in which he was wrong. He talked about having a console connected to a distant computer, the terminal-mainframe model that was the default then. Obviously we have gone through a generation of having home computers, which indeed probably exceed the raw computing power of the system in the background -- although surprising fractions of that power are devoted to simply the display of graphics. You could also point to ways in which computing power is re-centralizing with mobile devices and apps that are more like clients than PCs themselves. Then the second way in which he was wrong is also a way in which in the 90s the AT&T "You Will" commercials were wrong -- in other words, an enduring myth. Decentralization is something that a few can achieve, and mobile internet may yet make many of us less tied to location than ever before, but cities are only more necessary now than they were in Clarke's day.
posted by dhartung at 10:54 PM on April 1, 2012 [3 favorites]


I knew that "As We May Think" was influential in the eventual development of the web, but I never realized that Vannevar Bush was the first to describe The Problem with Wikipedia.
posted by grouse at 10:56 PM on April 1, 2012 [1 favorite]


The Internet had been around for years by 1974. The World Wide Web does not equal The Internet.
posted by GallonOfAlan at 11:18 PM on April 1, 2012 [4 favorites]


Only losers of the rankest sort would communicate via computer...

(And thanks, Mister Stross, for turning me on to the amazing memex!)
posted by Samizdata at 11:33 PM on April 1, 2012 [1 favorite]


The Australian Broadcasting Corporation didn't broadcast in color in 1974?
posted by birdherder at 11:35 PM on April 1, 2012


Except for test broadcast, no colour until 1975.
posted by unliteral at 11:40 PM on April 1, 2012 [1 favorite]


...two key ways in which he was wrong. He talked about having a console connected to a distant computer, the terminal-mainframe model that was the default then.

Are Google/iTunes/YouTube/Amazon/Wikipedia/Facebook/Metafilter not just mainframes under a different name? What are most PCs or Macs or iPhones or Kindles really doing in 2012 other than displaying content downloaded from 'mainframes'? The notion of distributed systems is wonderful in theory, but in practice we observe that the gravitational pull is towards consolidated and centralized systems.
posted by rh at 12:02 AM on April 2, 2012 [5 favorites]


I was convinced that Arthur C. Clarke's prediction in "Childhood's End" (1953) that the children of the world would start connecting together in a hive mind was pure science fiction until I noticed these kids instant-messaging and twittering and using the interwebs and stuff.

Also, I predicted this comment years ago.
posted by twoleftfeet at 12:17 AM on April 2, 2012 [1 favorite]


Are Google/iTunes/YouTube/Amazon/Wikipedia/Facebook/Metafilter not just mainframes under a different name?

No. real mainframes are still around. There is a key difference in the way they operate from "steam" style IO computers, which is essentially everything we use on a day to day basis
posted by delmoi at 12:31 AM on April 2, 2012 [1 favorite]


Well, looking at the wikipedia article on mainframes, I guess it's not really all that well defined. The big difference between mainframes and 'regular' computers used to be that 'regular' computers were much more 'stream' based. In UNIX (as well as DOS) you read a file as a stream of bytes. This makes networking easy, you read network sockets the same way you would a file. So from what I understand it was actually kind of a pain to hook mainframes directly to the internet.

Mainframe systems tended to work differently. You'd process data by 'screen', where you'd use a terminal and enter data into a screen, then that data would be taken back and put into a database directly. So I guess at the core level it might have used something more like random access to file data? I don't really know. That may just be how IBM's OSes handled it though, but IBM had most of the mainframe market (and still does)

Google/amazon/metafilter/etc are all built up on clusters of regular general purpose PC hardware. At a certain point, it doesn't really matter. You can run Linux on a actual IBM mainframe. All this computer stuff is equivalent to itself. But the big difference is that you have a situation where you have an actual computer in your hand, as opposed to a 'dumb' terminal. You can still play angry birds without a network connection.

That's a big difference from the way a lot of pre-personal computer era people imagined the future, they pictured everyone having a terminal, and all work being done on central computers, whereas now you have a more symbiotic relationship.

A lot of the 'cloud' stuff we have today is simply a result of people trying to get vendor lock-in by keeping all your stuff on their servers. It has nothing to do with anything technical, it's all about money.
posted by delmoi at 1:14 AM on April 2, 2012 [1 favorite]


The owner of the memex, let us say, is interested in...

...sharing his opinion of a clip he has just watched on YouTube...
posted by colie at 1:24 AM on April 2, 2012 [3 favorites]


Are Google/iTunes/YouTube/Amazon/Wikipedia/Facebook/Metafilter not just mainframes under a different name? What are most PCs or Macs or iPhones or Kindles really doing in 2012 other than displaying content downloaded from 'mainframes'? The notion of distributed systems is wonderful in theory, but in practice we observe that the gravitational pull is towards consolidated and centralized systems.

Oh, agreed -- I was glossing over what I thought was obvious. The extent to which raw local computing power has exponentially grown is all the more incredible when you consider how little it is actually contributing to the media-consumption process. There was definitely an era not that long ago when this seemed far-fetched, too -- when having a kick-ass machine that could crank through Excel and "run your business" seemed the ne plus ultra of what the computer could do. But the web is nothing if not the very apotheosis of the client-server model, leaving aside technical questions like Javascript execution. The point is that until very recently, nobody took the idea of the web as computing all that seriously. The Chromebook/Google OS stuff hit the market like a tumbled brick. Then ... then Apple invented the smartphone.

Consider how important that was. The tablet? Tried, even by Apple (Newton). The phone? Market sewn up by experienced players (Motorola). The iPhone came from the iPod, which itself seemed like it was a frogdesign-cute rehash of a middling idea that had muddled around for a few years already (digital Walkman). Suddenly all these things turned the entire industry on its head.

Anyway, the key difference between the web model and the terminal-mainframe model is that the latter was essentially offloading expensive computational work to a remote server, while the former is simply providing content to wherever it needs to go. There's no question that your video plugin on your local device is capable of actually playing the movie, but that's not relevant to the equation anymore. Even products like Google Docs are sold not as a way to have access to tremendous computing power but simply so you can have access to your data wherever you go. While there are architectural similarities, the fundamental rationale is quite different.
posted by dhartung at 2:01 AM on April 2, 2012 [1 favorite]


Meh. The backward people of 1970 America also had enough foresight to pass the Clean Air Act to prevent acid rain from melting all the internet cables that are in use today.
posted by three blind mice at 2:24 AM on April 2, 2012


Sort of a double.
posted by Malor at 2:44 AM on April 2, 2012


"Businessman can live wherever he wants, do his business without living in the city" (paraphrase)

Who would have foreseen telecom cartels screwing us? This could be true is only there wasn't massive profiteering going on throughout North America.
posted by Meatbomb at 3:24 AM on April 2, 2012 [1 favorite]


Except for test broadcast, no colour until 1975.

How did you come up with that answer so quick? Look it up on your Memex?
posted by fairmettle at 3:58 AM on April 2, 2012 [1 favorite]


Mainframe systems tended to work differently. You'd process data by 'screen', where you'd use a terminal and enter data into a screen, then that data would be taken back and put into a database directly. So I guess at the core level it might have used something more like random access to file data? I don't really know. That may just be how IBM's OSes handled it though, but IBM had most of the mainframe market (and still does)

Yes, that is just the IBM way of doing it, and that's mostly for front-end database work. (Which is all the computer really did "back in the day".) Load UNIX on it and it will work the UNIX way.

Deep down, it's the same as a regular computer. Comparing 1s and 0s.

What makes a mainframe different is mostly that they are designed for I/O throughput to a far greater degree than computational speed. The PC was a dumbing down and a cheapification of the mainframe by many orders of magnitude. Narrow data busses, stupider processors, slower memory and commoditification of parts.

What makes a computer a mainframe now? Mostly it is the same- lots of I/O, plus lots of redundancy and parallelism, and much more robust and "native" virtualization. If I remember right (and this might just be a feature of certain models), processing is done more carefully- each processing "job" is handed to three processors and the result is compared between them. If there is not a consensus, the machine disables the offending processor.

The hardware and operating systems are much more compartmentalized and specialized- like older PCs, there are separate cards for each individual task. Unlike a PC, however, those cards don't require the processor to be interrupted to do their jobs. If anything breaks or needs to be upgraded, it can be done online.

If you took a couple racks of x86 servers, concatenated them into one large virtual host and hooked them to a high speed SAN, you'd have a pretty close approximation of what a mainframe is.


What I love about these "the future of yesterday" pieces is the collision of such monumental forward thinking with contemporary pretenses. Like "imagine ordering theater tickets!" or "In the year 2000, robot physicians will be able to perform bloodlettings with unheard of efficiency, with other robots tending the leach ponds, all while the human doctor smokes his pipe and reads up on the current trends in sport in the news-paper!"
posted by gjc at 6:34 AM on April 2, 2012 [3 favorites]


I love these anachronistic prognostications. The notion that people would still go the theatre in this day and age! How droll.
posted by clvrmnky at 6:47 AM on April 2, 2012


And yet the terry cloth unitard never caught on - - where's the justice in this topsy turvy world of 2012!?!
posted by fairmettle at 7:16 AM on April 2, 2012 [2 favorites]


I love the way (we) geeks react to news like this.

"Arthur C. Clark totally predicted the internet!"
"Yeah, but he somewhat mischaracterized the specific workings of some of its components."
"No he didn't!"
"Yes he did!"
"You misspelled his name."
"[meme]"

(You did misspell his name. And I'm on the "No he didn't!" side.)
posted by IAmBroom at 7:38 AM on April 2, 2012


So who's making today's predictions about what we'll see 50 years from now?
posted by ZenMasterThis at 9:16 AM on April 2, 2012 [1 favorite]


Mud, dust, and hunger.
posted by Meatbomb at 9:33 AM on April 2, 2012


What I love about these "the future of yesterday" pieces is the collision of such monumental forward thinking with contemporary pretenses. Like "imagine ordering theater tickets!" or "In the year 2000, robot physicians will be able to perform bloodlettings with unheard of efficiency, with other robots tending the leach ponds, all while the human doctor smokes his pipe and reads up on the current trends in sport in the news-paper!"
Or in this 1969 clip, which tries to show the "home of the future", where a woman goes shopping online, buys clothes and watches the kids - while the husband gets the bill and pays for it. They got the idea of having a home computer, internet banking, online shopping, email, and something like quicken - but they still have rigid gender rolls. Of course if the video was actually made in '69 It's not like feminism hadn't made big progress by then, I wonder how much of that was sarcastic.
posted by delmoi at 4:35 PM on April 2, 2012


So who's making today's predictions about what we'll see 50 years from now?

I'll make some predictions:

* Humanity will of course completely ignore the 3 laws of robotics, lots of military robotics will be developed. We already see this kind of thing with the predator drone. Why not slap some guns on a petman (which was supposedly developed for the military to help test clothing. Yeah right)

* People will continue to use robots to creep out babies

* Basically a lot of robots, I think.

The real test will be if robots can be as dexterous as humans. Being able to answer trivia questions is impressive, but one important part of human intelligence is our ability to quickly figure out physical systems. We can see a blueprint of a house, and some wood and nails and stuff and figure out how to hammer in all the nails. We can see fabric and figure out how to sew it into clothes. If robots get to the point where they can do those things, then basically we'll have no need for human labor. Intellectual labor will also be automated away. So what then will people do with their time? It's an open question.

I think robotics probably will get to that point.
posted by delmoi at 4:54 PM on April 2, 2012


Nice clip. Does anyone know where Jonathan is today, and what he's doing?
posted by crazy_yeti at 4:55 PM on April 2, 2012


@delmoi: What about bioengineering?
posted by ZenMasterThis at 5:33 PM on April 2, 2012


So what then will people do with their time?

Die, hungry, in the mud and dust. Unless the revolution comes in time.
posted by Meatbomb at 6:40 PM on April 2, 2012


What is this "theater" you guys keep mentioning?
posted by LordSludge at 7:10 PM on April 2, 2012


He was describing technology that already existed at the time. Englebart's demo yt was in 1968 (which actually showed all this stuff). The Apple II came out in 1977, and the Apple I a year before that. this video yt came out in the 60s (I think)

The altair was available in 1975.
-- delmoi

Considering the Apple I was considered completely revolutionary at the time it came out, two years after Arthur C Clarke said this, I'd say that the technology definitely wasn't there yet (at least not in my house).

Being in high school at the time, learning programming on so-called mini-computers (small mainframes!) I was hearing predictions like Clarke's, and I COULDN'T WAIT!! WHERE IS IT??!!

Yes, I was very upset, and I would have been upset with you at the time if you had told me that it already existed.
posted by eye of newt at 8:20 PM on April 2, 2012 [1 favorite]


@delmoi: What about bioengineering?<>
Well, I'm a computer scientist not a biologist, so how would I know? I personally think robotics would have a bigger impact on day to day lives, but biotech may mean major medical breakthroughs.

That said, let's think about economics. I think one interesting aspect will be cost to aply these medical breakthroughs. Right now, the major advances all entail massive costs. Even 50 years ago the technology to do a lot medical procedures just wasn't there, and the differences between care for the wealthy and care for the poor was not as great, simply because the ceiling wasn't very high.

Already you see blue collar people living shorter lives then the wealthy, that differential will increase, and, because the elderly will have more time for income to accrue through interest. So there will be a feedback effect. Wealth means longer life, longer life means more wealth. So, kind of whack.

There's also the question of how much of our GDP is going to be spent on healthcare. After all, what most people want more then anything is to stay alive. So, naturally you would expect most of their money to go into that, right? Would it be surprising if we ended up spending a majority of our GDP on healthcare? It's something that's totally new in human history, given the fact that prior to the 20th century there simply wasn't effective healthcare to be had. It just didn't exist.

So what will happen? that's an interesting question.
Considering the Apple I was considered completely revolutionary at the time it came out, two years after Arthur C Clarke said this, I'd say that the technology definitely wasn't there yet (at least not in my house).
Not in the home but definitely in the lab and in big business, academic settings, and so on.

Also the Altair was put out just a year after that. You could have gotten on and a paper teletype console. Or used toggle switches.
posted by delmoi at 9:01 PM on April 2, 2012


This clip was released as part of the ABC's 80 Days that changed our lives project.
posted by unliteral at 10:32 PM on April 2, 2012


There will be a seriously lucrative boom in personal intelligence services. These are like private investigators only they let you know when a creep is stalking you on Facebook before he has the chance to meet you at work.
posted by LogicalDash at 6:50 AM on April 3, 2012


« Older republicanism: nature or nurture?   |   You may hear from heaven almost any time Newer »


This thread has been archived and is closed to new comments