Join 3,512 readers in helping fund MetaFilter (Hide)


Wiring an Intelligent World
June 19, 2006 10:16 AM   Subscribe

What is ubiquitous computing or "ubicomp," other than a geeky buzz-phrase for smart objects, "things that think"? In his provocative new book Everyware (freely excerpted here and here), interface designer and MeFite Adam Greenfield provides a thoughtful meditation on one of the digital world's most resonant hopes for the future, encompassing everything from pervasive RFID-chipping, Orwellian surveillance, and a humbly practical magic wand to a "coming age of calm technology."
posted by digaman (29 comments total) 14 users marked this as a favorite

 
The founder of "ubicomp" as an approach to design was Mark Weiser, a brilliant researcher from Xerox PARC who, sadly, passed away in the nineties. Interested Mefites might want to read his foundational papers The Computer for the 21st Century and The Coming Age of Calm Technology [PDF] in which he lays out his vision of how devices and services ought to work.

Funnily enough, I'm writing a workshop paper for Ubicomp 2006 right now. Another good conference is Pervasive.
posted by xthlc at 11:00 AM on June 19, 2006


Great, thanks for the links xthlc.
posted by digaman at 11:08 AM on June 19, 2006


From Wesier's home page, this is a nice summation:

What Ubiquitous Computing Isn't

Ubiquitous computing is roughly the opposite of virtual reality. Where virtual reality puts people inside a computer-generated world, ubiquitous computing forces the computer to live out here in the world with people. Virtual reality is primarily a horse power problem; ubiquitous computing is a very difficult integration of human factors, computer science, engineering, and social sciences.

posted by digaman at 11:10 AM on June 19, 2006


sorry, *Weiser
posted by digaman at 11:11 AM on June 19, 2006


xthlc, since you're writing a paper on it (or Adam Greenfield, if you're around), what do you think are the most significant societal or technological barriers to realizing the benevolent goals of ubicomp at this point?
posted by digaman at 11:18 AM on June 19, 2006


Unless I'm mistaken, I believe I saw Weiser give a talk at Seybold in spring of '99, on Powerpoint as "wall-reading" -- many of the same points that Tufte makes in his critiques, but from a less moralistic place. (Not that I disagree with Tufte, it's just that the polemic gets a little polemical...but I digress....)

It was the second most memorable thing about the conference. The first was flying back into Rochester in the four hours that the airport was actually open, so I could spend the next few days digging my truck out of five and six foot snow drifts to avoid having it towed.
posted by lodurr at 11:51 AM on June 19, 2006


Ubisoft is SO going to sue them.
posted by zoogleplex at 1:02 PM on June 19, 2006


Hey, thanks for posting this, digaman!

I think one of the biggest hurdles we'll have to overcome if we ("big we") want humane everyware is that most of the people currently building ubiquitous systems probably don't know that's what they're doing.

They're developing a VoIP phone, in other words, or a luxury condo, or a new supply-chain management interface, or a snowboarding parka, and they haven't thought deeply about what happens when all those things start to talk to each other.

So even above and beyond the very real challenges to acceptable user experience that arise out of the nature of technological development as it's currently practiced, we now have to worry about the emergent behavior of systems comprised of modular, networked components unimagined at design time. And I'm not super confident that we have either the methodological or the social tools to develop systems of this nature that do not compromise our privacy, complicate our lives, and drive us all crazy.

That's why I'm so glad to see this conversation get started here and elsewhere - there is truly no time like the present to start wrestling with the implications of this particularly insinuative class of technologies.
posted by adamgreenfield at 1:51 PM on June 19, 2006


... the methodological or the social tools ...

They'll evolve, of course, but we can count on them evolving in ways that are unanticipated by conventional ways of thinking about problems. One example: as we move further and further toward grafting mobile phones into our brainstems, we approach something that resembles a sort of practical telepathy. So we'll all understand one another better, right? Maybe, but maybe at the cost of another kind of understanding. I imagine that one consequence of that would be that people would tend to erect psychological barriers in their own consciousness. They might, for example, cripple their own self-awareness in order to preserve some degree of privacy.
posted by lodurr at 2:31 PM on June 19, 2006


what do you think are the most significant societal or technological barriers to realizing the benevolent goals of ubicomp at this point?

There are a few technological barriers right now, but they don't matter much. We'll overcome them, given time. The biggies are:
  1. Battery life, which is really almost solved with the advent of low-power wireless protocols like Bluetooth and ZigBee.
  2. Cheap and ubiquitous displays -- this need will most likely be met by passive reflection e-paper in about 10-20 years.
  3. Input and human factors issues. There's a wealth of ongoing research into how exactly we're going to get all these little doodads to do what we want them to do. Handwriting recognition and voice are both suboptimal, so researchers are trying all kinds of crazy stuff. This includes gestural interfaces, tangible UIs, portable GUIs that can pop up anywhere from your laptop to your mobile phone, and even some clever hacks to to make handwriting / voice a reasonable UI. Me, I make flashy furniture.
Really, the biggest obstacle is something that might seem like a technical problem, but that is actually social in nature. That problem is building a common infrastructure that's adequate for everything to talk to everything else.

There are plenty of packet-level standards out there (e.g. 802.11) but very little that tells devices how to talk to each other at a SEMANTIC level. A device can see that something exists on the network, but it rarely has any idea whether that something is a toaster or an automobile or an LED.

Let's take an example: say I want to make the things I write down on my tablet PC appear on a large wall display. I'd better hope that the same company made both devices and wrote special software to help them find each other, connect and share data. Otherwise generally I'm SOL -- perhaps I can manually go in and do some special hacks to get them to talk to each other, but that's about ten million times more work than most users are willing to do.

(Bluetooth and Rendezvous both take a step in the right direction towards solving this problem, but both fail for different reasons. I can go into this more if you're interested)

The reason why I say this is a social problem is, in a word, money. Every corporation and research lab has their own one true vision (and several patents) for how everything in the future is going to work together. Nobody's ready to abandon their proprietary, narrow-minded, and usually broken "standard". When they do manage to agree on something, it's usually so watered-down as to be nearly useless (e.g. UPnP). This balkanization of technical standards is the problem I fear will never be properly solved, and we'll be doomed to a confusing mess of broken crap permeating every aspect of our digital lives.

I'm actually less worried about the social issues that Adam raises. I think the emergent behavior of such systems IS inherently unpredictable and disruptive, but that there are plenty of potential benefits as well as dangers. Who would've anticipated the transformative impact of mobile phones on developing countries? What about the profound changes wrought by the chaotic evolution of the Internet? If Weiser's vision ever comes about, I have faith that the benefits we realize will provide plenty of motivation to muddle through the problems it creates.
posted by xthlc at 3:34 PM on June 19, 2006


This balkanization of technical standards is the problem I fear will never be properly solved, and we'll be doomed to a confusing mess of broken crap permeating every aspect of our digital lives.

xthlc, very well put. They do love their walled gardens, and unless some basic direction is forced on them, whichever vendor you choose will take you captive to whatever extent they can.

As I think you hardly need me to point out, this is exemplified by the wireless operators -- cellphones being as nearly ubiquitous today as any of the technologies under discussion can reasonably expected to become in the future. There are salutory lessons in this: one being that the free market alone does not produce an optimal outcome. Where a publicly designated ruling authority has decreed the winning standard, adoption exploded, competition flourished and the technology achieved something like its full potential in short order.

Where the market alone ruled (the United States) even the simplest interoperation beyond that of the simple phone call (to wit SMS) lagged many years behind the rest of the world. Even the smallest victory for the public required legislative intervention resulting from public pressure: phone number portability. If we hadn't stood up and used extra-market means, we never would have got that much.
posted by George_Spiggott at 4:03 PM on June 19, 2006


Fascinating, xthlc -- thanks!

And Adam -- it's a brilliant book.
posted by digaman at 4:06 PM on June 19, 2006


You're way too kind, sir. ; . )

I know there are a great many people who share your faith that everything will just kind of turn out OK, xthic, at least in technical circles. It's as we get further away from technologists and developers and engineers that I begin to see the annoyance, the anxiety, even (in not a few cases) outright panic over the specter of ubiquity.

Bruce Sterling will tell you that there's a (not-inconsiderable) constituency that even thinks of RFID as the literal Mark of the Beast. Now you *know* I can't endorse thinking like that...but I do have to wonder if, given that everyware is by definition something embedded in the contexts of everyday life, we haven't done enough to understand why these technologies we're so hungrily anticipating stir up so much angst and fear in everyday people.
posted by adamgreenfield at 4:42 PM on June 19, 2006


technologies we're so hungrily anticipating stir up so much angst and fear in everyday people.

Dude, books we're so hungrily anticipating stir up so much angst and fear in everyday people. I refer of course to Harry Potter 7. Angst and fear is 50% of the default response since we realised we could be eaten.
posted by Sparx at 5:07 PM on June 19, 2006


we haven't done enough to understand why these technologies we're so hungrily anticipating stir up so much angst and fear in everyday people.

Have you tried asking them? Personally I fall into both camps: I want the cheese but not the trap. I am far more wired than the average person, or even the average geek, but I know to a high degree of confidence that the technology I use is under my control. Non-geeks are unable to obtain this assurance by themselves.

Look at how people are reacting to their phone usage being handed over to the NSA. You may not share their concerns but you need to acknowledge them if you're going to be regarded as something other than an evangelist. If this technology is a tool for me to use to my own benefit, it's desirable. If it's an instrument of corporate and government prying and control, it is much less desirable.

It comes down to whether privacy and security concerns will be respectfully and honestly addressed, or disrespectfully and dishonestly waved off with terms like "paranoia" and "tinfoil hats". Alternatively, some people suggest that it is a trade-off, and insist that not only should you not expect privacy, that it's not even reasonable to want it. The words "deal with it" spring to mind. Some people will not accept "deal with it" as an answer simply on some invested party's say-so. And why should they? "We have the technology to improve your life, but we are powerless to allay your fears, we refuse to try; rather, we say that you are a fool for having those fears." Seems rather a cop-out. I'm not saying that you're saying that, but others are.
posted by George_Spiggott at 5:13 PM on June 19, 2006


Huh? I'm the furthest thing from an evangelist.

OP: I see that you see this. And yes, I have "tried asking them." ; . )
posted by adamgreenfield at 5:28 PM on June 19, 2006


Groovy post and thread. Yay, team.
posted by dejah420 at 8:50 PM on June 19, 2006


previously
posted by shoepal at 11:38 PM on June 19, 2006


I know there are a great many people who share your faith that everything will just kind of turn out OK, xthic, at least in technical circles. It's as we get further away from technologists and developers and engineers that I begin to see the annoyance, the anxiety, even (in not a few cases) outright panic over the specter of ubiquity.

Yes, of course. It's a natural (and perfectly legitimate) human response to be suspicious of things that we don't understand. I know very, very little about cars, and as a result I tend to be paranoid and overly cautious when problems arise.

However, now we're talking about two different issues. The first is the Frankenstein concern: are we building an information dystopia in which privacy will become a thing of the past? The second is human acceptance: how can we help people to better understand what these technologies can and can't do, so that they become more comfortable with their use?

I think you know my response to the former. Of course, these kinds of technologies may have explicit or implicit political philosophies embedded in them, to enforce certain kinds of power relationships between producer and consumer or between users. We (the ubicomp community of researchers, developers, manufacturers and users) simply have to be reasonably vigilant regarding the privacy and security implications of technology alternatives as they arise. Believe me, it's a concern that we're aware of. Almost every conference has at least some kind of panel or workshop on privacy issues. As the field matures, it may become useful to create a formal industry body charged with researching ethical concerns.

As for the latter concern, I would argue that that's a problem we're going to have to solve sooner or later, elitist atttitudes or no. Ubicomp is a paradigm that simply doesn't work without a certain level of critical mass; that means we either have to make it so valuable that people are willing to trade away their fears for the benefits it provides, or (ideally) that we work to introduce some transparency into the systems we create. That last one is actually a fascinating and tricky design problem, one that's attracted a lot of research (including my own work). How can you make sure that people learn just enough so that they can understand what's going on (and allay their fears), but not so much that they become overwhelmed with useless information?

we haven't done enough to understand why these technologies we're so hungrily anticipating stir up so much angst and fear in everyday people.

An interesting point (though irrelevant to the current discussion) is that this angst and fear you mention is a uniquely Western phenomenon. We have a cultural legacy of anti-statist political beliefs that make privacy and security concerns a major barrier to technology acceptance. In Asian and African markets, these concerns would be largely nonexistent (albeit replaced by other factors).
posted by xthlc at 6:05 AM on June 20, 2006


While I can't quite go all the way with you in characterizing discomfort with everyware as a uniquely Western thing, I do think you're right that these issues simply do not seem to be as germane elsewhere - notably in two of the cultures likely to be key players in the early development of the field, Korea and Japan.

And that's why I think it's crucial that we understand how cultural attitudes about things like privacy can get designed into systems (whether consciously or not) at the level of their architecture, and far too deeply to be easily countered by the human being eventually exposed to them.

And while I thoroughly agree that an independent body to certify a given ubiquitous system's compliance with ethical standards would be a useful thing, some have argued that this doesn't go nearly far enough - that the right to deploy countermeasures aimed at defeating such systems is absolute and should not be negotiated away.

These are obviously political questions, and they will have differing resolutions in every polity and culture where everyware appears. My concern is simply that we haven't even begun to raise the issue framed this way, and that a conversation only among technologists and those comfortable with technology is no kind of conversation at all.
posted by adamgreenfield at 8:13 AM on June 20, 2006


xthlc: Of course, these kinds of technologies may have explicit or implicit political philosophies embedded in them, to enforce certain kinds of power relationships between producer and consumer or between users.

The use of the word "may" here strikes me as a bit naive. A better approach in my mind is to just say that every human artefact is associated with certain socio-political practices. You really can't have an apolitical technology.

Another view of the internet btw is that it's simply become a new site for many of the same old power struggles and heirarchies. (Metafilter provides a great example of this in action, unfortunately.)
posted by KirkJobSluder at 8:42 AM on June 20, 2006


I should point out, too, that I believe that not all anxiety at the prospect of ubiquity is naive or uninformed.

I've spent long enough in the industry to see how the process of technological development works, in a very intimate way. And I've seen time and time again how issues of user experience are shunted aside, considered only as a hygienic step toward the end of the process, or even disregarded entirely. And so I make a direct connection between this cavalier "value engineering" and the manifest dissatisfaction I experience with so many digital products and services. (In this regard, I think of the three days my friend Molly recently had to spend wrestling with the cascading failures that affected nearly every application on her machine, simply because she had forgotten to add the "http://" to a proxy configuration setting - and she's no newbie, either.)

There's no reason to believe that the development process will suddenly discover a new regard for users' comfort and sanity as information technology moves outward from the desktop into architectural and public space. In fact, I tend to see the opposite happening. And if the prospect of the Blue Screen of Death is annoying in the PC context, it's well-nigh intolerable when considered in the context of cooking a meal, getting on the subway or opening a door.
posted by adamgreenfield at 8:46 AM on June 20, 2006


adamgreenfield: Perhaps one of the more frustrating things about the late 20th century is how "technological development" has been almost exclusively appropriated by computer systems development. Somehow, when we talk about "technology" we are not talking about the design of plastic sporks, or coffee creamer containers.

Which is perhaps part of the problem. People who design information technology frequenty come with the attitude that "everything changed" in '94. If you are lucky, you get some who have read Bush and Ted Nelson. But very few who consider themselves as following in the footsteps of Frank Lloyd Wright, DaVinci, or the anonymous Clovis community.

There are certainly no lack of user-centered technological artefacts out there. We just don't see spoons, fast-food, and houses as "technology."
posted by KirkJobSluder at 9:14 AM on June 20, 2006


We just don't see spoons, fast-food, and houses as "technology."

Good point -- though I bet Greenfield does.

So does Geoff Manaugh of BLDGBLOG, one of my favorite places on the Internet.
posted by digaman at 10:17 AM on June 20, 2006


adamgreenfield:
And while I thoroughly agree that an independent body to certify a given ubiquitous system's compliance with ethical standards would be a useful thing, some have argued that this doesn't go nearly far enough - that the right to deploy countermeasures aimed at defeating such systems is absolute and should not be negotiated away.

Mmm. I would certainly agree with that last point. I simply bristle at the implication that ubicomp development must be regulated or controlled in a manner similar to, say, biotech.

There's no reason to believe that the development process will suddenly discover a new regard for users' comfort and sanity as information technology moves outward from the desktop into architectural and public space. In fact, I tend to see the opposite happening.

Interesting. I suppose I have a different perspective, since I have a background in HCI. In the past ten years I've seen quite a lot of improvement in the job market for designers and human factors specialists -- more and more companies are investing in user-centered design as a means of building better products and keeping their customers around. Furthermore, I've personally seen that as disciplines such as as architecture and computer science mingle, a human-centered perspective inevitably becomes a part of the resulting culture. Now, whether the human in question is actually the user, rather than the designer, is a bit of a sticking point, but then that's been a problem with design since the 19th century.

Yes, a blindness to human issues in technology is still a problem. But to my mind, it's one that several very smart people are working to correct at a cultural level within the technology industry.

KirkJobSluder:
The use of the word "may" here strikes me as a bit naive. A better approach in my mind is to just say that every human artefact is associated with certain socio-political practices. You really can't have an apolitical technology.

Hmm. OK, you can say that, but for many technologies I bet you'd be stretching the association between artifact and practice, to a point where a claim that the artifact is necessarily "political" due to its use seems a little stretched. For example, how is the metal fork a political technology? Certainly, you can make some claims about mass production, reusability, the family meal, etc. But you'll sound pretty silly.

I think a better way to think of it is that political implications result from the interaction of an artifact and a context. You can have very direct implications when a designer correctly anticipates all of the potential contexts in which an artifact might be used, but sometimes the results are unintentional and cannot quite be said to be "embedded" in the artifact in any meaningful way.

One of my favorite examples of this is the AK-47, a technology that sure as hell seems political on the surface. It was initially developed to be a simple, reliable, and cheap weapon for mass manufacture on a superpower scale.

The AK-47 proved to be a great weapon for the Soviet military. However, it has had the additional and unanticipated effect of fundamentally altering the power dynamics of dozens of developing countries. Most of the parts can be made or repaired by a village blacksmith. You can pound tent stakes with it and it still fires true. It didn't take long for it to propagate across the border. Kalashnikov was a patriot, but he invented the weapon that guaranteed his country's defeat in Afghanistan.

Are the politics of peasant uprising inherent in the design of the weapon itself? No, but the interaction of the AK-47 with the vagaries of the post-colonial world has imbued it with a set of implications that are far greater than its designer's intent. The AK-47, as a gun, is just a cheap way of killing people. The AK-47, as a sociotechnical artifact, can have a variety of political implications depending on where and how it is produced and used.

Langdon Winner wrote a great essay on this, entitled "Do Artifacts Have Politics?" (in summary, yes and no :).
posted by xthlc at 10:36 AM on June 20, 2006


Winner's essay is utterly fascinating! Thanks!
posted by digaman at 10:56 AM on June 20, 2006


xthlc: Thanks for posting the Winner article because I think it helps to clarify a bunch of stuff. I think there is a 3rd possibility that he doesn't really address which is that the use of technological artifacts becomes necessary for participation within certain communities.

And one of the reasons why I state my claims pretty strongly is as an antidote to naive technological determinism that is so often expressed.

Hmm. OK, you can say that, but for many technologies I bet you'd be stretching the association between artifact and practice, to a point where a claim that the artifact is necessarily "political" due to its use seems a little stretched. For example, how is the metal fork a political technology? Certainly, you can make some claims about mass production, reusability, the family meal, etc. But you'll sound pretty silly.

Why would this be silly? You can look at the politics of this in multiple ways:
1) The economics of how something like a stainless steel fork gets in the silverware drawer. This would be an application of Winner's second theory, that some artifacts (like a stainless steel fork) can't exist without other social systems and artifacts in place. In addition, the fork has impact on some other technologies such as food recipes.
2) Knowing which fork to use and how to use it is required for participation within more powerful communities. And this is one of the key ways in which technologies can be used to reinforce traditional power structures.

You and I can take forks for granted because we live in a culture where they are ubiquitous. Here is a challenge, if you really believe a fork is not political, go to a nice restaurant and chow down on a big plate of linguine with your fingers and watch the reactions.

Of course, the politics of an artifact may turn out to be quite different from what was intended by the designer. That's ok. Technologies if they are going to do something other than sit on a shelf or as an idea in a patent application must become a part of socio-political systems. (Which is why I'm skeptical of claims to "disruptive" technologies and singularities.) Cultures and technologies dance together. Incompatible technologies are rejected if trivial, and subverted if inevitable.

And I suspect that our disagreements are largely semantic. But my opinion is that the critical position that talks about how technologies are political leads to better design than saying they might be political.
posted by KirkJobSluder at 12:20 PM on June 20, 2006


KirkJobSluder:
And I suspect that our disagreements are largely semantic. But my opinion is that the critical position that talks about how technologies are political leads to better design than saying they might be political.

OK, seems reasonable. I suppose I'm just wary of falling into the Marxist trap of seeing everything in terms of power relationships. :)

I suppose I see a spectrum of "inherency" for an artifact's political implications. Sometimes the politics are pretty close to the nature of the artifact -- the designer may have even intended them. Sometimes they are almost completely the result of the role that the artifact takes in society -- for your fork example, one may just as well have chosen chopsticks and found the same implications (although perhaps food presentation would be different).

In any event, this is getting rarified, and I should really get back to work on my paper . . .
posted by xthlc at 1:33 PM on June 20, 2006


You say it's getting rarified; I say it's just getting interesting.

You could probably use chopsticks v. forks as the jumping off place for a long comparison of asian and european society. Not just forks, but standardized utensils in general. How common are forks and dull-edged knives at various levels of society at various times, and how does that compare with manufacturing capability?

For example: Prior to the advent of mass-manufactured metal items, forks were probably not common below a certain social stratum. The principle culinary tools below that level were probably spoons (relatively easy to make) and sharp-edged knives, with appeal to fingers and "edible tools" like soft flatbreads. The advent of the mass-produced fork probably marks a significant socio-economic watershed.

Chopsticks are more like a spoon than they are like a fork, in that they're cheap and easy to manufacture and may be used all up and down the social ladder. And so the idea of not eating with your fingers can be pushed down to fairly low social strata. One place this plays out is the Chinese idea that to use chopsticks 'like a Japanese' (i.e., having your fingers low on the sticks) is crude, because they think of Japanese as barbarians who eat with their fingers. (And I suppose that little war back in the '30s-40s might also have something to do with it...)
posted by lodurr at 5:06 AM on June 23, 2006


« Older We're losing the war on terror....  |  Bunny versus Airbus A380.... Newer »


This thread has been archived and is closed to new comments