But, but, is not Johnny 5 alive?
January 19, 2011 3:26 AM   Subscribe

Sherry Turkle of MIT once fell in love with a robot. And was promptly repulsed by her own reaction. Via the always great Arts and Letters.
posted by converge (102 comments total) 96 users marked this as a favorite
 
She also has a book out this month.
posted by gene_machine at 3:36 AM on January 19, 2011


Which the article does actually mention...
posted by gene_machine at 3:37 AM on January 19, 2011


SHe ought to thank her lucky stars that she never let the robot make it to home base.
posted by UbuRoivas at 3:43 AM on January 19, 2011 [1 favorite]


gene_machine: Have no worry. I'm hardly a publicist.
posted by converge at 3:49 AM on January 19, 2011


Have you guessed the name of Billy's planet? IT WAS EARTH! DON'T DATE ROBOTS!





More seriously, though, RX-6309 and I are very happy together.
posted by kyrademon at 3:56 AM on January 19, 2011 [13 favorites]


Filthy robosexuals.
posted by Rangeboy at 4:05 AM on January 19, 2011 [7 favorites]


One day during Turkle's study at MIT, Kismet malfunctioned. A 12-year-old subject
named Estelle became convinced that the robot had clammed up because it didn't like her, and she became sullen and withdrew to load up on snacks provided by the researchers. The research team held an emergency meeting to discuss "the ethics of exposing a child to a sociable robot whose technical limitations make it seem uninterested in the child," as Turkle describes in Alone Together. "Can a broken robot break a child?" they asked. "We would not consider the ethics of having children play with a damaged copy of Microsoft Word or a torn Raggedy Ann doll. But sociable robots provoke enough emotion to make this ethical question feel very real."


I used to pity my sisters' naked Barbie dolls and would clothe and groom them from that pity. I had similar feelings towards canned cranberry sauce at Thanksgiving: it sat on a fine, cut-crystal dish, yet no one would eat it. "Poor cranberry sauce," I thought, and I'd make myself eat it--not for the waste of mere inanimate food, or some thought of a starving human without canned cranberry sauce, but for the living animistic functional spirit which the cranberry sauce possessed, it's unreached potential being-toward-- becoming.

"Can canned cranberry sauce break a child?" Well, she never made it to Harvard, and the world may never know.
posted by eegphalanges at 4:08 AM on January 19, 2011 [91 favorites]


Oh god, this reminded me of when my husband and I had a Shelby years ago, which is a shellfish version of a Furby. The video shows you just how cute it is; it has a cute little laugh and it's always making cute little sounds and saying it loves you and stuff.

Except Shelby doesn't stop talking unless it doesn't get any response for five minutes or something... and ignoring it is agonizing, because it's being cute, and you just feel so awful when it says it loves you, or it tries to tell you a knock knock joke, and you know you can't respond. He'll outright say things like, "I want to PLAY!" and you feel like the worst person in the world.

We finally had to turn it off and leave it off forever because it made us feel so bad. And it was absurd and crazy-making because we knew it was just a machine and it didn't know anything we did to it. That was the worst part, just knowing how irrational it was and how a machine could make us feel like horrible people. We still remember him, though, and sometimes we'll make the sounds he made. Every now and then, if one of us makes a Shelby sound, we have an exchange like this:

Husband, imitating Shelby: Knock knock!
Me: Oh god. No.
Husband: Hah hah hah! Knock knock!
Me: Oh no. Oh no.
Husband: Knock knock!
Me: ... who's there?
Husband: Shelby!
Me: Shelby who.
Husband: Shelby's going to stick his head in the oven because you're ignoring him! Hah hah hah!

We still feel horrible just remembering Shelby. When I came across Shelby later in a box, I literally felt sick with guilt, just leaving him in there. When my husband saw what I was holding, his face just drained and he let out a panicked, "FUCK!" and tried to leave the room, but I wouldn't let him. Instead, we just both stared uncomfortably for a moment. "What should we do with him?" I asked in a hushed voice, but neither of us wanted to deal with it. I think we just kind of shoved it somewhere; it was so rushed I can't even remember.

But that just makes it worse, because whenever I go through the closet or an old box now, I have this apprehensive feeling that I'm going to find Shelby in there.

And it's not just Shelby. We named our Roomba -- Ricky -- and all it did was vacuum the carpet and smash into things but it had these weird person-like qualities. When Ricky got stuck in a corner and started furiously backing up and rotating, backing up and rotating, we'd frown and stand watch over him, concerned: "What are you doing, Ricky?" When he couldn't get himself unstuck, we'd sigh, pick him up -- "Oh, calm down" we'd say when we whirred in the air -- and put him back down, like he was a toddler learning to walk. And when he finished cleaning the room and sang that -- er, emitted that triumphant little chime, his joy was our joy.

Except he doesn't feel joy. One day I hope to better internalize that because Ricky made us feel bad, too. For one, he couldn't hold a battery charge to save his life, you could say, because it felt like we were always saying, "Ricky's dead," "Ricky died," or my favorite, "Why you dead, Ricky?" And it was all the worse when he'd die in the middle of vacuuming, because it was like, wow, he loved us so much that he gave his life for our carpet. When we replaced Ricky with a soulless Dyson I felt awful -- until my husband took Ricky to his office at work permanently, so it's like Ricky got to go live on a farm and play with new people who will appreciate him.

Robots, man. They're nothing but heartbreak. Robots ain't shit.
posted by Nattie at 4:14 AM on January 19, 2011 [1176 favorites]


Brilliant comment, Nattie! Also, have I got a post for you...
posted by MonkeyToes at 4:26 AM on January 19, 2011 [4 favorites]


Well, the implication of he works is that robots can't have feelings. But why not? Can't a robot be programmed to love? What's the difference between electro-chemical signals in the brain and electrical signals in silicon?

I'm not saying it's a good idea or anything, but certainly it seems possible.

Also, these people are thinking way to hard about this stuff. A robot stopped working and a little kid got sad. But most kids get said when their toys break.
posted by delmoi at 4:39 AM on January 19, 2011 [1 favorite]


"Can a broken robot break a child?" they asked. "We would not consider the ethics of having children play with a damaged copy of Microsoft Word or a torn Raggedy Ann doll. But sociable robots provoke enough emotion to make this ethical question feel very real."

"Have you felt yourself to be exploited in any way?"
posted by Blazecock Pileon at 4:46 AM on January 19, 2011 [2 favorites]


I used to pity my sisters' naked Barbie dolls and would clothe and groom them from that pity.

My wife can't get rid of stuffed animals "because they wouldn't be able to breathe in the plastic bag".

I had similar feelings towards canned cranberry sauce at Thanksgiving...

It is possible to get my wife to pet a crouton.
posted by DU at 4:47 AM on January 19, 2011 [210 favorites]


The Shelby story reminds me of my little sister's Furby, which used to babble on and on in Furby language as she became more and more bored with it. One day I got out the instruction leaflet it came with and looked up a couple of the sound sequences it kept coming up with; its preconfigured utterances were translated in a table which spanned over a couple of pages. With horror, I realised that it was saying 'I'm so lonely' and 'Please play with me', over and over again. She hadn't even been bothered to learn the Furby language so as to be able to understand what it meant. I still want to shrivel up with guilt and sadness just thinking about it.
posted by Acheman at 5:00 AM on January 19, 2011 [98 favorites]


I love that AskMeFi thread, thanks!

My wife can't get rid of stuffed animals "because they wouldn't be able to breathe in the plastic bag".

Oh man, normally I don't care much about stuffed animals, but when it comes time to get rid of them, I start feeling guilty. You pick one up, and then suddenly you're looking at its little face, and you're like, I'm just going to get rid of him? How do I even do that? I just... put him in this box? And then he's gone forever? Where will he go?

I had decided to confront the irrationality head on by doing the most macabre thing I could imagine: I was going to collect mine for years -- I never buy them because they're not very functional, but people give them as gifts often enough that I had a drawer full -- until I had enough that I could roughly stitch them together with twine and cover them in fake blood, either as a room divider or a wall hanging. I was completely serious about this. I've wanted a bizarre room for artistic activities for a long time, so that was just going to be part of it. Then if I got writer's block or something, I could look up and see the stuffed animals, then... something... then profit.

Look, I'll admit I didn't think this through very carefully. In my world, I like to think you just cover stuffed animals in fake blood and things work out somehow.

Fortunately or unfortunately, depending on how you look at it, it turns out my friend has a husky who really loves stuffed animals -- she carries them everywhere and sleeps with them. So now my stuffed animals get to be loved by someone who won't feel weird about it instead of being part of my freak show, and the world is saved from unspeakable horror. I feel like the villain of Care Bears episode.
posted by Nattie at 5:09 AM on January 19, 2011 [57 favorites]


Sometimes I creep myself out by imagining that the power cord to my laptop is an IV drip of nutrients for the computer. Or that SF Muni cars are a trusty species of large silver caterpillars who don't mind if we hitch a ride.

Anyway, this article makes the book sound like the usual "dangers of technology" piece — saying that robots replacing some human roles is a bad idea, and observing that technology shortens people's attention spans. I hope the book is much better than that. I really enjoyed Life on the Screen when I read it parts of it recently — I've found some of her observations about MUDs and MOOs still relevant to understanding anonymous and pseudonymous environments like 4chan and fandom. That book describes interesting examples and pulls from psychology and psychoanalysis to provide insight and context, with a background argument that there are important things happening online that the general public doesn't get yet (but with some explanation can understand).
posted by dreamyshade at 5:29 AM on January 19, 2011 [7 favorites]


TBH, I stopped readed A+L Daily years ago when it seemed to move in a weird direction: there appeared to be a distinct editorial slant to the articles picked & it didn't really appeal. Has it improved since?
posted by pharm at 5:29 AM on January 19, 2011 [2 favorites]


I think some people in this thread need to re-watch Battlestar Galactica. Robots and people do NOT mix!
posted by digitalprimate at 5:35 AM on January 19, 2011 [3 favorites]


SHe ought to thank her lucky stars that she never let the robot make it to home base.

Ja, I can't afford the upkeep in batteries alone...

c'mere bob
posted by infini at 5:40 AM on January 19, 2011


Case 202: Ikea Lamp
posted by Lord_Pall at 5:41 AM on January 19, 2011 [18 favorites]


And then there's Wall E who can still make me sniffle...
posted by infini at 5:45 AM on January 19, 2011 [2 favorites]


Sherry Turkle sounds like something my mother would make with Christmas leftovers. Never a pretty sight.
posted by le morte de bea arthur at 5:55 AM on January 19, 2011 [13 favorites]


Phalene:

MeFi, I must say, is a wonderful thing. I would like to slap it on toast first thing in the morning.

In one response you give me a phrase that, while I will probably never use to its full potency, I will never forget -- "crouton petters."

Also, yo make me recall a song I had all forgotten about, Doll Hospital.
posted by timsteil at 5:58 AM on January 19, 2011 [3 favorites]


This is exactly what's dealt with in Ted Chiang's short (not all that short) story "The Lifecycle of Software Objects" (that I incidentally found from Rhaomi's excellent Ted Chiang post). In the story, people develop AI that mimics human learning and emotion. The rest of the story is about dealing with the effects of people coming to love them. Excellent read.
posted by This Guy at 6:00 AM on January 19, 2011 [6 favorites]


We are required to remind you that your crouton will never threaten to stab you and, in fact, cannot speak. In the event that your crouton does speak, we urge you to ignore its advice.
posted by Electric Dragon at 6:01 AM on January 19, 2011 [13 favorites]


I think I understand, a little more, why pets are so popular.

Also, isn't this another manifestation of the phenomenon explored by the still face experiment?
posted by amtho at 6:08 AM on January 19, 2011 [1 favorite]


pharm, have you tried The Front Section? It's like aldaily but without the reactionary slant.
posted by mediareport at 6:15 AM on January 19, 2011


Well the first thing, which should be kind of obvious since farmers have known for about 10,000 years, is you don't name things you don't want to humanize.

I see an interesting convergence between this research with doe-eyed mechanical face robots, the trend toward photorealism and social interaction in virtual environments like WoW and Second Life, mechanical pets like Furbies and Nattie's Shelby, and to jump a fence that is not all that high RealDolls.

I recently mentioned V.S. Ramachandran's theory that art is the search for "super reactions" where various brain modules provide a stronger response than ever occurs in a natural situation. All of the things here strike me as being "social art" in that context. Humans have always anthropomorphized our animals and toys; we're very good at it and Turkle even mentions it. If we can fall in love with a totally inanimate object, how strong a reaction can we form to something that is not just human, but perhaps super-human in its manipulation of our perceptions? After all, real people don't have Anime-style doe eyes like robots and avatars.

This is why I have always thought that if humans ever get either Holodeck or Caprica-style virtual worlds that are fully immersive, or even better the Singularity moves us all into nerd heaven, people are going to turn seriously perverse.
posted by localroger at 6:17 AM on January 19, 2011 [10 favorites]


My wife can't get rid of stuffed animals "because they wouldn't be able to breathe in the plastic bag".

THIS. Not me, mind you, but my wife. If we buy a stuffed animal, its head has to be sticking out of the plastic bag or she'll freak out. She collects porcelain animals with hair glued to them -- those, she doesn't care about; it's specifically stuffed animals. She also names our cars and talks to them.

At the St Vincent De Paul here in town, for a very long time they had a small stuffed Mickey Mouse pinned to the bulletin board, one thumbtack through each ear, right behind the cash register. Checkout time was an emotional ordeal. The place was staffed entirely by old Norwegian ladies, and they only laughed when my wife pointed out the horror right behind them.
posted by AzraelBrown at 6:39 AM on January 19, 2011 [4 favorites]


"There are at least two ways of reading these case studies," she writes. "You can see seniors chatting with robots, telling their stories, and feel positive. Or you can see people speaking to chimeras, showering affection into thin air, and feel that something is amiss."

I like to think of it as childhood and old age being so close to the tomb/womb and nothingness, that we're more aware of our being spirit trapped in matter at those ages. And there are Buddhist tales of saints licking maggots from dying/dead dogs' backs and such to consider. We don't like to think of ourselves as fleshy contingencies, MIT scientist or not. We care, we matter.

Here's a bit of Rollo May from Love and Will, the chapter, "The Meaning of Care":

The new basis for care is shown by the interest of psychologists and philosophers in
emphasizing feeling as the basis of human existence. We now need to
establish feeling as legitimate aspect of our way of relating to reality. When William
James says, "Feeling is everything," he means not that there is nothing
more
than feeling, but that everything starts there. Feeling commits one, ties
one to the object, and ensures action. But in the decades after James made this
"existentialist" statement, feeling became demoted and disparaged as merely
subjective. Reason or, more accurately, technical reason was the guide to the way
issues were to be setting. We said, "I feel" as a synonym for "I vaguely believe,"
when we didn't know---little realizing that we cannot know except as we feel.

I don't think Turkle's concerns are any different than those of William James', Rollo May's or Aeschylus' time. We just play with different toys, and our spiritual actuaries are less patently animistic than ever before, but it's still Perky Pat to me."There are are least two ways of reading these case studies." Again: At least two. She gives no upper limit to the the ways of reading these case studies, so go apeshit, you know, marry that vaccum cleaner and we'll all ponder the morality of it. And this is why I chose the career of massage therapy.
posted by eegphalanges at 6:39 AM on January 19, 2011 [6 favorites]


Sometimes I creep myself out by imagining that the power cord to my laptop is an IV drip of nutrients for the computer.

When I plug in my iPhone to recharge, it is often with the tenderness a mother would feel settling in her baby at the nipple. But you could attribute that to the reality distortion field.

She's concerned about robots that want to be buddies, implicitly promising an emotional connection they can never deliver.

Who's the one getting sucked in now? Robots don't "want" or "promise" anything.

Also: like love, emotional connection is in the mind of the person seeking it. [See: stalking] It is not "delivered". It is, if you like, manufactured.
posted by Joe Beese at 6:41 AM on January 19, 2011 [2 favorites]


Does a Rampant Rabbit count as a robot?
posted by PeterMcDermott at 6:43 AM on January 19, 2011


I read a short story by Asimov once, in an obsolete 1950s collection. It was about a cold man who works in a robot company and neglects his stay at home wife, while admiring his flashier coworker. His wife has very low self esteem.
One day he brings home a household robot for her to test and the wife is repulsed, both by the robot and her own desire to connect with it. The robot, who quickly aquires the name Tony, true to its mission to help humans proceeds to help her redecorate the house, gives her pep talks and finally kisses her in full view of the evil coworker.
The woman freaks out and the robot is sent back to be finetuned.

Anyway, the most memorable part is the last one by Dr. Susan Calvin, who chastizes her co-scientist for saying they can't have a robot making love to its mistress. "You don't understand. Robots can't fall in love. Women can."

It was such a cheesy story and I still like it so much!
posted by Omnomnom at 7:02 AM on January 19, 2011 [3 favorites]


My wife can't get rid of stuffed animals "because they wouldn't be able to breathe in the plastic bag".


I am a full grown man with a rational outlook and viking hair, but I still have Donkey Tiberius Donkey. I take him to every hotel room I go to, so he wont be lonely.

He is also my Legal Counsel.
posted by The Whelk at 7:03 AM on January 19, 2011 [88 favorites]


Can't a robot be programmed to love?

I am so sick and tired of all the robot stereotypes. Robots fill a number of wide and varied roles, including chess-playing, robo-cop, and, of course, termination.

But one company comes out with a line of 'Love-bots', and that is ALL YOU HUMANS TALK ABOUT.
posted by Comrade_robot at 7:25 AM on January 19, 2011 [15 favorites]


Some people are inclined to be dismissive of this, because people anthropomorphize shit all the time, it's nothing new, and therefore to point out that robots are particularly good at exploiting this is not insightful... But to me that seems like pointing out that humans are greedy bastards, and therefore pointing out that something like the 419 scam is particularly good at exploiting this is not insightful. But people have lost millions on the 419. We know they're not real, but we don't feel they're not real, and oh , the uses you could make of this fact....

Reading this makes me realize how key reciprocity is in tempering us, in making us healthy social beings....a commentator on the original piece argues that you cat don't care about you. I dunno if cats truly love their owners, but they sure won't come if you kick 'em. What would constantly interacting with pseudo-beings who don't stop loving you if you're cruel, or start comforting you if you suffer, do to a munchkin's mind? Makes me feel creepy to wonder...
posted by Diablevert at 7:27 AM on January 19, 2011 [4 favorites]


I used to pity my sisters' naked Barbie dolls and would clothe and groom them from that pity.

Awfully off-topic, but the word "groom" has now changed meaning for me to such a degree that this sentence seemed uncomfortably freaky to me. Weirdness.
posted by seanyboy at 7:28 AM on January 19, 2011


I'm really interested to see when a public intellectual who was on the internet thing so early (as Turkle was) suddenly becomes a skeptic. About-faces in intellectual positions are always fascinating to me. That said...

I think she suffers from the myopia of the specialist. For millenia we have been making artifacts that simulate human interaction, and the claim that these artifacts promise more than they deliver has been something of a constant drumbeat.

Here's a voice of reason from Critical Review, 1765:

From the usual strain of these compositions, one would be apt to conclude, that love is not only the principal, but almost the sole passion that actuates the human heart. The youth of both sexes are thereby rendered liable to the grossest illusions. They fondly imagine that every thing must yield to the irresistible influence of all conquering love: but upon mixing with the world, they find, to their cost, that they have been miserably deceived; that they have viewed human nature through a false medium.

And another from The Guardian, in 1820:

The argument used by many in favour of novel reading, is that novels display character, describe men and manners, depict the human heart, and make youth acquainted with the world. But is this correct? Is an accurate description of man, of the manners and customs of the world, usually given in such productions? Are not the scenes so highly wrought, and the characters drawn so perfect, that youthful expectation is so greatly raised, that the every day scenes of common life, in which he will probably be called to act, seem to him insipid, if not disgusting. He loves nothing that is common; with him it is vulgar.

And now here's the linked article, with in-line quote by Turkle:

Kismet can't reciprocate friendship, after all, or prepare kids for the more dynamic experience of interacting with people.

"What if we get used to relationships that are made to measure?" Turkle asks. "Is that teaching us that relationships can be just the way we want them?" After all, if a robotic partner were to become annoying, we could just switch it off.


The mistake Turkle makes, which I think is sadly a common one, is to believe that her object of study is fundamentally different from all human endeavors that preceded it. It's difficult, though, to take a long view, and to take jeremiads on the psychologically destructive force of the most recent interation of human/techne interaction too seriously.
posted by Pickman's Next Top Model at 7:31 AM on January 19, 2011 [25 favorites]


iteration, not interation, in the last line. :P
posted by Pickman's Next Top Model at 7:33 AM on January 19, 2011


She may have a point. I read this article while attempting to (halfheartedly) listen to my History lecture this morning.

Maybe we should put down the tech for at least a little while.
posted by ralenys at 8:02 AM on January 19, 2011


I've told this story before, but I fell in love with food twice in my life, and made love to it/made out with it. Once was a more "dirty" experience...

The hostess cupcake. I had some fucking primo Northern Lights, I was hallucinating and seeing patterns in 3d on the back of playing cards, everything was sooooo detailed and in depth, i thought the shit was laced, i swore it was, but the person i was with insisted it was just really fucking good weed. Up in the boonies in Wisconsin, nonetheless.

anyways, i had a cupcake, and the creamy filling. Oh dear god. That sweet silky smooth sugar high cream. And the velvety feeling as it glided around my tongue. Oh delicious and glorious. I made out with that cupcake as if my life depended on it. And I swear that I think it liked it, it knew it was being loved and enjoyed in a way only a stoned ape could enjoy it.

And another time, again, high on weed, a year or two later.

I had a plum, oh, and this plum, it's flesh, so red, pulsating, writhing... it had these veins, throbbing. The nectar, again, so sweet, dripping and juicy. Not too tart... Just perfect, and as I watched its flesh I heard it speaking for me to devour it. And I did, and it was divine.

But it was more pure. I was making love to it, in a way that I couldn't to a cup cake. The cupcake? That was dirty, filthy, make out pure fuckery. But this plum. Oh my god, it was the most glorious love one could have with a fruit. It was a communion of the gift of mother earth to me.

So yes, it's possible to fall in love with inanimate objects.

Call me when you can eat robots.
posted by symbioid at 8:09 AM on January 19, 2011 [7 favorites]


So yes, it's possible to fall in love with inanimate objects.

*stares goofily at screen, muffles titter, strokes mouse button*

*starts playing with the keyboard again*
posted by infini at 8:43 AM on January 19, 2011 [1 favorite]


ignores hairy eyeball from other side of the room
posted by infini at 8:46 AM on January 19, 2011


My husband got me a Segatoy chick robot for Christmas. It peeps (in several different ways) and flutters its wings when you pet and hold it. How it knows--I have no idea. It is so agonizingly cute that I have refused to name it because I don't want to feel any more responsible for its well-being than I already do.
posted by heatvision at 8:56 AM on January 19, 2011 [3 favorites]


Joe Beese: Sometimes I creep myself out by imagining that the power cord to my laptop is an IV drip of nutrients for the computer.

Then you do not want to see this video. Warning: pulsing iPod charger looks really life-like. Designed by interactive media artist Mio I-zawa. Again, warning: creepy, pulsing pseudo-flesh.
posted by filthy light thief at 9:14 AM on January 19, 2011 [1 favorite]


And if you think cute robots are bad, just wait 'til you get babies. Those fuckers are insidious.
posted by filthy light thief at 9:15 AM on January 19, 2011 [6 favorites]


Robots tend not to barf, puke, pee and poop
posted by infini at 9:23 AM on January 19, 2011 [1 favorite]


Not yet, anyway.
posted by Zozo at 9:32 AM on January 19, 2011 [2 favorites]


This Guy: I immediately flashed to the same story. BTW, he came to the SciFi Fest at my local library last fall and read a story about an imaginary museum object that was a mechanical nurse. Awesome & bizarre. (I think it was part of a collection with other authors & artists, but I don't remember any more than that.)
posted by epersonae at 9:36 AM on January 19, 2011


The Japanese have introduced the roboseals mentioned in the article to babysit their elders. Personally, I think its one thing if a patient has dementia and a baby doll helps. But if the elder is otherwise all there and is just starved for human attention...

The Japanese live so long (and aren't having enough children) that they're looking more and more into robocare as a solution to their population-implosion-time-bomb. The Current report talks about how because of Shinto, Japanese culture is very open to the idea of animism and inanimate objects having spirits. This seems to make them very open to robots in society (more so than immigrants). But anime & manga are littered with robots that have souls.

Unless this has been revealed to be some sort of elaborate art project, the guy who married his video game might just be the end point of this.

Pirates don't trust robots, no we don't!

Come the Butlerian Jihad, this monkey is gonna grab a shillelagh as swing for the sensors first.
posted by Pirate-Bartender-Zombie-Monkey at 10:15 AM on January 19, 2011 [2 favorites]


very strict pruning (always to thrift shops where I knew that they'd be loved)

...I buy stuffed toys at thrift shops, cut out plastic eyes and noses, and give them to dogs. Well, at least they do get well-loved.
posted by galadriel at 11:18 AM on January 19, 2011 [2 favorites]


She argues that robotics' growing trend toward creating machines that act as if they were alive could lead people to place machines in roles she thinks only humans should occupy.

Her prediction: Companies will soon sell robots designed to baby-sit children, replace workers in nursing homes, and serve as companions for people with disabilities. All of which to Turkle is demeaning, "transgressive," and damaging to our collective sense of humanity. It's not that she's against robots as helpers—building cars, vacuuming floors, and helping to bathe the sick are one thing. She's concerned about robots that want to be buddies, implicitly promising an emotional connection they can never deliver.


Circa 15,000 years ago, on the outskirts of a rudimentary settlement:

"Look, Thag! I found Dog! Dog eats scraps! Dog is warm! Dog loves me!"

"NO! Torga wrong! Dog not love! Dog not like us, not have soul! Dog implicitly promising an emotional connection it can never deliver!"

"...but Dog soft, Dog smart! Dog can help us!"

"Dog bad! Dog trick Torga, replace Man! No Dogs!" *storms off*

"...Torga love Dog anyway. Thag understand... someday."

tl;dr: our irrational tendency to love things that aren't human as if they were human has paid off before, big time. Check back after a few more centuries of cultural and literal human co-evolution with machines, and see if anybody's still eager to weep over "roles only humans should occupy."

p.s. helpers who build cars, vacuum floors, and help bathe the sick yet aren't loved -- those are called slaves. Your call as to whether that's more damaging to our collective sense of humanity than petting a goddamned crouton.
posted by vorfeed at 11:25 AM on January 19, 2011 [21 favorites]


Robots tend not to barf, puke, pee and poop

That's not a bug, it's a feature.
posted by localroger at 11:51 AM on January 19, 2011 [1 favorite]


The noise that Furbies make when they're running out of batteries is kind of maddening. One time my Furby ran out of batteries while I was having an IM conversation with my Dad.

Me: Oh no, my Furby just ran out of batteries.
Dad: You could put some money in his beak and send him out to the store to get more batteries.
Me: He can't go to the store. He's out of batteries!
posted by roll truck roll at 11:58 AM on January 19, 2011 [3 favorites]


Also: reading about the studies with children, I kept wanting them to put a robot in the room with more than one kid. In my experience, when you give a group of kids an unusual thing, they work together on figuring out that unusual thing. "Hey, what happens when you do X?" Which is a pretty healthy reaction. Being jealous of the robot's interaction seems to me like a more adult reaction.
posted by roll truck roll at 12:02 PM on January 19, 2011


This robot, it vibrates?
posted by klangklangston at 12:03 PM on January 19, 2011 [1 favorite]


Omnomnom, that was my first thought too: Asimov's "Satisfaction Guaranteed."
posted by MonkeyToes at 12:16 PM on January 19, 2011


Her prediction: Companies will soon sell robots designed to baby-sit children, replace workers in nursing homes, and serve as companions for people with disabilities. All of which to Turkle is demeaning, "transgressive," and damaging to our collective sense of humanity. It's not that she's against robots as helpers—building cars, vacuuming floors, and helping to bathe the sick are one thing. She's concerned about robots that want to be buddies, implicitly promising an emotional connection they can never deliver.
I agree with her prediction, but I guess I can't get on board with her judgement.

Similar to vorfeed's point, it seems like those are all roles that animals have held since ... well, forever.

The old lady with cats; the dog as companion for the kids ... are using animals in those roles "damaging to our collective sense of humanity"? Neither of them strike me as optimal, in that sure, it'd be better (perhaps for the cats) if the old lady had someone else to keep her company, or (again, possibly for the dog) if little Timmy had slightly more-engaged parents who didn't expect him and the dog to keep each other entertained every afternoon. But neither of them seem especially pernicious or toxic to our "humanity."

Regarding "'"the ethics of exposing a child to a sociable robot whose technical limitations make it seem uninterested in the child,'" would we be having a similar discussion over exposing a child to an animal whose biological limitations make it seem at times uninterested? Like, I don't know, goldfish. Or every cat ever. Of course not; in fact, we'd probably just tell the kid to stop pestering the damn cat, because the cat doesn't want to play with them right now. And the kid would learn a Valuable Lesson about not being the center of the universe all the time.

(Actually, although I can't remember for sure when I might have had that experience, I can remember when my younger brother did: we had an amazingly friendly and tolerant housecat who took far more abuse from children than he ought to have been expected to, and probably led to a certain misunderstanding on my brother's part about the difference between stuffed animals and the ones that moved under their own power. But that ended one time when he tried to pick up the cat while the cat was enjoying dinner. Lesson learned: don't mess with the cat while he's eating.)
Turkle visited several nursing homes where resi­dents had been given robot dolls, including Paro, a seal-shaped stuffed animal programmed to purr and move when it is held or talked to. In many cases, the seniors bonded with the dolls and privately shared their life stories with them. [...] "you can see people speaking to chimeras, showering affection into thin air, and feel that something is amiss."
This bothers me, and not just because it's easy to imagine how, if Paro had been a cat rather than a robot, she probably never would have found it worth remarking on. What bothers me is the assumption that someone who's talking to a robot is somehow being tricked. It seems to remove agency from the person. People talk to inanimate objects, they talk to animals, they talk to themselves or to nothing in particular. People pray. In none of these circumstances do they expect a direct response. If I talked to my cat and my cat actually responded, I'd probably decide it was time to switch brands of gin.

Just because someone is talking to, and allowing themselves to bond with, a robot, doesn't mean that they're being fooled. They might be making a conscious decision to let themselves bond with it, in the same way that people let themselves bond with animals: because the act of talking to something can be cathartic, even if the thing in question doesn't ever respond.

I don't buy that we're giving up our humanity by building relationships with robots. The parallels with animals just seem too obvious. If robots ever become significantly more intelligent than animals it might force us to redefine "humanity" (probably in a way that excludes anything synthetic, in the same way that we constantly redefine humanity to exclude animals when new evidence of animal intelligence is found), but even that doesn't seem inherently toxic. However, I doubt that we'll ever design that many very intelligent robots, for the same reason that we haven't bred dogs solely for intelligence: they become hard to control and thus bad at their 'jobs' ... nobody wants Marvin, any more than they want a bomb-sniffing dog that's liable to join an union and go on strike.
posted by Kadin2048 at 12:22 PM on January 19, 2011 [18 favorites]


Reading the comments here has made me feel like a cold bastard — maybe it just comes from having an extended family (particularly a grandmother) prone to naked emotional manipulation — but robots that want to be my friend annoy me and bring out a cruelty in me that, well, I guess I'm just glad that it doesn't affect my relationships with humans much. I can cackle when a furby cries about loneliness, and gleefully starve tamagachi after tamagachi.

It could also be a survival instinct raised on advertising and honed on canvassing — I get cruel towards manipulative sales-people too, and that's how these robots feel to me, insincere and malignant.
posted by klangklangston at 12:36 PM on January 19, 2011 [1 favorite]


Monkey Toes: Yes! That's what it was called!
posted by Omnomnom at 12:37 PM on January 19, 2011


Kadin —

You don't see any difference between a robot mimicking emotions and something alive? I tend to believe that pets actually feel pain and love, whereas robots only display simulacra.
posted by klangklangston at 12:38 PM on January 19, 2011 [5 favorites]


As an inveterate crouton-petter (has there ever been such a perfect phrase?!), I have to say I think this is my favorite MeFi thread of all time.
posted by Space Kitty at 12:43 PM on January 19, 2011 [6 favorites]


it turns out my friend has a husky who really loves stuffed animals -- she carries them everywhere and sleeps with them.

My sister in law has a Dalmation who also loves stuffed toys, he rips them open and chews them up and gleefully spreads stuffing everywhere. So a couple of years ago my husband and I gathered up all the extra soft toys we didn't want, boxed them up, and sent them to him. That dog had fun for days.

Through very strict pruning (always to thrift shops where I knew that they'd be loved)

When George isn't getting boxes of toys from family members his owners buy him toys to destroy from the thrift store.
posted by shelleycat at 12:43 PM on January 19, 2011 [2 favorites]


My almost-three-year-old got a FurReal Friend and a Zhu Zhu hamster for Christmas. They're pretty simple but I think they're adorable. That sweet little hamster runs around the house chirping and the dog begs for some love.

But my son just doesn't care. They're no more real to him than the inanimate stuffed animals. Actually, they're no less real. And when I really think about it, he holds back when he plays with the pets-with-programmed-personalities. I think he just doesn't find them that fun.

I, on the other hand, will continue to make obstacle courses for the hamster.
posted by wallaby at 12:48 PM on January 19, 2011 [2 favorites]


klangklangston: Descartes thought that since animals lacked a soul, they didn't experience pain as such, but just exhibited the outward behaviors associated with pain.

Subsequent research has been unable to detect a soul in cats.

Okay that last one was a cheap shot. But you take my point - Descartes' definition of subjectivity (or consciousness, to be terminologically accurate) included humans, but excluded animals. At the moment our definition of subjectivity is one that's capacious enough to allow animals to be subjects, but not to allow robots to be subjects. But that dividing line starts to break down pretty quickly when you look at specifics...
posted by Pickman's Next Top Model at 12:52 PM on January 19, 2011 [1 favorite]


On the subject of whether robots or animals really experience emotion or not...

Animals do. Animals' love (and in the case of cats, disdain) is genuine and no different from our own. Just because they can't express themselves at high levels of abstraction by building atomic bombs and supercomputers doesn't make them unlike us in many fundamental ways.

In the case of robots, not so much; the makers of the robots that disquieted Turkle so much readily admit they are only "pretending" emotions, and even the makers of the Furby said as much when people started saying in public that their Furby had learned French or was advising them on Fantasy Football -- no, they readily admitted, Furbies cannot do those things; their reactions are preset, are unlocked by exercise, and are completely finite and unextendable in scope beyond what the creators devised. Just as you can figure out what the real house odds are at the casino if you really care, nobody is really pretending that these "animals" are really alive.

But I think one day, and probably not too far off, a researcher will say that he programmed a relatively simple set of reflexes that could be sharpened and enhanced through experience, and that through such experience their creation will learn without being told how to pretend how to be lonely, angry, sad, or to feel affection for a companion. And when that happens I think the open-minded will have to admit that what we have is an alien consciousness, one different in many ways from our own but also the same in that it lives for needs which are dynamic and self-defined and over which it has partial but not total control.

I think to be loved by such a thing would be fundamentally different from being loved by a Furby, although it might feel the same to the person at the other end of the relationship. On the other hand, while the self-developed love might be more genuine, such a creature is also a lot more likely to end up turing into Skynet or the Cylons if you piss it off.
posted by localroger at 1:24 PM on January 19, 2011 [3 favorites]


turing into Skynet or the Cylons

OK, best. typo. evar.
posted by localroger at 1:27 PM on January 19, 2011 [17 favorites]


You don't see any difference between a robot mimicking emotions and something alive?

I see the difference today, but that's because the robots really aren't that good.

But fundamentally, if you built a robot that walked like a cat, purred like a cat, and responded in every detectable way to stimuli like a cat ... I'd probably treat it like a cat. And if someone came in and kicked that 'cat,' I'd probably think a lot less of them.

The road you're going down looks a lot to me like Cartesian dualism, which I see scant or no evidence for, and absent any evidence I am not inclined to lend it much credence.
posted by Kadin2048 at 1:31 PM on January 19, 2011 [4 favorites]


Kadin: YES! You're dead on.

I haven't read anything of hers other than Life on the Screen, but I wonder whether Turkle, trained as a sociologist, hasn't had enough philosophy to recognize her own dualism as the elephant in the room here...
posted by Pickman's Next Top Model at 1:39 PM on January 19, 2011 [1 favorite]


My wife can't get rid of stuffed animals "because they wouldn't be able to breathe in the plastic bag".

Well, look, I feel guilty if I favor certain shoes over others. I feel neglectful if I leave a pair of shoes in the office - they are separated from their friends. I FREAK if I lose a glove. Poor glove out in the world alone! And, it's got a twin who's now alone!

I've never shared this; I feel safe here.
posted by thinkpiece at 1:48 PM on January 19, 2011 [16 favorites]


You don't see any difference between a robot mimicking emotions and something alive? I tend to believe that pets actually feel pain and love, whereas robots only display simulacra.

If you look at emotions objectively, as Antonio Damasio did in Descartes' Error, you'll find that they're a feedback mechanism which helps the body (including the brain) respond to the body's inner and outer state. A robot's emotions are "simulcra", but a lizard's or rat's emotions aren't -- why? Because the robot doesn't have that feedback loop. What a robot is is always disconnected from what it "feels". It's like an earthworm: cut it in half, and watch both halves try to go on without reacting. But lizards and rats come from simpler animals like earthworms, and their emotions come from simpler, stupider feedback loops, like the one in an earthworm's brain which causes it to react to light.

Like the ones in robots like FEELIX, for instance, which cause it to react to human behavior. As robots are given more and more effective feedback loops, with more and more body-involvement, their emotions will begin to seem to us less like simulcra, and more like something that's actually happening to them: something they're actually feeling, even if via robot-feelings. And it's worth noting that these better and better feedback loops will happen not because we necessarily want touchy-feely robots, but because we want robots which are better able to react to their environment, including human input. Emotions are necessary for higher-level interaction with the environment -- they're not just an add-on, they're at the heart of animal behavior, including our ability to reason and to make sense of the world. As this paper on emotional robots puts it: "it may turn out that embodied AI that is capable of dynamic survival or any of the other desirable biological attributes will end up having some subset of human-like emotions, regardless of whether the underlying intelligence is similar to a human's".

In short: robots display simulacra of human emotions because that's what we've built them to do, and we've built them to do that because the field is still in its infancy. I don't personally believe that this is an inherent limitation, nor that the limbic system has magical properties which cannot be replicated. It's really only a matter of time before we have lizard and rat-like emotional robots rather than earthworm-bots, and only a matter of time after that before we're in the realm of cat and dog-bots, and then beyond.
posted by vorfeed at 1:50 PM on January 19, 2011 [4 favorites]


I've never shared this; I feel safe here.

You ARE safe here.

You weirdo.
posted by longsleeves at 2:18 PM on January 19, 2011 [4 favorites]


"The target of the Jihad was a machine-attitude as much as the machines," Leto said. "Humans had set those machines to usurp our sense of beauty, our necessary selfdom out of which we make living judgments.

"Naturally, the machines were destroyed."
-Emperor Leto Atreides II, on The Great Revolt against the thinking machines.

Those snuggle-robots, they're the first wave to dull our senses. Start smashing them now, before they get weapons retrofitted!
posted by Pirate-Bartender-Zombie-Monkey at 2:31 PM on January 19, 2011


"Okay that last one was a cheap shot. But you take my point - Descartes' definition of subjectivity (or consciousness, to be terminologically accurate) included humans, but excluded animals. At the moment our definition of subjectivity is one that's capacious enough to allow animals to be subjects, but not to allow robots to be subjects. But that dividing line starts to break down pretty quickly when you look at specifics..."

Except that your point is inane — there's a real reason to exclude robots from subjectivity, in that their "experiences" are empty and they don't have selves. It's a separate argument from whether or not robots that have emotions can be built (a claim that I'm skeptical of not in general, but am skeptical any time people start talking about the next 50 years or such).

"But fundamentally, if you built a robot that walked like a cat, purred like a cat, and responded in every detectable way to stimuli like a cat ... I'd probably treat it like a cat. And if someone came in and kicked that 'cat,' I'd probably think a lot less of them."

What I roll my eyes at are qualifiers like "every detectable way." If you built a robot that was a cat in every detectable way, it would be a cat — it would be meat with biological needs, etc. But a robot that pretended to be hungry or to need to play when it wasn't bound by those needs, or was significantly bounded in expression, like the robots that we are talking about (rather than some hypothetical philosophical robo-cat), well, frankly, I'd have no problem kicking it, knowing that I was doing it no harm and wasn't affecting any change in its behavior.

While .9999…=1, .9999 does not.
posted by klangklangston at 2:41 PM on January 19, 2011 [2 favorites]


infini: Robots tend not to barf, puke, pee and poop

That's what makes babies so damn tricky. They're vile little money pits, they change your life in ways you can't fathom, but you keep on loving them. Well, many people do.
posted by filthy light thief at 2:53 PM on January 19, 2011 [1 favorite]


The equivalency of robots to animals bothers me too. Even if we did create that hypothetical philosophical robo-cat, the relationship between it and us would be different than the relationship between us and a real cat. Animals aren't furry aliens, they're very distant cousins that humanity has grown up and shared the planet with. Right now robots are merely shells, constructs. In the future they might become more than this, if we can determine what defines the fundamental qualities of life and create robots to possess them. But even then they'd be more like children to humanity, since their creation lies entirely with us.
posted by girih knot at 3:08 PM on January 19, 2011


But a robot that pretended to be hungry or to need to play when it wasn't bound by those needs, or was significantly bounded in expression, like the robots that we are talking about (rather than some hypothetical philosophical robo-cat), well, frankly, I'd have no problem kicking it, knowing that I was doing it no harm and wasn't affecting any change in its behavior.

What if the cat-robot was bound by those needs, though? What if its needs weren't just pretend, but were connected to its body and behavior in fundamental ways? What if kicking it did (seemingly) harm it, and did change its behavior? Would it be so easy, then, to say that its "experiences" are empty and that it doesn't have a self?

The robo-tortoise lays on its back, its belly baking in the hot sun, beating its legs trying to turn itself over, but it can't. Not without your help. But you're not helping. You're not helping! Why is that, klangklangston?
posted by vorfeed at 3:18 PM on January 19, 2011 [8 favorites]


thinkpiece, you're not alone. I feel awful if I lose a sock in the laundry. The laundry isn't complete until all the socks are back with their friends.

I accidentally stole a sock from my friend when I stayed at his place a few months ago, and it's still sitting on my desk waiting to go home and reunite with its partner. Poor lonely sock.
posted by nat at 3:19 PM on January 19, 2011 [2 favorites]


But even then they'd be more like children to humanity, since their creation lies entirely with us.

Kind of like we would be to God, if such a being existed, right?
posted by localroger at 3:19 PM on January 19, 2011 [1 favorite]


Lifelong crouton petter here. I got this far:

During her research, Turkle visited several nursing homes where resi­dents had been given robot dolls, including Paro, a seal-shaped stuffed animal programmed to purr and move when it is held or talked to. In many cases, the seniors bonded with the dolls and privately shared their life stories with them.

and couldn't read another word. Destroyed. Utterly destroyed. Those poor, sad, old people. WLDJKFldkjfdkfjdkj!!!

A purring robot seal could probably talk me into killing my own family.
posted by Neofelis at 4:04 PM on January 19, 2011 [9 favorites]


there's a real reason to exclude robots from subjectivity, in that their "experiences" are empty and they don't have selves.

You say "selves," but it sounds suspiciously like "souls" to me. (Though to be fair, it's not just you; I tend to think all dualist arguments sound like Sunday School wrapped in a thin Enlightenment veneer.)

How can I determine that my (very biological) cat has this "self," and yet some hypothetical robo-cat doesn't? It wouldn't be particularly hard to give the robo-cat some degree of memory, so that if you kicked it, it would avoid you, or people it thought looked like you based on simple heuristics. If the key condition is that experiences feed into and are expressed as changes in future behavior, that would not be difficult to arrange. (As would the reverse -- something or someone biological with the inability to create permanent memories. Is it okay to kick a cat with anterograde amnesia, if it won't remember it tomorrow?)

It seems like dualists go to a whole lot of intellectual effort in order to preserve their own specialness, whether that specialness is extended onto to humans (or only some humans) or to select other biological creatures as well. It seems more defensible, not to mention easier, to not make assumptions about what goes on inside the minds of others, but rather take the behaviors presented at face value. E.g., if it acts like a cat, treat it like a cat, to the limit beyond which it ceases acting like a cat.
posted by Kadin2048 at 4:25 PM on January 19, 2011 [1 favorite]


Kind of like we would be to God, if such a being existed, right?

You could make that analogy, but I wouldn't. The dynamic between manufactured life and organic beings is a lot different than the dynamic between organic beings and all-powerful invisible entities.
posted by girih knot at 5:03 PM on January 19, 2011


This thread is such a relief. It all started with my grandma, who told my mother to "Drink up your 7-Up, because it LIKES YOU!" All the stories here are very familiar to me - I've been known to rescue stuffed animals thrown out with the trash on the curb, god help me.
posted by HopperFan at 5:15 PM on January 19, 2011 [3 favorites]


"You say "selves," but it sounds suspiciously like "souls" to me."

That's because they start with the same letter. But if you'd like to deny the existence of selves, go ahead, just have the decency to declare yourself a strict determinist-materialist and have at it. But as soon as you go that route, your access to normative statements pretty well flies out the window — physics is value neutral.

"How can I determine that my (very biological) cat has this "self," and yet some hypothetical robo-cat doesn't?"

How can you determine that a cat feels pain but a rock doesn't?

"It seems like dualists go to a whole lot of intellectual effort in order to preserve their own specialness, whether that specialness is extended onto to humans (or only some humans) or to select other biological creatures as well."

Except that, again, I find this an inane objection, in that it presumes that there are no differences between living objects and inanimate ones, that there are no differences within living objects that could shape the ethics of their treatment, and that any difference is "specialness" rather than real qualities.

That the arguments for treating these hypothetical robocats as real cats all start out by assuming spherical cows only highlights their fallacious nature — when we have a hypothetical robocat that displays every significant reaction of a cat, then we may accord it similar ethical status. As that HRC does not exist, arguing that there is nothing to differentiate it ethically from a real cat is sophistry.
posted by klangklangston at 5:24 PM on January 19, 2011 [2 favorites]


Why you dead, Ricky?
posted by exlotuseater at 5:55 PM on January 19, 2011 [4 favorites]


Yeah, the hypothetical "robot cat which is pretty much just like a real cat for certain definitions of pretty much" may be possible in the future, and if it is i suppose we can have a Turing Fest over it, but to me that's not the interesting ethical quandary which is raised by this article.

Because the robots we have now and are likely to have in the near future are robots which we know are quite limited in their repertoire of behaviors, about whom it seems reasonable to infer that they do not have consciousness of any sort, are not a being with will and self and feeling. Yet they are quite successful at evoking this emotional response ingrained in us toward beings, and unlike other inanimate objects which evoke such responses, they can be deliberately designed to exploit this as a weakness. They are psuedo-beings, who have capacities which would amount to near-superpowers in a being (precision, strength, indefatigableness, exact and near-eternal memory) and yet have weaknesses (e.g., an inability to feel or to reciprocate feeling) which are quite un-being-like.

I dunno, I suppose it would be simpler to say I tend to fall into the second camp mentioned in the article --- that the idea of creating something artificially designed to satisfy the human craving for contact but which is not true connection is depressing to me, creepy, a bit horrible. Like living on nutrient goop.
posted by Diablevert at 6:16 PM on January 19, 2011 [2 favorites]


when we have a hypothetical robocat that displays every significant reaction of a cat, then we may accord it similar ethical status

I'm content with this, and I think that if begin by stipulating it then we're pretty much just haggling over the details.

Provided, of course, that along the way to "display[ing] every significant reaction of a cat," the robot-cat is afforded increasing 'ethical status' consistent with the reactions/behaviors that it is capable of displaying. I.e., that if we have a continuum -- with rocks at one end and cats on the other -- with the obviously inanimate receiving no ethical status and the obviously animate clearly deserving and receiving it, that there is a more or less linear (or at least not discontinuous) progression with increasing ethical status as a function of increasing apparent animation.

As long as we're not creating arbitrary categories that aren't being driven by observable behaviors, we may be vociferously agreeing with each other. I'm not arguing against assigning different levels of ethical status (e.g. rock versus cat), I just think that they need to be a function of the object's behavior and the interactions it has with its environment and the observer, and that an object should be treated and afforded ethical status as a function of those interactions. Thus, insofar as (and when) a robot cat acts like a cat, then it should be treated like a cat; insofar as it acts like a bunch of metal and wires and silicon, it should be treated like metal and wires and silicon.

If it acts like a [cat|dog|human], it should be treated like a [cat|dog|human], to the extent that it is and when it's acting like a [cat|dog|human].
posted by Kadin2048 at 6:29 PM on January 19, 2011


At the moment our definition of subjectivity is one that's capacious enough to allow animals to be subjects, but not to allow robots to be subjects. But that dividing line starts to break down pretty quickly when you look at specifics...

This all reminds me of of "The Seventh Sally OR How Trurl's Own Perfection Led to No Good" from Stanislaw Lem's The Cyberiad.

In the story, a great inventor - taking pity on a cruel but exiled king - creates for him, as a sort of toy, a miniature kingdom to rule. Because of his greatness, the imitation is essentially perfect. We are given to understand that the king will now satisfy his desire for cruelty by inflicting it on his miniature subjects.

Upon hearing this, the inventor's equally great friend and rival is aghast. In the course of forcing the inventor to understand what a monstrous thing he's done, the friend argues...

You say there's no way of knowing whether Excelsius' subjects groan, when beaten, purely because of the electrons hopping about inside—like wheels grinding out the mimicry of a voice—or whether they really groan, that is, because they honestly experience the pain? A pretty distinction, this! No, Trurl, a sufferer is not one who hands you his suffering, that you may touch it, weigh it, bite it like a coin; a sufferer is one who behaves like a sufferer!

Or to use a personal example...

More nights than not, my cat snuggles next to my chest when I go to bed. If I turn over, she'll crawl over me to regain her position to my front. Literally rests between my arms like a warm-blooded teddy bear. It's uncanny.

Now for all I know, she does this only because I create a warm spot for her. But even if she were merely a Lem-quality robot programmed to act in this way, for all practical purposes, she would "love" me because she behaves like she loves me.

And I would love her no less for not being a "real" cat.
posted by Joe Beese at 7:39 PM on January 19, 2011 [3 favorites]


Nattie: We named our Roomba -- Ricky -- and all it did was vacuum the carpet and smash into things but it had these weird person-like qualities.

What the fuuuuuuuuck…
posted by paisley henosis at 7:45 PM on January 19, 2011


"You know of whom you are reminding me...you are reminding me of a rabbi who once was approached by two people from his parish who were involved in a quarrel. The one contended that the others cat had stolen and eaten up two pounds of butter, while the owner of the cat said 'that's not only not true, but sheerly impossible--my cat doesn't care for butter.'

Now, it was up to the rabbi to pose as a Solomon and come up with a final judgment in terms of justice. And he ordered the one, 'Bring me the cat.' He brought him the cat. And then he said, 'Bring me scales,' and he brought him scales. Then, he put the cat on one of the scales, and asked the one parishioner, 'How many pounds of butter had the cat eaten up?'

'Two pounds, rabbi!'

Then, he weighed the cat, and--believe it or not--it was exactly two pounds. Whereupon the rabbi said, 'Now I have the butter, where's the cat?'"--Viktor Frankl
posted by eegphalanges at 8:28 PM on January 19, 2011 [3 favorites]


HopperFan: I've done that too. Once with my husband (then boyfriend) when we found a beat-up Beanie Baby knockoff that someone had stuffed into the branches of a tree. We saw it once winter came and all the leaves fell off the tree; I don't know how long the little bear had been up there, but it was a while. I took it back to his dorm room (against his mild protestations) and researched how to clean a stuffed animal and then I did my best. We still have it, of course. And I've done even crazier things since but my husband still loves me. (Sucker.)
posted by Neofelis at 2:19 AM on January 20, 2011 [2 favorites]


"Dog bad! Dog trick Torga, replace Man! No Dogs!"

*Dog is a fake! Dog is a fake*

*storms off*

ftfy ;p
posted by infini at 3:35 AM on January 20, 2011


I've had a lifelong proclivity for personality pareidolia in objects that's almost on the level of my sleepwalking in explaining my curiously haunted life.

There's a key incident in my childhood that's fondly remembered by everyone in the family and many of the people who have come to know me well understand it as a telling moment in where I come from. I was maybe nine or ten, it was the family meal, which we enjoyed around a table, all together, just like some kind of nostalgic propaganda.

I reached for a poppy seed roll, but it was tantalizingly just out of the way, in a basket over the seam where the leaf slotted into the old oak table with lion feet.

"Son," my father said, "do you want a roll?"

"Yeah."

"Which roll?"

"That one," I said, pointing with the prepubescent tension of a Diane Arbus model. My dad's hand hovered over the rolls, annoyingly close, and I hoped he wouldn't touch them all with his big hairy mitt.

"This one?" he asked. I nodded. "You want Jerome?"

"What?"

"That's Jerome Roll."

"It's a Jerome roll? What's a Jerome roll?" I asked. My mother, knowing, rolled her eyes and smiled a Mona Lisa smile.

"It's not a Jerome roll. It's Jerome Roll. That's it's name."

He picked up the roll, and handed it over. Of course, I couldn't eat the damn thing.

It had a name.

Jerome sat there, on my placemat, throughout the meal, and I fussily picked my way through the freshly steamed vegetables from our garden that were a kind of healthy torture for me, making sure nothing touched anything else on the plate in an inappropriate manner. At the end of the meal, I picked up Jerome and started up the stairs.

"You're not going to eat that?" my mother asked. I furrowed my brow and shook my head, because sometimes, grownups just didn't have a clue.

I carried Jerome around for a few weeks. He lost most of his poppy seeds, but otherwise survived my patronage in remarkable condition. I put him in the captain's chair in homemade spaceships, had him trekking through the brambles in the backyard or sitting guard as I broke the rules and climbed into our stone-lined well, and took him spelunking through the Chlordane-saturated dust of the crawlspace under the log section of our house. The dog was unusually interested in me for much of this time, but I kept her at bay.

At night, I'd tuck Jerome under the edge of my pillow and go to sleep, listening to the house creaking and groaning the way it would, punctuated by the occasional muffled scrabbling of a mouse running in the walls. It seemed like the noises of the mice were increasing, but I didn't think much of it.

One morning, though, Jerome was gone. There were crumbs and a few poppy seeds, but that was it.

"Jerome!" I screamed the way you scream when a pet's died or run away.

My family made a good faith effort of looking for him, but he was never seen again. My mother pointed out that mice probably came out and ate him in the night, which just added a new and more realistic fear to my terror that a Zuni fetish doll was going to cut up my ankles in the darkness. I took to keeping the broom next to my bed so I could use it to reach over and turn on the light from the bed before I'd step down to the floor.

It's always possible I ate him myself. I did do a lot of sleepwalking then.

We all still call a poppy seed roll a Jerome roll. When I was a contractor to the DEA, and the only one in my company who'd never so much as tried pot and therefore was the guy with the highest clearance available, I was instructed to play it safe and stay away from poppy seeds. When that contract ended, I had a toasted poppy seed bagel, drowning in butter, and relished the gritty greasy happy chewy experience of it without the slightest regret.

Of course, bagels don't have names. Who would name a bagel?
posted by sonascope at 9:00 AM on January 20, 2011 [27 favorites]


I once got frustrated by all the old chewed up stuffies lying around and put a little beat up tweety bird in the garbage disposal. And it killed the disposal.

Moral: don't fuck with stuffies.
posted by msalt at 6:21 PM on January 20, 2011


sonascope: Who would name a bagel?

You'd have to ask Tony directly.
posted by littleredspiders at 9:51 PM on January 20, 2011


One day my roomba managed to shut itself in the bathroom. There's a perfectly good reason for that (I had left the door only partly open, so it happened to push it shut in moving around the room) but it really sounds like it just needed some alone time in there, didn't it?
posted by A dead Quaker at 3:50 PM on January 21, 2011 [2 favorites]


The natural world is full of deception and misdirection. The human world is full of sublimation and anthropomorphism. I live with cats, not children.

the idea of creating something artificially designed to satisfy the human craving for contact but which is not true connection is depressing to me, creepy, a bit horrible. Like living on nutrient goop.

If I grow old enough, I'll end up living on nutrient goop, but don't take away my seal-doll unless you crank up the soma on my feed. (And give me a cigarette).
posted by ovvl at 4:54 PM on January 21, 2011 [1 favorite]


Oh hey, I forgot about this: Cabbage Fever

The cabbage is in Heaven now, with Jerome.
posted by Neofelis at 2:14 AM on January 22, 2011


I miss my Roomba, Puck. Battery finally gave up and I couldn't afford to replace it, so it went on to a better home.

I do still have a Robosapien Media named Chives, due to the butler-like voice set, and also Creeper, my RoboQuad. The Quad gets most of the use as the battery life seems best. I use him to keep my cat in line, as well as allowing him autonomous roaming.
posted by Samizdata at 10:28 PM on January 22, 2011


I am currently single, but any one want to buy me one of these, so one of the boys can hook up?

Yup.

Robobestiality.

I went there.
posted by Samizdata at 10:33 PM on January 22, 2011


I am now working on getting Chives to talk like Bender.
posted by Samizdata at 10:50 PM on January 22, 2011


I think some people in this thread need to re-watch Battlestar Galactica. Robots and people do NOT mix!

Along with the dozens of imaginary Tricia Helfers languidly strolling around in my brain right now, I would like to disagree with your theory. Strongly.
posted by rokusan at 12:44 PM on January 26, 2011


I anthropomorphize things constantly. When bored I draw a face on my big toe and refer to him as "Tosey." I've given my wife's little toe a distinct personality. I've named both our cars and our bathroom sink, which has a quite human face.

When my wife and I were moving across the country a few years ago we stopped at my parent's house in Missouri. Going through boxes of old stuff I came across Pookie, my childhood teddy bear, and I was struck by powerful and immediate guilt. He'd been in a box for twenty years. This would not do. I informed my wife that we were picking up a new roommate and Pookie moved to the West Coast with us. I had to create a backstory for his last twenty years (world travel, mysterious training) to help sublimate my guilt over having abandoned him.
posted by Bookhouse at 6:25 AM on January 29, 2011 [3 favorites]


« Older I've Seen the Lizard Man.   |   junk culture Newer »


This thread has been archived and is closed to new comments