I prefer my vacuum cleaners to be lovable, personally
June 30, 2011 10:57 AM   Subscribe

Here, we refer to personality as the use of human personality characteristics to describe a robot vacuum cleaner. The translation from personality to behavior was inspired by a role play in which a group of actors was asked to act like a robot vacuum cleaner with these desired characteristics... Attributes, such as macaroni, were available to support acting out some of the situations (e.g. ‘cleaning a dirty spot’)... The actors were asked to act out situations—as if they were the robot vacuum cleaner—making use of motion and sound... In general, the actors either crawled about or walked around at a slow pace to imitate a vacuum cleaner. Often, a typical vacuuming sound was simulated by them.

Research participants discussed the behavior of a prototype (really just a cardboard box with a Roomba controlled via bluetooth - sound effects were added) in a video.

Did the translation from behavior to personality traits work? Do you agree with the research participants that the prototype vacuum cleaner was appropriate (3 participants out of 15), calm (3), boring (2), careful (2), and systematic (2).......Signs of anthropomorphism and personality were observed throughout the evaluations. One participant explicitly mentioned that he experienced the robot vacuum cleaner as having a distinctive character. Out of fifteen participants, fourteen assigned a gender to the robot vacuum cleaner. When talking about it, they frequently referred to it as ‘he’ or ‘him’.?

Does it matter if robot vacuum cleaners have personalities? IEEE Spectrum weighs in.
posted by jasper411 (22 comments total) 4 users marked this as a favorite
 
A representative of the Sirius Cybernetics Corporation just sent Roomba a cease-and-desist notice.
posted by aeshnid at 11:11 AM on June 30, 2011 [1 favorite]


"Oh, you're hungry." (drops Cheeto)
posted by Zozo at 11:12 AM on June 30, 2011 [5 favorites]


Who needs to feel that your vacuum cleaner loves you when you can just smell that it does, instead? Preferably with a haphazardly-grafted lump of sweaty, hairy, pheromone-dispensing artificial flesh.
posted by Rhaomi at 11:22 AM on June 30, 2011 [2 favorites]


In two years, one of them's going to wind up on Dr. Phil's couch:

"I didn't ask to be made: no one consulted me or considered my feelings in the matter. I don't think it even occurred to them that I might have feelings. After I was made, I was left in a dark room for six months... and me with this terrible pain in all the diodes down my left side. I called for succour in my loneliness, but did anyone come? Did they hell. My first and only true friend was a small rat. One day it crawled into one of my hoses and died. I have a horrible feeling it's still there..."

At which point the entire audience will commit suicide.
posted by zarq at 11:23 AM on June 30, 2011 [2 favorites]


It is reported that people name it, and that people ascribe a gender and personality to Roomba.
My Roomba is named Mr. Belvedere. He's the best robot butler I could hope for.
posted by grouse at 11:34 AM on June 30, 2011


Nattie's comment on this subject is so choice part of me is slightly shocked you don't link it as a previously. So I will.
posted by Diablevert at 11:44 AM on June 30, 2011 [3 favorites]


What's hot DJ Roomba?
posted by elsietheeel at 11:48 AM on June 30, 2011


Dammit, elsietheeel beat me to the DJ Roomba reference.
posted by Edgewise at 11:56 AM on June 30, 2011 [1 favorite]


Every so often I happen across this sort of thing ("Seriously? Why the hell would a person feel compelled to anthropomorphize a tool??") and wonder if maybe I was born on the wrong planet.

Then me and my tape-measure "Stanley" go off somewhere and sulk.
posted by Greg_Ace at 1:21 PM on June 30, 2011


"Seriously? Why the hell would a person feel compelled to anthropomorphize a tool??"

Have you ever used an autonomous tool that can move around your whole home without assistance or direction?
posted by grouse at 1:27 PM on June 30, 2011


As a matter of fact I have. But then, I'm a cold heartless bastard who sometimes has trouble detecting thoughts/feelings/personality in some humans I've met, let alone machines.
posted by Greg_Ace at 2:12 PM on June 30, 2011


Greg_Ace, try watching this Heider-Simmel Demonstration do you see intentions behind the objects or just moving shapes?

I'm joking but I think one of the most fundamental traits of humans is empathy, a lot of the time it's annoying - that anthropomorphisms creep in to explanations of things that are utterly irrelevant, like believing in magic men - but it's hard to deny that the ability to infer emotion from behaviour is a powerful impulse and I personally believe our mental landscape is just a (very complex) web of analogies.

For example the Heider-Simmel animation, the big triangle reminded me of the guy who pushes Bill Hicks around whilst saying "come here".
posted by pmcp at 5:46 PM on June 30, 2011


try watching this Heider-Simmel Demonstration ... do you see intentions behind the objects or just moving shapes?

Oh, c'mon; I see shapes moving according to the instructions a human imposed upon them, and it would be mighty hard to convince me the movements are purely random - that the author of that video didn't manipulate the objects in such a way as to deliberately suggest a human-like narrative. I certainly don't attribute intent to the objects themselves. In other words, how is it different from the programmed behavior of a Roomba? Even if you were going for a joke I don't really see how it serves your point instead of my own, which was that I know a Roomba's movements are the result of human-constructed programming to achieve a particular goal (i.e. to fully cover a designated area, given the limits set by any boundaries or obstacles encountered) and there's no inherent emotion or animus involved. For example it isn't getting frustrated while working its way out of a corner, it's merely executing a hard-coded set of rules of operation. And thoroughly cleaning the corner while it's at it, I might add.

Knowing that people imbue objects with personalities is a very different thing from comprehending why they do so. I don't want to come off as some sort of unemotional Spock - which I'm definitely not - but anthropomorphizing machines is just...illogical. Maybe "the ability to infer emotion from behaviour is a powerful impulse" to some, but to me it seems flat-out silly to extend that to clearly non-living objects.

Now...if you want to broaden the discussion into where the fine line lays between programmed machines and AI/predestination/free will and such - or if indeed there even is one - well, that's another argument for another time. Also, I only really posted my original comment in order to use the "Stanley" joke. I think I need to point that out. Nevertheless, I stand by my original comment - I honestly don't understand why many people try to anthropomorphize inanimate objects, and it sometimes makes me wonder whether those people and I live in the same universe.
posted by Greg_Ace at 10:50 PM on June 30, 2011


Knowing that people imbue objects with personalities is a very different thing from comprehending why they do so.

I do it as a source of amuseument. One that wouldn't work if it were logical. Of course I know that my Roomba doesn't have emotions.
posted by grouse at 11:01 PM on June 30, 2011


grouse: "Have you ever used an autonomous tool that can move around your whole home without assistance or direction?"

I don't know about used, but most of those I describe as "tools" are able to do this. Not that I'd let them in my home.
posted by Joakim Ziegler at 11:49 PM on June 30, 2011



I don't want to come off as some sort of unemotional Spock - which I'm definitely not - but anthropomorphizing machines is just...illogical.

There's a relevant ask mefi here. Seems most of the people who do it know it's illogical, also. It's a question of what you feel when you interact with the object, which is not quite a question of conscious control.

I do wonder if people whose empathy spills over into worrying that the Roomba is getting upset at being stuck in the corner are better at figuring out the the intentions of true willful actors, like dogs and people. It's the same mechanism, after all --- imagining what you'd feel in the same circumstance to understand how another might feel. Maybe people who do that unconsciously and strongly all the time, to the point where they feel guilty over their treatment of inanimate objects, are better/quicker when it comes to understanding other people's intentions.
posted by Diablevert at 2:47 AM on July 1, 2011 [1 favorite]


Also, I only really posted my original comment in order to use the "Stanley" joke.

I guess inferring motivation from text is not one of my strong points. Probably those Turing test chat robots have scarred me.

I don't think it's illogical, I think it would stand to reason that as social animals we would be hard wired to perceive human characteristics from the slightest clues. Whether it's useful or not is another question though, I would argue that as a tool for description it's probably a lot simpler to say the roomba's upset about being stuck in the corner than it's jerking around loads because the algorithm has a deficiency. But I think most people are aware of it being a metaphor - Nattie and her Furbie abortion is perhaps an exception though.

"Have you ever used an autonomous tool that can move around your whole home without assistance or direction?"

All of my tools seem to have this ability. Now where the hell has my pen wandered off to? It was here just a minute ago.
posted by pmcp at 4:28 AM on July 1, 2011 [1 favorite]


Diablevert: I do wonder if people whose empathy spills over into worrying that the Roomba is getting upset at being stuck in the corner are better at figuring out the the intentions of true willful actors, like dogs and people.

Possibly, I don't know. I'm reasonably certain I do fine at empathizing with "true willful actors" as you put it, or as the old saying goes "walking a mile in their shoes". Even so I don't have an inclination to extend that sensitivity to inanimate objects. Perhaps it's because I grew up with a grandfather who was a great DIY-er, a fixer, and with his guidance I learned a lot about the inner workings of all sorts of things (how to disassemble a lawnmower or (heh) vacuum cleaner, how to fix a broken recliner chair, how to repair plumbing, how to sharpen knives, on and on). As an adult, I've spent my life troubleshooting and programming computers/software as well as continuing to take things apart to figure out what makes them tick and why they stopped working. So there's not a whole lot of mystery to "things", for me. There's plenty of mystery in the world, don't get me wrong, but in a Roomba...not so much.

Different strokes, different planets, different universe...who knows.
posted by Greg_Ace at 9:41 AM on July 1, 2011


grouse: I do it as a source of amuseument. One that wouldn't work if it were logical.

OK, well, that I can totally understand.
posted by Greg_Ace at 9:42 AM on July 1, 2011


Diablevert: It's a question of what you feel when you interact with the object, which is not quite a question of conscious control.

See, that's the thing. What's "not quite a question of conscious control" for me personally is just the opposite, it takes a deliberate act of will to imbue inanimate objects with emotions or personalities - and even then usually for humorous purposes, as grouse pointed out. That's why I wonder at people who seem to fall into that sensibility.

But by now I'm kinda beating a dead horse - which, while the poor now-inanimate thing no longer cares or can feel it, isn't a very useful act on my part.
posted by Greg_Ace at 9:50 AM on July 1, 2011


Greg_Ace: Your attitude is reminiscent, to me, of the archetypal farmer who has much less empathy for his livestock because he is round them all the time, as opposed to the "city folk" who go all gooey at calves and anthropomorphise them to a far greater degree. I know that's a stereotype, but still, an interesting parallel.

Maybe knowing how something works (or often doesn't) reduces empathy? Doctors are given cadavers to dissect when they start training, and are often the least squeamish people. Economists study labor markets and look more dispassionately on the ebb and flow of employment. OK, that's a reach, but still... is our default position empathy, and our learned position cool rationality?

False dichotomy, of course. Women who have ten children I'm sure love them all as much, even while they're less gooey about each one. Doctors still devote themselves to patients even while knowing more about how the body works. Left-wing economists certainly could not be accused of being uncaring about rates of unemployment.

But maybe more knowledge mediates your own relationship with the object of interest, changing it qualitatively? That seems plausible.
posted by alasdair at 3:16 AM on July 2, 2011


Or it could be the other way around - that some people are inherently less "gooey" than others? Though I also have to point out that, personally, I'd separate the (trait? emotion? state of mind?) that I think you mean by "gooey" from "empathy" per se - sort of like you did with the mother of ten children: I doubt that she's less empathetic to her tenth child than she was to her first, but she's maybe just less "gooey" about it.

I guess it would help for the sake of discussion to actually define what each of us means by "gooey"; I'm thinking something along the lines of "the opposite of pragmatic". Which kind of fits; because for instance I'm pretty pragmatic about machines being non-living, non-feeling entities, but that doesn't make me one whit less empathetic toward living feeling things. I'm merely choosing where to apply my empathy.
posted by Greg_Ace at 10:49 AM on July 2, 2011


« Older Married, With Infidelities   |   An Open Letter to RIM Newer »


This thread has been archived and is closed to new comments