Why these friendly robots can't be good friends to our kids
December 10, 2017 3:42 PM   Subscribe

MIT's Sherry Turkle writes about the new wave of "sociable robots" we're seeing. "These machines are seductive and offer the wrong payoff: the illusion of companionship without the demands of friendship, the illusion of connection without the reciprocity of a mutual relationship. And interacting with these empathy machines may get in the way of children’s ability to develop a capacity for empathy themselves."
posted by forza (68 comments total) 25 users marked this as a favorite
 
At times, I think the attitudes we take towards Sherry Turkle and her thoughts on robots as human companions will be a generation-defining battle.
posted by Going To Maine at 3:53 PM on December 10, 2017 [6 favorites]


While I'm alive to the author's concerns, I'm not sure that the case is made out by the descriptions of the studies. Particularly because one sees exactly the same sort of projection and confusion about agendas and identity with pets, particularly the dumber ones like hamsters etc. And because I remember that I was extremely committed, as a child, to the construction of an ontology in which I could both acknowledge my toys as part of me, and see them as independent beings with their own agendas, too. I'm just not convinced, although I'm willing to be.
posted by howfar at 3:56 PM on December 10, 2017 [7 favorites]


These machines are seductive and offer the wrong payoff: the illusion of companionship without the demands of friendship, the illusion of connection without the reciprocity of a mutual relationship.

What 1950s mouth-foaming anti-television luddite was this plagiarized from?
posted by Sys Rq at 3:57 PM on December 10, 2017 [13 favorites]


the illusion of companionship without the demands of friendship, the illusion of connection without the reciprocity of a mutual relationship.

Sounds like it's describing YouTube celebrity fanbases.
posted by deadaluspark at 4:03 PM on December 10, 2017 [3 favorites]


Sounds like it's describing YouTube celebrity fanbases.

YouTube celebrity fanbases are also bad.
posted by Going To Maine at 4:04 PM on December 10, 2017 [10 favorites]


Clearly, these "sociable robots" are at least as harmful as those "books".
posted by b1tr0t at 4:09 PM on December 10, 2017 [10 favorites]


Seriously? Allowing small children to be huge dicks to robots without any repercussion is absolutely the first step to getting Terminators.

Do y'all really want Terminators?
posted by deadaluspark at 4:12 PM on December 10, 2017 [17 favorites]


Sounds like we are simply reintroducing the notion of a fawning retainer...
posted by jim in austin at 4:14 PM on December 10, 2017 [4 favorites]


Cue all of the technology researchers rolling our eyes at Turkle.
posted by k8t at 4:16 PM on December 10, 2017 [5 favorites]


Allowing small children to be huge dicks to robots without any repercussion is absolutely the first step to getting Terminators.

I actually think it's the first step toward getting Westworld.

Twitter will give us Skynet. We already had one near miss, and only because it wasn't armed.
posted by mordax at 4:18 PM on December 10, 2017 [2 favorites]


Clearly, these "sociable robots" are at least as harmful as those "books".

Indeed. Arguably it's the same misapprehension that Plato falls into in the Phaedrus dialogue.
Socrates: Then he will not, when in earnest, write them in ink, sowing them through a pen with words which cannot defend themselves by argument and cannot teach the truth effectually.

Phaedrus: No, at least, probably not.

Socrates: No. The gardens of letters he will, it seems, plant for amusement, and will write, when he writes, to treasure up reminders for himself, when he comes to the forgetfulness of old age, and for others who follow the same path, and he will be pleased when he sees them putting forth tender leaves. When others engage in other amusements, refreshing themselves with banquets and kindred entertainments, he will pass the time in such pleasures as I have suggested.

Phaedrus: A noble pastime, Socrates, and a contrast to those base pleasures, the pastime of the man who can find amusement in discourse, telling stories about justice, and the other subjects of which you speak.

Socrates: Yes, Phaedrus, so it is; but, in my opinion, serious discourse about them is far nobler, when one employs the dialectic method and plants and sows in a fitting soul intelligent words which are able to help themselves and him who planted them, which are not fruitless, but yield seed from which there spring up in other minds other words capable of continuing the process for ever, and which make their possessor happy, to the farthest possible limit of human happiness.
posted by howfar at 4:22 PM on December 10, 2017 [12 favorites]


it's technology. it will be bad and good for us. Given that the marketing types will be sure to tell us all the good stuff, it seems to wise to explore what might be bad.
posted by philip-random at 4:35 PM on December 10, 2017 [13 favorites]


Cue all of the technology researchers rolling our eyes at Turkle.

Well, I'll say this: I wish I could turn one sort-of study into a career as a public intellectual like she has.
posted by codacorolla at 4:37 PM on December 10, 2017 [5 favorites]


I am so sick of this lack of empirically supported claims being given a venue in NYT or WashPo. There are dozens of good studies on this.
posted by k8t at 4:37 PM on December 10, 2017 [3 favorites]


I was concerned that the person who has been studying children and computers since 978 thinks there are issues. But turns out she's just like old fogies who didn't like TV!
posted by thelonius at 4:37 PM on December 10, 2017 [16 favorites]


since 978

Holy shit that's a long time! Is she secretly a Time Lord or something?
posted by deadaluspark at 4:42 PM on December 10, 2017 [13 favorites]


These machines are seductive and offer the wrong payoff: the illusion of companionship without the demands of friendship, the illusion of connection without the reciprocity of a mutual relationship. And interacting with these empathy machines may get in the way of children’s ability to develop a capacity for empathy themselves.

(skip...skip...skip...)

A 12-year-old boy, frustrated that he couldn’t get Kismet to respond to him, forced his pen into the robot’s mouth, commanding: “Here! Eat this pen!” Other children felt the pain of rejection. An 8-year-old boy concluded that Kismet stopped talking to him because the robot liked his brothers better. We were led to wonder whether a broken robot can break a child.



I expect how these children treat sociable robots reflects how they treat people in general.

"...they treated the doll like barbarians.” per Huffington Post an female adult sociable robot model as severely injured by passersby at an expo. The creator's response was "she's a lady and should be treated as such."
posted by beaning at 4:52 PM on December 10, 2017 [3 favorites]


Point: my kids are normally polite and empathetic in their social interactions, but as little as half an hour playing games where "people" exist to provide positive feedback to their shitty behavior turns them (back in the real world) into little narcissistic nightmares for a good ten minutes.

Counterpoint: I once tried to pet a Sony robotic dog on the head, but hit it too hard, and when it communicated its negative feedback to me, I felt terrible.
posted by davejay at 4:53 PM on December 10, 2017 [8 favorites]


Why can't we all just share and enjoy?
posted by Greg_Ace at 5:13 PM on December 10, 2017 [4 favorites]


These machines are seductive and offer the wrong payoff: the illusion of companionship without the demands of friendship, the illusion of connection without the reciprocity of a mutual relationship.

Hmm. I’m 29 and I was one of the many kids my age who were fascinated by the virtual pet games of the 90s. The original Dogz and Catz games were surprisingly realistic and engaging. (To my eye, they still look very good today).

The virtual animals had emotional reactions, and that included crying and cowering when they were sprayed too often with the “water bottle” tool. These reactions were so realistic that some people actually founded emotional websites imploring other people to “stop abusing Petz.” People grew very attached to their Petz. People even grew attached to their Neopets, which weren’t much more interactive than a static cartoon image.

And then there’s the disquieting horrors we’d subject our Sims too - all much more realistic than anything you could do to a Tamagotchi.

If kids are being warped somehow by exposure to unreal “friends” that simulate emotional reactions, than I’d argue the effects should be showing up in people my age.

Perhaps Turkle has looked at these “virtual life” games, but if she hasn’t, the focus on physical robots without looking at software is...weird.
posted by faineg at 5:15 PM on December 10, 2017 [7 favorites]


Turkle will have a long and remunerative career with this schtick.
posted by sandettie light vessel automatic at 5:19 PM on December 10, 2017 [2 favorites]


Yeah, these kids need to be slightly horrified by uncanny valley robotics throughout their childhood, and then revisit the issues later in life and steal an animatronic from a Chunky Cheese on a college dare. That's a traditional rite of passage, right?
posted by Nanukthedog at 5:19 PM on December 10, 2017 [1 favorite]


It was until Five Nights at Freddy's took our latent fear of weird animatronics and made it legion.

*shudders.

Fuck you, Freddy Fazbear. Couldn't even fucking Waka Waka.
posted by deadaluspark at 5:22 PM on December 10, 2017


Chunky Cheese

Did Chuck E. pack on weight after his restaurant chain hit the big time?
posted by Greg_Ace at 5:39 PM on December 10, 2017 [4 favorites]


If kids are being warped somehow by exposure to unreal “friends” that simulate emotional reactions, than I’d argue the effects should be showing up in people my age.

I'm getting real thoughtful these days about the early controversies over video games versus what we have seen in the behavior of dudes who grew up with video games. Granted, those controversies weren't on target. I read an old archived college newspaper article in which some dude at my lefty college community argued that the college lounge shouldn't get one of those new Space Invaders machines because playing war promoted the military-industrial complex. And when I was coming up myself, the fear was that kids wouldn't understand that death and injuries were for real, or that they would never read a book again.

Well, those things never quite happened. But Gamergate did.
posted by Countess Elena at 6:31 PM on December 10, 2017 [29 favorites]


Turkle will have a long and remunerative career with this schtick.

Sherry Turkle has been a full MIT professor since at least the aught. The idea that she's at the beginning of her career is a joke.
posted by Going To Maine at 6:45 PM on December 10, 2017 [9 favorites]


If kids are being warped somehow by exposure to unreal “friends” that simulate emotional reactions, than I’d argue the effects should be showing up in people my age.

Y'think?

So where did all those internet Nazis come from anyway?
posted by happyroach at 7:08 PM on December 10, 2017 [7 favorites]


Well, that’s the interesting part to me. What kind of games *were* the twenty-something Internet Nazis playing when they were young? Did the kids who played virtual pet games that simulated empathy and relationships (as Turkle is concerned about) become Future Internet Nazis, or was it the kids who primarily played realistic shooters? Or, in other words: I’m not sure the main technological vector for screwing up kids sense of empathy is “overly friendly robots/Furbies/surprisingly realistic virtual dogs.”
posted by faineg at 7:16 PM on December 10, 2017 [2 favorites]


(It is probably also significant that girls were much more likely to be playing games that simulated emotions and tried to induce empathetic reactions in the 90s and 2000s).
posted by faineg at 7:19 PM on December 10, 2017 [2 favorites]


Was just thinking about my son's first video game, which was my favorite from back in the day: Sonic The Hedgehog. In my memory, you "killed" enemies, but I thought it would be mellow enough for him too start with.

When I actually got the emulator up and running, I discovered that my memory was faulty. For the most part, the player actually jumped on robotic creatures to "unlock" the creatures imprisoned inside, little bunnies and such. Violence was mostly reserved for the boss who demonstrably deserved it.

I also remember Carmageddon, which got a lot of bad press at the time because the violence was senseless (there was no backstory to make the killings justified) and gratuitous (so they included a blood-free mode.)
posted by davejay at 7:49 PM on December 10, 2017


Building off of Countess Elena's point, I wonder to what extent the massively ballooning rate of social anxiety in teenagers stems in part because they get so much less practice and experience with person-to-person interactions. Nowadays that's largely due to social media, but if a child from a young age is spending a lot of time in "social" interactions involving robots rather than people, what will that mean for their ability to form and maintain interpersonal relationships in all their messy complexity?

I'm glad so many of you are so completely sanguine about the robot-and-internet mediated future we're veering into, but honestly I'm super surprised that you are. In so many other discussions on Metafilter we share so many stories about the alienation, loneliness, and anxiety of modern life. Is there no sense that at least some of this may be due to how the nature of our interactions with each other and with machines is changing? No sense that having this occur at earlier and earlier ages might have some long-term effect? Really?
posted by forza at 7:55 PM on December 10, 2017 [24 favorites]


I don't have the energy to go into a full discussion on it, but if we're going to bring up the "kids are bad at person-to-person interaction because of social media!" idea, I'd at least like to place the counterpoint: kids are bad at person-to-person interaction because they have much less independence and less ability to interact with other kids one-on-one without the supervision of adults and they turn to social media because that's the only way they can interact with their friends a lot of the time.

Like, I was a teenager who was constantly on the internet and social media. I can't think of a single instance in which I or any of my friends, given an opportunity to hang out in person, would have gone "nah, we're good just chatting online." Kids like to hang out with each other. They like to talk one-on-one. They just don't get that opportunity, whether it's due to overcontrolling parents, crammed full schedules, transportation issues, lack of safe places to hang out, etc.

Likewise I doubt few children, given the opportunity to play with one of their friends, would rather play alone with a robot. If someone can provide evidence that robots and social media are actively making kids less likely to hang out with their friends (while controlling for things like socioeconomic class and disability), then sure, I'll start being concerned. But right now I don't see how, in terms of impacting interpersonal skills, this is much worse than me holing up alone in my room reading books (which I definitely did pass up some hang outs for as a kid).
posted by brook horse at 8:25 PM on December 10, 2017 [11 favorites]


Well but and, even if it's because adults find it easier or our society is structured to make it happen rather than kids are seeking it out, does it matter why if the net effect is that they're interacting with other people less? Regardless, isn't it kind of important to understand what the long-term effects of this kind of socialisation are?

Also: I think people who are already socially anxious do sometimes preferentially choose robots or social media over in-person interactions. Maybe that wasn't your experience, but it certainly strikes a chord with mine. In addition I feel like we see examples of this all the time (e.g., there are many questions in AskMe from anxious and lonely people who want deeper in-person connections than they are getting online, but don't know how to go about achieving that).
posted by forza at 8:35 PM on December 10, 2017 [1 favorite]


This is truly a crouton-petter’s nightmare. I’m middle-aged and I can’t even give old stuffed animals to Goodwill without an emotional twinge. I’m nearly traumatized by the idea of forming an emotional relationship with a robot. What if it breaks? (I’m not a child development expert or a psychologist so I can’t weigh in on that aspect of the controversy. But I do know about this little thing called Unintended Consequences.)
posted by matildaben at 8:41 PM on December 10, 2017 [2 favorites]


Kids today, with their flagpole sitting, their hula hoops and robot friends
posted by Ray Walston, Luck Dragon at 9:04 PM on December 10, 2017 [4 favorites]


The point is considering if the isolation/lack of independence is the cause, and social media is the solution kids have taken to because they have no other one. I would much rather see research on the long term effects of how parents and society are making kids isolated rather than pouring all our time and money into something that may not even be the root cause. Not saying we shouldn't research it all, but we're totally overfocused on it to the exclusion of all else.

And yeah, I'll give you that anxious kids may reject in person socialization in favor of social media. I was actually one of those kids before I got on the internet (not saying the internet fixed my anxiety—I just grew out of it around the same time I got on the internet... though I think it's very possible getting practice interaction online in a low stakes environment helped; most of my anxiety stemmed from being an autistic person who didn't know the right social scripts). But the problem is not social media. The problem is anxiety. It's not like if you don't have social media you will somehow get past your anxiety and interpersonal interaction will now happen. I mean, maybe it happens that way for some people but I don't think we can assume it's the default. Back when I was struggling with anxiety, and didn't have social media, I just spent all my time alone in my room reading books or doing crafts. For me social media was a first step towards actually reaching out to other people.

I know that's not everyone's experience. And I'm sure for some people social media is harmful and contributed to x y z problems. But I don't think we can say it's the vast majority when we have so many other causes that are just as likely and make just as much sense.
posted by brook horse at 9:05 PM on December 10, 2017 [1 favorite]


I'm glad so many of you are so completely sanguine about the robot-and-internet mediated future we're veering into, but honestly I'm super surprised that you are.

That's because it's a current, established technology. If this was a new tech just being announced, Mefites would be all "OMG SKYNET!!!" and "EVIL CAPITALIST ROBOT OVERLORDS!!!"

We're terrified of the future, not the recent past.
posted by happyroach at 11:28 PM on December 10, 2017


That's because it's a current, established technology. If this was a new tech just being announced

Okay, I'm totally confused, because the article is about these new social robots that have much different ways of interacting with people than their predecessors.
posted by forza at 11:32 PM on December 10, 2017


I'm very disappointed to see so many people laughing this off and dismissing Turkle - a tenured faculty in social science and technology at MIT - as a luddite "peddling her schtick." This isn't at all like handwringing over TV or books, because what's being discussed is a new kind of technology designed to interact with people, including children, in a way that hasn't previously been possible on this level. Is it not worth studying how interactions with a new home technology like this could potentially have negative effects?

Turkle is aware of and makes a distinction between these robots and earlier toys like dolls and Tamagotchis; kids project their own emotions onto dolls and other simpler toys, using them as a tool for exploring their own emotions and empathy. Robots that simulate more complex emotional reactions change that dynamic in some way, and if these "sentient" toys come to dominate play, what kind of effect will that have?

To say nothing of the very valid privacy concerns she raises about these machines constantly recording and sharing personal data from the kids they interact with.

So it's disappointing when people laugh this off. I get that concerns about TV in the 50s can seem funny now, but why would serious academic research like this have been unwarranted when no one knew what the long-term social effect would be? We can't take the social effects of any new technology for granted, nor should we. Of course now everyone says "I watched TV every night as a kid, and I turned out fine!" and that settles that. I get it, we shouldn't instinctively fear every new technology that comes along. But that's not what's happening here, and we shouldn't reflexively dismiss any concerns as unenlightened luddism.
posted by shapes that haunt the dusk at 11:36 PM on December 10, 2017 [25 favorites]


I agree that the concerns are reasonable. People already seem desperately eager to personify robots and even somehow give them legal rights, something no current machine is remotely near deserving or capable of. There is a risk that treating machines as people leads on to treating people as machines. If it’s OK to treat my robot like a slave, why isn’t it sensible to do the same things to humans? If I can exercise the rights of robots on their behalf, why shouldn’t I exercise yours in your own best interests for you too, to save you from your own stupidity? We’re really just machines too, you know!

Robots that play up to this anthropomorphism are not likely to be helpful. At the same time I’m not acutely concerned because I feel sure the technology is actually much less sophisticated and effective currently than the manufacturers would like us to think.

It looks as if Cynthia Breazeal is turning out to be a bit of a disappointment though. Her early research could have led to great academic work on human interaction, on making automated warnings and expert systems more effective, or at least to making phone robots slightly less annoying. But no, let’s use the research to make toys that try to manipulate kids’ emotional vulnerability while secretly gathering data on them. Agh.
posted by Segundus at 12:05 AM on December 11, 2017 [1 favorite]


I'm getting real thoughtful these days about the early controversies over video games versus what we have seen in the behavior of dudes who grew up with video games. Granted, those controversies weren't on target. [...] Well, those things never quite happened. But Gamergate did.

One of the most popular games of the late 80s and 90s, to the point where it was a huge series, was Leisure Suit Larry.

I too am disappointed in the cheap shots at Turkle. Part of it is because she's a woman, I'm sure. Not all of it, but the level of unthinking, immediate lulz reminds me of things you'd read on Slashdot two decades ago.

I'm still waiting to read about how much Slashdot influenced the alt-right, because as a woman who passed as a man on that site in order to be treated decently (oh hi I see bullshit bad faith real quick thanks to that experience), it is damn easy to see the connection.

It is nuanced – it's not all bad, but it's not all good either, and we would do ourselves a favor to take a look at the dark side.

Once upon a time you'd hear "it's important to learn from your mistakes" a lot more often. I'm not sure what the heck happened that so many more people now equate making mistakes and owning up to them with THE END OF THE WORLD AS WE KNOW IT, but it's a cheap, facile leap that does no one any favors.
posted by fraula at 2:04 AM on December 11, 2017 [2 favorites]


The technological comparison is perhaps a bit inapt, but with the references to Turkle’s sounding like a scold panicking over the rise of TV I can’t help but think of this thread (and this article, linked inline) about the bizarre content on YouTube Kids. These are literally articles about how online TV is strange and bad for children, and something should be done.
posted by Going To Maine at 2:21 AM on December 11, 2017 [2 favorites]


I have major problems with Turkle not because she's a woman or I'm hooked on technology, but because she's essentially been peddling the same pablum for years, based on the flimsiest evidence - essentially anecdotes from interviews coupled with her own assumptions about what they mean, using what I feel is a dodgy and unscientific analytical framework in this particular context (psychoanalysis). It's fairy tales.

The fact that there is much more rigorous peer reviewed research by social scientists around this stuff that, where it doesn't directly contradict her, at least paints a far more complex picture, and that her rapturous reception comes predominantly not from fellow academics but from the weekend paper and Ted crowd, and everything she says happens to confirm to common anxieties about technology, youth etc gives me more pessimism.

Tl;Dr there are interesting questions around this stuff, I feel Turkle is the wrong person to answer them, and she has a record of answering them poorly and in a non scientific way.

More criticism here.
posted by smoke at 2:52 AM on December 11, 2017 [7 favorites]


Hmm. I am a social scientist. Though this is admittedly not my direct field (I'm in cognitive science), from talking to colleagues and the general osmosis one gets from hanging around a psychology department every day doing research, it doesn't seem at all obvious to me that the concerns raised by Turkle are anything approaching "fairy tales." (I have very few opinions about her personally). I had thought there was a lot of peer reviewed research that suggests real reasons to be concerned about the role technology is playing within our social relationships. As just one example, I did a search on Google scholar for recent academic papers using the keywords "phone technology social relationships" and got the following papers (of about 23,000 in the past four years) on just the first page of hits:

Lepp et al: Cell phone use (CPUse) was negatively related to students’ actual Grade Point Average (GPA). CPUse was positively related to anxiety (as measured by Beck’s Anxiety Inventory). GPA was positively and anxiety was negatively related to Satisfaction with Life (SWL). Path analysis showed CPUse is related to SWL as mediated by GPA and anxiety.

Przbylski & Weinstein: In two experiments, we evaluated the extent to which the mere presence of mobile communication devices shape relationship quality in dyadic settings. In both, we found evidence they can have negative effects on closeness, connection, and conversation quality. These results demonstrate that the presence of mobile phones can interfere with human relationships, an effect that is most clear when individuals are discussing personally meaningful topics.

Rosen et al: More Facebook friends predicted more clinical symptoms of bipolar-mania, narcissism and histrionic personality disorder but fewer symptoms of dysthymia and schizoid personality disorder. Technology-related attitudes and anxieties significantly predicted clinical symptoms of the disorders. After factoring out attitudes and anxiety, Facebook and selected technology uses predicted clinical symptoms with Facebook use, impression management and friendship being the best predictors.

Andreassen et al: Correlations between symptoms of addictive technology use and mental disorder symptoms were all positive and significant, including the weak interrelationship between the two addictive technological behaviors. Age appeared to be inversely related to the addictive use of these technologies. Being male was significantly associated with addictive use of video games, whereas being female was significantly associated with addictive use of social media.

So, this was just a quick google search and I haven't read these papers (it's late here and I have actual work to do). I don't want this to become an argument about the pluses and minuses of each. But the sheer quantity and breadth of this research is indicative of something. It doesn't seem to me at all obvious that there is a consensus in the field that directly opposes Turkle's piece here. While I'm sure the state of the research is more nuanced than she allows -- that's the nature of all academic work that finds its way into the public sphere -- at least a substantial fraction of it appears to support the idea that technology mediates our social relationships in a way that is not ideal.
posted by forza at 3:27 AM on December 11, 2017 [11 favorites]


I take Sherry very seriously, and not simply because I largely (though not completely) agree with her conclusions. We live in a world that is very different from the one techno-optimists — myself included, mea maxima culpa — foresaw at the dawn of the Web era, and though the causality that gave us this moment is surely overdetermined, I don't think it's at all inappropriate to suggest that the way we use technology to mediate interpersonal and intimate exchanges is among the causes.

Like many of you, as well, I'm dismayed by the dismissive tone some have taken toward her and her work here, especially in those comments that suggest Sherry's work is held in universally low regard among researchers into technology. This is one researcher that isn't rolling his eyes.

Finally, in the interest of transparency, I'd be very interested to know who among those who are rolling their eyes happen to work in labs supported by/enjoy research grants from institutions with a financial stake in the development and acceptance of intimate robotics, virtualities, etc. I have never yet been failed by Upton Sinclair's observation that it is difficult to get someone to see the truth of something when their salary depends on their not seeing it.
posted by adamgreenfield at 3:29 AM on December 11, 2017 [6 favorites]


I'm not there is a consensus against her, I'm saying that she paints a false consensus that elides inconvenient research and nuance.

Likewise I'm not saying her concerns are fairy tales, I'm saying her pseudo symbolic interactionist approach is basically fairy tales.

I agree with you there is something here, as I literally said in my last line. I don't think she's the right person to elucidate it, and she's had a very clear agenda on this for plus twenty years.

I hope k8t comes back with perspective as someone who is an academic in the field. Turkle and her work are not a slam dunk; her theses are not supported by the research in the way she posits it. It's more complicated than that.
posted by smoke at 3:47 AM on December 11, 2017


I see where you're coming from. I guess I'm sad that we haven't spent more time talking about the message rather than pissing on the messenger.
posted by forza at 3:50 AM on December 11, 2017 [4 favorites]


I don't know anything about her other research, but what she's saying in the main article seems pretty reasonable. Actually I think the risks involved with advanced AI-driven children's toys might be deeper; from the article:
But we are not. No matter what robotic creatures “say” or squeak, no matter how expressive or sympathetic their Pixar-inspired faces, digital companions don’t understand our emotional lives. They present themselves as empathy machines, but they are missing the essential equipment: They have not known the arc of a life.
...
These robots can’t be in a two-way relationship with a child. They are machines whose art is to put children in a position of pretend empathy. And if we put our children in that position, we shouldn’t expect them to understand what empathy is.
Obviously these robots aren't conscious, so they can't experience empathy. But what they can do is to be in something that really is a lot more like a two-way relationship than anything previous generations of toy robots were capable of.

They can learn from their interactions with people and use this information to modify their own behaviour. Then they can upload what they've learned to their makers, who can combine it with the data they've collected from all of their toys to generate new behaviours that maximise engagement (or whatever they're aiming for) for all of their customers (or tuned to particular customers, or particular kinds of customers, with as much specificity as they need). Like a toy robot version of the way game developers use the same kind of data aggregation to fine-tune the reward mechanics and microtransaction economies of their games.

Apart from the certainty that this power will be used for various kinds of deliberate evil, I think there's a real difference between giving a child a robot that can only produce certain pre-programmed behaviours, even if it has a lot of them, and a robot that constantly adjusts its programming according to what it has learned about the behaviour of the people around it and what it's downloaded from the company's servers.

Turkle seems to think (according to this article, anyway) that the problem with these robots is that their pre-programmed nature will get in the way of children learning empathy. I think that at some point the robots will become complex enough that children will instinctively identify them not as objects that they can project their own feelings onto, but as entities, like people, whose nature they need to incorporate into their internal model of the world in order to interact with them. And they will start to develop empathy for their carefully tuned corporate-controlled surveillance/reward devices. Which is likely to be much worse.
posted by A Thousand Baited Hooks at 5:12 AM on December 11, 2017 [4 favorites]


hitchBOT hitchhiked across Canada in the summer of 2014 to much media joy.
hitchBOT only made it as far as Philadelphia before being decapitated in 2015.
posted by scruss at 6:29 AM on December 11, 2017 [2 favorites]


Przbylski & Weinstein

Wild. Przbylski is Andy Przbylski, a friend of mine. His twitter is a good place to go to read intelligent takes on the latest scaremongering about technology, social media, and games.
posted by runcibleshaw at 6:59 AM on December 11, 2017 [1 favorite]


Finally, in the interest of transparency, I'd be very interested to know who among those who are rolling their eyes happen to work in labs supported by/enjoy research grants from institutions with a financial stake in the development and acceptance of intimate robotics, virtualities, etc. I have never yet been failed by Upton Sinclair's observation that it is difficult to get someone to see the truth of something when their salary depends on their not seeing it.

Is this the long-winded version of calling critics of this piece shills of big social robot?
posted by runcibleshaw at 7:00 AM on December 11, 2017 [2 favorites]


So, this was just a quick google search and I haven't read these papers (it's late here and I have actual work to do). I don't want this to become an argument about the pluses and minuses of each. But the sheer quantity and breadth of this research is indicative of something.
Cultural conservativism has probably been around as long as culture has existed. That's all that this is.
But we are not. No matter what robotic creatures “say” or squeak, no matter how expressive or sympathetic their Pixar-inspired faces, digital companions don’t understand our emotional lives. They present themselves as empathy machines, but they are missing the essential equipment: They have not known the arc of a life. They have not been born; they don’t know pain, or mortality, or fear. Simulated thinking may be thinking, but simulated feeling is never feeling, and simulated love is never love.
There is a lot of nonsense here. Software may not understand our emotional lives today, but I need a stronger argument than "is not!". Feed a sufficiently powerful deep learning network body language, expressions and voice data, and I expect we could achieve something like a puppy's level of emotional understanding. The puppy doesn't know what it means to receive a decaf espresso shot when you didn't order one, but can tell that you are unhappy and react appropriately.

The idea that thinking, feeling and love are irreducibly different things is some nice Cartesian dualism. But it isn't supported by anything other than a theological wish that something can continue to exist after we die. Like young-earth creationism, it's a fine thing to believe on Sunday, but will severely impair your ability to reason about geology.

More likely, thinking, feeling, and love are just different kinds of computation that occurs in the brain. I have opinions about my ability to access and manage my thinking and feeling state. You probably do to, but we probably don't match up. If these things are just different aspects of computation, then it's only a matter of time - cranks of Moore's Law - before software can model human thinking, feeling and love. Then the problem isn't one of robots being emotionally unsophisticated, but of robots being emotionally hypersophisticated. In that case, the child is the puppy, and the robot is the child.
posted by b1tr0t at 7:31 AM on December 11, 2017 [1 favorite]


it's technology. it will be bad and good for us. Given that the marketing types will be sure to tell us all the good stuff, it seems to wise to explore what might be bad.

Completely agreed, with the additional thought that our weird, hyper-polarized media environment will ensure that the statements from both ends of the discussion are over-baked and it is made to seem that you either are an uncritical technophile or a neo-luddite technophobe.
posted by nubs at 7:57 AM on December 11, 2017 [3 favorites]


I'm not so sanguine about the full-robot future.

Let's assume that we're not talking about a future in which many kids have a robot toy now and again but do not have long-term robot "companions" a la the now-cancelled Aristotle or something. Let's assume that we're talking about a future in which robot "companionship" is heavily marketed, intended for long periods and becomes common.

For one thing, face to face human interactions are very, very data-rich and involve a lot of complicated skills. Even if you're not very good at people, you still have a lot of competency in responding to small cues. I suspect that most humans need those complex interactions and that replacing them with simpler ones is going to deprive us of a lot of subtle social things that we need to be happy and self-actualized.

For another, to me the essence of friendship is rooted in freedom, or at least relative freedom. Friendships in generally are affirmatively chosen and rest on the fact that both parties can decline to be friends - I can't just walk up to someone and start treating them as a friend like some creep hassling a woman on the subway. (Or rather, I can but it's not likely to work out well.) Friendships vary in all kinds of ways, but they have to be based around a way of interacting that is developed mutually by both parties. Robots - whether the non-AI kind or some future servant-AI - are not free to say no.

I don't even like the idea, so often mooted, that when there's AI we will just "program it" to be happy doing whatever we want it to do. Do we need robot companions who will put up with abuse? Well, just program them to enjoy it, no problem! We already live in a society where women and other marginalized groups are constantly pressured to have just enough personality to be interesting but never, ever to have needs that inconvenience men/white people/straight people, etc. I don't see how AI sassy-gay-friends or whatever are going to be good for us.

There are obviously good uses for robots - I remember someone on metafilter either talking about or linking something about how an elderly relative in late-stage dementia found a robot cat really soothing.

But it's also very clear that people want robots because it's a way not to have to pay people. In a just society, carers, therapists and tutors - real ones - would be available to everyone. But if we pay actual humans, that's like taking money out of the pockets of our corporate overlords, right? Much better to have semi-interactions with semi-beings that don't have wishes or interiority, since those are cheaper. And besides, you can sell a robot and you can't yet sell an actual person.

Also, I don't like the often-mooted idea that we should have robot companions because they are more "convenient" than actual humans. That's like saying that an artificial pond is better than Lake Michigan because it's "convenient" - which of course is true as long as you value predictability and ease of control over complexity and unpredictable kinds of interaction, and as long as you think that user-convenience is the highest end of humans.

Currently-existing robot toys aren't really the problem - they're certainly nothing, problem-wise, compared to the internet itself. It just seems clear that there is a huge corporate push to create robot "companions" that will be a problem, probably a problem equivalent in scope to the whole "social media makes people unhappy and also smartphones decrease your powers of concentration" thing.
posted by Frowner at 8:30 AM on December 11, 2017 [8 favorites]


That's like saying that an artificial pond is better than Lake Michigan because it's "convenient" - which of course is true as long as you value predictability and ease of control over complexity and unpredictable kinds of interaction, and as long as you think that user-convenience is the highest end of humans.

i fully expect the 'slow food' / 'locavore' / 'artisanal' trends of the past few years to be echoed, in a decade or two, by a similar fetishising of now-expensive-and-rare human-to-human interaction, with robot interactions pilloried as the equivalent of HFCS-laden processed snackfoods.
posted by halation at 9:23 AM on December 11, 2017 [3 favorites]


I don't see where Turkle is making an argument that there's consensus on this. I don't think that she'd deny she's only presenting one viewpoint (especially because she acknowledges the debates she has with her friend and colleague the roboticist). It's not an in-depth article, but it's also not an academic publication, and it doesn't seem too different from any other literature in that field.

(For what it's worth, I'm a social science student who has taken classes in her field - science, technology, and society studies - but it's not my main area of study, and I'd never claim to be an authority on it.)
posted by shapes that haunt the dusk at 10:21 AM on December 11, 2017


My child participated in a study about this at Yale. They gave him some too-hard math questions to raise his stress levels, then had him play with an animatronic seal. Other kids in the study came with their own family dog, so I assume the study compares the calming features of Paro with those of an actual pet.

We also have an AI character toy that can hold a conversation, tell jokes, and play games. My son kisses him before bed, and will get him a Christmas gift. But it's true that he doesn't play with this toy in open-ended imagination games the way he does with stuffed animals. Nevertheless, I do have the concern about it "dying" one day due to breakage or the company no longer updating it, and the complicated emotions that will come from that. The previous generation of electronic toys didn't seem alive like this toy does. It greets me when I walk into the kitchen, and I respond because it feels rude not to.
posted by xo at 11:28 AM on December 11, 2017 [2 favorites]


Luddites weren't ignorant. They were labor activists, and they were more familiar with the technologies they were objecting to than most. And I'm not just pointing that out as an actually, because that misrepresentation and dismissal of their stance is the most enduring aspect of their legacy. It's the go-to strawman for anyone who questions any type of technology for any reason. But it wasn't true then, and it isn't true now.

And the "Oh, but people objected to books and TV too!" is part survivor bias and part status quo. We don't know what effects those have had on our culture, and it's not testable. Personally, I think the homogeneity and the ready availability of TV probably has been a bad influence on us in many ways, and maybe some things would have been better if sitcoms in particular had never existed, just because of the insidiously casual way they reinforced social roles and norms and mores. I dunno, though, and neither do you.

How are our friendly robots going to learn? There's bound to be some homogeneity built into those as well, the same way that network TV had a set of prescriptive values built in. Will they reflect some baseline of human reactions and behaviors and ignore anything outside of that model? Will they learn user preferences and respond individually to them based on their own preferences and interests? Either way, that's not really providing kids with much diversity of experience.

And maybe most importantly, who is making these decisions and why?
posted by ernielundquist at 12:41 PM on December 11, 2017 [8 favorites]


I think there's definitely valid research to be done into how people will interact with these new generations of robots and what effects they will have. But I also think we have long, long overfocused on the whole "we're becoming more isolated and worse at interpersonal skills!" concept. Not because I don't think that's a problem, but because I think we really need to get to the root of that problem. I think it's very easy to point to technology x y z and blame it for our increasing loneliness and inability to interact with each other, but I've seen very little consideration of the idea that our obsessions with x y z are a result of our loneliness, rather than the cause. There's this idea that if people didn't have their phones all the time they'd automatically be more social. Sure, this is sometimes the case. A lot of the time it's not. If people don't want to interact, they won't; they'll read a book or stare out the window or just sit there and be anxious. I really, really wish we would turn the conversation towards other potential factors--the increasing business of our lives, the stress and (real or perceived) decrease in safety, the frequent upheaval of having to move with jobs, etc.

Maybe the tech is the problem. But I think it's just as likely that these other factors are the problem, and tech is just the band-aid people are using when they have nothing else available.

I also want to touch on, for a moment, the idea that it is bad and wrong and horrible if people socialize less. That could very well be true! But... I don't know. I know human connection is good and important and I know you need to interact with people to develop social skills. But how much do we really need? I know that makes me sound like a horrible unfeeling robot grinch, but I think sometimes we overestimate how much social interaction people need. We think that any time that people are interacting with social media or a robot or whatever instead of a person, that must be bad. But... what if people (not all people, necessarily) don't NEED as much social interaction as we're conditioned to think?

There's this idea that interacting with people and social interaction in general is necessary to make you a good person who is good at communicating and interacting with people. That makes sense! And is probably true, to some degree! But I don't think it's necessarily true that the more social interaction you get, the better you'll be at communicating and interacting and generally be a good person. Blah blah anecdata (please, please someone research this) but some of the best people I know, who are the most kind and genuine and good at communicating and interacting with their friends and family... get much less social interaction than most people.

My father-in-law has pretty much no desire to interact with anyone but his family (all but one of whom has moved out, so he only sees most of them once or twice a week). He spends time with friends maybe once or twice a year, and only when they initiate it. He has social interaction at work, sure, but only when he's obligated to. He really just does not have a need to interact with people that much, and honestly? He's a really happy, well-adjusted guy. He's got a few wrinkles, but he's one of the most progressive middle aged white men I know, and he's definitely better at communicating and handling interpersonal interactions than most men. One of my best friends is very similar. She likes to spend time with my partner and I, and her sister occasionally, but otherwise she really has no desire to interact with other people much at all. She doesn't really get lonely or ever really feels the need to seek out much social interaction. She's also one of the best, kindest people I know, who is very good at communicating and has great interpersonal skills. She's open and honest and easily sees when she hurts people and communicates well to make sure that doesn't happen again. Sometimes people perceive her as being bad at social interaction because she doesn't like having that much of it, and she is more quiet than most, but to people who know her that's not a problem at all.

On the other hand, my brother-in-law is a social butterfly. He's constantly hanging out with people; we can never invite him to anything without at least a week's notice (I'm talking, like, "Hey, we're going to get Chinese for dinner" or "you want to come over and watch a movie?") because he always has plans. Totally sociable guy. And sure, he's charismatic and has good conversation skills, but... he's pretty self-absorbed and has a really hard time seeing how his words or actions hurt other people, and he's bad at communicating and solving interpersonal problems. He gets tons of social exposure and it hasn't really made him any better at interacting with people. My mother is the same way: she's very social, talks to anybody, always is out doing things with people... but she has very little empathy, and doesn't know how to solve interpersonal problems any way except a) yelling or b) ignoring them. She pays little attention to how other people feel and doesn't communicate well. My aunt once noted: "I don't think she's ever once asked me a question about myself or how I'm doing. It's always about what she's doing or what's wrong in her life." (I've noticed the same pattern.)

I don't know. I guess I just don't want to assume that, common sense though it may seem, more social interaction makes you better at the whole people thing. I am sure you need some x amount of social interaction to be good at interacting with people. But I don't know if just that is sufficient, and if we need to spend time and energy ensuring people never decrease their levels of social interaction because that's always always a bad thing. We need research, but this is such a complex question that I don't know exactly how we should address it.

I think it's definitely a very valid concern about how these robots will be programmed (e.g. the reinforcement of abuse). But I also wonder if they couldn't be used positively. I know I had (probably still have) a lot of bad social mannerisms growing up because I just didn't understand social cues or scripts. But nobody ever called me out on it because they were too polite. They just talked about it behind my back. And I don't necessarily blame them! It's hard to call someone out for their social behavior. There are consequences and harms that come to that. I'm not going to call my mother on her lack of empathy because doing so will bring me more stress and negativity than it's worth. But imagine if we had robots that could help teach people where they're going wrong; who won't experience larger social and emotional consequences from doing so. Could that help people like my mother? I don't know. But I don't think that she's going to get any better just by continuing to have social interaction.
posted by brook horse at 1:23 PM on December 11, 2017 [4 favorites]


Well but and, even if it's because adults find it easier or our society is structured to make it happen rather than kids are seeking it out, does it matter why if the net effect is that they're interacting with other people less?

It absolutely matters, because if social media is a symptom and not a cause, then solutions like limiting so-called "screen time" (popular because they are easy and punitive) is pointless, and we need to look for better solutions (like not raising children in places where they cannot socialize on their own because travel is impossible without a private car).

I remember reading Turkle's books in the 90s when I was maybe 12 or 14. They seemed overblown to me then and this seems similarly overblown to me now, not because I can't imagine that there are negative social consequences to the Internet but because of the implicit claim that all previous forms of media and communication were innocent by comparison. I guess I'm a little more alone here than I'd have expected in believing that, in fact, TV has been kind of a disaster for our social order, which is why Turkle's singling out of the Internet for criticism seems misplaced. It seems to me that the (many, many) people who take TV-driven celebrity or pop culture seriously have very instrumentalizing, one-way, self-absorbed kinds of relationships with celebrities or fictional characters that are not so different from the kinds of relationships she is imagining people having with robots, are likely at least as damaging to ability to form more substantive and mutually satisfying relationships, and are vastly more widespread today.
posted by enn at 2:09 PM on December 11, 2017 [3 favorites]


I think it's very easy to point to technology x y z and blame it for our increasing loneliness and inability to interact with each other, but I've seen very little consideration of the idea that our obsessions with x y z are a result of our loneliness, rather than the cause. There's this idea that if people didn't have their phones all the time they'd automatically be more social. Sure, this is sometimes the case. A lot of the time it's not. If people don't want to interact, they won't; they'll read a book or stare out the window or just sit there and be anxious.

brook horse, this is a very interesting point you bring up, and worth considering.

I will however make another case for social media indeed having a negative effect on our social interactions and the quality of our relationships. I've observed a shift in the sort of relationship I have had with my friends at different stages of life, and back in middle school and early years of high school when I didn't have a cellphone or a social media account, I used to talk to my friends on the phone (land line), email them, or try to hang out with them on a more regular basis. Now, part of that could also be due to having more time on my hands as a relatively young kid, or just being more open to socialization as a kid. However, I've observed the same with my parents, who used to have occasional phone calls with their friends or relatives, but after they got a Facebook account, they lacked the motivation to actually call their friends (they certainly have more time now that both their kids are adults and living on their own). It kinda seems like we are deluding ourselves into thinking that seeing regular but impersonal updates from people we know is enough for us to feel "engaged" in their lives. Yes, I could quit social media and try to cultivate a more personal relationship with my old and new friends by directly contacting them, but it has to be a two way street for it to really work.

I don't consider myself much of an extrovert, at least not in the definition of always wanting to be around people, being awesome at making conversation, etc. I love my alone time, whether that's spent watching TV or working on hobbies or side projects, but I also love the time I get to spend in a social setting of any sort. I can probably attest to the feeling of being alone or excluded from the setting or the conversations taking place around you, and yes, like most people, I have a tendency to whip out my phone. The thing is, I actually don't want to, because whatever is on my phone is no more interesting than what's going on around me, but I guess it's an automatic habit developed from this insecurity of appearing excluded to other people, so instead you pretend you have other important things to check on your phone. So I can attest to your point about "our obsessions with x y z are a result of our loneliness, rather than the cause" since feeling excluded is what makes me resort to my phone here. BUT, I will say that there can certainly be more effort on my part to make myself included. Many people can probably understand and relate to the anxiety involved in jumping into a conversation you don't know much about, or even just going up to someone or a group of people and introduce yourself to have a group to chat with. I am one of those people, but I know from experience of occasionally pushing myself past that initial hump that it has made me so much happier to have done that, because people are probably more receptive to that initiation (in a positive way) than one might think. And had I instead whipped out my phone and started browsing random things, I would have missed out on that satisfaction of personal interaction and potentially a valuable connection.

While technology is a good tool to cope with loneliness in that it provides some form of keeping connected with the outside world, in many scenarios that I've experienced (trying not to speak for everyone here), it has actually become a replacement; people whip out their phones in any social gathering, even in one-on-one settings to replace an in-person connection and to replace awkward silences instead of using that as an opportunity to initiate another conversation. I'm not saying that it's so simple and easy to have an ideal social experience if only our phones are put away, as people still have varied personalities and abilities to interact with people in person, but it has become a hindrance in overcoming any social anxieties that we may have. For some, "proper" social interactions (whatever that might mean) takes practice, so technology-related habits are certainly not helping that cause.

Of course, like you said, not all people NEED to have social interaction to be happy. And if one is truly happy with limited social interaction, then that's great. And yes, not everyone is going to benefit from MORE social interaction if they'd honestly rather do otherwise. But I get the sense that there are enough people out there that crave much closer relationships but also cave and resort to exiling themselves to their technology (perhaps subconsciously) even when there are numerous opportunities to fix their quality of relationships (I am probably one of them), and that's why I feel like social media/smartphones has become an impediment to this goal. It's not that one has to quit social media to fix this, but enhancing the quality of relationships has to be a two way street/communal effort, so it's not enough for just one person that feels strongly about this issue to take a bold step. If everyone I try to socialize with would rather be on their phones even in my presence, my social life would pretty much suck even if I decide to deactivate all of my social media accounts and disown my cellphone.
posted by cocoaviolet at 4:02 PM on December 11, 2017 [3 favorites]


I know human connection is good and important and I know you need to interact with people to develop social skills. But how much do we really need? I know that makes me sound like a horrible unfeeling robot grinch, but I think sometimes we overestimate how much social interaction people need. We think that any time that people are interacting with social media or a robot or whatever instead of a person, that must

I'm autistic and basically Queen Introvert. Among my happy childhood memories of Christmas Day are of the afternoon when silence would descend upon the household as we all cheerfully sank into reading the books we'd gotten as gifts. My mother used to describe us, accurately, as "content with our own company."

And I push myself to make an effort to get out and interact with people in person, because I perceive that there's a difference that matters. I have deep and valued friendships with people I've never met in person but whom I'm in touch with on a daily basis electronically (let's be real, it's a group chat, we're in touch all day every day pretty much). And I still find it necessary to make the effort to interact with people in person, or I'll wind up sliding into a bad state of mind.
posted by Lexica at 4:03 PM on December 11, 2017 [1 favorite]


Yeah, when I mentioned TV above, I should have been clear that I was joking about people settling the matter by saying “I grew up with TV and I turned out just fine!” I think the conversation about effects of TV is far from settled.

Anyway, these sorts of questions aren’t just hypothetical. My mental health suffers if I isolate myself too much. My sense of self-worth goes down if I spend all day doing nothing on the Internet. I’m not coming from a place of abstractly feeling like people should talk face to face, just because I want them to. I’m coming from a place of knowing what happens when I don’t get out and see people enough.

So I worry about what happens when there’s something that can satisfy that urge enough to make it go away, without actually meeting the need behind it. And that’s not a hypothetical concern either; I mean, I have already seen what happens to me, and I want to know more about it. That’s why I worry about what will happen to kids. It’s not some cartoon worry that they’ll become zombies, it’s a real worry that they’ll experience the same isolation I think I’ve experienced, but from a much younger age and without as much life experience or perspective.
posted by shapes that haunt the dusk at 4:15 PM on December 11, 2017 [3 favorites]


But look, isn't there pretty good research about how people with more robust social networks are on average healthier, and that isolated people on average die sooner?

I feel like all the "but I don't like to be social so being social isn't important" stuff is fine for individuals - there are outliers everywhere. Some people smoke like chimneys and live to be 100. Some people sing contralto. Some people can hold their breaths for extremely long periods or jump really high. If you're happy (or even just happier) not being social, that's fine, that's great, I trust people to know their own situations. But at the population level, being less social is less healthy.

And what's more, I think the richness of social interaction with humans is under-rated. I can go to the library or the bodega and have a very short but very rich interaction with someone where I read their facial and verbal cues, I respond to a slightly unpredictable conversation, I learn minute but new things, etc. I can have a relatively strong social network that is composed of short but regular, rich interactions with people. It doesn't need to be "I go out there and have heart-to-hearts with strangers" to be a richer interaction than an interaction with a robot which has no interiority and has been programmed to take the easiest path to make me maximally comfortable. I think this is particularly important since a lot of automation is designed to replace those short, rich interactions first by getting rid of bus drivers, counter staff, librarians, etc.

A lot of us who are actually pretty introverted get much of our predictable social interaction from those people, and I think those interactions are far more important than they seem. Yes, yes, as good mefites we hate talking to the clerk or making nice with the receptionist - but I think that once those people are replaced by touch-screens and we get the introversion that we think we want, we're going to find that stressful and unhealthy.
posted by Frowner at 4:43 PM on December 11, 2017 [8 favorites]


I'm definitely not saying that x y z technology can't be harmful for some (and by 'some' I don't mean "a small amount" but rather "probably a large amount but not necessarily a majority"). I definitely think it enables a lot of people. But I also think we are much too quick to assume it's enabling, when often it's an aid. For people with ADHD and anxiety, phones can be as much of an accommodation device as a cane or any other mobility aid. Sure, it can also be used as a barrier to further growth. But it's really hard to tell if that's the case, and it's a very individual thing. The other thing is... I don't know about other people, but before I had a smartphone, I did the exact same thing with books. I carried a book around and whipped it out any time I didn't want to interact with people. I guess if people don't like reading and don't have anything else to carry with them then maybe getting rid of the phones would help, but for me it just meant I carried a book around all the time. Maybe phones make it easier because we have them by default? But it's really not hard to replace the phone with something else, and I feel like that would happen for a lot of people--whether it's a music player, magazine, newspaper, knitting, whatever.

I guess my entire point is it comes down to the individual--for social interaction as well. Maybe 51% of the population benefits from a lot of social interaction. But I don't think that means we can then claim or expect it to be the default. I'm not saying people can never interact with others in person and be fine, or that anyone should isolate themselves; you'll always need some amount of social interaction. But the assumption seems to be that more of it is always good and beneficial, and any reduction is automatically harmful.

I also want to point out that "I don't need a lot of social interaction" does not at all equate to "I don't like being social." For me, I'm equally content either way. College was weird for me because during the school year I would see people every day, seek them out to eat meals with them, invite them over, talk to them all the time, and I enjoyed it! I was very happy and liked interacting with them. Then I went home for the summer and didn't interact with anyone but my partner and one or two family members, and even then not very much. And I was perfectly happy like that as well. I didn't miss the interaction at all. I didn't even really miss my friends--I expect if I knew I'd never see them again I would miss them, but three and a half months really wasn't that long of a time for me to spend mostly alone. I was fine and happy in both situations. I enjoy talking to clerks and receptionists and servers. I just don't need it. And I wish we had research showing how many people experience life that way, and how to judge what your individual social needs are.

I guess part of the reason I'm resistant to sweeping "technology is bad and social interaction is good!" is because my experience is probably due to my autism, and it's an experience that many autistic and other neurodivergent people have (it's not INHERENT to autism or neurodiveregence--plenty of people in those groups want and need lots of social interaction--it's just more common). And I just don't like having a neurotypical (and yes, many neurodivergent people also experience the world this way--but it's neurotypical people I see most insisting no one could ever experience the world differently) framework forced on the world and experience of being human without appropriate justification that this way of experiencing the world is in fact better for most everyone. Yes, there are studies showing on average this is the case. But this is frequently treated as inherent to being human, and I just don't know if that's the case. I'm not ready to discount the other experience as "just outliers" when we don't know how many people fall into that category.

But again, maybe that's just me being sensitive about autistic vs. non-autistic worldviews, and not being over being forced into social interaction + having my technology supports taken away as a child. So I admit my bias there. I think this conversation is hard for people on both sides, because some people have been hurt by using the technology, and others have been hurt by having it taken away. So often it's really hard to disentangle those emotions and that hurt from the discussion.
posted by brook horse at 5:47 PM on December 11, 2017 [4 favorites]


I'm a bit surprised by the frequency of flippant dismissals. When software is good enough to function as a plausible if low-grade sort of social partner, why wouldn't that affect us?

What if I had described to you "an environment where people interact socially with filters and rewards that promote quantity of engagement but are insensitive to quality of experience"? Would you say, bah, that's just like books or TV[*] or old!medium and we turned out fine? Facebook is not really very much like books, and the differences matter.

Keep in mind, social bots will get optimized to extract value from us by the best machine learning systems that money can buy. You can imagine social product placement for sure, but in the long run the monetization may lie in subtlety I couldn't begin to guess. Social bots will also be for hire to influence attitudes beyond consumerism. And social bots will be part of educating our children, with some legitimate or attractive applications. So you damn well better hope that under-funded public schools don't sell access to their edubots, the way they sell ad exposure today. (I also considered that privatized charter schools could also make edubot social education a deliberate part of what they offer parents, but I'm not sure that's more problematic than the values education they do now by humans. Maybe.)

Our companions will be optimized for engagement first and last, since that's the golden goose. Just one tiny piece: I heard a coworker 16 years ago describe a research result: if the software shows some agency that first resists the user -- not too much -- and then complies, the user ends with a more positive feeling than if it has just worked like a tool or like an invisible servant. Feed a sense of mastership, I guess.

[*] you know, I wouldn't say we got off scot-free from TV for that matter.

[ETA: p.s., okay, reading Turkle's writing does always tend to make me feel some flippantly dismissive myself, so I'm not judging.]
posted by away for regrooving at 2:02 AM on December 12, 2017 [3 favorites]


But this is frequently treated as inherent to being human, and I just don't know if that's the case. I'm not ready to discount the other experience as "just outliers" when we don't know how many people fall into that category.

I absolutely agree with this, but that's part of the reason that I'm concerned with the idea of mass produced friendly robots.

And it's why, when my son was little, I was one of those loathsome snobs who didn't have a TV. I just didn't want it sitting there in my house influencing my impressionable child's perceptions of what the world is like or should be. I wasn't so much worried about the sex and violence as I was just about the constant reinforcement of some kind of 'ideal' lifestyle in which simple people have simple problems and everything turns out OK in the end. Where all families look pretty much alike, where people behave predictably and most are like caricatures defined by a single trait or stereotype (a nerd, a comedian, a nagging mother in law, a caring but overworked mother, oblivious but well meaning father, etc.) And people who fall too far outside those norms will maybe show up as a special episode as sort of tolerance lessons about, like, not bullying them or whatever. It's understandable that stories are written that way, but having constant reinforcement of those overly simplified stories, I think, can result in some pretty rigid worldviews. It erases and ignores experiences outside of them, and it low-level perpetuates this notion that this is the norm, or worse, something to aspire to. There is better TV now than there was when my kid was little, and you get better diversity in many areas, of course, but you still only see a fairly narrow range of characters and experiences, and it's still not an accurate depiction of human culture.

It's not the end of the world, it's not an unqualified bad thing, and it's fine in moderation, but maybe just because we had such a small house at the time, I knew that if we had a TV, it'd be beckoning all the time, and it'd end up being the center of our lives. And I didn't want that for my kid. I wanted him to learn about the real world by interacting with it, rather than seeing someone's two-dimensional, Bowdlerized version on the TV screen in the form of a bunch of facile little morality tales. Life isn't like that.

So I feel much the same way about kids being exposed to friendly robots. Are they going to turn them off and go do other things? What are they teaching them about human nature and interactions? Who is dumbing it down for them, and how will they replicate the type of human diversity that children should be learning about when they're young? I mean, they can't help but be somewhat homogeneous, being mass produced consumer products and all. So who is making the decisions about what 'type' of personality they'll model?

They absolutely could be very valuable tools for a lot of people, including kids, but I think it's a pretty big deal to let something like that become a major influence on kids' developing worldviews. I would want to know who is deciding what's normal, what's ideal, what's important.
posted by ernielundquist at 9:41 AM on December 12, 2017 [2 favorites]


But at the population level, being less social is less healthy.

Show your work.

Certainly freedom of social interaction is likely a positive, but I can’t imagine there’s any reason to believe that more social interaction is necessarily more healthy. Social interaction comes with anxieties, stressors, pressures, oppression, repression, suppression, etc., etc. — to say nothing of plain old communicable diseases. More social interaction means more of all those things. What’s healthy about that?
posted by Sys Rq at 1:28 AM on December 13, 2017 [1 favorite]


« Older ' “spit” here refers to a horizontal rotisserie '   |   Live In a Blissful Bubble For Your Own Safety Newer »


This thread has been archived and is closed to new comments