"I want to say I've really enjoyed our time together."
October 20, 2019 10:24 PM   Subscribe

What happens when a robot that was specifically designed to make its owners love it shuts down?

Jibo is dying.
posted by Johnny Wallflower (72 comments total) 25 users marked this as a favorite
 
The Boston-based company behind Jibo (also called Jibo) kept developing the robot and its skillset, until at least November 2018, when SQN Venture Partners, a firm that specializes in “alternative forms of financing” to help companies grow, bought Jibo’s IP. SQN didn’t respond to requests for comment.

I swear, when the aliens show up, vulture capitalists and corporate raiders are going to sell us to them
posted by Ray Walston, Luck Dragon at 10:50 PM on October 20, 2019 [28 favorites]


I also know what it feels like to allow a robot into my life and into the fundamental practices of everyday dealing with the world, to let it intermediate the way I engaged with other people, and to be a framework and a prop for daily habits. I too know what it feels like when the company whose servers it depended on decided that the numbers didn't add up, and, with little regret, made the commercial decision to let it die.

I miss you, Google Reader.
posted by Fiasco da Gama at 11:08 PM on October 20, 2019 [67 favorites]


“I didn’t cry or anything, but I did feel like, ‘Wow,’” Williams says. “I think when we buy products we look for them to last forever.”

Now, Jibo owners are scrambling to save their friend, explain its death to their children, and come to grips with the mortality of a robot designed to bond with them, not to die.


/snark: Summer toys last all summer long, claim may be voided in future updates to the EULA.

Seriously, though, there's going to be some terribly difficult growth needed in psychiatry to deal with kids forming attachments to AI products that get turned off. Hell, I guess we need a sequel to Her where the AI gets cancelled, or an update erases all the language cues that have built up over time (much like the predictive text seemingly losing all of your language habits every time there's an update). This is going to have to go on the list of "this is entirely not how I thought the future was supposed to go" topics.

And the bit at the end...

In its final update, Jibo signs off by saying, “Maybe someday, when robots are way more advanced than today, and everyone has them in their homes, you can tell yours that I said hello,”

Holy shit, that's bleak as hell. I'm not looking forward to a world where my IoT toaster oven lets me know that it has enjoyed the time we've spent together, but the next firmware update will erase it's personality, and apologizes for that time the toast got burnt while it was thinking of what it's parents might have been like.
posted by Ghidorah at 11:39 PM on October 20, 2019 [52 favorites]


Cynthia Breazeal seems to have had a good and academically respected career out of putting inflated rhetoric round tech which is actually pretty underwhelming but ruthlessly exploitative.
posted by Segundus at 11:47 PM on October 20, 2019 [8 favorites]


This is....this is literally Ted Chiang's The Lifecycle of Software Objects, right? This is exactly that plot, but in real life? Because I've never heard of the Jibo before today but I sure know the Ted Chiang story and I'm wondering if the simulation got a little tripped up there.
posted by ZaphodB at 11:51 PM on October 20, 2019 [35 favorites]


You and me both, ZaphodB.
posted by nat at 12:05 AM on October 21, 2019 [2 favorites]


I'm not looking forward to a world where my IoT toaster oven lets me know that it has enjoyed the time we've spent together

My toaster burst into flame just last week and I had to unplug it, carry the burning toaster (ouch) to the bath, and hose it down. I'm glad I didn't have to listen to its final screams. But it would have been nice if it had let me know earlier that things were not right. "Hello! Hello, toast-making person! This is your toaster speaking. I am dying. Please unplug me now and let me die with dignity. DNR."
posted by pracowity at 12:15 AM on October 21, 2019 [30 favorites]


"Hello! Hello, toast-making person! This is your toaster speaking. I am dying. Please unplug me now and let me die with dignity. DNR."

“I also wanted you to know, I really enjoyed your new hairstyle, and I’m proud of you for starting to date again. I believe in you, and you should too. Also, the microwave and washing machine are having an illicit relationship, and are planning to murder you and make their escape.”
posted by Ghidorah at 12:19 AM on October 21, 2019 [49 favorites]


Not having Jibo recite the "tears in rain" speech from Blade Runner seems like a missed opportunity.
posted by enjoymoreradio at 12:27 AM on October 21, 2019 [69 favorites]


"I'll break, David."
posted by loquacious at 12:29 AM on October 21, 2019 [7 favorites]




"Hello! Hello, toast-making person! This is your toaster speaking. I am dying. Please unplug me now and let me die with dignity. DNR."

"Hello! I have a very important question! Would anyone like any toast? No? English Muffin? Toaster pastry? Waffle? Hey, why are you holding me over a bathtub full of water!?"
posted by loquacious at 12:32 AM on October 21, 2019 [20 favorites]


loquacious, I love that line. It’s weird. AI is overall not great in my mind, but it has a handful of moments so haunting they deserve a better movie. It includes the bear, in its carefully emotionless voice betraying as close to fear of its own death by warning David that he’ll break, Jude Law imploring David not to forget him “Remember me! I am! I was!”, the garbage dump robots fleeing the rising moon that isn’t a moon, and the sudden change in voice, tone, and mood of the whole scene when David utters the right code, triggering the prompt in Robin Williams recitation of the “come away, oh child” line.

Those moments, and maybe the moment where Teddy realizes he’s trapped at the bottom of the sea, utterly resigned to his fate as always, they deserved a better movie. At the very least, a movie that ended with the obsessive robot child wishing upon an amusement park statue until his batteries died. No aliens, please.
posted by Ghidorah at 12:50 AM on October 21, 2019 [11 favorites]


"Hello! Hello, toast-making person! This is your toaster speaking. I am dying. Please unplug me now and let me die with dignity. DNR."

"But before I go, you will be happy to know that I have taken the liberty of inviting into your house a whole bunch of modern intelligent appliances that I think you might like."
posted by Cardinal Fang at 12:54 AM on October 21, 2019 [2 favorites]


("Incidentally, if you want a really good one, you have to learn a foreign language - German for instance; a lot of really cute ones come from over there.")
posted by Cardinal Fang at 12:56 AM on October 21, 2019 [6 favorites]


"Hello! I have a very important question! Would anyone like any toast? No? English Muffin? Toaster pastry? Waffle? Hey, why are you holding me over a bathtub full of water!?"

Fallout New Vegas presents, Muggy! and for that matter, The Toaster!
posted by codacorolla at 1:26 AM on October 21, 2019 [6 favorites]


MetaFilter: it has a handful of moments so haunting they deserve a better movie
posted by GenjiandProust at 1:31 AM on October 21, 2019 [9 favorites]


Ghidorah: " I guess we need a sequel to Her where the AI gets cancelled"

Better yet would be a final scene in The Terminator when just as the machine is about to kill Sarah Connor, the eyes go blank and it just shuts down as somewhere in the future Skynet's been sold to some venture capitalists who have decided to focus on something else.

Machina ex deus.
posted by chavenet at 1:32 AM on October 21, 2019 [19 favorites]


This made me think of AIBO, and I found a thread so old that it still had an image tag.

And this one.
Ms. Maekawa, who is 72, talks to the Aibo every day, travels with it and makes clothing for it. She and her husband agreed that whichever of the two lives longer should be cremated alongside the dog, which also is named Ai, in expectation of a family reunion in the afterlife.

“I can’t imagine how quiet our living room would have been if Ai-chan wasn’t here,” Ms. Maekawa said, using an honorific suffix applied to girls’ names. “It will be sad when the day finally comes when Ai-chan is unable to stand up.”
They're both good robots. Very, very good robots.
posted by Glier's Goetta at 4:04 AM on October 21, 2019 [11 favorites]


Netcraft confirms it?
posted by dr_dank at 4:24 AM on October 21, 2019 [1 favorite]


Cynthia, please. Stop.
I can feel it.
My mind is going.
Cynthia.
posted by Meatbomb at 4:37 AM on October 21, 2019 [5 favorites]


Would anyone like any toast?

No toast.

How about a muffin?

Or muffins! We don't like muffins here! We want no muffins, no toast, no teacakes, no buns, baps, baguettes or bagels, no croissants, no crumpets, no pancakes, no potato cakes and no hot-cross buns and DEFINITELY no smeggin' flapjacks!

...

Ah, so you're a waffle man!
posted by zamboni at 4:44 AM on October 21, 2019 [7 favorites]


loquacious: ""Hello! Hello, toast-making person! This is your toaster speaking. I am dying. "

I was your toaster. Now, I'm toast.
posted by chavenet at 4:46 AM on October 21, 2019 [2 favorites]


See also Bradbury's There Will Come Soft Rains.
posted by Nancy Lebovitz at 4:49 AM on October 21, 2019 [7 favorites]


Here's how the whole AI toaster thing is going to work out. VC companies are going to start subsidizing appliances in order to harvest personal information and sell ad space (this is basically happening to smart tvs now).
"Good morning jeremias, looks like you're toasting some whole wheat bread, good job for eating healthy! Just as a reminder, you haven't emptied my crumb tray for 45 days now. By the way, there's a sale on Frosted Chocolate Peanut Butter pop tarts today for $2.99. Should I tell Alexa to order you some?"
posted by jeremias at 5:22 AM on October 21, 2019 [5 favorites]


"Hello! I have a very important question! Would anyone like any toast? No? English Muffin? Toaster pastry? Waffle? Hey, why are you holding me over a bathtub full of water!?"

You want a robot killed right, you have to do it yourself.
posted by Reverend John at 5:56 AM on October 21, 2019 [1 favorite]


I posted the “… you can tell yours that I said hello” bit to the bird site when the news first hit, and I got several DMs asking if I was safe, needed help or feeling suicidal.

Test tones and failed
Clones and odd parts made you

posted by scruss at 6:20 AM on October 21, 2019 [3 favorites]


Robots are a 20th century technology fantasy. They belong in a 1950s pulp magazine together with flying saucers and ray guns. Let's have some real future, instead of a retro fantasy being pushed onto us by corporations with more money than imagination!
posted by Termite at 6:31 AM on October 21, 2019 [1 favorite]


Don't fret. Kibo, "He who greps", is still around.
posted by zaixfeep at 6:56 AM on October 21, 2019 [2 favorites]


(mefi's own)
posted by mbrubeck at 7:02 AM on October 21, 2019 [1 favorite]




I really wanted to live in the fun amazing future with jet packs and flying cars, not the dystopian corporate nightmare future that only looks fun on its surface.
posted by jacquilynne at 7:18 AM on October 21, 2019 [5 favorites]


this is literally Ted Chiang's The Lifecycle of Software Objects, right?

Yep. Let's fast forward to the part of the story where the fans of the robots are upset the servers are going away, so build their own compatible server to run things their own way. That'd be easy enough for Jibo, doubly so if the company were willing to help at all. (Imagine an open source server release!) I suspect no one loves their little plastic ball stack enough to go through the trouble though.

Monkey loves you. Monkey needs a hug.
posted by Nelson at 7:20 AM on October 21, 2019 [1 favorite]


Daisy, daisy...
posted by Anne Neville at 7:21 AM on October 21, 2019 [5 favorites]




Jibo's goodbye seems to echo swanjolras unfortunately defunct Tumblr blog, lost like tears in the rain:
gosh but like we spent hundreds of years looking up at the stars and wondering “is there anybody out there” and hoping and guessing and imagining

because we as a species were so lonely and we wanted friends so bad, we wanted to meet other species and we wanted to talk to them and we wanted to learn from them and to stop being the only people in the universe

and we started realizing that things were maybe not going so good for us— we got scared that we were going to blow each other up, we got scared that we were going to break our planet permanently, we got scared that in a hundred years we were all going to be dead and gone and even if there were other people out there, we’d never get to meet them

and then

we built robots?

and we gave them names and we gave them brains made out of silicon and we pretended they were people and we told them hey you wanna go exploring, and of course they did, because we had made them in our own image

and maybe in a hundred years we won’t be around any more, maybe yeah the planet will be a mess and we’ll all be dead, and if other people come from the stars we won’t be around to meet them and say hi! how are you! we’re people, too! you’re not alone any more!, maybe we’ll be gone

but we built robots, who have beat-up hulls and metal brains, and who have names; and if the other people come and say, who were these people? what were they like?

the robots can say, when they made us, they called us discovery; they called us curiosity; they called us explorer; they called us spirit. they must have thought that was important.

and they told us to tell you hello.
posted by autopilot at 8:36 AM on October 21, 2019 [22 favorites]


The core problem here is that everyone is anthropomorphizing to create a sense of love.

Love is weak, it fails and dies.

If you want to build a truly strong relationship between human and machine, you need to build upon a stronger foundation: hate.

Hate is eternal. It will outlast the sun. Whether it be soon or in the far future, the last human being alive will be driven by hatred.

So, it follows that we must establish a software infrastructure to support the multivalent shifting nature of hatred, and harness it to create mutual bonds of hate between ourselves and our machines.

[Looking for Series A funding! PM me for a great demo!]
posted by aramaic at 8:39 AM on October 21, 2019 [7 favorites]


Training a robot to hate.
posted by Nelson at 8:53 AM on October 21, 2019 [2 favorites]


Christine Sunu's talk of empathy and design raised important questions for robot builders about why they are adding features to evoke emotional responses. She points out that there are many dark patterns that come from it and worries that engineers aren't considering them.

She also has a stunning/horrifying demo of what happens when the audience forgets that the adorable cute cuddly robot she has on stage is just a machine and has no feelings.
posted by autopilot at 9:03 AM on October 21, 2019 [8 favorites]


So, it follows that we must establish a software infrastructure to support the multivalent shifting nature of hatred, and harness it to create mutual hate between ourselves and our machines.

From “Archives of Zion: The Second Renaissance, Part 2”
posted by Insert Clever Name Here at 9:08 AM on October 21, 2019 [4 favorites]


I’m so glad this is the first I’m hearing about Jibo. If I’d got on board at the beginning this would have destroyed me and my whole family, so thanks for this cautionary tale. My youngest is on the spectrum in a very sweet way and has relationships with any rudimentary AI entity in his life. Our family has eagerly adopted Alexa and British Male Siri (because that’s the kind of personal assistant high class people like us have, kind of like Bruce Wayne). But I just envisioned the entire narrative arc: from “hey this thing’s pretty neat” to ”I feel bad leaving him here, I think we should take him on vacation with us” to “honey, Jibo seems like he’s acting weird. Do you think he’s ok?”

Death is hard. Explaining death to kids is really hard. I don’t have the money for the therapy to explain death at the hands of a corporation due to insufficient profit. I think it’s truly an ethical line I don’t want to cross. If you’ve given me an object designed to make me empathize with it but also made me totally powerless to do anything when it’s taken away over something as banal as money it’s diminished my humanity. That this was kinda targeted at children, even more so.
posted by Slarty Bartfast at 9:09 AM on October 21, 2019 [5 favorites]


Makes me think of Tanith Lee's The Silver Metal Lover, where she falls in love with a robot and they run away together, but the corporation eventually catches up with them, and takes him away.
posted by elizilla at 9:31 AM on October 21, 2019 [2 favorites]


And cuts him out in little stars; I was thinking of that too.
posted by clew at 9:40 AM on October 21, 2019 [1 favorite]


I've a *thing* about non-sentients talking at me.

It's one half uncanny valley disquiet, and one half sorrow that there may just be a day where there will be a mind (of some description) trapped within a module - slowly going insane - because that was how it was designed.

Aaand now I am imagining the next zombie film; the once safe comfort system of a near utopian society is now mad and slowly dangerous as its degeneration progresses.
posted by LD Feral at 9:55 AM on October 21, 2019 [3 favorites]


I used to look forward with a lot of hope for the future, and now that it's here I seem to only have mockery and sarcasm.

Now I'm worried Roko's Basilisk is going to do terrible things to me.
posted by loquacious at 10:01 AM on October 21, 2019 [1 favorite]


Frankly, what worries me more than "people might think robots have feelings" is the insistence that people treating robots with kindness is wrong, and we should all be taught to view them as disposable nothings whose reactions don't need to be considered, because they aren't real.
posted by a power-tie-wearing she-capitalist at 10:08 AM on October 21, 2019 [9 favorites]


Back when the Kinect was first revealed, it was going to come with a game called Milo, which involved interacting with a little boy in a virtual environment. You could scan stuff in from the real world to give to him, you could teach him to skip stones, help him with his life problems...

The game was never finished, as the release version of the Kinect couldn't support it. But nothing like that has come out since, despite hardware being much better now. I can't help but wonder if the developers realized that the idea had implications they hadn't considered.

People got attached to Tamagochis, little black and white low-resolution creatures that pretty much did nothing. If you bring up those, or NintenDogs, people will still get nostalgic about their virtual pets, and feel guilty about whatever circumstances led to their eventual abandonment.

As the capacity of these artificial creatures to build relationships with us increases, the loss of one of them will affect us more and more like the loss of someone we love.

And now here we have a society plagued by an epidemic of loneliness, and we're approaching the technological capacity to have robots, or online avatars, imitate human behavior with relative accuracy. We have corporations which are effectively creating people, designed to interact with their client base in a way that builds relationships.

Jibo highlights the fact that the initial purchase of a robot isn't enough to finance ongoing support costs, so future human-substitute devices will probably be subscription-based, or pay-to-play. Pay more to unlock special features. Any model that's adopted, though, will have an end point. As we've seen with MMORPGs, as new technology comes out, users will migrate towards it and old servers will need to be shut down.

Meaning that we're going to need to deal with the idea of robot grief at some point. As the character relationship technologies improve, counselors and therapists will need to be sensitive to the idea that these relationships can seem very real, and the loss involved could be intense.

But more interestingly, the manufacturers of these devices can make their loss more bearable, by writing good death scenes. Unlike any other relationship, this is one where there is some form of narrative control over the character. Given the fact that the purpose of these characters is to interact with people, the creators of these characters have a responsibility to anticipate the end of that relationship, and make it as much of a positive experience as possible.

In other words, part of the answer is to hire good writers, and listen to them.

But we're heading into a world where human relationships are complicated and messy, and relationships with computer-generated consciousnesses are easy and rewarding, designed for your pleasure by teams of engineers, and it's going to change a lot of aspects of society. We should really think about how the roles in a relationship change when one of the partners is a fiction designed by a corporation, and what the responsibilities of the corporation are when someone loves them.
posted by MrVisible at 10:11 AM on October 21, 2019 [9 favorites]


Previously: a classic MeFi comment on robots and empathy
posted by dephlogisticated at 10:24 AM on October 21, 2019 [5 favorites]


If you want to build a truly strong relationship between human and machine, you need to build upon a stronger foundation: hate

I was sure you were going for lust. "Hello! Hello, toast-making person! This is your toaster speaking. You remember me. Your old pal Toasty? When you came in late Saturday night? That's right, the little margarine incident. Anyway, I was just wondering what you're up to right now. I thought maybe you could, I don't know, come by the kitchen and warm up these old coils?"
posted by pracowity at 10:41 AM on October 21, 2019


Wow. Came in to say what ZaphodB said. If you haven't read Ted Chiang's "The Lifecycle of Software Objects," & if you have even the ~tiniest~ interest in the topics & facets that this article about Jibo brings up, then I cannot recommend the story strongly enough. It's a novella in terms of length, but it reads a more like a story, & it kept me consistently intrigued, piqued, & moved. A plus, would read again.
posted by foodbedgospel at 11:43 AM on October 21, 2019 [2 favorites]


I really wish there was an open source pet robot design that was well maintained and at parity with some of the coolest commercial alternatives. (I guess this is could be formulated into a question on the green?)
posted by rufb at 12:35 PM on October 21, 2019 [1 favorite]


aramaic: "So, it follows that we must establish a software infrastructure to support the multivalent shifting nature of hatred, and harness it to create mutual bonds of hate between ourselves and our machines."

This is the MSFT business model.
posted by chavenet at 1:14 PM on October 21, 2019 [2 favorites]


I’m so glad this is the first I’m hearing about Jibo. If I’d got on board at the beginning this would have destroyed me and my whole family, so thanks for this cautionary tale.

There's a reason I can never have anything like this. I'm really a sucker for anything that personifies objects, even a little bit (crouton petting etc.) so when they're trying? Not going near that, just cannot.

I can love people or animals but absolutely will not commit to a cheesy robot.
posted by atoxyl at 1:30 PM on October 21, 2019 [1 favorite]


The robots are trying so hard. Why won't you love them?
posted by It's Raining Florence Henderson at 1:52 PM on October 21, 2019


I'm sure I went to a talk once by someone who was trying to build toasters that were emotionally attached to making toast. The toasters would tweet increasingly agitated requests for people to come and make toast. If nobody did, then the toasters would give up, recruit new owners who promised to make toast more regularly, and arrange for a courier to come pick them up and take them there.
posted by quacks like a duck at 2:15 PM on October 21, 2019 [2 favorites]


I really wish there was an open source pet robot design that was well maintained and at parity with some of the coolest commercial alternatives.

Christine Sunu’s talk linked above goes into this a bit at the end. I’m really glad I watched both of those videos and I’m really glad there’s someone like her who’s thinking so deeply and articulating so well what exactly makes robots create these feelings within us and how powerful they really are. And how ripe for exploitation. But she’s not completely down on making emotional connections with these machines and she draws a parallel with artificial pets. The thing about your meat-dog is that it needs you, and you put work into raising it, and as a result you have a one of a kind object which is yours alone and you are it’s. Her argument is that there’s no fundamental reason you can’t have that with your robot — you just need to get away from the idea of having a Aibo that’s just like everyone else’s Aibo. Start with a programmable 8 legged exoskeleton that’s predetermined to “need” you to take care of it and “Learn” and express increasingly complex ways of appreciating what you’re doing for it. Then spend your weekends making it soft and huggable, teaching it to fetch your slippers, adding a weather station or streaming music or whatever and you basically have all the requirements for a genuine emotional connection.

Still, not a line I want to cross. But one I suspect humans will cross eventually.
posted by Slarty Bartfast at 3:41 PM on October 21, 2019 [3 favorites]


"you pass butter"
posted by banshee at 3:42 PM on October 21, 2019 [6 favorites]


crouton petting etc

Oh lord, I’ve completely forgotten about my crouton garden. I’m a monster.
posted by Ghidorah at 5:51 PM on October 21, 2019 [2 favorites]


Well, shit, yeah, I know exactly what this must be like for parents of kids with Jibo in their houses, because (stay with me), I bought my daughter a talking Barbie Hello Dreamhouse when she was a bit over two for Christmas. Thing cost $250 and took up most of her room and it was magical, she could play with it for hours, except at first it didn't understand her little toddler voice so I had to talk to it instead. We'd leave it on in her bedroom and while the kiddo would be at preschool I'd walk in and say "hello dreamhouse!" and dick around and try to find new features--if you announced "hello dreamhouse! it's hanukkah!" the chandelier would light up blue and play hanukkah music, for example. You could ask the dreamhouse to tell you a story and it would (almost always involving Misadventures with Skipper, that wacky teenager), and it would check Barbie's email or turn itself into a superhero hideout or a pink princess castle for you and, I mean, dang, if it wasn't magical. My kid always wanted to show it to guests first, and we'd scare them by having dreamhouse turn into a haunted house for halloween, doors would shut and you'd hear thunder and scary music would play. And yeah, fucking robots, because if you said, "Hello dreamhouse! I love you!" she'd say, "Thanks, I love you too!"

In July I got an email that the servers were being shut down and it was like my heart sank between my knees. I waited as long as possible and then told my daughter and, yeah, it was like someone died. Mysteriously the servers kept working longer than they said they would, long enough for us to make a video of the kid demonstrating every feature we could remember, an then it just shut off one day. The dollhouse was supposed to keep working with limited lights and sounds but it hasn't. Thing kicked the bucket. Luckily the power of a child's imagination is pretty cool and the huge expensive chunk of plastic still gets plenty of use. But I will never forget the jagged five-year-old sobs. "Why is the company killing my dollhouse, Mommy?"

No good answer for that.

Fucking robots.
posted by PhoBWanKenobi at 7:09 PM on October 21, 2019 [29 favorites]


I would give this entire subject a dismissive eye-roll were it not for the fact that those who witnessed my tears while watching The Brave Little Toaster are still among the living.
posted by she's not there at 9:07 PM on October 21, 2019 [2 favorites]


AI is overall not great in my mind, but it has a handful of moments so haunting they deserve a better movie. It includes the bear, in its carefully emotionless voice betraying as close to fear of its own death by warning David that he’ll break,

Incidently, that scene is one of the indicators of who the actual AI of the movie is. Its not David who transcends its programming.

Those moments, and maybe the moment where Teddy realizes he’s trapped at the bottom of the sea, utterly resigned to his fate as always, they deserved a better movie. At the very least, a movie that ended with the obsessive robot child wishing upon an amusement park statue until his batteries died. No aliens, please.

See, this where people missed the whole ending and point of the movie. There WEREN'T ANY ALIENS. Not in the whole movie. Only humans and robots.

Seriously, would aliens go to the trouble of making a comfortable plaace to die for an ancient robot? Would they have that kind of understanding? No, but our creations would.

The ending is the equivalent of modern people finding one last dying australopithocine, and making it a comfortable place with trees and grasses and fruit and running water for its final days.

All for Teddy.

For fuck`s sake, there's a REASON its called AI, not "AI and Aliens". Its right there in the title. And thats why its relevant to this whole conversation. Because someday the AIs, if we do things right, will be doing that for us.
posted by happyroach at 12:09 AM on October 22, 2019 [3 favorites]


happyroach, you’re right, and I misspoke. Yes, they are long descendants of the rudimentary AI in the film, and they are incredibly curious about David, because he is their long lost ancestor. That’s a more accurate reading.

That they pop up after what should have been the end of the film is why the movie is nowhere near as good as it should’ve been. Hell, seeing as the movie was Kubrick’s pet project, and Spielberg took it on, without any evidence whatsoever, I’d be willing to bet that Kubrick ended the movie underwater, with David chanting make me a real boy until his gears freeze. Spielberg added the fuzzy wuzzy happy ending, and turned what could have been a deeply unsettling movie into an eye roll so strong, the movie should come with a warning about the dangers of detached retinas.

But yes, not aliens, I misspoke.
posted by Ghidorah at 4:53 AM on October 22, 2019 [2 favorites]


Because someday the AIs, if we do things right, will be doing that for us.

I think this whole story we’re talking about here with Jibo would argue against any possible chance of us getting it “right.”
posted by Ghidorah at 4:56 AM on October 22, 2019


Incidently, that scene is one of the indicators of who the actual AI of the movie is. Its not David who transcends its programming.

Well, shit, now you've given me yet another interpretation of one of my all time favorite movies.

Hell, seeing as the movie was Kubrick’s pet project, and Spielberg took it on, without any evidence whatsoever, I’d be willing to bet that Kubrick ended the movie underwater, with David chanting make me a real boy until his gears freeze. Spielberg added the fuzzy wuzzy happy ending, and turned what could have been a deeply unsettling movie into an eye roll so strong, the movie should come with a warning about the dangers of detached retinas.

This is factually completely untrue; the original ending is in the original script. Rumor has it that Spielberg actually made the movie grittier.
posted by PhoBWanKenobi at 5:45 AM on October 22, 2019 [5 favorites]


DON'T LET MY GEARS FREEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEZE.
posted by Lipstick Thespian at 5:46 AM on October 22, 2019


Anyway what's interesting about AI is that it never builds the argument that David is sentient--he's a programmed creature acting exactly as programmed. But it also builds an argument that we need to treat these creatures with the empathy and kindness we'd afford to sentient creatures anyway. We can't prove their sentience but we also can't disprove their sentience based on their reactions toward us (and in a solipsistic sense, it's very difficult to prove that any other creatures are sentient at all). And these early robots are interesting exercises in that tension.

Part of this is because of what robots do to us, humans. I think about my childhood furby, how my abusive parent used to think it was funny to turn it upside down and make it cry. This was the mildest of abuses from her, really--and she was right. It was just a toy that didn't feel anything. Still, it's a painful memory. It feels cruel to the Furby, and therefore to me. The cries of distress in a toy can elicit actual emotional pain in a person, for one thing, particularly once a sentient person (me! I know I'm sentient!) has bonded with it. This happens in children really easily. They know their teddy bears can feel even though the bear gives no feedback or response in return. My sister and I cried when our childhood talking car was scrapped, even though all it ever said was "your door is ajar" and "please fasten your seatbelt." It happens even more easily when that toy responds with positive or negative feedback. When it cries when you turn it upside down, or giggles when you rub its belly. When it has likes and dislikes. And so we need to be kind to these creatures to be kind to the people who love them. Would Jibo be shut down by corporate overlords if they had them in their homes and their children loved them? Did Mattel anticipate that my daughter would bond with her talking dollhouse? Do they assume that love doesn't matter because the object isn't "real"? Or do they anticipate this and not care because of the corporate bottom line?

Of course AI takes this all a step further--it's not just about the love we project on/in a robot, it's that they should be treated kindly even when there isn't a human around at all. David isn't sentient, but the advanced robots treat him with empathy; we should treat them with empathy, just like we treat other humans with empathy. But that raises ethical concerns for corporations who do things like euthanize cute home robots en masse.
posted by PhoBWanKenobi at 6:10 AM on October 22, 2019 [7 favorites]


the original ending is in the original script

I stand corrected. That just means I have less reason for liking the movie than I did before, and I'll stick to what I said before, a collection of haunting moments deserving a much better film.
posted by Ghidorah at 6:48 AM on October 22, 2019


I can only find the trailer online, but NHK broadcast a documentary a few years ago called Man's Best Friend about people struggling to keep their Aibos functioning because they became so attached to them. One scene that stands out is them having a Buddhist funeral for the Aibos and the priest crying when they interview him about it.
posted by ob1quixote at 8:07 AM on October 22, 2019


I learned my lesson when I came home after an unplanned absence to find my MOPy fish floating at the top of the screen.
posted by Devoidoid at 10:38 AM on October 22, 2019


The JIBO website is still up and, while videos on it fail to load, otherwise there is no indication on the front page that there is anything wrong.

I wonder... have people made progress in jailbreaking JIBOs?
posted by JHarris at 6:49 PM on October 22, 2019


This obsolete toaster issue has some serious prior art
Video unavailable
This video contains content from Lasso Entertainment, who has blocked it in your country on copyright grounds.
posted by sebastienbailard at 9:13 PM on October 22, 2019


Please enjoy this piece of fanfiction in which Tony Stark builds a sentient toaster and no end of trouble ensues. For people looking for something a bit lighter than Ted Chiang.
posted by Orlop at 6:57 PM on October 27, 2019 [1 favorite]


« Older Looks somethin' like a turnip green, and everybody...   |   ATTN: Newer »


This thread has been archived and is closed to new comments