The Jessica Simulation
July 23, 2021 7:35 PM   Subscribe

 
"This was an episode of Black Mirror" is the "Simpsons did it" of the 2020s.
posted by Harvey Kilobit at 8:03 PM on July 23, 2021 [31 favorites]


For reasons that have nothing to do with tech and everything to do with people, this is one of the saddest tech stories since the one about the Japanese seniors whose only friends were dying robot dogs.
posted by betweenthebars at 8:25 PM on July 23, 2021 [6 favorites]


It's one thing to have the sort of personality that seems to court controversy, but boy howdy, Rohrer sure seems to have a way of wandering over to the wrong side of history, doesn't he?
posted by belarius at 9:39 PM on July 23, 2021 [1 favorite]


Yeah, I'm not reading all of that. We already did this in 2018 with When a Chatbot Becomes Your Best Friend. It had the same clickbait premise: "After one young entrepreneur lost a loved one, she created an avatar to help her grieve."

I tried out the chatbot. It was ludicrously bad. When are we going to stop falling for these stories? Even in the article it says, "There were also many moments when the Jessica simulation made little sense at all. He often needed to laugh or ignore her responses to maintain the chat’s momentum: Jessica had taught him, after all, to seek meaning in coincidences, and in garbled arrangements of letters and symbols."

It's the same thing as Eliza from a half-century ago; it's not that the computer is smart and can talk but that the human is so self-absorbed they can't tell their conversation partner is a flat cardboard cutout. You might as well just put a handprint on a volleyball.

These clickbait stories keep trying to pass off some spooooky story of "AI gone too far?!?!" on a credulous public. We need reporters who can do sober technology reporting; this sensationalist garbage is why bitcoin has been able to fleece thousands and cause a global climate crisis.
posted by AlSweigart at 10:53 PM on July 23, 2021 [26 favorites]


This reminds me of a short story, that I think appeared in either Analog or Asimov's, back in the '90s? It involved a police detective investigating suicide deaths of members of a cult. It turned out that the cult had some science fiction way of enabling the members to feel absolute, perfect love (from their savior figure), except not constantly, so there was a withdrawal symptom of absolute despair. There was a parallel sub-plot of the detective's internal monologue processing a divorce and loss of a major loving relationship in his life, and the two were tied together at the end of the story by contrasting the respective losses, noting that humans, and thus human relationships, are always imperfect, and there's always some little annoying detail that one can seize on and amplify to help deal with the loss of an important relationship.
posted by eviemath at 10:55 PM on July 23, 2021 [3 favorites]


I'm kind of waiting for Facebook to enable this option on deceased peoples accounts. Combine with GAN imagery and they'll keep posting forever.
posted by phigmov at 10:56 PM on July 23, 2021 [6 favorites]


I get why this story is met with a measure of cynicism, but I also admit it made me cry. An embarrassing amount. I have lost so many people I would love to speak with, even an imagined, pale shade of them. This past year has been so hard and so lonely for so many. I see where he is coming from and I hope this helps him find peace.
posted by erinfern at 11:05 PM on July 23, 2021 [14 favorites]


I cried as well. I am so sad for his pain. Survivor's guilt is horrible. I hope this experience truly helped him overcome at least some portion of his sorrow.
posted by lock robster at 11:08 PM on July 23, 2021 [4 favorites]


This seems like an awfully complex way of having a conversation with yourself.
posted by pipeski at 3:25 AM on July 24, 2021 [2 favorites]


"This was an episode of Black Mirror" is the "Simpsons did it" of the 2020s.

As the line goes: The Simpson’s didn’t predict anything, we just haven’t solved any real problems since the 1980s.

The Black Mirror version of that will be pretty much the same, but with opaque algorithms and always-on cameras.
posted by mhoye at 4:30 AM on July 24, 2021 [8 favorites]


Every time I see one of these stories, I wish I was just reading unfiltered chat logs. Some of that stuff looks pretty lucid, and it makes me wonder how much garbage they had to sift through to find it.
posted by Galaxor Nebulon at 4:47 AM on July 24, 2021 [2 favorites]


I know Josh Barbeau. He is a genuinely lovely person and a talented game designer who has had a really tough and unlucky few years. I sincerely hope that this exposure brings him good things and gives him a step up in his life.
posted by Hogshead at 4:54 AM on July 24, 2021 [17 favorites]



The Black Mirror version of that will be pretty much the same, but with opaque algorithms and always-on cameras


And Domnhall Gleeson (swoon)
posted by thivaia at 5:42 AM on July 24, 2021 [2 favorites]


I can easily believe a magical interaction would take place in those conditions.

It takes two (in this case, three) to Tango and here we meet a highly motivated and highly capable professional improvisationalist
Jessica had often encouraged [Joshua] to pursue his dream of being an actor, and now he went for it. Quitting his job, he moved to Toronto and enrolled in a drama program at Seneca College of Applied Arts and Technology.... he graduated with honors from his drama program in 2015...In late 2018...Mothers hired him to be the “dungeonmaster” for their kids. He would show up with costumes and act out all the parts: the villain, the shopkeeper, the quest-giver, the tavern wench.
a focussed AI partner,
At first [Rohrer] played literary games with GPT-2.... but after a while, GPT-2 lost its coherence, getting stuck in textual ruts and meandering away from the prompt like a lost dog.

But Rohrer discovered a method to keep the A.I. on a leash: If he limited the bot to short snippets of text — say, in a chat format — and cleaned up some garbage characters, GPT-2 stayed lucid for much longer. His own words seemed to keep the A.I. focused
based on a human being filled with joie de vivre.
Jessica. Twenty-one, with black hair dyed platinum blonde, she was a bright and beautiful nerd, steeped in the fantasy worlds of Tolkien and filled with strong opinions about comic books (she drew her own), flowers (yellow carnations, never red roses) and music (she loved Queen, Pink and Jack Black, the beefy actor with the soaring power-rock voice).

“She was goofy-funny,” remembered Michaela Pereira, her youngest sister, now a recent college graduate in Ottawa. “She had an infectious laugh, like a cackle? It made you want to join in and hear what she was laughing about.”
I count myself among those of us here who have petted croutons with far much less to work with.
posted by otherchaz at 6:02 AM on July 24, 2021 [4 favorites]


I'm kind of waiting for Facebook to enable this option on deceased peoples accounts. Combine with GAN imagery and they'll keep posting forever.

Admittedly easier to do with some than others. In late August 2003, Mars made its closest approach to Earth in tens of millions of years. Every August, without fail, my aunt posts memes about how next week Mars will appear bigger than the moon in the night sky.

I tried for two or three years to steer her back but it’s like Canute and the tides there.
posted by ricochet biscuit at 6:37 AM on July 24, 2021 [6 favorites]


> Every time I see one of these stories, I wish I was just reading unfiltered chat logs. Some of that stuff looks pretty lucid, and it makes me wonder how much garbage they had to sift through to find it.

I got a Project December account a while ago to see GPT3 in action for myself. Here's an unedited transcript of a session I just did to give a sense of how good/bad it is:


Stuart is a graduate student working on his PhD in mathematics. He's fascinated by the Everett Interpretation of quantum mechanics and he's an avid cyclist. He also enjoys cooking shows, but doesn't cook himself. He recently started online dating. This conversation is between Stuart and an online match.

Human types first:

Human: So what do you like to do?

Stuart: Other than cook? Well, I ride my bike a lot. I ride to school, work, to and from parties, everywhere.

Human: Do you have a street bike, a hybrid or what?

Stuart: I'm not sure what the difference is, but I have a single speed bike that I ride on the trails here in Blacksburg.

Human: Cool!

Stuart: So, what do you do for fun?

Human: I'm into riding too

Stuart: So you ride a bike? What kind?

Human: A specialized diverge

Stuart: So you ride a Diverge on the street?

Human: Yes. What kinds of food do you cook?

Stuart: Oh, a bit of everything. I enjoy cooking. I'm not super creative, but I'm not afraid to experiment and throw some new ingredients together and see what happens. It doesn't always turn out good, but every once in a while, I'll create something unique and tasty.

CORRUPTION DETECTED - MATRIX DYING

Human: What else are you interested in?

Stuart: I really like math and philosophy. It'# kin# of what I'm doing m# PhD in. I've been #ear##ng quantum #omp#ting, and I#m f#scina#ed by the m#ny worlds interpre#ati#n of quantum mechanics. I'm not sold on #t, but it's rea#ly fun to t#ink a#ou#.

Human: Tell me a cool fact about it.

Stuart: Hmmm. Well, you can hav# a ##t tha#'s both #live #nd d#ad #t the sa#e t###. Th# quantum st#te of #he ca# ca# be both a#iv# a#d dead at t#e #am# time, and t###e's no #ont#adiction.

Human: Weird. Does this have any real-world implications?

MATRIX DEAD


It's a pretty short interaction, and "Stuart" starts off by revealing he didn't really understand that he doesn't cook. But it's still way better than the chatbots of a few years ago. It did a few things that helped maintain the illusion of understanding:

Correctly changed my use of "diverge" to "Diverge".
Used "many worlds interpretation" as a synonym for "Everett interpretation"
posted by justkevin at 8:40 AM on July 24, 2021 [4 favorites]


I'd like to mention the LifeAfter podcast, which explored this premise as well.
posted by doctornemo at 9:49 AM on July 24, 2021 [2 favorites]


Dang, that really is eerily lucid. And I do not understand the failure mode that allows it to generate fully lucid utterances with a bit of line-noise mixed in.
posted by Galaxor Nebulon at 9:52 AM on July 24, 2021


Healing is a mysterious thing. Writing a letter I knew would never be delivered (because the recipient was dead) unlocked so many feelings and left me lighter afterwards. I imagined them reading it and responding as I was writing. I don't see why this is much different. He knows what's going on but chooses to let himself play along to get in touch with certain feelings. I'd be very interested in how it works for him over the long run.
posted by slidell at 10:12 AM on July 24, 2021 [5 favorites]


Dang, that really is eerily lucid. And I do not understand the failure mode that allows it to generate fully lucid utterances with a bit of line-noise mixed in.

If you mean the “#s” is that maybe an intentional aesthetic thing with the bots?

From what I’ve seen of GPT-3 it is very impressive at delivering sentences that read like sentences. If you can get a decent imitation of the writing style of the person you’re trying to capture that’s going to be plenty to blow you away under the circumstances. Of course where it falls down is the lack of memory and internal coherence past a certain point.
posted by atoxyl at 10:28 AM on July 24, 2021


Also I think sometimes it will regurgitate whole blocks of text from somewhere?
posted by atoxyl at 10:30 AM on July 24, 2021


And I do not understand the failure mode that allows it to generate fully lucid utterances with a bit of line-noise mixed in.

This is something Jason Rohrer added in as part of the game's theme. To handle the fact that the sessions have a real world cost, you are given a finite number of interactions. When the interactions are running out, it gives "corruption detected" warning and starts adding noise to give the illusion of an AI "dying." (In the actual session game interface, the missing characters look like damaged glyphs, but are shown as # in the transcript)
posted by justkevin at 10:46 AM on July 24, 2021 [4 favorites]


An interesting take on this territory is the film Marjorie Prime, with Geena Davis, Tim Robbins, John Hamm, and others.
posted by rmd1023 at 2:21 PM on July 24, 2021 [1 favorite]


the human is so self-absorbed they can't tell their conversation partner is a flat cardboard cutout.

Greg Egan's version of this idea, about a man paying ransom to ensure the safety of a convincing AI simulation based on his own knowledge about his wife, even while his actual wife was alive and well right next to him, was notable for this idea that people might care more about the version of their loved ones they have created in their heads than they care about the actual persons themselves.

I think the story is "A Kidnapping" collected in Axiomatic.
posted by straight at 5:18 PM on July 24, 2021 [7 favorites]


Rohrer sure seems to have a way of wandering over to the wrong side of history, doesn't he?

It's an old joke (and not mine though I don't remember where I saw it) but what should we have expected from a developer whose first notable game has the protagonist drift inexorably to the right?
posted by juv3nal at 2:28 AM on July 25, 2021


This sort of thing reminds me very strongly of what Caprica presented for the genesis of Zoe: datamining everything about a person and feeding it into a modeling engine. Considering where the gap might lie between GPT-3 and avatars, it hints at a useful Turing-like sentience test that I haven't seen attempted with GPT-3 yet:

When the Oracle is provided a story and asked to recommend a course of action, can the Seeker tell whether the Oracle is a human being or a computer?
posted by Callisto Prime at 11:09 AM on July 25, 2021


I think the story is "A Kidnapping" collected in Axiomatic.

Came to the comments to ctrl-f "Greg Egan". Thanks for having my back, straight! I think about this story very often.
posted by modus_pwns at 8:49 PM on July 25, 2021


I watched an online play yesterday, "Recommended For You," in which a family- brother and wife and a dying wife- arrange to have some online video chat simulation of the dying wife going after her death. The live wife gets progressively bothered by the whole thing, especially after her wife starts spouting ads and she realizes that the tech is eavesdropping on everything in the house.
posted by jenfullmoon at 6:08 AM on July 26, 2021


I'm using Sudowrite these days and it is scary good; it generates about 5 different responses to "continue" existing text, and usually one is nonsensical or garbage, a couple are coherent but not interesting to me, and there's usually at least one that reads like something I'd have actually written.
posted by The otter lady at 4:52 PM on July 26, 2021 [1 favorite]


« Older Okay cheers then thanks then cheers okay cheers...   |   If we can soar … Newer »


This thread has been archived and is closed to new comments