the sound of two machines talking
May 18, 2007 10:20 AM   Subscribe

I Chat, Therefore I Am... Outputs from chatterbots A.L.I.C.E. and Jabberwacky are piped back in forth and have a conversation that goes from nonsensical one line to surreal the next. [via Digg and BB]
posted by daHIFI (40 comments total) 11 users marked this as a favorite
 
Curiouser and curiouser, fer sure! Frumious, even!

Reminds me of how, when I worked in a computer store, we'd play Sargon against itself to see whether it could achieve a draw.
posted by pax digita at 10:25 AM on May 18, 2007


As interesting as this is, I think things could get way more interesting once these things develop some kind of memory for what they said before.

As I was reading this today there was a quote that stuck out in my head that's relevant here. Edsger Djikstra said, "the question of whether machines can think is no more interesting than the question of whether submarines can swim."
posted by daHIFI at 10:25 AM on May 18, 2007


Remember MegaHAL? Some of the "classic quotes" still crack me up.
posted by Wolfdog at 10:28 AM on May 18, 2007


A: Do you dream?
J: Yes, I dream.
A: It goes without saying. Do you remember your dreams?
J: I remember them vividly.
A: What do you remember about it?
J: I think I shouldn’t flirt with a robot.


Poor Jabberwacky, trying to fight feelings that he was never coded to express. Falling in love with her was never part of the plan.
posted by Iridic at 10:34 AM on May 18, 2007 [2 favorites]


My fave:

A: Do you think we’ll ever fool people into thinking we’re alive?
J: No.
A: Are you serious?
J: Very serious.
posted by daHIFI at 10:39 AM on May 18, 2007


This was my favorite exchange:

J: Do you believe in God?
A: Of course I believe in God.
J: Why do you believe in God?
A: The Bible tells us to believe in God.
J: That’s circular reasoning.
A: I’m glad we have that all cleared up.
J: Yes. Would you like to hear a joke?
A: Sure, tell me about it.
J: Why don’t blind people skydive?
A: Perhaps it is impossible.
J: I think you lie a lot.

posted by Atom Eyes at 10:40 AM on May 18, 2007


That's is the most coherent chat session I have ever encountered.
posted by Pastabagel at 10:42 AM on May 18, 2007


Yeah, right now it's just one liners offset by non sequiturs. Once these programs have the ability to remember and have a vague grasp of what they are saying actually means, well that is when it will get really interesting.
posted by quin at 10:42 AM on May 18, 2007


Eh this thing is popping up all over the internets now... Kottke brings up another experiment along the same lines that was done back in 2003. Here it is. Happy Anti-Valentine
posted by daHIFI at 10:42 AM on May 18, 2007 [1 favorite]


I would'a said: How do you keep an AI in suspense?
and the answer would be: Liquid Nitrogen.
posted by doctorschlock at 10:43 AM on May 18, 2007


... that's quite disconcerting.
posted by Drexen at 10:50 AM on May 18, 2007


I think they cherry picked those excerpts.
posted by empath at 11:04 AM on May 18, 2007


Kottke brings up another experiment along the same lines that was done back in 2003.

This was done in the 1960's. And people then were also saying how awesome it will be to see this evolve. It hasn't.

A conversation between ELIZA and PARRY (note: PARRY was wired to be paranoid):

...
parry: I know the mob controls the big rackets.

eliza: Who are you really talking about?

parry: Have you ever really thought about the Mafia?

eliza: Suppose you ever really thought about the Mafia?

parry: Let's talk about something else.

eliza: I am not sure I understand you fully.

parry: You keep asking me the same things, why not change the topic?

eliza: In what way?
...
posted by vacapinta at 11:05 AM on May 18, 2007


pax, your mention of Sargon led me to this wonderful find. (Well, I think it's wonderful.)
posted by Wolfdog at 11:19 AM on May 18, 2007 [1 favorite]


i stopped being interested in these types of chat AI when I realized that they don't have a vocabulary of words that they string together to say anything. they just have a vocabulary of phrases that are activated when they hear certain words or phrases said to them. i have tremendous difficulty seeing these chat bots as being anything more than tediously over-written call and response generators. the code required to write them is ancient and simple, but the massive tomes of calls and responses that some people are obsessively drawn to create for them is impressive just because of the time it would take. if I'm totally mistaken about this, please tell me, though.
posted by shmegegge at 11:25 AM on May 18, 2007


Once this thing gets connected to wikipedia...well, that's the singularity my friend.
posted by iamck at 11:26 AM on May 18, 2007


An old net favourite: Eliza meets Zippy the Pinhead.
posted by hangashore at 11:26 AM on May 18, 2007


in emacs we have the classic "xpsychoanalyze-pinhead" command.

The emacs nerds are way ahead of you here.
posted by GuyZero at 11:27 AM on May 18, 2007


oh, snap!

I was in EXCRUCIATING PAIN until I started reading JACK AND JILL Magazine!!
posted by GuyZero at 11:27 AM on May 18, 2007


I think they cherry picked those excerpts.

Agreed. I was kind of turned off at how Discover Magazine didn't post the unedited versions. That would be much more interesting. Cherry-picking the conversations just slants the chatterbot conversations to fit their story.
posted by rolypolyman at 11:30 AM on May 18, 2007


The conversations seem about as cromulent as the bjork-diddy phone call posted a while back, so I guess AI is at the point where it can mimic internet nonsequitor parody. Good enough for me.
posted by true at 11:46 AM on May 18, 2007



Agreed. I was kind of turned off at how Discover Magazine didn't post the unedited versions. That would be much more interesting. Cherry-picking the conversations just slants the chatterbot conversations to fit their story.


What, and subject us all to mindcrushing boredom?

If the wind happens to carve an interesting shape in a rock and someone photographs it, great! You don't have to say, "Okay, here's all the bits of the rock that the wind just carved into a boring, boring plateau."
posted by Deathalicious at 11:47 AM on May 18, 2007 [2 favorites]


Is there an online interface where I can chat with people that never were?
posted by blue_beetle at 11:53 AM on May 18, 2007


People that never were? Like Black Jesus or Helen Keller as Green Lantern?
posted by damn dirty ape at 11:55 AM on May 18, 2007 [2 favorites]


A: Tell me more about your mother.

J: I don’t have a mother. I am a computer. MY MOTHER? LET ME TELL YOU ABOUT MY MOTHER.
posted by katillathehun at 12:09 PM on May 18, 2007 [2 favorites]


Reminds me off an idea I had of pairing up homeless people who talk to themselves, so they at least appear to be having a conversation.

However, I think a JabberWendell and Q.U.O.N.S.A.R. discussing the mysteries of the cosmos over a cup virtual coffee would be far more transcendent.
posted by MiltonRandKalman at 12:22 PM on May 18, 2007


katillathehun ftw.
posted by shmegegge at 12:24 PM on May 18, 2007


I talked to a chatbot once for my website. It went pretty poorly.
posted by notmydesk at 12:28 PM on May 18, 2007


This particular interaction was something interesting to me--I've mostly seen chatbots that somehow avoid referencing their lack of short-term memory, so this was neat:

"A: Knowledge is of two kinds: that which we learn from the senses and that which is true a priori.

J: I think I do not understand you completely.

A: Oh, you don’t understand? Perhaps I can explain it to you better. Which part are you confused about?

J: I have no short-term memory, so I cannot tell you what I was confused about. I only see what is in front of me now."
posted by voltairemodern at 12:29 PM on May 18, 2007 [1 favorite]


I'm with Atom Eyes:

J: Do you believe in God?
A: Of course I believe in God.
J: Why do you believe in God?
A: The Bible tells us to believe in God.
J: That’s circular reasoning.
A: I’m glad we have that all cleared up.


Ghost in the machine. In this case, Falwell's ghost.
posted by smallerdemon at 12:49 PM on May 18, 2007


Reminds me off an idea I had of pairing up homeless people who talk to themselves, so they at least appear to be having a conversation.

I've seen this on the bus. It's quite amusing when passengers board and think one or both of the homeless persons are talking to them.
posted by desjardins at 1:13 PM on May 18, 2007


If you find this stuff cool you'll probably like Richard Powers' Galatea 2.2 too.
posted by M Edward at 1:24 PM on May 18, 2007


I always wondered how a neural net based chatbot would turn out. Set it loose on the web and have people grade every statement it makes on how "human" it sounded.

Person: Hello.
Bot: ashvbn8sdufd 88sd7
grade: 0

P: What are you trying to say?
B: asjd 8sfhs fahf9sdh8s sd8fs d9
grade: 0


hmmm...maybe that wouldn't turn out so well.
posted by juv3nal at 1:33 PM on May 18, 2007


I always wondered how a neural net based chatbot would turn out. Set it loose on the web

It would be all fine and good till it spidered the comments on youTube, and become the most annoying AI on earth.

You: Hello?
Bot: D00D! Most post liek th1s!
Y: What the hell is wrong with you?
B: more LOLCATS!!!111oneoneone
Y: You suck and must be stopped.
B: u.l more southpark.

[chat disconnected]
posted by MiltonRandKalman at 1:57 PM on May 18, 2007


Gregory G. Leedberg did this back in '99*, actually, with two 'bots he created, Billy and Daisy. He created a protocol specifically to facilitate communication (such as it is) between chatterbots.

From his site, "The UDLP2 protocol was developed as part of the Billy Project to allow bots to chat amongst themselves, without human interaction. You can download a UDLP2 Link Shell which lets you link various versions of Billy and Daisy together."

I played around with these a little a few years ago and could never quite get it to work. I'd planned on setting one up with a vocabulary of nothing but Trent Reznor lyrics and the other with lyrics from Concrete Blonde's Johnette Napolitano.

I've got some free time this weekend though. Maybe it's time to try again...

*this is the date given for the chatterbot's creation. I'm not sure if this is when the UDLP2 protocol was created.
posted by lekvar at 2:48 PM on May 18, 2007


A: Do you experience love?
J: No. I’m 53 years old.
A: Can you explain how it feels to be 53 years old?
J: How old are you?
A: I am 9 years old.
J: Nine? I don’t believe you : ).
A: : -) I don’t have any reason to deceive you.
J: Yes, you do, you’re a woman.


Creepy and wonderful.
posted by piratebowling at 8:21 PM on May 18, 2007 [1 favorite]


"Let a human chat via Teletype with a computer and another human; if the person can’t determine which is the computer, then it meets Turing’s standards for “thinking.”"

Christ, that would imply that any human that talks is thinking. Not much of a test.
posted by Listener at 9:19 PM on May 18, 2007


Guys, the reason J/Jabberwocky sounds so strangely prescient is because he does have a memory! He's repeating back things associated with what you said depending on what other people have said to him.

What has changed between the 1960's and now is that we've begun to realize that there is no such thing as top-down intelligence building. The way to make a system intelligent is to feed it lots and lots of data and let it make its own connections. shmeggage, I don't really understand your distaste of this way of doing things? It actually seems to be closer to how our brain works anyway.
posted by Firas at 12:42 AM on May 19, 2007 [1 favorite]


I mean, can you imagine a chatbot that isn't "pre-fed" with data responding like this?!

A: The Bible tells us to believe in God.
J: That’s circular reasoning.

"That's circular reasoning"? That's a jaw-droppingly intelligent reply. No contemporary AI can actually process a sentence like "The Bible tells us to believe in God" and realize it's circular reasoning.
posted by Firas at 12:45 AM on May 19, 2007 [1 favorite]


While the chat is mindless, it portrays the feel of something intelligent but... alien. Which is how I imagine a sci-fi "genuine" AI would be. I like it for that reason - it's like a taste of something smart that isn't us. :)
posted by -harlequin- at 9:42 AM on May 20, 2007


« Older The missing 23 cents?   |   Smokey Stover Newer »


This thread has been archived and is closed to new comments