Another year, another Chat.
September 20, 2002 6:34 AM   Subscribe

Another year, another Chat. This year's Loebner Prize competition will be held next week in Atlanta, GA (at SciTrek and GSU). The yearly contest is a modified "Turing test" (seminal paper here) where people try to guess whether they're chatting with computers or with people.

There are some resources for rolling your own AI bot, but before you begin, think about these two sentences and you'll see what a serious problem natural language is: "We gave the monkeys the bananas because they were hungry" and "We gave the monkeys the bananas because they were ripe" (nod to this guy for the example). You have to know a lot about the world and the things in it to disambiguate the "they" in those sentences.
posted by zpousman (15 comments total)
 
Any zoo visitor knows there's no ambiguity there... monkeys ARE pretty ripe. Giving 'em some lunch is the least we can do to make up for it.
posted by GhostintheMachine at 6:40 AM on September 20, 2002


So strange the way topicson Metafilter always appear when you were thinking about them the night before. Thanks zpousman this will come in handy for my new web-service alife cross :)
posted by laukf at 7:32 AM on September 20, 2002


So.. any AI geeks here?

I don't know crap about AI.. so.. someone tell me if this is a stupid idea:

Create a dictionary "graph" of all words, what they are (subject/noun, verb, adjective) and words that typically describe them...

So you see 2 nouns (thus potential subjects) in each sentence, right? monkeys, and bananas. And you look for the closest link between the adjective and the nouns to determine which is the subject.

So the shortest links for each sentence are:

Monkeys [noun] -> Animals [category] -> Hungry [adjective]
Bananas [noun] -> Fruit [category] -> Ripe [adjective]

Obviously it goes deeper than just that.. you need logic to parse all the sentence structure beyond just nouns/verbs/adjectives... but.. is this how these sorts of things work? Or am I totally off-base?
posted by twiggy at 8:01 AM on September 20, 2002


great example, zpouseman. I had a prof. once that used to work as a programmer for a speech-recognition software company, and he had an equally memorable one regarding the complexity of speech. Say aloud these two sentences:

It's hard to recognize speech.
It's hard to wreck a nice beach.
posted by LuxFX at 8:11 AM on September 20, 2002


So.. any AI geeks here?

Man, I hated that movie!

But the online advertising campaign included an ALICEbot. The AI chatbot most assuredly fails the Turing Test, but I must shame-facedly admit to having spent quite some time confirming this. Funfun.

More Alicebots here. Somewhere, there's an AynRandBot (trained on phrases from Rand's writings), but I can't find it.
posted by dilettanti at 8:27 AM on September 20, 2002


twiggy - conversational bots today are at a surprisingly lower level than what you propose. Most are simply pattern matchers, with built in replies to specific inputs. There hasn't really been a significant advance in conversational technology since Eliza.

In conversational bots, there's rarely an attempt to get at the meaning or deep structure of a sentence, because where do you go from there? Language generation can be an even stickier problem than language parsing.
posted by mfbridges at 8:32 AM on September 20, 2002


Ah, linguistics!

Here's another commonly cited example of an ambiguous sentence: Visiting relatives can be boring.

Then there's the classic, "Colorless green ideas sleep furiously."
posted by tbc at 8:50 AM on September 20, 2002


It seems to me that those two sentences are a overly ambiguous, even for human brains. I had to read them twice myself. I'm more in favor of emphasizing meaning over efficiency.

"We gave the bananas to the monkeys because the monkeys were hungry."

"We gave the bananas to the monkeys because the bananas were ripe."

Instead of programming computers to understand ambiguous human statements, let's program humans to be less ambiguous.
posted by mikrophon at 9:17 AM on September 20, 2002


I took a Semantics class long ago and was delighted by the many meanings of "We saw her duck", which has three meanings, the least obvious one involving a serrated blade. Hee.
posted by skryche at 9:20 AM on September 20, 2002


mikrophon: rrriiiiiiiiiight. Why should all of the beautiful nuance, wit, ambiguity and even contradiction be removed from the way people speak and write?

The world is ambiguous. Language just reflects that.

You can't make a list of necessary and sufficient conditions for something being a "chair", much less a bachelor. The world is full of people sitting on tables, or on tree stumps, or in hammocks, or on skateboards outside the 7-11. And it's full of guys who get married for green cards but never live with their "wives", guys who have been living with the same woman for 12 years and have two children (by her) and drive minivans, or guys who are priests or monks -- monks might be unmarried men but they're not bachelors. Why?

The reason why the Turing Test is interesting is because humans are ambiguous, sometimes on purpose. Getting computers to be like us is going to take a long time. Unless we take mike's suggestion and teach people to speak in unix commands. Unambiguous languages are boring.
posted by zpousman at 9:36 AM on September 20, 2002


zpousman -- right on. I would add that it seems to me that the world itself is completely unambiguous, though when we attempt to describe it and quantify it, we construct ideas that are at best loose representations of that reality. (not that I'm arguing that we should try to describe the world more fully... our descriptions still would never be complete and more importantly, no one would ever get anything done!)
posted by 4easypayments at 9:46 AM on September 20, 2002


twiggy, you're not entirely off base. Some of the most effective practical uses of AI are in expert systems, which create logic trees somewhat like your language example, and neural nets, which go up another level and draw conclusions based on disambiguation of existing relationships. The kind of shortest-path logic you're talking about is similar to the kind of tree-pruning done by chess computers like Deep Blue. This doesn't work as well for language as you'd think, though. Today, NLP (natural language processing) is a subset of artificial intelligence that is considered much more utilitarian than in the past, and builds on the logic tree basis of most other AI developments rather than being seen as their goal.

The idea of AI consciousness, of course, is considered a pipe dream by almost anyone seriously working in the field.

A good layman's introduction to the current state of AI can be found via Jorn Barger.
posted by dhartung at 10:01 AM on September 20, 2002


Okay, okay. I was by no means suggesting removing the nuance from language. If anything, I am torn by my desire for clarity in language and my love for poetry. The point I was trying to make here is that those two sentences require context for clear meaning. If you want a computer (soft carbon or hard silicate) to really understand them, it's going to need to know which monkeys and what bananas.
posted by mikrophon at 12:43 PM on September 20, 2002


Another semanitc snare for AIs would be "A Porsche is a Volkswagen
built by a Porsche, who built a Volkswagen
."

By the time one scripted the AI's processes to differentiate between "Ferdinand Porsche" "Porsche assembly plant" and "Porsche automobile",
the issue of "automous action and the enabler" would come into play.
posted by Smart Dalek at 1:17 PM on September 20, 2002


Another classic ambiguous pair:

Time flies like an arrow
Fruit flies like a banana
posted by quarantine at 4:45 PM on September 20, 2002


« Older PeaceTrees Vietnam. Reversing the Legacy of War.   |   Newer »


This thread has been archived and is closed to new comments