Sunday, November 22, 2009

It's the Context, Stupid!

What ultimately constructs a logical and proper response to a question? What's the edge a conversation agent have over a search engine? What makes a chat bot / chatter bot smart? It is the recognition of CONTEXT!

Here is an example conversation happen between a real human and CindyBot:

Human : How's the weather like ?
Cindy : Where do you live ?
Human : Kirkland, WA
Cindy : 42.6F / 5.9C, Scattered Clouds - 8:45 PM PST Nov. 22
Human : tomorrow?
Cindy : Cloudy with a chance of rain in the morning...then rain likely in the afternoon. Highs in the 40s to lower 50s. Southeast wind 10 to 15 mph.

Look at those human's inputs, without the context establish by the first question, "tomorrow?" will make no sense. Imagine what will be Google's results for keyword "tomorrow"? Cindy was able to respond to "tomorrow" because it recognized the context. Context here means a follow up question about weather in Kirkland, WA. With this implicit information, Cindy can go and retrieve tomorrow's weather info for Kirkland and generate the proper response.

Context is not necessary previous conversation. Context can be many things. It can be the current web-page a customer is browsing on; it can be items in the customer's cart; it can be current time, location or even the browser version the user is running on. CML provides strong context support based on these information. Basically, the idea is the next conversation is likely to be a follow up of the previous conversation, or on the same topic. And a Web-page should be able to send "hints" like current page, current items in the shopping cart to the conversation agent to augment it's response generation.

No comments:

Post a Comment