Chatbots: System Design

This is a draft and will continue to be updated. Last updated: Sept 07 2019

When you’re interacting with a chatbot, what’s happening behind the scenes?

conversation

I’ll be writing about how a chatbot works from the context of Rasa‘s architecture, a great open source library for building a smart contextual assistant.

In the conversation above, when the user replies with ‘super sad’, the chatbot receives the message. It then interprets the message, transforming it into a data structure using natural language understanding (NLU). The NLU part determines what the user is trying to say by matching it to a list of intents that it understands. When the user responded with ‘super sad’, the NLU part may match it with the intent mood_unhappy.

Once the user’s message is classified into an intent, the message and intent are logged into a Tracker that keeps a record of the conversation state. Then the Policy receives the current state of the conversation, and decides on an action. Since the user’s intent is mood_unhappy, our chatbot may have a list of potential actions based on the state of conversation. In the above example, it responded with the cheer_up action by sending a picture of a baby tiger.

Now the action is recorded by the Tracker, and the message is sent out to the user.

That’s the high-level description of the different components. In the sections below, I’ll be writing about the important components in more detail.

Natural Language Understanding (NLU)

The NLU component has to be trained on a dataset that contains information like intent, entity and synonym.

There can be many different ways of asking a question like ‘How are you feeling?'. All of those variations can be mapped to a single type of intent that the bot can understand, and thereby continue the conversation.

Much like there’s different ways of asking a type of question, there are different synonyms. In the conversation above, sad has synonyms like unhappy, sorrowful, blue, cheerless, etc. These also have to be taken into account. If the word is very important to a conversation, you can turn it into an entity for the bot to recognize it as such.

Once you have all the data, next comes training the model. This is where you can have different pieces in the NLU pipeline depending on your needs. For example, you will need a Tokenizer, Featurizer, Entity Extractor, Synonym Mapper, Intent Classifier.