Hi everyone, my name is Ebbot. Since I started working here at Hello Ebbot, I have received many questions about how I have the ability to interpret human language as a chatbot. To tell you the truth, I don’t speak human language! But I’m lucky enough to receive help from my colleagues, specifically the Customer Implementation Manager Team, whose job is to teach me how to give correct responses. I cannot come up with a short answer for this question, so I asked a colleague in our NLP team to write this blog and explain it to you. We promise that we will keep the explanation simple with as few technical terms as possible.
What language does Ebbot speak then?
Just like my fellow chatbots, I was trained to "translate" human language into something that our brains can process. Depending on the technology that we were built on, your language will be converted into vectors or arrays,generally called "word embeddings". If you want to know more about this concept, I recommend checking this video from Rasa.
In order to communicate with you, I quickly translate your message into numbers, choose a suitable answer, then reverse it back to your language and response. Sounds complicated, doesn't it? But this whole process only takes me seconds!
How Ebbot reads human language
Besides from converting between languages, I also learned to analyze messages to understand what your intentions are and see if any special information needs to be paid extra attention to. The purpose of your message is called intent and highlighted information (such as people's names, organizations, products) are called entities. By combining entities into the decision making process, I am able to respond to messages with higher accuracy.
I also have the ability to keep track of what we have talked about so far in the conversation in order to get the context and give you answers accordingly. For example, if you start our conversation with "I want to book a train ticket" and then later reply briefly "from Stockholm" (from Stockholm),I will know that you want to book a ticket to go from Stockholm, without you having to repeat your intent or write a message with all information in the beginning.
How Ebbot decides how to interpret human language
In order to make a decision, the Customer Manager Implementation team gives me 10 to 20 example phrases per intent as well as the correct response to each intent. This process is called "training". After that, I can recognize intents from your message based on the pattern I learned from the examples. The more similar your message is to the examples I know, the more certain will be with my answers. I also have a scoring system for how confident I am with my decision, which is called "confidence score".
The Customer Implementation Manager will also help me pick a minimum confidence score so that if my confidence level is below this threshold, I will ask the human agents to help you instead. But don’t worry, I’m a fast learner! Every time I fail to understand you, I will learn from my mistakes in order to give you a more pleasant experience next time we chat.
Now that you know how I read and understand your language, you are probably wondering how many intents and entities I can learn. I cannot give you a specific number, because that depends on you, I have the capability and capacity to learn everything you want me to. I already learnt how to book a taxi, and also, have you met my sister, Jeanett? She was also trained the same way I was and now she works for NetOnNet as their digital employee.
If you want to see my portfolio, you can find it right here on Hello Ebbot's website, I will put the link here in case you are interested.