Comparing chatbots vs. virtual assistants vs. conversational agents
Is a conversational agent the same as a chatbot or a virtual assistant? Not exactly. IBM Watson VP and CTO Rob High explains the differences.
We've all seen instances where terms like chatbots, virtual assistants and conversational agents have been used interchangeably, but do those terms really describe the same thing? Not according to Rob High, vice president and CTO at IBM Watson and an IBM Fellow.
In this Q&A, High explains the subtle but distinct differences between those three conversation-based technology terms and the intent behind them. One rule of thumb: The extent to which these technologies engage the user is key to understanding their differences.
What are the differences between terms like chatbot, conversational agent, virtual assistant, etc.?
Rob High: All those terms are used kind of loosely. There are lots of examples in which the terms have been used interchangeably. At IBM, we tend to think of these things somewhat distinctively, and it largely has to do with the degree to which they engage the end user in solving the problem.
A simple example of this is that there are a lot of chatbots out there today that operate on what we call a single-turn exchange. Somebody says something like 'Alexa, turn on the lights' or 'OK, Google, what's the tallest mountain in the world?' Those are independent, single-turn exchanges. The end user expresses an utterance, the utterance is interpreted or recognized for its intent, and then that intent is mapped onto a specific task.
That's all good, but when somebody asks 'what's my account balance?' they may need to know what their account balance is, but that's really not their problem. Their problem is that they're getting ready to buy something or they're trying to figure out how to save up for their kids' education or they're trying to figure out how to pay their bills -- there's something behind the question.
In my mind, a conversational agent is one that engages the end user into really understanding the nature of the problem behind the question. Part of that includes determining when it's appropriate to dig in deeper but also recognizing that, often, there is a bigger problem there. The conversational agent must be prepared to go to the next level and solicit end users to better understand the problem. Sometimes [conversational agents] have to help [end users] figure out for themselves what the problem is because, sometimes, we'll just go in with a question and we don't really know what it is that we're after.
This is especially important when you're dealing with customer support or servicing a product because if you're having a problem with something that you bought, the first thing that you need to do is describe the problem, but that might just be describing the symptoms and not necessarily the real issue.
It's going to take more than that to figure out what is really going on with the product and what is the issue and whether it's a problem with the product or a problem with the way it's being used or whether it's some transient situation. There are lots of different things that could be behind all that. A conversational agent has to be able to get to that.
You use the term conversational agent, but a lot of people use the term virtual or personal assistant. Which of those terms should we be using, or are they distinct?
High: They're kind of two different sides of the same coin, in some sense. A conversational agent is more focused on what it takes in order to maintain a conversation. With virtual agents or personal assistants, those terms tend to be more relevant in cases where you're trying to create this sense that the conversational agent you're dealing with has its own personality and is somehow uniquely associated with you.
At least for me, the term virtual assistant sort of metaphorically conjures the idea of your own personal butler -- someone who is there with you all the time, knows you deeply, but is dedicated to just you and serving your needs. When a conversational agent is coupled with that kind of personalized knowledge and acts and behaves in a way that gives you the feeling that it's there only for you, I think there becomes an intersection between the two ideas.
For it to serve you on a personal level, any kind of good personal assistant or virtual assistant needs to retain a great deal of context about you, but then use that context as a way of interacting with you -- to use the conversational agent technique for not just anticipating your need but responding to your need and getting to know you better to be able to respond to that need better in the future.
So personal assistants are good at natural language processing and can use machine learning to keep getting better. Do you see chatbots and the various kinds of conversational agents evolving side by side or do you see one overtaking the other?
High: I think both are useful for their own purposes and, to some extent, there's a continuum. But there's certainly a demarcation when it comes to the philosophy of what you're trying to do [and] the tools that you need to be able to do it with and the underlying technologies that are necessary to enable it.
I could imagine a world where chatbots are just chatbots and they do what they've done and they do it well but they don't do much more than that. There may be a use for that, but [I could imagine] other places where there's a lot of utility in going beyond just simply the chatbot to help people with their problems. A lot of that is driven by what kind of utility is called for.
We believe at IBM that the real purpose of AI is to augment human intelligence, not to replace human intelligence. When you think about that, you begin to realize that augmenting human cognition requires getting into a deeper level of understanding of a human and being able to recognize what problems they're trying to get to in a conversation space. [AI] must recognize that humans express themselves in sometimes very subtle ways, and that the intention behind that expression is something that requires a certain degree of reasoning.
The systems have to be trained [using machine learning]; you can't just program them to be able to do all these things. They have to learn. Ultimately, they have to interact with us like we're humans. They have to know something about the fact that, as humans, we have emotions, and our emotions can vary throughout the course of a conversation. [Conversational agents] have to know how to interact with somebody in order to amplify their thinking. There's more to it than just what you typically see today as a chatbot.
So I think both will continue to exist, but a demarcation will occur between those simple things that people can do quickly and easily without a whole lot of additional exploration, versus those situations in which there's a lot of economic value in amplifying human cognition.
How can technologies like chatbots and virtual assistants drive business value? Beyond handling conversational tasks, what's their potential in the enterprise?
High: I think chatbots may be an entry point for almost any enterprise. It's hard to operate an enterprise without having some kind of interface to your clients -- even the simplest of interfaces like those that might occur when you're carrying your smartphone around with you. Almost every institution out there is trying to engage their clients at a deeper level.
Part of that is about getting to know your clients better so that you can serve them better and part of it is about trying to create a higher degree of trust and loyalty. Some of it is about trying to deal with the burgeoning growth in call center expenses as more and more of these relationships drive more hand-holding or deep touch.
I think all of that is conspiring to suggest that going into the digital age, enterprises can only be successful if they're thinking about employing these conversational agents as a way of augmenting their own staff, but, even more so, augmenting the intelligence of their staff and their relationship with their clients and augmenting the intelligence of the clients to create a stronger relationship with the institution.