5 reasons NLP for chatbots improves performance
Experts say chatbots need some level of natural language processing capability in order to become truly conversational. Without language capabilities, bots are simple order takers.
Developments in natural language processing are improving chatbot capabilities across the enterprise. This can translate into increased language capabilities, improved accuracy, support for multiple languages and the ability to understand customer intent and sentiment.
"Better NLP algorithms are key for faster time to value for enterprise chatbots and a better experience for the end customers," said Saloni Potdar, technical lead and manager for the Watson Assistant algorithms at IBM. Better or improved NLP for chatbots capabilities go a long way in overcoming many challenges faced by enterprises, such as scarcity of labeled data, addressing drifts in customer needs and 24/7 availability.
Why chatbots need NLP
Early chatbots that relied on decision-tree flows to answer questions could help with basic and anticipated transactions, but they quickly needed to escalate to a human agent if customers made requests with any complexity, said Michelle Collins director of marketing and product development at Nfinity Avatars, which creates avatar-driven chatbots.
This led to frustration for the users who felt like they wasted their time and had to repeat their inquiry, and it increased costs for the enterprise that had to pay an agent to handle the same request.
More sophisticated NLP can allow chatbots to use intent and sentiment analysis to both infer and gather the appropriate data responses to deliver higher rates of accuracy in the responses they provide. This can translate into higher levels of customer satisfaction and reduced cost.
Radhakrishnan Rajagopalan, senior vice president and global head of customer success, data and intelligence at Mindtree, a technology consulting and digital transformation company, said improvements in NLP are also making it possible to engage with customer in their language of choice. Tools like the Turing Natural Language Generation from Microsoft and the M2M-100 model from Facebook have made it much easier to embed translation into chatbots with less data. For example, the Facebook model has been trained on 2,200 languages and can directly translate any pair of 100 languages without using English data.
NLP is also making chatbots increasingly natural and conversational. "Thanks to NLP, chatbots have shifted from pre-crafted, button-based and impersonal, to be more conversational and, hence, more dynamic," Rajagopalan said.
Vasilis Vagias, senior AI architect at cnvrg.io, a machine learning operations platform, said four of the key benefits of improved NLP in chatbots are:
- Reduced response time to critical issues faced both internally and externally by customers.
- Enormous cost savings in call centers and social media/brand analytics departments.
- Higher percentages of bot resolution for questions and concerns, reducing the need for human interactions.
- Reduced manual input and dependence on large data sets.
Decreased latency
NLP can dramatically reduce the time it takes to resolve customer issues. Chatbots can integrate seamlessly with an e-commerce, financial or health application to offer quick answers to common questions, or even simple resolutions, said Judith Bishop, senior director of AI specialists at Appen, which provides tools for managing AI training data.
Better NLP enables the chatbot to understand what the customer wants to achieve and identifies the keywords -- a product name or the name of a location -- relating to the request, and even how the customer is feeling about the service provided.
"Improving the NLP models is arguably the most impactful way to improve customers' engagement with a chatbot service," Bishop said.
For example, improving the ability of the chatbot to understand the user's intent, reduces the time and frustration a user might have in thinking about how to formulate a question so the chatbot will understand it. To achieve this, the chatbot must have seen many ways of phrasing the same query in its training data. Then it can recognize what the customer wants, however they choose to express it.
Reduced cost
Improvements in NLP components can lower the cost that teams need to invest in training and customizing chatbots. For example, some of these models, such as VaderSentiment can detect the sentiment in multiple languages and emojis, Vagias said. This reduces the need for complex training pipelines upfront as you develop your baseline for bot interaction.
Organizations often use these comprehensive NLP packages in combination with data sets they already have available to retrain the last level of the NLP model. This enables bots to be more fine-tuned to specific customers and business. This saves teams costs in training and updating models over time.
Decrease in feedback loops
Improvements in NLP models can also allow teams to quickly deploy new chatbot capabilities, test out those abilities and then iteratively improve in response to feedback. Unlike traditional machine learning models which required a large corpus of data to make a decent start bot, NLP is used to train models incrementally with smaller data sets, Rajagopalan said.
This allows enterprises to spin up chatbots quickly and mature them over a period of time. This, coupled with a lower cost per transaction, has significantly lowered the entry barrier. As the chatbots grow, their ability to detect affinity to similar intents as a feedback loop helps them incrementally train. This increases accuracy and effectiveness with minimal effort, reducing time to ROI.
Smaller data sets
Large data requirements have traditionally been a problem for developing chatbots, according to IBM's Potdar. Teams can reduce these requirements using tools that help the chatbot developers create and label data quickly and efficiently. One example is to streamline the workflow for mining human-to-human chat logs.
Techniques like few-shot learning and transfer learning can also be applied to improve the performance of the underlying NLP model. "It is expensive for companies to continuously employ data-labelers to identify the shift in data distribution, so tools which make this process easier add a lot of value to chatbot developers," she said.
Cleaning noisy data
Improved NLP can also help ensure chatbot resilience against spelling errors or overcome issues with speech recognition accuracy, Potdar said. These types of problems can often be solved using tools that make the system more extensive. For example, teams may weave spellcheck into the inputs. But she cautioned that teams need to be careful not to overcorrect, which could lead to errors if they are not validated by the end user.
It's also important for developers to think through processes for tagging sentences that might be irrelevant or out of domain. It helps to find ways to guide users with helpful relevant responses that can provide users appropriate guidance, instead of being stuck in "Sorry, I don't understand you" loops. Potdar recommended passing the query to NLP engines that search when an irrelevant question is detected to handle these scenarios more gracefully.