At Dashbot, we continue to see the interest in customer service chatbots grow, with more and more enterprises adopting automated solutions – for external and internal use cases.

We hosted a meetup featuring experts in the customer service chatbot space to discuss insights and best practices to optimize customer service chatbots.

The panel included:

Why build a customer service chatbot?

As Natural Language Processing (“NLP”) and AI continue to get better, we see enterprises moving more to chatbots for customer service. Based on our previous customer service survey, we found 70% of enterprises that currently do not have a chatbot are looking to build one in the next year, with 43% looking to do so within three months.

Why though are they building chatbots?

Our panelists cited four key reasons why enterprises are building customer service chatbots: to reduce costs and to provide consistency, 24/7 availability, and a better overall experience.

As Derek pointed out, giving users information quickly and consistently, 24 hours a day, creates a better user experience. Paul added that users want a choice to be able to self serve or talk to an agent when needed. Companies need to embrace this and provide a good experience.

For some enterprises, like financial institutions who receive a tremendous number of questions a day, an automated solution is essential. Upkar has one banking client that receives 300,000 questions a day. A chatbot solution enables the customer to scale at a lower cost than having live agents.

An interesting outcome of providing a chatbot is the benefit of the consistency of responses. One of Upkar’s clients found the chatbot helped resolve a compliance issue. The client was not able to fully train their human agents to not say the wrong thing. By using NLP and a rule based chatbot, they were able to handle the issue. It was not the original reason the client built the chatbot, but a nice added benefit.

What use cases work well for customer service chatbots?

Two of the most common industries that our panel sees building customer service chatbots are insurance and banking.

What our panelists found is that enterprises tend to start with the “low hanging fruit.” The most common starting point is with automating FAQs. While Derek pointed out this might not be the most interesting area for a chatbot, it is still quite helpful.

If you do start with FAQs, it is important to keep the content up to date. There are often internal issues with FAQs – that the data is siloed, hard to find, or perhaps does not even exist. If there is not someone owning and regularly updating the content, it will get stale.

The more interesting use cases involve transactional requests – for example looking up account information. This is where the power of the technology can really shine. Paul sees the majority of the volume of requests tend to be around three transactional Intents, followed by about 20% FAQ type of questions, and then requests for routing to human agents.

The rise of internal, enterprise chatbots

While banking and insurance are quite common for external customer service chatbots, our panel found many enterprises are building internal support chatbots for human resources and knowledge management. These internal “agent assist” chatbots, help automate employee tasks.

Internal chatbots have an added advantage of a captive audience – users who may need to be more forgiving of the experience. As Derek explains, employees have to deal with the systems that are available to them. They have to be patient with the chatbot.

IBM has an interesting approach for increasing adoption of its internal chatbot – the human agents are using it too. While the chatbot is voluntary to use, and it generally works pretty well to provide the information a user is looking for, if a user does ask for more help or to speak to a human, the chatbot lets them know the agent is going to use the chatbot too, so they may as well try the chatbot! They found this works pretty well.

Getting started

Building effective chatbots is hard. It is difficult to know all the things users may ask or how they may ask.

One of the challenges in getting started is identifying the use cases, or Intents to handle, for the chat bot. At IBM, they look at historical logs, especially call logs, to see the types of things users are interested in. They augment this through workshops with their clients, and their clients’ customers to better understand the requirements. At Dashbot, we provide an automated solution for enterprises who have historical logs to identify the common Intents and use cases through our Phrase Clustering algorithm.

Even with the historical data, it is important to look at actual live user data. Upkar emphasized how important it is to use live customer data when creating Intents and training the NLP model. IBM’s workshops with clients help discover this. Dashbot analytics also provides insights into how users are interacting, where NLP may need improvement, as well as tools to take action to improve the model and overall user experience.

Gathering the content for the chatbot can also be a challenge. The content may exist, but it may be spread across a variety of siloed systems within an organization. Part of the process may be to identify and find the content. A related issue in some cases, particularly transactional use cases, as Derek points out, is that the chatbot may need to be on premise to have access to the underlying content.

As Paul recommends, it is helpful to start off small and focused, rather than try to “boil the ocean.” He recommends focusing on the top three or four use cases rather than building hundreds of Intents at the start.

Measuring success

Tools like Dashbot can help measure user behavior and KPIs to build a better user experience and monitor the success of the chatbot. It is important to use analytics to see what is really happening to make improvements.

While there are numeric success metrics like containment and Net Promoter Score (“NPS”) that our panelists consider, there are qualitative aspects they brought up as well. As Derek pointed out, increasing people’s ability to automate aspects of their job that they would rather not do, is a measure of success. Paul added, as companies move more into an interaction approach, they do not necessarily need to ask users how they feel, they can infer it through the conversations themselves.

The future of customer service chatbots

The technology behind customer service chatbots continues to grow and improve leading to a bright future in this space. As Paul states, “the train has left the station” – it is a great time to be in this industry.

Derek predicts more enterprises taking advantage of the technology. There is so much value today, but enterprises currently may not fully understand how to build or resource a chatbot. On a related note, Paul sees one area of improvement is for the enterprise to break down silos within organizations – especially between sales and customer service – to transform the contact center from a cost center, to a center of profit.

Given the sheer amount of chatbots, Upkar also sees a future of chatbots communicating with each other to share functionality or use cases.

At Dashbot, we are excited about the conversational interface space and look forward to seeing the advancements in customer service chatbots.

Watch the full video

About Dashbot

Dashbot is an analytics platform for conversational interfaces that enables enterprises to increase satisfaction, engagement, and conversions through actionable insights and tools.

In addition to traditional analytics like engagement and retention, we provide chatbot specific metrics including NLP response effectiveness, sentiment analysis, conversational analytics, and the full chat session transcripts.

We also have tools to take action on the data, like our live person take over of chat sessions and push notifications for re-engagement.

We support DialogFlow, Alexa, Google Assistant, Facebook Messenger, Slack, Twitter, Kik, SMS, web chat, and any other conversational interface.

Contact us for a demo