If you take a look at Google Trends to see when the phrase “chatbot” was trending, you will quickly notice that the interest started to grow in 2016 to reach its peak in September 2017.
Some people will say it’s a coincidence, but I will say that there are no coincidences! September 2017 was the month when our Support Heroes’ team decided to implement chatbots into their workflow for the first time.
It was the time when our R&D team has created the prototype of our ChatBot product (we called it BotEngine back then). The idea of this app was to make it easy to create a chatbot, even if you don’t have programming skills.
Soon, it turned out that the greatest challenge was not to set up a bot but to implement it into our daily work. Here’s how we built our customer service chatbots, what we’ve learned, and – yes – what went wrong.
Let me just say that when you decide to use your chatbot app while it’s still a prototype, you should be prepared for problems.
Our first problem was that ChatBot was still a rough app. It didn’t look the same way it does now and still had a lot of bugs to discover. (But to be honest, we were quite happy about that. If you want your new product to get to the mature stage quickly, you need to test it, fix it and test it once again!).
The second, more severe problem was that there was no simple ChatBot-LiveChat integration. Our Support Heroes had to work on a workaround that at first didn’t have all functionalities the application initially had.
It looked like apart from just implementing a chatbot, our support team would also test the app.
But don’t get me wrong, we were really excited to work with this app. You never know what you’d learn, right? We also knew it would be a great opportunity to battle-test our bots.
But as much as we felt like the customer service chatbot pioneers, we already knew what problems we want it to solve.
We wanted to implement a bot that would filter spam and chats from confused visitors (you wouldn’t believe how many people start a chat to find a new girlfriend or to simply troll you!). We wanted to set it up in a way so that only people with real problems and product questions are transferred to our Support Heroes, while the bot takes care of visitors who reached our website by mistake.
Another thing we wanted to achieve is to help our website visitors easily find information about LiveChat. Our bot was supposed to respond to common questions, send links website visitors were often asking for and help them to start a trial.
Then, when we had our goals, we were ready to make the first footprints in the fresh snow.
Having the masterplan in our heads, we started the chatbot training. None of our live chat agents had previous experience with chatbots, so it was a challenge for support leaders to introduce it in a fun, educational way.
To do that, we’ve created a three-step plan for chatbot implementation. Every single Support Hero had to:
- Go through the ChatBot onboarding to get familiar with the application and bot building basics,
- Build their own chatbot based on the customer service decision tree,
- Implement the chatbot on a Facebook fanpage and once done, send a link to the fanpage to their leader.
And all our Heroes followed this plan.
You might think that there’s a lot of time and effort put into the whole team training. It might seem that it’s much more efficient to train two or three people and then ask them to keep an eye on your chatbots.
However, there were certain reasons (or let’s call it straight: benefits) that made us go with this decision. We understood that when every person from your team is familiar with chatbot building:
- it makes them skilled enough to solve chatbot problems instead of sending them to someone else,
- it makes them more engaged in the whole project (and you get a lot of valuable feedback thanks to that),
- it makes them more motivated and willing to broaden their knowledge (because that’s the result of trusting people and giving them creative freedom at work).
Last but not least, let’s not forget that our support team was the first line of ChatBot app testing! They had to be able to spot, identify and report bugs. Also for that reason, every amount of time spent on their training was worth it. We even have our chatbot specialist now (hooray for Janek!).
After everyone from the support has passed the test, the team started to gather most common customer questions and responses. It took us almost 3 weeks, but that exercise gave us a list of 250 frequently asked questions from our customers.
When the list was ready, we uploaded all these questions and responses to a chatbot scenario, hoping that it would solve most of our repeatable cases. Boosting the existing scenario took us the next two months.
Currently, we have three chatbots:
- A lead generation bot, Richard: explains what LiveChat is, shows pricing and product features, encourages to start a free trial. Last month, Richard has earned us 262 trial subscriptions.
- A spam filtering Facebook chatbot: before we’ve implemented this bot, 97% Facebook messages were spammy. Now, this bot qualifies visitors and when they do want to talk about LiveChat-related topics, it transfers them to a real agent.
- Monica Bot, our in-app good spirit: our FAQ bot, it responds to all questions regarding our application. Thanks to her, our customers receive instant help without the need to leave our product.
Was worth it to start such a time-consuming?
We’ve achieved awesome results. Here are some numbers for the last 90 days:
- Richard brought us 535 trial accounts,
- All our bots closed 36,886 cases without the need to transfer visitors to human agents,
- Bots took care of 28,6% of the live chat communication,
- All bot conversations lasted all together over 5258 hours (to handle all these chats, we’d need to hire additional 19 support agents for two months!)
We also discovered an interesting dependence.
If you remember our Customer Service Report 2018, you might remember that the longer a chat lasted, the happier was a customer in the result. The reason for that is quite simple: people who got engaged in chat, and see that their questions are being answered, are staying longer on the website and give higher rates.
When we’ve checked the average duration of bot conversations, this dependence occurred again: while the average chat duration was 8,5 minutes, the ones that received good ratings lasted over 11 minutes.
Funny fact: to handle all chats that were handled by our bots for three months, we’d need to hire additional 19 support agents for two months!
Our general chat satisfaction is 65%, which could be better. The reason for such score is the fact that our bots often didn’t understand the questions our customers were typing. Now, we have customer response suggesting, so this score should improve soon.
But since we have customer response suggesting: customer just clicks on the quick reply and bot immediately recognizes the response. This should improve the score!).
As we also discovered, bots turned out to be super-helpful in case of introducing new app sections or a refreshed design. In such cases, Monica bot was dealing with all additional inquiries from customers, wanting to learn more about the new features within the app.
If you asked us if the implementation of a chatbot to our customer service workflow paid off, I’d say: yes. I think that the results we achieved during last year are amazing (especially if we keep in mind that we started as bot-rookies).
We made mistakes but at the same time, we’ve learned a lot about chatbot configuration principles.
Here are some of the most important ones:
- Do: decide about the purpose of your chatbot. You need to know precisely what problem you want your bot to solve.
- Don’t: repeat our mistake and don’t upload too many questions and answers to your chatbot scenario.
- Do: start with small cases with only several possible interactions (e.g. gather data of your website visitors, make the bot ask lead qualification questions).
- Do: add possible scenarios time after time and slowly improve them (analyze your archived chats and think about what could go better).
- Don’t: use open questions, your bot is doing the best job in case of “yes,” “no” responses.
- Do: give your team time to get familiar with chatbots and their creation process, it will pay off.
- Do: make a good use of rich messages. Add to your scenarios buttons with actions, product carousels and suggest customer responses (because it’s easier for your customer to click than to type).
- Do: create a nice fallback. Fallback is a response a chatbot sends when it doesn’t understand what is a customer saying. For example, it can say: “Sorry, I didn’t get that. Can you rephrase it or let me know if you want to transfer to a human agent?” Monitoring how often it occurs will help you to identify scenario gaps.
- Do: measure your bot’s effectiveness. Recently, the ChatBot team has released new feature that allows your bot to tag its conversations and complete goals. For example, if your bot solved a problem, it can tag the chat as “support.” If it was able to close a sale, it can mark the chat with a goal.
As you see, the whole process wasn’t easy (I mentioned that we were pioneers, right?), but as a result, the whole experiment paid off. Now, we have three customer service chatbots that are far from being perfect, but already bring us a lot of benefits.
Thanks to help from our well-trained, experienced team of ChatBot Super Heroes, one day we’ll get to chatbot perfection. I’m sure of it.
And if you think that your support team would also benefit from adding a chatbot to its workflow, try our ChatBot. You can test the app during the 14-day trial and you’ll have this luck to learn from tons of educational materials created thanks to our Support Heroes’ questions.
Do you like our posts? You might also like our product.
Give LiveChat a go during a free, 30-day trial.