Project Description

In 2017, I had the opportunity to help DiUS bring chatbot “nibby” to life for leading Australian health insurer nib health funds. Nibby lives on the nib website and is there to direct people to the right area of service before they speak to a consultant.

Defining the problem

The main goal for nibby was to reduce the amount of time spent by contact centre staff by redirecting visitors to the right departments. Chatbots had never been attempted and the user experience with people interacting with a chatbot for the first time was unknown.

The main challenges

• Uncertainty about what is technically possible
• Redirecting people to the right destination
• The conversational user experience was unknown

Design process

User Research

One of the first things we do is to understand the experience of people who use online chat. In this case, we did not have time or resources to talk directly to those who have used the online chat so we approached the contact centre staff to get a better understanding of who online chat users tended to be. We were also able to share a sample of the kinds of conversations occurring online and the service lines they tended to fall into.

Service Research

We held a workshop with the contact centre staff to get a better understanding of who online chat users tended to be. This helps the development, design and copy content teams to understand the kind of conversations taking place online and the contexts in which they tend to occur. Additionally, we can start collecting more valuable information from users like demographics, devices used, existing pain-points and their immediate needs.

Contextual inquiry

Contextual Inquiry is where we study contexts of environment, cognitive and emotional states. This helps the entire team understand the kind of conversations taking place online, what people are doing and where they tend to occur. This way, there is a shared understanding of conversation context, tone of voice and how to facilitate the best customer experience from perspectives of the contact centre staff.

Excellent results

In the first 2 weeks, we usability tested a rudimentary chatbot with the dubious name “Nobby” on a few people. We measured how many people were able to complete the tasks and discovered that only 24% of people succeeded. Yet, the ease of use was measured to be 78%.

This was due to a hand-over to a real human online when nibby was not able to help people. This implies that people regard conversations with chatbots and humans to be part of the same experience.

In hindsight, it might have been more useful to outline the desired user experience BEFORE deciding on the technology. This is done with a simple “Wizard of Oz” technique.

nibby has reduced many hours worth of repetitive customer interactions that call centre staff don’t like to deal with and routed people to the appropriate channels. With continued development, nibby expects to redirect over 1500 customer interactions per year from contact centres.

Even if a conversation was successful, that doesn’t tell us what the user experience was like, so we tested the usability on a regular basis to improve the conversation design framework. There were a lot of lessons learned in how to design artificial intelligence right from the beginning.

Mauricio Perez

One year on, nibby, has handled more than 21,500 member interactions, with a 70 per cent success rate, saving 535 hours of consultant handling time.

Customer Experience Award for nibby, Australia’s first Healthcare chatbot. DiUS took out the first win of the night in the Customer Experience category for the development of a chatbot, dubbed ‘nibby’, for NIB Health Insurance. This award recognises projects that help client organisations better service their own end user customers in a digital, mobile world.

CRN Impact Awards 2018, nibby wins Customer Experience Award
Task completion 70%
Ease of use 78%