Background

The City of Boroondara is a local city council in Melbourne’s inner east with over 170,000 residents and 25,000 local businesses.

The challenge

With COVID-19 affecting how we go about our lives, our customer service team was experiencing a significant rise in call volumes - over 45% compared to pre-COVID times. They were often answering repetitive questions and directing customers to third-party authorities for answers.

Opportunity

We set off to design a solution that could:

  • help our customers self-serve - find the answers they need, when they want it and on the device they prefer

  • take the strain off the customer service team so they can focus on providing a higher value of service.

How we did it

Phase 1

Customer interviews

In our discovery phase, I worked with the Service Design team and interviewed 10 residents from different demographics and living conditions.

The aim of these moderated discussions was to understand the core problems our users were experiencing with our services.

We found over 500 data points that we grouped. Six core themes emerged with waste, planning, and pets being top contenders.

Ideation sessions with staff

I collaborated with the Service Design team to plan an ideation session around the 6 themes.

In these sessions, we looked at the core user problems/needs and workshopped possible solutions with subject matter experts from different business areas. We had 11 concepts emerge, including a ‘Waste goes where?’ lookup; convenient recycling centre drop of points, proactive issue reporting and more.

Concept playback

The Service Design Team and I ran a follow-up workshop with our user group. We looked at the sustainability of each concept - estimating desirability and feasibility on a graph. The ‘Waste goes where?’ solution scored highest for viability/feasibility and came within the top 3 for desirability.

This provided us with a framework to launch a chatbot that would allow users to self-serve.

customer feedback when presented with chatbot idea

Content audit and modelling

With only a month to deliver a Waste goes where?’ lookup, I needed to design a content solution that didn’t have an enormous content maintenance overhead.

I initially looked at the City of Boroondara’s A to Z Recycling and Waste Guide and found the user experience to be confusing. The static structure of the guide meant:

  • a user needed to know what term we’ve classified an item under

  • we duplicated items to cater for different search terms

  • the digital content team had a massive content maintenance overhead

I built out the data and content model for aWaste goes where?’ lookup to get a better understanding of how to structure the content. I found that an:

  • item may have 1 or more actions

  • action can belong to 1 or more items.

Another limitation was our vendor didn’t support this structured content approach. But we could use an API to pull the structured data from a third-party platform.

I built out the structured content in a Google sheet and used VLOOKUP functionality to relate actions to items.

Phase 2

We set out to expand the chatbot and transition it from a ‘Waste goes where?’ lookup to a fully-fledged chatbot.

Rollout plan

I developed and implemented a thorough rollout plan. This included:

  • a project delivery framework

  • test plan

  • transition plan and platform training

  • developing microcopy and style guides

  • stakeholder/leadership team presentations and reporting

Quantitative topic research

I wanted to understand those high volume and low complexity questions our users were asking.

I looked at:

  • topic and top question data from surrounding councils

  • top searches in Siteimprove and Google Search console

  • customer feedback reports

  • data from our customer service team

I mapped the findings on a visual board in Miro to help me develop the top 50 questions to be added to the chatbot in phase 2 of the project.

I wrote the questions and answers for the top 50 questions and worked closely with subject matter experts to validate them. I also identified and edited passages to support the conversational flow of the new chatbot.

Transition to BAU

To ensure content in the chatbot continued to be relevant to our users, I implemented a continuous improvement process.

This included a:

  • workflow for changes or additions from business areas

  • weekly improvement process informed by data

  • style manual to capture tone, voice and audience of chatbot

I trained the digital content team on how to run the improvement process and ran an awareness campaign to inform the business of the launch.

Results

Within 6 months of the chatbot being live, we:

  • increased the number of frequently asked questions in the chatbot from 50 to more than 190

  • achieved a deflection rate of 38.21%

  • improved the accuracy rate from 38% (July 2021) t0 82% (January 2022)

  • increased the number of questions answered a month from 411 (June 2021) to 2,927 (January 2022)

  • increased the number of users per month from 256 (June 2021) to 2,237 (January 2022)