How to Enhance your Customer Experience with AI

Artificial intelligence (AI) already underpins many of our online customer interactions and its use across organisations is growing. McKinsey estimates that around 30% of global organisations have adopted AI at some level, however, there are still issues with deploying this new species of technology successfully.

How can you use AI to enhance your customer experience? These 5 steps will help you get the right results for your organisation.

1. Define your customer strategy

Your customer strategy should help you navigate our rapidly changing environment. A good customer strategy will outline innovative customer revenue opportunities plus new ways to leverage your internal capability. Your strategic positioning might encompass a high-growth trajectory, out-competing rivals, or defining a new path. A clear customer strategy helps dictate the next steps for the firm. Assess whether AI might be a useful tool to enable your customer strategy.

2. Understand your customer journey

Articulate your firm’s current and target state customer journeys. Understanding where you are now and where you need your organisation to be in the future will help guide what tools you need. Assess each stage of the journey in detail. Use your journey maps to determine areas that could be improved with AI.

3. Understand what AI solutions are available at each journey stage

Do your research about what technologies are most useful across each stage of your customer journey. Then focus on applications that will genuinely improve your customer experience, rather than being a fad.

The use of AI technologies such as machine learning, natural-language understanding, and natural-language processing can help analyze customer sentiment and customer feedback at scale, precision and speed not achievable through humans.

Jessica Ekholm, Gartner.

Current research suggests that for customers to want to interact with an AI, it has to be smart enough. Have you ever reached the point with a customer service chatbot when it was clear the bot didn’t know what you were talking about? That frustration point for a customer is important to know. When incorporating your AI into the customer journey, be aware of the point that requires a human to take over, and make sure it is seamless for the customer.

No matter how much you might want them to, some customers will not want to interact with your new AI. Doing some research around who is (and is not) likely to engage with your new AI-enabled experience is worthwhile.

You may have heard of MABA-HABA. It is (alas) not a song from The Lion King. It is a model that characterised the different strengths and weaknesses of humans and machines: ‘Machines Are Better At – Humans Are Better At’. The theory acknowledges that machines are better at some things (like processing large amounts of data, fulfilling highly detailed repetitive tasks), and humans are better at other things (like applying creativity to a problem, using flexible procedures, using judgment). This is something to be aware of as you assess potential AI to employ in your customer’s journey. Also, ensure to highlight areas that should not be replaced by AI. For example in the healthcare sector, AI has the potential to support patients, health workers, and hospital administrators, however, some patient interactions are important to achieve well-being outcomes and should not be lightly handed over to a machine.

GNS Healthcare applies machine-learning software to find overlooked relationships among data in patients’ health records and elsewhere. After identifying a relationship, the software churns out numerous hypotheses to explain it and then suggests which of those are the most likely. This approach enabled GNS to uncover a new drug interaction hidden in unstructured patient notes. CEO Colin Hill points out that this is not garden-variety data mining to find associations. “Our machine-learning platform is not just about seeing patterns and correlations in data,” he says. “It’s about actually discovering causal links.”

HBR 2018

4. Consider how your customer will interact with the AI

AI-supported by machine learning (ML) and natural language processing (NLP) is a different beast to other species of digital technology. The technology that most people are currently used to was predictable: it was determined by a programmer’s intentions and coding (if this happens, then that happens). Artificial intelligence is different: we give the machine access to data and allow it to learn for itself. What the AI learns and how it behaves is not as perfectly predictable as it used to be when we coded for a response. An excellent use case of AI-supported by machine learning gone awry was Microsoft’s Tay, which was let loose to learn from Twitter interactions, and became racist within hours.

There are different representational types of AI to be aware of. Considering levels of AI-based on how they will appear to your customer (or their ‘representation’) is useful.

AI powering an online interaction: This type of AI won’t often be front of mind for your customers as it operates to create a better experience in the background. Google search, Facebook news feeds, and Amazon recommendations all operate using AI. HSBC, a banking institution, uses AI-enabled technology that operates in the background to make fast decisions that protect the business against fraud. It benefits customers who might otherwise get upset about bounced transactions.  

AI inside the machine: People are beginning to rely upon machines that are enabled with AI in day to day life. This opens more possibilities for trust to develop for humans interacting with AI enabled machines.

For Mercedes-Benz executives, inflexible processes presented a growing challenge. Increasingly, the company’s most profitable customers had been demanding individualized S-class sedans, but the automaker’s assembly systems couldn’t deliver the customization people wanted.

Traditionally, car manufacturing has been a rigid process with automated steps executed by “dumb” robots. To improve flexibility, Mercedes replaced some of those robots with AI-enabled cobots and redesigned its processes around human-machine collaborations. At the company’s plant near Stuttgart, Germany, cobot arms guided by human workers pick up and place heavy parts, becoming an extension of the worker’s body. This system puts the worker in control of the build of each car, doing less manual labor and more of a “piloting” job with the robot.

HBR 2018

Automation has been used to fly commercial planes for a few decades. This year, Lockheed Martin ran simulated testing with its AI against human fighter pilots in a dogfight. The AI won 5-0. For those who are interested, it is possible to watch the dogfight trails on the DARPA site. This technology is estimated to be in real planes by 2024. Commercial aircraft use is still under development.

Alpha Dogfight Trials: Advanced algorithms to fly simulated F-16 dogfights against each other, Air Force pilot.

Alpha Dogfight Trials: Advanced algorithms to fly simulated F-16 dogfights against each other, Air Force pilot.

Interactive AI: Machines that can intelligently interact with humans in a human-like manner are becoming more prevalent in customer facing environments. Chatbots are a ‘non-embodied’ type of interactive AI. In contrast, robots have a tangible physical presence. People tend to interact with them in different ways.

In some contexts, chatbots can be more trustworthy than people. Woebot is a mental health chatbot that leads to measurable reductions in anxiety and depression. One of the benefits to users is their perceived anonymity; some people are more likely to share more with a non-human interface that appears to care. One of the lessons from the research on chatbot interfaces: don’t pretend to be a human when you are not. Claiming false ‘humanity’ can lead to a serious breakdown in customer trust in your organisation. The finance sector, where trust is a fundamental pillar, was an early adopter of customer service AI technology.

SEB, a major Swedish bank, now uses a virtual assistant called Aida to interact with millions of customers. Able to handle natural-language conversations, Aida has access to vast stores of data and can answer many frequently asked questions, such as how to open an account or make cross-border payments. She can also ask callers follow-up questions to solve their problems, and she’s able to analyze a caller’s tone of voice (frustrated versus appreciative, for instance) and use that information to provide better service later. Whenever the system can’t resolve an issue—which happens in about 30% of cases—it turns the caller over to a human customer service representative and then monitors that interaction to learn how to resolve similar problems in the future. With Aida handling basic requests, human reps can concentrate on addressing more-complex issues, especially those from unhappy callers who might require extra hand-holding.

HBR 2018

Robots are increasingly deployed in public spaces like malls, airports, hotels, and hospitals to act as concierges. Nearly half of the world’s airlines and 32% of its airports are investigating robotics and automated vehicles in the next 3 years, according to the 2018 Air Transport IT Insights survey. Done the right way, this is good news in a sector where delivering good customer experiences has been fraught with problems.

Pepper, manufactured by Softbank Robotics, is one of the most widely spread humanoid customer service robots. Pepper can recognize faces and basic human emotions, plus can engage with people through conversation and his touch screen. Despite the novelty, human-like robots like Pepper can be uncomfortable for some people. If you have ever been followed by a Pepper robot, and have invested too much time watching Dr. Who, it is a slightly unnerving experience. Discomfort around humanoid robots has been well documented and remains something to consider in designing your experience.

According to the interviewer in the video below: “Pepper is far from perfect - if it didn’t know the answer to a question I asked, it would spit out a bunch of silly answers”.

5. Understand the return

Now that you have assessed the right types of AI to deliver your customer strategy, you need to understand the return on investment for your firm. Questions to ask include:

What are the anticipated productivity or financial gains for your firm?

Is it better to build or to buy?

TIP: Start your build or buy research with companies like Amazon, IBM Watson, Google AI, and Microsoft Azure.

Can you run a trial on a small part of your market to test its efficacy before rolling out to a larger audience?

Do you have the internal capability to harness the benefits of your AI enhancements?

Finally, be prepared to overcome some challenges to reap the rewards. After your investment in time, funds, and great intentions, sometimes AI projects are tough to get off the ground. According to HBR, one of the most likely place problems will arise is when integrating the new AI into your company’s overall technology architecture. Success is often guarded by problems. Given AI is still in its infancy in many organisations, expect some level of difficulty. This is not to warn you off, it is to prepare you for a potentially rough ride. If your investment return is valuable, it is certainly worth being a pioneer in this burgeoning field. Right now, there is space to achieve a significant competitive advantage through AI. This will not be the case forever.

Sarah Daly is undertaking a PhD at the Queensland University of Technology, investigating the role of trust in the adoption and diffusion of AI based innovation, particularly in the healthcare sector. She is also the Operations Director of CapFeather, a customer strategy and innovation consulting firm.

Discover more from the CapFeather Edition