Product Design

Part One: Shortcuts to Chatbot Emotional Intelligence

blog-featured-image_Chatbot_Emotional_Intelligence

Chatbots have been getting a lot of press recently with Slack, Facebook, and Microsoft embracing the technology. But many of the bots released on these platforms are like the Tin Man from The Wizard of Oz – they’ve got no heart. Here’s how a user-centered design process can lend a heart to your chatbot, and why that’s important.

User-Centered Design, huh?

If you’re not scratching your head here, go ahead and skip this section. Check back in next week, though, because I’ll share some tactical tips on breathing life into your chatbot in part two of this series. For those of you who don’t have a fully developed user-centered design practice, this section’s for you.

While a user-centered product strategy doesn’t sound all that groundbreaking, it seems like a lot of chatbot creators have neglected this phase of the creative process. It can be tempting to dive right into mapping out conversational user flows, but each flow is like a feature in your app and adding features without a strategic vision is a recipe for disjointed user experiences and feature bloat. Egads! You’re building a Frankenbot!

blog-post-image-1.0 chatbot-emotional-intelligence WF-510x275

BFFs - UX Strategy and UX Design

I’ll briefly cover our UX Strategy process here at WillowTree, starting with a few ways to identify user pain points in the first phase (which we call the Explore phase), then moving on to considerations to keep in mind when starting to create proof of concepts in the Focus phase.

In the Explore phase, we start by gathering and organizing information in order to identify possible opportunities. We have many tools at our disposal, but a couple of our favs are User Personas and Journey Maps. (This process is the same for any type of project we undertake, including chatbots.)

blog-post-image-2.0 chatbot-emotional-intelligence WF-510x273

User Personas are representative of our target users and help our team to identify with the needs and goals of those users by giving them faces and names. These personas usually include detailed demographic information, as well as user feelings and frustrations uncovered in user surveys. If we’re working on a chatbot component of an existing app, we can use the same personas we may have already developed.

The Journey Map outlines the user’s interaction with the chatbot or the broader customer experience. In this example, we have illustrated the steps of adopting a new health regimen, with the needs and frustrations of each persona during those steps. We use these frustrations to identify points of friction that we can alleviate with our chatbot.

We then move on to the Focus phase, during which we’re crystallizing our assumptions into concepts and validating those assumptions with user surveys.

Our concept of the chatbot’s capabilities is starting to take shape, so now is a good time to start mapping out the functionality. The bot’s UI is simply a conversation made up of different paths the user could take. We can visualize it as a conversation tree. Each point of friction from our exploration phase maps to a branch on our conversation tree. If it’s starting to look familiar, that’s because it’s just like a product or site map, with each conversational branch representing a feature set.

blog-post-image-3.0 chatbot-emotional-intelligence WF-510x290

Now that we have a strategic vision based on our users’ needs and frustrations and the beginnings of our chatbot’s capabilities, we can begin to take a look at a few key tactical areas of focus to help refine our interactions. Specifically, we’ll address how usefulness, helpfulness, flow and cadence, personality, and intelligence can be leveraged to breathe life into your chatbot’s conversational flows. But I’ll be covering all that in part 2 of this series, so keep an eye out for that post next week!

apple event

What We're Taking from Apple's 2019 Hardware Event

Yesterday marked Apple’s annual media event, where they announced the next iterations...

Read the article