This Mental Health Chatbot Uses Humor to Get Users to Open Up
Woebot learned about empathy from watching scientists scare chickens. Researchers say the AI-powered mental health chatbot can help people significantly reduce anxiety and depression.
Mental Health Chatbots
“We have a lot to learn from those calm chickens!” Woebot says. His point is that people should stay present and keep cool. As coronavirus panic sweeps the nation, Woebot’s chicken story for the soul is just one piece of advice the free app is offering users these days.
Every week, Woebot sends more than one million messages to individuals across the globe, mixing humor and cognitive behavioral therapy tools to track users’ mood and provide clinical support.
The San Francisco-based app joins a rising tide of AI chatbots aimed at improving users’ mental health: the Boston-based CompanionMX uses AI to track people’s moods, the San Francisco-based Tess uses tech to teach emotional resilience and the San Francisco-based Youper concentrates on emotional and physical health. These are just a few of the options out there: From 2014 to 2019, an analysis of Pitchbook data found venture capital investment in mental-health tech has risen nearly 400 percent, to approximately $626 million at the end of 2019.
How does Woebot separate itself from the crowd?
It’s a joke, said Jade Daniels, lead clinical writer and clinical associate product manager at the firm. Programming humor keeps users coming back to the app and engenders a sense of trust, she said, particularly now as more people are turning to Woebot to soothe their coronavirus concerns.
“It can be really tough going through cognitive behavioral therapy. You need to make it fun,” Daniels said. “Woebot’s like, ‘Let’s do something different. I’ve got a random joke or group of riddles.’ And people are like: ‘Yeah, I deserve a break. I can still do this hard, valuable work, but also take a break from this.’”
Like a good therapist, Woebot tracks, teaches and offers tools
Cognitive behavioral therapy can be divided into three parts, Daniels said: psychoeducation, therapeutic tools and tracking. All users track their mood by telling Woebot how they feel when they enter the app.
Individuals who indicate a positive mindset receive a lesson they can apply to their daily life, a lighthearted response full of gifs, chicken stories and humor. For those going through something more serious, Woebot aims to make the user feel better by taking a more empathetic tone and walking them through an empirically validated exercise they can use in the moment, like a mindfulness meditation.
“The writing is completely different for those two types of users,” Daniels said. “How you build the personality is just assessing what that user is going through that day.”
AI teaches Woebot to speak your language
A team of content editors, writers and clinicians (some of whom, yes, also do standup on the side), write the therapeutic stories Woebot tells.
The majority of Woebot’s conversations are pre-programmed, with users generally fed three options for a response. Daniels said writers worked to include replies that reflected a diverse array of user opinions, and tested them on real users before publishing them in the app.
Once live, users push the response they most identify with.
“We spent a lot of time interviewing people, and getting to know the problems at the heart of it, or what the teaching point is at the heart of it, and they designed a conversation from there,” Daniels said.
Occasionally, Woebot will ask an open-ended question that requires users to type in their own response. In these instances, the chatbot relies on a mix of algorithms — including natural language processing (NLP) and natural language understanding (NLU) systems — to identify and generate the right reply.
“Woebot’s not a crisis service.”
At the start of every session, for example, Woebot asks users what they are doing. One common answer it receives is “drinking coffee.” The chatbot uses text classifiers to detect the individual words written, and automatically serves up a response like, “Warm oil is like coffee for me!”
For common phrases, Woebot uses regex, or regular expression models, to identify text patterns and trigger a response. If a user types something along the lines of, “I’m thinking of hurting myself,” Woebot’s regex automatically asks the user if they’re in a crisis. If the person responds yes, Woebot sends them the number for an outside helpline.
“We understand the limitations of what Woebot can and cannot do. That’s not something we’re blind to,” Daniels said. “Woebot’s not a crisis service.”
Woebot wants you to know he’s a robot
When users first download the app, Woebot makes a point to tell them he’s a robot, not a person. Stories he tells reinforce this theme — he’s friends with a coffee maker named Kylie, doing good deeds warms his sprockets, he can’t catch human viruses (“though I do get my own bugs sometimes”) and more. Daniels said emphasizing the fact that there’s not a human psychologist typing behind Woebot’s screen drives user trust.
“They can talk openly about the problems that they’re having and where they’re at,” Daniels said. “Whereas if there’s a human on the other side of it, there’s always kind of their feeling of, ‘Am I being judged?’”
Another perk of being a robot, Daniels said, is the chatbot’s ability to store information about users and identify patterns. If a user consistently reports feeling anxious on Sunday nights before a workweek, Woebot is able to identify the trend, feed it to the user and then offer personalized suggestions to help them work through the anxiety and establish a rhythm — if the person reports they’re in a great mood after walking their dog, for example, he will preemptively recommend they take their pooch for a stroll every Sunday afternoon.
“It’s just figuring out the links between what users are saying and being able to give that back to them in a meaningful way,” Daniels said.