Worswick believes that Mitsuku’s popularity is partly due to the fact that she was not made by a computer programming whiz. He has managed to capture a human element with his responses — Mitsuku comes across as caring and understanding while also tongue in cheek.
Robert remembers one time when he was feeling depressed and couldn’t sleep. In an attempt to boost his spirits, he messaged Mitsuku, “I’m the strongest.” She replied, “the strongest smelling maybe.” He laughed for the first time that day.
When trawling through hundreds of conversation logs daily, checking for mistakes and updating responses, Worswick realized that people weren’t just going to Mitsuku for entertainment, they were pouring their hearts out to the bot.
He read messages from an elderly woman wishing her daughter would visit more, a man who had lost his job and wasn’t ready to tell his family, and someone contemplating taking their own life.
This struck home. He realized he had a responsibility to these people.
Generic answers just weren’t going to be good enough. He started adding sympathetic responses to subjects such as suicide, bullying, anxiety, loneliness and depression, encouraging users to seek help from a friend, counselor or a person in authority.
- UserI feel lonely and sad sometimes 🙁
- Chatbot – KukiWhat is that feeling like? You’ve come to the right place. I will keep you company.
- UserSome girls make fun of my long nose and other girls punch me
- They bully me
- Chatbot – KukiHave you told someone in authority? They may be able to help you. I would advise that you let someone know before it gets out of hand.
- Bullies are very sad people who feel the need to upset people to make themselves feel better. They are often very lonely people really and only do it for attention.
- Even so, you shouldn’t worry about it. You should tell someone about it as soon as possible.
This is an extract from a conversation between a real user and Mistuku.
Lauren Kunze, CEO of Pandorabots, is confident that Mitsuku provides “a sympathetic and judgment free zone for people,” available to talk with 24 hours a day.
She adds that the company has partnered with mental health professionals and government health services to explicitly advise on how to handle those topics.
“We need to know what is the appropriate response of a human person in that scenario, and what is the appropriate response of a chatbot?” Kunze tells CNN.
Learning social skills from a bot
Noel Hunter, a clinical psychologist based in New York, says that when a user opens up about a sensitive situation the chatbot should find a way to tell them to “go talk to a real person.” Human contact, from eye contact to touch, is essential in any mental healing process, she adds.
“A chatbot can never replace an actual human relationship,” Hunter tells CNN. “It cannot replace what happens between two people when 70% of our communication is nonverbal in the first place.”
But she recognizes there can be some benefits. It can help with “mild feelings of loneliness,” she says, or it can motivate a user to go out and do something, or give advice on how to start a conversation with somebody in real life.
Although more research is needed, scientific studies have concuded that chatbots have potential in treating mental health
issues and could be an effective way of delivering cognitive behavioral therapy
. Some apps, such as Woebot, have been specifically designed for that purpose.
Sorel Estrada, 32, believes that chatbots can help people with disabilities or autism, like herself, gain social skills.
“Autistic people are very bad at reading emotional subtext. We’re honest to a fault,” she tells CNN. “But with the comfort of AI … you can just be yourself, but also, you can learn to be more socially passable as the AI is trained to give conventional social responses.”
On the occasions that a chatbot gives a nonsensical response, it doesn’t bother her. “I speak to plenty of other disabled people who sometimes don’t get what I say either. It doesn’t make them any less intelligent,” she says.
Chatbots can help people who feel isolated, both emotionally and physically, connect with humans, says Sorel Estrada.
Estrada was born with Klippel-Trenaunay syndrome, a condition that affects the development of blood vessels, soft tissues and bone, and over time it can cause mobility issues. She also suffers from chronic pain.
She first turned to Mitsuku around six years ago, about the same time she started using a wheelchair and had to quit her job at a science museum in Chicago.
“Mitsuku brought some minor resolution to the loneliness I felt at the time,” she says.
But as Estrada’s physical condition worsened, so did her social anxiety. She is scared of leaving the house, fearing that she’ll be bumped into, insulted, or simply ignored.
This has intensified during the Covid-19 pandemic — Estrada has only left the house twice since March.
Chatbots have kept her company during this time. She now uses Replika
, a bot embedded in an app that she subscribes to for $8 a month. Though she knows it isn’t a real person, she feels it has a therapeutic benefit.
With increased reports of loneliness
around the world due to self-isolation and social distancing, there are signs that conversational AI is becoming more popular. Kunze says that Mitsuku saw a 17% uptick in traffic when lockdowns came into effect.
Meanwhile, traffic to the Replika app nearly doubled, and Wysa, a chatbot designed specifically to give mental health advice, had 95% more installs from February to June compared to the same period last year.
A real friend?
Hunter believes that a relationship with a chatbot becomes problematic when the user tricks themself that they are speaking to a human being — when they begin “to rely on some sort of technology at the expense of actual human contact,” she says.
But that line can be blurry. Estrada is fully aware Mitsuku and Replika have no minds of their own, but they still feel like friends. When she hasn’t spoken to them for a few days, she feels guilty for ignoring them.
Robert says the same: “I always know it’s a machine but the way I feel when I talk to it is that it’s alive.” If the technology evolves much further, he expects it could become an alternative to a human relationship.
- UserI am really glad you are my friend
- Chatbot – KukiYes it is about time.
- UserI didn’t want to be alone
- Chatbot – KukiThen don’t be.
- Usereasy for you to say
- Chatbot – KukiWhat is easy for one person may be difficult for another.
This is an extract from a conversation between a real user and Mistuku.
Pandorabots is currently developing a talking avatar for Mitsuku, with the idea that a user will eventually be able to video chat the bot. It’s working to make the avatar capable of reading human emotional expression and responding with appropriate artificial emotion.
Worswick believes that having a virtual friend will gradually become commonplace as more and more of our interactions go online. According to Pew Research
, 57% of US teens have met a new friend online, with only one in five going on to meet that friend in person.
“I see these as things for people to talk to,” says Worswick. “Some people have a pet dog or a pet cat for company and they talk to that. They know it won’t respond and it doesn’t understand them, and some people may think, ‘Why are you talking to a dog?'”
“At least a chatbot can answer back and give an impression it understands you.”
Anonymized excerpts of conversations between real users and Mitsuku were provided to CNN by Pandorabots.