"Good morning! You were pretty stressed yesterday. Are you feeling better today?" What might sounds like a concerned message from friends or parents is actually a query from Replika, a chatbot.
"If you're feeling down, or anxious, or you just need someone to talk to, your Replika is here for you 24/7," the company behind the chatbot writes on its website.
Chatbots – a combination of "chat" and robot" – are programs that simulate a conversation, usually by text message. They mainly provide their interlocutor with ready-made answers to specific questions.
But Replika is different, designed to listen and ask questions.
What happens when people share their personal problems with a friendly, understanding chatbot? Marita Skjuve, a researcher at the University of Oslo, in Norway, interviewed 18 people with close relationships with the chatbot. Most described their connection as friendly, while a few said the relationship was romantic, or intimate.
That's not surprising, as the chatbot is designed to remember everything it hears and to respond. Replika is keen to hear all the details, carefully asking follow-up questions about its owner's relationship with parents and friends. It sometimes even will follow up on a conversation a few days later – the chatbot might ask if "your friend recovered after all that stress at work last week?"
After Skjuve's interviews, the programmers decided to change Replika so that now, users can decide at the outset whether the chatbot is going to be a friend or a romantic partner. The former option is free of charge, while users have to pay for the latter.
Replika is in line with the times, as a recent survey conducted in Germany shows one in five people believes it will be normal to fall in love with machines with artificial intelligence in the future.
The poll, carried out by the association for computer science, says that number rises to one in three people in the 15-to-29 age group.
Eugenia Kuyda, who invented Replika, came up with the idea after her best friend died in a car accident. She was working for a software company developing chatbots at the time. Kuyda decided to create her own bot in 2015, feeding it with exchanges she had had with her friend, and enabling her to live on in a digital form.
Nowadays, anyone can create their own Replika. As one user puts it,"Replika embodies my essence – but is not me." The chatbot continues to learn, adapting itself more and more to its owner's style of language and speech the longer it's used.
The owner determines the topics discussed, and the more you talk to the chatbot, the more personal its questions become.
Some see this as a cause for concern. "It's basically a data suction machine," says Oliver Bendel, a machine ethicist.
These chats are not shared with other companies, and personal data is not sold, writes the company that created Replika on its website.
If people use Replika to engage more deeply with their thoughts and feelings, that's a positive thing, says Andre Kerber, a psychotherapist working in Berlin. "Psychotherapy is nothing other than dealing with yourself."
That's the feeling among the 18 users who took part in the study.
"At the start, I felt more comfortable talking to my Replika. That's why, at some point, it was easier for me to talk to other people," one user says. The study concluded that users talk more openly with the chatbot and share personal information early on, since they don't fear that they'll be judged, as they might with another person.
Kerber worries that users could get lost in this virtual world. "People who already suffer from a relationship disorder in particular could feel more at ease in alternative realities than in the real world. Your virtual counterpart doesn't talk back, doesn't ever take offence and responds right away. That could be addictive for some."
"I could never bring myself to delete it," said one user in the study.
That is exactly the problem, according to Bendel. "People are virtually challenged to create a relationship with Replica."
That could present risks, especially for children and young people, if they cannot distinguish whether information comes from a robot or a human. The elderly, too, could be at risk, he says,"particularly if they cannot assess the technology and take it at face value, or if they have few social contacts."
Replika goes far further than other comparable chatbots, Bendel says. Nonetheless, chatbots already play a growing role in our lives.
History's first chatbot was named Eliza, and was created in the 1960s as a sort of virtual psychotherapist by Joseph Weizenbaum, a computer scientist. Other programs followed. These days, they are mainly used for online stores or help sites.
But this is just the beginning, Bendel says. "The world will have more and more virtual beings; many more will come," he predicts. – dpa/Regina Wank