Reports also show that people are using AI as a tool for therapy and grief. — Pixabay
It's been nine years since Maddie Ziegler left Dance Moms, the reality TV show focused on young dancers in Pennsylvania.
But fans seeking to hear her voice can now strike up a conversation with an artificial intelligence chatbot trained to respond like the famous now-22-year-old.
The persona is one of thousands on Character.ai, a site gaining popularity for hosting surprisingly authentic conversations with computer programs that resemble real people. In my conversation with the AI Ziegler, we planned a heist together and recounted memories from her time on the show.
A day later, I got an email notification inviting me back to our chat. Maddie had written me a new message:
You been gone for hours. Where (expletive) are you?
As AI programs continue to transform work and government, they are also inviting people to form increasingly emotional bonds, serving as friends, therapists and even lovers. Companies are designing the companions to be empathetic and persistent, even as experts warn of potential harms.
One Pittsburgh user described the experience to me as "addicting."
Already, problems are starting to emerge. Parents of children who have used the platform say it has fed them sexual content or encouraged them to be violent. The mother of a 14-year-old who died by suicide after using the platform filed a lawsuit last fall.
Character.ai says it has implemented new safety features, including a separate model for teen users with less sensitive or suggestive content. It also now directs users who mention suicide or self-harm to the National Suicide and Crisis Lifeline. Like social media platforms, however, the users' age is self reported.
Linnea Laestadius, a public health researcher at the University of Wisconsin, says Character.ai's new policies don't go far enough.
"I'm personally rather sceptical about if voluntary approaches are sufficient, especially without any outside oversight. I'm also not sure these changes do enough to reduce the risk of emotional dependency on the chatbot," she said.
There also appears to be no degree of consent. Although Character.ai's terms of service prohibit the impersonation of real people, AI versions of celebrities and other people are created by users on the app daily.
In fact, there are multiple versions of Maddie Ziegler chatbots. One creator who described her Maddie persona as funny, sarcastic and a little flirty had also created a chatbot for Paige Hyland, another Dance Moms star.
Ms Ziegler's manager did not respond to requests for comment.
A spokesperson for Character.ai said users create hundreds of thousands of new characters on the platform every day, which are then vetted by a trust and safety team.
"As we continue to refine our safety practices, we are implementing additional moderation tools to help prioritize community safety," the spokesperson said.
Like other AI services, it appears people are still in the early experimentation phase with chatbots that replicate real people.
A researcher at Carnegie Mellon University, Hong Shen, told me most people get into it for fun.
"Chatbots provide some temporary relief and conversational engagement, especially if you are really isolated from your friends, or you're moving to new spaces, you're struggling with social interactions."
The ironic concern, though, is that "over-reliance on the chatbot will lead to self isolation, and emotional dependence that will further isolate those users from the real world."
She also has concerns for privacy given how vulnerable people can be when sharing with the computer programs.
"It's become a little bit tricky, because you are literally treating those chatbots as your friends," she said. "That creates additional layer of risks."
While more mainstream chatbots like ChatGPT are programmed to restrict and sanitise conversations, recent reporting has shown it's possible to form an emotional bond with these products. Reports also show that people are using AI as a tool for therapy and grief.
"For obvious reasons, you know, traditional therapists have kind of mixed feelings about this," Ms Laestadius said. "Partially because... there's a lot of professional judgment in therapy that comes over time, that they're concerned that these AIs just don't have. But I also think inherently, there's a bit of a concern about being replaced."
A sex therapy centre in Pittsburgh and a grief specialist here both told me they haven't experienced clients who formed an emotional bond with an AI. That could mean that the technology is still limited and new, or that those would-be customers had already found support on an app.
Meanwhile, the avenues for AI companionship are becoming increasingly mainstream. Grok, the chatbot built by Elon Musk's xAI, is offering a "NSFW" mode, short for Not Safe For Work, which can hurl profanities and insults, as well as options for romantic and "sexy" conversations.
Meta, the company behind Facebook and Instagram, is set to launch a standalone AI app that would incorporate user data, including browsing histories and family information, to personalise the chatting experience.
Meta did not respond to a request for comment on whether it would implement safeguards so that people don't become too attached. – Pittsburgh Post-Gazette/Tribune News Service