These tips from experts can help your teenager navigate AI companions


Bruce Perry, 17, demonstrates Character AI, an artificial intelligence chatbot software that allows users to chat with popular characters such as EVE from Disney's 2008 animated film WALL-E in Russellville, Arkansas. — AP

As artificial intelligence technology becomes part of daily life, adolescents are turning to chatbots for advice, guidance and conversation. The appeal is clear: Chatbots are patient, never judgmental, supportive and always available.

That worries experts who say the booming AI industry is largely unregulated and that many parents have no idea about how their kids are using AI tools or the extent of personal information they are sharing with chatbots.

New research shows more than 70% of American teenagers have used AI companions and more than half converse with them regularly. The study by Common Sense Media focused on "AI companions,” like Character. AI, Nomi and Replika, which it defines as "digital friends or characters you can text or talk with whenever you want,” versus AI assistants or tools like ChatGPT, though it notes they can be used the same way.

It’s important that parents understand the technology. Experts suggest some things parents can do to help protect their kids:

– Start a conversation, without judgment, says Michael Robb, head researcher at Common Sense Media. Approach your teen with curiosity and basic questions: "Have you heard of AI companions?” "Do you use apps that talk to you like a friend?” Listen and understand what appeals to your teen before being dismissive or saying you’re worried about it.

– Help teens recognise that AI companions are programmed to be agreeable and validating. Explain that’s not how real relationships work and that real friends with their own points of view can help navigate difficult situations in ways that AI companions cannot.

"One of the things that's really concerning is not only what's happening on screen but how much time it’s taking kids away from relationships in real life,” says Mitch Prinstein, chief of psychology at the American Psychological Association. "We need to teach kids that this is a form of entertainment. It's not real, and it's really important they distinguish it from reality and should not have it replace relationships in your actual life.”

The APA recently put out a health advisory on AI and adolescent well-being, and tips for parents.

– Parents should watch for signs of unhealthy attachments.

"If your teen is preferring AI interactions over real relationships or spending hours talking to AI companions, or showing that they are becoming emotionally distressed when separated from them – those are patterns that suggest AI companions might be replacing rather than complementing human connection,” Robb says.

– Parents can set rules about AI use, just like they do for screen time and social media. Have discussions about when and how AI tools can and cannot be used. Many AI companions are designed for adult use and can mimic romantic, intimate and role-playing scenarios.

While AI companions may feel supportive, children should understand the tools are not equipped to handle a real crisis or provide genuine mental health support. If kids are struggling with depression, anxiety, loneliness, an eating disorder or other mental health challenges, they need human support – whether it is family, friends or a mental health professional.

– Get informed. The more parents know about AI, the better. "I don't think people quite get what AI can do, how many teens are using it and why it's starting to get a little scary,” says Prinstein, one of many experts calling for regulations to ensure safety guardrails for children. "A lot of us throw our hands up and say, ‘I don’t know what this is!' This sounds crazy!' Unfortunately, that tells kids if you have a problem with this, don't come to me because I am going to diminish it and belittle it.”

Older teenagers have advice, too, for parents and kids. Banning AI tools is not a solution because the technology is becoming ubiquitous, says Ganesh Nair, 18.

"Trying not to use AI is like trying to not use social media today. It is too ingrained in everything we do,” says Nair, who is trying to step back from using AI companions after seeing them affect real-life friendships in his high school. "The best way you can try to regulate it is to embrace being challenged.”

"Anything that is difficult, AI can make easy. But that is a problem,” says Nair. "Actively seek out challenges, whether academic or personal. If you fall for the idea that easier is better, then you are the most vulnerable to being absorbed into this newly artificial world.” – AP

Follow us on our official WhatsApp channel for breaking news alerts and key updates!
Chatbot

Next In Tech News

Smartphone on your kid’s Christmas list? How to know when they’re ready.
A woman's Waymo rolled up with a stunning surprise: A man hiding in the trunk
A safety report card ranks AI company efforts to protect humanity
Bitcoin hoarding company Strategy remains in Nasdaq 100
Opinion: Everyone complains about 'AI slop,' but no one can define it
Google faces $129 million French asset freeze after Russian ruling, documents show
Netflix’s $72 billion Warner Bros deal faces skepticism over YouTube rivalry claim
Pakistan to allow Binance to explore 'tokenisation' of up to $2 billion of assets
Analysis-Musk's Mars mission adds risk to red-hot SpaceX IPO
Analysis-Oracle-Broadcom one-two punch hits AI trade, but investor optimism persists

Others Also Read