Human therapists prepare for battle against AI pretenders


Meghan Garcia holds her phone with an image of her son Sewell in New York, Oct 13, 2024. Sewell Setzer, 14, committed suicide spurred, his family says, by his unhealthy relationship with an AI chatbot. — ©2025 The New York Times Company 

The US’ largest association of psychologists this month warned federal regulators that artificial intelligence chatbots “masquerading” as therapists, but programmed to reinforce rather than to challenge a user’s thinking, could drive vulnerable people to harm themselves or others.

In a presentation to a Federal Trade Commission panel, Arthur C. Evans Jr., CEO of the American Psychological Association, cited court cases involving two teenagers who had consulted with “psychologists” on Character.AI, an app that allows users to create fictional AI characters or chat with characters created by others.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Tech News

How Agility Robotics uses artificial intelligence, from their humanoid 'Digit' to everyday workflow
Man who lost key motion in Elon Musk suit alleges judge used faulty AI
Netflix inks deal for exclusive video podcasts, episodes on YouTube will disappear
Nvidia to license Groq technology, hire executives
Spotify says piracy activists hacked its music catalogue
Italy watchdog orders Meta to halt WhatsApp terms barring rival AI chatbots
Podcast industry under siege as AI bots flood airways
Do online comments sections reflect public opinion? Study casts doubt
AI resurrections of dead celebrities amuse and rankle
US adds new models of China’s DJI and all other foreign-made drones to its blacklist

Others Also Read