Human therapists prepare for battle against AI pretenders


Meghan Garcia holds her phone with an image of her son Sewell in New York, Oct 13, 2024. Sewell Setzer, 14, committed suicide spurred, his family says, by his unhealthy relationship with an AI chatbot. — ©2025 The New York Times Company 

The US’ largest association of psychologists this month warned federal regulators that artificial intelligence chatbots “masquerading” as therapists, but programmed to reinforce rather than to challenge a user’s thinking, could drive vulnerable people to harm themselves or others.

In a presentation to a Federal Trade Commission panel, Arthur C. Evans Jr., CEO of the American Psychological Association, cited court cases involving two teenagers who had consulted with “psychologists” on Character.AI, an app that allows users to create fictional AI characters or chat with characters created by others.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Tech News

OpenAI developing AI devices including smart speaker, The Information reports
US judge upholds $243 million verdict against Tesla over fatal Autopilot crash
Analysis-New cybersecurity rules for US defense industry create barrier for some small suppliers
Tesla unveils cheaper Cybertruck variant, cuts Cyberbeast price to drive demand
Los Angeles sues Roblox over child exploitation claim
Google Gemini, Apple add music-focused generative AI features
ByteDance building out artificial intelligence team in US
Sony shuts down video-game studio Bluepoint
AppLovin plans Its own social platform after failed TikTok bid
Laser-written glass can store data for millennia, Microsoft says

Others Also Read