Meghan Garcia holds her phone with an image of her son Sewell in New York, Oct 13, 2024. Sewell Setzer, 14, committed suicide spurred, his family says, by his unhealthy relationship with an AI chatbot. — ©2025 The New York Times Company
The US’ largest association of psychologists this month warned federal regulators that artificial intelligence chatbots “masquerading” as therapists, but programmed to reinforce rather than to challenge a user’s thinking, could drive vulnerable people to harm themselves or others.
In a presentation to a Federal Trade Commission panel, Arthur C. Evans Jr., CEO of the American Psychological Association, cited court cases involving two teenagers who had consulted with “psychologists” on Character.AI, an app that allows users to create fictional AI characters or chat with characters created by others.
