Young Europeans turn to AI chatbots for emotional support, survey shows


FILE PHOTO: A message reading "AI artificial intelligence", a keyboard, and robot hands are seen in this illustration taken January 27, 2025. REUTERS/Dado Ruvic/Illustration/File Photo

May ⁠5 (Reuters) - Nearly one in two young people in Europe have used AI chatbots to ⁠discuss intimate or personal matters, as the technology increasingly serves as a source of ‌emotional support, an Ipsos BVA survey showed on Tuesday.

Of the 3,800 people surveyed, 51% said it was "easy" to discuss mental health and personal issues with a chatbot. Only 49% said the same about healthcare professionals and 37% about psychologists.

People close to ​you were at the top of the list, with 68% ⁠saying it was easy to discuss issues ⁠with friends and 61% with parents.

The survey, commissioned by France's privacy watchdog CNIL and insurer Groupe ⁠VYV, ‌was carried out among people aged 11 to 25 across France, Germany, Sweden and Ireland in early 2026.

The findings showcased growing concerns over young people's mental health. About 28% of ⁠respondents met the threshold for suspected generalized anxiety disorder, the survey ​found.

Around 90% of those surveyed ‌had used artificial intelligence tools before, with many citing their constant availability and non-judgmental ⁠nature. More than ​three in five users described AI as a "life adviser" or a "confidant".

However, concerns over the psychological impact of AI tools have also grown over the past year, and experts have warned about thelimitations of AI in detecting human ⁠emotions and safelyproviding emotional support.

Earlier this year, the family of ​a Florida man sued Google, alleging its Gemini AI chatbot contributed to his paranoia and eventual suicide.

The results of the survey were not a surprise, said Ludwig Franke Föyen, a psychologist and digital health ⁠researcher at Stockholm's Karolinska Institutet.

Current large language models can produce high-quality responses, Franke Föyen told Reuters, adding that his research suggested even licensed professionals may struggle to distinguish AI-generated advice from that of human experts.

But he warned against relying on chatbots alone for mental health support, saying general-purpose AI systems ​were designed for engagement and companies' goals may not align with mental ⁠healthcare needs.

"AI can offer information and support, but it should not replace human relationships or professional care," ​Franke Föyen said.

"If someone turns to a chatbot instead of ‌speaking to a parent, a friend, or a mental ​health professional, that is a concern. We do not want technology to make people feel more alone."

(Reporting by Lucie Barbier and Leo Marchandon in Gdansk, editing by Milla Nissi-Prussak)

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Others Also Read