Medics concerned as young reach out to AI for mental health advice


The trend is widespread. In the US, a recent poll found 53.6% of respondents were using AI to manage stress or anxiety. — Photo by Zulfugar Karimov on Unsplash

LEIPZIG: Youngsters worldwide are increasingly reaching out to chatbots for advice on mental health issues, a trend medics warn is dangerous as it could discourage some from seeking much-needed help.

Some two-thirds (65%) of 16- to 39-year-olds have spoken to artificial intelligence (AI) about mental health issues at some point, says a new German poll.

Their conversations do not necessarily imply a diagnosed case of depression but often address general issues such as stress, grief or heartbreak.

The proportion is even higher among respondents who stated that they are currently experiencing a depressive phase (76%), says the German Depression Aid and Suicide Prevention Foundation poll.

Respondents say they often use systems such as ChatGPT (77%), Gemini (14%) or Microsoft Copilot (4%).

AI is already involved in cases of serious illness, with more than a third of respondents with a diagnosed depression (35%) saying they have recently spoken to chatbots about their condition, the data shows.

Often, young people said they wanted to just talk about problems or seek reassurance. More than half cite having someone to talk to as the reason (56%), says the poll of 16 to 39 year-olds in Germany.

Some 46% hope to gain better control of their condition themselves, while 40% seek information on therapy and treatment options.

The trend is widespread. In the US, a recent poll found 53.6% of respondents were using AI to manage stress or anxiety.

As in Germany, the survey conducted for George Mason University’s College of Public Health in the US found AI usage highest among those aged 25–34. Some 80% say they turn to AI for these needs and nearly a third do so daily.

Accessible, widespread use

AI chatbots are appealing mainly due to how easily you can access them. They are anonymous, available at any time and can be used without a wait, which for many makes them a first port of call when it comes to personal problems.

A large proportion of users describe the conversations as helpful and supportive. At the same time, many report feeling understood or finding it easier to open up. Around three-quarters of users (75%) say they have spoken to a chatbot about their problems within the past 30 days.

Some use the programmes more intensively: around a quarter (26%) have longer conversations or talk to the AI in a similar way to how they would with a real person.

As in Germany, Australians are also finding the accessibility of chatbots a powerful appeal. Nearly three in 10 Australian adults (28%) say they’ve opened up or been emotionally vulnerable with a chatbot like ChatGPT at least once.

More than one in five (21%) say they’ve done this multiple times, says a 2025 YouGov poll.

And one in six Australians (17%) say they’d sometimes rather stay home and talk to a chatbot than go out with friends, raising concerns that AI could reduce human-to-human interaction.

Medics voice concerns

Health specialists say the technology offers opportunities but also comes with risks. "The way young people communicate about mental health issues has shifted significantly towards digital spaces in recent years," says Germany-based psychiatrist Malek Bajbouj.

The systems can help bridge gaps in care, he says. “AI-based systems – evidence-based, human-guided and used in a targeted manner – have great potential to break down barriers, reduce waiting times and enable more prevention.”

At the same time, AI can also hamper people in seeking proper care, says Bajbouj. “AI systems carry the risk of pseudo-treatment: instead of seeking professional help, people remain trapped in systems that are either ineffective or even harmful.”

US poll respondents raised some similar concerns, saying their main worries with AI mental health care are data security, reliability of advice, and the lack of human connection.

Not a substitute for therapy

Medics criticise the fact that some users perceive AI as an alternative to treatment. “AI cannot replace therapy,” says Bajbouj. Algorithms are programmed for empathy, but lack the critical questioning and therapeutic guidance, crucial elements in genuine therapy.

Some respondents say they see exchanges with AI chatbots as an alternative to seeing a doctor or undergoing psychotherapy, with 62% of users with depression saying conversations with AI have made a visit to a doctor or psychotherapist unnecessary for them.

That is a problem if they have a severe case of depression, say doctors – and not enough research has been carried out into this area.

“The side effects of AI-supported treatment have hardly been systematically investigated. As things stand, AI systems are often not equipped to handle crises,” says Bajbouj.

In the worst-case scenario, AI systems could exacerbate distressing or suicidal thoughts, he says.

In fact, 53% of affected users report increased thoughts of self-harm or suicide following use.

A further problem is that many services are not developed for therapeutic purposes, whilst there is a lack of clear rules, quality standards and independent oversight. Whether AI ultimately helps or harms those affected has not yet been sufficiently determined by scientists.

Use safe services only as a supplement

Given the drawbacks, medics say only use AI as a supplement. “Depression is a serious, often life-threatening illness, and those affected should definitely continue to consult doctors, psychiatrists or psychological psychotherapists,” says German Depression Aid Foundation.

If you seek digital support, only use services that have been checked first – including approved digital health applications. Such apps on prescription come recommended by your health-care professional and may be medically prescribed and, depending on where you live, paid for by your health insurance. You can also find supervised online programmes. – dpa

Those suffering from problems can reach out to the Mental Health Psychosocial Support Service at 03-2935 9935 or 014-322 3392; Talian Kasih at 15999 or 019-261 5999 on WhatsApp; Jakim’s (Department of Islamic Development Malaysia) family, social and community care centre at 0111-959 8214 on WhatsApp; and Befrienders Kuala Lumpur at 03-7627 2929 or go to befrienders.org.my/centre-in-malaysia for a full list of numbers nationwide and operating hours, or email sam@befrienders.org.my.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Others Also Read