When AI’s your go-to confidant 


By AGENCY

One of the issues with young people turning to AI chatbots as confidants is that they are forming emotional bonds with entities that have no capacity to reciprocate. — 123rf

AI chatbot systems, such as ChatGPT, Claude and Copilot, are increasingly being used as confidants of choice.

But turning to these artificial intelligence systems for companionship and emotional support is a cause for concern, especially in younger people, say experts in the (2025) Christmas issue of The BMJ medical journal.

They warn that “we might be witnessing a generation learning to form emotional bonds with entities that lack capacities for human-like empathy, care and relational attunement”, and say that evidence-based strategies for reducing social isolation and loneliness are paramount.

In 2023, the US Surgeon-General declared that the United States was experiencing a loneliness epidemic, constituting a public health concern on par with smoking and obesity, write London’s Great Ormond Street Hospital for Children consultant radiologist Dr Susan Shelmerdine and University of Oxford consultant psychiatrist Dr Matthew Nour. 

In Britain, nearly half of adults (25.9 million) report feeling lonely either “occasionally”, “sometimes”, “always” or “often”.

Almost one in 10 experience chronic loneliness (defined as feeling lonely “often” or “always”).

Younger people (aged 16 to 24 years) are also affected. 

ALSO READ: Young and old are feeling alone

Given these trends, it’s no wonder that many are looking to alternative sources for companionship and emotional support, say the authors.

ChatGPT, for example, has around 810 million weekly active users worldwide, and some reports place therapy and companionship as a top reason for use. 

Among younger people, one study found a third of teenagers use AI companions for social interaction.

One in 10 report that the AI conversations are more satisfying than human conversations, and one in three report that they would choose AI companions over humans for serious conversations. 

In light of this evidence, the authors say it seems prudent to consider problematic chatbot use as a new environmental risk factor when assessing a patient with mental state disturbance. 

In these cases, they propose that clinicians should begin with a gentle enquiry on chatbot use, particularly during holiday periods when vulnerable populations are most at risk.

This should be followed, if necessary, by more directed questions to assess compulsive chatbot use patterns, dependency and emotional attachment.

They acknowledge that AI might bring benefits for improving accessibility and support for individuals experiencing loneliness.

They note that empirical studies are needed “to characterise the prevalence and nature of risks of human-chatbot interactions, to develop clinical competencies in assessing patients’ AI use, to implement evidence-based interventions for problematic dependency, and to advocate for regulatory frameworks that prioritise long term well- being over superficial and myopic engagement metrics”.

Meanwhile, focusing and building on evidence-based strategies for reducing social isolation and loneliness are paramount, they conclude.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Health

Here are five signs of frailty
How to power-nap effectively�
Bad effects of bug sprays on humans�
Exercises to fix those tight hip flexors
Are you having problems with your wrists?�
More treatment options for shoulder issues
‘Sniffing out’ consciousness in unresponsive patients
Biosimilars: Same effects as biologics, but less costly
The best way to wean off your antidepressants
Are growth supplements really necessary for children?�

Others Also Read