AI therapy believed to do more harm than good


PETALING JAYA: The rising reliance of Malaysian teenagers on AI chatbots for emotional support and counselling is a “double-edged sword” that could potentially do more harm than good, according to mental health experts and child advocates.

While offering immediate comfort, they warn that the technology is largely unregulated and lacks the critical nuances of human interaction and professional therapy.

Consultant paediatrician and child disability activist Datuk Dr Amar Singh HSS said the use of such chatbots to meet emotional needs appears appealing in a world that is increasingly fragmented, busy and lonely.

“They are accessible 24 hours, unlike a human therapist or even a friend,” he said.

Also appealing, he added, was that chatbots offered a judgement-free and safe space for teenagers to express their thoughts in privacy.

However, he cautioned that most chatbots are unregulated, with a potential for inaccurate or biased information.

“Chatbots are not therapists and should not be treated like one. No AI application can compare with a sensible, trustworthy human friend or a trained human therapist,” he added.

Assoc Prof Dr Anasuya Jegathevi Jegathesan said that while AI chatbots may offer some comfort to teens seeking emotional support, they come with a price.

“It helps people not to feel so alone. Chatbot responses are fast so it does feel as though someone is responding to them.

“Some may find it easier to confide in the chatbot, something which they would not do with people,” she said.

With the advancement in AI, she said there were chatbots which seemed personalised as they had human-like generated images or avatars when interacting with users.

Anasuya, who is dean at the Faculty of Psychology and Social Sciences at University of Cyberjaya, pointed out that chatbots are not designed by mental health practitioners but computer programmes responding to questions and algorithms.

“The chatbots are trained by the Internet and are akin to the computer term ‘garbage in, garbage out’.

“The outcome may not be the most intelligent or supportive response, but one that could be very negative and toxic.

“For individuals who need help, desperate or in pain, they may end up being supported with a negative response,” she said.

Anasuya added that over-reliance on chatbots could result in teenagers being unable to develop social interaction skills.

“Chatbots give a false sense of having somebody there for you when you are actually still alone.

“As humans, we not only require mental stimulation but also physical interaction,” she said.

She suggests that relevant authorities collaborate with ­professionals from the healthcare sector to develop chatbots ­speci­fically for the young to have an avenue to express their thoughts.

Consultant psychiatrist Datuk Dr Andrew Mohanraj related his own personal experiences with clients who had relied on chatbots for advice, which only worsened their emotional distress.

He said the risk is even greater if the individual is suffering from psychosis or entertaining suicidal or self-harm thoughts.

“Such individuals may not be able to share their feelings accordingly and it is unlikely that they will get an accurate response from the chatbots,” he added.

He suggests strict regulations and rigorous testing of AI chatbot with enforcement against deceptive companies if they are to be used to provide advice or support to the young.

Parent Action Group for Education Malaysia chairman Datin Noor Azimah Abdul Rahim said fear of being judged, shame or repercussions may be why youths rely on chatbots for solace.

She said this is especially true when traditional support systems like friends and family are not available.

The danger, she said, was that there are risks of misinformation or oversimplification of the problem faced by the individual.

Azimah suggests that young users must be educated that AI chatbot cannot replace human intervention, especially for serious mental health issues.

Both the National Health Service in the United Kingdom and the American Psychological Association have voiced concerns recently over the use of AI chatbots for mental health support.

The rise in so-called “AI therapy” was described as “harmful and dangerous”, with the dangerous trend being linked to suicide.

Last month, a couple in California, the United States, sued the creator of an AI chatbot, alleging that it had encouraged their 16-year-old son to take his own life.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Nation

Shamsul Iskandar, Albert Tei claim trial to RM64,924 bribery charges
Anwar conveys condolences to Bung Moktar’s family
Sabah mourns loss of fearless, outspoken Bung Moktar
Marine police seize smuggled diesel worth over RM230,000 in Sarawak
Sabah mourns the passing of Bung Moktar
Govt’s medical, health insurance scheme to be standalone product, says Finance Ministry
Bung Moktar: Sabahans lost a resolute fighter who championed their rights
‘Some prayers don't have words anymore, only tears and trust’, Bung Moktar’s wife Zizie Izette posts heartfelt message
Bung Moktar always stood firm by his principles, says Zambry
BN chairman Ahmad Zahid, Umno leaders pay tribute to Bung Moktar

Others Also Read