PETALING JAYA: The death of a Sarawakian teenager who posted a story on Instagram moments before taking her own life has not only garnered worldwide attention, it has increased scrutiny of social media platforms.
The 16-year-old had uploaded a post with the heading “Really Important, Help Me Choose D/L”. D and L were largely taken to mean “die” and “live”.
The girl jumped to her death after 69% of pollsters supported the decision for her to kill herself.
Following a backlash, Instagram said it has put in place measures to prevent self-harm and suicide.
But its representatives acknowledged the difficulty of detecting such posts due to the different ways people express themselves.
Instagram chiefs were questioned in the UK Parliament on Wednesday about the poll.
BBC quoted Instagram head of product Vishal Shah as telling British lawmakers that they were looking at whether policy changes were needed.
The platform has in place a mechanism to detect self-harm thoughts and seeks to remove certain posts while offering support where appropriate, he said.
For example, if a user searches for the word “suicide”, a pop-up appears offering to put them in touch with organisations that can help.However, Vishal Shah said, the way people expressed mental health issues was constantly evolving, which posed a challenge.Instagram Asia Pacific communications head Wong Ching Yee said measures had been taken to provide users with suicide prevention tools and information.
These included how to report content, get support for a friend or contact experts for help, she said.
Wong explained that to build a supportive community, Instagram had taken to automatically filtering out offensive comments and adding screens to sensitive content.
“We want those struggling with mental health issues to be able to access support on Instagram when and where they need it,” she said.
The platform also works with experts to give people the tools and information they need while using the app, such as how to report content or get support for friends they are concerned about, she said.
“In Malaysia, we work with Befrienders KL,” she said in e-mail to The Star yesterday.
Instagram, she added, has teams worldwide working 24 hours a day, seven days a week reviewing reports that come in and prioritising the most serious ones like suicide.“We provide people who have expressed suicidal thoughts with a number of support options.
“For example, we prompt people to reach out to a friend and offer pre-populated text to make it easier for people to start a conversation.
“We also suggest contacting a helpline and offer other tips and resources for people to help them in that moment,” she said.
Wong said if someone posts on Facebook or Instagram about their wellbeing, other users are encouraged to reach out to the person or report the post to the admin.Facebook, which owns Instagram, has had suicide prevention tools in place for more than 10 years.
These tools were developed in collaboration with over 50 mental health organisations across the world with input from people with personal experience thinking about or attempting suicide, she said.
“In the past few years we have worked to expand these tools, enhance our review tools for live broadcast and adopt the use of AI (artificial intelligence) to help identify when someone might be expressing thoughts of suicide, respond to reports faster and improve how we identify appropriate first responders,” she added.A survey by The Star Online showed that although 43% of users spend more than three hours a day on social media, many are not too concerned with the comments they receive on these platforms.
Out of over 160 respondents,
80% said they did not go to social media to seek support for their problems.
When asked to rate how important social media comments were to them, 39% were neutral and 31% said they couldn’t be bothered.About 47% of users said they would show concern if someone posted about their problems on social media, with 40% saying they would do so sometimes and 11% saying they would not.
It was noteworthy that 15% of users knew someone who had attempted or had committed suicide due to social media comments.
The Malaysian Communications and Multimedia Commission (MCMC) urged social media users to be responsible and alert, especially when they see suicidal posts.
“If (users) find that any of their online friends are showing suicidal inclinations, a report must immediately be made to the police for further observation and action,” MCMC said in a statement on Wednesday.
It said police were verifying whether the online voting had elements of abetting the suicide, an act punishable under Section 305 of the Penal Code.
MCMC also urged Internet users to report cases of cyberbullying or inappropriate content to it.
Communications and Multimedia Minister Gobind Singh Deo said the government was making progress in its plan to come up with a law to tackle cyberbullying.
Gobind, who mentioned the proposal of such a law in December, said discussions with various ministries and the police were ongoing and he would make an announcement when the law was ready.
He said this to reporters after distributing bubur lambuk to TM Bhd staff at Menara TM yesterday.
Wanita MCA chief Datuk Heng Seai Kie called on the police and MCMC to bring the social media abusers to book for being complicit in the Sarawakian teenager’s suicide.
“To the cyberbullies who goaded the victim to take her own life, does your conscience not prick you, that there is blood on your fingers too?
“Immediate measures might have pre-empted the tragedy. For example, upon noticing her question, the girl’s followers should have tried to contact and dissuade her, while also informing her parents or guardians of her intentions,” said Heng.
Need for suicidal posts on social media to be addressed
Lawyers: Action can only be taken if there is abetment
Lack of clinical psychologists means long wait for patients