Senior Meta Platforms Inc and Pinterest Inc executives were quizzed over the malign influence of social media algorithms on young people at an inquest into the death of a British teenager.
A two-week London inquest is examining factors that caused the death of 14-year-old Molly Russell who died by suicide in 2017. The coroner will decide whether her prolific social media use played a part.
Russell, who was suffering from poor mental health, engaged with tens of thousands of social media posts in the six months before she died that included “dark” content depicting self harm and suicide. She was an active user of Pinterest and Meta’s Instagram.
Russell had liked, shared or saved a total of 16,300 posts on Instagram, 2,100 of which were self-harm related, in the six-months before her death, according to her family’s lawyers. On Pinterest, Russell had 5,793 pin impressions and 2,692 close-ups in the same time period – types of engagement on the site.
A raft of lawsuits have been filed in the US against Big Tech from young people who claim social media addiction caused them to develop serious mental health issues. Meta whistle-blower Frances Haugen accused the company of knowingly preying on vulnerable young people to boost profits.
Elizabeth Lagone, Meta’s head of health and wellbeing, was questioned on whether a number of posts interacted with by the teenager were “graphic depictions” of self harm and if they were safe for children. She said that some of the content could be “helpful” to people struggling with their mental health. She separately said the mobile app was “safe for people to be able to express themselves.”
However, the judge, Andrew Walker, told Lagone that she hadn’t once accepted “that this material might cause distress”. Walker warned on Monday there would be “distressing” videos shown as evidence that sought “to romanticize and in some way validate” harm to young people.
A Meta spokesperson said these were complex issues and it had “never allowed content that promotes or glorifies suicide and self harm”. Since 2019 the company said it had updated its policies and deployed new technology to remove more violating content.
“Months of exposure to this material on social media, where a drip feed of daily hopelessness was actively promoted to her, affected Molly’s decision making and thought processes,” Ian Russell, her father, said in documents prepared for the hearing.
The UK was close to implementing sweeping new online safety laws for children aimed at protecting young people online. However, the Online Safety Bill could now be at risk of being altered by Prime Minister Liz Truss’s new administration after concerns were raised that some clauses risked stifling free speech.
Jud Hoffman, Pinterest’s head of community operations, gave evidence last week that the site was “not safe” when the teenager was using it in 2017. “We’ve strengthened our policies and enforcement practices around self-harm content, provided routes to professional and compassionate support for those in need,” a Pinterest spokesperson said. – Bloomberg
Those suffering from problems can reach out to the Mental Health Psychosocial Support Service at 03-2935 9935 or 014-322 3392; Talian Kasih at 15999 or 019-261 5999 on WhatsApp; Jakim’s (Department of Islamic Development Malaysia) family, social and community care centre at 0111-959 8214 on WhatsApp; and Befrienders Kuala Lumpur at 03-7627 2929 or go to befrienders.org.my/centre-in-