Neveen Radwan said she didn't realise the triggers behind her daughter's eating disorder until she opened the teenager's phone and scrolled through her Instagram and TikTok feeds.
She said her daughter, then 15, was being bombarded with photoshopped images of rail-thin celebrities in bikinis and videos about extreme, low-calorie diets. In a separate note, Radwan's daughter had jotted down the heights and weights of a dozen female celebrities, like Kylie Jenner.
"The way that these algorithms work, there's just like a tidal wave of information," she said. "No matter what you tell them, how much this stuff isn't real, there's no way you can get through."
Radwan, who lives in San Jose, said she took away her daughter's phone and tried to persuade her to eat. But the teen was caught in the grips of anorexia — she was hospitalised after her heart rate fell dangerously low and was admitted to a treatment centre for people with eating disorders.
Her daughter's struggles illustrate what many privacy advocates say is the downside to 20-plus years of rapid growth in the digital space: Social media platforms and other sites routinely target child users with their content, but face little, if any, regulation over their practices dealing with youth-oriented content.
The outcome is a virtual "Wild West" of content accessible to children, which behavioural experts say has fuelled a mental-health epidemic among young people, including their struggles with body image, depression, anxiety, suicidal ideation and addictive patterns of digital behaviour.
But two parents and California state lawmakers — Assembly Members Jordan Cunningham, R-San Luis Obispo, and Buffy Wicks, D-Oakland — are taking the issue head on with proposed legislation that would require tech companies to provide online protections tailored for kids.
Their proposal, composed of two bills, would be the most sweeping package of children's Internet law in the country. It could give momentum to similar efforts in other state as attempts to pass child Internet protections have largely fizzled at the federal level.
Cunningham said the bills would hold tech companies responsible for their algorithms, codes that determine what content users see, and other design features they use to track child users or show them inappropriate or harmful content. For example, he said, teen users who search for information about healthy eating can be inundated with ads for diet pills or other weight-loss products.
"Folks, if you're going to create a product that you know children are going to use, you should design it in a way that doesn't require them to seek psychiatric help," he said during a floor debate last week. "This standard exists for every other product on the market."
Last fall, a study leaked by a former employee of Facebook, which owns Instagram, found that significant percentages of teens said the platform causes them major distress: 17% of girls said using it makes eating disorders worse, and 13.5% of girls said it makes their suicidal thoughts worse.
Cunningham said the bills would signal to big tech that "the era of unfettered social experimentation on children is over" and that companies are financially liable for the harm they cause.
Two main bills are at the heart of the effort in California:
— AB2408 would allow parents to sue social media platforms if their children become addicted, and allow them to seek a penalty of US$25,000 (RM109,700) per violation, in addition to other punitive damages. The bill defines addiction as a user's inability to "cease or reduce use of a social media platform" despite their desire to do so. To sue a company, the plaintiffs would have to prove that the child has been harmed physically, mentally, emotionally or developmentally.
— AB2273 would create an age-appropriate design code for webs© 2019 San Francisco Chronicle sites or apps likely to be accessed by children, requiring tech firms to create products with the safety of children in mind. It would prohibit companies from using a child's online data, such as what terms they search, or otherwise profiling them to power algorithms that promote harmful content. Companies would also be required to provide the highest privacy settings to users younger than 18 by default.
Both bills have stirred opposition from large tech companies, but AB2408 has stirred the most complaints from Silicon Valley groups that worry companies could face never-ending lawsuits.
Dylan Hoffman, executive director for California and the Southwest for TechNet, the industry advocacy group, said making Internet companies liable for so-called social media addiction would be "incredibly costly and puts a lot of risk on our companies." He said the bill could also erode free-speech protections that shield Internet companies from being sued over the content users posts on their platforms.
"I have a hard time picturing it benefiting anybody other than just trial attorneys," Hoffman said. "We've certainly taken a lot of steps proactively both to protect kids but also give parents more control."
TechNet and other Internet industry players argue that the best way to protect children online is to focus on developing better design standards that limit a child's access to harmful content from the outset.
Hoffman said tech companies are open to supporting AB2273 if the measure is amended in several key ways, including narrowing the definition of what sites are likely to be accessed by children and giving the state attorney general the sole authority to enforce the law, effectively barring private lawsuits.
Opposition from tech groups has been a major hurdle to state legislators passing aggressive internet-privacy measures in past years. But political sentiment around the debate may have shifted.
Jim Steyer, the CEO of Common Sense Media, a San Francisco-based advocacy group, said the bills are drawing bipartisan support in California because isolation during the Covid-19 pandemic laid bare the mental health problems that social media has exacerbated for youth.
"There's no question that there's broader public support, but also among legislators," Steyer said. "It's just a new ball game."
Last month, both bills passed the Assembly with support from many Democrats and Republicans, though the margin was tighter for AB2408. The measures now head to the Senate, where there will likely be an effort to soften or shelve them.
Tiffany Chen, a junior at UCLA and volunteer with Good for MEdia, an advocacy group that helps young people use social media in healthy ways, said she's advocating for the state to pass AB2273 to better protect the next generation of teens from a firehose of targeted content.
Chen, who grew up in Cupertino, said she remembers how the emergence of apps like Instagram and TikTok around 2015 changed her digital experience as a teen. She said the apps created pressure to be part of a "bubble of perfectness online" as teens competed for the most "likes" or painstakingly staged selfies with the perfect lighting to be just like the influencers in their feeds.
"I did notice there was this increasing pressure to conform, to live your best life on social media even if, in reality, you're not living it," she said. "For me, it got really draining."
Radwan, the mother whose teen developed an eating disorder, said her daughter came home last fall, after spending about six months at a treatment facility in Denver. She said she's hopeful that if California passes laws regulating social media content for children that it will spare other families from the same trauma.
"It's been excruciating, absolutely excruciating," Radwan said. "There's got to be responsibility for the companies putting it out there. There's got to be some accountability." – San Francisco Chronicle/Tribune News Service