Former Meta researchers testify company buried child safety studies


A staffer places a visual aide on a stand as Sen. Josh Hawley (R-MO) questions Sattizahn and Savage during a Senate Judiciary Subcommittee on Privacy, Technology, and the Law hearing titled ‘Hidden Harms: Examining Whistleblower Allegations that Meta Buried Child Safety Research’,on Capitol Hill on Sept 9, 2025 in Washington, DC. The whistleblowers allege that Meta deleted or manipulated internal research showing children as young as 10 were exposed to sexual harassment, grooming and violence on its platforms in disclosures to Congress and federal regulators. — AFP

WASHINGTON: Meta systematically suppressed internal research highlighting serious child safety risks on its virtual reality platforms, according to allegations from current and former employees who testified to Congress on Sept 9.

The social media giant deployed lawyers to screen, edit and sometimes veto sensitive safety research after facing congressional scrutiny in 2021, six researchers alleged.

In their allegations, first revealed in the Washington Post, the whistleblowers claim Meta's legal team sought to "establish plausible deniability" about negative effects of the company's VR products on young users.

Though a major money loser for the company that owns Facebook and Instagram, Meta is a leading force in the VR industry, primarily through its Quest lineup of devices, including the successful Quest 3.

"Meta is aware that its VR platform is full of underage children. Meta purposely turns a blind eye to this knowledge, despite it being obvious to anyone using their products," said former Meta researcher Cayce Savage at the US Senate hearing.

According to the Post, internal documents show that after former Meta product manager Frances Haugen leaked damaging information about the company's policies on content issues, the company imposed new rules on any research into "sensitive" topics including children, gender, race and harassment.

This included advice to researchers to "be mindful" about how they framed studies, avoiding terms like "illegal" or saying something "violates" specific laws.

But the documents reveal employees repeatedly warned that children under 13 were bypassing age restrictions to use Meta's VR services, despite terms of service limiting access to users 13 and older.

As early as 2017, one employee estimated that in some virtual rooms as many as 80 to 90% of users were underage, warning: "This is the kind of thing that eventually makes headlines – in a really bad way."

Speaking to the Post, Meta vehemently denied the allegations, with spokeswoman Dani Lever calling them a "predetermined and false narrative" based on cherry-picked examples.

"We stand by our research team's excellent work and are dismayed by these mischaracterizations of the team's efforts," Lever said, noting the company has developed various safety protections for young users.

Researcher Jason Sattizahn told the Senate hearing that it was "very clear that Meta is incapable of change without being forced by Congress."

"Whether it's engagement or profits at any cost, they have, frankly, had unearned opportunities to correct their behaviour, and they have not," Sattizahn told senators.– AFP

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Tech News

Alibaba claims viral happy horse AI model in latest breakthrough
German actress breaks silence in 'digital Pelicot' case
Elon Musk, who owns X, appears to post on TikTok
China’s state media turns to social media and AI to tell its story – and often mock the US
TSMC likely to book fourth straight quarter of record profit on insatiable AI demand
IMF chief warns global monetary system not ready for AI cyber threats
Time out: A new approach to healthy screen habits for kids
In Europe first, Netherlands to allow Teslas to self-drive
US FTC in settlement talks with ad companies in boycott probe, WSJ reports
UK regulators rush to assess risks of latest Anthropic AI model, FT reports

Others Also Read