Instagram suggested ‘groomers’ connect with minors, US FTC says


The Meta Platforms Inc. report noted that minors made up 27% of the follow recommendations that the social media app surfaced to "groomers,” a term the company used to refer to accounts they identified as exhibiting predatory behaviour toward children. — Photo by Solen Feyissa on Unsplash

Instagram’s automated software systems recommended that child "groomers” connect with minors on the app, making it easier for them to find victims, according to a 2019 internal document presented in court by the Federal Trade Commission. 

The Meta Platforms Inc. report noted that minors made up 27% of the follow recommendations that the social media app surfaced to "groomers,” a term the company used to refer to accounts they identified as exhibiting predatory behaviour toward children. Over three months that year, the company found that 2 million accounts held by minors had been recommended to groomers.

The company found that 7% of Instagram follow recommendations made to all adults were minors. The report, titled "Inappropriate Interactions with Children on Instagram,” was shared among company executives in June 2019. It was presented in federal court on Tuesday as part of the FTC’s antitrust lawsuit against Meta.

The report also included an analysis of 3.7 million user reports flagging inappropriate comments to the company. Meta, which was called Facebook at the time, found that about one-third of those reports came from minors. Of the minors who reported an inappropriate comment, 54% were reporting an adult. 

Earlier in the trial, the FTC offered evidence that Meta Chief Executive Officer Mark Zuckerberg, when presented with safety issues on Instagram, chose not to offer enough resources to that app to help it address the risks to users. After seeing the 2019 data, US District Judge James Boasberg asked the FTC’s lawyers to speed things along.

"Out-of-context and years-old documents about acquisitions that were reviewed by the FTC more than a decade ago will not obscure the realities of the competition we face or overcome the FTC’s weak case,” a Meta spokesperson said in a statement. 

The company added that it has "long invested in child safety efforts,” and in 2018 began work to restrict recommendations for potentially suspicious adults and encouraged the National Center for Missing and Exploited Children to expand its reporting process to include additional grooming situations it noticed. 

SMARTER GAMING WITH CUTTING-EDGE HOME WI-FI

Guy Rosen, Meta’s chief information security officer, argued on the stand Tuesday that difficulties protecting young people online are not unique to Meta. "These challenges are present everywhere in the industry,” Rosen said when questioned by Meta’s lawyers. Rosen took over Meta’s integrity efforts in 2017, and leads the team responsible for fighting content that violates the company’s policies. 

Lawyers for the FTC surfaced the internal data as part of an argument that Meta’s acquisition of Instagram ultimately harmed consumers. Government lawyers have used emails and other internal documents, including testimony from Instagram founder Kevin Systrom, to argue that Meta under-invested in the app’s safety and security efforts. The FTC first sued Meta in 2020, alleging that Meta’s acquisitions of WhatsApp and Instagram were illegal and that the company needs to be broken up. 

Earlier in the trial, Systrom argued that Zuckerberg starved Instagram of resources in part because he felt threatened by the app’s success and worried that it would cannibalise the social network he founded, Facebook. 

The FTC surfaced more emails and documents Tuesday that supported that theory. In May 2018, Adam Mosseri, a senior Meta product executive who would take a job to lead Instagram later that year, asked Rosen for an honest assessment of Instagram’s integrity work. Rosen warned at the time that Instagram was "behind” in terms of fighting harmful content, including child exploitation and terrorism-related content. Rosen suggested that this posed a risk, particularly to the platform’s younger audiences, and that he was seeking to "expand aggressively” into addressing these issues. 

The FTC painted a portrait of a company reluctant to do so over several years. In a different exchange from February 2019, Rosen wrote in an email that he relayed his concerns that Instagram was being underfunded to Zuckerberg during a planning meeting about increasing company headcount. After speaking with Zuckerberg, Rosen concluded the resource allocation "was deliberate.” Zuckerberg thought Instagram had another year or two to catch up to Facebook and didn’t think the app needed as many resources. "I think we are not sure that’s the case anymore,” Rosen said.

An internal presentation titled "Instagram Well-being H1 2019 - planning” – a planning document for the first half of 2019 – acknowledged that Instagram’s integrity team was thin relative to the scope and importance of the work. Given resource limitations, "we will not be doing major proactive work” in areas like harassment, financial scams, credible threats of violence, impersonation, prostitution and sexual solicitation and forms of child exploitation, the presentation said. 

Rosen, when cross-examined by Meta’s lawyers, said it wouldn’t be fair to say that Meta starved the Instagram integrity team, and that Zuckerberg was aligned with him in the need to support Instagram. He believes that nobody in the industry invested as much or prioritised these challenges as much as Meta.

"We’ve grown substantially,” Rosen said.

The company in September launched Instagram Teen Accounts, which have protections to limit who can contact teens and are private by default. "They’re in the strictest messaging settings, meaning they can’t be messaged by anyone they’re not already connected to,” Meta said in a statement, noting that teens under 16 need a parent’s permission to change the settings.

The company also pointed to technology it introduced in 2021 that helps "identify adults accounts that had shown potentially suspicious activity, such as being blocked by a teen, and prevent them from finding, following and interacting with teens’ accounts.” The suspicious accounts don’t get recommendations to follow teens, "or vice versa.” – Bloomberg

 

 

 

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Others Also Read


Want to listen to full audio?

Unlock unlimited access to enjoy personalise features on the TheStar.com.my

Already a member? Log In