Meta CEO Zuckerberg blocked curbs on sex-talking chatbots for minors, court filing alleges


FILE PHOTO: Meta CEO Mark Zuckerberg wears the Meta Ray-Ban Display glasses, as he delivers a speech presenting the new line of smart glasses, during the Meta Connect event at the company's headquarters in Menlo Park, California, U.S., September 17, 2025. REUTERS/Carlos Barria

Jan 27 (Reuters) - Meta Chief Executive Mark Zuckerberg approved allowing ‌minors to access AI chatbot companions that safety staffers warned were capable of sexual interactions, according to internal Meta documents filed in a New Mexico state court ‌case and made public Monday.

The lawsuit – brought by the state’s attorney general, Raul Torrez, and scheduled for trial next month – alleges that Meta "failed to stem the ‌tide of damaging sexual material and sexual propositions delivered to children" on Facebook and Instagram.

The filing on Monday included internal Meta employee emails and messages obtained by the New Mexico Attorney General's Office through legal discovery. The state alleges they show that "Meta, driven by Zuckerberg, rejected the recommendations of its integrity staff and declined to impose reasonable guardrails to prevent children from being subject to sexually exploitative conversations with its AI chatbots," the attorney general said in ‍the filing.

In the communications, some of Meta’s safety staff expressed objections that the company was building chatbots geared for ‍companionship, including sexual and romantic interactions with users. The artificial intelligence chatbots ‌were released in early 2024. The documents cited in the state’s filing Monday don’t include messages or memos authored by Zuckerberg.

Meta spokesman Andy Stone on Monday said the state’s portrayal ‍was ​inaccurate and relied on selective information. "This is yet another example of the New Mexico Attorney General cherry-picking documents to paint a flawed and inaccurate picture."

Messages in the filing showed safety staff had special concern about the bots being used for romantic scenarios between adults and minors under the age of 18, referred to as “U18s.”

“I don’t believe that creating and ⁠marketing a product that creates U18 romantic AI’s for adults is advisable or defensible,” wrote Ravi Sinha, ‌head of Meta’s child safety policy, in January 2024.

In reply, Meta global safety head Antigone Davis agreed that safety staff should push to block adults from creating underage romantic companions because “it sexualizes minors.”

Sinha and Davis did not ⁠respond to requests for comment.

According to ‍one February 2024 message, a Meta employee whose name was redacted relayed that Zuckerberg believed that AI companions should be blocked from engaging in sexually "explicit" conversations with at least younger teens and that adults should not be able to interact with "U18 AIs for romance purposes."

A summary of a meeting dated February 20, 2024, said the CEO believed the "narrative should be framed around ... general principles of choice and non-censorship," that Meta should ‍be "less restrictive than proposed," and that he wanted to "allow adults to engage in racier conversation on topics ‌like sex."

Meta spokesman Stone said the documents don’t support New Mexico’s case. “Even these select documents clearly show Mark Zuckerberg giving the direction that explicit AIs shouldn't be available to younger users and that adults shouldn't be able to create under 18 AIs for romantic purposes."

Messages between two employees from March of 2024 state that Zuckerberg had rejected creating parental controls for the chatbots, and that staffers were working on "Romance AI chatbots" that would be allowed for users under the age of 18.

We "pushed hard for parental controls to turn GenAI off – but GenAI leadership pushed back stating Mark decision,” one employee wrote in that exchange.

Nick Clegg, who was Meta's head of global policy until early 2025, said in an email included in the court documents he thought Meta’s approach to sexualized AI companions was unwise.

Expressing concern that sexual interactions could be the dominant use case for Meta’s AI companions by teenage users, Clegg said: “Is that really what we want these products to be ‌known for (never mind the inevitable societal backlash which would ensue)?”

Clegg did not respond to a request for comment.

Meta’s AI chatbot policies eventually came to light, prompting a backlash in the U.S. Congress and elsewhere.

A Wall Street Journal article in April 2025 found that Meta's chatbots included overtly sexualized underage characters and that they engaged in all-ages sexual roleplay, including graphic descriptions of prepubescent bodies.

Reuters reported in August that Meta’s official guidelines for chatbots stated ​that it is “acceptable to engage a child in conversations that are romantic or sensual.” In response to the report, Meta said it was changing its policies and that the internal document granting such approval had been in error.

Meta last week said it removed teen access to AI companions entirely, pending creation of a new version of the chatbots.

(Reporting by Jeff Horwitz in San Francisco; Editing by Cynthia Osterman and Michael Williams)

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Tech News

Oracle says data center outage causing issues faced by US TikTok users
SoftBank in talks to invest up to $30 billion more in OpenAI, source says
Seagate forecasts quarterly results above estimates on strong data storage demand
Revolut launches full banking operations in Mexico in first expansion outside Europe
US banks may lose $500 billion to stablecoins by 2028, Standard Chartered warns
TikTok settles social media addiction lawsuit ahead of trial against Meta, YouTube
Pinterest cuts up to 15% jobs to prioritize AI push, shares sink
WhatsApp unveils high-security mode, latest tech firm to offer users stronger protection
Cloudflare surges as viral AI agent buzz lifts expectations
Meta, Corning sign deal worth up to $6 billion for fiber-optic cables in AI data centers

Others Also Read