Europe pushes for a gentler Internet for children


Stéphanie Mistre sits in the bedroom where her daughter, Marie Le Tiec, ended her life in 2021 at age 15, in Cassis, France, March 9, 2026. The European Union and national capitals are trying to make social media and algorithms less addictive and safer, especially for children. — Andrea Mantovani/The New York Times

BRUSSELS: Marie Le Tiec was 15 when she ended her life in 2021. Her mother, Stéphanie Mistre, went through her phone a month later, and said she was shocked to discover the kind of content on the girl’s TikTok feed.

There were songs glamorising death. There were people encouraging each other to take their lives. There were detailed instructions on how to do so, that matched what Marie had done.

“They are making money on the mental health of our children,” said Mistre, who has joined other French parents in suing the social media platform, and has become an activist for children’s online safety.

Last month, a California jury found Meta and YouTube had harmed the mental health of a young user with addictive design features – a landmark case that could lead to further lawsuits in the United States. Efforts to protect children from dangers online are also well underway in Europe, where the European Union and national authorities in France and elsewhere are pushing for new measures.

The goal, they say, is an internet where children are not exposed to sexual or violent content before they are old enough to process it, and in which algorithms are not addictive. It’s a kind of gentler version of the web.

“We are setting a clear limit that you can’t do business by harming people’s mental health,” Henna Virkkunen, the commissioner at the European Union focused on technology issues, said during a recent interview.

TikTok declined to comment on the open case of Le Tiec, but it has pushed back on accusations that its design harms children, as have other social media companies. And some academics warn that choking off access to these online services could leave children less digitally savvy.

Still, the wave of efforts across Europe continues, and may be the most comprehensive attempt yet to limit what children can access on apps and the internet.

It builds on long-running efforts by the EU to better police the internet, including landmark legislation, the Digital Services Act, which prods large online platforms to set standards and monitor for harmful content on their own sites.

Among the new initiatives are these:

  • Investigations into social media companies, including one opened last week by EU regulators into child protection safeguards at Snapchat. Regulators say the social media service has an ineffective age-verification system for children. Snap has said its services had “security and privacy built in” and that it does not allow users younger than 13.
  • Another EU announcement last week finding that Pornhub, Stripchat and other porn platforms “did not diligently identify and assess” the risks their platforms posed to children, or do enough to keep minors off their websites, based on a preliminary investigation. If those results are confirmed, the platforms could face hefty fines. A spokesperson for Stripchat said the company disagreed with aspects of the finding, adding that discussions were ongoing. A Pornhub spokesperson noted that many age verification efforts do not work and drive people toward less regulated sites, and said the company was working with the commission to protect minors.
  • An announcement in February by the European Union that TikTok’s infinite scroll, auto-play features and recommendation algorithm may amount to an “addictive design” that violates EU online safety laws. TikTok has said it planned to challenge the findings.
  • A bloc-wide approach adopted by the European Union to counter cyberbullying.
  • Efforts in France, Greece, Denmark and Spain to explore minimum ages for social media, following in the footsteps of Australia, where a law barring those younger than 16 from social media took hold in late 2025.

“There is no reason our children should be exposed online to what is legally forbidden in the real world,” Emmanuel Macron, the president of France, said in a speech this year.

Ursula von der Leyen, the president of the European Commission, the EU executive arm, has recently convened a panel of experts to advise on whether Europe-wide social media age restrictions for kids might make sense.

Like that conversation, many of Europe’s social media restrictions are still in development. If age restrictions do take hold, they will likely be backed by EU technology. The bloc is working on a digital identity wallet that would allow platforms and other websites to easily check how old users are before allowing them in.

The flurry of activity comes as officials become more worried about the role social media is playing in the lives of children, an issue that is politically salient in Europe when little else achieves wide consensus. Research and advocacy papers in France have raised concerns about algorithmic rabbit holes that can exacerbate depressive thinking. About 1 in 6 young adolescents experiences cyberbullying, according to World Health Organisation Europe data.

Many of the European Union’s efforts aim at non-American firms. TikTok’s owner is Chinese, and Pornhub is Canadian.

There’s a chance, though, that Europe’s push to protect minors could eventually extend to American companies like Meta, putting it on a potential collision course with the United States.

The Trump administration often criticises Europe’s broader approach to digital regulation, especially its efforts to curb misinformation during elections and its push to make the recommender system on the social platform X more transparent.

But when it comes to child safety online, the United States shows some signs of moving in the same direction as Europe.

Many changes are happening through the courts. In addition to the California case against Meta and YouTube, a New Mexico jury recently found Meta liable for violating state laws in ways that it found had enabled sexual exploitation of young users.

Also in the United States, some rules have been enacted in states to keep young people away from algorithm-driven social media.

For Mistre, who lost her daughter, curbing the risks of social media is a major priority, requiring rapid action. She has been active in the European debate, regularly appearing in local media to push for more expansive regulation.

“It’s horrible, the way they use our children,” she said of social media sites. “We have to protect them.” – ©2026 The New York Times Company

This article originally appeared in The New York Times.

Those suffering from problems can reach out to the Mental Health Psychosocial Support Service at 03-2935 9935 or 014-322 3392; Talian Kasih at 15999 or 019-261 5999 on WhatsApp; Jakim’s (Department of Islamic Development Malaysia) family, social and community care centre at 0111-959 8214 on WhatsApp; and Befrienders Kuala Lumpur at 03-7627 2929 or go to befrienders.org.my/centre-in-malaysia for a full list of numbers nationwide and operating hours, or email sam@befrienders.org.my.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Tech News

Infinix launches Smart 20 smartphone with 6.78in screen, priced from RM399
Systems that let drivers take their hands off the wheel don't improve safety, NTSB head says
Microsoft to invest $10 billion in Japan for AI and cyber defence expansion
Survey shows Germans divided over AI's impact on future
Goodbye ‘Geeky Hunk’? Gmail users can now change their usernames.
US government requests for social media user data are soaring
Analysis-Under global spotlight, Australia plays hardball on social media ban
Broadcom taps Alphabet executive Amie Thuener as next CFO
OpenAI acquires technology talk show TBPN in surprise move
Amazon must negotiate with Staten Island warehouse workers, NLRB says

Others Also Read