Meta launches new teen safety features, removes 635,000 accounts that sexualise children


The heightened measures arrive as social media companies face increased scrutiny over how their platform affects the mental health and well-being of younger users. — AP

Instagram parent company Meta has introduced new safety features aimed at protecting teens who use its platforms, including information about accounts that message them and an option to block and report accounts with one tap.

The company also announced July 23 that it has removed thousands of accounts that were leaving sexualised comments or requesting sexual images from adult-run accounts of kids under 13. Of these, 135,000 were commenting and another 500,000 were linked to accounts that "interacted inappropriately”, Meta said in a blog post.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Tech News

Google top India counsel quits in latest departure amid regulatory hurdles, sources say
Uber, Pony.ai and Verne team up to launch Europe's first robotaxi service in Croatia
The EU’s biggest test for device makers: Replaceable batteries
US activists work to connect Iranians via Starlink
New on the iPhone: Shazam songs even when offline with iOS 26.4
First Robot: Melania Trump brings droid to White House event
Why AI means animal testing is not always needed to trial new medicines
Day of reckoning arrives for social media after US court loss
Teens get probation after using AI to create fake nudes of classmates
Revolut to base 40% of its global workforce in India by 2026

Others Also Read