Meta launches new teen safety features, removes 635,000 accounts that sexualise children


The heightened measures arrive as social media companies face increased scrutiny over how their platform affects the mental health and well-being of younger users. — AP

Instagram parent company Meta has introduced new safety features aimed at protecting teens who use its platforms, including information about accounts that message them and an option to block and report accounts with one tap.

The company also announced July 23 that it has removed thousands of accounts that were leaving sexualised comments or requesting sexual images from adult-run accounts of kids under 13. Of these, 135,000 were commenting and another 500,000 were linked to accounts that "interacted inappropriately”, Meta said in a blog post.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Tech News

Atos to sell Latin American businesses to Brazil's Semantix
Toys are talking back thanks to AI, but are they safe around kids?
Tesla faces NHTSA probe over Model 3 emergency door handles
Ghana arrests dozens of Nigerians over online fraud ring
German politician urges more face-to-face interaction in digital age
Vince Zampella, formative designer of 'Call of Duty' games, dies at 55
The European laws curbing big tech... and irking Trump
A skateboarding demon’s grinding, gratifying quest
How Agility Robotics uses artificial intelligence, from their humanoid 'Digit' to everyday workflow
Man who lost key motion in Elon Musk suit alleges judge used faulty AI

Others Also Read