The heightened measures arrive as social media companies face increased scrutiny over how their platform affects the mental health and well-being of younger users. — AP
Instagram parent company Meta has introduced new safety features aimed at protecting teens who use its platforms, including information about accounts that message them and an option to block and report accounts with one tap.
The company also announced July 23 that it has removed thousands of accounts that were leaving sexualised comments or requesting sexual images from adult-run accounts of kids under 13. Of these, 135,000 were commenting and another 500,000 were linked to accounts that "interacted inappropriately”, Meta said in a blog post.
