TOKYO: The Japanese arm of US-based Meta Platforms, which runs Instagram, said on March 10 that it would introduce a feature in Japan in 2026 to notify parents if children ages 13 to 17 repeatedly try to search for content related to suicide or self-harm on the photo-sharing app.
To further protect children, it will also soon introduce a feature that restricts access to posts about drugs and dangerous behaviour.
For users ages 13 to 17, who are allowed on Instagram under the app’s terms of use, the “Teen Accounts” feature, which limits certain functions, will notify parents via the app or by e-mail if children repeatedly try to search for suicide-related content. For this to work, parents must link their account to their child’s.
While this feature is already available in the United States and Britain, it had not been introduced in Japan.
Instagram will also soon introduce a feature to restrict teens up to age 17 from viewing posts containing drug-related content, extreme language such as threats and dangerous acts like shooting guns.
The platform already limits the display of posts when they contain sexual imagery or relate to alcohol or tobacco.
While social media allows for easy communication with friends and others, it has created concerns worldwide that it can lead to bullying and suicide.
In the United States, lawsuits have been filed against operating companies. In Australia, a law banning social media use by those under 16 took effect in December in 2025. - The Japan News/ANN
