Apple to detect, report sexually explicit child photos on iPhone


Apple said it will detect abusive images by comparing photos with a database of known Child Sexual Abuse Material, or CSAM, provided by the NCMEC. — Photo by VASANTH on Unsplash

Apple Inc said it will launch new software later this year that will analyse photos stored in a user’s iCloud Photos account for sexually explicit images of children and then report instances to relevant authorities.

As part of new safeguards involving children, the company also announced a feature that will analyse photos sent and received in the Messages app to or from children to see if they are explicit. Apple also is adding features in its Siri digital voice assistant to intervene when users search for related abusive material. The Cupertino, California-based technology giant previewed the three new features on Thursday and said they would be put into use later in 2021.

Unlock 30% Savings on Ad-Free Access Now!

Monthly Plan

RM13.90/month
RM9.73 only

Billed as RM9.73 for the 1st month then RM13.90 thereafters.

Annual Plan

RM12.33/month
RM8.63/month

Billed as RM103.60 for the 1st year then RM148 thereafters.

1 month

Free Trial

For new subscribers only


Cancel anytime. No ads. Auto-renewal. Unlimited access to the web and app. Personalised features. Members rewards.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Others Also Read


Want to listen to full audio?

Unlock unlimited access to enjoy personalise features on the TheStar.com.my

Already a member? Log In