Apple to detect, report sexually explicit child photos on iPhone


Apple said it will detect abusive images by comparing photos with a database of known Child Sexual Abuse Material, or CSAM, provided by the NCMEC. — Photo by VASANTH on Unsplash

Apple Inc said it will launch new software later this year that will analyse photos stored in a user’s iCloud Photos account for sexually explicit images of children and then report instances to relevant authorities.

As part of new safeguards involving children, the company also announced a feature that will analyse photos sent and received in the Messages app to or from children to see if they are explicit. Apple also is adding features in its Siri digital voice assistant to intervene when users search for related abusive material. The Cupertino, California-based technology giant previewed the three new features on Thursday and said they would be put into use later in 2021.

Save 30% OFF The Star Digital Access

Monthly Plan

RM 13.90/month

RM 9.73/month

Billed as RM 9.73 for the 1st month, RM 13.90 thereafter.

Best Value

Annual Plan

RM 12.33/month

RM 8.63/month

Billed as RM 103.60 for the 1st year, RM 148 thereafter.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Tech News

UK court gives go-ahead to challenge to large data centre
Spotify launches AI-driven 'prompted playlist' for premium users in US, Canada
Coupang investors seek US probe over South Korea's handling of data leak
Apple asks Indian court to stop antitrust body from seeking its financials
Taiwan's Compal warns rising memory prices to impact industry into 2027
Uber faces growing pressure over sexual assault record
Ubisoft shares tumble after 'Assassin's Creed' creator unveils restructuring, cancels games
Ubisoft unveils details of big restructuring bet
Hyundai Motor's Korean union warns of humanoid robot plan, sees threat to jobs
These college students ditched their phones for a week. Could you?

Others Also Read