Apple to detect, report sexually explicit child photos on iPhone


Apple said it will detect abusive images by comparing photos with a database of known Child Sexual Abuse Material, or CSAM, provided by the NCMEC. — Photo by VASANTH on Unsplash

Apple Inc said it will launch new software later this year that will analyse photos stored in a user’s iCloud Photos account for sexually explicit images of children and then report instances to relevant authorities.

As part of new safeguards involving children, the company also announced a feature that will analyse photos sent and received in the Messages app to or from children to see if they are explicit. Apple also is adding features in its Siri digital voice assistant to intervene when users search for related abusive material. The Cupertino, California-based technology giant previewed the three new features on Thursday and said they would be put into use later in 2021.

The Star Festive Promo: Get 35% OFF Digital Access

Monthly Plan

RM 13.90/month

Best Value

Annual Plan

RM 12.33/month

RM 8.02/month

Billed as RM 96.20 for the 1st year, RM 148 thereafter.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Tech News

Czech prime minister in favour of social media ban for under-15s
Analysis-Investors chase cheaper, smaller companies as risk aversion hits tech sector
PDRM calls for greater parental vigilance as grooming by online predators leads victims to share more CSAM content
New app helps you sit up straight while at your computer
Dispose of CDs, DVDs while protecting your data and the environment
'Just the Browser' strips AI and other features from your browser
How do I reduce my child's screen time?
Anthropic buys Super Bowl ads to slap OpenAI for selling ads in ChatGPT
Chatbot Chucky: Parents told to keep kids away from talking AI dolls
South Korean crypto firm accidentally sends $44 billion in bitcoins to users

Others Also Read