Apple to detect, report sexually explicit child photos on iPhone


Apple said it will detect abusive images by comparing photos with a database of known Child Sexual Abuse Material, or CSAM, provided by the NCMEC. — Photo by VASANTH on Unsplash

Apple Inc said it will launch new software later this year that will analyse photos stored in a user’s iCloud Photos account for sexually explicit images of children and then report instances to relevant authorities.

As part of new safeguards involving children, the company also announced a feature that will analyse photos sent and received in the Messages app to or from children to see if they are explicit. Apple also is adding features in its Siri digital voice assistant to intervene when users search for related abusive material. The Cupertino, California-based technology giant previewed the three new features on Thursday and said they would be put into use later in 2021.

Get 30% off with our ads free Premium Plan

Monthly Plan

RM9.73/month

Annual Plan

RM8.63/month

Billed as RM103.60/year

1 month

Free Trial

For new subscribers only


Cancel anytime. No ads. Auto-renewal. Unlimited access to the web and app. Personalised features. Members rewards.
Follow us on our official WhatsApp channel for breaking news alerts and key updates!
   

Next In Tech News

Meta rebuffs Google's virtual reality tie-up proposal, The Information reports
SoundHound AI shares fall as disappointing results temper rally
AI startup Cohere opens New York office in expansion
Dell shares soar as annual forecast gets a boost from AI adoption
Number of agencies have concerns about 'sideloading' on iPhone, Apple says
EV startup Fisker slides as cash dwindles with 'difficult year' ahead
Elon Musk sues OpenAI for abandoning original mission for profit
Humanoid robot-maker Figure partners with OpenAI and gets backing from Jeff Bezos and tech giants
A startup’s technology takes aim at lithium-ion batteries’ fire problem
Drones, snake robot enter wrecked Japan nuclear reactor

Others Also Read