Apple to check iCloud photo uploads for child abuse images


FILE PHOTO: The Apple logo is seen at an Apple Store, as Apple's new 5G iPhone 12 went on sale in Brooklyn, New York, U.S. October 23, 2020. REUTERS/Brendan McDermid

(Reuters) -Apple Inc on Thursday said it will implement a system that checks photos on iPhones in the United States before they are uploaded to its iCloud storage services to ensure the upload does not match known images of child sexual abuse.

Detection of child abuse image uploads sufficient to guard against false positives will trigger a human review of and report of the user to law enforcement, Apple said. It said the system is designed to reduce false positives to one in one trillion.

Save 30% OFF The Star Digital Access

Monthly Plan

RM 13.90/month

RM 9.73/month

Billed as RM 9.73 for the 1st month, RM 13.90 thereafter.

Best Value

Annual Plan

RM 12.33/month

RM 8.63/month

Billed as RM 103.60 for the 1st year, RM 148 thereafter.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Tech News

Is social media harmful for kids? Meta and YouTube face US trial after TikTok settles suit
It’s not a product. This habit will be the biggest luxury of 2026
Apple spent years downplaying AI chatbots. Now Siri Is becoming one
US judge signals Musk's xAI may lose lawsuit accusing Altman's OpenAI of stealing trade secrets
Apple stole our revolutionary camera technology, British company claims in US district court lawsuit
Exclusive-Saks ending e-commerce partnership with Amazon, source says
Nvidia's plan to invest up to $100 billion in OpenAI has stalled, WSJ reports
Musk's Starlink updates privacy policy to allow consumer data to train AI
Google defeats bid for billions of dollars of new penalties in US privacy class action
Analysis-Combining SpaceX with xAI may be simple for Musk Inc, but Tesla isn't so easy

Others Also Read