After criticism, Apple to only seek abuse images flagged in multiple nations


FILE PHOTO: The Apple Inc logo is seen hanging at the entrance to the Apple store on 5th Avenue in Manhattan, New York, U.S., October 16, 2019. REUTERS/Mike Segar

(Reuters) - After a week of criticism over a its planned new system for detecting images of child sex abuse, Apple Inc said on Friday that it will hunt only for pictures that have been flagged by clearinghouses in multiple countries.

That shift and others intended to reassure privacy advocates were detailed to reporters in an unprecedented fourth background briefing since the initial announcement eight days prior of a plan to monitor customer devices.

Get 20% OFF The Star Digital Access

Monthly Plan

RM 13.90/month

RM 11.12/month

Billed as RM 11.12 for the 1st month, RM 13.90 thereafter.

Best Value

Annual Plan

RM 12.33/month

RM 9.87/month

Billed as RM 118.40 for the 1st year, RM 148 thereafter.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Tech News

Alphabet considers first yen bond sale to fund AI goals
EU Commission in talks with OpenAI and Anthropic over AI models
Circle sees revenue boost as stablecoin demand rises amid volatility; shares up
AI labs should pass safety review to get US government contracts, group says
Disneyland rolls out facial recognition at US park's entrances
US prepares AI security order that omits mandatory model tests
Google settles racial discrimination lawsuit for US$50mil
Who are you getting your health advice from?
All those AI notetakers? They’re making lawyers very nervous.
Finding Nintendo adventures after 'Super Mario Galaxy Movie' ends

Others Also Read