'Fortnite' game-maker Epic's CEO slams Apple's iPhone child sexual abuse photos move, calls it 'spyware'


Sweeney said Apple's move on child safety initiatives would open the way for governments to conduct surveillance. — 123rf.com

Apple has revealed that it may look into users iPhones and check for child abuse photos when they are being uploaded onto iCloud in an attempt to wipe out this horrific online crime. However, while the attempt to address this issue has been on the agenda of all tech majors, this move by Apple has been blasted by others as it is likely to cause more problems for them. The first to react was WhatsApp Head Will Cathcart and then Fortnite gamemaker Epic Games CEO Tim Sweeney.

What has Apple done? Apple has launched various tools to reduce the spread of child sexual abuse material (CSAM) on Thursday. This will introduce changes in iMessage, Siri and Search, that would enable scanning iCloud Photos for known CSAM imagery and thereby help protect children online.

The Star Christmas Special Promo: Save 35% OFF Yearly. T&C applies.

Monthly Plan

RM 13.90/month

Best Value

Annual Plan

RM 12.33/month

RM 8.02/month

Billed as RM 96.20 for the 1st year, RM 148 thereafter.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Tech News

Merriam-Webster’s 2025 word of the year is 'slop'
US communities push back against encroaching e-commerce warehouses
Will OpenAI be the next tech giant or next Netscape?
No wolf plush toy by Christmas, French supermarket says
Intel appoints Trump economic adviser as head of government affairs
How much does an army of bots cost? How likes and clout are bought
US suspends technology deal with Britain, FT reports
British regulator kicks off consultation on new crypto rules
'Battlefield' maker EA forecasts softer 2026 bookings amid slow spending, crowded holiday slate
German parliament suffers suspected cyberattack during Zelenskiy’s visit, FT reports

Others Also Read