'Fortnite' game-maker Epic's CEO slams Apple's iPhone child sexual abuse photos move, calls it 'spyware'


Sweeney said Apple's move on child safety initiatives would open the way for governments to conduct surveillance. — 123rf.com

Apple has revealed that it may look into users iPhones and check for child abuse photos when they are being uploaded onto iCloud in an attempt to wipe out this horrific online crime. However, while the attempt to address this issue has been on the agenda of all tech majors, this move by Apple has been blasted by others as it is likely to cause more problems for them. The first to react was WhatsApp Head Will Cathcart and then Fortnite gamemaker Epic Games CEO Tim Sweeney.

What has Apple done? Apple has launched various tools to reduce the spread of child sexual abuse material (CSAM) on Thursday. This will introduce changes in iMessage, Siri and Search, that would enable scanning iCloud Photos for known CSAM imagery and thereby help protect children online.

Play, subscribe and stand a chance to win prizes worth over RM39,000! T&C applies.

Monthly Plan

RM 13.90/month

RM 11.12/month

Billed as RM 11.12 for the 1st month, RM 13.90 thereafter.

Best Value

Annual Plan

RM 12.33/month

RM 9.87/month

Billed as RM 118.40 for the 1st year, RM 148 thereafter.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Others Also Read