Facebook removes 8.7 million sexual photos of kids in last three months


FILE PHOTO: A Facebook page is displayed on a computer screen in Brussels, Belgium, April 21, 2010. REUTERS/Thierry Roge/File Photo

SAN FRANCISCO: Facebook Inc said on Oct 24 that company moderators during the last quarter removed 8.7 million user images of child nudity with the help of previously undisclosed software that automatically flags such photos.

The machine learning tool rolled out over the last year identifies images that contain both nudity and a child, allowing increased enforcement of Facebook's ban on photos that show minors in a sexualised context.

Subscribe now for a chance to win your dream holiday!

Monthly Plan

RM13.90/month

Annual Plan

RM12.33/month

Billed as RM148.00/year

1 month

Free Trial

For new subscribers only


Cancel anytime. No ads. Auto-renewal. Unlimited access to the web and app. Personalised features. Members rewards.
Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Facebook

   

Next In Tech News

Amazon racing to develop AI chips cheaper, faster than Nvidia's, executives say
Leveraged Nvidia ETFs ramp up investor risk as tech turbulence hits markets
Musk to discuss $5 billion xAI investment with Tesla board
Sibanye Stillwater delays releasing results after cyberattack
Exclusive-Meta to be hit with first EU antitrust fine for linking Marketplace and Facebook, sources say
Oracle signs deal to use Italy's Rai Way data centres
Disney, Warner Bros launch streaming bundle with ad-free plan for $30/month
BofA payments app for businesses handled record $500 billion by mid-year
Grab no longer buying Trans-cab, Singapore watchdog says
AI cloud provider SMC plans global rollout, CEO says

Others Also Read