Facebook removes 8.7 million sexual photos of kids in last three months


FILE PHOTO: A Facebook page is displayed on a computer screen in Brussels, Belgium, April 21, 2010. REUTERS/Thierry Roge/File Photo

SAN FRANCISCO: Facebook Inc said on Oct 24 that company moderators during the last quarter removed 8.7 million user images of child nudity with the help of previously undisclosed software that automatically flags such photos.

The machine learning tool rolled out over the last year identifies images that contain both nudity and a child, allowing increased enforcement of Facebook's ban on photos that show minors in a sexualised context.

Limited time offer:
Just RM5 per month.

Monthly Plan

RM13.90/month
RM5/month

Billed as RM5/month for the 1st 6 months then RM13.90 thereafters.

Annual Plan

RM12.33/month

Billed as RM148.00/year

1 month

Free Trial

For new subscribers only


Cancel anytime. No ads. Auto-renewal. Unlimited access to the web and app. Personalised features. Members rewards.
Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Facebook

   

Next In Tech News

Dutch privacy watchdog recommends government organisations stop using Facebook
Nigerian court adjourns Binance and executives' tax evasion trial to May 17
Pornhub, XVideos, Stripchat face strict EU rules, Commission says
India's Wipro scrapes past lowered revenue expectations, prioritises growth pick-up
Japanese doctors demand damages from Google over ‘groundless’ reviews
Meta releases beefed-up AI models
Netflix slides as move to end sharing user count sparks growth worries
Explainer-Bitcoin's 'halving': what is it and does it matter?
Japanese AI tool predicts when recruits will quit jobs
US ‘swatting’ pranks stoke alarm in election year

Others Also Read