Predators exploit AI tools to generate images of child abuse


Bloomberg News found one open web paedophile forum with a guide to generating fake child sex abuse material on Stable Diffusion, an image generation tool created by London-based billion-dollar startup Stability AI, which was posted October last year. — Photo by Philipp Katzenberger on Unsplash

Child predators are exploiting generative artificial intelligence technologies to share fake child sexual abuse material online and to trade tips on how to avoid detection, according to warnings from the National Center for Missing and Exploited Children and information seen by Bloomberg News.

In one example, users of a prominent child predation forum shared 68 sets of artificially generated images of child sexual abuse during the first four months of the year, according to Avi Jager, head of child safety and human exploitation at ActiveFence, a content moderation startup. The figure marks an increase from the 25 posts that ActiveFence observed during the final four months of 2022, said Jager, who declined to name the forum for safety purposes.

Unlock 30% Savings on Ad-Free Access Now!

Monthly Plan

RM13.90/month
RM9.73 only

Billed as RM9.73 for the 1st month then RM13.90 thereafters.

Annual Plan

RM12.33/month
RM8.63/month

Billed as RM103.60 for the 1st year then RM148 thereafters.

1 month

Free Trial

For new subscribers only


Cancel anytime. No ads. Auto-renewal. Unlimited access to the web and app. Personalised features. Members rewards.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Others Also Read


All Headlines:

Want to listen to full audio?

Unlock unlimited access to enjoy personalise features on the TheStar.com.my

Already a member? Log In