
Bloomberg News found one open web paedophile forum with a guide to generating fake child sex abuse material on Stable Diffusion, an image generation tool created by London-based billion-dollar startup Stability AI, which was posted October last year. — Photo by Philipp Katzenberger on Unsplash
Child predators are exploiting generative artificial intelligence technologies to share fake child sexual abuse material online and to trade tips on how to avoid detection, according to warnings from the National Center for Missing and Exploited Children and information seen by Bloomberg News.
In one example, users of a prominent child predation forum shared 68 sets of artificially generated images of child sexual abuse during the first four months of the year, according to Avi Jager, head of child safety and human exploitation at ActiveFence, a content moderation startup. The figure marks an increase from the 25 posts that ActiveFence observed during the final four months of 2022, said Jager, who declined to name the forum for safety purposes.
Unlock 30% Savings on Ad-Free Access Now!
