AI-generated child sexual abuse images could flood the Internet. A watchdog is calling for action


Computer-generated child sexual abuse images made with artificial intelligence tools like Stable Diffusion are starting to proliferate on the Internet and are so realistic that they can be indistinguishable from photographs depicting actual children, according to a new report. — AP

NEW YORK: The already-alarming proliferation of child sexual abuse images on the Internet could become much worse if something is not done to put controls on artificial intelligence tools that generate deepfake photos, a watchdog agency warned on Oct 24.

In a written report, the UK-based Internet Watch Foundation urges governments and technology providers to act quickly before a flood of AI-generated images of child sexual abuse overwhelms law enforcement investigators and vastly expands the pool of potential victims.

Save 30% OFF The Star Digital Access

Monthly Plan

RM 13.90/month

RM 9.73/month

Billed as RM 9.73 for the 1st month, RM 13.90 thereafter.

Best Value

Annual Plan

RM 12.33/month

RM 8.63/month

Billed as RM 103.60 for the 1st year, RM 148 thereafter.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Tech News

Like fancy Japanese toilets? You’ll love the sound of this.
Facebook 'supreme court' admits 'frustrations' in five years of work
Russia restricts FaceTime, its latest step in controlling online communications
Studies: AI chatbots can influence voters
LG Elec says Microsoft and LG affiliates pursuing cooperation on data centres
Apple appoints Meta's Newstead as general counsel amid executive changes
AI's rise stirs excitement, sparks job worries
Australia's NEXTDC inks MoU with OpenAI to develop AI infrastructure in Sydney, shares jump
SentinelOne forecasts quarterly revenue below estimates, CFO to step down
Hewlett Packard forecasts weak quarterly revenue, shares fall

Others Also Read