Study shows AI image-generators being trained on explicit photos of children


Students walk on the Stanford University campus in Stanford, Calif. Hidden inside the foundation of popular artificial intelligence image-generators are thousands of images of child sexual abuse, according to a new report from the Stanford Internet Observatory that urges technology companies to take action to address a harmful flaw in the technology they built. — AP

Hidden inside the foundation of popular artificial intelligence image-generators are thousands of images of child sexual abuse, according to a new report that urges companies to take action to address a harmful flaw in the technology they built.

Those same images have made it easier for AI systems to produce realistic and explicit imagery of fake children as well as transform social media photos of fully clothed real teens into nudes, much to the alarm of schools and law enforcement around the world.

Save 30% OFF The Star Digital Access

Monthly Plan

RM 13.90/month

RM 9.73/month

Billed as RM 9.73 for the 1st month, RM 13.90 thereafter.

Best Value

Annual Plan

RM 12.33/month

RM 8.63/month

Billed as RM 103.60 for the 1st year, RM 148 thereafter.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Tech News

Musk says steps to stop Russia from using Starlink seem to have worked
French tech company Capgemini says selling US subsidiary
Indonesia lets Elon Musk's Grok resume, lifting ban over sexualised images
I'm a parent, how worried should I be about AI?
Elon Musk's Grok generated 3 million sexualised images in just 11 days, new analysis finds
After robotaxi hits child, Waymo says its software prevented worse
Elon Musk says ‘singularity’ is here – What to know about AI threats to humanity
Waymo seeking about $16 billion near $110 billion valuation, Bloomberg News reports
Bitcoin falls below $80,000, continuing decline as liquidity worries mount
SpaceX seeks FCC nod for solar-powered satellite data centers for AI

Others Also Read