Computer-generated child sexual abuse images made with artificial intelligence tools like Stable Diffusion are starting to proliferate on the Internet and are so realistic that they can be indistinguishable from photographs depicting actual children, according to a new report. — AP
NEW YORK: The already-alarming proliferation of child sexual abuse images on the Internet could become much worse if something is not done to put controls on artificial intelligence tools that generate deepfake photos, a watchdog agency warned on Oct 24.
In a written report, the UK-based Internet Watch Foundation urges governments and technology providers to act quickly before a flood of AI-generated images of child sexual abuse overwhelms law enforcement investigators and vastly expands the pool of potential victims.
Already a subscriber? Log in
Save 30% OFF The Star Digital Access
Cancel anytime. Ad-free. Unlimited access with perks.
