Treat deepfaked child abuse images as real crimes, urges MCA info chief


The recent rise in artificial intelligence (AI) generated child sexual abuse material (CSAM) warrants our attention. We must treat this issue with the concern it deserves and recognise that it poses a very real danger. While AI has brought about many advancements, its misuse in generating synthetic, hyper-realistic CSAM is enabling new forms of exploitation that are harder to detect and more damaging than ever before. When a child’s image is manipulated into sexual content, it is an act of violation.

When the use of AI-generated images becomes normalised, it creates the dangerous illusion that “it is not real, so it is no big deal.” AI-generated abuse content is not a harmless fantasy. It is part of a wider issue that normalises child exploitation. Furthermore, under the Sexual Offences Against Children Act 2017, it is a crime to produce, possess, or distribute child sexual abuse material using any means, including artificial intelligence. If the depiction is sexual and involves a child, whether real, edited or computer-generated, an offence has been committed.

Save 30% OFF The Star Digital Access

Monthly Plan

RM 13.90/month

RM 9.73/month

Billed as RM 9.73 for the 1st month, RM 13.90 thereafter.

Best Value

Annual Plan

RM 12.33/month

RM 8.63/month

Billed as RM 103.60 for the 1st year, RM 148 thereafter.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Letters

International Anti-Corruption Day 2025
The hidden cost of ‘rage bait’
Rebuilding aid falls short
Stepping up to make a difference between life and death
Digital power for a secure Malaysian food supply
Redirect a part of medical tourism revenue to close public healthcare gaps
When denial hurts more than the diagnosis
Civic consciousness needed from all Bukit Kiara park visitors
Moving beyond polite statements about OKU inclusion
Reading clauses in isolation gives the wrong picture

Others Also Read