Last year, deepfake pornographic images of Taylor Swift spread across the social media platform X. — AFP
IMAGE-based sexual abuse is the creation and circulation of AI-assisted deepfake nudes or revenge porn.
> What are deepfake nudes?
Deepfakes nudes or deepfake porn are explicit sexual images that have been digitally manipulated using AI-assisted technology.
Deepfake technology is now evolving rapidly and is widely available online to quickly generate nude photos, and even pornographic videos.
Previously, the technology was used to cut-and-paste faces onto nude bodies, but with the rise of AI-powered “nudification” apps, perpetrators can “strip” a clothed person in their regular photos.
> What are some of the recent cases of image-based sexual abuse?
Last year, several male students at Issaquah High School in Washington, US, used an app to “strip” photos of girls who attended the school’s homecoming dance.
In November 2024, a group of Singapore Sports School students were caught and punished for creating and circulating deepfake nude images of their female schoolmates.
In August 2024, a Telegram channel with more than 220,000 members in South Korea was reportedly used to create and share AI-generated pornographic images.
In January 2024, AI-gene-rated pornographic images of Taylor Swift circulated on social media, leading to significant public outrage and legislative action. The images were viewed over 45 million times before being taken down.
Malaysian singer Jaclyn Victor, 47, is filing a police report after a doctored image of her was circulated online, including on various social media platforms. Jaclyn is reportedly furious and expecting to take legal action against the perpetrators.
In 2019, actor Zul Ariffin claimed that he was likely a victim of deepfake technology after a video of a man resembling him engaging in lewd acts spread online. This led him to temporarily deactivating his Instagram account.
> What can you do if you become the target of deepfake nudes?
The most important first step is to document evidence.
Take screenshots of the posts or videos, record the links or URLs, and save the messages and timestamps.
While many may rush to report the image or video as soon as possible to have it removed, recording as much evidence is crucial as the harmful content can be deleted, altered, or moved by the perpetrator, making it difficult to prove that the incident occurred.
Screenshots act as a timestamped record, ensuring that the evidence is not lost.
Platforms and authorities also often require concrete evidence when investigating cases of online harm. Having screenshots can strengthen your case and increase the likelihood of action being taken against the offender.
> How do you avoid becoming a deepfake nude victim?
With advanced AI tools becoming widely available and easier to use, anyone with an online presence is vulnerable, so it’s important to exercise caution when navigating the online world.
Limit who can see your posts through privacy settings and avoiding sharing highly personal information such as full names or addresses.
Be wary of unfamiliar follower requests and suspicious behaviour on social media.
Be aware of overly friendly accounts, or accounts that are quick to offer gifts or offers that are too good to be true.
> If you are a victim of image-based sexual abuse and you would like to seek help, you can:
<> File a police report with visual evidence of correspondence and photos.
<> Lodge a complaint with the Tribunal for Anti-Sexual Harassment.
This will enable authorities to start the investigation process.
<> You can also reach out to NGOs that provide a safe space and counselling for survivors such as the WAO hotline at 03-3000 8858 or text Tina at 018-988 8058.
The Knowledge and Rights with Young people through Safer Spaces (KRYSS) Network also has a publicly-accessible toolkit for online gender-based violence which can be accessed at kryss.network/resources/toolkits-and-guides.
Sources: The Strait Times/Asia News Network/Agencies

