MORE than 10,000 people sought government assistance for dealing with digital sex crime in 2023, marking the highest number of cases since the establishment of the Digital Sex Crime Victim Support Centre in 2018.
Most victims were in their teens and 20s, with a sharp surge in the abuse of artificial intelligence (AI) to generate deepfakes driving the increase.
According to the digital sex crime report from South Korea’s Ministry of Gender Equality and Family, a total of 10,305 individuals received support in 2023 – a 14.7% rise from the previous year – in the sectors of counselling, assistance with content removal and referrals for legal, medical and investigative aid.
Deleted illegal materials also exceeded 300,000 for the first time, up 22.3% year on year.
Teenagers accounted for 27.9% of the victims, up from 17.8% in 2022, while victims in their 20s made up 50.2%, rising steeply from 18.2%.
The authorities believe the actual number of teenage victims may be higher due to under-reporting.
“Teens are especially vulnerable as they are frequent users of social media and digital platforms,” a ministry official said.
The number of cases involving synthetic media abuse – including deepfake pornography – jumped dramatically.
In 2023, 1,384 such cases were reported, up 227.2% from 423 the previous year.
The ministry expressed concerns over the growing accessibility of AI tools that can create explicit synthetic content, including deepfakes.
Officials warned that as AI systems become more personalised through data accumulation, the scale and sophistication of digital sex crime is likely to grow. — The Korea Herald/ANN
