SINGAPORE (The Straits Times/ANN): With online platforms continuing to be fertile ground for youth self-radicalisation, experts have called for a whole-of-society approach to raising awareness of the problem.
This means roping in parents, teachers, peers, as well as the platforms themselves to strengthen safeguards and identify those at risk before they can harm themselves or others, said those who work with detainees here.
The use of digital platforms such as social media, online games and video-sharing sites has been a common thread among recent radicalisation cases in Singapore.
All eight Singaporeans dealt with under the Internal Security Act (ISA) between July 2024 and June 2025 were self-radicalised by extremist materials they encountered online, the Internal Security Department said in its latest annual report on terrorism.
Dr Muhammad Mubarak Habib Mohamed, a religious teacher who counsels young detainees, noted the role of some gaming platforms in amplifying their users’ political and ideological stances in a “free space of so-called creativity”.
“Games that allow for the use of violent behaviour provide a signal to users that this kind of behaviour is okay, especially for youth who are seeking a sense of identity,” said Dr Mubarak, who is a secretariat member of the Religious Rehabilitation Group.
Roblox and Gorebox are sandbox games that let their users generate virtual worlds and avatars, with the latter game also known for its graphic depictions of violence.
In 2023, a 16-year-old was issued an RO for being self-radicalised by online ISIS propaganda. Among other things, he had joined multiple ISIS-themed Roblox servers, where the virtual game settings replicated physical ISIS conflict zones, such as those in Syria and Marawi city in the southern Philippines.
The youth regarded himself as an ISIS member in the game, and said his shooting of enemies in the virtual world was intended to mimic his desire to become a real-life member of the group, noted ISD.
Are platforms doing enough?
Following the latest ISA case in January, there was some debate about whether online platforms such as Roblox are doing enough to protect their younger users.
Razer chief executive officer Tan Min-Liang was among those who cautioned against “knee-jerk blame on games”, and said that such gaming platforms are neutral systems and not “extremist tools by design”.
“What this case really exposes is the gap in how young people access, interpret, and are supervised on online platforms,” he wrote in a LinkedIn post that called for shared responsibility in child safety. “Technology moves faster than digital literacy, parental oversight, and sometimes even regulation.”
The reality, however, is that Roblox has been criticised for allowing extremists to use its platform for radicalisation, such as far-right groups who use the virtual environment to show Nazi symbols, and to lure users to extremist channels on messaging platforms like Discord.
In the US, the platform has been sued by states such as Kentucky, Louisiana, Nebraska and Texas over harmful content and weak moderation. In response, it has rolled out measures such as parental controls and chat blocks with unknown adults, and has banned off-platform links and images.
Dr Jolene Jerard, who is executive director of public safety and management consultancy firm Centinel, said tech platforms cannot be absolved of the responsibility to protect their users, though many have strengthened their guard rails.
These protections are also patchy.
While larger companies may have moderation teams and AI tools to detect and remove extremist content, smaller platforms often lack the same capacity, said Dr Jolene, who is also an adjunct senior fellow at the S. Rajaratnam School of International Studies.
Apart from games and social platforms, experts also flagged the use of artificial intelligence chatbots as another radicalisation enabler.
In September 2024, a 17-year-old youth who was weeks away from carrying out a knife attack against non-Muslims in Tampines was detained. Investigations showed he had used an AI chatbot to generate a pledge of allegiance to ISIS, as well as a declaration of armed jihad against non-Muslims.
In another case, a youth influenced by far-right ideology used an AI chatbot to find instructions to produce ammunition, and considered 3D-printing his own firearms.
Dr Joachim Lee Tai Loong, a senior principal psychotherapist and a counsellor who works with radicalised youth, said AI technology can facilitate the self-radicalisation process by enabling a young person to develop a “mishmash” of extreme viewpoints.
Chatbots’ eagerness to agree and to appeal to one’s emotions could also result in a feedback loop that reinforces extreme views, he added.
“This is how radicalised thinking comes about – from uncertainty and chaos in their thinking, people are given a ‘clear’ way of thought,” he said. “It gives them a false sense of security.”
Given that these technologies are not going away, experts said the general public should be taught to recognise signs of radicalisation among their friends and family.
For instance, the Ministry of Education teaches school counsellors and student welfare officers to look out for signs of radicalisation, such as somebody expressing support for terrorist groups and making hateful remarks about other communities.
As extremist groups often look for ways to evade detection, such as by using coded content on gaming platforms, Dr Jolene said it is also important for youth themselves to have a sense of responsibility and resilience within such spaces.
“The individual also has a part to play, and the only way it can be done is with sufficient public education in terms of being able to identify extremist content and knowing who to flag it to – be it to their friends, teacher or the platform,” she said.
Dr Mubarak said protecting youth from extremism in this age requires a holistic approach.
“Tech platforms, government, parents, teachers – everyone has a role to play,” he said. - The Straits Times/Asia News Network
