EXISTING SAFEGUARDS ARE NOT ENOUGH


In today’s hyperconnected era, some of the most significant threats to children enter through the glowing screens of their devices. — Filepic

As digital risks outpace awareness, Malaysia is intensifying efforts to build more robust safeguards for younger users

FOR decades, the primary lesson in child safety was simple: “Don’t talk to strangers”.

But in today’s hyperconnected era, that advice has become dangerously obsolete.

Today, the most significant threats to children often bypass the front door entirely, entering instead through the glowing screens of smartphones in the supposed sanctuary of a child’s bedroom.

As the government moves forward with the implementation of the Online Safety Act (ONSA) 2025, the conversation is shifting to a sobering reality that our current digital safeguards are largely reactive, functioning more like an ambulance at the bottom of a cliff rather than a fence at the top.

Evolution of grooming

Local non-governmental organisation, Protect & Save The Children (P.S. The Children) highlighted that the digital environment has fundamentally shifted exploitation from physical access to sophisticated psychological manipulation.

Its executive director, Amnani A. Kadir, believes that the traditional image of a predator is being replaced by something far more insidious in the digital world.

Modern exploitation often scales harm through networks, which is facilitated through online channels, leveraging the very peer-to-peer connections that children value most.

“Predators now pose as peers or trusted individuals, building emotional connections before introducing grooming,” she said.

“Networked grooming is also a growing concern, where children who have been exposed or manipulated are used to recruit or bring in other children.”

Amnani (left) said stronger protections are needed to change how online harm is viewed, stressing that it should be recognised as real harm while ensuring children feel safe to speak up without fear. — Filepic
Amnani (left) said stronger protections are needed to change how online harm is viewed, stressing that it should be recognised as real harm while ensuring children feel safe to speak up without fear. — Filepic

She pointed to a local case where a 12-year-old girl and her friends set up a website to sell nude images of herself, highlighting how exploitation is also becoming peer-driven and normalised in digital spaces, rather than isolated incidents.

This shift in exploitation reflects a more dangerous trend where digital spaces act as echo chambers for harmful behaviour.

Unlike traditional physical threats, networked grooming allows a single predator to manipulate multiple victims simultaneously by turning them against one another or using them to lure more children.

As these behaviours become normalised within peer groups, children could lose the ability to distinguish between a healthy online friendship and a calculated attempt at exploitation.

“This makes ‘stranger danger’ outdated as the threat often comes from someone familiar, either online friends or peers.

“These same tactics are also used in ideological grooming, where trust and belonging are leveraged to influence and recruit youth. In both cases, children may be drawn into harming others without fully understanding it,” she added.

Architecture of exposure

The ONSA framework explicitly moves away from treating digital platforms as “neutral conduits of information”.

It recognises them as powerful systems that shape user behaviour and risk, and are responsible for moderating harmful content.

However, the architecture of social media itself serves as a primary entry point for harm.

Platform designs – specifically auto-recommendations and private messaging – proactively increase a minor’s exposure to strangers and manipulative interactions.

The danger of these design features lies in their ability to foster a false sense of intimacy.

When an algorithm repeatedly suggests a “friend” or pushes content that resonates with a child’s current emotional state, it creates a “digital familiarity” that predators are quick to exploit.

Private messaging serves as the final gateway, moving children away from the public eye into unmonitored digital territories where the psychological grooming can intensify without any external oversight from parents or platforms.

Amnani noted that the design functions as a powerful tool to access and engage children at scale, as these everyday features allow strangers to initiate contact in ways that feel normal or harmless, which could later affect a recipient’s algorithms.

“These everyday interactions can quickly build familiarity and trust, while algorithms expand visibility and connect children to wider, and sometimes unsafe, networks,” she said.

“Private messaging then enables these interactions to move into unmonitored spaces where manipulation can deepen.

“If adults, despite their greater life experience, are also frequently deceived and scammed by fake accounts and online tactics, children are even more susceptible to similar forms of exploitation.”

Reactive, not proactive

Current safety measures rely almost entirely on reporting harm after it has occurred, a strategy deemed ineffective by the ONSA.

This reactive approach places the burden on children who may not recognise subtle abuse or exploitation as it happens.

By the time a child realises the impact, they are often too afraid or ashamed to speak up.

This leads to an underreporting of incidents with many children suffering in silence, fearing that speaking up will only lead to the loss of their digital lifeline – their devices..

“Many cases go unreported because they are often misunderstood as less serious than physical harm, assuming it is temporary or less damaging. There is a common belief that online risks can be easily managed by blocking, switching off or disconnecting,” said Amnani.

“However, this overlooks the reality that harm in digital spaces is deeply psychological and emotional, and it can take place even within the supposed safety of a child’s bedroom.”

She added that when the harm involves grooming, bullying or sexual content, feelings of shame and confusion are even stronger.

The reality is that online abuse can be persistent and invasive, and can follow a child across platforms and remain in their minds long after the screen is turned off.

Additionally, the slow nature of digital investigations to trace anonymous accounts or recover deleted evidence means that while systems respond, the harm continues to spread.

“This can lead to anxiety, fear, low self-esteem, isolation, self-harm or harming others,” said Amnani.

“In reality, we are managing the aftermath when a child is already affected emotionally or psychologically.

“Stronger protections are needed to shift this mindset, recognising that online harm is real harm and to create safer environments where children feel supported to speak up without fear.”

Protection, not punishment

The proposal for an age limit under the ONSA, specifically for those under the age of 16, is a move toward a safety-by-design philosophy.

It recognises that platforms, which are built to maximise engagement and profit, are not designed with a child’s best interests in mind.

“Today’s children were born into the digital world. For many, online communities are not just for entertainment – they are spaces where they find identity, belonging and support, especially when these may be lacking offline,” said Amnani.

“But, expecting children – especially those under 16 – to navigate these environments safely on their own is unrealistic. Therefore, an age limit is a protective measure, not a punishment.

“It creates time and space for children to develop the maturity and skills needed to engage safely, while pushing platforms to take responsibility for safer designs.”

However, regulations are only one part of the solution, as the enactment of the ONSA requires the holistic effort of multiple parties.

Amnani asserted that parents and schools must integrate digital literacy with child safeguarding education, equipping children to recognise “tricky” adults, grooming behaviours and sexual innuendos while understanding the risks of peer recruitment and manipulation.

“Children need to be equipped to recognise how ordinary online interactions can be used to gain their trust and exploit them,” she said.

“At the same time, solutions must include safer, regulated digital spaces and strong education, so children are not cut off from connection, but are protected within it.”

Bridging this gap requires a fundamental shift in how we view the digital world, not just as a “playground”, but as a space that requires the same level of safeguarding as the schoolyard.

Therefore, the transition period under ONSA is an opportunity for families to relearn how to connect in the present.

By reclaiming family time and fostering open, empathetic conversations, we could pave the path for a future where a child’s identity is shaped by real-world support systems rather than the unpredictable algorithms of social media platforms.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Nation

Experts back comprehensive, balanced policy on social media age limit
Thunderstorms and heavy rain expected in Johor and Sarawak
Data, the star storyteller
Fahmi: 727 media practitioners have received Hawana aid
Johor takes on developers who fail to deliver homes
Chefs serve up tricks to keep healthy food tasty
Watchdogs: Good chance to reform and beef up agency
RM27mil worth of assets seized, 40 people identified
Japanese murder case suspect deported
Kampung Bahagia fire receives swift response

Others Also Read