Malaysia’s Online Safety Act 2025 a proactive approach to safeguard online users
Online safety has recently been a defining public-interest issue in the development of Malaysia’s digital age.
What was once a technical or niche concern has turned into a matter that involves everyday life – from how children learn and socialise online to how families manage their finances and how citizens participate in public debate.
As Malaysians spend more time on digital platforms, the risks associated with online spaces have grown in scale, speed and complexity.
Last year, between January and November alone, Malaysians reported losses of RM2.7bil to online scams.
Since 2022, 38,470 items of cyberbullying and online harassment content have been taken down.
During Op Pedo 2.0, the Royal Malaysia Police and the Malaysian Communications and Multimedia Commission (MCMC) seized 880,000 files of Child Sexual Abuse Material.
These figures reflect not just isolated wrongdoing but systemic problems in how digital platforms manage risk, content and user safety.
Online harm now affects ordinary users, families and communities directly.
This raises legitimate questions about whether Malaysia’s existing laws – many written before social media and algorithmic platforms became commonplace – are adequate to address harm at scale, while preserving civil liberties.
It is against this backdrop that the Online Safety Act 2025 (ONSA) was introduced.
What ONSA covers
Passed by Parliament in December 2024, the ONSA was gazetted on May 22 last year to complement and augment the current Communications and Multimedia Act 1998.
Its purpose is to create a dedicated legal framework focused specifically on online safety risks.
The act is not designed to police individual speech or monitor users but to regulate systems, processes and governance structures within digital platforms.
The aim is to correct structural failures in platform design and risk management that allow harmful content to spread rapidly, persist and be repeatedly amplified by algorithms optimised for engagement rather than safety.
The act represents a shift from reactive enforcement to one that favours preventive regulation.
It seeks to reduce the likelihood and impact of harm before it occurs rather than relying solely on takedowns, prosecutions or complaints after damage has already been done.
Responsible parties
The act applies to application service providers and content applications service providers.
This includes both domestic and foreign platforms that operate in or target the Malaysian market.
In practice, it covers social media platforms, content sharing services, online marketplaces and other digital services that host, distribute or recommend user-generated content.
By applying uniformly across platforms regardless of where they are based, the act aims to prevent regulatory arbitrage, where companies avoid responsibility by operating across borders.
Any platform benefiting from Malaysian users and the Malaysian digital economy is expected to meet equivalent safety obligations.
Addressing the harms
The act focuses on categories of harmful content that present significant risks to individuals and society.
These include child sexual abuse material, online scams and financial fraud, pornographic and obscene content, harassment and threats, content promoting violence or violent extremism, material encouraging self-harm particularly among children, content inciting hatred or public disorder and content related to dangerous drugs.
Rather than treating all content as equally risky, the act adopts a risk-based approach. Platforms are expected to assess which harms are most likely to arise within their services, who is most vulnerable to them and how their own design choices contribute to exposure and amplification.
Scope of the ONSA
A central concern in any form of online regulation is whether it becomes a tool for censorship or surveillance.
The act therefore draws clear boundaries around its scope.
It establishes enforceable safety duties for platforms, requires them to identify and mitigate risks, introduces transparency and accountability obligations and prioritises protection for children and other vulnerable users.
It does not regulate one-to-one private messaging, authorise mass surveillance, criminalise lawful expression or target legitimate online activity.
These limitations are built into the framework to ensure that online safety regulation strengthens trust rather than undermining fundamental rights, such as privacy and freedom of expression.
Role of platforms
Under the act, licensed platforms must take proactive steps to reduce users’ exposure to harmful content.
This includes making harmful content inaccessible a priority, providing clear and effective reporting mechanisms, offering user assistance and safety guidance and enabling user controls over interactions and content exposure.
One of the most significant requirements is the obligation to prepare and submit an Online Safety Plan to the MCMC.
It delineates how a platform identifies risks, what mitigation measures it uses, how it monitors effectiveness and how it responds to emerging threats.
Rather than a box-ticking
exercise, the plan functions as a central accountability tool for regulators and the public.
It allows scrutiny of whether a platform’s safety claims are matched by meaningful action and measurable outcomes.
Child protection at the core
Central to the act is child safety, as children are among the most vulnerable users online.
The reason is because digital platforms were designed without including children’s developmental, psychological or safety needs in mind.
These measures are intended to embed child protection into platform architecture rather than placing the burden entirely on parents, schools or children themselves.
The act recognises that design choices shape behaviour and exposure, and that safety must be built in, not added on.
Oversight and safeguards
To prevent misuse of regulatory power, the act incorporates governance safeguards.
Enforcement directions must be preceded by written notice, platforms have the opportunity to make representations and a public Register of Directions will ensure transparency.
Decisions can be appealed to the Online Safety Appeal Tribunal and are subject to judicial review before the High Court.
These mechanisms ensure that enforcement is proportionate, accountable and open to independent scrutiny.
Crucial move for protection
The ONSA reflects a broader shift in how societies think about digital platforms.
These are no longer seen just as neutral conduits for information but as powerful systems that shape visibility, behaviour and risk through their design and algorithms.
By focusing on systems rather than speech, on prevention rather than punishment and on accountability rather than control, the act intends to strike a balance between safety and freedom.
It aims to protect users without limiting innovation or legitimate expression.
For Malaysia’s digital ecosystem to remain open, innovative and trustworthy, online safety must be governed in a way that is firm, fair and rights-respecting.
The act represents an attempt to move beyond ad hoc responses to scandals and crises.
It leans towards building a more coherent, principled framework.
Instead it holds powerful platforms accountable and protects children and vulnerable communities.
It also ensures that the digital spaces Malaysians rely on every day are designed with care, responsibility and respect for fundamental rights.
