PETALING JAYA: The best interests of users, particularly children, will guide any future regulatory considerations on the use of beauty filters and digital alteration tools in livestream selling, says the Malaysian Communications and Multimedia Commission (MCMC).
In a reply to The Star, the MCMC said the Online Safety Act 2025 was already designed to strengthen online safety in Malaysia by regulating harmful content and setting out clear duties and responsibilities for licensed service providers.
Amid growing concerns about the rise of beauty filters in livestream sales, the commission said platforms must conduct risk assessments to identify potential online safety risks arising from their features and content practices.
“These include obligations to reduce the risk of users being exposed to harmful content, ensure the safe use of services by child users, and put in place appropriate safeguards through their Online Safety Plans.
“In this context, the commission expects platform providers to ensure their service design, recommendation systems and features do not expose users, particularly children, to content that may be misleading, exploitative or detrimental to their well-being,” it said.
The MCMC added that any further regulatory considerations would be guided by a balanced, proportionate and technology- neutral approach, taking into account platform accountability, consumer protection, commercial transparency and the best interests of child users.
“We will continue to work with relevant stakeholders, including platform providers, industry players and relevant authorities to ensure Malaysia’s online ecosystem remains safe, transparent and responsible,” it said.
On Jan 1, China implemented a ban on the excessive use of beauty filters in the livestreaming industry to curb online deception and address psychological harm linked to unrealistic portrayals.
However, cybersecurity specialist Fong Choong Fook said Malaysia should not rush to adopt a similar blanket ban, noting that widespread abuse of beauty filters has not been observed locally.
He said China’s move was driven by the scale of filter use among streamers there, where digital enhancements are frequently used to market products.
“China’s ban may appear excessive, but it reflects the scale of the issue there. I do not think it is a major concern outside China, as streamers elsewhere are not widely abusing filters for deception,” he said.
Fong added that while beauty filters can sometimes be detected during livestreams through visible distortions when a person moves, the technology is evolving.
“At present, there is no clear cybersecurity framework specifically designed to detect beauty filters, although distortions during movement are often seen as signs of digital alteration,” he said.
He noted that no other countries have introduced specific regulations targeting beautification tools, though Malaysia already has laws addressing online scams and fraudulent practices.
Be My Protector vice-chairman Prof Dr Isdawati Ismail said concerns over digital identity and filtered appearances must also be viewed in the broader context of how much time children now spend online.
She said many young users are increasingly exposed to social media content designed primarily for entertainment and engagement rather than healthy development.
“Children today are spending a significant amount of time online, and excessive smartphone use is already affecting sleep patterns, attention span, emotional regulation and real-world social interaction,” she said.
At the same time, she stressed that children are rights-holders who cannot simply be excluded from the digital environment, which has become essential for education, communication and access to services.
Isdawati said the challenge is therefore not whether children should be protected online, but how to do so responsibly through stronger digital literacy, better platform accountability, privacy protection and continuous monitoring of the impact on young users.
She cautioned that blanket age bans on social media may offer temporary control but risk oversimplifying a complex issue if not paired with proper safeguards.
“The goal should not be unlimited digital freedom or total restriction, but structured, age-appropriate and developmentally sound digital engagement,” she added.
She said protecting children from harm while preserving their rights should be seen as a duty of care shared by parents, educators, policymakers and digital platforms.
