Malaysia’s Online Safety Act came into force today, imposing new duties on internet platforms to protect children and embedding safety by design measures rather than relying solely on parental supervision. The law follows a marked rise in online harms, with authorities and child protection groups warning that the most vulnerable young people are disproportionately affected.
Malaysia online child safety measures target design and behaviour
The measures require platforms to set stricter default privacy and safety settings for child users, adopt age‑appropriate search, recommendation and content discovery systems, and restrict high‑risk interactions between adults and children. Regulators say these design changes are intended to reduce children’s exposure to sexual content, harassment and scams before harm occurs.
Shamir Rajadurai, a crime prevention specialist with Prevent Crime Now and AntiBuli.My, said children aged roughly nine to 15 are especially vulnerable because they are old enough to use digital services but too young to recognise complex online risks. He added that girls and young women, refugee and migrant children, low‑income (B40) families and children with disabilities face heightened danger.
The Malaysia Communications and Multimedia Commission (MCMC) has flagged fraud and scams, cyberbullying and online harassment, and child sexual abuse material (CSAM) as the primary categories of online harm. Between January and November 2025 Malaysians lost about RM2.7 billion to scams. During operations against CSAM, police and MCMC seized nearly 900,000 files, highlighting the scale of the problem.
Madeleine Yong, founder of Protect and Save the Children, warned that sharing of sexual content is one of the most serious threats children face online. She said UNICEF Malaysia now considers CSAM the top online risk for children, overtaking cyberbullying.
Data from The Internet Watch Foundation cited by authorities shows Malaysia recorded 12,656 CSAM reports from January to June 2025, a figure described as a substantial increase compared with previous years. Paediatrician Sasha Mohan told local media that studies indicate one in four Malaysian children have been exposed to sexual or disturbing content online, often without seeking it out.
“This implies that roughly 100,000 children in Malaysia may experience online sexual exploitation each year,” Sasha said, though she cautioned the true figure is likely higher because many incidents are never disclosed or recognised by adults.
Officials say recent laws provide a clearer legal basis to compel platforms to remove harmful content, but gaps in enforcement persist. Shamir noted that while the new rules and anti‑bullying legislation have improved public understanding of online safety, action on reported posts can be slow and some definitions in policy remain broad or vague.
MCMC previously reported that 92 per cent of harmful posts identified were removed, but that still left more than 58,000 posts related to fraud, bullying and CSAM online. Civil society groups and experts are urging platforms to speed up takedowns, refine detection and reporting processes, and invest in child‑centred design.
Child protection advocates also call for wider public education campaigns, improved reporting channels for caregivers and schools, and stronger cross‑agency cooperation to ensure the law translates into real reductions in harm. For now, the new legislation represents a significant step in Malaysia’s efforts to shield children from the most harmful consequences of a rapidly digitalising society.
Key Takeaways:
- Malaysia online child safety becomes statutory as the Online Safety Act comes into force, imposing platform design obligations for child protection.
- Experts warn children aged nine to 15, girls, migrant and B40 children, and those with disabilities face the highest online risks.
- Authorities reported a sharp rise in child sexual abuse material (CSAM) and large-scale seizures, prompting stricter rules and calls for faster enforcement.
- Advocates urge platforms to act more swiftly and for better public awareness and reporting to tackle under‑reported exploitation.

















