Key Takeaways:
- MeitY has issued an advisory demanding immediate compliance with online content rules India, warning of prosecution for non-compliance.
- Intermediaries must follow Section 79 of the IT Act and the IT Rules, 2021, removing unlawful content within prescribed timelines.
- Platforms are required to review internal compliance frameworks, content moderation and user enforcement mechanisms.
- Failure to act may invite prosecution under the IT Act and other applicable criminal laws.
New Delhi — The Ministry of Electronics and Information Technology (MeitY) has issued a stern advisory to online platforms, chiefly social media firms, ordering immediate action against obscene, pornographic, paedophilic and otherwise unlawful content hosted on their services.
Dated 29 December 2025, the advisory makes clear that intermediaries must redouble efforts to comply with the Information Technology Act and the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. MeitY warned that platforms failing to meet their due diligence obligations may face prosecution under the IT Act and other applicable criminal laws.
Online content rules India and platform obligations
The advisory reminded firms that Section 79 of the IT Act conditions the exemption from liability on intermediaries on the observance of due diligence. MeitY said it has observed inconsistency in how platforms identify, report and remove content that is obscene, indecent, vulgar, sexual, paedophilic or harmful to children.
Under the IT Rules, intermediaries are required to remove or disable access to content that is prima facie sexual in nature or otherwise unlawful within strict timelines. Where an affected individual files a complaint alleging an image or video depicts them in a sexual act, platforms must take action within 24 hours of receiving the complaint.
MeitY specifically instructed intermediaries to act expeditiously to remove unlawful content upon receipt of actual knowledge, a court order, or a reasoned intimation from an authorised agency of the government. Platforms were also asked to immediately review internal compliance frameworks, content moderation practices and user enforcement mechanisms to ensure continuous adherence to legal obligations.
The advisory stressed that intermediaries must not permit the hosting, displaying, uploading, publishing, transmission, storage, sharing or updating of content that is obscene, pornographic, sexually explicit, paedophilic, harmful to children or otherwise prohibited under law. Non-compliance, it said, may lead to consequences including prosecution under the IT Act and other criminal provisions.
Industry sources said the advisory is likely to prompt platforms to tighten automated detection systems and expand moderation teams to meet the prescribed timelines. Legal specialists noted that while the rules impose operational burdens on global platforms, they are designed to protect vulnerable users and uphold statutory obligations.
MeitY’s move follows ongoing concerns about the speed and consistency with which intermediaries address flagged content. The ministry’s message to firms is clear: compliance is not optional. Platforms that wish to retain the intermediary shield under Indian law must demonstrate robust processes for identifying, reporting and promptly removing unlawful material.
As platforms review their policies, the government has emphasised cooperation with authorised agencies and the need for transparent mechanisms to process complaints. Observers say this could accelerate investments in moderation technology and partnerships with child protection and civil-society organisations.
For users and affected individuals, the advisory underlines the availability of legal recourse and the expectation that intermediaries respond quickly to remove harmful content. For platforms, the deadline is immediate: strengthen compliance now or face the legal consequences set out under the IT Act and the IT Rules, 2021.

















