Key Takeaways:
- Government issues a fresh India digital platforms advisory requiring proactive moderation of obscene and illegal material.
- Intermediaries must remove or block prohibited content on formal notice or face action under the Information Technology Act and Bharatiya Nyaya Sanhita, 2023.
- Ministry asks platforms to review moderation systems and warns repeated violations could lead to prosecution and other penalties.
- Grievance Appellate Committee recruitment and pay details underline expanded regulatory infrastructure for digital complaints.
The Indian government has issued a fresh advisory to digital intermediaries, including major social media companies, warning that failures to address obscene and unlawful content may invite legal action. The Ministry of Electronics and Information Technology told platforms to act proactively to prevent users from hosting or sharing material that is obscene, sexually explicit, paedophilic, harmful to children or otherwise illegal.
India digital platforms advisory demands faster action and stronger systems
According to the ministry, intermediaries must remove or block access to prohibited content promptly once they receive formal notice via court orders or government directions. The advisory reiterates obligations under the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 and warns that non-compliance could attract penalties under the Information Technology Act and provisions of the Bharatiya Nyaya Sanhita, 2023.
The communication asks companies to review their internal compliance and content moderation systems. Officials emphasised that repeated violations could lead to prosecution, signalling a firmer approach to enforcement as the government tightens oversight of the digital ecosystem.
Industry observers say the advisory is aligned with recent regulatory activity aimed at increasing accountability for online platforms. Earlier this month the ministry invited applications for whole-time members of the Grievance Appellate Committee, a quasi-judicial body that hears appeals against decisions taken by grievance officers of intermediaries, including social media companies. Users who are dissatisfied with the decisions of an intermediary’s grievance officer can appeal to the committee for review.
The ministry disclosed remuneration for whole-time members, who will receive consolidated monthly pay of 175,000 rupees and an allowance of 75,000 rupees in place of housing and transport. The recruitment drive highlights the government’s intent to bolster the institutional framework that oversees complaints and appeals arising from intermediary decisions.
Legal analysts note the advisory underscores two parallel objectives: protecting children and vulnerable users from harmful content, and ensuring platforms meet due diligence obligations under existing rules. Platforms are required to appoint grievance officers, set up redressal mechanisms and prevent the spread of prohibited content. Failure to comply with these duties could lead to actions that range from monetary penalties to criminal prosecution.
Representatives of some technology companies have previously argued for greater clarity and predictable procedures in enforcement. The ministry’s advisory reiterates the need for proactive measures, while formal notices and court orders will continue to trigger takedown obligations. For platforms, the advisory is a reminder to strengthen automated and human moderation capabilities, improve complaint-handling timelines and maintain transparent records of actions taken.
For users, the strengthened oversight aims to deliver quicker redress when illegal or harmful content appears online. The Grievance Appellate Committee is positioned to act as a secondary review mechanism, giving affected users an option beyond the intermediary’s internal grievance officer.
The advisory comes as governments worldwide balance digital freedoms with responsibilities for safety and legality. In India, the renewed emphasis on enforcement marks a step towards more active oversight of online spaces, with potential implications for how platforms moderate content, comply with legal requests and manage appeals from dissatisfied users.

















