Section 79 of the Information Technology Act, 2000 provides that an intermediary — any entity that stores or transmits content on behalf of third parties, including Google, Facebook, YouTube, Twitter/X, and Indian news portals — is not liable for third-party content hosted on its platform, provided it satisfies certain statutory conditions. This protection is the legal foundation upon which every major platform operates in India, and understanding its precise terms is essential to any content removal strategy.
What Is Section 79 of the IT Act and Why It Matters for Content Removal
This protection is called "safe harbour." It is not a blanket immunity — it is a conditional protection that evaporates the moment the platform fails to meet its statutory obligations. Section 79(3)(b) states explicitly that the safe harbour is not available where the intermediary, upon receiving actual knowledge that the hosted content is unlawful, fails to expeditiously remove or disable access to that content. A valid legal notice creates that actual knowledge.
The practical consequence of this structure is that the safe harbour becomes legally untenable the moment a platform receives proper notice of unlawful content and does not act. A platform that ignores a formally addressed advocate notice invoking Section 79 is no longer operating as a neutral conduit — it has transitioned into the legal position of a knowing publisher of unlawful material, with corresponding civil and potentially criminal liability for that content.
The definition of "intermediary" under Section 2(1)(w) of the IT Act is deliberately broad. It covers telecom service providers, network service providers, internet service providers, web-hosting service providers, search engines, online payment and auction sites, online marketplaces, and any other entity that receives, stores, transmits, or provides any service with respect to electronic records on behalf of another person. Every major platform operating in India — domestic or foreign — falls within this definition.
The Shreya Singhal Judgment and Its Impact on Section 79 Notices
The Supreme Court judgment in Shreya Singhal v. Union of India (2015) struck down Section 66A of the IT Act as unconstitutional but significantly clarified the reading of Section 79. The Court held that Section 79(3)(b) — which removes safe harbour when a platform fails to act on actual knowledge — must be read narrowly: "actual knowledge" in the context of Section 79(3)(b) arises upon receipt of a court order or a government notification, not from a private individual complaint alone.
This reading departed from how Section 79 had been applied before Shreya Singhal, when many advocates treated private legal notices as sufficient to strip safe harbour. However, the subsequent IT Rules 2021 created an independent compliance framework that restored and strengthened the practical pressure of advocate-sent notices. Under the Rules, platforms are independently obligated to acknowledge and act on grievances within mandatory timelines — creating liability for non-compliance that operates separately from the Section 79(3)(b) safe harbour removal mechanism.
The operational effect post-Shreya Singhal is a dual-track system. Track one: IT Rules 2021 impose mandatory processing timelines on platforms regardless of the Section 79(3)(b) court-order requirement. Track two: for definitive safe harbour removal, a court order remains the cleanest instrument — and interim injunction applications in India High Courts have become faster and more accessible as the volume of online defamation cases has grown substantially in recent years.
Practitioners who send Section 79 notices must accompany them with a clear invocation of the IT Rules 2021 compliance obligations, which are independent of and not affected by the Shreya Singhal ruling on Section 79(3)(b). The two legal instruments work in tandem, and a well-drafted notice invokes both simultaneously to create maximum compliance pressure on the platform from different legal directions.
IT Rules 2021 and the Due Diligence Framework for Significant Social Media Intermediaries
The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 introduced a tiered compliance framework that distinguishes between ordinary intermediaries and "significant social media intermediaries" — platforms with more than 5 million registered users in India. Every major platform — Facebook, Instagram, WhatsApp, YouTube, Twitter/X, LinkedIn, Telegram, and Snapchat — qualifies as a significant social media intermediary under this definition.
The Rules impose specific additional obligations on significant social media intermediaries that do not apply to smaller platforms. These include: appointment of a Chief Compliance Officer who is a resident of India and is personally liable for non-compliance; appointment of a Grievance Officer who is a resident of India and must respond to complaints within the statutory timelines; appointment of a Nodal Contact Person for law enforcement coordination; and publication of monthly compliance reports disclosing the number of complaints received and actions taken.
The Grievance Officer is the most operationally significant appointment for content removal purposes. Under Rule 3(2) of the IT Rules 2021, the Grievance Officer must acknowledge complaints within 72 hours and resolve them within 15 days. For specific categories — content violating a woman modesty, privacy violations, and content affecting public order — the resolution period is 24 hours under Rule 3(2)(b). These are mandatory, enforceable timelines, not aspirational benchmarks.
Failure to comply with these due diligence obligations does not merely result in loss of safe harbour — it creates independent grounds for regulatory action by the Ministry of Electronics and Information Technology. The Ministry has the power to direct compliance and, in extreme cases of persistent non-compliance, to recommend blocking of platform access in India under Section 69A of the IT Act. This systemic exposure is why significant social media intermediaries take formally addressed legal notices far more seriously than private user complaints.
RepuLex Legal Services
How a Legal Notice Removes Safe Harbour
When a formal legal notice — sent by an advocate, citing specific unlawful content, invoking applicable sections of IPC or the IT Act, addressed to the platform designated Grievance Officer, and requesting removal within a specified deadline — is received by the platform, the platform is placed on notice that specific content is claimed to be unlawful. Whether this constitutes "actual knowledge" under the Shreya Singhal reading of Section 79(3)(b) depends on whether the notice is accompanied by a court order, but it independently triggers the mandatory IT Rules 2021 processing obligation.
The legal mechanism is more nuanced than a simple on/off switch for safe harbour. A properly structured advocate notice achieves two concurrent effects: it invokes the IT Rules 2021 mandatory processing timeline, creating an enforceable compliance obligation with its own consequences for non-compliance, and it establishes a formal evidentiary record of notice that becomes the foundation for subsequent court applications if the platform fails to act within the mandatory period.
This is the fundamental distinction between an advocate notice and a user complaint. A user complaint enters the platform automated moderation queue and is typically rejected for defamation content in the absence of an accompanying court order. An advocate notice is directed to the Grievance Officer by name, invokes specific statutory obligations, attaches the evidence of unlawfulness, specifies the precise URL, and contains an explicit threat of legal action — including an application for an interim injunction naming the platform as a respondent — if the mandatory timeline is not met.
Platforms that receive large volumes of notices have legal teams that triage incoming communications precisely on these criteria. A notice that demonstrates legal sophistication — correct statutory citations, Grievance Officer as addressee, evidence of unlawfulness, specific deadline — is processed as a live legal risk. A notice that reads as a user complaint dressed in formal language is routed to the standard moderation queue. The quality and legal precision of the notice directly determines the processing priority it receives.
How to Send a Section 79 Notice That Is Legally Effective
A legally effective Section 79 notice — one that creates enforceable compliance obligation and serves as valid evidence in subsequent court proceedings — must satisfy several non-negotiable requirements. First, it must be authored and sent by an enrolled advocate or a law firm on official letterhead. Communications from individuals or non-legal entities do not carry the same evidentiary weight and are processed differently by platform legal teams.
Second, the notice must identify the specific URL or URLs of the offending content with precision. A notice complaining of "defamatory content about my business" without specifying the exact URLs is insufficient. The platform Grievance Officer cannot act on a notice that does not identify the specific content. Every URL must be listed individually, with the platform-generated content identifier where available.
Third, the notice must explain why the content is unlawful under Indian law, citing the applicable provisions. For defamation, this means invoking IPC 499 and IPC 500, along with Section 79 IT Act and the IT Rules 2021. For privacy violations, Section 66E IT Act. For sexually explicit content, Section 67A IT Act. The legal basis must be explicit — it is not sufficient to assert that the content is harmful or offensive. Indian law has specific categories of unlawfulness, and the notice must identify which category applies to the specific content.
Fourth, the notice must be addressed to the designated Grievance Officer by name and designation, at the Grievance Officer publicly listed contact details. Sending the notice to a generic help email or a non-designated contact does not start the mandatory IT Rules 2021 clock. The Grievance Officer details are required to be published by each significant social media intermediary on their platform — checking and using these published details is a mandatory step in drafting a valid notice.
Platform Response Timelines Under IT Rules 2021
The IT Rules 2021 establish a clear, mandatory timeline structure for platform response to grievances. Acknowledgement of the grievance must occur within 72 hours of receipt. This acknowledgement obligation is independent of whether the platform agrees with the grievance — it is a mandatory procedural step. For urgent categories of content — non-consensual intimate imagery, content violating a woman dignity under Section 354A or 354D IPC, and content affecting public order — the mandatory resolution period is 24 hours.
The standard resolution period of 15 days applies to all other categories of complaints. Within this 15-day window, the platform must either remove the content or communicate its decision not to remove it with reasons. A failure to communicate within 15 days is itself a non-compliance, giving rise to escalation rights under the IT Rules 2021 and providing grounds for a court application for an injunction.
In practice, platforms do not always comply within these timelines. Non-compliance by the Grievance Officer — including failure to acknowledge, failure to decide, or an unreasoned decision — can be escalated to the Senior Grievance Officer where the platform has appointed one, and thereafter to the Grievance Appellate Committee, a statutory body established under the IT Rules 2021 as an alternative to court proceedings for certain categories of content decisions.
The Grievance Appellate Committee, established by the Government of India in 2022, provides an adjudicatory mechanism for appeals against platform content moderation decisions. An appeal must be filed within 30 days of the platform decision or deemed decision by non-response. The Committee is mandated to resolve appeals within 30 days. However, for serious defamation cases requiring immediate injunctive relief, court proceedings remain the more powerful and faster route to actual removal.
When the Platform Ignores Section 79 Notice: Escalation Options
When a platform fails to comply with a valid Section 79 notice within the mandatory IT Rules 2021 timeline, a structured escalation sequence follows. The first escalation step is a second notice — addressed now to both the Grievance Officer and the Chief Compliance Officer by name — recording the fact of non-compliance with the first notice, restating the mandatory timeline obligation, and specifying that court proceedings will commence within a defined period if removal is not effected.
The second escalation route is a complaint to the Grievance Appellate Committee where the platform has made an adverse decision or a deemed adverse decision through non-response. This administrative route is appropriate for straightforward cases and imposes costs on the platform without requiring court time. However, it does not provide interim injunctive relief — it can only direct the platform to reconsider or revise its decision on the specific piece of content.
The most powerful escalation route remains an application to the competent High Court for an interim injunction. An application naming the platform as respondent, relying on the documented record of notice and non-compliance, typically receives a hearing within days in High Courts with regular digital-defamation dockets such as Delhi and Bombay. The court may grant an ex parte interim order directing immediate removal pending the full hearing, which the platform is obliged to comply with or face contempt proceedings.
For platforms that persistently refuse to comply with court orders, the ultimate sanction is an application to the Ministry of Electronics and Information Technology to block access to the platform under Section 69A of the IT Act. This is a rare but real option — the Government of India has used Section 69A to block non-compliant platforms, and the threat of its exercise creates strong incentive for compliance at all prior escalation stages.
The Section 69A Route: Government-Ordered Content Blocking vs Private Removal
Section 69A of the IT Act is a distinct and separate legal instrument from Section 79. While Section 79 governs the conditions under which platforms retain or lose safe harbour protection, Section 69A empowers the Central Government to issue directions for blocking public access to information online where it is necessary in the interest of sovereignty and integrity of India, defence, security of the state, friendly relations with foreign states, public order, or for preventing incitement to any cognisable offence.
Section 69A orders are directed at URLs and are addressed to the platform by government notification rather than by private party action. A private individual cannot directly invoke Section 69A — only the designated government authority can issue a blocking order. However, the threat of a Section 69A referral, particularly where content threatens public order or where a platform is systematically non-compliant, is a legitimate escalation point in communications to the Ministry of Electronics and Information Technology.
The IT Blocking Rules 2009 prescribe a procedure for Section 69A orders that includes a committee review, a 48-hour emergency process for urgent matters, and confidentiality of the order itself. Platforms are prohibited from disclosing that a particular URL has been blocked pursuant to a 69A order. This confidentiality distinguishes 69A orders from private court-ordered removals, which are public record and can be referred to by the winning party.
In content removal practice, Section 69A is most relevant in two contexts. First, as an additional regulatory pressure point for cases involving defamatory content that also threatens public order — particularly where the content concerns ongoing legal proceedings, elections, or regulated financial events such as IPOs. Second, as a secondary measure in cases where a foreign platform refuses to comply with Indian court orders for private removal. The Government of India demonstrated willingness to use 69A has altered the compliance calculus for major international platforms operating in India.
Combining Section 79 with IPC 499/500: Why Dual Notices Are More Effective
IPC Section 499 defines criminal defamation as making or publishing any imputation concerning any person, intending to harm or knowing or having reason to believe that such imputation will harm the reputation of such person. IPC Section 500 prescribes punishment of up to two years simple imprisonment, a fine, or both. These are not civil provisions — they are criminal offences, and a complaint under IPC 499/500 initiates criminal proceedings against the individual who posted the content.
A dual-track approach — serving a Section 79 IT Act notice on the platform simultaneously with an IPC 499/500 criminal complaint against the original poster — creates fundamentally different pressure dynamics than either track alone. The platform faces civil compliance obligations under Section 79 and the IT Rules 2021. The original poster faces criminal proceedings under IPC 499/500. These are independent proceedings with independent consequences that run simultaneously, each with its own enforcement mechanism.
From the platform perspective, a dual-track approach signals that the complainant has access to competent legal counsel and is pursuing the matter seriously across multiple legal channels. Platforms are also aware that retaining content that is the subject of a criminal complaint against a named or identified user increases their own exposure — particularly in cases where a court order is subsequently obtained in the criminal proceedings directing content preservation or removal.
From the original poster perspective, the criminal complaint is often the most immediate motivator for voluntary removal. An individual who receives a police notice or summons under IPC 499/500 typically removes the offending content unilaterally within days, in anticipation of compounding the offence. This voluntary removal is sometimes faster than either the platform mandatory 15-day timeline or the court hearing schedule. The dual-track approach therefore maximises the probability of rapid removal through any of several possible channels.
Jurisdiction Issues With Section 79 Notices Sent to Foreign Platforms
The IT Act applies to all persons and entities that commit offences or contravene its provisions using a computer, computer system, or computer network located in India, regardless of their nationality or the location of their servers. Section 75 of the IT Act explicitly extends its application to offences committed outside India if the act involves a computer, computer system, or computer network located in India. This is the jurisdictional basis for IT Act notices sent to US-headquartered platforms.
US platforms — including Meta, Google, YouTube, and Twitter/X — have designated representatives in India who are personally accountable under the IT Rules 2021. Service of legal notices on these designated representatives is valid service under Indian law. Courts have consistently held that foreign companies operating in India, deriving commercial revenue from Indian users, and appointing India-resident Grievance Officers have submitted to Indian jurisdiction for the purposes of IT Act compliance.
The US First Amendment, which protects speech from government censorship in the United States, does not apply to Indian court orders. Indian court orders are directed at the platforms Indian operations and their India-resident Grievance Officers and Chief Compliance Officers — persons bound by Indian law. US-based platforms have generally complied with Indian court orders directing content removal, recognising that non-compliance creates personal liability for their India-resident officers and risks their ability to operate commercially in the Indian market.
GDPR creates a separate jurisdictional complexity for European platforms, but it does not prevent compliance with Indian law requirements. European data protection law generally permits disclosure and action in response to court orders from competent courts, including foreign courts. In practice, European-headquartered platforms operating in India have adopted the same compliance posture as US platforms and respond to properly served Indian court orders directing content removal or de-indexing.
Section 79 in Practice: Documented Cases of Successful IT Act Takedowns
Indian courts have developed a substantial body of case law on Section 79 notices and platform compliance. The Delhi High Court in Super Cassettes Industries v. MySpace India held that the safe harbour under Section 79 requires active compliance with takedown notices and that platforms cannot claim passive-conduit protection for content about which they have received specific, detailed notice. The judgment established the principle that the quality and specificity of the notice determines whether actual knowledge has been communicated.
In Myspace Inc. v. Super Cassettes Industries Ltd., the Division Bench of the Delhi High Court elaborated that platforms must implement reasonable measures to detect and prevent repeat infringement following notice. This principle — applicable by analogy to defamatory content as well as copyright-infringing content — means that a platform that restores removed content, or fails to prevent the same poster from reposting removed content, faces fresh liability for each such instance.
The range of content successfully removed through Section 79 IT Act notices and court orders includes: false product reviews on e-commerce platforms, defamatory social media posts about doctors and other professionals, forged documents and financial fraud allegations, leaked confidential business information, and reputationally harmful news articles on Indian digital portals. Compliance rates improve significantly when the notice is sent by an advocate, cites specific URLs, and is accompanied by evidence of the content unlawfulness.
Timeline data from content removal practice in India indicates that advocate-sent IT Act notices to the designated Grievance Officer of significant social media intermediaries achieve voluntary compliance — without court proceedings — in approximately 60 to 70 percent of cases where the legal grounds are properly established and evidenced. For the remaining cases, court proceedings are necessary. The 24-hour urgent category has a substantially higher voluntary compliance rate because the shorter timeline increases the platform exposure to immediate non-compliance findings.
RepuLex Editorial
Legal Researcher · IT Law & Defamation Practice
RepuLex's editorial team is composed of practising advocates and senior legal researchers specialising in IT Act 2000, defamation law, and digital content enforcement across Indian High Courts. All articles are reviewed for legal accuracy before publication. Nothing in this article constitutes legal advice — consult a qualified advocate for your specific situation.