Authored By: Akash Kailas Rathod
MANIKCHAND PAHADE LAW COLLEGE
Abstract
The regulation of online speech has emerged as one of the most complex and pressing constitutional challenges of the twenty-first century.1 In India, this challenge has become particularly acute with the Government’s recent decision to drastically reduce the timeline for takedown of unlawful online content from thirty-six hours to a mere three hours.2 This shift represents a significant escalation in the responsibilities and legal risks imposed on intermediaries, including social media platforms and other digital content hosts.3 While the State presents this accelerated framework as a necessary measure to swiftly curb the spread of harmful, offensive, or illegal content that can incite violence, create panic, or damage public order,4 critics caution that such a compressed timeframe effectively incentivises the automation of compliance.5 This, in turn, may normalise pre-emptive censorship, suppress legitimate debate, and erode the constitutional guarantees of free speech.6 Through a careful review of statutory provisions, landmark Supreme Court rulings, and the application of the proportionality principle, this article contends that a blanket three-hour takedown requirement cannot be reconciled with Article 19(1)(a) of the Constitution of India.7 While the objective of protecting public order and preventing harm is indisputably legitimate,8 the absence of adequate safeguards — such as procedural fairness, differentiated treatment of content categories, and robust appellate mechanisms — risks creating an environment of indirect censorship,9 where the fear of liability may prompt intermediaries to remove lawful content reflexively, thereby chilling democratic discourse and undermining the very essence of participatory expression in a constitutional democracy.10
Introduction
Social media platforms are increasingly assuming the role of public forums for debate.11 Electoral campaigning, investigative journalism, citizen journalism, and dissent are all happening online.12 Social media is not merely a complement to democracy; it has become a democratic space in its own right.13
But social media also transmits misinformation, hate speech, and incitement to violence at an unprecedented scale.14 Rumours cross borders in minutes.15 Content that incites inflammatory activity spreads faster than authorities can act to counter it.16 Governments worldwide are redefining the problem and seeking an appropriate regulatory response.17
The shift to a three-hour timeframe for content takedown in India is about more than regulatory housekeeping.18 It signals a decisive shift in emphasis regarding the urgency with which the State seeks to regulate online speech.19 The real constitutional question is not whether harmful content should be addressed — it certainly should be — but whether the means of addressing it respect the balance between safety and freedom.20
Is a three-hour compliance window genuinely about safety, or does it inherently encourage an overreach that violates Article 19(1)(a)?21 This article sets out to examine that question.22
Recent Developments: The 2026 IT Rules Amendment and Public Response
In February 2026, the Union Government notified a significant Amendment to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (IT Rules).89 The newly notified rules, designated G.S.R. 120(E) and effective from 20 February 2026, impose a statutory definition of “synthetically generated information” (SGI) — encompassing deepfakes, AI-generated audio and video, and similar content — along with new obligations on intermediaries that host such material.90
A defining feature of the 2026 amendments is the dramatically accelerated timeframe for intermediaries to take down or disable unlawful content, reduced from thirty-six hours to just three.91 The 2026 amendments do not, however, prescribe a single uniform schedule for compliance.92 Instead, there are tiers of varying compliance windows.93 The three-hour window applies to unlawful content generally, upon receipt of a valid government or court order.94 Non-consensual intimate imagery (NCII), deepfake-generated nudity, and other SGI content is subject to a still more aggressive removal timeline of two hours.95 This two-hour window applies specifically to impersonative and sexually explicit content.96 The accelerated timetable reflects the State’s recognition of the severe harm such content can cause to dignity, privacy, and reputation.97
The existence of these different tiers indicates that speech is not regulated uniformly, and it raises further questions about whether the timelines adopted are appropriate for all categories of speech.98,99 The government has maintained that the changes are proportionate to the rapidly proliferating harm posed by SGI content, impersonation content, and coordinated networks of misinformation — content capable of causing reputational harm, enabling fraud, and threatening public order.100
The statutory definition of SGI, together with strict requirements for visible labelling, metadata retention, and provenance markers, is designed to enhance traceability, accountability, and authenticity verification across leading social media platforms.101 These changes represent a shift from notice-and-takedown to interventionist regulation, imposing detailed due-diligence obligations and strict deadlines on social media intermediaries.102
The amendments have drawn a mixed response.103 Industry body NASSCOM has described the three-hour takedown deadline as operationally impracticable, cautioning that such tight deadlines may exceed the operational capacity of social media platforms, particularly smaller platforms without effective moderation resources.104 Critics have also raised concerns about heightened compliance costs and the attendant risk that those costs will manifest in the misclassification of protected speech.105
Digital rights advocates, led by the Internet Freedom Foundation (IFF), have expressed particular concern that tight deadlines will drive increased reliance on automated content removal processes, resulting in overreach that suppresses protected speech.106 Civil liberties advocates warn that the absence of procedural safeguards — including requirements for transparency, user notification, and effective appeal processes — risks intensifying the chilling effect on free expression rather than alleviating it.107
While the 2026 amendments reflect a growing government appetite for addressing AI-based harms, they come at a significant constitutional cost: the rushed implementation of these rules places Article 19’s core due process and free speech guarantees under considerable strain.108
Constitutional and Statutory Framework
1. Freedom of Speech: Article 19(1)(a)
Article 19(1)(a) confers upon all citizens the right to freedom of speech and expression.23 The Supreme Court has consistently held that such speech is integral to participation in a democracy.24 It is not speech that others approve of or enjoy; it is speech that is dissenting, challenging, and often uncomfortable.25 The internet has been recognised by several court rulings as a constitutionally protected forum for speech.26 Speech on the internet is not a lesser form of speech.27
2. Justified Restrictions: Article 19(2)
Article 19(2) permits justified restrictions on speech in the interests of:28
- The sovereignty and integrity of India
- The security of the State
- Public order
- Decency or morality
- Incitement to an offence29
Restrictions must conform to the constitutional discipline imposed upon them.30 They cannot be overbroad, vague, or disproportionate.31 The State’s objective in imposing a three-hour takedown window — minimising the spread of illegal content — falls within the heads of “public order” and “incitement to an offence.”32 The legitimacy of that purpose is not in dispute.33 The question of proportionality, however, remains a constitutional question.34
3. Intermediary Liability Under the IT Act, 2000
Section 79 of the Information Technology Act, 2000 provides a safe harbour for intermediaries that exercise due diligence and comply with the orders of the authorities.35 The new amendments raise the compliance threshold considerably.37
As the compliance window narrows, the burden of regulation shifts. Intermediaries are afforded minimal opportunity to assess the legality of content, even in genuinely ambiguous categories such as satire, political commentary, and investigative journalism.39 It is therefore necessary to assess the effect of such a rule not merely on its face, but structurally.40
4. Judicial Guidance on Online Speech
Shreya Singhal v. Union of India (2015)41
In striking down Section 66A of the IT Act as unconstitutional, the Supreme Court emphasised two cardinal principles: restrictive speech regimes must be narrowly defined,42 and vague, overbroad provisions with a chilling effect are unconstitutional.43
The Court also clarified that intermediaries must not be made the private guardians of legality at risk of liability.44 Removal must follow and be tied to a proper process.45 The intent of this judgment echoes the caution that courts have consistently applied against indirect censorship.46
Anuradha Bhasin v. Union of India (2020)47
The Court held that the right to free speech and expression on the internet enjoys constitutional protection.48 It ruled that restrictions must satisfy the proportionality requirement49 and cannot be limitless or excessive.50 This decision firmly established that digital governance cannot sidestep constitutional safeguards simply because the medium is digital.51
Constitutional Questions Involving the Three-Hour Deadline
The Proportionality Test
To constitute a justifiable limitation on speech, a measure must:52
- Have a legitimate goal
- Bear a rational connection to that goal
- Represent the least restrictive option necessary to achieve the goal
- Strike a proper balance between the competing interests at stake
Removal of content that poses an immediate threat of harm is undoubtedly necessary, but is a universal three-hour deadline across all content categories truly the least restrictive option?53 Content that directly incites violence is the least disputable candidate for swift removal.54 Political commentary or factually contested statements may require contextual inquiry.55 A uniform timeline applied indiscriminately across all categories is overbroad.56
Structural Incentives for Excessive Removal
In practice, intermediaries operate within a risk-minimisation paradigm.57 The threat of incurring penalties or losing safe harbour protection for any level of offending content means that platforms will very likely remove content rather than undertake contextual interpretation.58 This “collateral censorship” phenomenon — where private parties suppress speech to avoid regulatory exposure — does not involve direct state action to suppress speech.59 But the regulatory framework produces this outcome as a structural consequence.60
This indirect burden on speech carries independent constitutional significance.
1. The “Application of Mind” Problem
Legal experts and industry bodies such as NASSCOM have noted that a three-hour window effectively eliminates the possibility of “application of mind.”
Within three hours, a platform cannot realistically conduct a legal review of a government order to assess whether it meets the Shreya Singhal standards — that is, whether the order is actually constitutional. The result is that platforms will migrate to a “delete-by-default” approach to avoid criminal liability, producing what may be termed “preventive silence.”61
2. The Safe Harbour Pincer
The 2026 rules provide that intermediaries lose Safe Harbour protection under Section 79 if they fail to meet these timelines.62 By losing Safe Harbour, a platform becomes legally responsible for a post as if it were the original author.63 The result is a structural pincer:
- If a platform does not remove content within three hours, it is liable for that content.64
- If it does remove the content and that content turns out to be lawful speech — such as satire or journalism — it has violated the user’s fundamental rights.65
The current regulatory architecture incentivises platforms to fear the Government more than the user.66
3. The State Overreach Problem
The 2026 amendment rolled back a crucial safeguard introduced by the 2025 Rules:67
- The Prior Rule (2025): Only one designated officer per State was authorised to issue takedown orders, ensuring senior-level oversight.68
- The New Rule (2026): States may now designate multiple authorised officers at the rank of Deputy Inspector General (DIG) or above.69
This change dramatically increases the volume of orders that platforms must process.70 A platform could simultaneously receive hundreds of three-hour mandates from different districts, making compliance physically impossible even for the largest technology companies.71
4. The Automation Trap
Because three hours is too short for human moderation at scale, the rules explicitly mandate the use of “automated tools.”72 AI tools are, however, notoriously poor at understanding context, irony, or political nuance.73 An automated tool might flag a documentary about a riot as “incitement to violence” merely because it contains footage of the event.74 Under the three-hour rule, there is no time for a human reviewer to override the automated system’s error.75
5. The Transparency Gap
Critics — including the Internet Freedom Foundation — have highlighted that the rules lack a pre-decisional hearing.76 Users are frequently not notified of the reason for content removal until after it has already been taken down.77 In the three-hour rush, reasoned orders grounded in specific Article 19(2) justifications are likely to be displaced by generic “Public Order” templates.78
Summary: The Core Proportionality Problem
The core constitutional problem is one of proportionality.79 While the government’s intent — stopping the viral spread of deepfakes and incitement — is legitimate, applying a “wartime” timeline of three hours to “peacetime” speech such as political debate creates a structural regime of censorship.80
Procedural Safeguards and Transparency
Constitutional case law imposes both substantive and procedural rationality on speech-restricting measures.81 Safeguards including:
- Written orders
- Stated grounds
- A right to contest
- An independent adjudicator82
are required to prevent the arbitrary exercise of power.83 Where swift takedown is not accompanied by robust safeguards for transparency and review, the potential for overreach increases substantially.84
The Chilling Effect on Democratic Participation
A chilling effect occurs when individuals refrain from engaging in lawful speech because they fear the consequences.85 In the online space, if a user knows that their content can be removed within hours without meaningful redress, they may think twice before speaking at all.86 The essence of democracy is the unfettered exchange of ideas.87 The Constitution safeguards speech not only after it has been stifled, but also ensures that a climate for free exchange exists in the first place.88
Comparative Approach
Regulatory developments in other jurisdictions suggest a trend toward expediency, but with meaningful safeguards. The European Union’s Digital Services Act, for example, includes similarly accelerated requirements for action against unlawful content, but pairs these with obligations to provide transparency reporting and enforceable rights of appeal.109,110
Comparative experience demonstrates that urgency and procedural safeguards can co-exist. The constitutional question for India is whether its current regulatory regime achieves this balance.111
Potential Improvements
A constitutionally sound regime must go beyond merely facilitating compliance; it must embed safeguards that honour the separation of powers and the proportionality principle.112 The following improvements merit consideration:
Judicial Review for Takedown Requests
Takedown requests targeting political or journalistic speech should originate from, or be capable of confirmation by, a Judicial Magistrate, rather than from purely administrative actors at the level of a Joint Secretary.113 This would ensure that the power to regulate speech does not become concentrated in the executive; rather, the application of Article 19(2) justifications would be entrusted to an independent judicial officer.114 This approach is more consonant with the doctrine of separation of powers as constitutionally mandated.115
Differentiated Liability Based on Platform Scale
The compliance obligation must distinguish between Significant Social Media Intermediaries (SSMIs) and smaller players.116 A major SSMI with extensive technical and legal infrastructure may, in theory, absorb even a rigid two-hour compliance window. The same cannot be said of start-ups and smaller platforms that do not maintain technical, legal, and compliance staff on a twenty-four-hour basis.117 A uniform compliance approach would entrench the market dominance of large platforms by burdening smaller players beyond their operational capacity.118
Differentiated Timelines Based on Type of Harm
While the justification for a two-hour timeline for the removal of non-consensual intimate content is well-founded, other categories of content require a contextualised application of compliance timelines in order to satisfy the requirements of Article 19(2).119
Reasoned Orders Under Article 19(2)
Any takedown order must specify the relevant provision of Article 19(2) relied upon and provide written reasoning for that determination.120
Post-Removal Right of Appeal and Access to Justice
Users whose content has been removed must have prompt access to an independent appellate body capable of providing effective redress.121
Structured Reporting and Periodic Third-Party Auditing
Platforms should be required to submit structured transparency reports detailing the volume and nature of removals, the content affected, and the justifications relied upon.122 In addition, periodic independent third-party audits of removal decisions will bolster institutional trust in the regulatory framework.123
These design options would ensure that regulatory urgency is tempered by constitutional discipline.124
Conclusion
The concern that animates the three-hour takedown rule — the digital devastation that unchecked harmful content can cause — is understandable.125 However, in a constitutional democracy, it is not the power one possesses that defines a government’s character, but the power it is willing to constrain.126
Freedom of speech under Article 19(1)(a) is not absolute. It is subject to the reasonable restrictions enumerated under Article 19(2), but those restrictions must be proportional, calibrated, and procedurally sound.127,128 If the rush to regulate overwhelms the Constitution’s demand for careful calibration, the risk is not merely censorship — it is the normalisation of preventive silence.129
This is not a debate between safety and freedom. It is a demand for safety that does not sacrifice the idea of freedom.130 The credibility of digital governance in India will ultimately depend upon that calibration.131
Bibliography
Primary Sources
Statutes and Statutory Instruments
- Constitution of India 1950
- Digital Services Act (Regulation (EU) 2022/2065)
- Information Technology Act 2000
- Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules 2021
- Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules 2025 (notified 22 October 2025)
- Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules 2026 (G.S.R. 120(E), notified 10 February 2026)
- Ministry of Electronics and Information Technology (MeitY) Notification No G.S.R. 120(E) (10 February 2026)
- Online Safety Act 2023 (UK)
Cases
- Anuradha Bhasin v Union of India (2020) 3 SCC 637
- Faheema Shirin v State of Kerala 2019 (4) KLT 301
- Foundation for Media Professionals v Union Territory of Jammu and Kashmir (2020) 5 SCC 746
- Google India Pvt Ltd v Visaka Industries (2020) 4 SCC 162
- Hemant Malviya v State of Madhya Pradesh [2025] INSC 468
- Indibleu Pvt Ltd v State of West Bengal [2019] SCC OnLine Cal 605
- IR Coelho v State of Tamil Nadu (2007) 2 SCC 1
- Justice KS Puttaswamy (Retd) v Union of India (2017) 10 SCC 1
- Kaushal Kishore v State of Uttar Pradesh (2023) 4 SCC 1
- Kunal Kamra v Union of India [2024] SCC OnLine Bom 2949 [see also SCC OnLine Bom 3058 — author to verify whether these are separate orders]
- Maneka Gandhi v Union of India (1978) 1 SCC 248
- Modern Dental College & Research Centre v State of Madhya Pradesh (2016) 7 SCC 353
- Ram Jethmalani v Union of India (2011) 8 SCC 1
- S Khushboo v Kanniammal (2010) 5 SCC 600
- S Rangarajan v P Jagjivan Ram (1989) 2 SCC 574
- Sakal Papers (P) Ltd v Union of India AIR 1962 SC 305
- Shreya Singhal v Union of India (2015) 5 SCC 1
- Wazahat Khan v Union of India [2025] INSC 442
- Wikimedia Foundation v ANI Media Pvt Ltd [2025] INSC 812
Secondary Sources
Books
- Jain MP, Indian Constitutional Law (8th edn, LexisNexis 2025)
Journal Articles
- Balkin JM, ‘Free Speech in the Algorithmic Society: Big Data, Private Governance, and New School Speech Regulation’ (2018) 51(3) UC Davis Law Review 1149
- Baxi U, ‘The Rule of Law and the Digital Leviathan’ (2026) 15(1) Journal of Indian Constitutional Law 12
- Bhatia G, ‘The Proportionality Test in the Digital Sphere’ (2025) 14(2) Journal of Indian Constitutional Law 112
- Chowdhury PR, ‘From Passive Host to Proactive Policeman: The 2026 Intermediary Shift’ (2026) 15(1) Journal of Digital Governance 22
- ‘The Doctrine of Indirect Censorship in the Digital Age’ (2025) 14(1) Journal of Indian Constitutional Law 88
- ‘The Pincer Effect: Intermediary Liability in the 2026 Regulatory Regime’ (2026) 15(1) Journal of Indian Constitutional Law 42
Government and NGO Reports
- Internet Freedom Foundation, ‘Statement on the IT (IGDME) Amendment Rules 2026’ (11 February 2026)
- Ministry of Electronics and Information Technology, ‘Explanatory Note on the IT Amendment Rules 2026’ (MeitY, 10 February 2026)
- NASSCOM, ‘Industry Representation on the Operational Challenges of G.S.R. 120(E)’ (NASSCOM Press Release, 11 February 2026)
- UN Human Rights Council, ‘Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression’ (2011) A/HRC/17/27
Newspapers and Websites
- ‘AI three-hour takedown rule: When speed becomes the censor’ (The Federal, 11 February 2026) https://thefederal.com accessed 12 February 2026
- ‘Curbing Deepfakes and Digital Harm: India’s New Framework for Regulating Synthetic Media’ (Sanskriti IAS, 6 February 2026) https://www.sanskritiias.com accessed 12 February 2026
- ‘IT Ministry mandates label for AI-generated content, reduces takedown timeline to 2–3 hours’ The Hindu (New Delhi, 11 February 2026)
- ‘Safe Harbor Under Section 79 IT Act: Why Legal Immunity for Platforms Is Crumbling’ (K&S Partners, 19 June 2025) https://ksandk.com/corporate/safe-harbor-intermediary-immunity-indian-law/ accessed 11 February 2026
- ‘Towards Safe, Trusted, and Responsible AI in India: Legal and Institutional Alignment’ (Official Round Table, India AI Impact Summit, National Law University Delhi, 30 January 2026)





