Syllabus: GS2/Polity and Governance
Context
- Under the Indian Cyber Crime Coordination Centre (I4C)-led Sahyog portal, the government has issued 130 content notices to online platforms like Google, YouTube, Amazon, Apple, and Microsoft in the last 6 months.
About
- These notices effectively act as content blocking orders and are sent under Section 79(3)(b) of the Information Technology Act, 2000.
- These fall outside Section 69(A) of the Information Technology Act, which has been commonly used to issue online censorship orders.
- As per Section 79(3)(b) of the IT Act, online intermediaries can lose their safe harbour protections if they fail to block access to content which has been flagged by an “appropriate” government agency.
- Safe harbor protections provide legal immunity to social media platforms for third-party user-generated content.
Indian Cybercrime Coordination Centre (I4C) – It is an initiative of the Ministry of Home Affairs launched in 2020 to deal with cyber crime in the country in a coordinated and comprehensive manner. – I4C focuses on tackling all the issues related to Cybercrime for the citizens, which includes improving coordination between various Law Enforcement Agencies and the stakeholders. |
Legal Framework: Section 69A vs. Section 79(3)(b)
- Section 69A of the IT Act, 2000: This section empowers the government to block public access to content on the internet in certain circumstances, such as concerns over national security, sovereignty, public order, or to prevent incitement.
- It includes safeguards as laid out by the Supreme Court in the Shreya Singhal case (2015).
- A reasoned order explaining the necessity of blocking content.
- The person or entity affected should have a chance to contest the order.
- Section 79(3)(b) of the IT Act: This section deals with the liability of intermediaries (such as platforms like X Corp) for third-party content.
- It exempts platforms from liability for illegal content unless they fail to act swiftly to remove or disable access to that content when notified by the government.
- Intermediaries argue that this provision should not be used to directly block content, as it is not intended for that purpose.
The Sahyog Portal – It was launched by the Ministry of Home Affairs in 2024. – The portal acts as a centralized system for government agencies at various levels—ranging from ministries to local police stations—to issue blocking orders more efficiently. |
Digital content censorship
- Digital content censorship refers to the control of online content by governments, organizations, or other entities. This includes:
- Blocking websites and apps
- Removal of social media content
- Regulation of OTT (Over-The-Top) streaming platforms
- Restrictions on digital news and journalism
Legal Framework Governing Digital Censorship in India
- Right to Freedom of Speech (Article 19(1)(a)): Subject to reasonable restrictions under Article 19(2) concerning decency, morality, and public order.
- Information Technology (IT) Act, 2000: Section 69A grants the government power to block online content for security or public order concerns.
- Intermediary Guidelines & Digital Media Ethics Code, 2021: Regulates social media, OTT platforms, and digital news media.
- Self-Regulation by OTT Platforms: Platforms like Netflix and Amazon Prime follow self-regulatory frameworks such as the Digital Publishers Content Grievances Council (DPCGC).
- The Central Board of Film Certification (“CBFC”), which was established by the Cinematographic Act, of 1952, is responsible for censoring movies in India.
Challenges in Digital Censorship in India
- Balancing Freedom of Speech & Regulation: Over-regulation can suppress creativity, while under-regulation can spread harmful content.
- Transparency & Accountability: Content moderation and censorship decisions often lack clear guidelines, raising concerns about misuse.
- Jurisdictional Issues: Many digital platforms operate from outside India, making enforcement difficult.
- Technological Advancements: The rapid evolution of digital media complicates consistent and fair regulation.
- Ethical Concerns: The subjective nature of obscenity laws can lead to arbitrary censorship.
Way Forward
- Strengthening Independent Regulatory Bodies: Ensuring that courts and neutral institutions review censorship decisions.
- Enhancing Transparency in Content Moderation: Digital platforms should publish periodic transparency reports on content takedowns.
- Encouraging Digital Literacy: Educating citizens to identify fake news rather than enforcing restrictive censorship.
- Public Consultation in Policymaking: Involving journalists, legal experts, and civil society in framing digital content regulations.
Source: IE
Previous article
17th Civil Services Day
Next article
Cruise Tourism in India