Intermediaries in India are largely governed by the Information Technology Act, 2000 and the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 and, by virtue of their nature, are granted safe harbour protections from third-party information on their platform by S.79 of the IT Act. However, non-adherence to the IT Act and the 2021 Rules would vitiate this protection and make intermediaries liable to offences committed using the platform.1
Intermediaries are required by law to inform their users to not host, display, upload, modify, publish, transmit, store, update or share any information that is obscene, pornographic, paedophilic, otherwise inconsistent or contrary to the laws in force, or harmful to children.2 In instances where such content is found on their platforms, intermediaries must remove or disable access to the information and, upon doing so, must preserve evidence of such information and associated records for a period of 180 days for investigative purposes or longer if required by courts or lawfully authorized government agencies.3 Intermediaries involved in primarily or solely enabling online interaction between two or more users and having a significant number of registered users in India are additionally required by law to deploy technology measures, including AI tools, with human oversight, to identify proactively information that portrays any act or stimulation in any form depicting rape, child abuse or conduct, whether explicit or implicit.4
Platform-safety is of prime importance, and especially so on platforms where children form a significant part of the userbase. It is thus imperative for intermediaries to promptly take down objectional material and also maintain appropriate safety measures. Recently, there has been concern and discussion over the increasing online presence of children, which has now been spurred on even more due to the COVID-19 pandemic.
In a 2015 suo moto petition before the Supreme Court of India concerning two objectional and explicit videos shared on the messaging platform WhatsApp, the Apex Court noted that intermediaries should make efforts towards implementing certain platform-safety measures.5 The guidance sets out that intermediaries should set up proactive monitoring tools for auto-deletion of unlawful content by deploying AI-based tools and flaggers for identifying and deleting such content, to maintain a 24×7 mechanism for dealing with requisitions of law information agencies and promptly dispose requisitions to remove unlawful content, to appoint escalation officers and an India-based contact officer.
More recently, a petition was moved before the Madras High Court seeking a ban on the popular video-sharing application TikTok on the grounds that it exposed children to pornographic material and sexual predators.6 A ban was imposed, but later lifted after the Court was convinced that TikTok, as a platform, did not have any control over its users and that it had maintained safety features such as machine and manual moderation for filtering inappropriate, obscene and negative content posted on the application. This was based on TikTok’s presentation before the Court that it was adequately equipped to ensure safety on its platform through the publication of community guidelines; terms of use; privacy statement; an in-app content reporting feature; an India-based grievance officer; a dedicated channel for local government inquiries; a moderation team; safety centre in several languages; automated tools to detect pornographic content; specific measures to prevent the use of the platform by children (while still allowing access to teenagers and above) alongside other protective measures.
Even if the intermediary is ensuring compliance and maintaining necessary safeguards, where there are any incidents concerning child-safety, it must conduct internal investigations over the concerns raised and take prompt action. In the event that a conclusive investigation establishes that an incident has taken place, or that there is likelihood of a crime being committed or having been committed against a child, the intermediary is obliged, to report such incidents to the relevant authority prescribed under the Protection of Children from Sexual Offences Act of 2012 (“POCSO”), (legislation that provides for punishment for a range of offences related to sexual offences related to children), and to take greater efforts to ensure a safe platform for its users. While victims can seek recourse under the Indian Penal Code and POCSO for offences committed against them and register cyber-crime complaints online on the Government of India’s ‘National Cyber Crime Reporting Portal,’ the intermediary must follow through on its own required course of action.
Notes
[1] Section 79(3) of the IT Act.
[2] Rule 3(1)(b)(ii) and (iii) of the 2021 Rules.
[3] Rule 3(1)(g) of the 2021 Rules.
[4] Rule 4(4) of the 2021 Rules.
[5] Re: Prajwala letter dated 18.02.2015.
[6] S. Muthukumar vs. The Telecom Regulatory Authority of India and Ors., Writ Petition (MD) No. 7855 of 2019.
Raghunath Ananthapur is a Partner with a law firm in Bangalore, India and advises in the areas of corporate law, technology, media and intellectual property transactional and advisory work. Raghunath can be reached at: raghunath.ananthapur@magnahlaw.com
Prithvika Prasad is a recent graduate of law, and currently works as an Associate at a law firm in Bangalore, India. She can be reached at: prithvika.prasad@magnahlaw.com