Relevance: GS-2: Important aspects of governance, transparency and accountability, e-governance- applications, models, successes, limitations, and potential;transparency & accountability and institutional and other measures.
Key Phrases: Information Technology Act 2000, Technology Platforms, non-consensual intimate images, Intermediary Guidelines 2021, Hashing Technology, Child-Sex Abuse Material, Threat to Free Speech, e-Safety Commissioner, Automated Tools
Why in News?
- Recently, the intimate pictures of a woman were shared online without her consent.
- The case highlights the need for courts, law enforcement, and technology platforms to have a coordinated response to the sharing of non-consensual intimate images (NCII) online.
Key Highlights:
- Publishing NCII is a criminal offence under the Information Technology Act 2000, with platforms doing their best to filter out such content.
- While a criminal conviction is desirable, the more urgent need is to stop the spread of this illegal content.
- The Intermediary Guidelines 2021 provide a partial solution.
Intermediary Guidelines 2021:
- It empowers victims to complain directly to any website that has allowed the uploading of non-consensual images or videos of a person in a state of nudity or engaging in a sexual act.
- This includes content that has been digitally altered to depict the person as such.
- The website must remove the content within 24 hours of receiving a complaint or risk facing criminal charges.
Issues with the Guidelines:
- The approach relies on victims identifying and sharing every URL hosting their intimate images.
- The same images may be re-uploaded at different locations or by different user accounts in the future.
- While the Intermediary Guidelines do encourage large social media platforms to proactively remove certain types of content, the focus is on child pornography and rape videos.
- Victims of NCII abuse have few options other than lodging complaints every time their content surfaces, forcing them to approach courts.
Right to Free Speech and Indian Constitution
- The Constitution of India guarantees to all of its citizens the fundamental right of freedom of speech and expression under article 19 (1)(a) in accordance with the philosophy of Preamble to secure to all its citizens, liberty of thought and expression.
- This right allows for the complete personality development of an individual as well as helps in the thriving of a vibrant democracy.
- The scope of this right has been widened by the judiciary in India with passage of time through its wisdom, creativity and craftsmanship.
- Thus, it now includes: Freedom of Press, Freedom of Commercial Speech, Right to Broadcast, Right to information, Right to criticize, Right to expression beyond national boundaries, Right not to speak or right to silence.
- However, the right is not absolute and is subject to restrictions of the State in (1) Interests of the sovereignty and integrity of India (2) Security of the State (3)Friendly relations with foreign states (4)Decency or morality(5) Contempt of court(6)Defamation or incitement to an offence
Freedom of Speech and Expression and Social Media/ Internet
- Freedom of speech and expression is recognized as a fundamental right in whatever medium it is exercised under the Constitution of India and other international documents.
- In the light of the growing use of internet and social media as a medium of exercising this right, access to this medium has also been recognized as a fundamental human right.
- Although there is no specific legislation in India which deals with social media, there are several provisions in the existing cyber laws which can be used to seek redress in case of violation of any rights in the cyberspace, internet and social media.
The Information Technology Act, 2000
- The Information Technology Act, 2000 is the primary law in India dealing with cybercrime and electronic commerce.
- Secondary or subordinate legislation to the IT Act includes the Intermediary Guidelines rule 2011 and the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021.
- The Act provides a legal framework for electronic governance by giving recognition to electronic records and digital signatures.
- It also defines cyber crimes and prescribes penalties for them.
- The Act directed the formation of a Controller of Certifying Authorities to regulate the issuance of digital signatures. It also established a Cyber Appellate Tribunal to resolve disputes rising from this new law.
- The Act also amended various sections of the Indian Penal Code, 1860, the Indian Evidence Act, 1872, the Banker's Book Evidence Act, 1891, and the Reserve Bank of India Act, 1934 to make them compliant with new technologies.
- Section 69 of the Act grants power to the Central or a State Government to issue directions for interception or monitoring or decryption of any information through any computer resource.
- Section 69A grants power to the Central Government to issue directions to block public access to any information through any computer resource on similar grounds.
What is Stop NCII.org (Stop Non-Consensual Intimate Image Abuse)?
- The social media tech firm Meta, formerly Facebook, has introduced this website to stop women from being exploited on the internet.
- Meta operates the platform with the UK’s Revenge Porn Helpline which is an independent organisation.
- The platform is used to stop non-consensual intimate images from being shared on Facebook and Instagram.
- The tool relies on a “hashing” technology to match known NCII against future uploads.
- A victim whose intimate images have been shared without their consent can use the tool to create a “hash” (or unique identifier) of the offending image, which is shared with the platform.
- The platform then compares this user-generated hash with the hashes of all other images on its site, allowing for the identification and takedown of content identical to that reported by the victim.
- The victim’s private images stay with them, with only the hash being added to a database to guard against future uploads.
- Similar technology is already used against child-sex abuse material (CSAM) with promising results.
What are the concerns associated with these Automated Tools?
- Threat to Free Speech:
- The use of automated tools raises free speech concerns that lawful content may accidentally be taken down.
- Automatic filters often ignore context i.e., content that may be illegal in one context may not be illegal in another.
- This is precisely why free speech advocates are wary of using automated tools to remove harmful content online.
- Assaults committed by Public Figures likely to be taken down:
- The courts may be required to intervene while depicting a public figure committing sexual assault.
- The vast majority of NCII has no public interest component and can be taken down quickly.
- In the past, courts have required victims to continually supply URLs or directed intermediaries to remove all content remotely related to NCII.
Way Forward:
- Use of NCII hash database by other Websites:
- If well-designed and administered, other websites could eventually use this NCII hash database to identify illegal content they may be unwillingly hosting.
- Victims could report NCII abuse at a centralised location and have it taken down across a range of websites.
- Grievance Redressal Mechanism:
- The government can also play a role in facilitating a redressal mechanism.
- For example, Australia has appointed an “e-Safety Commissioner” who receives complaints against NCII and coordinates between complainants, websites, and individuals who posted the content with the Commissioner empowered to issue “removal notices” against illegal content.
- Pairing a hash database with an independent body like the Commissioner may significantly reduce the spread of NCII.
- Image-matching technology for Surveillance:
- Image-matching technology could be used for surveillance or to simply remove unpopular (but not illegal) content from the internet.
- For Example, use of the image-matching software “PhotoDNA” tool of CBI built to identify CSAM for investigatory purposes.
- To counter such risks, the hash database for CSAM is not maintained by either any private company or any government but rather by independent organisations.
- Similarly, Meta has partnered with the Revenge Porn Helpline to administer its NCII tool.
- Coordinated Response of Government, Companies and Independent
Organisations:
- The government’s reported overhaul of the Information Technology Act is an opportunity to develop a coordinated response to NCII-abuse that will provide victims meaningful redress without restricting online speech.
- In the interim, courts should balance the harm caused by NCII with the need to protect online speech.
- Verification of URL:
- Courts may consider tasking a state functionary or independent body with verifying the URLs and coordinating with online platforms and internet service providers.
- Taking Down of Illegal content only:
- Courts should direct platforms to take down NCII only where the NCII-content will be illegal in every foreseeable context.
- Reinstatement of Content:
- It must be ensured that the individual who posted the content can seek reinstatement.
- Also, they should not demand absolute outcomes but rather require that platforms take affirmative steps to address the issue.
Source: Indian Express
Mains Question:
Q. What are the legal provisions in India with respect to sharing of non-consensual intimate images (NCII)? Highlight the issues associated with them and the efforts taken to tackle the spread of NCII. (250 words).