Home > Daily-current-affairs

Daily-current-affairs / 03 Jul 2024

Digital Jurisprudence in India : Daily News Analysis

image

Context:

Even though Generative AI (GAI) stands as a transformative force, existing legal frameworks and judicial precedents that have been designed for a pre-AI world may struggle to effectively govern this rapidly-evolving technology.

Generative AI

  • Concept of Generative AI (GAI)
    • Generative AI (GAI) stands as a transformative force capable of revolutionising society in groundbreaking ways. Generative AI, or generative artificial intelligence, refers to a form of artificial intelligence that automatically produces content such as text, images, audio, and video. Unlike traditional AI systems, which are designed to recognize patterns and make predictions, generative AI creates entirely new content. This type of AI is powered by foundation models, which are large AI models capable of multi-tasking and performing a variety of out-of-the-box tasks, including summarization, Q&A, and classification.
    • Some popular generative AI tools include ChatGPT, an AI-powered chatbot developed by OpenAI that can generate written content and converse fluently with users, and Bard, a generative AI chatbot created by Google based on LaMDA language model technology, which can answer user questions or create new content from text or image prompts.
  • Generative AI as Intermediaries or Conduits
    • There are contrasting views on the role of GAI tools. Some argue that they should be considered intermediaries like search engines, even though they do not host links to third-party websites. Others see them as mere “conduits” for user prompts, where altering the prompt changes the output, making the generated content akin to third-party speech, thereby attracting lesser liability.

Safe Harbour and Liability Fixation

  • Intermediary Liability
    • One persistent and contentious issue in Internet governance is fixing liability on “intermediaries” for hosted content. The landmark Shreya Singhal judgement upheld Section 79 of the IT Act, granting intermediaries ‘safe harbour’ protection against hosting content, contingent on meeting due diligence requirements outlined in Section 3(1)(b) of the Information Technology (Intermediaries Guidelines) Rules. However, its application to Generative AI tools remains challenging.
  • Legal Precedents
    • In the Christian Louboutin Sas vs. Nakul Bajaj and Ors (2018) case, the Delhi High Court held that safe harbour protection applies solely to “passive” intermediaries. However, distinguishing between user-generated and platform-generated content in the context of Large Language Models (LLMs) is increasingly challenging. Liability in the case of AI chatbots arises once information is reposted on other platforms by the user; mere response to a user prompt is not considered dissemination.
  • Case Studies
    • Generative AI outputs have led to legal conflicts in various jurisdictions. For instance, in June 2023, a radio host in the United States sued OpenAI, alleging that ChatGPT had defamed him. The ambiguity in classifying GAI tools complicates courts' ability to assign liability, particularly in user reposts.

The Copyright Conundrum

  • Current Copyright Laws
    • Section 16 of the Indian Copyright Act 1957 specifically states that “no person” shall be entitled to protection of copyright except by the provisions of the Act. Globally, there is reluctance to extend copyright protection to works generated by AI.
  • Key Questions
    • Critical questions include whether existing copyright provisions should be revised to accommodate AI, if co-authorship with a human should be mandatory for AI-generated works, and whether recognition should extend to the user, the program itself, and by extension, the programmer, or both. The 161st Parliamentary Standing Committee Report found the Copyright Act of 1957 “not well equipped to facilitate authorship and ownership by Artificial Intelligence.”
  • Legal Responsibilities
    • Under current Indian law, a copyright owner can take legal action against anyone who infringes on their work, with remedies such as injunctions and damages. However, who is responsible for copyright infringement by AI tools remains unclear. Classifying GAI tools as intermediaries, conduits, or active creators complicates courts' ability to assign liability. ChatGPT’s ‘Terms of Use’ attempt to shift liability to the user for any illegal output, but the enforceability of such terms in India is uncertain.

Privacy and Data Protection

  • Privacy Jurisprudence
    • The landmark K.S. Puttaswamy judgement (2017) by the Supreme Court of India established a strong foundation for privacy jurisprudence, leading to the enactment of the Digital Personal Data Protection Act, 2023 (DPDP). While traditional data aggregators or consent managers raise privacy concerns, Generative AI introduces a new layer of complexity.
  • Right to Erasure and Right to Be Forgotten
    • The DPDP Act introduces the “right to erasure” and the “right to be forgotten.” However, once a GAI model is trained on a dataset, it cannot truly “unlearn” the information it has already absorbed, raising critical questions about how individuals can exercise control over their personal information integrated into AI models.

Steps to Pursue

  • Learning by Doing
    • Consider granting GAI platforms temporary immunity from liability following a sandbox approach. This allows responsible development while gathering data to identify legal issues that could inform future laws and regulations.
  • Data Rights and Responsibilities
    • The process of data acquisition for GAI training requires an overhaul. Developers must prioritize legal compliance by ensuring proper licensing and compensation for the intellectual property used in training models. Solutions could include revenue-sharing or licensing agreements with data owners.
  • Licensing Challenges
    • Licensing data for GAI is complex as web-data lacks a centralized licensing body similar to copyright societies in the music industry. A potential solution is creating centralized platforms, akin to stock photo websites like Getty Images, to simplify licensing, streamline data access for developers, and ensure data integrity against historical bias and discrimination.

Conclusion

The jurisprudence around Generative AI (GAI) is hazy and yet to be evolved. It demands a comprehensive re-evaluation of existing digital jurisprudence. A holistic, government-wide approach and judicious interpretations by constitutional courts are essential to maximize the benefits of this powerful technology while safeguarding individual rights and protecting against unwelcome harm.

Probable Questions for UPSC Mains

  1. How can the current safe harbour provisions under Section 79 of the IT Act be revised to address the unique challenges posed by generative AI tools? (10 Marks, 150 Words)
  2. What are the primary challenges in categorizing generative AI tools under existing legal definitions of intermediaries and content creators, and how do these challenges impact liability and safe harbour protections? What legal and regulatory measures should be implemented to ensure responsible development and use of generative AI in India, while balancing innovation with the protection of individual rights? (12 Marks, 250 Words)

Source: The Hindu