Date: 04/11/2022
Relevance: GS-2: Effect of policies and politics of developed and developing countries on India's interests
Key Phrases: Gonzalez vs Google case, Section 230 of the Communications Decency Act of 1996, Twitter, Inc. v. Taamneh Case, Issue of YouTube algorithms, Stratton Oakmont, Inc vs Prodigy Services Co case in 1995, The Twenty-Six Words That Created the Internet
Why in News?
-
The US Supreme Court has decided to hear a high-stake case that will decide upto to what extent tech companies can be held legally liable for the content that gets published on their platforms.
- The case concerns whether Google was in the wrong for recommending YouTube videos that helped encourage ISIS recruitment, and by extension a separate case brought by Twitter over similar content.
Gonzalez vs Google case:
- The case emerged due to the killing of Nohemi Gonzalez, a 23-year-old American law student studying in Paris, in an ISIS attack that killed 129 people in November 2015.
- Nohemi’s family sued Google alleging that ISIS posted hundreds of radicalising videos inciting violence and recruiting potential supporters” on YouTube, owned by Google, and YouTube’s algorithms promoted these contents to users whose characteristics indicated that they would be interested in ISIS videos.
- The Supreme Court will decide whether protections under the Section 230 of the Communications Decency Act of 1996 should include targeted video recommendations on social media platforms, or if they should only be legally protected when it comes to content being published on the platforms.
- Reynaldo Gonzalez, who brought the case has argued that platforms’ legal liability should be limited to “traditional editorial functions” like “whether to publish, withdraw, postpone or alter content,” and not recommendations, while Google argues its recommendations are protected under section 230.
Twitter, Inc. v. Taamneh Case:
- The Supreme Court will also take up Twitter, Inc. v. Taamneh, a related case that was brought against Twitter, Facebook and YouTube seeking to hold them liable for extremist content published on their platforms in light of a 2017 terrorist attack in Turkey.
Issue of YouTube algorithms:
- The algorithms of YouTube determine the content people watch on social media, the websites as well as the ads visited on search engines.
- YouTube is one of the most powerful radicalising instruments of the 21st century because of its algorithms’ propensity to serve up more and more extreme versions of the content its users decide to watch.
- As per a Mozilla Foundation study in July 202, 70% of objectionable videos that participants flagged were found through the platform’s recommendation system.
- Current and former YouTube engineers are of the view that YouTube doesn’t consciously try to recommend extremist content.
- However, the platform’s algorithm highlights videos that are “already drawing high traffic and keeping people on the site,” which tend to be “sensationalistic.”
Historical background of framing of internet laws:
- Stratton Oakmont, Inc vs Prodigy Services Co case in 1995: The New York Supreme Court’s held that online service providers could be held liable for the speech of their users The case had been instrumental in the framing of Section 230 of the Communications Decency Act in 1996.
- Section 230 of the Communications Decency Act of 1996:
- It offers two protections to websites that host third-party content
online.
- It shields the websites from civil lawsuits arising out of illegal content posted by the website’s users.
- Section 230 states that websites retain this lawsuit immunity even if they engage in content moderation that removes or “restricts access to or availability of material” posted on their site.
- These twin safeguards have fundamentally shaped the internet’s development.
- It offers two protections to websites that host third-party content
online.
Issues with Section 230 of the Communications Decency Act of 1996:
- While Section 230 protects websites that remove content they find objectionable, it is not clear that it protects websites that promote illegal content.
- If a defamatory tweet is published about someone, and Twitter sends a promotional email to its users telling them to check this tweet, Twitter can be sued for this email promoting a false claim even though Section 230 prevents Twitter from being sued.
The Twenty-Six Words That Created the Internet:
- It is a book written by Cybersecurity law professor Jeff Kosseff and it remained the backbone of the mechanism of the modern internet.
- The 26 words are: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider”.
Arguments of Gonzalez family:
- The Gonzalez family argues that YouTube’s algorithm should be treated the same way as Twitter would be treated if it sent mass emails promoting defamatory tweets.
- While Google cannot be sued because ISIS posts a video to one of its websites, the Gonzalez family claims that Google can be sued because one of its websites uses an algorithm that shows ISIS content to users who otherwise most likely would not have seen it.
- Social media companies have so far been shielded from legal liability regarding the content that users publish on their platform under section 230 of the Communications Decency Act.
- However, the Gonzalez suit argues that while YouTube may have legal protection for hosting whatever its users post on it, it should not have protection for its machine-learning “recommendation” algorithms that prescribe what the viewers should view next.
Conclusion:
- There is an urgent need for reforms in order to hold platforms accountable for misinformation and hate speech.
- Exactly 26 years have passed after the 26 words that ‘created’ the internet were framed.
- Internet giants are in no way in their formative stage now and they are on the receiving end of various issues — from antitrust to privacy to misinformation to algorithmic discrimination and lack of transparency.
- Is it time to have a paradigm shift in the way the internet runs and a new law needs to be framed that strikes a sensible balance between ensuring that important websites continue to function, while including some safeguards against the promotion of illegal content.
Source: The Hindu BL
Mains Question:
Q. Internet giants are in no way in their formative stage now and they are on the receiving end of various issues - from antitrust to privacy to misinformation to algorithmic discrimination and lack of transparency. In this context, discuss the need for reforms in order to hold platforms accountable for misinformation and hate speech. (150 words).