Home > Daily-current-affairs

Daily-current-affairs / 20 May 2024

EU Probes Facebook and Instagram Over Child Safety Concerns : Daily News Analysis

image

Context-

The European Union has launched a fresh investigation into Meta's social media platforms, Facebook and Instagram, due to concerns that they may be failing to protect children effectively. This investigation could lead to significant fines if violations are found. The following provides a detailed analysis of the situation, including the background, specific concerns, regulatory framework, Meta's current efforts, and broader implications.

Background of the Investigation

  • Concerns Raised by the EU : The EU's executive arm has raised concerns that Facebook and Instagram’s recommendation engines could be exploiting the vulnerabilities and inexperience of children, potentially stimulating addictive behaviors. There is a specific worry about the so-called "rabbit hole" effect, where users are led to increasingly disturbing content. This concern has prompted the EU to investigate whether Meta’s platforms are complying with the Digital Services Act (DSA) and if they are effectively protecting the privacy, safety, and security of minors.

What is the Rabbit hole effect?

The "rabbit hole effect" on social media refers to the tendency of users to consume similar media in a row, which can lead to them preferring more similar media in the future. This effect can be caused by algorithms that promote inflammatory content, such as hate speech, misinformation, and extremist content. It can also be caused by content creators who encourage interaction to get users to pay attention to their pages. This can cause users to lose sight of their original purpose for searching online and get derailed.

  • Scope of the Investigation : The investigation will examine Meta's use of age verification tools designed to prevent children under the age of 13 from accessing their platforms. The EU will gather evidence through additional requests for information, conducting interviews, and inspections. The commission can also accept commitments from Meta to address the issues identified during the investigation.

The Digital Services Act (DSA)

  • Protection Measures for Minors : Under the DSA, platforms must implement measures to protect minors from content that could harm their physical, mental, or moral development. This includes deploying age verification systems and parental control tools to help minors report abuse and access support. As Facebook and Instagram exceed the 45 million user threshold, they fall under the purview of these stringent regulations.
  • Meta’s Efforts to Protect Children: Meta announced earlier this year that it was testing an AI-driven "nudity protection" tool designed to detect and blur images containing nudity sent to minors through its messaging system. This initiative aims to shield minors from inappropriate content and enhance their online safety.
  • Additional Safety Measures: Meta has also introduced several measures to protect users under 18, including tightening content restrictions and enhancing parental supervision tools. These efforts reflect Meta’s attempt to comply with regulatory requirements and improve the safety of its younger users.

Broader Regulatory Scrutiny

     Previous Investigations by the EU

This is not the first time Meta's platforms have come under EU scrutiny. In April, the regulator opened an investigation into Meta over allegations of failing to address deceptive advertising and disinformation in the lead-up to the European Parliament elections. The investigation focused on Meta's potential role in disseminating disinformation from foreign actors, including Russia, China, and Iran, aiming to influence EU voters.

     U.S. Regulatory Actions

Meta's Instagram has also faced significant criticism in the United States. A Wall Street Journal report in June 2023 revealed that the platform was facilitating connections and promoting networks involved in underage sex content. In response, Meta claimed it was enhancing internal controls and reported the removal of 27 pedophile networks and 490,000 accounts that violated its child safety regulations within a month.

 

Regulatory laws for Child protection across the globe-

     Children's Online Privacy Protection Act (COPPA): Enacted in 2000 in the USA, COPPA prohibits platforms from collecting personal information from children under 13 without parental consent.

     Adopt and Enact Legal Guidelines: The UN Committee on the Rights of the Child (CRC) recommends strong measures, including legislation, to protect children from harmful and misleading content.

     Guidelines Issued by China: China restricts minors' internet use, banning access from 10 p.m. to 6 a.m. and limiting usage for those aged 16 to 18 to two hours a day.

     World Health Organization Recommendations: The WHO advises no screen time for babies under 2 and limits screen time to one hour a day for children aged 2 to 4

     Data Protection Bill 2023: DPDP Bill should clarify that mechanisms for age-verification and parental consent must adhere to basic data protection principles and safeguards like data minimisation and purpose limitation

Best Practices for Parents and Guardians

The digital age has transformed the way children interact with the world, bringing both opportunities and risks. Social media platforms, while offering avenues for creativity and connection, also pose significant dangers, particularly for young users.

While regulatory measures are essential, the role of parents and guardians in safeguarding children's online experiences cannot be overstated

     Parental Guidance and Supervision: In the digital age, ensuring children's online safety is challenging. Parents should stay informed about online risks and implement safeguards like setting up child profiles, selecting age-appropriate apps, and using child-friendly sites and search engines.

     Age-Restricted Content: Parents must block age-restricted content on their children's devices. Supervising online activities and spending time with them online can prevent harmful engagements and protect against online predators.

     Reporting and Blocking Offensive Material: Minors should know how to report and block offensive content. Open conversations about online experiences ensure children feel comfortable seeking help when needed.

Conclusion

The ongoing EU investigation into Facebook and Instagram underscores the delicate balance between technological innovation and user safety. As social media platforms continue to evolve, so too must the measures to protect their most vulnerable users. The outcome of this investigation could set important precedents for how digital platforms operate and prioritize child safety in the future.

Probable questions for UPSC Mains Exam-

  1. What are the primary concerns raised by the European Union regarding Facebook and Instagram's impact on child safety, and how do these concerns relate to the "rabbit hole effect"? (10 Marks, 150 Words)
  2. How does the Digital Services Act (DSA) aim to protect minors on large online platforms, and what specific measures has Meta implemented to comply with these regulations? (15 Marks, 250 Words)

Source- The Hindu