Australia Passes World-first Law Banning Under-16s From Social Media

Syllabus: GS1/ Social Issue

In News

  • The Australian Senate passed a law that imposes fines on platforms like TikTok, Facebook, Snapchat, Reddit, X, and Instagram if they fail to prevent users under the age of 16 from creating accounts. 

About the Legislation

  • Objective: To protect young people from the potential harms of online platforms, such as cyberbullying, addiction, and exposure to harmful content.
  • Strict Enforcement: Social media platforms will be held accountable for enforcing age restrictions and could face significant fines for non-compliance.

Challenges in Banning Social Media

  • Privacy Concerns: The law raises concerns about privacy, as platforms may require users to verify their age using government-issued identification.
  • Challenge of age verification: One of the biggest challenges in implementing these bans is age verification.
  • Potential for Circumvention: Experts argue that the ban could lead to increased use of anonymous platforms and VPNs, making it difficult to monitor online activity.
  • Exposure to harmful sites: It could inadvertently push young people towards more dangerous online spaces, such as the Dark Web. This further creates more challenges like cybercrimes.

Impact of Social Media Addiction on Children

  • Psychological Impacts: Excessive social media use has been linked to increased rates of anxiety, depression, and low self-esteem.
    • Children can be exposed to cyberbullying, which can have severe emotional and psychological consequences like low self esteem. 
  • Physical Impacts: Excessive screen time can lead to a sedentary lifestyle, contributing to obesity and other health problems like eye strain and poor posture. 
  • Social and Emotional Impacts: FOMO (Fear of Missing Out) and can hinder the development of face-to-face communication, erosion of real-life relationships and social skills.

Way Ahead

  • Stricter Age Verification: Social media platforms should implement robust age verification systems to ensure that only users who meet the minimum age requirements can access their services.
  • Parental Consent: Platforms could require parental consent for users below a certain age.
  • Digital Literacy Education:  Schools should incorporate digital literacy into their curriculum to teach young people about the responsible use of technology.
  • Platform-Based Interventions like Time Limits: Social media platforms can implement features that limit screen time, especially for younger users.
    • Platforms can use AI-powered tools to filter harmful content and promote positive content.
  • Government Regulations: Strong data privacy laws can protect users’ personal information and prevent data breaches.
    • Governments can work with social media platforms to develop and enforce stricter content moderation standards.
  • Digital Detox Camps: Organizing camps to encourage digital detox and promote offline activities.

Source: LM