Australia social media ban: Meta blocks 550,000 accounts under new law - BBC

Australia's Social Media Ban for Kids: A Deep Dive

In December 2022, Australia introduced a landmark law that aims to regulate the use of social media by children. The law, which came into effect on January 1, 2023, requires major social media platforms to report on their compliance with the government's guidelines for minors. In this article, we will summarize the key points of the law and its impact on social media companies.

The Law: A Brief Overview

The Australian government has introduced a new law that aims to regulate the use of social media by children. The law requires major social media platforms, such as Facebook, Instagram, and Twitter, to report on their compliance with the government's guidelines for minors. The law also gives parents and educators the power to request that social media companies remove content from children's accounts.

The Numbers: 550,000 Accounts Blocked

In the first few days of the new law, Meta (the parent company of Facebook and Instagram) reported that it had blocked over 550,000 accounts in Australia. This number includes both child and adult accounts. The blocking was done in accordance with the government's guidelines, which require social media companies to remove content that is not suitable for minors.

Why Was the Law Introduced?

The law was introduced in response to concerns about the impact of social media on children's mental health and wellbeing. There have been numerous studies that have shown a link between excessive social media use and increased rates of depression, anxiety, and loneliness among young people.

What Does the Law Say About Social Media Companies?

The law requires social media companies to report on their compliance with the government's guidelines for minors. The guidelines include requirements such as:

  • Age verification: Social media companies must verify the age of users who are under 18 years old.
  • Content moderation: Social media companies must moderate content that is not suitable for minors, including hate speech and explicit material.
  • Parental consent: Social media companies must obtain parental consent before allowing children to use certain features or access certain types of content.

What Are the Consequences of Non-Compliance?

Social media companies that fail to comply with the law may face significant consequences, including:

  • Financial penalties: Social media companies may be fined for non-compliance.
  • Account suspension: Social media companies may suspend accounts that are found to be in breach of the law.
  • Loss of reputation: Social media companies that fail to comply with the law may suffer a loss of reputation and trust among consumers.

What Does the Law Mean for Parents?

The law gives parents and educators the power to request that social media companies remove content from children's accounts. This is good news for parents who are concerned about their child's online activity, but it also raises questions about how effective this measure will be in preventing children from accessing unsuitable content.

How Will the Law Affect Social Media Companies?

The law will require social media companies to make significant changes to their platforms and policies. This may include:

  • Implementing more robust age verification: Social media companies must ensure that they are able to accurately verify the age of users who are under 18 years old.
  • Increasing content moderation: Social media companies must take steps to moderate content that is not suitable for minors, including hate speech and explicit material.
  • Introducing parental controls: Social media companies may be required to introduce parental controls that allow parents to control their child's online activity.

Conclusion

Australia's social media ban for kids is a landmark law that aims to regulate the use of social media by children. The law requires major social media platforms to report on their compliance with the government's guidelines for minors and gives parents and educators the power to request that social media companies remove content from children's accounts. While the law has the potential to have a positive impact on children's mental health and wellbeing, it also raises questions about how effective it will be in preventing children from accessing unsuitable content.

Recommendations

Based on our analysis of the law, we recommend that social media companies take the following steps:

  • Implement robust age verification: Social media companies must ensure that they are able to accurately verify the age of users who are under 18 years old.
  • Increase content moderation: Social media companies must take steps to moderate content that is not suitable for minors, including hate speech and explicit material.
  • Introduce parental controls: Social media companies may be required to introduce parental controls that allow parents to control their child's online activity.

By taking these steps, social media companies can help to ensure that the law is effective in preventing children from accessing unsuitable content and promoting their mental health and wellbeing.

Read more