Australia’s internet regulator has raised concerns about how major platforms handle the country’s ban on social media use for children under 16. The rule came into force late last year, but enforcement remains inconsistent.
The watchdog reported that platforms such as Meta (owner of Facebook and Instagram), Snap Inc., TikTok, and YouTube have not taken sufficient steps to stop underage users from accessing their services.
Authorities introduced the ban to protect young users from harmful content and addictive algorithms. However, early findings show that many companies still struggle to enforce the rules effectively.
Key Issues Identified
Regulators highlighted several weaknesses in how platforms apply the law:
- Some platforms allowed users who previously declared themselves under 16 to re-verify their age and regain access
- Systems let children retry age verification methods multiple times
- Companies failed to block new underage users from creating accounts
- Reporting tools for parents remain limited and ineffective
These gaps suggest that enforcement relies too heavily on weak verification systems.
Initial Impact of the Ban
In the first month after implementation, authorities restricted or removed around 4.7 million accounts. Despite this, regulators believe many underage users still remain active on these platforms.
Officials now plan to move from monitoring to strict enforcement. They will collect evidence to determine whether companies have taken reasonable steps to comply with the law.
Industry Response
Social media firms argue that verifying age accurately remains a major technical challenge. Meta stated that stronger age verification at the app store level, combined with parental approval, could provide a better solution.
Meanwhile, Snap Inc. reported that it has already locked hundreds of thousands of accounts and continues to take action daily.
What Comes Next?
Australia’s approach is drawing global attention. Other countries are watching closely as governments explore stricter controls to protect children online.
Regulators now expect platforms to implement stronger systems, improve transparency, and provide better tools for parents. Without these changes, companies may face legal consequences.
