Tech Giants Concede but Contest Australia’s Under-16 Social Media
News THE ECONOMIC TIMES, livelaw.in, LAW, LAWYERS NEAR ME, LAWYERS NEAR BY ME, LIVE LAW, THE TIMES OF INDIA, HINDUSTAN TIMES, the indian express, LIVE LAW .IN
Meta Platforms, TikTok and Snapchat signal compliance with Australia’s landmark youth social-media law—while warning of unintended consequences.
Australia | Tuesday, October 28, 2025
Major social-media companies have announced they will comply with Australia’s pioneering law restricting use of platforms by children under 16, while simultaneously expressing strong reservations about the legislation’s effectiveness and broader impact.
At a parliamentary hearing on Tuesday, executives from Meta Platforms (the parent company of Facebook and Instagram), ByteDance’s TikTok, and Snap Inc. (owner of Snapchat) confirmed that they will abide by the new regulations when they come into force on December 10, 2025.
However, each company made clear that although they are prepared to comply with the law, they believe the policy is flawed, warn of unintended consequences and flag significant implementation challenges. Meta’s Australia & New Zealand policy director, Mia Garlick, said the company was “working through many significant engineering and age-assurance challenges” as it prepares to deactivate accounts it determines to belong to users under 16
TikTok’s Australia public-policy lead, Ella Woods-Joyce, told the inquiry that while TikTok will meet its obligations, “we don’t agree the ban will protect young people” and argued the measure could drive under-16s toward less moderated corners of the internet.
Snap’s senior vice-president global policy & platform operations, Jennifer Stout, summarised the position for the company: “We don’t agree, but we accept and we will abide by the law.” s
What the Law Requires
The legislation, the Online Safety Amendment (Social Media Minimum Age) Act 2024, mandates that social-media platforms take “reasonable steps” to prevent children under 16 from creating or maintaining accounts. Non-compliance may result in fines up to A$49.5 million (approx. US$32 million).
Key responsibilities for platforms include:
- Identifying and deactivating existing accounts held by under-16s.
- Preventing new account registrations by under-16s
- Taking steps to prevent work-arounds (such as false age claims, VPN usage).
Australia’s regulator, the eSafety Commissioner, has indicated the law is not about verifying the age of every user but about ensuring platforms apply “reasonable steps” to restrict under-16s from accounts.
Why the Companies Oppose It
Despite their agreement to abide by the law, the tech firms voiced several key concerns:
2. Implementation complexity: Meta said it faces “significant new engineering and age-assurance challenges.” Identifying and verifying under-16s with accuracy, while preserving privacy, remains technically difficult.
The Regulatory & Political Backdrop
Australia’s push for stricter youth online protections comes amid mounting concern about social-media harm to young people’s mental health. The legislation was passed in late 2024, giving platforms a transition period until December 10, 2025
The eSafety Commissioner has led engagement with platforms and laid out FAQs explaining that the measure aims not to “ban” under-16s outright but to delay access until maturity (age 16) and require better safeguards.w, stating: “We want kids to know who they are before platforms assume who they are.”
What Happens Next: Compliance & Risk
From December 10 onwards, platforms must begin deactivating or archiving accounts of users they believe to be under 16. Meta estimates it has around 450,000 accounts under-16 in Australia (across Facebook & Instagram); TikTok estimates 200,000; Snap estimates 440,000.
Platforms plan to notify affected account holders, provide data-download options, or suspend accounts until the user turns 16. For those falsely flagged, Meta and TikTok will offer third-party age estimation tools; Snap is still developing appeals procedures.
Regulator eSafety will monitor compliance and may impose penalties if platforms fail to meet obligations. Enforcement will focus on whether “reasonable steps” have been taken rather than guaranteeing zero under-16 usage.
Global Implications & Stakeholder Reactions
The legislation is being watched globally as one of the most stringent efforts to regulate youth access to social media. Analysts believe other jurisdictions may follow suit.
Privacy advocates caution that age-verification and account deactivation carry risks of exclusion, error and migration to unregulated platforms. In Australia, critics have likened the measure to delaying account creation rather than banning use entirely.
Parents, educators and mental-health professionals are divided. Some welcome stronger protections; others worry about pushing teens offline or into private apps harder to monitor.
Bottom Line
Meta, TikTok and Snap are walking a line: they are complying with Australia’s under-16 social-media restrictions to maintain access to one of the world’s advanced digital markets, yet they have not endorsed the principle behind the law. Their unified position—“we don’t agree, but we will comply”—speaks to the tension between regulatory demands and platform-business models.
For the Australian government, this moment offers both a test and a precedent: can regulation meaningfully curb youth exposure online without pushing them into the internet’s darker corners or creating unintended harms? The next few months of implementation will answer whether “reasonable steps” is a meaningful standard—and whether platforms can truly enforce age restrictions at scale.
Source:
