Social media restrictions for minors under 16 years old are now outlined by Australia's online safety authority
The eSafety Commissioner of Australia has published regulatory guidance for social media platforms regarding age assurance requirements, as part of the Online Safety Act 2021. This comprehensive 55-page document outlines how regulated platforms can comply with Australia's social media minimum age obligation. From December 10, 2025, social media platforms will be required to take 'reasonable steps' to prevent users who are under the age of 16 from having accounts. Platforms are encouraged to use a layered approach to age verification, and self-declaration by users is not considered sufficient to meet the legal obligation for age verification. The guidance is principles-based and doesn't prescribe any specific technology that must be used. Platforms are expected to consider what systems they can leverage for compliance and what additional measures will be necessary. They cannot use government ID as the sole method for age verification and must always offer reasonable alternatives. Veronica Scott, an expert in privacy law at Pinsent Masons, stated that platforms will prioritize identifying and engaging with existing underage users, with no legally enforced minimum standards or technology identified by the government. Platforms are expected to deal with existing underage users in a careful and kind manner, deactivating or removing accounts with clear communication, and preventing re-registration or circumvention by underage users. The Australian government will introduce a Children's Online Privacy Code by December 2026, aimed at enhancing privacy protections for children. Social media services, messaging apps, online games, cloud storage platforms, and any other service likely to be accessed by children will be subject to this code. Measures for age verification need to be reliable, robust, and effective, and cannot remain static. They require a range of measures, including in-house or third-party technology. A range of test data has been released from the trial, which should provide helpful contextual guidance on the effectiveness of different methods of age verification. The guidance also emphasizes the importance of record-keeping for age verification, focusing on systems and processes, not user-level data. Platforms are expected to offer an accessible review mechanism for users who believe they've been wrongly flagged as underage. The guidance builds upon the recently released self-assessment guide for age-restricted social media platforms. Notably, the Australian platforms like Snapchat, Instagram, TikTok, and Facebook have been required since July 25, 2025, to implement age verification measures banning users under 16 years from having accounts, with no unified technical solution. These platforms must identify and remove underage users to comply with new laws that impose significant fines for non-compliance. In conclusion, the new regulatory guidance provides a comprehensive framework for social media platforms to ensure age assurance compliance in Australia. Platforms are expected to prioritize the safety and privacy of young users while implementing effective and adaptable age verification measures.
Read also:
- Industrial robots in China are being installed at a faster rate than in both the United States and the European Union, as the global market for these robots faces a downturn.
- Hyundai N affirms transition to hybrid performance-centric models, initiating with Tucson N
- EAFO Research Uncovers Crucial Elements in Electric Vehicle Adoption within the EU
- Stock markets in India anticipated a moderate opening, influenced by mixed signals from global markets.