From 10 December 2025, platforms classified as “age-restricted social media platforms” (ARSMPs) under Part 4A of the Online Safety Act 2021 (Cth) ( Part 4A), must be able to demonstrate they are taking “reasonable steps” to prevent those in Australia under 16 from holding accounts on their services. Failure to do so will expose companies to investigation and enforcement by the eSafety Commissioner.
Part 4A operates alongside the Privacy Act 1988 (Cth) (Privacy Act) and the Australian Privacy Principles (APPs) at Schedule 1 and introduces additional obligations on ARSMPs and third-party age-assurance providers when handling information for compliance purposes. On 9 October 2025, the OAIC released “Privacy Guidance on Part 4A of the Online Safety Act 2021” (OAIC Guidance), which provides practical guidance on how entities should handle personal information to comply with the new Social Media Minimum Age (SMMA) scheme and the Privacy Act.
The OAIC Guidance complements the eSafety Commissioner’s Social Media Minimum Age Regulatory Guidance, which we explain in our article here. Together, these materials mark a coordinated regulatory approach: compliance with the SMMA scheme is not just about stopping under-16s from signing up — it is about doing so in a privacy-preserving and demonstrably accountable way.
For background on how age assurance technologies have been tested and evaluated in Australia, see our article on the recent Age Assurance Technology Trial here.
The OAIC makes clear that compliance with Part 4A will not be “reasonable” unless it also complies with the Privacy Act and APPs. Part 4A introduces stricter obligations than the existing privacy framework, including mandatory destruction (not de-identification) of personal information once the SMMA purpose is complete.
Platforms cannot retain information “just in case” it is useful later. Section 63F demands destruction — not repurposing, unless specifically permitted — and any breach will be treated as an interference with privacy under the Privacy Act. That means the OAIC can investigate and enforce directly, even against entities not previously regulated, such as small technology providers or overseas processors.
The OAIC expects age assurance solutions to be privacy by design, backed by an early-stage Privacy Impact Assessment (PIA) that examines proportionality, necessity and data minimisation. Entities should choose the least privacy-invasive method capable of achieving the SMMA goal and test its justification through a PIA before deployment.
The OAIC recommends establishing a “ring-fenced SMMA environment” — a segregated technical and data structure where age assurance information is processed, stored and destroyed separately from other systems. Only minimal artefacts, such as a binary “16+ yes/no” result, method and timestamp, should persist. Inputs like ID scans or selfies must be deleted immediately after use.
Information collected for SMMA purposes must be reasonably necessary and proportionate. Reuse of existing data (such as self-declared dates of birth or account tenure) to infer age is allowed only if it fits within APP 6 — consent, a reasonable expectation, or a legal basis under Part 4A.
The OAIC supports inference-based and AI-driven approaches but with clear caveats: they must be transparent, demonstrably accurate, and not rely on continuous behavioural tracking or unnecessary sensitive data such as biometric or content analysis.
Transparency is central to compliance. The OAIC expects just-in-time notifications at the point of data collection, explaining what information is being collected, by whom, for how long, and why. Updated privacy policies must clearly describe SMMA-specific processing and destruction practices.
Legal, product and design teams need to collaborate. Poorly designed consent or information screens — even if legally accurate — can amount to non-compliance.
Part 4A sets a higher bar for consent to secondary uses of information collected for SMMA purposes than the standard APP test. It must be voluntary, informed, current, specific and unambiguous and be able to be withdrawn. The OAIC Guidance provides that practical considerations include no bundled or pre-ticked consents, no reliance on general terms of use, and simple withdrawal mechanisms in dedicated privacy settings or contextually appropriate screens. Each consent should be purpose-specific and time-limited — for example, a one-time “16+ yes” token.
To demonstrate compliance, ARSMPs are advised to build withdrawal toggles and consent logs capable of showing when, how and for what purpose consent was given or withdrawn.
Section 63F’s destruction requirement is uncompromising: inputs like photos, document scans and biometric data must be destroyed as soon as the age check is complete, unless unambiguous consent to other purposes is obtained or another limited exception applies. The OAIC Guidance provides that entities should retain only minimal decision artefacts for troubleshooting, fraud and circumvention prevention, responding to complaints and reviews and to evidence compliance. Retention for these purposes must be transparent and should be subject to time-based limits in accordance with industry standards.
Where the same collection serves multiple purposes (for instance, onboarding under AML/CTF or other regimes), each purpose must be narrowly defined with separate retention rules. Information should be destroyed once the last retention period has expired.
The OAIC urges organisations to develop retention matrices and automate deletion within ring-fenced environments.
Third-party age assurance providers are subject to the same privacy rules as platforms. The OAIC Guidance recommends that destruction attestations are provided to clients and may be directly investigated by the OAIC, including if they are based offshore.
Small businesses are also likely to be caught: The privacy obligations in section 63F apply to any entity that holds information collected for SMMA scheme purposes and any provider that discloses or commercialises personal information for a benefit, service or advantage will not benefit from the small business exemption under section 6D(4)(c) of the Privacy Act.
While eSafety expects platforms to monitor for under-age users, the OAIC reminds industry that each reuse or re-check of personal information must still comply with the APPs — lawful reuse, quality, security and destruction. Ongoing or repeated checks should be necessary to comply with SMMA obligation, proportionate to risk and not a standing surveillance feature.
This alignment between OAIC and eSafety guidance is explicit in the OAIC Guidance: steps to comply with the SMMA scheme will not be considered by the OAIC to be “reasonable” unless they are also privacy-compliant. In practice, that means privacy and safety teams will need to work in lock-step — one without the other is unlikely to pass regulatory muster.
The OAIC Guidance signals regulatory expectations for age assurance in Australia. ARSMPs must demonstrate that their systems are not only effective but also necessary, proportionate and privacy-preserving. Compliance is a multidisciplinary exercise, spanning privacy law, engineering, UX and risk management.
Entities should move quickly to take practical steps including to:
Failure to integrate privacy and safety compliance could attract parallel enforcement from both regulators. While the SMMA scheme remains in its early implementation phase, organisations that invest now in privacy-preserving architecture, transparent design and governance frameworks will be best placed to demonstrate compliance once enforcement begins. Early alignment across privacy, safety and product functions will not only mitigate regulatory risk but also help build public trust in the next generation of age-appropriate digital services.