Legal development

The kids are online – what Australia's Children's Online Privacy Code means for you

Abstract waterfall waves

    What you need to know

    • Australia's privacy regulator is consulting on a draft Children's Online Privacy Code until 5 June 2026. A final code must be registered by 10 December 2026 – but we don't yet know when obligations will commence.
    • The code will apply to services likely to be accessed by children under 18 years, or that are primarily concerned with the activities of children – whether or not children are the intended audience.
    • Services likely to be accessed by children could include a range of apps, games, websites, streaming services, educational tools. Services primarily concerned with the activities of children could cover early childhood development trackers, family photo-sharing apps, school management systems. This broad scope is closer to the UK's Age Appropriate Design Code model than the narrower categories covered by the social media ban.
    • The code introduces a wide range of new legal requirements like default privacy settings to minimise collection, checking age before collecting information, stricter rules for consent, mandatory privacy impact assessments, data destruction (deletion), and a "best interests of the child" data handling requirement.
    • Even for organisations that are not caught, the code is an important signal of future areas of focus from the regulator, and the increasing crossover between privacy and online safety concerns, with the privacy regulator and the eSafety regulator each pursuing mechanisms requiring age checks for access to online services.

    What you need to do

    • Engage in the consultation process: There is still time to prepare submissions for the consultation.
    • Audit your services: The draft code could capture a broad range of digital services and could apply to data handling you are not aware of. Remember, the code can apply to services likely to be accessed by children whether or not they target children.
    • Decide whether you will take steps to check the age of users or apply child-safe practices universally.
    • Start to integrate defensible transparency and impact assessments into your processes: Various code requirements will bring internal processes into the clear view of customers, regulators and competitors. Increased transparency brings the potential for increased trust, but also the risk of reputational harm and regulatory or private legal action. The draft code includes mandatory privacy impact assessments, published on an online register, which can be scrutinised by the regulator.
    • Revisit privacy protections through a child-appropriate lens: Many organisations will underestimate how general systems, processes and procedures will need to be adapted to comply with the code and to apply child-specific requirements even for services that are not primarily for children. Code obligations work together to reduce the amount of information collected about children, to introduce more transparency and friction into customer journeys, and to limit what can be done with information.
    • Make governance and code compliance demonstrable and defensible: The code is supported by a broader underlying obligation to take reasonable steps to implement practices, procedures, and systems to comply with privacy laws – underscoring the focus of regulatory action on inadequate privacy risk management measures.

    The draft Children's Online Privacy Code at a glance

    Broad range of electronic services covered: 

    The scope is extremely broad, capturing websites, instant messaging and social media platforms. The code covers:

    • social media services – websites or services which enable online social interaction, or allow material to be posted,
    • relevant electronic services – such as messaging, chat and online games, and
    • designated internet services – services that makes material available over the internet, which extends to various websites and apps.

    Does not apply to health services, or telecommunications carriage service providers. 

    Applies to services:
    • likely to be accessed by children; or
    • primarily concerned with the activities of children,

    whether or not children are the intended audience.

    Key obligations for both:

    • services likely to be accessed by children; and
    • services primarily concerned with the activities of children

    Broad range of obligations for both service categories:

    • Either do age checks, or apply code protections to all users:
      • take reasonable steps to ascertain the age of end-users before collecting personal information (and destroy sensitive information used for age checks as soon as practicable), OR
      • apply the code for all users.
    • Data minimisation by default – implement technical and organisational measures ensuring only strictly necessary personal information is collected, used or disclosed by default – users can change these settings later.
    • Best interests – collection, use and disclosure of personal information must be consistent with the best interests of the child. Activities not in the best interests of the child will not be considered "lawful and fair" under the APPs. The draft explanatory memorandum includes a list of "best interest" factors based on Article 3 of the United Nations Convention on the Rights of the Child (UNCRC).
    • Consent and assent – consent must be voluntary, informed, current, specific, unambiguous, not obtained by coercion, and capable of being withdrawn. Consents from children valid for a maximum of 12 months. Consents cannot be "bundled". Children under 15 require parent/guardian consent, and a second layer – assent from the child – is needed to collect sensitive information, use and disclose information for a secondary purpose, or for direct marketing (in addition to parental consent). Children over 15 can themselves consent (provided they have the relevant capacity).
    • Access to personal information – must be given in terms that are simple, easy to understand and age appropriate. Responses to access and correction requests – respond within 30 days (extendable to 60 days in complex cases).
    • Right to request information – children or person with parental responsibility may request information about handling of the child's personal information – such as the information collected, where it came from, who may receive it, how long it is held, and "clear and meaningful" information about how it is used for automated decisions (including whether profiling has been used, the context of the decisions and the consequences of making those decisions).
    • Opting out of direct marketing – maintain a clear, simple and easily accessible opt-out mechanism, and not include retention features that make it difficult to opt out – intended to reduce the use of design practices and "dark patterns" such as "confirm shaming".
    • Destruction of personal information – personal information must be destroyed (ie, permanently deleted) on request of the child or person with parental responsibility (with some limited exceptions). Actual destruction (not merely de-identification) is required.
    • Notification of parental control and monitoring – ongoing notification to the child of parental control or monitoring, or geolocation monitoring.
    • Privacy Impact Assessments (PIAs) on changes to information handling likely to have a significant impact on children's privacy, or on new services or new activities. Maintain and publish an online register of PIAs and provide copies to the OAIC upon request.
    • Child-specific privacy training – all staff with regular access to children's personal information must receive training, when employed/engaged and at least annually.
    • Review of privacy practices – review and update practices, procedures and systems at least annually.
    • Cross-border disclosure consent – when seeking consent to disclose information overseas, information provided to a child must be clear, concise, age appropriate, and not misleading.
    Additional obligations for services likely to be accessed by children

    Additional obligations for this service category only:

    • Child-friendly privacy policy – maintain a standalone child-friendly privacy policy that is clear, concise, age appropriate, and incorporates non-text material where appropriate.
    • Age appropriate notices of collection – clear, concise, age appropriate, and must not obscure or misrepresent the nature of collection.
    • Child friendly inquiry and complaints – in addition to being clear, simple, and easily accessible, must be expressed in age appropriate language.

    If the service is not specifically targeted at only one age range, content should be appropriate for 10–12-year-olds.

    The international context

    Australia's draft Children's Online Privacy Code looks to align to the UK Age Appropriate Design Code, which was introduced in September 2020 (with a 12-month enforcement grace period). Age Appropriate Design Codes have also subsequently been adopted in various US States, including most recently Nebraska and Vermont in 2025.

    Not just services aimed at kids

    The code will apply to a broad range of online services that are either:

    • likely to be accessed by children (under 18 years); or
    • primarily concerned with the activities of children.

    The "primarily concerned with the activities of children" limb does not appear in the Privacy Act – it has been added in the code to address services that might not be used directly by children, but that will manage information about children – such as applications that track early childhood development, family photo sharing, school performance tracking, and connected baby monitors. However, the "likely to be accessed by children" limb is likely to be the most expansive category.

    The code applies to social media services, relevant electronic services, and designated internet services, as those categories are defined in Australia's online safety laws. While there's some detail underneath these concepts, they are extremely broad – extending to messaging services, generative AI, websites, and apps.

    The eSafety Commissioner has provided non-exhaustive examples of the kinds of services covered in each of these categories:

    • Social media services: examples given include social networks, public media sharing networks, discussion forums, and consumer review networks.
    • Relevant electronic services: examples given include instant messaging services, SMS and MMS, chat, online multi-player gaming, email, online dating, and enterprise messaging.
    • Designated internet services: examples given include generative AI, file storage services managed by end-users in Australia, and other websites and apps.

    Health service providers and telecommunications carriage service providers (including internet service providers) are not covered by the code.

    The UK Age Appropriate Design Code uses a different formulation for which services are covered, but its test is also extremely broad – applying to any "relevant information society service", which covers services normally provided for remuneration, electronically, at a distance, on request. However, unlike the UK, Australia's approach will extend to unpaid services.

    Part of a bigger reform agenda

    Laws enabling the code were introduced as part of the Privacy and Other Legislation Amendment Act 2024 (Cth) – the first tranche of a broader reform agenda proposed in the 2023 Privacy Act Review Report. A second tranche of reforms was deferred for later implementation.

    However, in late 2025, the Productivity Commission recommended a major pivot from the current prescriptive privacy framework and reform agenda to an “outcomes-based” duty to act “fair and reasonably”, potentially complicating the remainder of the privacy reform agenda. The Government’s response remains to be seen. We may see some proposed reforms progressed in 2026, but a significant pivot would be a multi-year project.

    The OAIC has followed a thorough three-phase consultation process for the code. Phase 1 (January to August 2025) involved direct consultations with children and young people. Phase 2 (April to August 2025) engaged industry stakeholders. We’re now in Phase 3, running until June 2026, where the OAIC is seeking feedback on the draft code itself. Submissions during this time will shape the final code before its required registration on 10 December 2026.

    Practices, procedures and systems to support the code

    Organisations need to not only make sure they comply with the code, but under Australian Privacy Principle 1.2 must also take reasonable steps to implement practices, procedures, and systems that ensure compliance with privacy obligations (including the code).

    This distinct requirement means that organisations without adequate privacy risk management processes in place can be in breach of their privacy obligations even if no privacy incident or a specific breach of the code has occurred.

    Privacy enforcement actions regularly examine not only privacy failures, but the underlying practices, procedures and systems that allowed the failures to occur.

    The need to embed good privacy in practices, procedures, and systems makes it hard to "bolt on" privacy protections at the end of a project. A "privacy by design" mentality from project initiation makes it easier to safeguard data, comply with increasingly specific requirements, and prove you have effective practices, procedures, and systems.

    Intersection with online safety

    The code comes in the wake of a significant run of changes in online safety in Australia, including the implementation of the under-16 social media ban (the Social Media Minimum Age) in December 2025, as well as new online codes that require age assurance for a wide range of websites and services (the Age Restricted Material Codes) in March 2026.

    While the Office of the Australian Information Commissioner (OAIC) is Australia's privacy regulator, the eSafety Commissioner is a stand-alone online safety regulator. The OAIC has responsibility for privacy aspects of several regimes, including Digital ID, the Consumer Data Right, and online safety under the Social Media Minimum Age.

    eSafety and the OAIC have recently signed a memorandum of understanding to provide for more cohesive regulatory efforts – with age assurance under online safety codes and standards, social media minimum age laws, and the Children's Online Privacy Code expected to be a key focus.

    This intersection is also driving complexity. It is notable that different requirements trigger at different ages. The draft code and online safety age restricted content rules apply to children under 18 years, the Social Media Minimum Age age-check is for children under 16 years, the Online Safety Codes require age-checks for children under 18 years, and under the draft code, the proposed age at which a child can consent on their own behalf to the collection and use of their personal information is 15.

    Tension between children's safety and privacy

    The draft code introduces age verification requirements to our privacy laws, but age assurance has been a focus in online safety for some time – underpinning the Social Media Minimum Age as well as the online safety standards and codes.

    Under the draft Children's Online Privacy Code, entities must take steps that are reasonable in the circumstances to establish the age of an end user before collecting personal information or alternatively apply code protections to all users regardless of age.

    There is a natural tension between approaches to age assurance and privacy – with more privacy invasive techniques (like checking identity documents) generally providing more reliable age assurance than less privacy invasive techniques (like self-declaration). The right approach balances safety risks against privacy risks.

    • "Reasonable steps" are a balancing act: Exactly what constitutes "reasonable steps" is a balancing act between the risk of harm of a service against the age assurance techniques available. Draft explanatory material for the code gives non-exhaustive examples of the kinds of issues to consider. Deciding what age assurance technique to apply is its own privacy impact assessment.
    Risks of harm  Vs Age assurance options

    Eg:

    • Types of personal information
    • Volume of personal information
    • Whether information is shared with third parties
     

    Eg:

    • Suite of age assurance methods available – and their relative effectiveness
    • Costs of implementation
    • Data and privacy implications for users
    • Take care collecting information to do age checks: Some personal information can be collected to verify age, but only information necessary to perform age verification. In early trials of age assurance technologies, concerns were raised about providers over-collecting information for age checks. There are significant penalties for using age assurance information collected under the Social Media Minimum Age for other purposes – however there is no equivalent mechanism under the code.
    • Current techniques may need revisiting: Lower friction age assurance approaches that rely on a deep understanding of the user, like inferring age from online behaviours, might be challenging because (in general) age checks need to occur before collecting information. Age verification requirements only apply for information collected after the code commences, so historical data might be useful for age inference for existing users but might be difficult to apply for new users after the code commences.
    • Integrate online safety and children's privacy processes: Draft explanatory materials for the code point out that data-minimised age results from other age assurance checks (eg done to comply with online safety laws) can be re-used to verify age under the code, but only if compliance with the code was a stated purpose of collection, restrictions under those other laws are complied with, and reasonable steps are taken to confirm information is accurate, up-to-date, complete, and relevant. As discussed above, you may need to know under different obligations whether users are under 15, 16, or 18, and obligations under those laws may limit how certain information is used.
    • Self-declared age might not be enough: In the UK, we've seen a firm push under the Age Appropriate Design Code towards more robust age assurance where children may be exposed to "high risk" activities such as targeted advertising and behavioural profiling. The UK Information Commissioner has said “[r]elying on users to declare their age themselves is not enough when children may be at risk and we are focusing now on companies that are primarily using this method."

    Age assurance technologies – key considerations

    The OAIC has released guidance for age assurance technologies under the Privacy Act (currently not dealing with additional requirements under online safety laws or the draft Children's Online Privacy Code).

    This guidance includes a list of key issues the regulator expects organisations to address:

    • Is age assurance needed? Take a privacy by design approach and consider privacy impacts
    • Undertake due diligence to ensure the security of your age assurance ecosystem
    • Make sure age assurance options are reasonably necessary and proportionate to legitimate aims
    • Escalate to more intrusive personal information handling only as necessary
    • Be transparent in notices, at the moment it matters.
    • Define primary and secondary purposes precisely
    • Provide clear contact information and meaningful support
    • Minimise personal and sensitive information in age assurance processes
    • Make sure consent requests for sensitive information, or for secondary use or disclosure are understood by people of all abilities
    • Destroy or de-identify information used for age assurance immediately once purposes of collection have been met

    A new focus on fairness

    The 2023 Privacy Act Review Report proposed a new overriding requirement that data handling be "fair and reasonable" – a change that the Privacy Commissioner described as a "new keystone of the Australian privacy framework", primarily because it would apply even where consent was obtained. That reform has not been progressed, but in the absence of reform the Commissioner has pointed to existing obligations to collect information by lawful and fair means.

    Elements of the draft code emphasise this focus on ensuring consent is meaningful and will not "rubber stamp" harmful privacy practices. Under the draft code, information will not be collected "fairly" if collection is not in the best interests of the child, there is a specific prohibition on coercing consent, and various provisions emphasise accurate and understandable information and choices.

    In the recent privacy determination, the Privacy Commissioner found (among other things) that the way in which information and choices were presented in online forms used by RentTech provider 2Apply unfairly influenced individuals.

    "Whilst [fairness in choice architecture] is a novel approach to considering fairness for the purpose of the Privacy Act, it is becoming an increasingly important and relevant, privacy and data protection issue."

    Carly Kind, Privacy Commissioner – (IRE Pty Ltd (2Apply) privacy determination)

    Investigating "fairness" of online practices will be an important tool for the Privacy Commissioner to drive industry change in data practices, allowing the Commissioner to take a more flexible approach to consumer harm – and enhanced code-making powers introduced in 2024 (including the power to make the Children's Online Privacy Code) provide a lever to set enforceable expectations.

    An early sign of regulator focus areas

    What we see with the draft Children's Online Privacy Code is a signal of future focus areas and enforcement activity from Australia's privacy regulator. The code demonstrates an increasing level of creativity in expanding the existing regime to focus on areas beyond the previous consent model, with:

    • previously proposed reforms that have not yet been implemented appearing in the code, sometimes in ways more comprehensive than the original proposal – such as mandatory privacy impact assessments and the new rules regarding consent and the "two step" consent and assent process;
    • models for fairness and other more subjective assessments being implemented within existing requirements – such as the reliance on lawful and fair means to import a "best interests of the child" requirement; and
    • new areas of focus from other regulatory regimes being brought into focus – such as the age assurance requirement.

    While these proposed mechanisms may be seen as applying narrowly today (in that they only apply to children), this might not necessarily play out in practice as organisations start to assess whether or not they are captured, and if their services are "likely to be accessed" by children.

    As we continue to see this area evolve, both in the privacy space and across other areas such as online safety, expect to see an increasing focus on more prescriptive regulatory requirements for organisations that deal with large volumes of personal information. Slowly, the regulator's preferred positions are appearing within the privacy regime, notwithstanding that the privacy reform agenda has slowed since the "tranche 1" privacy reforms were passed in 2024.

    Other authors: Cindy Nguyen, Graduate; Zoe Huang, Lawyer; Imogen Loxton, Senior Associate; Andrew Hilton, Expertise Counsel; Nick Perkins, Partner and Geoff McGrath, Partner.

    The information provided is not intended to be a comprehensive review of all developments in the law and practice, or to cover all aspects of those referred to.
    Readers should take legal advice before applying it to specific issues or transactions.