This web page was created programmatically, to learn the article in its unique location you’ll be able to go to the hyperlink bellow:
https://www.allens.com.au/insights-news/insights/2025/10/oaic-weighs-in-on-privacy-aspects-of-social-media-minimum-age-regime/
and if you wish to take away this text from our website please contact us
From 10 December 2025, age-restricted social media platforms should take ‘affordable steps’ to stop customers beneath the age of 16 from having accounts. Although the social media minimal age (SMMA) regime sits throughout the Online Safety Act and can be primarily enforced by the eSafety Commissioner (eSafety), the Office of the Australian Information Commissioner (OAIC) will play a vital function in regulating privateness compliance.
In this Insight, we discover key themes from the OAIC’s latest guidance and description motion objects for platform suppliers and their know-how companions making ready for the brand new regime.
Overview of relevant legislation, rules and guidance.
The SMMA regime will come into effect on 10 December, requiring social media platform providers to take ‘reasonable steps’ to prevent Australians under the age of 16 (age-restricted users) from having accounts on their platforms. This follows amendments to the Online Safety Act in late 2024 introducing the SMMA framework under Part 4A.
The Online Safety Act does not prescribe specific methods for how to ensure users meet minimum age requirements. Rather, platform providers must implement ‘reasonable steps’ tailored to their context.3 eSafety’s guidance sets out baseline expectations around reliability, accuracy, robustness and effectiveness in using age assurance to assess user age. These expectations mirror the findings of the community consultation on the implementation of the SMMA regime conducted by eSafety and the Age Assurance Technology Trial, which we explore in our Insight.
This ‘reasonable steps’ requirement applies to both existing accounts and new accounts. This means platform providers will need to determine whether existing accounts on their platforms are held by age-restricted users and deactivate or remove those accounts, as well as prevent age-restricted users from creating new accounts.
The SMMA requirement applies to providers of ‘age-restricted social media platforms’ (platform providers), which are defined as services that meet the following conditions:
This definition is broad and will encompass most social media platforms.
The Online Safety (Age-Restricted Social Media Platforms) Rules 2025 issued under the Online Safety Act specify the types of online services that are not covered by the SMMA. These are services with the sole or primary purpose of:
In September 2025, eSafety issued guidance on the ‘affordable steps’ platform suppliers are anticipated to take to stop age-restricted customers from having on-line accounts. This includes, at a minimal, assessing person age through ‘age assurance‘ mechanisms—outlined broadly as processes used to confirm or infer age. Age assurance could also be undertaken by platform suppliers themselves or outsourced to contracted distributors—that are known as third-party age assurance suppliers.
The steering additionally units out eSafety’s principles-based method to the SMMA restrictions, emphasising that:
The steering additionally offers guardrails to help platform suppliers in growing their compliance measures, clarifying what’s strictly not prescribed beneath the SMMA guidelines and what measures will not be thought of affordable steps. Some key examples of those guardrails are under.
Platform suppliers ought to:
Platform suppliers shouldn’t:
On 9 October 2025, the OAIC launched guidance setting out its expectations concerning privateness compliance for each platform suppliers and third-party age assurance suppliers dealing with private data for age assurance functions within the SMMA context. The OAIC steering sheds mild on how these entities ought to adjust to their obligations beneath the Privacy Act 1988 (Cth) (Privacy Act) when taking the ‘affordable steps’ required for SMMA compliance functions beneath the Online Safety Act.
Together, the steering issued by each eSafety and the OAIC displays a regulatory method that makes an attempt to strike a steadiness between defending younger folks from harms related to social media use, whereas emphasising privateness and proportionality.
Part 4A of the Online Safety Act operates alongside the Privacy Act, introducing stricter obligations on platform suppliers and third-party age assurance suppliers whereas dealing with private data for SMMA compliance functions.
This means the regulators for Online Safety Act and the Privacy Act—eSafety and the OAIC respectively—play totally different, however complementary enforcement roles within the SMMA context.
Part 4A of the Online Safety Act (s63F) imposes the next data-handling necessities on platform suppliers and third-party age assurance suppliers—these apply along with the Privacy Act extra usually:
Failure to adjust to the Part 4A privateness obligations is an interference with the privateness of a person for the needs of the Privacy Act. This means:
Steps to adjust to SMMA obligations may even not be ‘affordable’ except an entity additionally complies with its Privacy Act obligations. eSafety is chargeable for imposing the ‘affordable steps’ obligation beneath the Online Safety Act.
The OAIC’s steering enhances eSafety’s steering by outlining the way it expects entities to align their privateness practices with these technical measures. The steering categorises private data used for SMMA functions into three sorts:
Each class requires strict controls on assortment, use, disclosure, storage and destruction.
The OAIC encourages a privateness by design method when deciding on age assurance strategies, emphasising the significance of privateness influence assessments. The OAIC is obvious that compliance with the SMMA regime might improve knowledge breach danger—knowledge safety should be the precedence, significantly when dealing with delicate data within the type of biometric knowledge. The OAIC states that entities ought to construct and keep their age assurance practices in order that high quality (APP 10) and safety and retention limitations (APP 11) are enforced by design.
In addition, the OAIC reiterates eSafety’s steering that the measures taken by platform suppliers to adjust to the SMMA regime shouldn’t be static—suppliers ought to ‘proactively monitor and reply to modifications of their platforms’ options, features and end-user practices’. eSafety additionally expects platforms to take proactive steps to detect accounts held by age-restricted customers on an ongoing foundation.
The OAIC is obvious that platform suppliers and third-party age-assurance suppliers should restrict their assortment to what’s truly needed for compliance with the SMMA regime. Otherwise, entities danger breaching the APP 3 requirement that assortment is ‘fairly needed’ for his or her features or actions. The OAIC acknowledges that assessing what knowledge is ‘needed’ within the circumstances includes weighing competing pursuits however emphasises knowledge minimisation as key. The OAIC recommends, for instance, that entities:
Part 4A of the Online Safety Act offers that private data collected for age assurance can’t be repurposed (ie used or disclosed for secondary functions) with out unambiguous consent (outdoors customary APP 6 exceptions, akin to the place use or disclosure is required for regulation enforcement).
The Guidance provides the instance of a platform supplier permitting a person to consent to the platform supplier sharing an output (eg a 16+ token) with a 3rd occasion to permit the person to enroll to that third occasion’s service. The OAIC is obvious that consent to any secondary makes use of or disclosures can’t be achieved by way of pre-selected settings or an opt-out method—a separate, devoted consent movement is required. Further, the OAIC’s view is that unambiguous consent requires the person to have the flexibility to withdraw consent—within the instance given, this might require each events to delete the token from their programs upon the withdrawal of consent.
Part 4A of the Online Safety Act requires platform suppliers and third-party age assurance suppliers to destroy private data collected for SMMA functions after dealing with it for that objective. This is a far stricter obligation than APP 11.2 which permits for:
In specific, the OAIC has harassed that inputs (akin to doc photos, OCR textual content, selfies, liveness movies or different biometric data or templates used for a point-in-time age verify) should be destroyed instantly following the age assurance verify, together with caches and storage—the OAIC sees inputs as highest danger. Outputs or ‘choice artefacts’ (akin to binary outcomes 16+ sure/no, timestamps and tokens) are seen as decrease danger—these could be retained briefly however solely inside ring-fenced environments for restricted operational wants, and supplied the entity is clear concerning the instantly associated functions arising from the age verify that contain retention for an extended interval. The OAIC provides three examples of such instantly associated functions:
The OAIC strongly means that entities create a ring-fenced SMMA surroundings to adjust to these destruction necessities and block promoting, analytics and machine-learning pipelines from the surroundings. Appropriate time-based retention intervals ought to be utilized in respect of every class, with data being destroyed as soon as the time interval for the final allowed objective has expired.
The Guidance additionally confirms that this quick destruction requirement doesn’t apply to current knowledge already held by an entity simply because that knowledge is used for an SMMA-compliance objective (eg the place a platform supplier makes use of current knowledge it already holds a few person to find out whether or not they’re beneath 16 years). Such knowledge should proceed for use, disclosed and destroyed (or de-identified) in accordance with the APPs extra usually.
As famous above, the OAIC is obvious that inputs (akin to doc photos and selfies or different biometric data) should be destroyed instantly. The OAIC offers the next extra steering on biometric data and templates extra particularly.
As a part of stopping customers beneath 16 from having accounts, age-restricted platform suppliers should assess current accounts and take steps to de-register customers beneath 16.
The OAIC makes the next key feedback concerning utilizing current knowledge for age assurance functions.
This web page was created programmatically, to learn the article in its unique location you’ll be able to go to the hyperlink bellow:
https://www.allens.com.au/insights-news/insights/2025/10/oaic-weighs-in-on-privacy-aspects-of-social-media-minimum-age-regime/
and if you wish to take away this text from our website please contact us
This web page was created programmatically, to learn the article in its authentic location you…
This web page was created programmatically, to learn the article in its unique location you…
This web page was created programmatically, to learn the article in its unique location you…
This web page was created programmatically, to learn the article in its authentic location you…
This web page was created programmatically, to learn the article in its unique location you…
This web page was created programmatically, to learn the article in its authentic location you'll…