In September 2023 TikTok was fined 345 million euros (the equivalent of $575 million AUD) by the Irish Data Protection Commission (DPC) under the European General Data Protection Regulation (GDPR) for breaches in its processes of children’s personal data. The basis for the fine is a lack of transparency through vague language explaining TikTok’s data handling processes and a failure to implement privacy-by-design in automatically making children’s accounts public. Another important part of the decision is consideration of age-verification measures.
We have written previously about children’s privacy, as it is an intersection of privacy and child safety similar to our annual eSafety campaigns for Safer Internet Day each February. More information on these topics is here:
This article considers three key aspects of the TikTok fine – transparency, privacy-by-design, and age-verification measures – in the context of Australian privacy regulation as it is relevant for charities and schools who work with children.
Transparency: a pillar of privacy regulation
Transparency is a pillar of privacy regulation, in both Europe and Australia. In the TikTok decision, the DPC took issue with certain vague words: the use of “public,” “everyone” and “anyone” to describe who could see a user’s account was not sufficiently clear as to whether that meant all registered TikTok users or anyone who could access the platform. Another transparency breach was the failure to provide information about TikTok’s information handling processes in a concise, transparent, intelligible and easily accessible form, using clear and plain language. We encourage all organisations to ensure their privacy policies and collection notices are clear, easy to understand and tailored to their particular audience.
In Australia, Australian Privacy Principle 1 enshrines openness and transparency as requirements for how organisations handle personal information. Specifically, openness and transparency means:
- taking reasonable steps to implement practices, procedures and systems to ensure you comply with the APPs and can deal with related inquiries and complaints; and
- you have a clearly expressed and up-to-date Privacy Policy publicly available that explains how you manage the personal information you hold.
Further, improving transparency of organisations and control of individuals is a key aim of proposed amendments to the Privacy Act 1988 (Cth).1 The reforms propose to increase transparency and control with improved notice and consent mechanisms. This is, in part, in response to the 2023 Office of the Australian Information Commissioner (OAIC) Australian Community Attitudes to Privacy Survey which showed that 84% of Australians want more control over the collection and use of their personal information.
For charities and schools, ensuring you provide transparency and control is critical to maintaining a strong and healthy relationship of trust with your community members. Transparency is a pillar of privacy regulation because privacy recognises that handling over information about ourselves or our children can be personal; similar to handling over part of our identity. Privacy, and transparency, is inherently about trust.
Privacy-by-design: your pro-active tool
We previously discussed what we mean by privacy-by-design in a recent article. For TikTok, it was found that making children’s accounts public by default is inconsistent with the GDPR’s data protection by design and default obligations. This was partly because TikTok, through its web browser version, can be access by non-registered users; i.e., the public at large. An additional, specific setting was required to “Go private”.
In Australia, no obligation regarding privacy-be-design currently exists. The inclusion of a privacy-by-design requirement is possible in the proposed amendments. What the government has committed to is to implement “new organisational accountability requirements [that] will encourage entities to incorporate privacy-by-design into their operating processes.” Regardless of a compliance obligation, privacy-by-design is a strong risk mitigation step against the threat of data breaches because:
- privacy-by-design shifts the focus from compliance to prevention.
- privacy-by-design increases awareness of privacy in your organisation.
- privacy-by-design addresses human error breaches (1/3 of all notifiable data breaches) through awareness and system design.
Privacy-by-design is particularly relevant to children’s privacy, as the Government agrees with the recommendation from the Attorney-General’s Department to introduce a Children’s Online Privacy code.2 The code would apply to online services that are likely to be accessed by children.
Age-verification: an emerging area
However, the decision is novel from a pan-European/EDPB (European Data Protection Board) perspective insofar as it is the first to examine age-verification measures against the backdrop of the GDPR. While the EDPB’s dispute resolution procedure, in an arguably rather odd way, directed the DPC to reach an inconclusive outcome, there are some important markers digital services with a mixed user population should note, as they may be indicators of future regulatory approaches to age verification.
Other key takeaways of the Irish Data Protection Commission’s fine of TikTok
The decision reaffirms the major focus of European regulators — and moreover the DPC as the bloc leader in this area — on children’s data. This is a topic we expect to see increasingly more often in regulatory investigations and enforcement decisions.
The DPC’s findings regarding the risks to children from the processing of their data are informative of how the DPC will expect organisations to assess such risks in relation to their own processing operations.
Finally, the decision also signals the EDPB’s willingness to use the fairness principle to bolt on additional findings of infringement at the dispute resolution stage, even where the lead supervisory authority’s investigation did include such an issue within its scope.
Read our latest article more detail on how the DCP came to their decision against TikTok.
How we can help
With the growing focus from both Europe and Australia on children’s data, organisations that work with children must take careful consideration of how they handle personal information.
Our privacy and data security team work with organisations to create workable and compliant privacy frameworks, and implement information handling practices that are resilient to data security threats. Our deep understanding of the education and not-for-profit sectors means that we are well equipped to support organisation that work with children on privacy requirements.
Contact us
Please contact us for more detailed and tailored help.
Subscribe to our email updates and receive our articles directly in your inbox.
Disclaimer: This article provides general information only and is not intended to constitute legal advice. You should seek legal advice regarding the application of the law to you or your organisation.
1Australian Government, Government Response to the Privacy Act Review Report (28 September 2023).
2Australian Government, Government Response to the Privacy Act Review Report (28 September 2023), page 13; Attorney-General’s Department, Privacy Act Review: Final Report (23 February 2023) Proposal 16.5.