Under the European General Data Protection Regulation (GDPR) the Irish Data Protection Commission (DPC) recently fined TikTok 345 million euros. The fine was the result of a inquiry launched by the DPC regarding TikTok’s processing of children’s personal data.
The DPC’s decision demonstrates the growing focus from privacy regulators on how organisations handle children’s personal information.
Brief Background: The DPC’s inquiry into TikTok
The decision is the result of an own-volition inquiry launched by the DPC in September 2021. The inquiry covered solely the period between 31 July 2020 -31 December 2020. Since then, TikTok Technology Limited (TTL) has made several service modifications addressing most of the criticisms within the decision.
TikTok’s terms do not allow users under the age of 13 to use the platform. The decision focuses on the processing of personal data relating to users aged 13-17, but also examines TTL’s compliance regarding personal data of children under 13 in the context of the company’s age-verification measures.
The case went through the GDPR’s dispute resolution mechanism under Article 65. While there was general consensus to the DPC’s proposed findings in its draft decision, objections were raised by the Italian and the Berlin supervisory authorities. Despite the fact that these objections were a small minority opinion among the collective EDPB, the Article 65 process mandates that even just one unresolved objection must trigger the whole machinery of the GDPR’s process, thus these objections were referred to the EDPB for determination.
The EDPB adopted its binding decision on these objections 2 August 2023, requiring the DPC to include a new finding of infringement of the fairness principle and an order to bring the relevant processing operations into compliance, while also requiring the DPC to amend its conclusion regarding its draft determination on whether TTL’s age-verification measures were GDPR-compliant.
It should be emphasized that the relevant period of the DPC’s inquiry pre-dated the DPC’s guidance on children’s data, The Fundamentals for a Child-Oriented Approach to Data Processing. The decision therefore assesses TTL’s compliance by reference to the GDPR itself and does not refer to the Fundamentals — however, the DPC carefully clarifies that the Fundamentals introduce “child-specific data protection interpretative principles” and that it would still be permissible to refer to principles derived from the GDPR.
Family pairing and direct messaging
The “Family Pairing” feature gave certain parental-type controls over the child user’s account to another user. Notably, the DPC acknowledged that, in general, the “Family Pairing” options allowed the paired account user to make privacy settings more strict for the child user’s account — by narrowing available content, disabling search and direct messages, making the account private and limiting comments.
However, the other user could also enable direct messages for accounts of child users over the age of 16 (although, based on the quoted TTL submissions to the DPC, this was only with regards to “Friends”) where the child user had themselves switched off this feature. The accounts were paired so that a QR code was generated to the nonchild user. This code had to then be scanned by the child user, who confirmed if they wished for the accounts to be linked. The DPC took the view that despite this process, there was no verification of the relationship between the two users.
The DPC considered that allowing a user, who was not a verified parent/guardian, to enable direct messages in this way for child users over age 16 posed risks. This enabled third parties to contact the child user and would thereby constitute unauthorised processing of their personal data, since they had not selected to have their data processed in this manner.
On this basis, the DPC concluded that TTL failed to apply appropriate technical and organisational measures to effectively implement the integrity and confidentiality principle and to integrate safeguards to meet GDPR requirements.
This finding again demonstrates the increased risk to children that the DPC associates with being able to directly contact children, whether through the comments function or via direct messaging.
Age verification
TikTok had age-verification measures in place to prevent users under 13 from accessing the platform. These consisted of an age gate requesting the user’s birthdate, along with technical measures to prevent users from re-submitting an older age, and ex-post measures to identify and remove accounts of underage users.
TTL’s data protection impact assessment on age-appropriate design did not identify the specific risks of users under age 13 accessing the platform and the further risks arising from this, which was viewed by the DPC as a lack of appropriate measures to demonstrate and ensure compliance with the GDPR, contrary to Article 24(1). This is an important indicator of the regulatory expectation that digital services with minimum user thresholds must still account for risks to users under the service’s permitted minimum age for use, including via DPIAs.
The DPC proposed in its draft decision to find that TTL’s age-verification measures otherwise complied with the GDPR. Following an objection from the Italian supervisory authority, the EDPB analysed this point, concluding there was not sufficient information to conclusively assess TTL’s compliance on this point, and instructed the DPC to amend its finding accordingly.
As such, the DPC’s decision included a statement to the effect that it could not be concluded that the age-verification measures deployed by TikTok infringed the GDPR. In other words, the positive statement in the draft decision expressing the DPC’s view that TikTok had complied with Article 25 in this regard was removed at the direction of the EDPB.
The decision also contains a comment on requiring hard identifiers as a method of age verification. The DPC accepted TTL’s contention that this would be a disproportionate measure. The DPC’s view was given that children are unlikely to hold or have access to hard identifiers, this would act to exclude or lock out child users who would otherwise be able to utilise the platform.
EDPB’s view on age verification
For organisations with mixed-age user populations of both adults and children on their services that impose minimum user ages, the portion of the decision relating to age verification is potentially the most significant. This is due to the fact that the EDPB carried out a lengthy analysis of TTL’s age-verification measures, taking into account — as required by Article 25(1) of the GDPR — the nature, scope, context and purposes of processing, the risks to child users, the state of the art and the costs of implementation.
In its analysis, the EDPB pointed out that regarding the requirement for “appropriate” technical and organisational measures under Article 25, appropriate means effective and this in turn requires a robustness of measures. The EDPB expressed serious doubts on the effectiveness of TTL’s neutral age gate as an age-verification solution given the high risk of the processing. The EDPB noted that the age gate can be easily circumvented, that presenting the age gate in a neutral manner does not itself sufficiently discourage users from entering an incorrect birth date, that once a method of circumvention is known this can be easily shared with peers, and that since TikTok was rated for age 12+ in the Apple store, users could easily infer they had to enter a birth date above the age of 12 to use the platform.
Similarly, the EDPB expressed doubts on the effectiveness of TTL’s ex-post measures to identify and remove users under age 13 from the platform. Despite these concerns, the EDPB considered it did not have sufficient information to conclusively assess the state-of-the-art element related to TTL’s age-verification measures, and as such, it could not conclusively assess TTL’s compliance with data protection by design.
While the EDPB’s decision does not explain why it felt there was not enough information to reach a conclusion here, it is worth noting that its analysis concerned the six months between July and December 2020 and it is possible that the need to carry out a historical examination some three years back may have been a factor.
It’s worth noting the EDPB’s view that the appropriateness of age-verification measures changes regularly, due to the link to the state of the art and the associated risks, and a controller must periodically review whether such measures remain appropriate.
Overall, though, controllers should not read the lack of an infringement finding concerning the use of the age gate in this case as a green light to use this means of age verification.
Fairness and design choices
The last finding in the DPC’s decision, regarding the infringement of the fairness principle, was not a finding originally proposed by the DPC. It was instead mandated by the EDPB’s binding decision and is the result of an objection raised by the Berlin supervisory authority on behalf of both it and the Baden-Württemberg supervisory authority.
More specifically, the EDPB concluded that the design of the “Registration Pop-Up,” with the “Go Private” or “Skip” options, and the “Video Posting Pop-Up,” with the “Cancel” or “Post Now” options, nudged the user to a certain decision, “leading them subconsciously to decisions violating their privacy interests.” The EDPB took into account the chosen language, sharing the DPC’s view that the word “Skip” seemed to incentivise or even trivialise the decision to opt for a private account, which shows the use of nudging. It also considered the location of the “Skip” and “Post Now” buttons on the right-hand side of the screen, which according to the EDPB, would lead most users to choose the option as they are accustomed to clicking to continue there, as well as the different color gradient for each option — light gray for “Cancel” and black for “Post Now.”
The EDPB’s direction in its binding decision on this point, requiring the DPC to insert a finding of an infringement of the fairness principle, demonstrates the EDPB’s propensity to use the general principles provision in Article 5(1)(a) as a route for finding additional umbrella-type infringements, even where the lead supervisory authority’s investigation did include such an issue within its scope.
Corrective powers
With regards to the above infringements, the decision exercises the following corrective powers:
- A reprimand.
- An order to bring TTL’s processing into compliance with the GDPR within three months, to the extent (if any) that TTL is conducting ongoing processing operations as described in the decision. TTL made several service modifications, both during and after the relevant period, which was also considered as a mitigating factor by the DPC.
- Three administrative fines totaling 345 million euros, as follows: A fine of 100 million euros for TTL’s infringement of Articles 5(1)(c) and 25(1), (2), regarding the public-by-default account setting; a fine of 65 million euros for the infringement of Articles 5(1)(f) and 25(1) regarding the “Family Pairing” feature; and a fine of 180 million euros for infringement of Articles 12(1) and 13(1) regarding transparency.
The DPC did not impose a fine for infringements of Article 24, regarding the public-by-default setting and TTL’s age-verification measures, given that the GDPR does not provide for an administrative fine for this infringement. The DPC also did not impose a fine for the infringement of the fairness principle — although this was requested by the Berlin supervisory authority in its objection. Instead, TTL was ordered to bring its processing into compliance.
Finally, it should be noted that TTL has appealed the DPC’s decision to the Irish High Court and has also issued annulment proceedings before the Court of Justice of the European Union against the EDPB in relation to its binding decision.
How we can help
With the growing focus from both Europe and Australia on children’s data, organisations that work with children must take careful consideration of how they handle personal information.
Our privacy and data security team work with organisations to create workable and compliant privacy frameworks, and implement information handling practices that are resilient to data security threats. Our deep understanding of the education and not-for-profit sectors means that we are well equipped to support organisation that work with children on privacy requirements.
Contact us
Please contact us for more detailed and tailored help.
Subscribe to our email updates and receive our articles directly in your inbox.
Disclaimer: This article provides general information only and is not intended to constitute legal advice. You should seek legal advice regarding the application of the law to you or your organisation.