TikTok Fined €345M for Children's Data Violations

Sep 2023 · €345M fine

By Karim El Labban · ZERO|TOLERANCE

EU GDPRSeptember 15, 20239 min read

# TikTok Fined EUR 345M for Children's Data Violations

The Irish Data Protection Commission (DPC) fined TikTok Technology Limited EUR 345 million in September 2023 for multiple violations of the GDPR in its processing of children's personal data.

The investigation found that TikTok set children's accounts (users aged 13-17) to public by default, exposing their content and profile information to all TikTok users and the wider internet.

The DPC also found critical flaws in TikTok's “Family Pairing” feature, which failed to verify that the adult account linking to a child's account was actually the child's parent or guardian, and identified that the platform's direct messaging settings for children were insufficiently restrictive.

## Key Facts

  • .**What:** TikTok set children's accounts to public by default, exposing their data.
  • .**Who:** TikTok users aged 13-17 across the EEA.
  • .**Data Exposed:** Children's profiles, videos, locations, and direct messaging access.
  • .**Outcome:** Irish DPC fined TikTok EUR 345M; EDPB issued binding decision.

## What Was Exposed

  • .Children's (ages 13-17) profile information including usernames, profile photos, biographical details, and account activity visible to all users globally due to public-by-default settings
  • .Video content created by children publicly accessible and indexable by search engines, including content revealing faces, locations, schools, and daily routines
  • .Children's direct messaging capabilities exposed to potential contact from unknown adults through insufficiently restrictive default messaging settings
  • .Family Pairing feature allowing any adult TikTok account to link to a child's account without verification of parental or guardian relationship
  • .Behavioral and engagement data from children's accounts processed for algorithmic content recommendation without appropriate safeguards for minor users

## Regulatory Analysis

The DPC's investigation, initiated in September 2021 as part of two own-volition inquiries into TikTok's processing of children's data, identified violations of multiple GDPR provisions.

The decision was adopted following the Article 60 consistency mechanism, with the European Data Protection Board (EDPB) issuing a binding decision under Article 65 that directed the DPC to strengthen certain aspects of its findings.

The primary violation concerned Article 25(1) and 25(2)--data protection by design and by default.

The DPC found that TikTok's default setting of children's accounts to “public” directly violated Article 25(2), which requires that by default, only personal data necessary for each specific purpose of the processing is made accessible.

A public profile setting maximizes rather than minimizes the accessibility of children's personal data, representing the exact opposite of what data protection by default requires.

The DPC noted that children, as a vulnerable category of data subjects identified in Recital 38 of the GDPR, are entitled to enhanced protections that TikTok failed to implement.

Article 5(1)(c)--the principle of data minimization--was violated because TikTok collected and made publicly available more personal data from children than was necessary for providing the platform's core functionality.

A child's content and profile could be viewed by the entire internet without any demonstrated necessity for this breadth of exposure.

The DPC determined that a private-by-default setting for children would have achieved TikTok's stated purpose of enabling content sharing within a controlled audience without exposing children's data to the open internet.

Article 5(1)(f)--the principle of integrity and confidentiality--was violated through the Family Pairing feature.

The DPC found that TikTok did not implement any meaningful verification to confirm that an adult user linking to a child's account through Family Pairing was actually the child's parent or guardian.

This meant that any adult user could potentially gain a degree of control over a child's account settings, including privacy and content filtering preferences.

The DPC characterized this as a failure to ensure appropriate security of children's personal data, including protection against unauthorized access.

The investigation also identified violations of Article 24 (responsibility of the controller) and Article 12 (transparent information and communication).

TikTok's privacy information directed at children was found to use language and formatting that was not genuinely accessible to the age group.

The DPC referenced the EDPB's Guidelines 05/2020 on consent, which emphasize that information provided to children must be in clear, plain language that the child can easily understand.

The EDPB's binding decision under Article 65 was significant because it required the DPC to expand its findings regarding Article 25(1) to include TikTok's failure to implement age-appropriate design features throughout the platform, not only in the specific settings examined.

The EUR 345 million fine reflected the vulnerability of the affected data subjects (children), the extended duration of the violations (from the GDPR's application in May 2018 until TikTok began changing default settings in 2021-2022), and the global scale of TikTok's platform.

## What Should Have Been Done

TikTok should have implemented private-by-default settings for all accounts identified as belonging to users under 18 from the earliest stage of its European operations.

This means that children's accounts should have been set to private visibility, with content accessible only to approved followers, and with direct messaging restricted to existing mutual connections.

These defaults should have been the baseline, with any loosening of restrictions requiring explicit, informed action by the account holder with age-appropriate guidance explaining the implications.

The Family Pairing feature should have incorporated robust verification of the parental or guardian relationship before granting any control over a child's account.

This could have been achieved through multi-factor verification including requiring the child to confirm the pairing from their device, verifying the adult's identity through government-issued identification, or implementing a challenge-response mechanism that only a genuine parent or guardian could complete.

At minimum, the feature should have notified the child when a pairing request was received and provided a clear mechanism to revoke the pairing at any time.

TikTok should have implemented a comprehensive age-appropriate design framework aligned with the principles articulated in the UK Information Commissioner's Office's Children's Code (Age Appropriate Design Code) and the EDPB's guidance on children's data.

This framework should have included age-gated content recommendation algorithms that do not expose children to content designed for adults, reduced data collection for children's accounts limiting processing to the minimum necessary for core functionality, and enhanced parental controls that are verified and transparent.

Privacy information and terms of service directed at children should have been developed with child development specialists to ensure genuine comprehension.

This includes using visual explanations, age-appropriate language, and interactive tutorials rather than lengthy text-based policies.

Regular user testing with children in the target age range should have been conducted to verify that privacy information was actually understood by its intended audience.

From a governance perspective, TikTok should have appointed a dedicated children's safety officer with board-level reporting authority, and should have conducted regular Data Protection Impact Assessments (DPIAs) under Article 35 specifically focused on the processing of children's data.

These DPIAs should have been updated whenever new features were launched or existing features were modified in ways that could affect child users.

The EUR 345M TikTok children's data fine establishes that platforms must treat child users as a protected class requiring maximum default protections.

Public-by-default is never acceptable for children's accounts, and any feature purporting to provide parental controls must actually verify the parental relationship or it creates a security vulnerability, not a safeguard.

RELATED ANALYSIS

USPTO GovDelivery Scam: How Fraudsters Weaponize Real .gov Emails to Steal From Trademark Filers
Apr 1, 2026 · 77K+ victims · 60+ domains · First-person investigation
Free Mobile Fined EUR 42M After 24.6 Million Customer Records Stolen
Jan 16, 2026 · EUR 42M fine
Illuminate Education: FTC Action Over 10.1 Million Student Records Breach
Dec 1, 2025 · $5.1M settlement
Capita Fined £14M After Black Basta Ransomware Exposes 6.6M Records
Oct 1, 2025 · £14M fine
SHEIN Fined €150M for Cookie Consent Violations
Jan 23, 2025 · €150M fine
MORE REGULATORY ENFORCEMENT →