UK: +44 (0) 207 0528 285 | IE: +353 (0) 51 833 958
UK: +44 (0) 2070528285
IE: +353 (0) 51 833 958
Home > News & Insights > Data Governance Insights > Transparency while processing Children’s personal data

Transparency while processing Children’s personal data

Transparency is a key obligation for any data processor to discharge while processing personal data. The same has been mentioned within Article 5(a) of the GDPR. Recently, documents filed in the UK Courts have accused TikTok, a popular social media platform, of processing children’s personal data without meeting transparency obligations or seeking consent as required by law.  

This article analyses the allegations levelled against TikTok and provides guidance on achieving the transparency obligations when processing children’s personal data, with a focus on the UK’s 

Accusations: 

TikTok has been accused of violating the following facets of  the UK General Data Protection Regulations (UK GDPR): 

  • Failing to adhere to Article 8 of the UK GDPR conditions applicable to child’s consent in relation to information society services; 
  • Failing to adhere to Article 13 of the UK GDPR concerning the information to be provided when personal data is directly obtained from the data subject;  
  • Failing to adhere to Article 14 of the UK GDPR concerning information to be provided when personal data is not obtained from data subject and; 
  • Failing to comply with Article 6 of the UK GDPR concerning the lawful processing of data. 

TikTok allegedly indulged in the processing of children’s personal information, including phone numbers, videos, exact location and biometric data. Article 13(e) and 14(e)clearly states, that data subjects must be informed about the types of data being processed about them, including videos and biometric data, and how it will be used (e.g., for marketing) and whether it will be shared it with third parties.  

The accusation also alleges that TikTok did not gain the relevant consent. Article 8 of the UK GDPR states that when it comes to processing children’s data, the age of digital consent is 13 years old. Where the child is below the age of 13 years, such processing shall be lawful only if and to the extent that consent is given or authorised by the holder of parental responsibility over the child.  

Guidance on ensuring Transparency while processing Children’s Personal Data: 

Data controllers that process children’s personal data must have a child friendly Privacy Notice that is clear and presented in plain, age appropriate language. We recommend that data controllers and other organisations consider the following necessary elements of a child friendly privacy notice ensuring the discharge of the Transparency obligations mentioned within Article 5(a) of the UK GDPR: 

  • Use child friendly ways of presenting privacy information, such as: diagrams, cartoons, graphics and videos, dashboards, layered and just-in-time notices, icons and symbols. 
  • Explain to children why the personal data being requested is required, and what will be done with it, in a way which they can understand. 
  • Explain the risks inherent in the processing, and how safeguards will be implemented, in a child friendly way, so that children (and their parents) understand the implications of sharing their personal data. 
  • Tell children what rights they have over their personal data in language they can understand. 

Additionally, it will also be beneficial for organisations to consider the ICO guidance around processing personal data belonging to Children.  

Future Legislations to comply with when processing Children’s Personal Data: 

Since the TikTok suit was filed, a Children’s Code (also known as the Age Appropriate Design Code) has come into force in the U.K. — requiring digital services that are likely to be accessed by children to comply with a set of design standards intended to prioritise privacy and safeguard minors from being tracked and profiled. Platforms that fail to respect the code of may draw wider UK GDPR scrutiny (and fines) from the U.K.’s Information Commissioner’s Office. 

Tech platforms operating in the U.K. are also facing a major new content oversight regime with a strong child protection component via the proposedOnline Safety Bill, which currently proposes heavy fines for violations. 

It is clear that data controllers and social media platforms that process children’s personal data in UK will need to be alert in monitoring new compliance obligations as the data protection regime concerning Children’s data continues to evolve. 

Trilateral’s Data Governance and Cyber Risk Team have data protection specialists with extensive expertise and experience in implementing and monitoring processing of children’s data to meet legislative requirements. Please feel free to contact our advisors, who would be happy to speak with you about your compliance needs.

Join our newsletter