Longer Reads

Two key decisions highlight issues when handling children’s data

TikTok and ChatGPT: our Data Privacy team comments on two major data protection decisions and offers his recommendations to organisations on how to process personal information relating to minors.

2 minute read

Published 4 April 2023

Share

Key information

Two regulatory actions taken in the last few days have provided a salutary warning to organisations that process personal information relating to minors.

First, despite its reputation as a pragmatic watchdog that is generally less likely than its EU counterparts to impose drastic sanctions for non-compliance, the Information Commissioner’s Office (the UK’s data protection regulator, commonly known as the ICO) has today announced a £12.7 million fine on the video-sharing platform TikTok for its misuse of children’s data.

Second, the ICO’s Italian counterpart, the Garante per la protezione dei dati personali (or simply the ‘Garante’, for short) took the decision last Friday to block the widely-publicised chatbot, ChatGPT, in Italy. The Garante’s decision is partly based on concerns that ChatGPT lacks an appropriate age verification mechanism, resulting in children potentially receiving inappropriate responses to their inputs.

Why did the ICO fine TikTok?

The ICO has stated that TikTok ‘did not do enough to check who was using their platform or take sufficient action to remove the underage children that were using their platform’. Even though the Chinese-owned platform’s terms of service prohibit its use by individuals under the age of 13, the ICO found that it had breached the UK GDPR between May 2018 and July 2020 by nonetheless allowing up to 1.4 million children below that age to use its services during that period without parental consent. UK laws permit only children aged 13 or over to give consent themselves. As a result, TikTok may have used underaged children’s data in order to track and profile them, potentially serving up harmful or inappropriate content to them.

In addition, the ICO found that TikTok had failed to provide adequate information to those using the platform about how their data is collected, used, and shared in an easy-to-understand way. Privacy notices must be written with the intended audience in mind in plain and age-appropriate language. By failing to do this, TikTok did not, in the ICO’s view, enable its users under the legal age of majority to make informed choices about whether and how to engage with the platform.

The scale of the fine reflects the findings of the ICO’s investigations into TikTok’s processing of children’s data, which revealed failures to respond in a timely an adequate manner to an internal concern raised by TikTok’s own employees about underaged children’s use of the app.

Why did the Garante block ChatGPT in Italy? 

In its explanation for blocking the ChatGPT, the Garante cited a number of concerns regarding the AI-powered system, including a data breach in late March (which exposed conversations and certain subscribers’ personal information) and the apparent lack of any lawful basis on which to process the enormous volumes of personal data used by OpenAI (the US organisation that is developing the chatbot) to train the underpinning algorithm.

In the context of children’s data specifically, the Garante stated that that ChatGPT’s lack of a suitable age verification mechanism ‘exposes children to receiving responses that are absolutely inappropriate to their age and awareness, even though the service is allegedly addressed to users according to OpenAI’s terms of service’.

What are the key takeaways from these decisions?

The decisions taken this week by the ICO and the Garante are a timely reminder that children require particular protection when you collect and process their personal information, since they may be less aware of the risks involved. Specific points to note include the following:

– Neither TikTok nor ChatGPT are owned by UK- or EU-based organisations. Yet both are subject to the UK GDPR and EU GDPR, which nonetheless apply to the extent those services target, monitor, or process the personal data of individuals who are based in the UK or EU (as applicable). If your business handles information relating to UK- or EEA-based customers, the obligations under the relevant legislation will therefore govern your use of that information even if you do not have an establishment in the UK or the EEA

– Even if you state in your terms and conditions that individuals below a certain age cannot use your services, if you anticipate that this is likely to happen, then you should implement and maintain an effective age verification mechanism. You should also consider introducing other suitable technical and organisational measures to police this, such as training moderators to identify underage accounts and providing tools for parents and guardians to request the deletion of their underage children’s accounts. If concerns are raised by your customers or staff, it is essential to act swiftly and document the steps so that you are able to demonstrate accountability.

– If you provide services that children are likely to access, such as online gaming services and social media sites, then you should as a minimum follow relevant guidance issued by data protection regulators, such as the ICO’s Age-Appropriate Design Code. This will assist you in designing your systems and processes always with the need to protect children in mind and in ensuring that children are able to understand how you use their personal information and what their rights are.

For more information, please visit our Data protection lawyers page.

Related latest updates
PREV NEXT

Related content

Arrow Back to Insights

Longer Reads

Two key decisions highlight issues when handling children’s data

TikTok and ChatGPT: our Data Privacy team comments on two major data protection decisions and offers his recommendations to organisations on how to process personal information relating to minors.

Published 4 April 2023

Associated sectors / services

Two regulatory actions taken in the last few days have provided a salutary warning to organisations that process personal information relating to minors.

First, despite its reputation as a pragmatic watchdog that is generally less likely than its EU counterparts to impose drastic sanctions for non-compliance, the Information Commissioner’s Office (the UK’s data protection regulator, commonly known as the ICO) has today announced a £12.7 million fine on the video-sharing platform TikTok for its misuse of children’s data.

Second, the ICO’s Italian counterpart, the Garante per la protezione dei dati personali (or simply the ‘Garante’, for short) took the decision last Friday to block the widely-publicised chatbot, ChatGPT, in Italy. The Garante’s decision is partly based on concerns that ChatGPT lacks an appropriate age verification mechanism, resulting in children potentially receiving inappropriate responses to their inputs.

Why did the ICO fine TikTok?

The ICO has stated that TikTok ‘did not do enough to check who was using their platform or take sufficient action to remove the underage children that were using their platform’. Even though the Chinese-owned platform’s terms of service prohibit its use by individuals under the age of 13, the ICO found that it had breached the UK GDPR between May 2018 and July 2020 by nonetheless allowing up to 1.4 million children below that age to use its services during that period without parental consent. UK laws permit only children aged 13 or over to give consent themselves. As a result, TikTok may have used underaged children’s data in order to track and profile them, potentially serving up harmful or inappropriate content to them.

In addition, the ICO found that TikTok had failed to provide adequate information to those using the platform about how their data is collected, used, and shared in an easy-to-understand way. Privacy notices must be written with the intended audience in mind in plain and age-appropriate language. By failing to do this, TikTok did not, in the ICO’s view, enable its users under the legal age of majority to make informed choices about whether and how to engage with the platform.

The scale of the fine reflects the findings of the ICO’s investigations into TikTok’s processing of children’s data, which revealed failures to respond in a timely an adequate manner to an internal concern raised by TikTok’s own employees about underaged children’s use of the app.

Why did the Garante block ChatGPT in Italy? 

In its explanation for blocking the ChatGPT, the Garante cited a number of concerns regarding the AI-powered system, including a data breach in late March (which exposed conversations and certain subscribers’ personal information) and the apparent lack of any lawful basis on which to process the enormous volumes of personal data used by OpenAI (the US organisation that is developing the chatbot) to train the underpinning algorithm.

In the context of children’s data specifically, the Garante stated that that ChatGPT’s lack of a suitable age verification mechanism ‘exposes children to receiving responses that are absolutely inappropriate to their age and awareness, even though the service is allegedly addressed to users according to OpenAI’s terms of service’.

What are the key takeaways from these decisions?

The decisions taken this week by the ICO and the Garante are a timely reminder that children require particular protection when you collect and process their personal information, since they may be less aware of the risks involved. Specific points to note include the following:

– Neither TikTok nor ChatGPT are owned by UK- or EU-based organisations. Yet both are subject to the UK GDPR and EU GDPR, which nonetheless apply to the extent those services target, monitor, or process the personal data of individuals who are based in the UK or EU (as applicable). If your business handles information relating to UK- or EEA-based customers, the obligations under the relevant legislation will therefore govern your use of that information even if you do not have an establishment in the UK or the EEA

– Even if you state in your terms and conditions that individuals below a certain age cannot use your services, if you anticipate that this is likely to happen, then you should implement and maintain an effective age verification mechanism. You should also consider introducing other suitable technical and organisational measures to police this, such as training moderators to identify underage accounts and providing tools for parents and guardians to request the deletion of their underage children’s accounts. If concerns are raised by your customers or staff, it is essential to act swiftly and document the steps so that you are able to demonstrate accountability.

– If you provide services that children are likely to access, such as online gaming services and social media sites, then you should as a minimum follow relevant guidance issued by data protection regulators, such as the ICO’s Age-Appropriate Design Code. This will assist you in designing your systems and processes always with the need to protect children in mind and in ensuring that children are able to understand how you use their personal information and what their rights are.

For more information, please visit our Data protection lawyers page.

Associated sectors / services

Need some more information? Make an enquiry below.

    Subscribe

    Please add your details and your areas of interest below

    Specialist sectors:

    Legal services:

    Other information:

    Jurisdictions of interest to you (other than UK):



    Enjoy reading our articles? why not subscribe to notifications so you’ll never miss one?

    Subscribe to our articles

    Message us on WhatsApp (calling not available)

    Please note that Collyer Bristow provides this service during office hours for general information and enquiries only and that no legal or other professional advice will be provided over the WhatsApp platform. Please also note that if you choose to use this platform your personal data is likely to be processed outside the UK and EEA, including in the US. Appropriate legal or other professional opinion should be taken before taking or omitting to take any action in respect of any specific problem. Collyer Bristow LLP accepts no liability for any loss or damage which may arise from reliance on information provided. All information will be deleted immediately upon completion of a conversation.

    I accept Close

    Close
    Scroll up
    ExpandNeed some help?Toggle

    < Back to menu

    I have an issue and need your help

    Scroll to see our A-Z list of expertise

    Get in touch

    Get in touch using our form below.



      Business Close
      Private Wealth Close
      Hot Topics Close