How to comply with the Children’s Code

If your company provides information society services (ISS) likely to be accessed by children in the United Kingdom (UK), the Children’s Code will apply. It addresses how to design data protection safeguards into online services to ensure they are appropriate for use by, and meet the development needs of, children. It aims to better protect children when they are using digital services i.e. online. Our guide explains all you need to know about the Children’s Code for UK companies and non-UK companies with a branch, office or establishment in the UK.

Update, 30 May 2023

The ICO provided an update to the Children´s Code clarifying when an edtech provider may be within scope of the code (see below).

What is the Children’s Code?

The Children’s Code (formally entitled the ‘Age Appropriate Design Code’) is a statutory code of practice under the Data Protection Act 2018 (DPA 2018) requiring organisations to provide better online privacy protections for children. Organisations have to conform to it since the 2 September 2021. It aims to ensure that children automatically have a baseline of protection by design and default. Importantly, it is not limited to services specifically directed at children.

Does the Children’s Code apply to my organisation?

According to the Information Commissioner’s Office (ICO) the code applies to organisations (‘information society services’) which e.g. provide online products or services (including apps, programs, websites, games or community environments, and connected toys or devices with or without a screen) that process personal data and are likely to be accessed by children under the age of 18 in the UK. It is not, however, only applicable to services aimed only at children.

It also applies to online services based outside the UK that have a branch, office or other ‘establishment’ in the UK, and process personal data in the context of the activities of that establishment. Essentially, companies which would not fall under the requirement to appoint a UK representative.

As the code is risk based, it does not apply to all organisations in the same way. Your organisation is more likely to have to take steps to conform with the code if you are responsible for designing, developing or providing online services like apps, connected toys, social media platforms, online games, educational websites and streaming services that use, analyse and profile children’s data.

Therefore, all of the major social media (e.g. TikTok) and online services (e.g. Google) likely to be accessed by children in the UK will need to conform to the code.

In its update to the Children’s Code in May 2023 the ICO clarified that providers of edtech products and services that process children’s personal information may be in scope of the Children’s code, even if the edtech is provided via a school or on a non-profit basis. Edtech likely to be accessed by children on a direct-to-consumer basis may fall under the code if it also does any of the following:

  • determine or influence the purposes for which personal information will be processed (eg by setting parameters of how the information can and will be processed);
  • process children’s personal information for research purposes, where the research is not the core service procured by the school;
  • process children’s personal information for marketing and advertising; and
  • process children’s personal information for your own commercial purposes, which includes product development.

How do you know if services are ‘likely to be accessed by children’?

If your service is designed for and aimed at under-18s the code applies. However, even if your services are not specifically targeted at children, but are likely to be used by under-18s, it will apply (s.123 of the DPA). In practical terms, whether or not your service is likely to be accessed by children will depend on:

  • the nature and content of the service and whether it particularly appeals to children;
  • how the service is accessed and measures you have put in place to prevent children gaining access.

The ICO is currently conducting a consultation on its draft guidance for ‘Likely to be accessed’ in the context of the code, which will close on the 19 May 2023. It has already published a list of factors which should be examined when determining whether the code applies, FAQs and four case studies regarding online dating, pornography, games and social media in an aid to assist organisations.

In any case, organisations must decide whether children are likely to access your service, even if you run an adult-only service.

Standards of age-appropriate design

Organisations falling under the code must ensure their services and products adhere to the “standards of age-appropriate design”. The ICO indicates,

“the standards are not intended as technical standards, but as a set of technology-neutral design principles and practical privacy features. The focus of the code is to set a benchmark for the appropriate protection of children’s personal data. Different services will require different technical solutions.”

There are 15 standards which are set out in the code, amongst them, transparency, age appropriate application, online tools, parental controls, default settings, and data sharing. Which standards are important to which service and product depends on a case-by-case basis. Consultation of an expert is recommended.

Organisations must build the standards set out in the code into their design processes from the start, into subsequent upgrades and service development processes and into their data protection impact assessment (DPIA) process.

Enforcement and fines for the Children’s Code

Organisations have needed to conform to the code since 2 September 2021. If organisations fail to comply, the ICO has the power to conduct compulsory audits, issue orders to stop processing and apply fines of up to £17.5 million or 4% of the annual worldwide turnover, whichever is higher.

The ICO has already investigated and enforced several failures to comply with the children´s code, most recently and publically it issued a fine of £12.7 million against the social media service TikTok.

Let us help with your data protection needs

Our team of specialist data protection lawyers can provide support on all of your company’s data protection needs, including reviewing:

  • Your existing services to ensure that they comply with regulatory standards
  • Your DPIAs (or conducting new ones)
  • The measures you have taken to protect and secure the personal data you process

We will identify any additional measures you need to take now and provide detailed recommendations on how to implement them. See all of our data protection support services.