Listen to this post

On April 15, 2019, the UK Information Commissioner’s Office (the “ICO”) issued for public consultation a draft code of practice, “Age Appropriate Design,” that will regulate the provision of online services likely to be accessed by children in the UK. Given the extraterritorial reach of the UK Data Protection Act 2018, organizations based outside of the UK may be subject to the code, which is expected to take effect by the end of 2019. The deadline for responding to the public consultation is May 31, 2019.

The draft code was published in accordance with the ICO’s obligation under section 123 of the Data Protection Act 2018 to prepare a code of practice on standards of age-appropriate design of online services likely to be accessed by children. The scope of the draft code is broad; it covers social media platforms, apps, online games, messaging services, search engines, online marketplaces, streaming services, news and educational websites, connected toys or devices, and any websites offering goods and services over the Internet. Free services (e.g., funded by advertising revenue) are covered, as are not-for-profit services that would normally be provided for remuneration.

The code will apply to any service that a child (defined as someone under the age of 18) is likely to access, regardless of whether or not the service provider intends to target children. Even where a service is ostensibly aimed at adults, service providers must be able to demonstrate, with specific documented evidence, that children are not likely to access the service.

The draft code is based on 16 headline standards of age-appropriate design and aims to protect the best interests and privacy of children. The standards are cumulative and interdependent, so that each one must be met in order for a service provider to demonstrate compliance with the code.

Many of the standards expand requirements already included in the EU General Data Protection Regulation (“GDPR”), with a view to providing additional, specific safeguards for children. For example, standard 8 provides that children’s personal data should not be disclosed unless there is a compelling reason for disclosure, considering the best interests of the child. Generalized data sharing for the purposes of commercial reuse is unlikely to meet this standard. The transparency standard (standard 3) reflects the transparency requirement of the GDPR, but specifies that “bite-sized” explanations of how personal data is used should be provided to children “at the point that use is activated.” Information must be provided in “clear language suited to the age of the child.”

The standards also require that all profiling and geolocation settings are, by default, set to “off,” and that a website or app’s settings are “high privacy” by default, meaning that children’s personal data should only be visible or accessible to other users to the extent that the child actively selects these options (standards 6, 9 and 11). Children should be informed of parental monitoring of their online activities (standard 10). When conducting a DPIA (standard 15), companies are encouraged to take into account the additional risk factors relevant to children accessing online services, such as features that may encourage excessive screen time, or increase exposure to online grooming.

The draft code emphasizes that the best interests of the child should be a primary consideration in the design of online services (standard 1), and that data should not be processed in a way that could be detrimental to a child’s physical or psychological well-being (standard 4). Further, the draft code states that the interests of the processing organization are unlikely to outweigh a child’s right to privacy.

In order to meet the draft code’s requirement to deliver services in an age-appropriate manner, service providers must either apply the code’s standard of protection to all users, or have robust age-verification mechanisms to distinguish children from adult users. The ICO notes that “asking users to self-declare their age or age range does not in itself amount to a robust age-verification mechanism under this code.” The draft code recommends that service providers deliver a child-appropriate service to all users, but provide age-verification options for adults to opt-out of the code’s protections, disincentivizing children from lying about their age.

Companies must also avoid using “nudge techniques,” designed to encourage users to select the option favored by the service provider (often a lower privacy option). Such nudges are sometimes employed, for example, where a website provider frames an option in more positive language than another, or makes one option more cumbersome to select. The draft code encourages the use of nudges towards higher privacy options, particularly for younger children.

The finalized code will be enforced by the ICO under the Data Protection Act 2018. Processing children’s personal data in breach of the code is likely to result in regulatory action, including enforcement notices and administrative fines of up to €20 million or 4% of annual worldwide turnover, whichever is greater. The ICO will take the code into account when considering whether an online service has complied with its obligations under the GDPR and the Privacy and Electronic Communications Regulations.

The ICO anticipates that the code, when finalized, will become an international benchmark.