On January 21, 2020, the UK Information Commissioner’s Office (“ICO”) published the final version of its Age Appropriate Design Code (“the code”), which sets out the standards that online services need to meet in order to protect children’s privacy. It applies to providers of information services likely to be accessed by children in the UK, including applications, programs, websites, social media platforms, messaging services, games, community environments and connected toys and devices, where these offerings involve the processing of personal data.
Online services will fall within the scope of the code even where the service’s funding does not come from the user (e.g., where the funding comes from advertising). Not-for-profit organizations are also covered to the extent that the service constitutes an “economic activity,” though public authorities and websites that merely provide information about a real-world business, without access to products or services, will generally fall outside the code’s scope.
The code lists 15 standards that organizations must meet, including requirements to (1) take into consideration the best interests of children; (2) refrain from using children’s personal data in ways that are detrimental to their wellbeing; and (3) ensure that settings are “high privacy” by default.
The code will now be laid before Parliament for approval. Following the code coming into force, there will be a 12-month transition period to allow organizations to implement the standards. The ICO expects the transition period to end by autumn 2021, and has said that it is preparing a significant package of support for organizations as they work towards compliance.
The code has been published in accordance with the ICO’s obligation under section 123 of the Data Protection Act 2018 to prepare a code of practice on standards of age appropriate design for online services likely to be accessed by children. The ICO consulted on a draft version of the code in April 2019 and received 450 responses. Following this consultation, the ICO updated the draft, reducing the number of standards from 16 to 15 and removing the standard on governance and accountability.
In relation to standard 3 concerning age appropriate application, the ICO added a requirement for organizations to take a risk-based and proportionate approach to recognize the age of users in order to ensure that the standards are effectively applied to child users. Specifically, the standard requires that organizations “[e]ither establish age with a level of certainty that is appropriate to the risks to the rights and freedoms of children that arise from your data processing, or apply the standards in this code to all your users instead.”
Otherwise the standards themselves remain largely unchanged from the version of the code that was circulated for consultation in April.
The final version of the code clarifies that a service is likely to be accessed by children (i.e., those under 18, as per the definition in the UN Convention on the Rights of the Child) if the possibility of the service being accessed by children is more probable than not, adding, “[t]his recognises the intention of Parliament to cover services that children use in reality, but does not extend the definition to cover all services that children could possibly access.”
The code adds that organizations should take a common sense approach when assessing this likelihood, taking into account the nature and content of the service, and its appeal to children, as well as the accessibility of the service to children.
The code states in its foreword: “For all the benefits the digital economy can offer children, we are not currently creating a safe space for them to learn, explore and play. This statutory code of practice looks to change that, not by seeking to protect children from the digital world, but by protecting them within it.”
The ICO describes the standards as “flexible,” but emphasizes that, “[c]onforming to the standards in this code will be a key measure of your compliance with data protection laws.”