Listen to this post

On February 16, 2024, the UK Information Commissioner’s Office (the “ICO”) published its first piece of guidance on content moderation. The ICO defines content moderation in the guidance as the analysis of user-generated content to assess whether it meets certain standards, and any action a service takes as a result of this analysis. This process includes the processing of personal data and,  according to the ICO in its statement, “can cause harm if incorrect decisions are made,” for example content being incorrectly defined as illegal.

The guidance outlines how UK data protection law applies to content moderation and the impact it can have on information rights. The guidance will, according to the ICO’s statement, “help” organizations subject to the UK Online Safety Act 2023 to comply with data protection law as they carry out content moderation in order to meet their online safety duties. The guidance covers areas such as:

  • assessing and mitigating data processing risks;
  • conducting lawful content moderation;
  • transparency regarding content moderation; and
  • ensuring data minimization in content moderation.

The guidance forms part of the ICO’s collaboration with the UK Office of Communications (“Ofcom”) and will be updated where appropriate to reflect technological developments and Ofcom’s finalized online safety codes of practice.