On September 4, 2019, the High Court of England and Wales dismissed a challenge to South Wales Police’s use of Automated Facial Recognition technology (“AFR”). The Court determined that the police’s use of AFR had been necessary and proportionate to achieve their statutory obligations.

An individual (Mr. Bridges) had brought judicial review proceedings after South Wales Police launched a project involving the use of AFR (“AFR Locate”). This technology was deployed at certain events and in certain public locations where crime was likely to occur, and was able to capture up to 50 faces per second. The police would subsequently match the images captured with wanted persons in their own databases using biometric data analysis. Where a match was not made with any of these watchlists, the images were immediately and automatically deleted.

Bridges had not been identified as a wanted person but had likely been captured by AFR Locate during its pilot deployment in Cardiff. He considered this to be unlawfully intrusive, specifically under Article 8 of the European Convention on Human Rights (“ECHR”) (right to respect for private and family life) and data protection law in the UK, including both the Data Protection Act 1998 (“DPA 1998”) and the Data Protection Act 2018 (“DPA 2018”). With regard to the DPA 1998, Bridges claimed that the prior use of AFR Locate had infringed Section 4(4), as it failed to comply with the data protection principles. Bridges also claimed that future use would constitute a failure to comply with Section 35 of the DPA 2018, which requires that processing of personal data for law enforcement purposes be lawful and fair. Bridges also pointed to the fact that the police had failed to carry out an adequate data protection impact assessment (“DPIA”), as required under Section 64(1).

The Court found that the use of AFR did interfere with an individual’s rights under Article 8 of the ECHR, and that this type of biometric data has an intrinsically private character, similar to DNA, as it enabled “the extraction of unique information and identifiers about an individual allowing … identification with precision in a wide range of circumstances.” Despite the fact that the images were immediately deleted, this process constituted an interference with Article 8 of the ECHR – it was sufficient that there was momentary storage of the data.

Despite this, the Court found that the interference was nonetheless carried out in accordance with the law as it was within the police’s common law powers to prevent and detect crime. The Court also found that the use of the AFR system was proportionate and met existing criteria that the technology be deployed openly and transparently and with significant public engagement. It was only deployed for a limited period, for a specific purpose, and was publicized before its use (for example, on Facebook and Twitter). The Court also pointed to the fact that the pilot had been successful in identifying wanted individuals, and that “this new technology has resulted in arrests or disposals in 37 cases where the individual in question had not been capable of location by existing methods.”

With regard to data protection law, the Court considered that the images of individuals captured (even those not matched with wanted persons lists) did constitute personal data, as the technology singled them out and made them distinguishable from others. The Court specified that AFR is more complex than simple CCTV, stating:

“AFR technology uses … digital information to isolate pictures of individual faces, extract information about facial features from those pictures, compare that information with … watchlist information, and indicate matches between faces captured through the CCTV recording and those held on the watchlist.”

By its nature, AFR had to make all images captured distinguishable from one another in order to attempt to match them to a watchlist. In fact, the processing was judged to constitute “sensitive processing” under section 35(8)(b) of the DPA 2018, even though there was no intention on the part of the police to identify individuals not present on any of their watchlists.

However, the processing did not infringe the relevant principle under the DPA 1998 for the same reasons the Court discussed regarding Article 8 of the ECHR:the Court found that the processing satisfied the conditions of lawfulness and fairness, and was necessary for the police’s legitimate interest in preventing and detecting crime, as required by their common law obligations. The requirement under Section 35(5) of the DPA 2018 that the processing be strictly necessary was also satisfied, as was the requirement that the processing be necessary for the exercise of the police’s functions.

The final requirement under Section 35(5) of the DPA 2018 was that there be an appropriate policy document in place to govern the processing. Although the Court found the relevant policy document in this case to be brief and lacking in detail, the Court declined to make a judgement as to whether the document was adequate, stating that it would leave that judgement to the police in light of more detailed guidance to be released by the UK Information Commissioner’s Office (“ICO”).

Finally, the Court determined that the impact assessment carried out by the police had been sufficient to meet the requirements under Section 64 of the DPA 2018.

The ICO, which recently completed its own investigation into the police’s piloting of this technology, emphasized that it would be reviewing the judgement carefully: “This new and intrusive technology has the potential, if used without the right privacy safeguards, to undermine rather than enhance confidence in the police… Any police forces or private organisations using these systems should be aware that existing data protection law and guidance still apply.”

The ICO stated that it will take the High Court’s judgement into consideration when finalizing its recommendations and guidance regarding the use of Live Facial Recognition systems.