In Bridges, R (On Application of) v The Chief Constable of South Wales Police [2019] EWHC 2341 (Admin), the court was asked to decide whether the current legal regime in the UK is adequate to ensure the appropriate and non-arbitrary use of automated facial recognition (AFR) in a free and civilised society. The case was the first time any court in the world had considered AFR. The parties brought the proceedings before the High Court to seek early guidance on the legal parameters and framework relating to AFR, whilst it is still in its trial phase, and before it is rolled out nationally.
The judgment was directed specifically to the way in which the technology has been used to date by South Wales Police (SWP), in a pilot project known as “AFR Locate”. In summary, AFR Locate involves the use of surveillance cameras to capture digital images of members of the public, which are then processed and compared with digital images of people on SWP watchlists. The question for the court was about the adequacy of the current legal framework in relation to AFR Locate, concerning human rights, data protection and equality legislation.
Human rights claim
The court ruled that the use of AFR Locate does entail infringement of the human rights of those in the position of the Claimant. However, the court considered the police’s common law powers to be sufficient and that they do not need new express statutory powers for this purpose.
The court also held that there is a clear and sufficient legal framework governing whether, when and how AFR Locate may be used. What is important is to focus on the substance of the actions that using AFR Locate entails, not simply that it involves a first-time deployment by SWP of an emerging technology. The fact that a technology is new does not mean that it is outside the scope of existing regulation, or that it is always necessary to create a bespoke legal framework for it. The legal framework within which AFR Locate operates comprises three elements or layers (in addition to the common law), namely: (a) primary legislation; (b) secondary legislative instruments in the form of codes of practice issued under primary legislation; and (c) SWP’s own local policies. Each element provides legally enforceable standards.
It was neither necessary nor practical for legislation to define the precise circumstances in which AFR Locate may be used, that is, to the extent of identifying precisely which offences might justify inclusion as a subject of interest or precisely what the sensitivity settings should be. Taking these matters as examples, the data protection principles provide sufficient regulatory control to avoid arbitrary interferences with human rights. The legal framework provides a sufficient level of certainty and foreseeability. It provides clear legal standards to which SWP will be held. The court also took account of the fact that AFR Locate is still in a trial period. The content of SWP’s policies may be altered and improved over the course of this trial. The possibility (or even the likelihood) of such improvement was not evidence of present deficiency.
The court said that the use of AFR Locate does entail sensitive processing of personal data of members of the public under section 35 of the DPA 2018. This must not be undertaken other than for cogent and robust reasons.
Nevertheless, the court was satisfied that the use of AFR Locate struck a fair balance and was not disproportionate. AFR Locate was deployed in an open and transparent way, with significant public engagement. On each occasion, it was used for a limited time, and covered a limited footprint. It was deployed for the specific and limited purpose of seeking to identify particular individuals (not including the claimant) who may have been in the area and whose presence was of justifiable interest to the police. On one occasion it led to two arrests. On the other occasion it identified a person who had made a bomb threat at the same event the previous year and who had been subject to a (suspended) custodial sentence. On neither occasion did it lead to a disproportionate interference with anybody’s rights. Nobody was wrongly arrested. Nobody complained as to their treatment (except the claimant on a point of principle). Any interference with the claimant’s rights would have been very limited. The interference would be limited to the near instantaneous algorithmic processing and discarding of the claimant’s biometric data. No personal information relating to the claimant would have been available to any police officer, or to any human agent. No data would be retained. There was no attempt to identify the claimant. He was not spoken to by any police officer.
Data protection claim
The court consider if facial images were personal data if the person could not be identified because they were not on a police watch list. The court considered both indirect and direct identification and individuation.
The court held that the claimant succeeded on the individuation point as the information recorded by AFR Locate individuated him from all others, that is, it singled him out and distinguished him from all others. The court said that individuals caught on the CCTV cameras were sufficiently individuated because the AFR Locate equipment takes images of their faces, that information is processed to extract biometric facial data, which is itself processed by being compared with information being drawn from the watchlist. By its nature, the facial biometric data is information about a natural person. That person is identifiable under data protection legislation because the biometric facial data is used to distinguish that person from any other person so that the matching process can take place. The biometric facial data in issue in the case was qualitatively different and clearly comprised personal data, because, per se, it permitted immediate identification of a person. It followed that SWP was (and is) required to process that data consistently with the data protection principles.
The court ruled that the use of AFR Locate was necessary for SWP’s legitimate interests taking account of the common law obligation to prevent and detect crime. The processing was not unwarranted.
The court said it was satisfied that the operation of AFR Locate involved the sensitive processing of the biometric data of members of the public, ie who are not on the watchlist. The process of comparing the images to see if someone was on the watchlist could only take place if each uniquely identifies the individual to which it relates. Although SWP’s overall purpose was to identify the people on the watchlist, the biometric information of members of the public must also be processed so that each is also uniquely identified to achieve a comparison. Therefore, that processing came within section 35(8)(b) of the DPA 2018. However, the court declined to rule on the issue as the Information Commissioner is preparing guidance and it felt that it would be premature.
SWP prepared an impact assessment about its use of AFR, which the claimant contended was defective. The court disagreed and said it complied with section 64 DPA 2018. There was a clear narrative that explained the proposed processing and the SWP’s assessment specifically considered the potential for breach of human rights. It also recognised that personal data would be processed, and identified the safeguards in place such as the duration for which any such data would be retained and the purposes for which it would be used.
Equality duty claim
The court also rejected the claimant’s argument that the use of AFR breached the public sector equality duty in section 149(1) of the Equality Act 2010.
Conclusion of the court
Therefore the claimant’s claim for judicial review was dismissed on all grounds. The court said that it was satisfied both that the current legal regime is adequate to ensure the appropriate and non-arbitrary use of AFR Locate, and that SWP’s use to date of AFR Locate had been consistent with the requirements of the Human Rights Act 1998 and data protection legislation.