The ICO has issued its first Opinion on live facial technology (LFR) as well as a report on the ICO’s investigation into its use.
The use of LFR by police has increased in recent years. The ICO’s investigation raises serious concerns about its use, as it relies on huge amounts of sensitive personal information. The ICO found that the current combination of laws, codes and practices relating to LFR will not drive the ethical and legal approach that is needed to manage the risks it presents.
The absence of a statutory code that covers the specific challenges posed by LFR will increase the likelihood of legal failures and undermine public confidence in its use. As a result, the key recommendation from the ICO’s investigation is to call for government to introduce a statutory and binding code of practice on the deployment of LFR. The ICO says that this is needed to give the police and the public enough knowledge about when and how the police can use LFR systems in public spaces. As a result, the ICO will be liaising with various bodies on how to progress the recommendation for a statutory code of practice.
The ICO also recommends that more work should be done by a range of agencies and organisations including the police, government and developers of LFR technology, to eliminate bias in the algorithms; particularly that associated with ethnicity. This will help and maintain public confidence and cross-community support.
Taken together, the recommendations from the ICO’s investigation have such far reaching applications for law enforcement in the UK that the Commissioner has taken the step of issuing the first Commissioner’s Opinion under data protection laws.
The Opinion makes clear that there are well-defined data protection rules which police forces need to follow before and during deployment of LFR. The Opinion recognises the high statutory threshold that must be met to justify the use of LFR, and demonstrate accountability, under the UK’s data protection laws. That threshold is appropriate considering the potential invasiveness of LFR. The Opinion also sets out the practical steps police forces must take to demonstrate legal compliance.
The Opinion is significant because it brings together the findings in the ICO’s investigation, the current landscape in which the police operate, and the recent High Court judgment in R (Bridges) v The Chief Constable of South Wales.
In that case, the High Court judged that South Wales police had used LFR lawfully. However, it was a judgment on specific examples of LFR deployment. It is the Commissioner’s view that the High Court judgment should not be seen as a blanket authorisation for police forces to use LFR systems in all circumstances. When LFR is used, the Opinion should be followed. The Opinion recognises there is a balance to be struck between the privacy that people rightly expect when going about their daily lives and the surveillance technology that the police need to effectively carry out their role. Accordingly, police forces must provide demonstrably sound evidence to show that LFR technology is strictly necessary, balanced and effective in each specific context in which it is deployed.
The ICO’s investigation has concluded but its work in this area will continue. It has carried out research to understand the public’s thoughts on the subject. Public support for the police using facial recognition to catch criminals is high, but less so when it comes to the private sector operating the technology in a quasi-law enforcement capacity. Therefore, the ICO is separately investigating this use of LFR in the private sector, including where LFR in used in partnership with law enforcement. It will report on this investigation in due course.
From LFR to the development of artificial intelligence systems that analyse gait and predict emotions based on facial expressions, technology moves quickly and police forces will investigate how new techniques can improve their work. But from a regulator’s perspective, the Commissioner says that everyone working in this developing area needs to satisfy the full rigour of UK data protection law. Moving too quickly to deploy technologies that can be overly invasive in people’s lawful daily lives risks damaging trust not only in the technology, but in the fundamental model of policing by consent. The various parties must all work together to protect and enhance that consensus.
In a separate development the Automated Facial Recognition Technology (Moratorium and Review) Bill has had its first reading. It aims to prohibit the use of automated facial recognition technology in public places and to provide for a review of its use. It is a private member’s Bill and it is unclear whether it will return to parliament in the new session after the December election.