According to a statement by the Joint Council for the Welfare of Immigrants, the Home Office has agreed to stop using its ‘visa streaming’ algorithm, in response to legal action. The algorithm will be suspended from 7 August, “pending a redesign of the process,” which will consider “issues around unconscious bias and the use of nationality” in automated visa applications.
The JCWI says that this is the UK’s first successful court challenge to an algorithmic decision system. The JCWI had asked the court to declare the streaming algorithm unlawful, and to order a halt to its use to assess visa applications, pending a review. The Home Office has agreed to amend the process in response to the claim.
Since 2015, the Home Office algorithm has used a traffic-light system to grade every entry visa application to the UK. The tool assigns a Red, Amber or Green risk rating to applicants. Once assigned by the algorithm, this rating plays a major role in determining the outcome of the visa application. Countries with a majority of non-white citizens were more likely to receive a red rating.
JCWI said that the visa algorithm discriminated on the basis of nationality. Applications made by people holding ‘suspect’ nationalities received a higher risk score. Their applications received intensive scrutiny by Home Office officials, were approached with more scepticism, took longer to determine, and were much more likely to be refused.
The algorithm was opaque and suffered from a feedback loop in which biased enforcement and visa statistics reinforce which countries stay on the list of suspect nationalities. Such feedback loops are a well-documented problem with automated decision systems.
The Home Secretary is putting in place an interim process for visa applications, and has agreed to carry out equality impact assessments and data protection impact assessments for the new system.
The Law Society has welcomed the Home Office’s decision, highlighting its report on the use of algorithms from 2019, as well as a report by the Independent Chief Inspector of Borders and Immigration in 2017 which raised concerns about the “streaming tool”. Other commentators have pointed out that this is another example of the public sector not carrying out equality and data protection impact assessments, after the Department of Health and Social Care admitted that none was carried out for the covid-19 test and trace programme.