The Equality and Human Rights Commission (EHRC) has called for the suspension of the use of automated facial recognition (AFR) and predictive algorithms in policing in England and Wales, until their impact has been independently scrutinised and laws are improved.
In evidence submitted to the UN on a range of civil and political rights issues, the EHRC has highlighted concerns about how the use of AFR is regulated, and has suggested that AFR may not comply with the UK’s obligation to respect privacy rights under the International Covenant on Civil and Political Rights (ICCPR). The report also raises questions about the technology’s accuracy and points to evidence that many AFR algorithms disproportionately misidentify Black people and women, and therefore could be discriminatory.
The EHRC has also expressed concerns over the use of predictive policing programmes, which use algorithms to analyse data and identify patterns, suggesting that such programmes could replicate and magnify discrimination in policing. Predictive technologies also rely on ‘big data’, which encompasses large amounts of personal information, and the EHRC has warned that this may infringe on privacy rights and result in self-censorship, having a chilling effect on freedom of expression and association.
Rebecca Hilsenrath, Chief Executive at the Equality and Human Rights Commission, said: