Information Commissioner's Office
Blog: Live facial recognition technology – police forces need to slow down and justify its use
Blog posted by: Elizabeth Denham, Information Commissioner, 31 October 2019.
As far back as Sir Robert Peel, the powers of the police have always been seen as dependent on public support of their actions. It’s an ideal starting point as we consider uses of technology like live facial recognition (LFR).
How far should we, as a society, consent to police forces reducing our privacy in order to keep us safe?
That was the starting point to my office’s investigation into the trials of LFR by the Metropolitan Police Service (MPS) and South Wales Police (SWP). LFR is a step change in policing techniques; never before have we seen technologies with the potential for such widespread invasiveness. The results of that investigation raise serious concerns about the use of a technology that relies on huge amounts of sensitive personal information.
We found that the current combination of laws, codes and practices relating to LFR will not drive the ethical and legal approach that’s needed to truly manage the risk that this technology presents.
The absence of a statutory code that speaks to the specific challenges posed by LFR will increase the likelihood of legal failures and undermine public confidence in its use. As a result, the key recommendation arising from the ICO’s investigation is to call for government to introduce a statutory and binding code of practice on the deployment of LFR. This is necessary in order to give the police and the public enough knowledge as to when and how the police can use LFR systems in public spaces. We will therefore be liaising with Home Office, the Investigatory Powers Commissioner, the Biometrics Commissioner, the Surveillance Camera Commissioner and policing bodies on how to progress our recommendation for a statutory code of practice.
We also recommend that more work should be done by a range of agencies and organisations including the police, government and developers of LFR technology to eliminate bias in the algorithms; particularly that associated with ethnicity. This will help and maintain public confidence and cross-community support.
Taken together, the recommendations from our investigation have such far reaching applications for law enforcement in the UK that I have taken the step of issuing the first Commissioner’s Opinion under our data protection laws.
That Opinion makes clear that there are well-defined data protection rules which police forces need to follow before and during deployment of LFR. The Opinion recognises the high statutory threshold that must be met to justify the use of LFR, and demonstrate accountability, under the UK’s data protection law. That threshold is appropriate considering the potential invasiveness of this technology. My Opinion also sets out the practical steps police forces must take to demonstrate legal compliance.
This Opinion is significant because it brings together the findings in our investigation, the current landscape in which the police operate, and the recent judgment from the High Court in R (Bridges) v The Chief Constable of South Wales, in which a member of the public had concerns that his image may have been captured on LFR from a police van while he was out shopping in Cardiff city centre. He brought the case to ask the courts to decide whether the use of facial recognition in this way by SWP was lawful. The High Court judged that in these instances, SWP used LFR lawfully.
However the SWP case was a judgment on specific examples of LFR deployment. It is my view that this High Court judgment should not be seen as a blanket authorisation for police forces to use LFR systems in all circumstances. When LFR is used, my Opinion should be followed. My Opinion recognises there is a balance to be struck between the privacy that people rightly expect when going about their daily lives and the surveillance technology that the police need to effectively carry out their role. Therefore it makes clear that police forces must provide demonstrably sound evidence to show that LFR technology is strictly necessary, balanced and effective in each specific context in which it is deployed.
My office’s investigation has concluded, but our work in this area is far from over. We have undertaken our own research to understand the public’s thoughts on the subject. Public support for the police using facial recognition to catch criminals is high, but less so when it comes to the private sector operating the technology in a quasi-law enforcement capacity. We are separately investigating this use of LFR in the private sector, including where LFR in used in partnership with law enforcement. We will be reporting on those findings in due course.
From LFR to the development of artificial intelligence systems that analyse gait and predict emotions based on facial expressions, technology moves quickly. It is right that our police forces should explore how new techniques can help keep us safe. But from a regulator’s perspective, I must ensure that everyone working in this developing area stops to take a breath and works to satisfy the full rigour of UK data protection law. Moving too quickly to deploy technologies that can be overly invasive in people’s lawful daily lives risks damaging trust not only in the technology, but in the fundamental model of policing by consent. We must all work together to protect and enhance that consensus.
For more information, please read:
Commissioner’s Opinion, the first issued under the Data Protection Act 2018 and sets out advice and recommendations.
ICO investigation into how the police use facial recognition technology in public places is a report containing our investigation findings.
Elizabeth Denham was appointed UK Information Commissioner on 15 July 2016, having previously held the position of Information and Privacy Commissioner for British Columbia, Canada.
Notes to Editors
- The Information Commissioner’s Office (ICO) is the UK’s independent regulator for data protection and information rights law, upholding information rights in the public interest, promoting openness by public bodies and data privacy for individuals.
- The ICO has specific responsibilities set out in the Data Protection Act 2018 (DPA2018), the General Data Protection Regulation (GDPR), the Freedom of Information Act 2000 (FOIA), Environmental Information Regulations 2004 (EIR) and Privacy and Electronic Communications Regulations 2003 (PECR).
- The General Data Protection Regulation (GDPR) has provisions included in the Data Protection Act 2018. The Act also includes measures related to wider data protection reforms in areas not covered by the GDPR, such as law enforcement and security.
- The Data Protection Act 2018 (DPA 2018), specifically s116 (2) in conjunction with Schedule 13 (2)(d), allows for the Information Commissioner to issue opinions to Parliament, Government or other relevant bodies and the public, on any issue related to the protection of personal data.
The Commissioner can issue opinions on her own initiative or on request. This opinion may also form the basis of the Commissioner’s approach to enforcing Part 3 and 4 DPA 2018 in this particular area.
The opinion may be subject to change or may lead to future guidance and the Commissioner reserves the right to make changes or form new opinions based on further findings or other changes in circumstances.
- To report a concern to the ICO go to ico.org.uk/concerns.
Latest News from
Information Commissioner's Office
ICO takes enforcement action against Experian after data broking investigation28/10/2020 12:25:00
The Information Commissioner’s Office (ICO) orders the credit reference agency Experian Limited to make fundamental changes to how it handles people’s personal data within its direct marketing services.
Blog: Simplifying subject access requests – new detailed SARs guidance22/10/2020 12:25:00
The right of access is a fundamental right under data protection law. And it has never been more necessary. In a world where personal data is used almost everywhere – by everyone – it’s vital that people have the right to be able to find out what’s happening to their information.
ICO fines British Airways £20m for data breach affecting more than 400,000 customers19/10/2020 12:25:00
The Information Commissioner’s Office (ICO) has fined British Airways (BA) £20m for failing to protect the personal and financial details of more than 400,000 of its customers.
Blog: Engagement key in protecting people’s privacy across the UK during the pandemic14/10/2020 12:25:00
Information Commissioner Elizabeth Denham highlights the positive results of the ICO’s engagement with the UK devolved administrations on the use of data in the fight against COVID-19.
ICO takes action against company for sending spam emails selling face masks during pandemic09/10/2020 12:25:00
A company that sent spam emails selling face masks during the pandemic has been fined £40,000 by the ICO and issued with an enforcement notice.
Statement on the outcome of the ICO’s compulsory audit of the Department for Education08/10/2020 09:10:00
The Information Commissioner’s Office (ICO) has published the outcome of a compulsory audit of the Department for Education DFE carried out in February 2020.
Blog: Elizabeth Denham on the conclusion of the ICO’s investigation into the use of personal data in political campaigning07/10/2020 09:10:00
There can be few cases that better illustrate how mainstream data protection has become than the ICO’s investigation into the use of personal data in political campaigning, including by the now defunct Cambridge Analytica.
ICO launches consultation on draft Statutory guidance02/10/2020 12:25:00
The Information Commissioner's Office (ICO) has launched a public consultation on its draft Statutory guidance, which details how it will regulate and enforce data protection legislation in the UK.