Information Commissioner's Office
Blog: Live facial recognition technology – police forces need to slow down and justify its use
Blog posted by: Elizabeth Denham, Information Commissioner, 31 October 2019.
As far back as Sir Robert Peel, the powers of the police have always been seen as dependent on public support of their actions. It’s an ideal starting point as we consider uses of technology like live facial recognition (LFR).
How far should we, as a society, consent to police forces reducing our privacy in order to keep us safe?
That was the starting point to my office’s investigation into the trials of LFR by the Metropolitan Police Service (MPS) and South Wales Police (SWP). LFR is a step change in policing techniques; never before have we seen technologies with the potential for such widespread invasiveness. The results of that investigation raise serious concerns about the use of a technology that relies on huge amounts of sensitive personal information.
We found that the current combination of laws, codes and practices relating to LFR will not drive the ethical and legal approach that’s needed to truly manage the risk that this technology presents.
The absence of a statutory code that speaks to the specific challenges posed by LFR will increase the likelihood of legal failures and undermine public confidence in its use. As a result, the key recommendation arising from the ICO’s investigation is to call for government to introduce a statutory and binding code of practice on the deployment of LFR. This is necessary in order to give the police and the public enough knowledge as to when and how the police can use LFR systems in public spaces. We will therefore be liaising with Home Office, the Investigatory Powers Commissioner, the Biometrics Commissioner, the Surveillance Camera Commissioner and policing bodies on how to progress our recommendation for a statutory code of practice.
We also recommend that more work should be done by a range of agencies and organisations including the police, government and developers of LFR technology to eliminate bias in the algorithms; particularly that associated with ethnicity. This will help and maintain public confidence and cross-community support.
Taken together, the recommendations from our investigation have such far reaching applications for law enforcement in the UK that I have taken the step of issuing the first Commissioner’s Opinion under our data protection laws.
That Opinion makes clear that there are well-defined data protection rules which police forces need to follow before and during deployment of LFR. The Opinion recognises the high statutory threshold that must be met to justify the use of LFR, and demonstrate accountability, under the UK’s data protection law. That threshold is appropriate considering the potential invasiveness of this technology. My Opinion also sets out the practical steps police forces must take to demonstrate legal compliance.
This Opinion is significant because it brings together the findings in our investigation, the current landscape in which the police operate, and the recent judgment from the High Court in R (Bridges) v The Chief Constable of South Wales, in which a member of the public had concerns that his image may have been captured on LFR from a police van while he was out shopping in Cardiff city centre. He brought the case to ask the courts to decide whether the use of facial recognition in this way by SWP was lawful. The High Court judged that in these instances, SWP used LFR lawfully.
However the SWP case was a judgment on specific examples of LFR deployment. It is my view that this High Court judgment should not be seen as a blanket authorisation for police forces to use LFR systems in all circumstances. When LFR is used, my Opinion should be followed. My Opinion recognises there is a balance to be struck between the privacy that people rightly expect when going about their daily lives and the surveillance technology that the police need to effectively carry out their role. Therefore it makes clear that police forces must provide demonstrably sound evidence to show that LFR technology is strictly necessary, balanced and effective in each specific context in which it is deployed.
My office’s investigation has concluded, but our work in this area is far from over. We have undertaken our own research to understand the public’s thoughts on the subject. Public support for the police using facial recognition to catch criminals is high, but less so when it comes to the private sector operating the technology in a quasi-law enforcement capacity. We are separately investigating this use of LFR in the private sector, including where LFR in used in partnership with law enforcement. We will be reporting on those findings in due course.
From LFR to the development of artificial intelligence systems that analyse gait and predict emotions based on facial expressions, technology moves quickly. It is right that our police forces should explore how new techniques can help keep us safe. But from a regulator’s perspective, I must ensure that everyone working in this developing area stops to take a breath and works to satisfy the full rigour of UK data protection law. Moving too quickly to deploy technologies that can be overly invasive in people’s lawful daily lives risks damaging trust not only in the technology, but in the fundamental model of policing by consent. We must all work together to protect and enhance that consensus.
For more information, please read:
Commissioner’s Opinion, the first issued under the Data Protection Act 2018 and sets out advice and recommendations.
ICO investigation into how the police use facial recognition technology in public places is a report containing our investigation findings.
Elizabeth Denham was appointed UK Information Commissioner on 15 July 2016, having previously held the position of Information and Privacy Commissioner for British Columbia, Canada.
Notes to Editors
- The Information Commissioner’s Office (ICO) is the UK’s independent regulator for data protection and information rights law, upholding information rights in the public interest, promoting openness by public bodies and data privacy for individuals.
- The ICO has specific responsibilities set out in the Data Protection Act 2018 (DPA2018), the General Data Protection Regulation (GDPR), the Freedom of Information Act 2000 (FOIA), Environmental Information Regulations 2004 (EIR) and Privacy and Electronic Communications Regulations 2003 (PECR).
- The General Data Protection Regulation (GDPR) has provisions included in the Data Protection Act 2018. The Act also includes measures related to wider data protection reforms in areas not covered by the GDPR, such as law enforcement and security.
- The Data Protection Act 2018 (DPA 2018), specifically s116 (2) in conjunction with Schedule 13 (2)(d), allows for the Information Commissioner to issue opinions to Parliament, Government or other relevant bodies and the public, on any issue related to the protection of personal data.
The Commissioner can issue opinions on her own initiative or on request. This opinion may also form the basis of the Commissioner’s approach to enforcing Part 3 and 4 DPA 2018 in this particular area.
The opinion may be subject to change or may lead to future guidance and the Commissioner reserves the right to make changes or form new opinions based on further findings or other changes in circumstances.
- To report a concern to the ICO go to ico.org.uk/concerns.
Latest News from
Information Commissioner's Office
Blog: Adtech - the reform of real time bidding has started and will continue17/01/2020 16:25:00
A blog by Simon McDougall, ICO Executive Director of Technology and Innovation
National retailer fined half a million pounds for failing to secure information of at least 14 million people10/01/2020 13:25:00
The Information Commissioner’s Office (ICO) has fined DSG Retail Limited (DSG) £500,000 after a ‘point of sale’ computer system was compromised as a result of a cyber-attack, affecting at least 14 million people.
ICO launches consultation on draft direct marketing code of practice09/01/2020 09:10:00
The Information Commissioner's Office (ICO) has launched a public consultation on a draft direct marketing code of practice.
Blog: The benefits of sharing personal data – what can we learn from Open Banking?07/01/2020 13:20:00
The ICO’s Regulators’ Business Innovation Privacy Hub has recently been looking at the key data protection considerations for innovators who are working in the Open Banking space.
Trust, technology and slippers with torches02/01/2020 14:10:00
Jonathan Bamford holds up a tatty bundle of papers. They’re scrumpled, time worn, ripped and held together with yellowing Sellotape, but with the Royal coat of arms crown still proudly visible on the cover.
Statement on ICO-approved certification schemes23/12/2019 12:10:00
The ICO has announced it will be working with UK Accreditation Service (UKAS) to deliver the ICO-approved certification schemes.
London pharmacy fined after “careless” storage of patient data20/12/2019 14:25:00
The Information Commissioner’s Office (ICO) has fined a London-based pharmacy £275,000 for failing to ensure the security of special category data.
Blog: The Data Protection Fee: does your company need to pay?04/12/2019 10:10:10
Blog posted by: Paul Arnold, Deputy Chief Executive Officer/Executive Officer, 03 December 2019.
Blog: ICO and The Alan Turing Institute open consultation on first piece of AI guidance03/12/2019 09:10:00
A blog aimed at data scientists, app developers, business owners, CEOs or data protection practitioners, whose organisations are using, or thinking about using, artificial intelligence (AI) to support, or to make, decisions about individuals, by Simon McDougall, Executive Director Technology and Innovation (02 December 2019).