Information Commissioner's Office
Blog: Live facial recognition technology - data protection law applies
Blog posted by: Elizabeth Denham, Information Commissioner, 09 July 2019.
Any organisation using software that can recognise a face amongst a crowd then scan large databases of people to check for a match in a matter of seconds, is processing personal data.
For the past year, South Wales Police and the Met Police have been trialling live facial recognition (LFR) technology that uses this software, in public spaces, to identify individuals at risk or those linked to a range of criminal activity - from violent crime to less serious offences.
We understand the purpose is to catch criminals. But these trials also represent the widespread processing of biometric data of thousands of people as they go about their daily lives. And that is a potential threat to privacy that should concern us all.
LFR is a high priority area for the ICO. My office has been conducting an investigation, monitoring the trials carried out by the police. The relevant forces piloting this technology have cooperated with our investigation and the ICO has learned a lot from our deep dive in examining how it works in practice. Legitimate aims have been identified for the use of LFR. But there remain significant privacy and data protection issues that must be addressed, and I remain deeply concerned about the rollout of this technology.
I believe that there needs to be demonstrable evidence that the technology is necessary, proportionate and effective considering the invasiveness of LFR.
There is also public concern about LFR; it represents a step change from the CCTV of old. There is also more for police forces to do to demonstrate their compliance with data protection law, including in how watch lists are compiled and what images are used. And facial recognition systems are yet to fully resolve their potential for inherent technological bias; a bias which can see more false positive matches from certain ethnic groups.
A key concern, currently being looked at in the courts, relates to the need for a detailed framework for safeguards prior to making decisions to implement LFR systems and governing its use at all stages.
So when a member of the public, supported by civil rights group Liberty challenged the lawfulness of South Wales Police’s use of LFR via the courts in May, it was crucial for me, as the regulator, to intervene to advise the court about the data protection issues in play.
The case - R (Bridges) v Chief Constable of South Wales Police (SWP) - involves a member of the public who has concerns that his image may have been captured on LFR from a police van while he was out shopping in Cardiff city centre. He has brought the case, to ask the courts to decide whether the use of facial recognition in this way by SWP is lawful.
The resulting judgment will form an important part of our investigation and we will need to consider it before we publish our findings.
Whilst the judgment will be important, any force deploying LFR needs to consider a wide range of issues. Our guidance for police forces considering LFR is:
- Carry out a data protection impact assessment and update this for each deployment - because of the sensitive nature of the processing involved in LFR, the volume of people affected, and the intrusion that can arise. Law enforcement organisations are advised to submit data protection impact assessments to the ICO for consideration, with a view to early discussions about mitigating risk.
- Produce a bespoke ‘appropriate policy document’ to cover the deployments - it should set out why, where, when and how the technology is being used.
- Ensure the algorithms within the software do not treat the race or sex of individuals unfairly.
Police forces should also ensure they have familiarised themselves with our Guide to Law Enforcement Processing covering Part 3 of the Data Protection Act 2018.
Although data protection law differs for commercial companies using LFR, the technology is the same and the intrusion that can arise could still have a detrimental effect. In recent months we have widened our focus to consider the use of LFR in public spaces by private sector organisations, including where they are partnering with police forces. We’ll consider taking regulatory action where we find non-compliance with the law.
We will continue to contribute to cross-government and international discussions about surveillance technology. We’re planning to report on all of our findings once the judgment in the South Wales Police case has been issued and we will then be setting out what action needs to be taken.
Elizabeth Denham was appointed UK Information Commissioner on 15 July 2016, having previously held the position of Information and Privacy Commissioner for British Columbia, Canada.
Latest News from
Information Commissioner's Office
Speech: The future of online advertising regulation12/07/2019 13:47:00
Simon McDougall, Executive Director for Technology Policy and Innovation’s speech at the Westminster Media Forum Keynote Seminar: The future of online advertising regulation.
Statement: Intention to fine Marriott International, Inc more than £99 million under GDPR for data breach10/07/2019 12:20:00
Statement given yesterday in response to Marriott International, Inc’s filing with the US Securities and Exchange Commission that the Information Commissioner's Office (ICO) intends to fine it for breaches of data protection law.
ICO publishes annual report covering an ‘unprecedented’ year09/07/2019 15:51:00
The public has woken up to the potential of their personal data, the Information Commissioner has said as the ICO’s annual report for 2018-19 was published today. Elizabeth Denham also said it covered an ‘unprecedented’ year for the regulator.
ICO statement: Intention to fine British Airways £183.39m under GDPR for data breach08/07/2019 13:10:00
Following an extensive investigation the ICO has issued a notice of its intention to fine British Airways £183.39M for infringements of the General Data Protection Regulation (GDPR).
Blog: Cookies – what does ‘good’ look like?04/07/2019 12:25:00
Blog posted by: Ali Shah, Head of Technology Policy, 03 July 2019.
Former company director believed to have profited by more than £1.4 million after selling personal data illegally01/07/2019 12:25:00
A former company director found guilty of illegally obtaining people’s personal data and selling it to solicitors chasing personal injury claims, has been fined for breaches of data protection and issued with a confiscation order under the Proceeds of Crime Act 2002.
ICO searches Liverpool addresses as part of investigation into suspected illegal acquisition and sale of personal data28/06/2019 15:20:00
The Information Commissioner’s Office (ICO) yesterday (27 June) searched two addresses in Liverpool, as part of an ongoing investigation into the acquisition and sale of illegally obtained personal data.
ICO’s access to information strategy calls for better compliance by public authorities backed up with enforcement action28/06/2019 12:25:00
The ICO yesterday published ‘Openness by Design’, its new access to information strategy.
Human bias and discrimination in AI systems26/06/2019 15:20:00
As part of our AI auditing framework blog series, Reuben Binns, our Research Fellow in Artificial Intelligence (AI), and Valeria Gallo, Technology Policy adviser, look at how AI can play a part in maintaining or amplifying human biases and discrimination.