Information Commissioner's Office
Blog: Using biometric data in a fair, transparent and accountable manner
As technology takes ever greater strides, so organisations and businesses are harnessing its capabilities to help manage their contact with customers, including using it for means of identification and authentication.
While there are undoubtedly significant benefits in using new technologies, organisations need to be aware of the potential challenges when choosing and using any systems involving biometric data.
In January 2017, HMRC adopted a voice authentication which asked callers to some of its helplines to record their voice as their password.
A complaint from Big Brother Watch to the ICO revealed that callers were not given further information or advised that they did not have to sign up to the service. There was no clear option for callers who did not wish to register. In short, HMRC did not have adequate consent from its customers and we have issued an enforcement notice ordering HMRC to delete any data it continues to hold without consent.
In the notice, the Information Commissioner says that HMRC appears to have given `little or no consideration to the data protection principles when rolling out the Voice ID service’.
She highlights the scale of the data collection – seven million voice records – and that HMRC collected it in circumstances where there was a significant imbalance of power between the organisation and its customers. It did not explain to customers how they could decline to participate in the Voice ID system. It also did not explain that customers would not suffer a detrimental impact if they declined to participate.
The case raises significant data governance and accountability issues that require monitoring. We therefore plan to follow up the enforcement notice with an audit that will assess HMRC’s compliance with good practice in the processing of personal data.
It was also found that a data protection impact assessment (DPIA) that appropriately considered the compliance risks associated with processing biometric data was not in place before the system was launched.
Any organisations planning on using new and innovative technologies that involve personal data, including biometric data, need to think about these key points:
- Under the GDPR, controllers are required to complete a DPIA where their processing is ‘likely to result in a high risk to the rights and freedoms of natural persons’ such as the (large scale) use of biometric data. A DPIA is a process which should also ensure that responsible controllers to incorporate ‘data protection by design and by default’ principles into their projects. Data protection by design and default is a key concept at the heart of GDPR compliance.
- When you’ve done your DPIA, make sure you act upon the risks identified and demonstrate you have taken it into account. Use it to inform your work.
- Accountability is one of the data protection principles of the GDPR - it makes you responsible for complying with the GDPR and says that you must be able to demonstrate your compliance by putting appropriate technical and organisational measures in place.
- If you are planning to rely on consent as a legal basis, then remember that biometric data is classed as special category data under GDPR and any consent obtained must be explicit. The benefits from the technology cannot override the need to meet this legal obligation.
This is the first enforcement action taken in relation to biometric data since the advent of GDPR when, for the first time, biometric data was specifically identified as special category data that requires greater protection.
Our guidance on informed consent provides advice for organisations planning to use these kinds of systems and we are currently developing our guidance on biometric data.
With the adoption of new systems comes the responsibility to make sure that data protection obligations are fulfilled and customers’ privacy rights addressed alongside any organisational benefit. The public must be able to trust that their privacy is at the forefront of the decisions made about their personal data.
Steve Wood is Deputy Commissioner for Policy and responsible for the ICO’s policy position on the proper application of information rights law and good practice, through lines to take, guidance, internal training, advice and specific projects.
Latest News from
Information Commissioner's Office
ICO raids business suspected of illegal pension cold calls04/10/2019 09:10:00
The Information Commissioner’s Office (ICO) yesterday searched a business premises in Chichester as part of an investigation into the making of nuisance calls related to pensions.
Superior Style Home Improvements fined and issued with enforcement notice17/09/2019 13:20:00
The Information Commissioner’s Office (ICO) has fined a Swansea double-glazing company £150,000 for making nuisance calls.
Privacy attacks on AI models16/09/2019 13:20:00
Reuben Binns, our Research Fellow in Artificial Intelligence (AI), and Andrew Paterson, Principal Technology Adviser, discuss new security risks associated with AI, whereby the personal data of the people who the system was trained on might be revealed by the system itself.
SMOs must “prepare for all scenarios” to maintain data flows when UK leaves the EU11/09/2019 14:20:00
The ICO has urged businesses to “prepare for all scenarios” as it publishes dedicated guidance to help small and medium sized organisations prepare for the possibility that the UK leaves the European Union with no deal.
Information Commissioner’s Office issues warning about historical personal details accessed through work06/09/2019 12:25:00
An ICO investigation into the actions of two former Metropolitan Police Service (MPS) officers has concluded.
Statement on the High Court judgement on the use of live facial recognition technology by South Wales Police04/09/2019 13:25:00
An ICO spokesperson responded to the statement on the High Court judgement on the use of live facial recognition technology by South Wales Police
Data minimisation and privacy-preserving techniques in AI systems22/08/2019 12:20:00
Reuben Binns, our Research Fellow in Artificial Intelligence (AI), and Valeria Gallo, Technology Policy Adviser, discuss some of the techniques organisations can use to comply with data minimisation requirements when adopting AI systems.
Statement: Live facial recognition technology in King's Cross19/08/2019 15:25:00
Statement from Elizabeth Denham, Information Commissioner, on the use of live facial recognition technology in King's Cross, London.