BCS, The Chartered Institute for IT, explains why abuses of facial biometric data means we need to get serious about safeguards
BCS, The Chartered Institute for IT, has warned against the rise of a ‘cavalier attitude’ by organisations using ‘flawed’ facial recognition technology to monitor crowds in public spaces.
Dr Bill Mitchell, Director of Policy at BCS says there is an unprecedented danger of the misuse of biometric data, including identity theft, because of a combination of flawed technology and a lack of ethical and rigorous safeguards around how that data is captured, stored and processed.
The top concerns expressed by IT professionals in consultations BCS have carried out over the last 18 months highlight the severe risks of biometric data misuse. These are:
- Poor data governance resulting in companies unable to effectively monitor how data is used, who is using the data, or where duplicates of data are stored, which may result in any unethical practice going undetected.
- Lack of diversity in product development teams leading to hard-wired unconscious bias in new products or services that are data-dependent.
- Using incomplete data to incorrectly infer personal characteristics.
- Allowing data to be improperly shared within organisations.
- Improperly aggregating data from different sources to infer personal characteristics.
- Incorrectly cleaning data
- Incorrectly restructuring data resulting in the wrong data being associated with an individual.
- Incorrectly merging different data pipelines from third parties.
- Not conducting proper due diligence to ensure correct provenance of data through the supply chain (which may well be offshored and distributed across different national jurisdictions).
- Using data analysis methodologies that are invalid in a particular context.
- Applying analytical models as part of decision-making processes that are poorly tested (including, for example, inappropriate Machine Learning based neural networks).
- Using invalid anonymisation techniques that do not provide enough protection against deanonymisation.
- Storing data insecurely so that it is at risk of being misappropriated.
Dr Mitchell says the feedback from these consultations has been quite clear: “Virtually every time we hear the same alarming worries about data governance practices. This directly links to worries about the current cavalier attitude to facial recognition technology. For instance, misappropriated facial biometric data could lead to opportunities for virtual doppelgängers, and poorly captured biometric data can lead to cases of mistaken identity that can have dire consequences that are hard to correct. Much of the concern has been focused on the immaturity of the technology. An even bigger concern is what your biometric data is used for, or rather misused for, once it’s been captured and added to a database.”
The concerns raised by the IT profession come after a series of recent revelations about the widespread use of facial recognition technology. This includes the release of a report by Big Brother Watch, a civil liberties and privacy campaigning organisation, that says there is a facial recognition ‘epidemic’ across privately owned sites in the UK. It says it has found major property developers, shopping centres, museums, conference centres and casinos using the technology. Also, the Information Commissioners Office, the UK’s privacy watchdog, has opened an investigation into the use of facial recognition cameras in a busy part of central London in Granary Square, close to King’s Cross station.
Dr Mitchell said: “All of this should mean we treat facial recognition technology with extreme caution.
For instance, in July 2019 the University of Essex published a report that found there have been ‘significant flaws’ in the way UK police forces have trialled AI-enabled facial recognition technology.
“If the police can’t get it to work properly, why should we assume that property developers, museums, or music festival organisers can make it work?”
Latest News from
Fancy a career in AI? BCS Launches AI Foundation Certificate03/12/2019 13:10:00
A recent report from PwC states that 7.2 million jobs in AI and related technologies will be created in the UK in the next 20 years.
Invotra winner of SME category at the National Apprenticeship Awards 201929/11/2019 13:20:00
A Woking-based company, Invotra, has won the SME Employer of the Year category sponsored by BCS, The Chartered Institute for IT and RITTech, at the National Apprenticeship Awards 2019.
Winner of the BCS Cyber Crime Cup™ announced28/11/2019 10:20:00
SegFault Squard from Royal Holloway, University of London (RHUL) has been announced as the winner of The Cyber Crime Cup™ 2019, the first time a cyber security competition has been held live as a spectator eSport.
Lumen Prize: Lichtsuchende wins BCS AI Award27/11/2019 12:20:00
An interactive installation called ‘Lichtsuchende’ – Cybernetic Sunflowers with Maslovian Behaviours – has won the 2019 Lumen Prize – BCS AI Award.
Who owns the Web: BCS Welcomes Contract for Web - Launched by Sir Tim Berners-Lee26/11/2019 13:20:00
BCS, The Chartered Institute for IT has welcomed the launch of a Contract for the web by Sir Tim Berners-Lee – a global plan of action to make our online world safe and empowering for everyone.
Winners announced of IT Industry Awards14/11/2019 13:10:00
The Natural History Museum, METCloud, Travis Perkins, Virgin Trains (West Coast Trains) and Mastek are among the winners of the 2019 UK IT Industry Awards run by BCS, The Chartered Institute for IT and Computing.
BCS brings Cyber Crime Cup™ to The Etihad Stadium12/11/2019 11:10:00
“Manchester’s Etihad Stadium is set to host BCS’s first live cyber security eSport - the Cyber Crime Cup™ – which takes place at Cyber Crime 2019 on Monday 25 November 2019. With less than a month to go – tickets are going fast.”
Top ten tips for a digitally ethical election, according to BCS, The Chartered Institute for IT08/11/2019 15:20:00
BCS, The Chartered Institute for IT is backing a call for politicians of all persuasions to do the right thing when it comes to using data to influence voters.