BCS
Printable version

BCS, The Chartered Institute for IT, explains why abuses of facial biometric data means we need to get serious about safeguards

BCS, The Chartered Institute for IT, has warned against the rise of a ‘cavalier attitude’ by organisations using ‘flawed’ facial recognition technology to monitor crowds in public spaces.

Dr Bill Mitchell, Director of Policy at BCS says there is an unprecedented danger of the misuse of biometric data, including identity theft, because of a combination of flawed technology and a lack of ethical and rigorous safeguards around how that data is captured, stored and processed. 

The top concerns expressed by IT professionals in consultations BCS have carried out over the last 18 months highlight the severe risks of biometric data misuse. These are:

  1. Poor data governance resulting in companies unable to effectively monitor how data is used, who is using the data, or where duplicates of data are stored, which may result in any unethical practice going undetected.
  2. Lack of diversity in product development teams leading to hard-wired unconscious bias in new products or services that are data-dependent.
  3. Using incomplete data to incorrectly infer personal characteristics.
  4. Allowing data to be improperly shared within organisations.
  5. Improperly aggregating data from different sources to infer personal characteristics.
  6. Incorrectly cleaning data
  7. Incorrectly restructuring data resulting in the wrong data being associated with an individual.
  8. Incorrectly merging different data pipelines from third parties.
  9. Not conducting proper due diligence to ensure correct provenance of data through the supply chain (which may well be offshored and distributed across different national jurisdictions).
  10. Using data analysis methodologies that are invalid in a particular context.
  11. Applying analytical models as part of decision-making processes that are poorly tested (including, for example, inappropriate Machine Learning based neural networks).
  12. Using invalid anonymisation techniques that do not provide enough protection against deanonymisation.
  13. Storing data insecurely so that it is at risk of being misappropriated.

Dr Mitchell says the feedback from these consultations has been quite clear: “Virtually every time we hear the same alarming worries about data governance practices. This directly links to worries about the current cavalier attitude to facial recognition technology. For instance, misappropriated facial biometric data could lead to opportunities for virtual doppelgängers, and poorly captured biometric data can lead to cases of mistaken identity that can have dire consequences that are hard to correct. Much of the concern has been focused on the immaturity of the technology. An even bigger concern is what your biometric data is used for, or rather misused for, once it’s been captured and added to a database.” 

The concerns raised by the IT profession come after a series of recent revelations about the widespread use of facial recognition technology. This includes the release of a report by Big Brother Watch, a civil liberties and privacy campaigning organisation, that says there is a facial recognition ‘epidemic’ across privately owned sites in the UK. It says it has found major property developers, shopping centres, museums, conference centres and casinos using the technology. Also, the Information Commissioners Office, the UK’s privacy watchdog, has opened an investigation into the use of facial recognition cameras in a busy part of central London in Granary Square, close to King’s Cross station.

Dr Mitchell said: “All of this should mean we treat facial recognition technology with extreme caution.

For instance, in July 2019 the University of Essex published a report that found there have been ‘significant flaws’ in the way UK police forces have trialled AI-enabled facial recognition technology.

“If the police can’t get it to work properly, why should we assume that property developers, museums, or music festival organisers can make it work?”

Channel website: http://www.bcs.org/

Original article link: https://www.bcs.org/more/about-us/press-office/press-releases/bcs-the-chartered-institute-for-it-explains-why-abuses-of-facial-biometric-data-means-we-need-to-get-serious-about-safeguards/

Share this article

Latest News from
BCS

Free, Secure, Compliant UK Public Sector IT Recycling Service