Information Commissioner's Office
Blog: Addressing concerns on the use of AI by local authorities
A blog by Stephen Bonner, Deputy Commissioner – Regulatory Supervision
So many of people’s interactions with the government, both local and central, involve us handing over data about ourselves. This could be as simple as our name or date of birth, or as personal as our financial history or health information.
People should feel confident that this data is handled appropriately, lawfully, and fairly. This should especially be the case when accessing welfare or social support, where an individual may be at their most vulnerable. They should also be confident that none of their personal data is being used to discriminate against them, either consciously or unconsciously.
When concerns were raised about the use of algorithms in decision-making around benefit entitlement and in the welfare system more broadly, we conducted an inquiry to understand the development, purpose and functions of algorithms and similar systems being used by local authorities. We wanted to make sure people could feel confident in how their data was being handled.
As part of this inquiry, we consulted with a range of technical suppliers, a representative sample of local authorities across the country and the Department for Work and Pensions. Overall 11 local authorities were identified through a risk assessment process to ensure a representative sample based on geographical location and those with the largest benefits workload. This inquiry has greatly increased our understanding of the development, practical application and use of this technology in this sector, and the findings will be fed into the ICO's wider work in this area.
In this instance, we have not found any evidence to suggest that claimants are subjected to any harms or financial detriment as a result of the use of algorithms or similar technologies in the welfare and social care sector. It is our understanding that there is meaningful human involvement before any final decision is made on benefit entitlement. Many of the providers we spoke with confirmed that the processing is not carried out using AI or machine learning but with what they describe as a simple algorithm to reduce administrative workload, rather than making any decisions of consequence.
It is not the role of the ICO to endorse or ban a technology, but as the use of AI in everyday life increases we have an opportunity to ensure it does not expand without due regard for data protection, fairness and the rights of individuals.
While we did not find evidence of discrimination or unlawful usage in this case, we understand that these concerns exist. In order to alleviate concerns around the fairness of these technologies, as well as remaining compliant with data protection legislation, there are a number of practical steps that local authorities and central government can take when using algorithms or AI.
Take a data protection by design and default approach
As a data controller, local authorities are responsible for ensuring that their processing complies with the UK GDPR. That means having a clear understanding of what personal data is being held and why it is needed, how long it is kept for, and erase it when it is no longer required. Data processed using algorithms, data analytics or similar systems should be reactively and proactively reviewed to ensure it is accurate and up to date. This includes any processing carried out by an organisation or company on their behalf. If a local authority decides to engage a third party to process personal data using algorithms, data analytics or AI, they are responsible for assessing that they are competent to process personal data in line with the UK GDPR.
Be transparent with people about how you are using their dataLocal authorities should regularly review their privacy policies, and identify areas for improvement. There are some types of information that organisations must always provide, while the provision of other types of information depends on the particular circumstances of the organisation, and how and why people’s personal data is used. They should also bring any new uses of an individual’s personal data to their attention.
Identify the potential risks to people’s privacyLocal authorities should consider conducting a Data Protection Impact Assessment (DPIA) to help identify and minimise the data protection risks of using algorithms, AI or data analytics. A DPIA should consider compliance risks, but also broader risks to the rights and freedoms of people, including the potential for any significant social or economic disadvantage. Our DPIA checklist can help when carrying out this screening exercise.
The potential benefits of AI are plain to see. It can streamline processes, reduce costs, improve services and increase staff power. Yet the economic and societal benefits of these innovations are only possible by maintaining the trust of the public. It is important that where local authorities use AI, it is employed in a way that is fair, in accordance with the law, and repays the trust that the public put in them when they hand their data over.
We will continue to work with and support the public sector to ensure that the use of AI is lawful, and that a fair balance is struck between their own purposes and the interests and rights of the public.
Latest News from
Information Commissioner's Office
Former RAC employee fined for stealing data of victims of road traffic incidents02/02/2023 12:15:00
A former employee of breakdown services company RAC has plead guilty and been fined for the stealing of data of victims of road traffic accidents.
Using FRT in schools – letter to North Ayrshire Council31/01/2023 12:05:00
We have issued a letter to North Ayrshire Council (NAC) following their use of Facial Recognition Technology (FRT) to manage ‘cashless catering’ in school canteens.
Building better business by responsibly unlocking the value of personal information24/01/2023 12:20:00
Ahead of Data Protection Day, the Information Commissioner’s Office (ICO) is encouraging the UK’s 5,501,000* small-and-medium-sized businesses (SMEs) to check they have the right data protection practices in place to help sustain and develop their businesses.
Change to regulation concerning communication service providers20/01/2023 16:05:00
The Information Commissioner’s Office (ICO) has written to communication service providers (CSPs) about their obligations under Regulation 5A of the Privacy and Electronic Communications Regulations 2003 (PECR).
Empowering people to foster trust in tomorrow’s technological advancements20/01/2023 14:05:00
The ICO is encouraging developers to consider privacy at an early stage when implementing new technologies to maintain public trust and confidence.
Blog: Commissioner responds to misdirected criticism of journalism code21/12/2022 16:20:00
A blog by John Edwards, Information Commissioner
Five businesses fined a total of £435,000 for making nearly half a million unlawful marketing calls08/12/2022 13:05:00
The Information Commissioner’s Office (ICO) has fined five companies a total of £435,000 for making nearly half a million unlawful marketing calls to people registered with the Telephone Preference Service (TPS).
Providing certainty on how we enforce the laws we regulate08/12/2022 12:05:00
John Edwards, UK Information Commissioner, recently set out our strategic approach to regulatory action where he said: “Members of the public, and those affected by a breach or infringement, are entitled to know that we have held the business or organisation to account, and that they have changed their practices as a result.”