Information Commissioner's Office
Information Commissioner’s report brings the ICO’s investigation into the use of data analytics in political campaigns up to date
When we launched our investigation into the use of data analytics for political purposes in May 2017, we had little idea of what was to come.
We were concerned about invisible processing – the ‘behind the scenes’ algorithms, analysis, data matching and profiling that involves people’s personal information.
When the purpose for using these techniques is related to the democratic process, the case for a high standard of transparency is very strong.
Since we began, the scope of our investigation has extended to 30 organisations, we have formally interviewed 33 individuals and are working through forensic analysis of 700 terabytes of data. In layman’s terms, that’s the equivalent of 52 billion pages.
Now we have published a report to Parliament that brings the various strands of our investigation up to date.
It sets out what we have found and what we now know. But it is not the end. Some of the issues uncovered in our investigation are still ongoing or will require further investigation or action.
Throughout our enquiries we found a disturbing disregard for voters’ personal privacy by players across the political campaigning eco-system — from data companies and data brokers to social media platforms, campaign groups and political parties.
Where there have been breaches of the law we have acted. We have issued monetary penalties - including the maximum £500,000 (under the previous law) to Facebook– and enforcement notices that compel companies and campaigns to comply with the law. We’ve instigated criminal proceedings against SCL Elections Ltd and referred issues to other regulators and law enforcement agencies. And where we have found no evidence of illegalities, we have shared those findings openly too.
But it’s not just about enforcement action.
We are at a crossroads. Trust and confidence in the integrity of our democratic processes risks being disrupted because the average person has little idea of what is going on behind the scenes.
This must change. People can only make truly informed choices about who to vote for if they are sure those decisions have not been unduly influenced.
What can we do to ensure that we preserve the integrity of future elections? How can we make sure that voters are truly in control of the outcome?
Whilst voluntary initiatives by the social media platforms are welcome, a self-regulatory approach will not guarantee consistency, rigour or shore up public confidence.
That is why we are calling for views for a code of practice covering the use of data in campaigns and elections. It will simplify the rules and give certainty and assurance about using personal data as a legitimate tool in campaigns and elections.
This code should be given the same statutory footing as other codes of practice in the Data Protection Act 2018.
Codes about data sharing, age appropriate design and a code for the media are all enshrined in law. The integrity of our democracy is equal to these issues. It’s important enough to the public and to the wider world that the regulator’s guidance be given a sharper edge and be included in primary legislation too.
We have also called for the UK Government to consider where there are regulatory gaps in the current data protection and electoral law landscape to ensure we have a regime fit for purpose in the digital age. We are working with the Electoral Commission, law enforcement and other regulators in the UK to increase transparency in election campaign techniques.
Finally, this is a global issue, which requires global solutions. Our work has helped inform the EU’s initiatives to combat electoral interference. A Canadian Parliamentary Committee has recommended extending privacy law to political parties and the US is considering introducing its first comprehensive data protection law.
We are immensely proud of the work of the team and the impact that our investigation has had.
We hope our investigation provides a blueprint for other jurisdictions to take action and sets the standard for future investigations.
Latest News from
Information Commissioner's Office
ICO fines home security company for making thousands of nuisance calls14/06/2019 09:10:00
The Information Commissioner’s Office (ICO) has fined Smart Home Protection Ltd £90,000 for making nuisance calls to people registered with the Telephone Preference Service (TPS).
Former customer services officer fined after unlawfully accessing personal data10/06/2019 17:20:00
A former customer services officer at Stockport Homes Limited (SHL) has been found guilty of unlawfully accessing personal data without a legitimate reason to do so.
G20 Side Event - International Seminar on Personal Data05/06/2019 12:25:00
Speach given yesterday by the ICO at the G20 Side Event – International Seminar on Personal Data.
Blog: Counting the cost of accessing environmental information04/06/2019 11:10:00
Blog posted by: Gill Bull, Director of Freedom of Information, 03 June 2019.
When it comes to explaining AI decisions, context matters03/06/2019 12:25:00
Alex Hubbard, Senior Policy Officer at the ICO, looks at some of the key themes identified in the ICO and The Alan Turing Institute’s interim report about explanations of AI decisions.
Blog: GDPR – One Year on31/05/2019 09:10:00
Blog posted by: Elizabeth Denham, Information Commissioner, 30 May 2019.
Blog: ICO regulatory sandbox29/05/2019 12:25:00
Work begins on creating ICO Sandbox short list as application period closes.
Known security risks exacerbated by AI24/05/2019 09:25:00
As part of our AI auditing framework blog series, Reuben Binns, our Research Fellow in Artificial Intelligence (AI), Peter Brown, Technology Policy Group Manager, and Valeria Gallo, Technology Policy Adviser, look at how AI can exacerbate known security risks and make them more difficult to manage.