Information Commissioner's Office
Printable version

The certainty of change: regulation in a time of political and social challenges.

Elizabeth Denham reflects on her time at the ICO, in a speech delivered to BCS, The Chartered Institute for IT (26 November 2021).

As you’ve heard, I am coming to the end of my spell as UK Information Commissioner. This will be my final formal speech, and as I come to an end of my term, my attention has turned to reflecting back over the past five years.

Earlier this week I read the first speech I delivered as Information Commissioner, back in September 2016. It was delivered to a busy room of people from businesses wanting to make the most of the opportunities of data-driven innovation.

The focus was on a changing law. The GDPR still wasn’t live, and the arrival of Brexit had left uncertainty over whether the new law would ever apply in the UK. Organisations wanted to know how to prepare for whatever came next, but were wary of anything that could stymie the opportunities data might offer.

It was a time when privacy online meant being careful what you posted on social media, rather than checking the privacy settings. A time when the ICO had a little under 500 staff, rather than the more than 800 we need today. A time when a national, government-led contact tracing programme would have been inconceivable.

--

Clearly much has changed since 2016. The past couple of years have seen an acceleration in the take-up of digital services, alongside a growing awareness of privacy rights and their value. And that’s made my world, and your worlds, look a whole lot different.

When I think back to my role as a regulator, even as recently at ten years ago, the focus was on health records left in the filing cabinets of decommissioned buildings, or MPs leaving paper files on trains. My work now is about issues that impact society. Data protection is about fairness, transparency and encouraging trust in innovation. It is about free and fair elections, how the social justice system serves victims, and how children can flourish safely in an online world. It’s about people.

It’s been an enormous change, and I’ve seen a similar change in your world. It’s not that long ago that digital, tech and IT professionals were backroom functions, and your roles were associated. Your office was probably tucked away somewhere, surrounded by old pieces of kit - it was probably next door to the data protection officer. But now digital is the starting point for so much innovation and so much opportunity, not just within your organisation, but within society.

I’ve seen BCS follow a similar journey, shifting from supporting IT professionalism to a broader focus of empowering digital tech to support society. I see the work being done to support an IT sector that is diverse and aware of its impact on people, in areas like diversity, sustainability and digital inequality.

There’s a clear overlap between all of our work today. I see our children’s code is a theme for your latest Animation and Games Development competition, and think back to BCS Society Medal I was awarded last year – that recognition of the role we all play in enhancing the reputation of digital technology just wasn’t there a decade ago.

--

And yet for all that change, so much of the underlying principles behind our work remain constant.

A key message of that first speech was that organisations should not be thinking about privacy or innovation, but about privacy and innovation.

That remains as true today as ever.

It was central through the pandemic, as the value of data protection as an enabler shone through, encouraging people to trust innovation by showing that their views are being respected.

We saw data protection considered from the very start of the development of contact tracing apps, for instance. Across the UK, government recognised the value of good data protection in encouraging public trust in the apps, and the urgency with which it needed to be considered. I was pleased my office was able to support this work. And we saw public authorities sharing data about vulnerable people with supermarkets, when grocery delivery spots were scarce.

In both of those examples, privacy was considered as a way of encouraging trust in innovation. That role of data protection as enabling innovation is not new: one of the motivations behind the first data protection law in the UK was to encourage public trust in the fledgling computer technologies of the day.

But the value of data protection as enabling innovation is greater than ever, whether that’s in enabling contact tracing apps or in protecting firms from cyber attack. Privacy, cyber security, considering the impact of digital innovation – these are all board level concerns.

--

What is crucial, though, is that amid the pace of change, we don’t forget this relationship between innovation and privacy.

The data-driven innovations of today have the potential to change the society of tomorrow. But that only happens if society buys into these technical advances.

We’ve seen already what happens when society is unsure about innovations. The opportunities for life changing innovation is huge in the health sector, for instance, but people have been nervous about allowing their data to be used, where the explanation of the process isn’t clear. That was one of the crucial learnings of care.data, and is being seen in public push back against the government’s plans around GP patient data.

Transparency is key here. And by that I mean real transparency: sensible explanations of how data is being used, the benefits that will result, and taking the time to check people understand.

--

Regulation plays an important role here too.

My eye was caught by a CDEI study earlier this year, which showed the single biggest predictor of whether someone believed in the role of digital innovation in response to the pandemic was not their level of concern about the pandemic, nor their age or education.

It was trust in the rules and regulation governing the technology.

That makes the DCMS consultation on potential reforms to data protection law well timed. The opportunities of digital innovation rely on trust in the law, and how we deliver those high standards cannot be static. We’ve spoken already of the changes just in the past five years.

But no matter how the technology evolves and the world changes, any future legislative framework must keep people, and people’s trust, at its centre. That must be constant.

It is crucial we continue to consider the opportunities of digital innovation and the maintaining of high data protection standards as two sides of the same coin. Innovation is enabled by high data protection standards.

--

With that in mind, I am deeply concerned about any changes to the data protection regime that would remove the centrality of fairness in how people’s data is used.

I am thinking specifically of AI and algorithms here, and questions in the consultation about the TIGGR proposal to remove the right to human review of automated decisions.

This feels like a step backwards. In the ICO’s consultation response I have set out clearly why I believe the right to human review must remain. And I note that BCS concerns in this area, and a call for clarity on the rights people should expect.

AI and algorithms rely on the data that is fed into them, in order to produce the world-changing outputs that come out the other end. Put simply, if people start to mistrust those outputs, then they’ll start to block their data being used as an input.

Building that trust starts with transparency, and continues in a commitment to fairness wherever people’s data is used. Without that trust, we risk losing so many opportunities that technologies can offer our society.

--

This consideration of AI is another example of how everything changes, and nothing changes. The technologies and opportunities are new, but the fundamental principles – of maintaining trust, of asking if data is being used fairly and transparently – remains constant.

I don’t have a crystal ball to see what challenges we will all face in the next five years, other than to be sure both IT and data protection will be at the centre of the continued push to maximise the impact of data-driven innovation. We’ll have to respond to the changes brought by the continued uptake of digital services. We’ll need to consider how society’s attitudes to privacy evolve and adapt post-pandemic. And we’ll need to consider changes to the law.

Change is certain.

But the fundamental principles that have been true throughout my time as Information Commissioner will remain. And so I’ll end with a couple of lines that I delivered in that first speech five years ago. They still feel very relevant today.

“This is an exciting time, with change happening day by day, hour by hour. Your job is to make sure that change, so reliant on people’s personal information, doesn’t leave those people behind.”

 

Channel website: https://ico.org.uk/

Original article link: https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2021/11/the-certainty-of-change-regulation-in-a-time-of-political-and-social-challenges/

Share this article

Latest News from
Information Commissioner's Office