Information Commissioner's Office
Printable version

The GDPR and Beyond: Privacy, Transparency and the Law

Elizabeth Denham recently spoke at the Alan Turing Institute on 23 March as part of its event: The GDPR and Beyond: Privacy, Transparency and the Law.

Ms Denham’s speech looked at how developments in Artificial Intelligence must take privacy into account.

Thank you to the Alan Turing Institute for the invitation to speak here today. You know, about a mile from my office in Wilmslow – and just a brisk walk from my own home – is a five bed Victorian semi. Fixed to its red-brick facade is a blue plaque that reads:

“Alan Turing, founder of computer science and cryptographer, whose work was key to breaking the wartime Enigma codes, lived and died here.”

Now here we are 64 years after his death, still admiring him, his work, his legacy.

In his day, Turing was one of few futurists. He had the ability to look beyond what was probable and into what might one day be possible. He had the power to identify potential and apply it in ways his contemporaries couldn’t begin to imagine.

Strange to think that Turing’s achievements have gone down in “history” – when his discoveries, his ideas were created barely a lifetime ago.

We’ve come so far in such a short space of time. In the Manchester Museum of Science and Industry there’s a replica of Baby – developed in 1948 it was the first computer to store and run a programme. This is a machine that would fill my living room, yet is has less power and capability than the iphone in my pocket.

So what does all this have to do with me? The UK’s Information Commissioner charged with upholding the rights of individuals to keep control of their personal information.

Well, the most significant risks to individuals’ personal information are now driven by the use of new technologies. The revelations over the last few days involving Cambridge Analytica and Facebook and political campaigns is a dramatic case in point.

But we’re also dealing with a rise in cyber-attacks as well as web and cross device tracking and, of course, the rise of Artificial Intelligence, big data and machine learning.

These technologies use high volumes of personal data from a wide range of sources making decisions and providing new insights about individuals. And cloud computing platforms enable the storage and processing power to be used at scale.

AI is not the future. It is the now. New facial recognition tools are being used in law enforcement – I’ll be blogging soon about this – and the credit and finance sectors are already using social scoring techniques.

The ability of AI to intrude into private life and effect human behaviour by manipulating personal data makes highlighting the importance of this topic a priority for the ICO.

So, my office has a significant role to play. I have often spoken about how innovation and privacy must go hand in hand. As technological developments progress ever rapidly, I am duty bound to stand up for the privacy rights of UK citizens. I will not allow their fundamental right to privacy to be carried away on a wave of progress.

In another 64 years from now, historians will look back at what we did. Not just at the nuts and bolts of our inventions, but at the steps we took to ensure they were used in ways that were ethical and moral. That we anticipated the risks, we mitigated them and, in turn protected individuals and the broader society.

Winston Churchill famously said: “Those who fail to learn from history are doomed to repeat it.”

History has much to teach us. We know that once the genie is out of the lamp, it’s darn near impossible to shove him back in.

Our history books are full of examples of good inventions used for bad things. Or great discoveries that ran amok in ways no-one foresaw.

When the Curies discovered radium it was hailed as a wonder. It was used in cosmetics, toothpaste, toys and novelty watches. Even when it became clear that radium may be responsible for sickness and even deaths, corporations seemed loathe to take it off the market. Where was the concern for the public? What should have been done differently?

If we had known that the Internet would be used to sell illegal drugs and create a dark web where terrorists flourish, would we have been more cautious?

Okay – we can’t know what we don’t know. But history has taught us that there are repercussions, consequences. It’s our job to search out what they might be and act before it’s too late.

History has its eyes on us.

Click here for the full speech

 

Channel website: https://ico.org.uk/

Original article link: https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2018/03/the-gdpr-and-beyond-privacy-transparency-and-the-law/

Share this article

Latest News from
Information Commissioner's Office