Information Commissioner's Office
Speech: The future of online advertising regulation
Simon McDougall, Executive Director for Technology Policy and Innovation’s speech at the Westminster Media Forum Keynote Seminar: The future of online advertising regulation.
Original script may differ from delivered version.
Privacy in adtech is having a moment. There’ve been some interesting fascinating developments recently! Amazon continues to carve out a piece of the adtech pie with their recent purchase of Sizmek. The new Global Alliance for Responsible Advertising held their inaugural meeting in Cannes a few weeks ago. And Uber has launched legal action against some adtech companies on grounds of security. Closer to home, our neighbours at the Irish Data Protection Commission launched investigations into Quantcast and Google. And the French Data Protection Authority announced their plans to review the adtech sector.
Now, I’m speaking to a room of experts who are familiar with these developments. However, if we stepped outside right now and asked anyone walking along the street about how advertising technology works, I bet most people would give you a blank stare in return. The majority of folk don’t understand that many of the adverts they see online have been specially selected for them via a complicated process called real-time bidding which relies on adverts being personalised using information, sometimes very private information, about you. This information about you could be basic, like which country you are in. Or it could be more detailed, like which websites you’ve visited or what your perceived interests are. It could even be information about a health condition you’ve been searching for.
Just one visit to a website can trigger an auction among advertisers which involves our personal data and hundreds of interested bidders. It’s impressive; it’s hard to imagine the speed, scale and complexity of this real-time bidding. It’s also the stuff of data protection nightmares. And it’s what has been keeping me up at night recently.
Our role at the ICO is to determine whether the use of personal data is compliant with the law. That’s why my team has prioritised our ongoing review of the adtech sector to learn more and firm up our thinking. From what we know so far, there’s a lot we’re concerned about. We’ve focused on programmatic advertising and real-time bidding because we’re concerned about its complexity and scale, the risks it poses to people’s data protection rights and the submissions we’ve received. We are not convinced current practices comply with the law.
Our work began late last year. We started by looking at transparency, which includes what people understand about how their information is used when they visit a website or use an app. We also looked at the lawful basis used to justify this processing. And we looked at security and just how safe personal data is when it flows through the ecosystem.
We spoke to many people about these three areas. We engaged with industry and spoke to many different groups, including publishers, advertisers, civil society, start-ups, adtech firms and legal specialists. We also considered concerns from consumers about how their data was handled. Back in March we brought together more than a hundred people for a full day fact-finding forum where we explored the challenges of transparency, lawful basis and security in greater detail.
We learned a lot from the forum – some things were uplifting and some were worrying. For example, we found real confusion around legitimate interests and how they can be applied. However, we had some really interesting discussions about the potential of innovative solutions like on-device bidding to make this process less privacy intrusive.
During this year, we’ve continued to develop our understanding and we weren’t satisfied with what we were hearing. We asked questions, and kept receiving answers that left us unsettled. Our engagement work, the submissions we received and our own policy thinking left us concerned and convinced that change is needed.
The creation and sharing of personal data profiles about people to the scale we’ve seen, often based on sensitive characteristics, feels wrong. It feels disproportionate, intrusive and unfair. At the moment, people’s data is being processed in ways they would not expect or easily understand. That certainly isn’t lawful. And we think it needs to change.
We’re concerned with the adtech industry as a whole and have honed our focus on two areas of real-time bidding. I should be clear that these aren’t the sum total of our concerns with the industry – our update report refers to a much broader set of problems. We’ve spoken about these areas with our European colleagues and we broadly share the same areas of concern.
The first area we’re looking at is how special category data, what used to be called sensitive data, is processed within real-time bidding. At the moment, real-time bidding protocols include data fields such as race, religion and mental health.
Special category data is highly sensitive personal information and explicit consent is needed to process these types of data within real-time bidding. If one hundred people visit a site promoting Pride in London and that site is tagged as relating to lesbian, gay, bisexual or transgender content, it’s probable that a number of these people would not ordinarily choose to share their information with hundreds of organisations. It’s just as probable a number of them would be very surprised and unhappy this has occurred.
Our second area of focus is the problems that are caused by relying on contracts for sharing data across the supply chain. The GDPR states that before any organisation decides to share personal data they should have formed a view on whether the organisation they want to share the data with has the proper technical and organisational controls in place to handle this data safely. We know that a single real-time bid request can be seen by potentially hundreds of organisations. The additional activities an organisation has to take before sharing data is going to be reliant on the context – and this is the case for any data controller in any sector. The same rules apply for real-time bidding. Given the nature of the real-time bidding ecosystem, we do not think a sole reliance on contractual agreements is valid.
Now, these are GDPR basics. The need for explicit consent for processing special category data in real-time bidding is unambiguous. The need for organisations to be accountable for who they share data with is unambiguous. This is simple stuff. Neither of our areas of focus are insignificant, theoretical, or exotic. We’ve focused on some of the foundational requirements of the GDPR – these things are the bread and butter of good data protection practice.
From what we’ve seen it appears many real-time bidding practices are unlawful. What we’re less sure about is whether industry players are aware that what they are doing is unlawful, or whether they do and are continuing to flout the law regardless.
The response to our report has been encouraging. A large majority of commentators concur with our description of the market and welcomed our report and call for engagement. We’ve had a number of productive and valuable discussions with Google, IAB Europe and IAB UK already. We’re looking forward to continuing and building upon these conversations and engaging with other industry players.
However, in some pockets of the industry, heads are still firmly in the sand. Either they don’t understand what we are saying, or they simply don’t want to listen and hope we’ll give up and go away. We are not going anywhere. We’re here to make sure real change happens.
I absolutely concur with Stephen Woodford’s description of the UK advertising and adtech sectors as being vibrant and world leading. And Doug Miller’s perceptive analysis of the GDPR in advertising demonstrates many firms in this space have the talent to engage in this area. They get it.
However, we have deliberated long and hard over how we should act on our concerns. There were a number of options on the table and we needed to decide what approach would affect the greatest change. We could have issued information notices to gather further information from organisations, or issued assessment notices to observe this processing, or even issued stop processing notices to order organisations to cease processing personal data.
We considered all the options available to us and decided the most effective course of action would be to give the industry time to reflect, review and address our concerns. We felt a six month period would give us time to continue our learning and give the industry time to begin addressing these issues.
Let me be clear on our reasons for this. We know how complex this ecosystem and market is, and how many organisations and complicated technologies are involved. We understand that many smaller publishers rely on this business model and would be left vulnerable if we decided to impose regulatory action now. We also understand that programmatic advertising and real-time bidding can offer enormous value if done in the right way, with privacy respectful and compliant processes and systems in place. But that doesn’t change our bottom line: Changes need to be made.
We’re still learning. And although our understanding of the industry and technology has come a long way, there is still further thinking and analysis for us to do. We don’t take action lightly, quickly, or without serious thought for the consequences. Our Regulatory Action Policy outlines our selective approach to action and how we decide to respond to infringements of information rights obligations – we have a spectrum of measures available to us. Our view is that an iterative approach and working directly with adtech firms gives the best chance of encouraging substantial and sustainable industry change.
Now, please do not mistake our measured approach as slow. Our report is intended to be a warning and a wake-up call. After this period of reflection and change, we’ll be undertaking a review of the industry. The current situation is unacceptable and untenable. We’ll be watching and engaging over the next six months and we expect action to be taken.
No one understands this business better than the people working in it, including the people in this room. There is a real opportunity for the industry to develop new practices and business models that address these concerns.
Please allow me to repeat myself. If these changes do not happen then we’ll need to take action. If we do not see fundamental changes being made, we are ready to respond and consider the full range of enforcement actions available to us.
To conclude, people expect that their personal information is used in a way that is legal, fair, secure and transparent. The advertising industry is not an exception. The technology used in RTB is truly impressive and the revenue generated for players across this industry is substantial. Some of the energy and expertise that has been dedicated to maximising returns in adtech now have to be channelled into meeting these challenges. We articulate these challenges through our Update Report, but these are not solely our concerns. They are concerns shared by other regulators around the world, by industry players and civil society groups and by individual users as they realise how their personal data is being used.
People want clarity and explanations. It is no longer sufficient to hide behind complex and opaque technology. “It’s complicated” is no longer an excuse. People expect players across the advertising industry to use their information in a way that is respectful, lawful, transparent and secure.
The GDPR applies to all sectors and adtech is no exception. We understand that real change is challenging, but real change is what is needed. We look forward to working with industry over the coming months to make this change happen.
Thank you for your attention
Latest News from
Information Commissioner's Office
Blog: Community groups and COVID-19: what you need to know about data protection27/03/2020 13:20:00
A blog by Ian Hulme, Director for Regulatory Assurance at the ICO.
Council employee fined £400 for illegally deleted audio file16/03/2020 10:25:00
A council employee has been fined £400 for an offence under the Freedom of Information (FOI) regulations.
Data protection and coronavirus12/03/2020 15:25:00
We all share the same concerns about the spread of the COVID-19 virus. The need for public bodies and health practitioners to be able to communicate directly with people when dealing with this type of health emergency has never been greater.
Blog: Don’t get caught out when it comes to pupil photos10/03/2020 15:10:00
Blog posted by: Andrew Laing, ICO Head of Data Protection Complaints, 09 March 2020.
Combining privacy and innovation: ICO Sandbox six months on10/03/2020 12:25:00
It’s been an exciting, interesting and challenging first six months for the ICO Sandbox – both for those externally involved in the various projects and for the ICO staff working on the scheme. Ian Hulme discusses the progress so far.
The ICO and the Office of the Australian Information Commissioner sign Memorandum of Understanding06/03/2020 12:25:00
James Dipple-Johnstone (Deputy Commissioner) yesterday commented on the signing of the Memorandum of Understanding.
International airline fined £500,000 for failing to secure its customers’ personal data04/03/2020 13:05:00
The Information Commissioner’s Office (ICO) has fined Cathay Pacific Airways Limited £500,000 for failing to protect the security of its customers’ personal data.
Scottish company hit with maximum fine for making nearly 200 million nuisance calls03/03/2020 09:10:00
The Information Commissioner’s Office (ICO) has fined CRDNN Limited with the maximum £500,000 fine for making more than 193 million automated nuisance calls.