Information Commissioner's Office
Blog: ICO regulatory sandbox
Discussion Paper and Intention to Apply Survey
In November we published an analysis of the call for views on our proposed regulatory sandbox. Since then, we have continued to develop the systems and processes necessary to launch a fully functioning beta phase of our sandbox, with the aim of opening for applications at the end of April.
We’ve had lots of questions about how our sandbox might work in practice and we know that organisations will be considering whether an application will be right for them.
With that in mind, we have published our sandbox discussion paper which explains to potential participants how we see the sandbox working in practice. The paper sets out our thinking so far - from early engagement through to application, sandbox entry and, ultimately, exit.
The paper will form part of the discussion at our sandbox workshop event in London on 6 February. This event is now fully booked but we would still welcome feedback on the project, including from any potential sandbox participants. We’ve included discussion questions throughout the paper and are asking people to send their views to firstname.lastname@example.org.
To help us plan our resources and build the sandbox appropriately, we are also now opening our ’Intention to apply’ survey. This will enable organisations to tell us in advance about any product or service that they might consider entering into the sandbox.
It is not an application – the full formal process will open later in the year – but will give us more of an idea about the numbers and types of formal applications we are likely to receive.
The survey will remain open until we open for applications and is entirely voluntary and non-binding. Click here to access the survey.
The sandbox is open to all sectors and all sizes of organisation, so whether public, private or third sector, a tech start-up or an innovation hub at a large established company or Government body, please do get in touch if you plan to use personal data in a new and innovative way.
Chris Taylor is a Head of Assurance at ICO working on the development of ICO's operational approach to Codes Of Conduct, Certification Schemes, Regulatory Sandbox and eIDAS.
Latest News from
Information Commissioner's Office
Blog: Why special category personal data needs to be handled even more carefully15/11/2019 09:10:00
Blog posted by: Ian Hulme, Director for Regulatory Assurance, 14 November 2019.
ICO call for views on the application for powers under the Proceeds of Crime Act11/11/2019 09:10:00
The Information Commissioner invites views on her office being granted access to investigation and other associated powers under the Proceeds of Crime Act 2002 (POCA).
Information Commissioner reminds political parties they must comply with the law ahead of General Election06/11/2019 09:10:00
The Information Commissioner has sent the following letter to the political parties in relation to the use of data in political campaigning.
Blog: Live facial recognition technology – police forces need to slow down and justify its use31/10/2019 13:10:00
Blog posted by: Elizabeth Denham, Information Commissioner, 31 October 2019.
Statement on an agreement reached between Facebook and the ICO30/10/2019 15:10:00
In 2017 the Information Commissioner's Office ("ICO") commenced a formal investigation into the misuse of personal data in political campaigns.
Blog: Embedding accountability – we want to hear from you29/10/2019 13:20:00
Blog posted by: Ian Hulme, Director for Regulatory Assurance, 28 October 2019.
AI Auditing Framework Call for Input: final considerations and next steps29/10/2019 09:10:00
As the initial Call for Input into the development of the ICO AI Auditing Framework comes to an end, Simon McDougall, Executive Director for Technology and Innovation, reflects on some of the overarching themes that have emerged in the first phase of our work.
Data Protection Impact Assessments and AI24/10/2019 10:20:00
Simon Reader, Senior Policy Officer, discusses some of the key considerations for organisations undertaking data protection impact assessments for Artificial Intelligence (AI) systems.