Information Commissioner's Office
ICO launches consultation on Code of Practice to help protect children online
The Information Commissioner’s Office has opened consultation on 16 standards that online services must meet to protect children’s privacy.
Age appropriate design: a code of practice for online services sets out the standards expected of those responsible for designing, developing or providing online services likely to be accessed by children and which process their data.
When finalised, it will be the first of its kind and become an international benchmark.
Elizabeth Denham, Information Commissioner, said:
Introduced by the Data Protection Act 2018, the draft code sets out 16 standards of age appropriate design for online services like apps, connected toys, social media platforms, online games, educational websites and streaming services. It is not restricted to services specifically directed at children.
The draft code says that the best interests of the child should be a primary consideration when designing and developing online services. It says that privacy must be built in and not bolted on.
Settings must be “high privacy” by default (unless there’s a compelling reason not to); only the minimum amount of personal data should be collected and retained; children’s data should not usually be shared; and geolocation services should be switched off by default in most circumstances. So-called “nudge techniques” should not be used to encourage children to provide unnecessary personal data, to weaken their privacy settings or carry on using the service longer than they had intended. It also addresses issues of parental control and profiling.
Ms Denham said:
“The ICO’s Code of Practice is a significant step, but it’s just part of the solution to online harms. We see our work as complementary to the current focus on online harms, and look forward to participating in discussions regarding the Government’s white paper.”
The code gives practical guidance on data protection safeguards that ensure online services are appropriate for use by children. It leaves online service providers in no doubt about what is expected of them when it comes to looking after children’s personal data. It helps create an open, transparent and safer place for children to play, explore and learn online.
The standards in the code are rooted in existing data protection laws that are regulated by the ICO. Organisations should follow the code and demonstrate that their services use children’s data fairly and in compliance with data protection law. Those that don’t, could face enforcement action including fines of up to £17million or 4% of global turnover or orders to stop processing data.
Baroness Kidron, who led the parliamentary debate about the creation of the code, said:
“I welcome the draft code released today which represents the beginning of a new deal between children and the tech sector.
“For too long we have failed to recognise children’s rights and needs online, with tragic outcomes.
“I firmly believe in the power of technology to transform lives, be a force for good and rise to the challenge of promoting the rights and safety of our children. But in order to fulfil that role it must consider the best interests of children, not simply its own commercial interests. That is what the code will require online services to do. This is a systemic change.”
The code is out for consultation until 31 May. The final version will be laid before Parliament and is expected to come into effect before the end of the year.
The code was informed by initial views and evidence gathered from designers, app developers, academics and civil society. You can read the responses here.
The ICO also sought views from parents and children by working with research company Revealing Reality. The findings from that work are published for the first time today.
Notes to Editors
- The Information Commissioner’s Office (ICO) is the UK’s independent regulator for data protection and information rights law, upholding information rights in the public interest, promoting openness by public bodies and data privacy for individuals.
- The Government included provisions in the Data Protection Act 2018 to create world-leading standards that provide proper safeguards for children when they are online.
As part of that, the ICO is required to produce an age-appropriate design code of practice to give guidance to organisations about the privacy standards they should adopt when offering online services and apps that children are likely to access and which will process their personal data. (A link to the parliamentary debate, led by Baroness Kidron, is here.)
The standards in the Code will be backed by existing data protection laws which are legally enforceable and regulated by the ICO. The regulator has powers to take action against organisations that break the law including tough sanctions like orders to stop processing data and fines of up to £17million or 4% of global turnover.
- The ICO has specific responsibilities set out in the Data Protection Act 2018 (DPA2018), the General Data Protection Regulation (GDPR), the Freedom of Information Act 2000 (FOIA), Environmental Information Regulations 2004 (EIR) and Privacy and Electronic Communications Regulations 2003 (PECR).
- Since 25 May 2018, the ICO has the power to impose a civil monetary penalty (CMP) on a data controller of up to £17million (20m Euro) or 4% of global turnover.
- The GDPR and the DPA2018 gave the ICO new strengthened powers.
- The data protection principles in the GDPR evolved from the original DPA, and set out the main responsibilities for organisations.
- To report a concern to the ICO, go to org.uk/concerns.
Latest News from
Information Commissioner's Office
Blog: Why special category personal data needs to be handled even more carefully15/11/2019 09:10:00
Blog posted by: Ian Hulme, Director for Regulatory Assurance, 14 November 2019.
ICO call for views on the application for powers under the Proceeds of Crime Act11/11/2019 09:10:00
The Information Commissioner invites views on her office being granted access to investigation and other associated powers under the Proceeds of Crime Act 2002 (POCA).
Information Commissioner reminds political parties they must comply with the law ahead of General Election06/11/2019 09:10:00
The Information Commissioner has sent the following letter to the political parties in relation to the use of data in political campaigning.
Blog: Live facial recognition technology – police forces need to slow down and justify its use31/10/2019 13:10:00
Blog posted by: Elizabeth Denham, Information Commissioner, 31 October 2019.
Statement on an agreement reached between Facebook and the ICO30/10/2019 15:10:00
In 2017 the Information Commissioner's Office ("ICO") commenced a formal investigation into the misuse of personal data in political campaigns.
Blog: Embedding accountability – we want to hear from you29/10/2019 13:20:00
Blog posted by: Ian Hulme, Director for Regulatory Assurance, 28 October 2019.
AI Auditing Framework Call for Input: final considerations and next steps29/10/2019 09:10:00
As the initial Call for Input into the development of the ICO AI Auditing Framework comes to an end, Simon McDougall, Executive Director for Technology and Innovation, reflects on some of the overarching themes that have emerged in the first phase of our work.
Data Protection Impact Assessments and AI24/10/2019 10:20:00
Simon Reader, Senior Policy Officer, discusses some of the key considerations for organisations undertaking data protection impact assessments for Artificial Intelligence (AI) systems.