Information Commissioner's Office
Children’s privacy – new standards for online services will help protect children
Today we’re setting out the standards expected of those responsible for designing, developing or providing online services likely to be accessed by children, when they process their personal data.
Parents worry about a lot of things. Are their children eating too much sugar, getting enough exercise or doing well at school. Are they happy?
In this digital age, they also worry about whether their children are protected online. You can log on to any news story, any day to see just how children are being affected by what they can access from the tiny computers in their pockets.
Last week the Government published its white paper covering online harms.
Its proposals reflect people’s growing mistrust of social media and online services. While we can all benefit from these services, we are also increasingly questioning how much control we have over what we see and how our information is used.
There has to be a balancing act: protecting people online while embracing the opportunities that digital innovation brings.
And when it comes to children, that’s more important than ever. In an age when children learn how to use a tablet before they can ride a bike, making sure they have the freedom to play, learn and explore in the digital world is of paramount importance.
The answer is not to protect children from the digital world, but to protect them within it.
So today we’re setting out the standards expected of those responsible for designing, developing or providing online services likely to be accessed by children, when they process their personal data. Age appropriate design: a code of practice for online services has been published for consultation.
When finalised, it will be the first of its kind and set an international benchmark.
It will leave online service providers in no doubt about what is expected of them when it comes to looking after children’s personal data. It will help create an open, transparent and protected place for children when they are online.
Organisations should follow the code and demonstrate that their services use children’s data fairly and in compliance with data protection law. Those that don’t, could face enforcement action including a fine or an order to stop processing data.
Introduced by the Data Protection Act 2018, the code sets out 16 standards of age appropriate design for online services like apps, connected toys, social media platforms, online games, educational websites and streaming services, when they process children’s personal data. It’s not restricted to services specifically directed at children.
The code says that the best interests of the child should be a primary consideration when designing and developing online services. It says that privacy must be built in and not bolted on.
Settings must be “high privacy” by default (unless there’s a compelling reason not to); only the minimum amount of personal data should be collected and retained; children’s data should not usually be shared; geolocation services should be switched off by default. Nudge techniques should not be used to encourage children to provide unnecessary personal data, weaken or turn off their privacy settings or keep on using the service. It also addresses issues of parental control and profiling.
The code is out for consultation until 31 May. We will draft a final version to be laid before Parliament and we expect it to come into effect before the end of the year.
The code was informed by views and evidence gathered from designers, app developers, academics and civil society. You can read the responses here.
We also sought views from parents and children by working with research company Revealing Reality. The findings from that work are published today here.
Our Code of Practice is a significant step, but it’s just part of the solution to online harms. We see our work as complementary to the current initiatives on online harms, and look forward to participating in discussions regarding the Government’s white paper.
Latest News from
Information Commissioner's Office
Superior Style Home Improvements fined and issued with enforcement notice17/09/2019 13:20:00
The Information Commissioner’s Office (ICO) has fined a Swansea double-glazing company £150,000 for making nuisance calls.
Privacy attacks on AI models16/09/2019 13:20:00
Reuben Binns, our Research Fellow in Artificial Intelligence (AI), and Andrew Paterson, Principal Technology Adviser, discuss new security risks associated with AI, whereby the personal data of the people who the system was trained on might be revealed by the system itself.
SMOs must “prepare for all scenarios” to maintain data flows when UK leaves the EU11/09/2019 14:20:00
The ICO has urged businesses to “prepare for all scenarios” as it publishes dedicated guidance to help small and medium sized organisations prepare for the possibility that the UK leaves the European Union with no deal.
Information Commissioner’s Office issues warning about historical personal details accessed through work06/09/2019 12:25:00
An ICO investigation into the actions of two former Metropolitan Police Service (MPS) officers has concluded.
Statement on the High Court judgement on the use of live facial recognition technology by South Wales Police04/09/2019 13:25:00
An ICO spokesperson responded to the statement on the High Court judgement on the use of live facial recognition technology by South Wales Police
Data minimisation and privacy-preserving techniques in AI systems22/08/2019 12:20:00
Reuben Binns, our Research Fellow in Artificial Intelligence (AI), and Valeria Gallo, Technology Policy Adviser, discuss some of the techniques organisations can use to comply with data minimisation requirements when adopting AI systems.
Statement: Live facial recognition technology in King's Cross19/08/2019 15:25:00
Statement from Elizabeth Denham, Information Commissioner, on the use of live facial recognition technology in King's Cross, London.
Statement: Live facial recognition technology in Kings Cross16/08/2019 10:10:00
Statement from Elizabeth Denham, Information Commissioner, on the use of live facial recognition technology in Kings Cross, London.
Blog: Three top issues for town and parish councils15/08/2019 10:15:00
The advent of the GDPR in May 2018 brought new data protection obligations for many organisations. Some of this presented a challenge, particularly for smaller organisations like parish and town councils, who we saw were keen to demonstrate their compliance but needed support to achieve this.