A public call for online platforms to do more to tackle social media content which is harmful to children

30 Jan 2019 02:10 PM

Blog posted by: Anne Longfield, Children’s Commissioner for England, 30 January 2019.

The following letter, written by Anne Longfield, Children’s Commissioner for England, has been sent to several of the leading social media organisations, including Facebook, which includes Instagram and WhatsApp, Snapchat, Youtube and Pinterest.

The tragic suicide of Molly Russell and her father’s appalled response to the material she was viewing on social media before her death have again highlighted the horrific amount of disturbing content that children are accessing online.

In January 2017, I published “Growing Up Digital”, a year-long study looking at the experiences of children as they navigate their way through today’s online world. A year later I followed it with “Life in Likes”, looking at the relationship 12 and 13 year olds have with social media. Both of these reports, and the subsequent work my office has done casting light on the data which tech companies collect from children, make clear that social media companies have expanded in ways not envisaged even a few years ago. I do not think it is going too far to question whether even you, the owners, any longer have any control over their content. If that is the case, then younger children should not be accessing your services at all, and parents should be aware that the idea of any authority overseeing algorithms and content is a mirage.

I come from the starting point that today’s children see no difference between offline and online and that the digital world provides a range of positive experiences, from learning to connecting people. However, none of the platforms regularly used by vast numbers of children were designed or developed with children in mind, and for some children this is proving harmful, whether that is due to addictive in-app features, inappropriate algorithms or a lack of responsibility for the hosting of dangerous content.

Over the last few years, I have had dialogue with many of the big social media companies over how best to make sure children have the resilience, information and power they need to make safe and informed choices about their digital lives. I have been reassured time and time again that this is an issue taken seriously. However, I believe that there is still a failure to engage and that children remain an afterthought.

I share the frustrations of the Duke of Cambridge who invited the tech companies to work with him to improve the online experiences of children and young people. He said: “I am very concerned though that on every challenge they face—fake news, extremism, polarization, hate speech, trolling, mental health, privacy, and bullying—our tech leaders seem to be on the back foot … The noise of shareholders, bottom lines, and profits is distracting them from the values that made them so successful in the first place.”

My experiences are the same. The potential disruption to all user experiences should no longer be a brake on making the safety and wellbeing of young people a top priority. Neither should hiding behind servers and apparatus in other jurisdictions be an acceptable way of avoiding responsibility.

The recent tragic cases of young people who had accessed and drawn from sites that post deeply troubling content around suicide and self-harm, and who in the end took their own lives, should be a moment of reflection. I would appeal to you to accept there are problems and to commit to tackling them – or admit publicly that you are unable to.

By law, I have the power to demand data pertaining to children from public bodies. Your company is not covered by this legislation but in the interests of greater transparency I would ask you to answer the following questions, or to explain to your users why you will not.

In ‘Growing Up Digital’, I called for the establishment of a Digital Ombudsman, financed by the internet companies themselves, but independent of them. This would be an arbiter, able to respond to the concerns of children and parents by demanding greater transparency and action from internet companies so material that is detrimental to the wellbeing of children is removed quickly. I am more convinced than ever that this is needed now and that the time has come for action. I have also called for companies like yourselves to be bound by a statutory duty of care, a legal obligation to prioritise the safety and wellbeing of children using your platforms.

With great power comes great responsibility, and it is your responsibility to support measures that give children the information and tools they need growing up in this digital world – or to admit that you cannot control what anyone sees on your platforms.