Tackling harmful content on Facebook – Anne Longfield’s letter to Facebook’s Head of Global Affairs, Nick Clegg
The following open letter to Nick Clegg, Facebook’s Head of Global Affairs, was published in The Daily Telegraph on Jan 30, 2020.
Dear Sir Nick,
I was very interested to hear the interview you gave the BBC from Davos as Facebook’s Head of Global Affairs. You were discussing concerns I have been raising about your company’s platforms and other social media giants regarding the worrying ease with which children, some of whom are particularly vulnerable, can access harmful, upsetting and even dangerous content on yours, and their sites.
You accepted on the BBC’s Today programme that in terms of tackling this problem across Facebook, WhatsApp and Instagram, “It’s not something we are completely on top of”. I welcome that recognition but regard it as an understatement despite some movement from yours, and other companies and waves of warm words
I detected a sense of frustration, and your answers conveyed the idea that somehow parents and those raising concerns, don’t understand the difficulties Facebook have in practically tackling this, or appreciate what has already been done. I can only say that any frustration felt is matched by my own that social media companies have in reality failed so far to match the scale of valid concerns that parents and children have raised with me, in large numbers.
I accept that there has been some action taken, but I know you recognise that vulnerable children can still see inappropriate content, across platforms right now. A platform with a third of the world’s population using it may indeed have practical problems ‘policing’ its usage but that just raises more questions about Facebook than it provides reassurance to parents. The scale of Facebook’s growth is down to Facebook and Facebook has to manage the problems that brings. With power comes responsibility.
The word ‘policing’ is also troubling. Removing clearly harmful content shouldn’t carry a fear of being branded heavy handed or at odds with the spirit of free speech. We believe, when it comes to safeguarding children, removing all harmful content, quickly, should be the main priority of your social responsibilities as a company. Achieving that should be a key focus for the whole company across all three platforms.
Any lack of appreciation of progress from parents or wider society might be due to the selective way in which Facebook is prepared to be transparent. You could quickly pinpoint that “149 Billion messages” had been shared on your platforms on New Year’s Eve and yet repeated requests from us, and others, to reveal how many children under 13 regularly use Facebook, WhatsApp and Instagram, has never been met with a figure
I am particularly concerned by Facebook’s reliance on arguments around privacy in explaining its plans to encrypt Facebook Messenger and Instagram messages. The decision to encrypt represents a real threat to children, who may come to harm when interacting with other users via these routes, with your company and the police left with no real way of knowing or intervening.
You told the interviewer that Facebook takes material down “when it is reported to us”. Despite your own proactive work Facebook still relies very heavily on users reporting material to you themselves – self policing – by which time of course the content has already been seen, often by children.
From what children tell us, there are still big issues here: ranging from a lack of response, long waiting times for a response if it comes, and often a lack of action excused with a simple explanation that the content doesn’t breach your terms and conditions, without actually addressing the reasons why the child felt it was harmful and reported it. Children have told me many times this is one their biggest issues especially with regard to bullying and its detrimental effect on their mental health. Indeed many tell me platforms have so often been unresponsive in the past, they now no longer bother to alert you.
You said in response to one question, “There is nothing in the business model that lends itself to showing harmful and unpleasant, and offensive or dangerous material to anybody”. What would be more exciting and positive would be to see your business model recognise the commercial advantage of very publicly tackling online harms on a scale it hasn’t yet come close to. Saying, albeit with regret, that it would be hard to do, is not good enough for a company that has been at the cutting edge of huge “speed and scale” solutions when it wants to promote its own growth. Algorithms and the smartest minds can be found to do the latter, why not the former?
Furthermore, the business model does provide a disincentive to applying one possible solution. Social media platforms build huge user numbers by offering “seamless on-boarding” i.e. using platform design to make it incredibly easy and very quick to become a user. There seems to have been little appetite so far from Facebook, Instagram or WhatsApp, and indeed other companies, to retro-fit safety measures, such as age verification, lest they clutter up the gateway with delays which damage that ability to grow users. Given the Information Commissioners Age Appropriate Design Code will be effective in 12-18 months’ time, that argument would no longer be viable, and we’d welcome hearing how the company might be thinking ahead of that.
That code should mean you have to do what you already could have done which is to genuinely restrict the platforms to the age they are designed for. The age limit is laid out in your own Terms and Conditions. For children under 13, the best way to remove their access to harmful content, is not to allow them access to your site, under your own rules.
You were in Davos making the argument that the UK Government shouldn’t impose a tech-tax on companies such as yours until there has been global discussion of how to do it properly. As an experienced politician, you’ll know historically, such arguments regularly have the secondary effect of delaying something one doesn’t want. However on the issue of online harms and vulnerable children this debate has been taking place for some years, the arguments made, and suggestions for solutions made, repeatedly. Facebook has had time to make its case. From my perspective, and that of too many parents, and children you and many other social media giants simply haven’t made that case convincingly enough, forcing Government to be the driver here.
We warned when Government did indeed start looking at ways to tackle these problems that any proposed legislation or code would get big push back from tech companies. So it has proved and perhaps your interview should be seen in that context. I find that surprising. When you were at the top of Government yourself I find it hard to believe you wouldn’t have been full-square behind the legislation and codes now being suggested. If you are, I’d welcome you saying so.
Children’s Commissioner for England
Latest News from
Responding to the DfE’s announcement on children in care living in unregulated accommodation12/02/2020 14:33:00
Anne Longfield, Children’s Commissioner for England, has responded to the Department for Education’s announcement on children in care living in unregulated accommodation.
Responding to the multi-agency report on children sexually abused within the family05/02/2020 12:20:00
Anne Longfield, the Children’s Commissioner for England has responded to the multi-agency report on children sexually abused within the family.
Children’s mental health report warns ‘chasm’ remains between what services are available and what children need31/01/2020 14:10:00
Anne Longfield, the Children’s Commissioner for England, has published her third annual children’s mental health briefing, ‘The state of children’s mental health services’.
Responding to the Chief Inspector of Prisons’ report on the separation of children in young offender institutions21/01/2020 13:20:00
Anne Longfield, Children’s Commissioner for England, has responded to today’s report by the Chief Inspector of Prisons.
Keeping kids safe, Anne Longfield’s article for the Huffington Post17/01/2020 13:25:00
This week’s publication of the inquiry into the catastrophic failures to protect very vulnerable children in Greater Manchester is another reminder of how agencies and authorities whose job it is to protect those at risk have too often let them down.
County lines gangs – a letter to the Editor of The Times17/01/2020 09:20:00
The Prime Minister’s decision to tackle the gang leaders who are targeting our kids through a cross-government Cabinet committee should be applauded.
Thousands of children in care passed around the system like parcels and living miles away from family and friends24/12/2019 10:10:00
Anne Longfield, Children’s Commissioner for England, is today (Tuesday December 24th) publishing a report revealing thousands of children in the care system are living many miles away from family and friends. The report, ‘Pass the parcel: children posted around the care system’, shines a light on the experiences of children in care who are moved ‘out of area’ – away from their home boroughs where family and friends live.
Response to the Home and Foreign Secretaries regarding British children in Syria07/11/2019 12:10:00
Earlier this year Anne Longfield, Children’s Commissioner for England, wrote to the Home Secretary and Foreign Secretary regarding the safety and welfare of British children who have become involved in the conflict in Syria.
Marking 30 years of the UN Convention on the Rights of the Child with call on political parties to put the spotlight on children during election campaign05/11/2019 15:20:00
Anne Longfield, Children’s Commissioner for England, is calling on the political parties fighting the General Election to put the spotlight on children’s issues, as she joins the Children’s Commissioners for Scotland, Wales and Northern Ireland, in publishing an assessment of the UK’s progress on children’s rights.