Ofcom
|
|
Remarks by Melanie Dawes, Ofcom Chief Executive, at the NSPCC
Remarks by Melanie Dawes, Ofcom Chief Executive, at the NSPCC.
For almost a century and a half now, the NSPCC has given a voice to every child in this country who faces abuse or neglect. And as someone who remembers Childline being set up, it’s incredible to think that it’s been helping children now for forty years.
Over that time, some of the common challenges faced by young people have endured. Growing up has never been easy.
But so many aspects of a modern child’s life are relatively new. Forty years ago – even twenty five, for that matter – there was no social media. No online video, gaming, posting or messaging. No likes, shares or emojis.
As a child before social media, you didn’t get a notification unless you’d done something seriously bad at school. You weren’t concerned about viral trends until you got the mumps or the measles. Your only phone was the dusty handset in the hall.
Today, that world is almost unrecognisable. Online services have grown at extraordinary speed. Social media, search, messaging and gaming are deeply woven into everyday life. For many people, disengaging from the online world simply isn’t an option.
Firms have focused on maximising profits, with algorithms set to maximise engagement. Without the right protections, such as effective age checks, children have been routinely exposed to risks they did not choose, on platforms they cannot realistically avoid.
But all of this is changing.
The Online Safety Act introduces clear duties on tech firms. They must now assess these risks upfront, and address them effectively.
The scale of change required is immense. This is not just about new processes or technical fixes. It’s about changing a culture that has operated without clear rules for a generation, so that safety is built in from the start, not added as an afterthought.
And the risks are not standing still. AI is creating new services and new harms – from powerful chatbots to the spread of intimate images. We strongly support the work the Government is doing to respond to this and look at strengthening the law.
What we’re doing
Ofcom’s powers came into force last March. Since then, we’ve moved quickly.
We have taken enforcement action, secured changes to disrupt the sharing of child sexual abuse material, and seen high risk services either get in line or block access to the UK altogether.
We’ve also made big progress on age assurance. Since July, most daily visits to porn sites now require highly effective age checks.
Major platforms, including X, Telegram, Discord and Reddit, have also introduced age controls to prevent children accessing adult or harmful content.
So far we have tackled some of the nastiest and most harmful parts of the internet – including suicide forums and abuse sites that have refused to engage properly. Where companies do not comply, we will continue to act. That might include fines, but also business disruption measures.
We, like you, were deeply concerned by reports that X’s AI engine Grok was being used to alter images of children and adults in ways that were highly inappropriate, and without their consent. As a result, at the start of the year, we opened a major investigation into X. We were the first regulator globally to use our formal powers in this case, and we’re prioritising the investigation as a matter of urgency.
While there are many examples of progress to be welcomed, the industry has not done enough.
When it comes to the largest sites – household names that children use the most – many have fallen short of putting children’s safety at the heart of their products. There’s a gap between what tech companies are promising in private, and what they’re doing publicly to keep children safe on their platforms. In short, too many still prioritise profit at the expense of their users’ safety.
Parents have lost trust in tech firms’ ability to keep their children safe, while the Government is consulting on further legislative measures to address public concern.
What we expect now
So today, we are setting out four clear demands for further action, so that tech firms are held publicly accountable for delivering the safest possible online environment for UK children – which many have claimed, privately in their engagement with Ofcom, they are serious about offering.
We have set Facebook, Instagram, Roblox, Snapchat, TikTok and YouTube a deadline of 30 April to report back to us on the action they will take, and we are urging them to publish this. In May, Ofcom will report on how the companies have responded and will announce any next steps for regulatory action.
First, platforms must finally apply their own minimum age rules properly. It’s time to keep underage children off their platforms. If they don’t, how can they expect parents to trust them?
Second, they must put in place failsafe grooming protections to stop strangers from being able to contact children they do not know on their platforms.
Third, they must make their algorithms safe. Algorithms which provide personalised recommendations to users are children’s main pathway to harm online. We expect platforms to meet their legal duties.
Fourth, companies must not test new products on children. Significant updates – especially those involving AI – must be risk assessed before they are deployed. Safety by design is not optional. It is the law.
These companies now have two months to respond and set out how they’ll make the changes. We are encouraging them to make these public.
In May – as well as reporting on companies' responses to us - we will publish the first year of our research into children’s online experiences - a crucial test of whether the industry is really becoming safer.
And we will not hesitate to act. Where progress falls short, we will use enforcement, strengthen our regulation, and support further laws if Parliament pursues them.
We are clear-eyed about the challenge. This is a completely new field of regulation. Fundamental and lasting change takes time to achieve. What we are doing today is supplementing the private pressure we put on companies with a challenge for them to be accountable to the British public.
We cannot do it alone. Strong partnerships are vital: not just with Government and law enforcement, but with other regulators – and we’re working closely with the ICO on age assurance, for example.
And of course, with expert bodies like the NSPCC.
You have already made a massive contribution to the changes that are taking place. Responding to our plans. Helping to shape our Codes. Offering support and critical advice. As the UK’s leading children’s charity, your dedication and expertise are invaluable to our task. So we look forward to continuing that relationship.
Of course, Ofcom will also continue to speak to children, understanding their needs and concerns – including those who have suffered abuse or exploitation. Through these conversations, our research and our experience of creating the regulations and enforcing the Act, we’ll continue to adapt our approach.
But throughout, our message remains clear: for social media users, especially children, harm cannot be the price of engagement.
Change is already happening, and today we are going further.
With your help, we can create a safer life online for every child in the UK.
Original article link: https://www.ofcom.org.uk/online-safety/protecting-children/remarks-by-melanie-dawes-ofcom-chief-executive-at-the-nspcc


