Ofcom
|
|
Making life safer online: our priorities for the year ahead
Oliver Griffiths, Group Director for Online Safety – statement on priorities for the year ahead
In March we published the Ofcom Plan of Work for financial year 2026-27, following consultation on our high-level objectives for making life safer online. A short document covering all Ofcom’s work inevitably skips over many of the competing considerations that we weighed up. This article shines a light on some of the key trade-offs involved and the judgements we have made.
I’ll start with two points. First, we are simultaneously driving companies’ compliance with the Online Safety Act, while implementing and operationalising remaining parts of the OSA, and now preparing for a raft of new legislative and regulatory online safety initiatives being introduced by the government. Second, the OSA is an extremely ambitious piece of legislation: it covers over 130 priority offences and more than 100,000 services are in its scope.
Taken together it means that we need to make choices about where to concentrate effort. If we spread ourselves too thinly, we won’t be able to drive effective and sustained change for UK users. As they say, when everything is a priority, nothing is. I’ll cover the implementation and preparation work quickly before spending most of my time on our compliance choices.
Implementing the current Act
Most of our remaining implementation work is set by the OSA itself. This year we must publish statutory reports on the effectiveness of age assurance, trends in content harmful to children and app stores.
We’re providing advice on Technology Notices this month. Then we’ll publish the register of categorised services in the summer, together with a consultation on the duties applying to these services, including a draft code of practice on fraudulent advertising. We will publish updated Codes in the autumn with additional safety measures, including AI and other automated content moderation tools. We have decided to bring forward our crisis response measure to the summer. Further details are contained in the Roadmap update.
New legislation
Recent or upcoming legislation will need substantial policy work from Ofcom to bring them into effect. We are already working on some important new priority offences under the OSA, for example–launching a new consultation in March to expand our Codes to cover encouraging serious self-harm and cyberflashing.
The Crime and Policing Act, which received Royal Assent at the end of April introduces further duties for services and will require detailed rules from Ofcom. These include duties to take down non-consensual intimate imagery in 48 hours and report it to a central registry, automatic Data Preservation Notices in the event of the death of a child, and a power for Government to bring more GenAI services into scope of the OSA.
Moreover, the government’s important consultation on preparing children for the future in an age of rapid technological change contains many proposals which – if adopted – would create new duties and require comprehensive policy work to implement. We are following the consultation closely and are on hand to advise government on how the new duties would interact with the existing rules and how they could be implemented.
Driving change
The first parts of the OSA came into force in March 2025 and in the first year of regulation our over-riding priority was the protection of children. We are proud of the changes we have driven so far.
As a result of our interventions all the top ten pornography sites in the UK, and most of the top-100, now use age checks to keep children off. We have opened investigations into nearly a hundred services, securing changes to disrupt the sharing of child sexual abuse material and taking action against porn sites that failed to comply with the new rules.
Roblox now uses age checks to restrict adults from contacting children they’re not already connected to, and dating apps like Tinder, Grindr, Bumble and Hinge are also now using age checks to prevent inappropriate contact between adults and under-18s. Major platforms including X, Telegram, Discord. Reddit, Bluesky, Xbox and Steam have also introduced age checks.
Setting our priorities for action
With the new financial year underway, our compliance programmes are expanding. In making decisions about where to direct our resources, we need to strike a balance between tackling the most widespread harms, such as fraud, and addressing the less frequent but even more severe harms, like child sexual abuse and grooming.
Similarly, we need to decide how far to focus our compliance drives on the large household names versus smaller services which are accessed less frequently but pose the highest risk of serious or fatal harm.
The answer, inevitably, involves a bit of all of the above. When making choices we use the following factors to guide us: how many people use a service, how serious the harm can be, whether users are especially vulnerable, whether there are widespread problems that keep happening, and how services act when risks are found.
We will always be led by the evidence. Our new Data and Intelligence team marshals information from our own research, data we formally request from services, intelligence supplied by our partners in civil society or law enforcement, and those with lived experience.
For 2026-27, as well as continuing to press hard on the protection of children, we will be focusing effort on countering terrorism and illegal hate, and improving women’s and girls’ safety online. That means that our different specialist staff – research, supervision, enforcement, technologists, lawyers and analysts – will be aligned to drive outcomes in these priority areas.
Of course, all the legal duties and relevant offences under the Online Safety Act continue to apply – our prioritisation does not change what the law requires. So while most of our staff will be supporting our priorities, we will also intervene beyond them.
Wherever there appear to be egregious breaches of the Act we will look into them promptly and act where appropriate, like we did with our investigation of a prominent suicide forum linked with over 100 deaths in the UK or the probe we launched in January into X over AI generated sexual deepfakes.
Improving protections for children
Our enforcement programmes on child sex abuse material and age assurance will continue. We will update on our analysis of major providers’ algorithms in July. We have told major services used by children to meet clear expectations on effective age checks, protections against grooming and child sexual abuse material, safer feeds and recommendations, and proper testing and risk assessment before new products are launched. Services responded at the end of April, and we will report publicly on progress later this month.
Removing illegal content, especially hate and terror
There are reports of disturbing rises in hate speech online, including a recent surge in antisemitic content. This has no place in our society and we are prioritising action on illegal hate and terrorist content. We will press services to demonstrate that they are finding and removing this material. Our focus is on making sure the protections they have in place - for example their content moderation systems - are effective. We will be providing an update on this compliance programme shortly.
Protecting women and girls
Intimate image abuse disproportionately affects women and girls, and we are acting on the urgent need to protect them. We will build on enforcement action this year into sexual deepfakes, a nudification site and image-based sexual abuse. This month we will set a new technical standard – known as ‘hash-matching’ - to prevent the upload of known non-consensual intimate images. This requires services to move beyond reactive takedown and ensure proactive protections work effectively at scale, including as new forms of abuse emerge. We will continue to prioritise taking enforcement action against services that fail to do so.
We will also be collecting evidence for a report in 2027 on how tech firms have applied the good practice set out in our 'Safer life online for women and girls’ guidance published in November 2025. The report will shine a light on the progress made towards reducing online gender-based harms with a view to directing future action.
How we drive change
To simplify massively, we drive change through three channels. First, through awareness raising. There are tens of thousands of sites in scope of the OSA that do not pose high risks of illegal content or to children. However, they need to carry out risk assessments to check that they are on top of the issues. Our digital tools and resources for industry are here to help.
Second, we directly supervise 40 of the highest profile services – from social media, to gaming, search, dating and AI. Supervision involves developing a detailed understanding of companies’ business models and operating systems, them checking their understanding of what they need to so, and us setting out bespoke requirements for change company by company.
This is inevitably the least visible part of driving compliance but it is essential – the move to adopt age checks was the result of deep engagement over many months. We have also signalled that we will play more of these conversations out in public when we think it will drive greater change, such as reporting later this month on responses from the six services most used by children to our challenge across four key areas.
Third, enforcement. This is the most visible element. We can fine companies up to 10% of their qualifying worldwide revenue (or £18m, whichever is higher) for breaches of the OSA. We have launched more online safety investigations over the past year than any of our international regulator peers. Investigations will always be a key part of our toolkit, particularly where a service is in breach of the OSA and shows no intention of complying or seeks to avoid regulation. However, enforcement will not always be the best approach: supervision can often drive meaningful change for users more quickly and with less resource.
Keeping you updated
We will continue to keep you updated on our supervision and enforcement work, how our priorities evolve as new risks and evidence emerges, and the improvements that our compliance work is driving.
Original article link: https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/making-life-safer-online-our-priorities-for-the-year-ahead


