Ofcom investigates online forums hosting image-based sexual abuse

6 Mar 2026 12:40 PM

Ofcom has today launched an investigation into whether the provider of two online image boards has failed to comply with duties to protect people in the UK from illegal content.

Due to the nature of these sites, we have decided not to name them or their provider. 

Tackling online harms against women and girls

It is illegal in the UK to share non-consensual intimate images (NCII) or child sexual abuse material (CSAM). Under the UK’s Online Safety Act, providers of ‘user-to-user’ services are required to assess and mitigate the risk of UK users encountering this type of content on their platforms.[1]

This is something that disproportionately impacts women and girls, and making sure sites and apps tackle this is one of Ofcom’s highest priorities.[2]

When the new duties on tech firms came into force last year, we immediately launched a programme of enforcement action against services that are used to distribute CSAM. As a result, some have deployed automated tools to detect and swiftly remove this vile content, while others have withdrawn from the UK.

In total, under the Act we have launched investigations into nearly 100 platforms – including X, when Grok was used to create and share demeaning sexual deepfakes of women and children. We have issued nearly a dozen fines for non-compliance, including against a nudification site, which has withdrawn from the UK.

We also recently announced that we will be fast-tracking our decision on proposed new requirements for tech firms to use technology to block non-consensual intimate images at source, bringing it forward to May.

New investigation into image boards

Our job is to judge whether platforms have taken appropriate steps to comply with their legal obligations – it’s not to tell platforms which specific posts or accounts to take down.

We have engaged extensively with victims, survivors and advocacy groups, and carried out an initial assessment of two sites used to facilitate image-based sexual abuse. Today, we have opened a formal investigation to establish whether the provider of these sites has failed to comply with its duties under the Act: 

We will provide an update on this investigation as soon as possible.

Ofcom’s investigation process

The Online Safety Act sets out the process Ofcom must follow when investigating a company and deciding whether it has failed to comply with its legal obligations.[4]

Our first step is to gather and analyse evidence to determine whether a breach has occurred. If, based on that evidence, we consider that a compliance failure has taken place, we will issue a provisional decision to the company, who will then have an opportunity to respond to our findings in full, as required by the Act, before we make our final decision'

 Typical OS investigation process

Enforcement powers

If our investigation finds that a company has broken the law, we can require platforms to take specific steps to come into compliance or to remedy harm caused by the breach. We can also impose fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is greater.

In the most serious cases of ongoing non-compliance, we can make an application to a court for ‘business disruption measures’, through which a court could impose an order requiring payment providers or advertisers to withdraw their services from a platform, or requiring internet service providers to block access to a site in the UK.

UK jurisdiction

As in other industries, companies that provide an online service to people in the UK must comply with UK laws. The Online Safety Act is concerned with protecting people in the UK. It does not require platforms to restrict what people in other countries can see.[5]

Notes to editors 

  1. User-to-user services are where people may encounter content – including images, videos, messages or comments – that has been generated, uploaded or shared by other users. See 66D, subsections 5-7 of the Sexual Offences Act, as inserted by the Online Safety Act, for the definition of intimate image abuse, and Schedule 6 explains the child sexual exploitation and abuse offences that are priority offences under the Act.
  2. In November, we launched new industry guidance demanding that tech firms step up to deliver a safer online experience for women and girls in the UK.
  3. Ofcom’s illegal harms codes of practice set out safety measures providers can implement to comply with their duties, such as: having user reporting and complaints processes for illegal content that are easy to find, access and use; adequately resourcing and training content moderation teams as appropriate to deal with illegal content; and having content moderation systems that are designed to take down illegal content swiftly when they become aware of it. The responsibility is on platforms to decide whether content is illegal, and they can use Ofcom’s Illegal Content Judgements Guidance when making these decisions.
  4. Our Online Safety Enforcement Guidance can be found here.
  5. More information on jurisdiction is available here.