Ofcom investigates Telegram and teen chat sites
21 Apr 2026 12:57 PM
Enforcement action launched after evidence suggests child sexual abuse material being shared on Telegram and teen chat sites being used by predators to groom children.
Ofcom has launched an investigation into Telegram under the UK’s Online Safety Act, to examine whether it is complying with its duties to prevent child sexual abuse material being shared.
The UK’s online safety watchdog has also opened investigations into Teen Chat and Chat Avenue to examine whether they are meeting their duties to prevent children from the risk of being groomed by predators.
Additionally, we have provided updates on file-sharing services that are now either using hash-matching technology to detect and swiftly remove child sexual abuse material (CSAM) or have taken steps to prevent people in the UK from accessing their sites.
Suzanne Cater, Director of Enforcement at Ofcom, said:
“Child sexual exploitation and abuse causes devastating harm to victims, and making sure sites and apps tackle this is one of our highest priorities. It’s why we work so closely with partners in law enforcement and child protection organisations to identify where these harms are occurring and hold providers to account where they’re failing to meet their obligations.
“Progress has undeniably been made, particularly with file-sharing services, which are too often used to share horrific child sexual abuse imagery. But this problem extends to big platforms too, and teen-focused chat services are too easily being used by predators to groom children. These firms must do more to protect children, or face serious consequences under the Online Safety Act.”
CSAM on Telegram
It is illegal in the UK to share or be in possession of CSAM. Under the UK’s Online Safety Act, providers of ‘user-to-user’ services are required to assess and mitigate the risk of this horrific crime being perpetrated on their platforms.[1]
We work closely with law enforcement agencies and other organisations to identify platforms that are particularly susceptible to being used by offenders for the sharing of image-based CSAM.
We received evidence from the Canadian Centre for Child Protection regarding the alleged presence and sharing of child sexual abuse material on Telegram, and carried out our own assessment of the platform. In light of this, we have decided to open an investigation to examine whether Telegram has failed, or is failing, to comply with its duties in relation to illegal content.
Grooming on teen chat sites
The sexual exploitation and abuse of children online has devastating consequences for those affected. Online grooming crimes against children can include coercing a child to send sexual images of themselves, sexual extortion, and arranging in-person sexual abuse of a child.
Ofcom works with child protection agencies to identify services that present particular risks of grooming. This work has raised concerns about the risk to children on two chat services called Teen Chat and Chat Avenue, which have open chatrooms, private messaging, profile creation and media sharing functionalities.
Ofcom has engaged with representatives of the providers of these services to try and address these concerns. However, we remain unsatisfied as to whether they are providing adequate protection to UK children from the risk of grooming.
We have therefore opened investigations into whether the providers of Teen Chat and Chat Avenue are taking appropriate steps to assess and mitigate the risk of UK users encountering illegal content and activity, including grooming. The investigation into Chat Avenue will also consider whether the provider is taking adequate steps to prevent children from encountering harmful content, including pornography, on the site.
CSAM is being tackled on file-sharing services
When duties under the Act came into effect last year, we immediately launched enforcement action to assess the safety measures being taken by file-sharing providers to prevent offenders from disseminating CSAM on their services.
As part of this work, we became concerned that the provider of file-sharing service Pixeldrain had not taken appropriate measures to assess and mitigate this risk.
In response to us raising our concerns with them, the provider of Pixeldrain made material improvements to its Illegal Content Risk Assessment and implemented perceptual hash matching – an automated tool that can detect and swiftly remove CSAM.
We have also today closed our investigation into file-sharing service Yolobit, which has taken steps to make itself unavailable to people in the UK.
This follows on from five other file-sharing providers taking steps to make their services unavailable to UK users after we launched enforcement proceedings against them, and two other services deploying hash matching as a direct result of our action.
Ofcom’s investigation process
The Online Safety Act sets out the process Ofcom must follow when investigating a company and deciding whether it has failed to comply with its legal obligations.[2]
Our first step is to gather and analyse evidence to determine whether a breach has occurred. If, based on that evidence, we consider that a compliance failure has taken place, we will issue a provisional decision to the company, who will then have an opportunity to respond to our findings in full, as required by the Act, before we make our final decision.
We will provide updates on our investigations as soon as possible.

Enforcement powers
If we find that a company has broken the law, we can require it to take specific steps to come into compliance or to remedy harm caused by the breach. We can also impose fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is greater.
In the most serious cases of ongoing non-compliance, we can make an application to a court for ‘business disruption measures’, through which a court could impose an order requiring payment providers or advertisers to withdraw their services from a platform, or requiring internet service providers to block access to a site in the UK.
UK jurisdiction
As in other industries, companies that provide an online service to people in the UK must comply with UK laws. The Online Safety Act is concerned with protecting people in the UK. It does not require platforms to restrict what people in other countries can see.[3]
- User-to-user services are where people may encounter content – including images, videos, messages or comments – that has been generated, uploaded or shared by other users. Schedule 6 of the Online Safety Act explains the child sexual exploitation and abuse offences that are priority offences under the Act.
- Our Online Safety Enforcement Guidance can be found here.
- More information on jurisdiction is available here.