Ofcom launches investigation into X over Grok sexualised imagery

12 Jan 2026 12:00 PM

The UK’s independent online safety watchdog, Ofcom, has today opened a formal investigation into X under the UK’s Online Safety Act, to determine whether it has complied with its duties to protect people in the UK from content that is illegal in the UK.

Our initial assessment

There have been deeply concerning reports of the Grok AI chatbot account on X being used to create and share undressed images of people – which may amount to intimate image abuse or pornography – and sexualised images of children that may amount to child sexual abuse material (CSAM).[1]

 As the UK’s independent online safety watchdog, we urgently made contact with X on Monday 5 January and set a firm deadline of Friday 9 January for it to explain what steps it has taken to comply with its duties to protect its users in the UK.

The company responded by the deadline, and we carried out an expedited assessment of available evidence as a matter of urgency.[2]

What our investigation will examine

Ofcom has decided to open a formal investigation to establish whether X has failed to comply with its legal obligations under the Online Safety Act – in particular, to: 

Ofcom’s role

The legal responsibility is on platforms to decide whether content breaks UK laws, and they can use our Illegal Content Judgements Guidance when making these decisions. Ofcom is not a censor – we do not tell platforms which specific posts or accounts to take down.

Our job is to judge whether sites and apps have taken appropriate steps to protect people in the UK from content that is illegal in the UK, and protect UK children from other content that is harmful to them, such as pornography.

Ofcom’s investigation process

The Online Safety Act sets out the process Ofcom must follow when investigating a company and deciding whether it has failed to comply with its legal obligations.[5]

Our first step is to gather and analyse evidence to determine whether a breach has occurred. If, based on that evidence, we consider that a compliance failure has taken place, we will issue a provisional decision to the company, who will then have an opportunity to respond our findings in full, as required by the Act, before we make our final decision.

 How Ofcom investigates under the Online Safety Act_X Grok process

Enforcement powers

If our investigation finds that a company has broken the law, we can require platforms to take specific steps to come into compliance or to remedy harm caused by the breach. We can also impose fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is greater.[6]

In the most serious cases of ongoing non-compliance, we can make an application to a court for ‘business disruption measures’, through which a court could impose an order, on an interim or full basis, requiring payment providers or advertisers to withdraw their services from a platform, or requiring internet service providers to block access to a site in the UK. The court may only impose such orders where appropriate and proportionate to prevent significant harm to individuals in the UK.[7]

UK jurisdiction

In any industry, companies that want to provide a service to people in the UK must comply with UK laws. The UK’s Online Safety Act is concerned with protecting people in the UK. It does not require platforms to restrict what people in other countries can see.

There are ways platforms can protect people in the UK without stopping their users elsewhere in the world from continuing to see that content.[8]

An Ofcom spokesperson said:

“Reports of Grok being used to create and share illegal non-consensual intimate images and child sexual abuse material on X have been deeply concerning. Platforms must protect people in the UK from content that’s illegal in the UK, and we won’t hesitate to investigate where we suspect companies are failing in their duties, especially where there’s a risk of harm to children.

“We’ll progress this investigation as a matter of the highest priority, while ensuring we follow due process. As the UK’s independent online safety enforcement agency, it’s important we make sure our investigations are legally robust and fairly decided.”

We will provide an update on this investigation as soon as possible.

Notes to Editors 

  1. It is illegal in the UK to share non-consensual intimate images or child sexual abuse material. See 66D, subsections 5-7 of the Sexual Offences Act, as inserted by the Online Safety Act, for the definition of intimate image abuse, and Schedule 6 explains the child sexual exploitation and abuse offences that are priority offences under the Act.
  2. We also received a response from xAI on Friday 9 January. We are assessing whether there are potential compliance issues with xAI – in connection with the provision of Grok – under the Online Safety Act that warrant investigation. We have sought urgent clarification from xAI on the steps it is taking to protect users in the UK.
  3. Ofcom’s illegal harms codes of practice set out safety measures providers can implement to comply with their duties, such as: having user reporting and complaints processes for illegal content that are easy to find, access and use; adequately resourcing and training content moderation teams as appropriate to deal with illegal content; and having content moderation systems that are designed to take down illegal content swiftly when they become aware of it.
  4. We have published guidance here and here setting out age assurance methods that we consider are capable of being highly effective at correctly determining whether or not a user is a child.
  5. Our Online Safety Enforcement Guidance can be found here.
  6. Since duties came into force less than a year ago, we have made use of a range of our powers under the Online Safety Act,, including: 

As a result of our enforcement action to date:

A full list of our enforcement actions under the Online Safety Act is available here.

  1. While we will not hesitate to use these powers where it is appropriate and proportionate, it would be a significant regulatory intervention due to the impacts it has on the availability of services and information online for people in the UK. We recently put the provider of a suicide forum on notice that we are prepared to apply to a court for business disruption measures swiftly after the period for making representations on our provisional decision has elapsed, if any non-compliance we may identify in our provisional decision continues.
  2. More information on jurisdiction is available here.