Ofcom
|
|
Ofcom fines porn company £1 million for not having robust age checks
Ofcom yesterday fined AVS Group Ltd – which runs 18 adult websites – £1 million for not having robust age checks in place, plus £50,000 for failing to respond to information requests
- Watchdog also reports on how platforms have responded to the UK’s new online safety laws this year
These fines come as the regulator reports on how the online safety landscape is changing, following new UK laws coming into effect this year.
Porn provider fined for not having robust age checks
Under the UK’s Online Safety Act, sites that host pornographic material must use highly effective age assurance to prevent children from readily accessing that content.
Within days of this duty coming into force in July, Ofcom launched investigations into the providers of dozens of adult sites, including AVS Group Ltd. With millions of monthly UK visitors, these websites were prioritised based on their user numbers and the risk of harm they posed.
While AVS has implemented what it refers to as age verification, we do not consider it to be highly effective, and have fined the company £1,000,000. AVS must now implement highly effective age assurance within 72 hours of this decision, or face a daily penalty of £1,000 per day.[1]
We continue to investigate other services’ compliance with age check requirements and will take action where necessary.
We have also fined AVS £50,000 for failing to respond to our legally binding information request. We will impose a daily penalty on the company of £300 per day, starting from tomorrow, until it responds or for 60 days, whichever is sooner.[2]
What’s changing under new online safety rules
For more than two decades, online platforms have been unregulated, unaccountable and often unwilling to prioritise people’s safety over profits. Illegal content duties under the Act came into force in March, and children’s safety duties came into force in July.
New Ofcom research, published yesterday, shows that 58% of parents believe the measures in our codes of practice are already improving the safety of UK children online; 67% feel the measures will make a difference in the future; and 36% have noticed a potential impact on their child’s online activity.
Ofcom’s job is to make sure tech firms assess the risk their platforms pose to UK users and take appropriate steps to address those risks. So far, we have opened investigations into 92 online services, fined three providers, and some high-risk sites are no longer available to UK IP addresses.
We are scrutinising what providers are doing, and whether these are translating into a safer life online for UK users. This includes, but is not limited to:
Risk assessments - We have reviewed 104 risk assessment records from a range of large and small services, spanning over 10,000 pages. We told 11 providers to do more work, and all submitted revised records or supplementary information. In addition, one major social media company is going through compliance remediation with our Enforcement team, which may result in formal action if we do not see sufficient improvement soon. Between 1 May and 31 July 2026, providers will have to submit their second risk assessment records to us on request, which should reflect any significant changes to their services and show they have assessed risks before changes are implemented.[3]
Age checks and other child protections - More than half of the top 100 most popular adult services in the UK have deployed age assurance, as well as widely used services including X, TikTok, Reddit, Bluesky, Telegram, Discord, Roblox, Xbox, Steam, Bumble, Tinder, Hinge and Grindr. Ofcom’s new research has found that 47% of children aged 8-17 encountered an age check online when trying to access age-restricted content after the July deadline compared to 30% before. We will publish a report on the deployment and effectiveness of age assurance by July 2026.
The platforms most used by kids – including Facebook, Instagram, Pinterest, TikTok, YouTube, Roblox and Snap – must tell us what they are doing to keep children safe, and make improvements where needed. By May 2026, we will publish data and analysis on children’s online experience. We will provide further updates on any enforcement action we deem necessary.
Combatting CSAM and grooming - More sites have implemented ‘hash-matching’ technologies to detect and remove child sex abuse material (CSAM), in line with our codes; and many are now restricting adults from seeing children in network expansion prompts, or from sending private messages to children before being connected, as our codes recommend, to reduce grooming risks. However, several recent reports have underlined the scale of the remaining problem that platforms need to address.
Tackling terrorist content and hate speech - We have worked with a range of civil society organisations to gather evidence that suggests terrorist content and illegal hate speech is persisting on some of the largest social media sites. Since the terrorist attack on the Heaton Park Hebrew Congregation synagogue, we have kicked off a new compliance programme to determine whether the biggest social media companies have adequate systems and processes for assessing and swiftly removing illegal hate and terror material that has been reported to them. By April 2026, we will have reviewed one major platform’s systems for taking down illegal terror and hate content – including antisemitic and anti-Muslim material – and consider whether formal enforcement action is necessary.
Fighting fraud - Organisations with fraud expertise have said platforms improved their reporting processes, making it easier for them to report fraud directly. However, substantially more needs to be done to address one of the most widespread online harms faced by UK users.
Oliver Griffiths, Ofcom’s Online Safety Group Director, said: “The tide on online safety is beginning to turn for the better. This year has seen important changes for people, with new measures across many sites and apps now better protecting children from harmful content. But we need to see much more from tech companies next year and we’ll use our full powers if they fall short.”
Technology Secretary Liz Kendall said: “Since the enforcement of the Online Safety Act, platforms have finally started taking responsibility for protecting children and removing illegal and hateful content. Ofcom has the government’s full backing to use all its powers to ensure that services put users’ safety first. Keeping children safe online is this government’s and my personal priority.”
Kerry Smith, Chief Executive of the Internet Watch Foundation (IWF), said: “The threats children face online evolve and escalate at a rate that sometimes outstrips regulation. As Ofcom enforces the Online Safety Act, we are starting to see the UK living up to its ambition to be the safest place in the world to be online.
“This crucial legislation is already making a difference. Children and young people deserve an internet where they can play, socialise, learn, and be themselves online without the fear of grooming, coercion, sexual abuse, and exposure to sexual imagery.
“We strongly believe there should be no safe spaces online for predators to hide. So while we welcome the steps taken by Ofcom so far, there is much more to do, and we look forward to working together to ensure children are given the safety they need.”
Chris Sherwood, CEO at the NSPCC, said: “All children deserve safe and positive experiences online. We welcome the progress Ofcom is making to ensure their codes of practice result in safer platforms and their drive to target clear cases of non-compliance.
“However, there is much still to do. In 2026 Ofcom must act with ambition as they implement and enforce the regulation, putting meaningful change for children at the heart of decision making. All services must be pushed to put safety before profit and deliver platforms which are genuinely safe. Only then can we ensure no child is exposed to preventable harm online.”
Helen Rance, Deputy Director CSA Threat at the National Crime Agency said: “As our lives - and particularly children’s lives - increasingly move online, ensuring that platforms protect both children and adults is more essential than ever. We estimate that around three-quarters of a million UK-based adults pose varying degrees of sexual threat to children, and online services provide far too many opportunities for them to engage with young people and with each other.
“Each month, industry reports contribute significantly to coordinated action by the NCA and UK policing, resulting in approximately 1,000 arrests and 1,200 children being safeguarded. But we all need to do more to reverse the rise in offending. We will continue to work closely with Ofcom and online platforms to drive stronger protections, close down opportunities for offenders, and make the online world safer for children.”
Notes to editors
- If a provider fails to pay a fine, Ofcom can seek recovery of those penalties. Where appropriate, we can also seek a court order for ‘business disruption measures’, such as requiring payment providers or advertisers to withdraw their services from a platform, or requiring Internet Service Providers to block a site in the UK.
- We have also expanded the scope of our ongoing investigation into 4chan to include its compliance with children’s safety duties, in addition to its compliance with duties to protect UK users from illegal content.
- We have published a comprehensive report on risk assessments, which provides more information on our views of the first set of submitted risk assessments.
Original article link: https://www.ofcom.org.uk/online-safety/protecting-children/ofcom-fines-porn-company-1million-for-not-having-robust-age-checks


.png)