Age checks to protect children online

16 Jan 2025 12:43 PM

Children will be prevented from encountering online pornography and protected from other types of harmful content under Ofcom’s new industry guidance which sets out how we expect sites and apps to introduce highly effective age assurance.

These decisions are the next step in Ofcom implementing the Online Safety Act and creating a safer life online for people in the UK, particularly children. It follows tough industry standards, announced last month, to tackle illegal content online, and comes ahead of broader protection of children measures which will launch in the Spring.

Robust age checks are a cornerstone of the Online Safety Act. It requires services which allow pornography or certain other types of harmful content to introduce ‘age assurance’ to ensure that children are not normally able to encounter it.[1] Age assurance methods – which include age verification, age estimation or a combination of both – must be ‘highly effective’ at correctly determining whether a particular user is a child.

We have published industry guidance on how we expect age assurance to be implemented in practice for it to be considered highly effective. Our approach is designed to be flexible, tech-neutral and future-proof. It also allows space for innovation in age assurance, which represents an important part of a wider safety tech sector where the UK is a global leader[2]. We expect the approach to be applied consistently across all parts of the online safety regime over time.

While providing strong protections to children, our approach also takes care to ensure that privacy rights are protected and that adults can still access legal pornography. As platforms take action to introduce age assurance over the next six months, adults will start to notice changes in how they access certain online services. Our evidence suggests that the vast majority of adults (80%) are broadly supportive of age assurance measures to prevent children from encountering online pornography.[3]

What are online services required to do, and by when?

The Online Safety Act divides online services into different categories with distinct routes to implement age checks. However, the action we expect all of them to take starts now:

What does highly effective age assurance mean?

Our approach to highly effective age assurance and how we expect it to be implemented in practice applies consistently across three pieces of industry guidance, published yesterday[5]. Our final position, in summary:

We consider this approach will secure the best outcomes for the protection of children online in the early years of the Act being in force. While we have decided not to introduce numerical thresholds for highly effective age assurance at this stage (e.g. 99% accuracy), we acknowledge that numerical thresholds may complement our four criteria in the future, pending further developments in testing methodologies, industry standards, and independent research.

Opening a new enforcement programme

We expect all services to take a proactive approach to compliance and meet their respective implementation deadlines. Ofcom is opening an age assurance enforcement programme, focusing our attention first on Part 5 services that display or publish their own pornographic content.

We will contact a range of adult services – large and small – to advise them of their new obligations. We will not hesitate to take action and launch investigations against services that do not engage or ultimately comply.

For too long, many online services which allow porn and other harmful material have ignored the fact that children are accessing their services. Either they don’t ask or, when they do, the checks are minimal and easy to avoid. That means companies have effectively been treating all users as if they’re adults, leaving children potentially exposed to porn and other types of harmful content. Today, this starts to change.

As age checks start to roll out in the coming months, adults will start to notice a difference in how they access certain online services. Services which host their own pornography must start to introduce age checks immediately, while other user-to-user services – including social media - which allow pornography and certain other types of content harmful to children will have to follow suit by July at the latest.

We’ll be monitoring the response from industry closely. Those companies that fail to meet these new requirements can expect to face enforcement action from Ofcom.

- Melanie Dawes, Ofcom’s Chief Executive

Notes to editor

  1. Research shows that children are being exposed to online pornography from an early age. Of those who have seen online pornography, the average age they first encounter it is 13 – although more than a quarter come across it by age 11 (27%), and one in ten as young as 9 (10%). Source: ‘A lot of it is actually just abuse’- Young people and pornography Children's Commissioner for England
  2. Research from the UK Government indicates that UK firms account for an estimated one-in-four (23%) of the global safety tech workforce. 28% of safety tech companies are based in the UK according to recent research by Paladin Capital and PUBLIC.
  3. Source: Yonder Consulting - Adult Users’ Attitudes to Age Verification on Adult Sites
  4. ‘Part 3’ services include those that host user-generated content, such as social media, tube sites, cam sites, and fan platforms.
  5. Services that conclude they are not likely to be accessed by children – including where this is because they are using highly effective age assurance – must record the outcome of their assessment and must repeat the children’s access assessment at least annually.
  6. ‘Part 5’ services are those that publish their own pornographic content, such as studios or pay sites, where operators control the material available.