Ofcom
|
|
Keep underage children off your platforms, Ofcom tells tech firms
Major sites and apps must enforce their minimum age rules with highly-effective age checks, Ofcom warns today, as the online safety regulator examines continued failings by services most popular among children
- Popular sites and apps must enforce minimum ages, tackle grooming, make feeds safer and test products rigorously
- Facebook, Instagram, Roblox, Snapchat, TikTok and YouTube have until end of next month to say what they're doing, with Ofcom to report on this in May
We have today written to the major sites and apps that children use the most – Facebook, Instagram, Roblox, Snapchat, TikTok and YouTube – requiring them to prove to parents a genuine commitment to protecting children online.
Since the UK’s online safety laws came into force last year, Ofcom has been investigating nearly a hundred services. We have taken enforcement action, secured changes to disrupt the sharing of child sexual abuse material, and seen high risk services either get in line or block access to the UK altogether. Millions of daily visits to porn sites now require highly effective age checks. Major platforms, including X, Telegram, Discord and Reddit, have also introduced age controls to prevent children accessing adult or harmful content.
While there are many examples of progress to be welcomed,[1] the industry has not done enough. Parents have lost trust in tech firms’ ability to keep their children safe, while the Government is consulting on further legislative measures to address public concern.
Four demands for further action
We are today setting out four clear demands for further action, so that tech firms are held publicly accountable for delivering the safest possible online environment for UK children – which many have claimed, privately in their engagement with Ofcom, they are serious about offering.
We have set Facebook, Instagram, Roblox, Snapchat, TikTok and YouTube a deadline of 30 April to report back to us on the action they will take, and we are urging them to publish this. In May, Ofcom will report on how the companies have responded and will announce any next steps for regulatory action.
Dame Melanie Dawes, Ofcom Chief Executive, said: “These online services are household names, but they’re failing to put children’s safety at the heart of their products. There is a gap between what tech companies promise in private, and what they’re doing publicly to keep children safe on their platforms.
“Without the right protections, like effective age checks, children have been routinely exposed to risks they didn’t choose, on services they can’t realistically avoid. That must now change quickly, or Ofcom will act.”
The companies must implement:
- Effective minimum-age policies. Ofcom research shows that widespread minimum age policies of 13 are still not being properly enforced by tech companies, with 72% of children aged 8-12 accessing their sites and apps.[2] Despite not being explicitly required by the Online Safety Act, we are calling on platforms to do this, using highly effective age assurance.[3]
- Failsafe grooming protections. This means ensuring strict controls to stop strangers being able to contact children they do not know on their platforms. That includes using highly effective age assurance to check users’ ages.
- Safer feeds for children. Algorithms are children’s main pathway to harm online.[4] To inform our assessment of these systems, we are issuing legally-binding information requests to large platforms. Given the complexities and technical nature of this work, it will take time to assess their responses. However, we will not hesitate to take enforcement action if we identify failings in how companies promote content to children.
- An end to product testing on children. New AI tools are launched regularly and widely used by children, without parents knowing they have been tested for safety. This cannot continue. We expect platforms to notify Ofcom that they have, as required by law, assessed the risk of significant updates before they are deployed.
Holding tech firms to account
At the same time as we publicly report on platforms’ responses in May, we will release new research on how far children’s online experiences have changed during the first year of the Online Safety Act being in force.
If we are not satisfied with platforms’ responses, we will be ready to take enforcement action. If necessary, we will also consider strengthening the regulatory requirements under our industry Codes to ensure further change. Separately, the Government has already begun seeking legislative changes from Parliament as new technology develops.[5]
Ofcom’s Chief Executive, Dame Melanie Dawes, and Group Director of Online Safety, Oliver Griffiths, will meet senior leaders and helpline counsellors on Thursday at the NSPCC and Childline headquarters. The remarks she is expected to deliver are available here.
Chris Sherwood, CEO at the NSPCC, said: “For too long, social media giants have looked the other way while harmful and addictive content floods children’s feeds, undermining their safety and wellbeing. That’s why Ofcom’s demand for far greater transparency about the risks children face online, and how tech companies plan to protect them, is absolutely essential.
“We’ve long called for minimum age limits to be properly enforced on social media, so it’s encouraging to see Ofcom confront this head on. Platforms must finally know who is using their services so that they can stop children accessing spaces that were never designed for them.
“As an urgent priority, Government must now give Ofcom the full powers it needs to enforce these effective age checks on young users, rein in dangerous algorithms, and finally hold tech companies to account when they fail to keep children safe.”
Notes to editors:
- For example, Meta has announced improvements to Teen Accounts, TikTok has introduced new user support tools and reporting changes, and Roblox has put age checks in to restrict adults from contacting children they don’t know. On Snapchat, we have seen stricter default privacy protections for teens, including on location sharing. YouTube has banned livestreaming for under 16s and allowed parents to limit the time their children spend on YouTube Shorts to zero.
- The minimum age for YouTube, Instagram, Facebook, Snapchat and Tiktok is 13 years of age. The minimum age for Roblox is 5 years of age. Under 13s may use YouTube through a supervised account (with parental consent). YouTube also offers YouTube Kids, for which there is no minimum age.
- We welcome the ICO’s clear statement today that services must enforce their minimum age requirements to comply with data protection law and we are working closely with them. Both regulators will publish an updated joint statement in March 2026, which outlines the main areas of interaction between online safety and data protection as they relate to age assurance. Ofcom also supports the ICO’s work on harm arising from processing of personal information by recommender systems.
- The Act is clear that pornography, suicide and self-harm material and eating disorder content must not be made available at all to under 18s. Other harmful content – including abusive and violent material and dangerous challenges – must not be pushed to children in their feeds. Ofcom’s Protection of Children Codes of Practice contain 50 measures designed to improve child safety online, covering areas such as content moderation, age assurance and recommender systems. Ofcom has subsequently proposed additional measures to ensure children’s online safety, covering areas such as livestreaming and the use of proactive technology.
- Government announcement of new legislative measures on online safety: https://www.gov.uk/government/news/pm-no-platform-gets-a-free-pass-government-takes-action-to-keep-children-safe-online
Original article link: https://www.ofcom.org.uk/online-safety/protecting-children/keep-underage-children-off-your-platforms-ofcom-tells-tech-firms


