Tech firms must clamp down on illegal online materials
Tech firms must use a range of measures to protect their users from illegal content online – from child sexual abuse material and grooming to fraud – under detailed plans set out by the new online safety regulator recently.
- Ofcom sets out first steps tech firms can take to create a safer life online
- Children are key priority, with measures proposed to tackle child sexual abuse and grooming, and pro-suicide content
- New research reveals children’s experiences of potentially unwanted contact online
- Tech companies will also need to address fraud and terrorist content
Ofcom is exercising its new powers to release draft Codes of Practice that social media, gaming, pornography, search and sharing sites can follow to meet their duties under the Online Safety Act, which came into law last month.
Ofcom’s role will be to force firms to tackle the causes of online harm by making their services fundamentally safer. Our powers will not involve us making decisions about individual videos, posts, messages or accounts, or responding to individual complaints.
Firms will be required to assess the risk of users being harmed by illegal content on their platform, and take appropriate steps to protect them from it. There is a particular focus on ‘priority offences’ set out in the legislation, such as child abuse, grooming and encouraging suicide; but it could be any illegal content.
Dame Melanie Dawes, Ofcom's Chief Executive recently said:
“Regulation is here, and we’re wasting no time in setting out how we expect tech firms to protect people from illegal harm online, while upholding freedom of expression. Children have told us about the dangers they face, and we’re determined to create a safer life online for young people in particular.”
Combatting child sexual abuse and grooming
Protecting children will be Ofcom’s first priority as the online safety regulator. Scattergun friend requests are frequently used by adults looking to groom children for the purposes of sexual abuse. Our new research, published recently, sets out the scale and nature of children’s experiences of potentially unwanted and inappropriate contact online.
Three in five secondary-school-aged children (11-18 years) have been contacted online in a way that potentially made them feel uncomfortable. Some 30% have received an unwanted friend or follow request. And around one in six secondary-schoolers (16%) have either been sent naked or half-dressed photos, or been asked to share these themselves.
Dame Melanie recently said:
“Our figures show that most secondary-school children have been contacted online in a way that potentially makes them feel uncomfortable. For many, it happens repeatedly. If these unwanted approaches occurred so often in the outside world, most parents would hardly want their children to leave the house. Yet somehow, in the online space, they have become almost routine. That cannot continue.”
Given the range and diversity of services in scope of the new laws, we are not taking a ‘one size fits all’ approach. We are proposing some measures for all services in scope, and other measures that depend on the risks the service has identified in its illegal content risk assessment and the size of the service.
Under our draft codes published recently, larger and higher-risk services should ensure that, by default:
- Children are not presented with lists of suggested friends;
- Children do not appear in other users’ lists of suggested friends;
- Children are not visible in other users’ connection lists;
- Children’s connection lists are not visible to other users;
- Accounts outside a child’s connection list cannot send them direct messages; and
- Children’s location information is not visible to any other users.
We are also proposing that larger and higher-risk services should:
- use a technology called ‘hash matching’ – which is a way of identifying illegal images of child sexual abuse by matching them to a database of illegal images, to help detect and remove child sexual abuse material (CSAM) circulating online; and
- use automated tools to detect URLs that have been identified as hosting CSAM.
All large general search services should provide crisis prevention information in response to search requests regarding suicide and queries seeking specific, practical or instructive information regarding suicide methods.
Latest News from
Implementing the Online Safety Act: Protecting children from online pornography05/12/2023 13:15:00
Children are set to be protected from accessing online pornography under new age-check guidance proposed by Ofcom today to help services to comply with online safety laws.
Working class audiences want the BBC to take more risks when producing new programmes01/12/2023 15:15:15
The BBC should take more risks when producing new programmes if it is to reconnect with viewers and listeners on lower incomes, according to new audience research published today by Ofcom.
How the Online Safety Act will help to protect women and girls29/11/2023 16:20:00
Online interactions play a major role in our daily lives. While most people have positive experiences online, for many women and girls life online can be an extension of harmful gender dynamics that exist in wider society.
Gen Z driving early adoption of Gen AI, our latest research shows29/11/2023 10:25:00
Teenagers and children in the UK are far more likely than adults to have embraced generative artificial intelligence (AI) according to Ofcom’s latest study into the nation’s online lives.
Top trends from our latest look at people's online lives28/11/2023 13:20:00
Our latest Online Nation research takes a look at how people in the UK are spending their time online.
UK adult site introduces age verification measures following engagement with Ofcom27/11/2023 13:15:00
Tapnet Ltd – which provides the online adult video service RevealMe – has introduced age verification measures, after Ofcom raised concerns that it was not doing enough to prevent children from being able to access pornography on its platform.
Ofcom investigation helps to convict man for amateur radio interference27/11/2023 09:20:00
Investigations carried out by Ofcom’s spectrum experts have helped to secure the conviction of a man who was deliberately causing harmful interference to amateur radio users in and around Hull.
Shell Energy fined £1.4m for failing to flag end of contracts and best deals to phone and broadband customers21/11/2023 15:25:00
Ofcom has today fined Shell Energy £1,400,000 for not properly prompting more than 70,000 phone and broadband customers to review their contract, or letting them know what they could save by signing up to a new deal.