Tech firms must clamp down on illegal online materials

13 Nov 2023 12:41 PM

Tech firms must use a range of measures to protect their users from illegal content online – from child sexual abuse material and grooming to fraud – under detailed plans set out by the new online safety regulator recently.

Ofcom is exercising its new powers to release draft Codes of Practice that social media, gaming, pornography, search and sharing sites can follow to meet their duties under the Online Safety Act, which came into law last month.

Ofcom’s role will be to force firms to tackle the causes of online harm by making their services fundamentally safer. Our powers will not involve us making decisions about individual videos, posts, messages or accounts, or responding to individual complaints.

Firms will be required to assess the risk of users being harmed by illegal content on their platform, and take appropriate steps to protect them from it. There is a particular focus on ‘priority offences’ set out in the legislation, such as child abuse, grooming and encouraging suicide; but it could be any illegal content.

Dame Melanie Dawes, Ofcom's Chief Executive recently said:

“Regulation is here, and we’re wasting no time in setting out how we expect tech firms to protect people from illegal harm online, while upholding freedom of expression. Children have told us about the dangers they face, and we’re determined to create a safer life online for young people in particular.”

Combatting child sexual abuse and grooming

Protecting children will be Ofcom’s first priority as the online safety regulator. Scattergun friend requests are frequently used by adults looking to groom children for the purposes of sexual abuse. Our new research, published recently, sets out the scale and nature of children’s experiences of potentially unwanted and inappropriate contact online.

Three in five secondary-school-aged children (11-18 years) have been contacted online in a way that potentially made them feel uncomfortable. Some 30% have received an unwanted friend or follow request. And around one in six secondary-schoolers (16%) have either been sent naked or half-dressed photos, or been asked to share these themselves.

Dame Melanie recently said:

“Our figures show that most secondary-school children have been contacted online in a way that potentially makes them feel uncomfortable. For many, it happens repeatedly. If these unwanted approaches occurred so often in the outside world, most parents would hardly want their children to leave the house. Yet somehow, in the online space, they have become almost routine. That cannot continue.”

Around three in five children aged 11-18 have been contacted online in a way that potentially made them feel uncomfortable. 30% have received an unwanted friend or follow request, and around one in six secondary-schoolers have either been sent naked or half-dressed photos, or been asked to share these themselves.

Given the range and diversity of services in scope of the new laws, we are not taking a ‘one size fits all’ approach. We are proposing some measures for all services in scope, and other measures that depend on the risks the service has identified in its illegal content risk assessment and the size of the service.

Under our draft codes published recently, larger and higher-risk services[4] should ensure that, by default:

We are also proposing that larger and higher-risk services should:

All large general search services should provide crisis prevention information in response to search requests regarding suicide and queries seeking specific, practical or instructive information regarding suicide methods.

Click here for the full press release