Ofcom
Printable version

Tech firms must clamp down on illegal online materials

Tech firms must use a range of measures to protect their users from illegal content online – from child sexual abuse material and grooming to fraud – under detailed plans set out by the new online safety regulator recently.

  • Ofcom sets out first steps tech firms can take to create a safer life online
  • Children are key priority, with measures proposed to tackle child sexual abuse and grooming, and pro-suicide content
  • New research reveals children’s experiences of potentially unwanted contact online
  • Tech companies will also need to address fraud and terrorist content

Ofcom is exercising its new powers to release draft Codes of Practice that social media, gaming, pornography, search and sharing sites can follow to meet their duties under the Online Safety Act, which came into law last month.

Ofcom’s role will be to force firms to tackle the causes of online harm by making their services fundamentally safer. Our powers will not involve us making decisions about individual videos, posts, messages or accounts, or responding to individual complaints.

Firms will be required to assess the risk of users being harmed by illegal content on their platform, and take appropriate steps to protect them from it. There is a particular focus on ‘priority offences’ set out in the legislation, such as child abuse, grooming and encouraging suicide; but it could be any illegal content.

Dame Melanie Dawes, Ofcom's Chief Executive recently said:

“Regulation is here, and we’re wasting no time in setting out how we expect tech firms to protect people from illegal harm online, while upholding freedom of expression. Children have told us about the dangers they face, and we’re determined to create a safer life online for young people in particular.”

Combatting child sexual abuse and grooming

Protecting children will be Ofcom’s first priority as the online safety regulator. Scattergun friend requests are frequently used by adults looking to groom children for the purposes of sexual abuse. Our new research, published recently, sets out the scale and nature of children’s experiences of potentially unwanted and inappropriate contact online.

Three in five secondary-school-aged children (11-18 years) have been contacted online in a way that potentially made them feel uncomfortable. Some 30% have received an unwanted friend or follow request. And around one in six secondary-schoolers (16%) have either been sent naked or half-dressed photos, or been asked to share these themselves.

Dame Melanie recently said:

“Our figures show that most secondary-school children have been contacted online in a way that potentially makes them feel uncomfortable. For many, it happens repeatedly. If these unwanted approaches occurred so often in the outside world, most parents would hardly want their children to leave the house. Yet somehow, in the online space, they have become almost routine. That cannot continue.”

Around three in five children aged 11-18 have been contacted online in a way that potentially made them feel uncomfortable. 30% have received an unwanted friend or follow request, and around one in six secondary-schoolers have either been sent naked or half-dressed photos, or been asked to share these themselves.

Given the range and diversity of services in scope of the new laws, we are not taking a ‘one size fits all’ approach. We are proposing some measures for all services in scope, and other measures that depend on the risks the service has identified in its illegal content risk assessment and the size of the service.

Under our draft codes published recently, larger and higher-risk services[4] should ensure that, by default:

  • Children are not presented with lists of suggested friends;
  • Children do not appear in other users’ lists of suggested friends;
  • Children are not visible in other users’ connection lists;
  • Children’s connection lists are not visible to other users;
  • Accounts outside a child’s connection list cannot send them direct messages;[5] and
  • Children’s location information is not visible to any other users.

We are also proposing that larger and higher-risk services should:

  • use a technology called ‘hash matching’ – which is a way of identifying illegal images of child sexual abuse by matching them to a database of illegal images, to help detect and remove child sexual abuse material (CSAM) circulating online;[6] and
  • use automated tools to detect URLs that have been identified as hosting CSAM.

All large general search services should provide crisis prevention information in response to search requests regarding suicide and queries seeking specific, practical or instructive information regarding suicide methods.

Click here for the full press release

 

Channel website: https://www.ofcom.org.uk/

Original article link: https://www.ofcom.org.uk/news-centre/2023/tech-firms-must-clamp-down-on-illegal-online-materials

Share this article

Latest News from
Ofcom

7-Step Guide Inspired by the UK Management of Risk in Government Framework