How we’re keeping people safe from harmful video content online

6 Oct 2021 01:19 PM

We yesterday announced a guidance for video-sharing platforms (VSPs) to help protect the people who use them from harmful content.

What are VSPs?

VSPs are a type of online video service where users can upload and share videos with other members of the public. They allow people to engage with a wide range of content and social features.

VSPs established in the UK are required by law to take measures to protect under-18s from potentially harmful video content; and all users from videos likely to incite violence or hatred, as well as certain types of criminal content.

According to our research, a third of VSP users say they have witnessed or experienced hateful content; a quarter claim they’ve been exposed to violent or disturbing content, and one in five have been exposed to videos or content that encouraged racism.

Our research shows that 70% of users say they have been exposed to any potential online harm, 32% to hateful content, 26% to bullying, abusive behaviour and threats, 26% to violent or disturbing content, and 21% to racist content.

Our guidance is aimed at helping these platforms to understand their new obligations and judge how best to protect their users from this kind of harmful material.

What is Ofcom's role in regulating VSPs?

Our job is to make sure VSPs within our jurisdiction have appropriate measures in place to protect users from videos which:

  1. might impair the physical, mental or moral development of under-18s;
  2. are likely to incite violence or hatred based on particular grounds such as sex, race, colour, ethnic or social origin, genetic features, language, religion or belief, political or any other opinion, membership of a national minority, property, birth, disability, age or sexual orientation; and/or
  3. directly or indirectly encourage acts of terrorism; show or involve conduct that amounts to child sexual abuse; and show or involve conduct that incites racism or xenophobia.

Appropriate measures may include: terms and conditions; reporting and flagging functions; viewer rating systems; age verification; parental control functions; complaints procedures; or media literacy tools and information.

Which video-sharing platforms fall under Ofcom’s powers?

VSP providers must assess whether they fall under the regulations and come under UK jurisdiction, and if they do are legally obliged to notify their service to Ofcom. We have published guidance to help them do this.

We keep up to date a published list of notified VSPs to Ofcom.

What powers does Ofcom have to keep video-sharing platforms in check?

We have information-gathering powers which enable us to demand relevant information from the platforms, such as assessing and monitoring their compliance and conducting investigations. We also have the power to take enforcement action if a breach of the rules occurs.

What penalties could video-sharing platforms face if they break the rules?

If we find that a VSP has broken the rules, we can enforce a financial penalty of up to 5% of its qualifying revenue or £250k (whichever is greater).

Does this mean you’ll be censoring content on the internet?

No. Freedom of expression is central to our democracy, values and modern society. Unlike in our broadcasting work, Ofcom’s role is not focused on determining whether particular items of content should or should not be made available or whether they comply with specific content standards. Rather, our role is to ensure platforms have safety systems and processes in place that provide effective protection to their users from the harms mentioned above.

In carrying out our responsibilities, we will always take account of users’ rights, including freedom of expression.

Can I complain to Ofcom about harmful content on video-sharing platforms?

You should complain directly to the video-sharing platform if you have concerns about harmful content on it.

If you reported content and remain concerned action wasn’t taken, you can tell Ofcom through our online complaints portal. You can also report any concerns about the platforms’ safety measures – for example, any problems with reporting, flagging or age verification functions.

Our role is to make sure providers have appropriate measures in place to protect users. Complaints from the public will help identify potential issues with compliance but we do not resolve individual complaints.