Ofcom
Printable version

Nudging users to report potentially harmful online content

Despite the many benefits of being online, most of us come across potentially harmful material. Ofcom research found that six in 10 users say they encountered at least one piece of harmful content online in the previous four weeks alone. In some cases the material is illegal and should be taken down. But in others, such as instances of bullying and harassment, or the promotion of self-harm, the content is legal but can cause harm, and users might wish to avoid seeing it.

Reporting potentially harmful content

People continue to play an important role in identifying potentially harmful content, which can then be moderated and labelled as sensitive - particularly where it’s ambiguous and difficult for automated systems to pick up. But despite widespread encountering of potentially harmful content, only two people in 10 say they reported the last piece of potentially harmful content they encountered.

The reasons are complex. Sometimes it’s caused simply by a lack of awareness or knowledge of how to report. That’s why Ofcom launched a social media campaign to raise awareness among young people. Others include the feeling that reporting doesn’t make any difference, or that it’s someone else’s responsibility. But another, deceptively simple, factor is the nudges and prompts to report content that users encounter as they browse. Behavioural research has long established that even small changes to the environment in which we make decisions (the choice architecture) can have surprisingly large effects.  We set out to measure the difference they can make to the reporting of harmful content.

Click here for the full press release

 

Channel website: https://www.ofcom.org.uk/

Original article link: https://www.ofcom.org.uk/news-centre/2023/nudging-users-to-report-potentially-harmful-online-content

Share this article

Latest News from
Ofcom

Facing the Future...find out more