Looking ahead to online regulation: It’s time to rethink transparency reporting
As Ofcom prepares to take on new powers to protect online users from harm, Anna-Sophie Harling, Declan Henesy and Eleanor Simmance from our online safety team discuss the power of transparency reporting.
Their full paper on this topic has been published in the Journal of Online Trust and Safety.
Online services have not faced comprehensive regulatory oversight of their trust and safety practices, leaving the public in the dark on how platforms make decisions and design their products, and how they affect people. The UK’s Online Safety Bill, which is currently making its way through Parliament, will give Ofcom new regulatory tools, including mandatory transparency reporting.
Many online platforms already publish voluntary transparency reports, with companies choosing how, what and when they report. These reports provide only a partial account of what’s happening inside companies and across the platforms they operate.
Under the Online Safety Bill, Ofcom will be required to issue transparency notices to a subset of in-scope platforms; these may be tailored to each particular service, specifying the information and data different platforms must publish, the methodology used, and the format in which the information is gathered and published. Ofcom will also be obligated to publish its own transparency reports each year, based on the information published in platform transparency reports.
We believe there is a benefit in rethinking the approach to transparency reporting. The publication of key information can help drive change in regulated services by exposing good and bad practice and ensuring the industry learns from this. Revelations about online platforms failing to prioritise user safety can have immediate impacts on user numbers, advertising spend, and share prices. Targeted transparency requirements will therefore be a major tool for driving behaviour change.
What should we measure?
We’re thinking carefully about what information we want platforms to publish in their transparency reports. We may require them to publish metrics around the prevalence and dissemination of illegal or harmful content, and the number of users who have encountered this content. We could ask them to explain how they enforce their policies and community guidelines, publish information about user reporting systems and user empowerment tools, or disclose details about content moderation technologies and user identity verification. Other areas of focus might be corporate governance structure and decision-making, risk assessment outcomes, and internal key performance indicators across teams.
The information that platforms currently publish provides some insight, but it has its limitations. For example, a transparency report might report that “140,000 pieces of hate speech were removed in Q1.” If this figure goes up in Q2, does that mean there was more hate speech on the platform than in Q1? Does it mean that the systems in place to identify this content became more effective? Does it mean that the platform changed its definition of hate speech in Q2, resulting in a greater number of pieces of content violating its rules? What was the impact of major international, national, or local events on the amount of hateful content uploaded or resurfaced by users in Q2? Even if overall levels of content violating a platform’s policies are low, there could still be a risk of harm if users with particular vulnerabilities or characteristics, such as children, are more likely than average to be exposed on a repeated basis.
There are a lot of challenges associated with metrics, so we’ll have to work hard to get them right. We will also consider how transparency reporting can go beyond content moderation to address the different ways that services protect their users from online harms and highlight good and bad practice, all the while keeping in mind the potential risks around arming bad actors with information on how to circumvent safety systems.
Other regulators around the world are in the process of implementing their own transparency reporting regimes. Ofcom will have to think about the extent to which we want to align our transparency reporting requirements with those of other regulators.
The UK Online Safety Bill gives Ofcom the power to tailor transparency reporting requirements to each platform, which means that we will have the ability to go above and beyond other standardised reporting regimes. Product changes can happen at a global level, meaning that a successful transparency regime might nudge platforms to make systemic changes that impact users around the world.
Transparency will be a powerful and essential tool in our regulatory arsenal. As the future online safety regulator, we plan to think long and hard about the numerous challenges and trade-offs associated with mandatory transparency reporting. A carefully designed transparency regime could transform Ofcom’s ability to hold platforms accountable and fundamentally change the way the industry prioritises the safety of its users.
Original article link: https://www.ofcom.org.uk/news-centre/2023/looking-ahead-to-online-regulation-transparency-reporting
Latest News from
How our compliance and monitoring programme is protecting telecoms customers31/03/2023 09:15:00
Cristina Luna-Esteban, Ofcom's Director of Telecoms Consumer Protection, explains how our compliance and monitoring programme is helping to protect telecoms customers.
Teens on screens: Life online for children and young adults revealed30/03/2023 10:25:00
Children are gravitating to ‘dramatic’ online videos which appear designed to maximise stimulation but require minimal effort and focus according to Ofcom’s annual study into children’s relationship with media and the online world.
Ofcom publishes its Plan of Work for 2023/2428/03/2023 13:20:00
Ofcom has today published its Plan of Work for 2023/24, outlining our areas of work for the next financial year.
Ofcom fines video-sharing platform provider Tapnet £2,00028/03/2023 10:05:00
Ofcom yesterday fined Tapnet Ltd – which provides the video-sharing platform RevealMe – £2,000 after the company did not respond to a statutory request for information.
Joining forces to help protect children online27/03/2023 10:10:00
We’ve published a statement on international cooperation to protect children online, alongside our partners in the International Working Group on Age Verification.
New, audience-focused BBC operating licence fit for digital future24/03/2023 09:25:00
The changing needs of viewers and listeners and a demand for greater public accountability are enshrined in a new, modernised BBC operating licence announced by Ofcom yesterday
Can politicians present TV and radio shows? How our rules apply22/03/2023 10:25:00
Kevin Bakhurst, Group Director for Broadcasting and Online Content, explains the rules for politicians presenting and appearing on television programmes.
Latest trends in home broadband performance revealed20/03/2023 12:20:00
Ofcom recently (17 March 2023) published their latest Home Broadband Performance Report, which outlines how different broadband services perform and how they vary by factors including provider, technology, location and package.