Tight guidelines needed to make sure new Online Safety law is a success, says professional body for IT

13 May 2021 03:53 PM

The government needs to be clear about what makes online material harmful, if new laws are to be effective, says BCS, The Chartered Institute for IT. This follows the publishing of the Online Safety Bill which could lead to big fines for tech companies who don’t protect their social media users from harm.

Dr Bill Mitchell, Director of Policy at BCS, The Chartered Institute for IT yesterday said:

“This is a positive step that will reassure parents and increase online safety and accountability. However, the proposed Bill will be challenging in practice unless there is robust and objective guidance for social media platforms on what legal online content constitutes as harmful and must be removed.

"That must be informed by a comprehensive public debate on how we balance the need to limit online harm and at the same time nurture freedom of speech and the freedom to disagree in a civilised manner, which underpins a democratic society.

“We fully support the Government’s focus on tackling child-abuse, racist and misogynistic abuse online while re-stating that there are people from a wide range of backgrounds who need more help with online safety, and indeed access to the benefits of the internet.

“We believe the UK won’t reach its full digital potential until everyone who is willing and able is equipped with the skills and opportunity to use the internet safely. Lack of access to digital technology is an ‘offline harm’ and further government action to coordinate efforts to close the digital divide is required to understand who is being left behind and how they can be included.”

When the law comes into effect, the new online regulator Ofcom will be given the power to heavily fine companies up to £18 million or 10 per cent of their annual global turnover, whichever is higher –  a figure which could run into billions of pounds for larger companies. The Bill also includes a deferred power making senior managers of firms criminally liable for failing to follow a new duty of care. Ofcom will also have the power to block access to sites, the Government said.

Professor Andy Phippen, FBCS, Professor of IT Ethics and Digital Rights at Bournemouth University and BCS Law Specialist Group Committee Member yesterday said:

“As it stands, the Bill is focussed almost entirely on providers to fix and safeguard against potential harms; there is little mention of other stakeholders with influence and social capital in this area.

“Young people tell me they want better education, skills and knowledgeable adults they can talk to about their concerns, rather than calling on platforms to adopt an intangible duty of care.

"We can’t make kids safe online, but we can make them informed about risk and help them mitigate them. Industry are responsible for providing them with the tools to help mitigate risk, but can’t solve it on their own.”

“Educational organisations, civic society and many other sectors are key to understanding the nature of these problems, but also in delivering better digital education, guidance and building functional resilience to online harms.

“We need more emphasis on the creation of opportunities to harness the collective skills across our communities – and the proposed legislation should do that.”

Contact the Press Office