BCS
Printable version

Demonising algorithms won't protect children from social media harm, BCS warns in response to the Online Safety Bill

Blaming social media algorithms for harming children, without equal emphasis on education, risks the Online Safety Bill’s success, the professional body for IT has warned.

Announcing the new regulation of digital platforms today (17 March), Culture Secretary Nadine Dorries said:

“If we fail to act, we risk sacrificing the wellbeing and innocence of countless generations of children to the power of unchecked algorithms.”

Singling out algorithms as responsible for children’s online safety was ‘passing the buck’, BCS, The Chartered Institute for IT said. A combination of ethical design choices in those algorithms, supported by strong digital education and adult media literacy was more likely to make the internet safer long-term, BCS added.

Dr Bill Mitchell OBE, Director of Policy at BCS, The Chartered Institute for IT said:

“It’s true that the public don’t trust any organisation – including tech companies and governments – to use algorithms to make decisions about them.

“But blaming the algorithm in this way is passing the buck. It’s how we collectively make design choices where the problems lie. It’s when we make badly informed choices or ones that are influenced by our own ignorance, or unconscious prejudice when we end up with algorithms that cause harm.

“When algorithms are ethically designed and competently developed, they can genuinely help improve our daily lives and the chances of solving the big problems in the world, such as caring for an ageing population and climate change.”

The Online Safety Bill aims to protect online users from harmful content under the regulator, Ofcom, which also has a duty to promote ‘media literacy’ among the general public. 

BCS welcomed the oversight role of Ofcom and the requirement for platforms to produce risk assessments against certain kinds of harm and set out what changes they will to mitigate those risks. Read the BCS policy briefing on the Online Safety Bill March 2022

Professor Andy Phippen, a Fellow of BCS and a specialist in Ethics and Digital Rights at Bournemouth University added:

“The rhetoric around the bill seems to be around making children safe, but the entire drive of the proposed legislation is tech sector regulation.

“When I talk to young people about online safety, they usually say they need supportive adults who understand the issues and better education. None have ever demanded that big tech billionaires need to be brought to heel. A group of young people I spoke to recently were clear that good online safety comes from good education and the opportunity to discuss and ask questions, and parents and teachers need to be able to have those discussions.”

Contact the Press Office

 

Channel website: http://www.bcs.org/

Original article link: https://www.bcs.org/about-us/press-office/press-releases/demonising-algorithms-wont-protect-children-from-social-media-harm-bcs-warns-in-response-to-the-online-safety-bill/

Share this article

Latest News from
BCS

If you think compliance is hard, try non-compliance