Printable version

The public don’t trust computer algorithms to make decisions about them, survey finds

The majority of people do not trust computers to make decisions about any aspect of their lives, according to a new survey. 

Over half (53%) of UK adults have no faith in any organisation to use algorithms when making judgements about them, in issues ranging from education to welfare decisions, according to the poll for BCS, The Chartered Institute for IT. 

The survey was conducted in the wake of the UK exams crisis where an algorithm used to assign grades was scrapped in favour of teachers’ predictions. 

Just 7% of respondents trusted algorithms to be used by the education sector - joint lowest with social services and the armed forces. Confidence in the use of algorithms in education also differed dramatically between the age groups - amongst 18-24-year-olds, 16% trusted their use, while it was only 5% of over 55-year-olds. 

Trust in social media companies’ algorithms to serve content and direct user experience was similar at 8%. Automated decision making had the highest trust when it came to the NHS (17%), followed by financial services (16%) and intelligence agencies (12%), reflecting areas like medical diagnosis, credit scoring and national security. 

Police and ‘Big Tech’ companies (like Apple and Google) were level with 11% of respondents having faith in how algorithms are used to make decisions about them personally.  

Older people are less trusting about the general use of algorithms in public life, with 63% of over-55s saying they felt negative about this, compared with 42% of 18-24-year-olds. Attitudes to computerized decisions in the NHS, private health care and local councils differ very strongly by age. 30% of 18-24-year-olds said they trusted the use of algorithms in these sectors, while for those over 55, it was 14%.

Over 2,000 people responded to the survey conducted for BCS, The Chartered Institute for IT by YouGov; all were shown a description of algorithms before answering any questions.

Dr Bill Mitchell, Director of Policy at BCS said:

“People don’t trust algorithms to do the right thing by them – but there is little understanding of how deeply they are embedded in our everyday life. 

“People get that Netflix and the like use algorithms to offer up film choices, but they might not realise that more and more algorithms decide whether we’ll be offered a job interview, or by our employers to decide whether we’re working hard enough, or even whether we might be a suspicious person needing to be monitored by security services. 

"The problem government and business face are balancing people’s expectations of instant decisions, on something like credit for a sofa, with fairness and accounting for the individual, when it comes to life-changing moments like receiving exam grades.

“That’s why we need a professionalised data science industry, independent impact assessments wherever algorithms are used in making high-stakes judgements about people’s lives, and a better understanding of AI and algorithms by the policymakers who give them sign-off.”

View the algorithms report (PDF)

Contact the Press Office


Channel website: http://www.bcs.org/

Original article link: https://www.bcs.org/more/about-us/press-office/press-releases/the-public-don-t-trust-computer-algorithms-to-make-decisions-about-them-survey-finds/

Share this article

Latest News from

Smart Places & Smart Communities 2024