POST (Parliamentary Office of Science and Technology)
Online Information and Fake News
Internet search engines and social media platforms are an increasingly popular way of accessing news and information. In 2017, the proportion of UK adults consuming news online exceeded those who watched news on TV (74% versus 69%). This note considers how people access news online, how algorithms (sequences of instructions) and social networks influence the content that users see, and options for mitigating any negative impact.
Social media platforms and Internet search engines enable users to find information that they find most interesting or relevant by filtering content. Information is filtered by both algorithms and user behaviour (e.g. selecting who to follow or which pages to like). There are differing views on the potential effect that these technological changes are having on the opinions of individual users.
Some have suggested that filtering could lead to users only seeing content that conforms to their pre-existing beliefs, and that it could unintentionally limit the range of information that users see. Two phenomena have been proposed:
- Echo-chambers – in which people form social networks with those who largely reflect their own viewpoints.
- Filter bubbles – in which search engines, social media sites and news aggregators automatically recommend content that an individual is likely to agree with, based on the previous behaviour of the user and others.
However, a growing body of research suggests that these filtering effects do not fully eliminate exposure to attitude-challenging information, for example because users on social media typically have a diverse social network spanning multiple geographic regions.
Concerns have been raised internationally by politicians, journalists and others about the spread of false information (“fake news”) online, and the effect that it may have on political events such as elections. There is no clear, agreed definition for fake news. Generally, it is defined as content intended to misinform or influence the reader. It is often financially or politically motivated.
The UK Government has no specific policies for addressing fake news, filter bubbles or echo-chambers. Attempts to address these issues have mainly focused on fake news, and have been largely industry-led, although other approaches include regulation and user education.
- Social media platforms and Internet search engines have made it easier to produce, distribute and access information and opinions online.
- These technologies, combined with user behaviour, filter the content that users see. On the one hand, some studies suggest that this limits users’ exposure to attitude-challenging information and that echo-chambers or filter bubbles may form. On the other hand, other studies argue that users still see a wider range of information than offline.
- Online fake news has the potential to confuse and deceive users, and is often financially or politically motivated.
- UK efforts to address these issues are largely led by industry and focus on fake news. They include better identification, fact-checking and user education.
POSTnotes are based on literature reviews and interviews with a range of stakeholders and are externally peer reviewed. POST would like to thank interviewees and peer reviewers for kindly giving up their time during the preparation of this briefing, including:
- Prof Rob Procter, University of Warwick*
- Dr Jonathon Bright, Oxford Internet Institute, University of Oxford*
- Prof Philip Howard, Oxford Internet Institute, University of Oxford
- Sam Woolley, Oxford Internet Institute, University of Oxford
- Monica Kaminska, Oxford Internet Institute, University of Oxford*
- Dr Helena Webb, Computer Science, University of Oxford*
- Dr Richard Fletcher, Reuters Institute for the Study of Journalism, University of Oxford*
- Prof Adam Joinson, University of Bath*
- Dr Emma Williams, University of Bath*
- Dr Ana Levordska, University of Bath*
- Dr Felipe Romero Moreno, University of Hertfordshire*
- Dr Frederik Zuiderveen Borgesius, University of Amsterdam*
- Amy Sippett, FullFact
- Phoebe Arnold, FullFact*
- Claire Wardle, First Draft News
- Jessica Montgomery, The Royal Society*
- Fergus Bell, Dig Deeper Media*
- Emma Collins, Facebook*
- Karim Palant, Facebook
- Nick Pickles, Twitter*
- Dave Skelton, Google
- Niall Duffy, Independent Press Standards Organisation*
- Jim Waterson, BuzzFeed News*
- Patrick Worrall, Channel 4 News*
- Department for Digital, Culture, Media and Sport*
- Department for Education*
- Cabinet Office
*Denotes people who acted as external reviewers of the briefing.
|Academic Fellowships||Upcoming work||POST Publications|
Latest News from
POST (Parliamentary Office of Science and Technology)
Decarbonising the Gas Network16/11/2017 16:05:00
The burning of natural gas for heating contributes 14% of the UK’s greenhouse gas (GHG) emissions.
Environmental Earth Observation16/11/2017 14:05:00
Earth observation (EO) is the process of gathering information about the Earth from a range of sensors to provide monitoring data at a range of scales.
Communicating Risk14/11/2017 14:43:00
People's responses to risk are shaped by the way that such risks are communicated.
Risk Assessment of Nanomaterials10/10/2017 16:05:00
The unique properties of engineered nanomaterials are beneficial to a range of industries. However, uncertainties in assessing their potential health and environmental risks could hinder their safe use.