Metrics cannot replace peer review in the next REF

9 Jul 2015 01:55 PM

The findings of the Independent Review of the Role of Metrics in Research Assessment and Management have been published.

The four UK funding bodies that manage the Research Excellence Framework (REF), through which £1.6 billion of research funding is distributed each year, have been told by an independent review that ‘no set of numbers is likely to be able to capture the nuanced judgments that the REF process currently provides’, and that it is not currently feasible to assess research outputs or impacts in the REF using quantitative indicators alone.

The findings of the Independent Review of the Role of Metrics in Research Assessment and Management are based on 15 months of evidence-gathering and consultation, including the most comprehensive analysis to date of the correlation between REF scores at the paper-by-author level and a set of 15 bibliometrics and altmetrics, undertaken by HEFCE with data provided by Elsevier. This analysis covered 149,670 individual outputs, and found only weak correlations between REF scores and individual metrics, significantly lower correlations for more recently published works, and highly variable coverage of metrics across subject areas. The analysis concludes that that no metric can currently provide a like-for-like replacement for REF peer review.

In addition, over 150 responses to the review’s call for evidence uncovered considerable scepticism among researchers, universities, representative bodies and learned societies about the broader use of metrics in research assessment and management. Concerns include the ‘gaming’ of particular indicators, uneven coverage across individual disciplines, and effects on equality and diversity across the research system.

The review was chaired by James Wilsdon, professor of science and democracy at the University of Sussex, supported by an independent and multidisciplinary group of experts in scientometrics, research funding, research policy, publishing, university management and research administration. Its report, ‘The Metric Tide’, takes a closer look at the potential uses and limitations of research metrics and indicators, exploring the use of metrics within institutions and across disciplines.

Other findings of the review include the following:

The review has identified 20 specific recommendations for further work and action by stakeholders across the UK research system. The recommendations, provided in full in the report, propose action in the following areas: supporting the effective leadership, governance and management of research cultures; improving the data infrastructure that supports research information management; increasing the usefulness of existing data and information sources; using metrics in the next REF; and coordinating activity and building evidence.

Key recommendations include the following:

Professor James Wilsdon, who chaired the review, said:

‘Metrics touch a raw nerve in the research community. It’s right to be excited about the potential of new sources of data, which can give us a more detailed picture of the qualities and impacts of research than ever before. But there are also real concerns about harmful uses of metrics such as journal impact factors, h-indices and grant income targets. A lot of the things we value most in academic culture resist simple quantification, and individual indicators can struggle to do justice to the richness and diversity of our research.

'The metric tide is rising. But we have the opportunity – and through this report, a serious body of evidence – to influence how it washes through higher education and research. We are setting out a framework for responsible metrics, which I hope research funders, university leaders, publishers and others can now endorse and carry forward.’

David Sweeney, Director of Research, Education and Knowledge Exchange, HEFCE, said:

‘This review provides a comprehensive and soundly reasoned analysis of the current and future role of metrics in research assessment and management, and should be warmly welcomed. The findings and recommendations of this review are clearly far-reaching, with implications for a wide range of stakeholders, including research funders, governments, higher education institutions, publishers and researchers.

'We will discuss the specific REF-related findings and recommendations with the other UK HE funding bodies to agree next steps, including as part of preparations for consulting on a future exercise later in 2015. We will also be looking to work actively with other stakeholders, where noted in the recommendations, to address specific challenges and to take forward this broader agenda as part of a collective effort.’

Notes

1. The four UK HE funding bodies that manage the REF are: HEFCE, the Higher Education Funding Council for Wales, the Scottish Funding Council, and the Department for Employment and Learning (Northern Ireland).

2. The Independent Review of the Role of Metrics in Research Assessment and Management was set up in April 2014 at the request of the Rt Hon David Willetts, then the UK minister of universities and science. Full details of the review.

3. Professor Wilsdon was supported by an independent steering group with the following members:

Liz Allen (Head of Evaluation, Wellcome Trust)

Eleonora Belfiore (Associate Professor of Cultural Policy, University of Warwick)

Sir Philip Campbell (Editor-in-Chief, Nature)

Professor Stephen Curry (Department of Life Sciences, Imperial College London)

Steven Hill (Head of Research Policy, HEFCE)

Professor Richard Jones FRS (Pro Vice-Chancellor for Research and Innovation, University of Sheffield) – representative of the Royal Society

Professor Roger Kain FBA (Dean and Chief Executive, School of Advanced Study, University of London) – representative of the British Academy

Simon Kerridge (Director of Research Services, University of Kent) – representative of the Association of Research Managers and Administrators

Professor Mike Thelwall (Statistical Cybermetrics Research Group, University of Wolverhampton)

Jane Tinkler (London School of Economics and Political Science)

Ian Viney (Head of Evaluation, Medical Research Council) – representative of Research Councils UK

Professor Paul Wouters (Centre for Science and Technology Studies, University of Leiden)

4. ORCID is a non-proprietary alphanumeric code to uniquely identify academic authors. Its stated aim is to aid ‘the transition from science to e-Science, wherein scholarly publications can be mined to spot links and ideas hidden in the ever-growing volume of scholarly literature’. ORCID provides a persistent identity for individual people, similar to that created for content-related entities on digital networks by digital object identifiers (DOIs).

Read The Metric Tide report.