High-impact papers score well in REF, study finds
Researchers in the UK find that the impact factor of journal articles matches judgements by a panel of experts on research quality.
9 October 2017
A study has found that, in science-based subjects, universities with a large proportion of articles published in high-impact-factor journals are likely to score highly in the Research Excellence Framework (REF), the United Kingdom’s system for assessing the quality of university science.
As part of the REF process, university faculty were asked to submit outputs — such as journal articles, book chapters or exhibitions — to a panel of experts, who scored them for research quality. The entire process cost UK funding bodies and institutions an estimated £246 million.
Kushwanth Koya and Gobinda Chowdhury, information and computer scientists at Northumbria University, who authored the recent study published in PLOS ONE, describe the REF as “a huge activity, with lots of meetings and time spent reading the papers, scoring them and crosschecking them.”
They suggest, based on their recent analysis, using journal impact factor to assess the quality of research outputs might make the process “cheaper, easier and more transparent.”
However, researchers involved with the national-level evaluation warn against reducing the process to a metric. Ole Petersen, a physiologist at Cardiff University and chair of the REF14 subpanel on biological sciences, says that doing so “would send the wrong message to academics: that it’s all about some quantitative parameter."
"Our assessment was based on reading the articles and judging whether we felt the research was new, important and reliable,” says Petersen. “Impact factor is just one of many possible quantitative indicators that journals use, and it does not necessarily say anything about any individual article in the journal.”
The next round of the REF, for which consultation is underway, will take place in 2021.
In their study, Koya and Chowdhury analysed open-access data on REF14 submissions from the Higher Education Funding Council for England (HEFCE), which is responsible for allocating research funding to English universities and colleges, and impact factor data from Thomson Reuters, now Clarivate Analytics.
They found a statistically significant correlation between percentage of papers submitted to high-impact factor journals and percentage of top-ranking research outputs as determined by the REF panels, in the fields of clinical medicine, physics, psychology, psychiatry and neuroscience.
For research in classics, anthropology and development studies, they observed no correlation. Journal articles are not the primary means of communication in these subjects, the authors explain.
A similar correlation between REF scores and journal impact factors was found in 2015 in The Metric Tide report, an analysis of REF14 results commissioned by HEFCE.
However, Steven Hill, head of research policy at HEFCE, cautions institutions against adjusting their submission practices based on findings of a correlation alone. “It would be unwise for universities to select their outputs for submission on this basis,” he says. “An equally plausible explanation of the data is that high-quality research both scores well in the REF and is more likely to be submitted to journals with high citation rates.”
Chowdhury and Koya are currently looking for similar patterns across all 36 REF subject areas.
In the same study, they also calculate the monetary value of research outputs and find that outputs with a score of four stars (deemed top-ranking in the REF) were awarded four-times as much funding as those with a score of three stars (internationally excellent).
Across subject areas, engineering-based subjects, pure sciences and environmental sciences were the highest earners (£14,639 for top ranking) in the REF14, while humanities, language and area studies earned the least (£7,504 for top ranking).