News

Review articles cause “dramatic loss” in citations for original research

But there are some influential exceptions.

  • Clare Watson

Credit: erhui1979/Getty Images

Review articles cause “dramatic loss” in citations for original research

But there are some influential exceptions.

11 May 2021

Clare Watson

erhui1979/Getty Images

Researchers have long suspected that review papers poach citations from the original papers they cite. A new analysis of millions of journal articles shows that certain review papers do indeed draw citations away from some, but not all, original works.

The exception is a special subset of so-called bridging papers not only get a boost to citations by being included in a review article, but become central hubs for future research, and restructure research fields in the process.

The study, published in American Sociological Review, looked at review articles published in Annual Reviews journals. Spanning the biological, physical and social sciences, the analysis was most revealing for its insights into how review articles consolidate scientific knowledge and transform research communities, says lead author Peter McMahan, a sociologist at McGill University in Montreal, Canada.

“We realized that it was more about this idea of what curation does to bodies of [scientific] knowledge than about the specific fate of articles that are cited by reviews,” says McMahan.

Some papers take a back seat

McMahan and co-author Daniel McFarland, from Stanford University in California, crunched Web of Science citation data on nearly six million articles published between 1990 and 2016 in 1,155 highly-cited journals.

By relating the predicted life cycle of citations for any given paper to actual citations papers received after their inclusion in a review, they found that most papers experienced a dramatic loss of nearly 40% in future citations when referenced in Annual Reviews.

However, a handful of the original articles cited - dubbed ‘bridging’ articles for the way they draw links between otherwise disconnected research fields – received considerably more attention within their fields, as indicated by co-citations, than other papers after being included in one of the 6,495 Annual Reviews articles analysed.

Further analyses of co-citation networks, indicating how research papers are tied to and cited with other articles in their field, showed that the research literature in a particular field became more integrated seven years after an Annual Review of it was published.

Distinct research communities dissolved into one large cluster centred around the bridging articles highlighted by reviewers, McMahan says, with these ‘exemplars’ becoming the focal points for future research on the topic.

Taking the results together, the drop in citations for original works is a reflection of how scientific knowledge progresses, shaped by curatorial review papers that help scientists make sense of the huge volume of research that is produced, McMahan says.

As new specialities emerge, “some of the early, original research has to take a backseat to the more schematic view that makes it a comprehensible subfield as opposed to a bunch of disparate research", he says.

Leading voices author reviews

Although their study analysed only Annual Reviews, and not review articles from other publishers, McMahan and McFarland posit that Annual Reviews are distinctly authoritative sources that give legitimacy to emerging subfields. An Annual Review carries weight, McMahan says, especially for people who are outside or more distant to that particular area.

A 2014 study of biomedical research, for example, compared papers cited in review articles (of any kind) to those not cited and found no difference in their lifetime citations.

University of Copenhagen sociologist Mathias Wullum Nielsen says the American Sociological Review study takes a more complex statistical approach and clearly shows how leading figures authoring Annual Review articles can steer scientific scholars in new directions by connecting research communities that rarely talk to each other.

“We know from other sociology of science literature, that’s where new innovations [emerge] – that’s where the novelty occurs,” he says.

However, the observed effects might be specific to publications from Annual Reviews, and not necessarily a broader trend, Nielsen says.

As McMahan and McFarland note, Annual Reviews are high-impact journals that do not follow standard peer-review processes and that recruit leading voices in a given field to write reviews. Because of this, the findings should also prompt reflection on who is tasked with writing reviews and how that might shape research, Nielsen says.

“These Annual Reviews articles represent expert opinions of influential scholars,” says Nielsen, who published a recent analysis showing that top-cited scientists are increasingly accruing a larger share of all citations, year on year.

“When you recruit experts who are already highly cited and visible in a field [to write reviews], you give them extra influence,” he says.

The American Sociological Review analysis also raises familiar questions about whether citations counts are an adequate measure of research impact.

“Citations are this really potent currency to academics,” says McMahan. But “the impact of research, of any particular project that a team or a lab or an individual is doing – that impact is actually a much richer, larger and more complex thing than can actually be measured by citations.”