Chemistry researchers cite their own work the most
But we shouldn't assume selfish motives.
27 April 2020
ROBERT BROOK/SCIENCE PHOTO LIBRARY/Getty
Researchers in chemistry are the most likely to cite their own research, citation data analysis for nearly 400,000 authors across 15 subject areas has shown. Science and technology, physics and biology follow chemistry in the self-citing stakes.
Without exception the top-scoring authors for self-citations were male, highly productive, and in the later stages of their careers. For the top self-citer of the four fields, in biology, almost a quarter of works he cited were his own, while the top self-citer in chemistry referenced his own work in 11% of citations.
While self-citing is commonly demonized as a gaming behaviour, the study authors argue this is too simplistic, and the practice should be brought into the open to allow better understanding of the bibliometric impact of researchers’ work.
Justin Flatt of the University of Helsinki, Finland and co-authors analyzed the citation data of 385,616 authors in 15 subject areas, using the self-citation index, a metric Flatt proposed in 2017. The findings were published in Scientometrics.
The authors had published a total of 3,240,973 papers with 90,806,462 citations, with references to their own work making up around 5%.
“I think we tend to have a very suspicious view of self-citations,” says Flatt. “Instead of viewing them as a form of progress, we often see them as nothing more than self-promotion.”
For example gaming behaviour is evident among scientists in Italy, who began citing their own work more frequently following the passing of a law in 2010, which required researchers to meet productivity thresholds in order to be promoted.
A 2019 analysis of 100,000 of the most highly-cited scientists found a median self-citation rate of 12.7% and identified at least 250 extreme self-citers who reference their own work more than 50% of the time.
But Flatt, a biologist, says that recognizing how self-citing behaviour differs across disciplines can help evaluators distinguish between excessive self-citers and those who are simply building on their own ideas over time.
“We need context,” he says. “I think we have to be careful about trying to use a single summarizing statistic to explain away the behaviour of large populations of authors.”
Keep it open
Self-references are often excluded from citation data to prevent authors from deliberately increasing their h-index, but the authors write that hiding self-citation data in this way “needlessly confuses any attempts to understand the bibliometric impacts of one’s work.”
“We urge academics to embrace visibility of citation data in a community of peers, which relies on nuance and openness rather than curated scorekeeping,” the authors wrote.
No one size fits all
When Flatt and his team calculated s-index scores for each of the 385,616 authors in their study they found that 98% achieved a score below five, indicating that they rarely cited their own work.
Almost 2% scored between six and 10, and just 0.1% of the authors had an s-index above 10.
John Ioannidis, an epidemiologist who has previously published work on self-citing behaviour, says that openly reporting self-citations should be a routine part of research evaluation and could be useful for detecting suspect practices.
“Besides self-citations by the same author, counting self-citations from all co-authors of a paper is an additional useful instrument,” says Ioannidis, who was not involved in the study. “Much inappropriate or excessive self-citing is not done directly by an author but through ‘citation farms’ and networks of scientists jointly promoting their careers.”