How altmetrics could help level the playing field for women in STEM

Traditional measures of success favour male researchers. Altmetrics could offer an alternative to help democratize academic evaluation.

25 March 2021

Bjarne Bartlett, Julie Fortin, Michael Kantar, Michelle Tseng & Zia Mehrabi


Mykyta Dolmatov/Getty Images

Academic hiring, promotion and tenure decisions are based on performance, which in most institutions is evaluated via metrics that disproportionately favour men.

Unconscious and conscious gender biases have been uncovered in almost all traditional academic evaluation metrics, including citation scores, grant allocations, and reference letters.

For example, men self-cite at much higher rates, which can lead to higher h-index scores. Reference letter writers tend to use more stand-out adjectives for men and more doubt-raising adjectives for women, which can affect how they are perceived by hiring committees.

These biases can manifest in unequal allocation of in-demand equipment and major prizes, and a slower uptake of innovative ideas.

Women have been shown to be significantly more likely to drop out of academia due to issues such as pay gaps and family responsibilities, and win fewer major medical grants.

Altmetrics (short for "alternative metrics") have emerged as a tool for quantifying the amount of interest a paper garners on digital platforms such as online news and social media. is one of the largest aggregators of altmetric scores, and hundreds of journals now publish the Altmetric Attention Score for individual journal articles.

Because of the democratizing effect of broad online audiences, altmetrics could help level the playing field for women in science, technology, engineering and mathematics (STEM).

In our study, published in the journal Scientometrics, we analyzed the Altmetric Attention Score of more than 200,000 articles published in seven major journals. We found that it displayed no clear gender bias.

Testing bias

We examined 208,804 original research articles between the years of 2011 and 2018 from Science, PNAS, PLOS One, New England Journal of Medicine (NEJM), Nature, Cell, and BioRxiv.

These journals represent high-quality journal types, specifically general interest (Nature, Science, PNAS), open access (PLOS One), disciplinary (NEJM, Cell), and a preprint server (BioRxiv).

We extracted author names, journal name, publication date and Altmetric score one year after publication.

We used the genderizeR algorithm, which was developed by data scientist Kamil Wais in 2016 as a way to better explore bibliometric data, to assign gender to the names based on a probability score, and acknowledge that this carries a small error rate.

Unlike traditional citation metrics, such as the 10 featured in the image below, we found no evidence of bias for Altmetric Attention Score across major journals, with equal attention afforded to articles authored by men and women alike.

alt Bjarne Bartlett et al.

The exception to this rule is the journal Science, which showed marked gender bias against women in 2018, as shown in the graphs below.

The graphs show the difference between Altmetric Attention Scores for male and female authors. Positive values indicate bias in favour of women, and negative values indicate bias in favor of men.

The grey ribbons represent 95% confidence intervals, such that if the interval overlaps zero, there is no evidence of bias.

More analysis is needed to understand why Science papers appear to be more affected by bias than the other journals.

alt Bjarne Bartlett et al.

Meagan Phelan, executive director of Science Press Package at the Science Family of Journals, who was not involved in the study, told Nature Index they are actively seeking to move beyond longstanding systemic inequities to ensure greater equity and visibility for all authors.

“Leadership at Science has placed a great priority on understanding the racial and ethnic diversity of our ecosystem,” says Phelan.

What does this mean for scientists?

Altmetrics are part of a suite of new big data-driven tools that could help to establish a more equitable system for the promotion, tenure and hiring of researchers in STEM.

Alone, altmetrics do not capture the entire impact of a particular paper or research work - they complement traditional citation metrics, instead of replacing them. They also provide a good indication of perceived impact, which can influence funding decisions.

Yet public engagement is rarely evaluated in institutional hiring and promotion structures.

We believe institutionalizing attention metrics for public engagement could have the dual benefit of fairness and of recognizing science communication efforts within universities. Beyond that, we believe further research is needed to examine why altmetrics did not show evidence of substantial gender bias for most journals.

If we can identify the mechanisms that help to eliminate gender bias in altmetrics, perhaps we can apply those mechanisms to improve other performance assessment tools, thus ensuring fairer evaluation of scientists and academics across the board.

This article has been contributed by members of the Nature Index community. See our pitching guidelines.

Julie Fortin, Institute for Resources, Environment, and Sustainability, University of British Columbia, Vancouver, Canada

Bjarne Bartlett, Department of Molecular Biosciences and Bioengineering, University of Hawai’i, Maile Way, Mānoa Honolulu

Michael kantar, Department of Tropical Plant and Soil Sciences, University of Hawai’i, Maile Way, Mānoa Honolulu

Michelle Tseng, Departments of Botany and Zoology, Biodiversity Research Centre, University of British Columbia, Vancouver, Canada

Zia Mehrabi, School of Public Policy and Global Affairs, University of British Columbia, Vancouver, Canada


Research Highlights of partners

Return to 'News'