Clarivate Analytics’s new author profiles tool to include peer-review activity
The aim is to give users more context around citation performance.
1 February 2022
Dalmeet Singh Chawla
Clarivate Analytics, the UK-based firm that owns the scholarly database Web of Science (WoS), has begun automatically generating author profiles, including information about peer-review activity for researchers whose papers are indexed on the database.
The peer-review information will be integrated from early 2022. It will come primarily from Publons, a free-to-use website that allows researchers to claim credit for peer review, which was bought by Clarivate in 2017 and now has more than three million users.
To find papers previously, WoS users would perform a document search in the WoS database, although an author search to find individual authors and their academic activities has also been possible since 2019. The tool is currently free to use, but the firm will be offering a version with additional features for a fee.
The WoS tool is particularly aimed for use in evaluating researchers’ grant funding applications, as well as assessing academics for hiring and promotions, its creators say. It will help assessors spot any conflicts of interest and give them a fuller picture of an academics’ academic activities, they note.
The addition of the new Publons feature makes the WoS tool the first to pull together details about researchers’ publishing records, their collaborators, the journal editorial boards they serve, and other professional research activities, such as the manuscripts they have refereed for journals, and the grant reviews they have completed.
There is a need for such a tool, says Ludo Waltman, deputy director of the Centre for Science and Technology Studies at Leiden University in the Netherlands, who has used it, but isn’t involved with the project. There have been other attempts to create such profiles, but the peer-review integration function is unique, he says.
The tool may also be useful for researchers conducting text-mining studies, Waltman notes.
Jeroen Huisman, a higher-education researcher at Ghent University, in Belgium, who has conducted research on the internal and external pressures that affect institutions’ public profiles and their market positions, says there isn’t much literature on the topic of researcher profiles. “Anything that researchers or their institutions can do to show the picture to the outside world is helpful,” he says.
“We’re very keen to try and provide tools that promote a responsible way of evaluating research,” says Philip Reimann, a senior product manager at Clarivate Analytics. For instance, in 2021, Clarivate added an author impact beam plot, a visualization feature that showcases researchers’ publication and citation impact, to the tool. The aim is to give users more context around citation performance and steer them away from single-point metrics, says Jeffrey Clovis, senior director of solutions support at Clarivate.
Overreliance or misuse of metrics has been a thorny issue in research evaluation that has gained more attention in the past decade or so — particularly through such advocacy as the Leiden Manifesto and the San Francisco Declaration on Research Assessment (DORA).
Researchers can indicate that a profile is theirs, thereby gaining the account a ‘green tick’ in WoS. They can also correct errors or add missing information, such as awards received, Reimann says. With Publons functions merged into the tool, academics will be able to manually add information about their peer-review activities as they currently do with Publons, or about other scholarly outputs such as published data sets or code.
Researchers using the tool can also link their profiles to their unique digital identifier, ORCID ID. Doing so would mean that when a researcher adds a paper to their ORCID profile, it’s automatically added to their WoS author profile, and vice versa, Reimann says. “We just really want to minimize the replication of effort.”
Waltman says that although integrating peer-review information is an interesting step, the tool doesn’t cover other non-traditional research outputs, such as preprints and preregistered reports. He wants Clarivate and other companies to broaden the information they offer “so that it becomes more and more clear that relying on any single piece of information, like a h-index, doesn’t really make sense”.
Giving visibility to peer review is an important step in fixing its lack of recognition as a component of research performance, Waltman says. It recognizes that referee reports are “important scholarly outputs that need to be treated in a similar way we treat other types of publications”.