Technology feature

Automated software saves researchers valuable hours

Online tools are lightening the load for authors and journal editors.

  • Brian Owens

Credit: Westend61/Getty

Automated software saves researchers valuable hours

Online tools are lightening the load for authors and journal editors.

12 October 2017

Brian Owens

Westend61/Getty

An international partnership is developing online tools that could save authors and journal editors hours in manuscript checking, while ensuring, with the help of peer review, that published science is high-quality, replicable, and useful.

Before submitting a paper, researchers often use reporting guidelines, such as the CONSORT guideline for randomized clinical trials, to meet the highest standards of publication. But, with almost 400 guidelines, each covering a different kind of study in health research alone, picking one can get complicated.

“People love them, but it can be difficult to know which ones to use,” says Caroline Struthers, education and training manager at the EQUATOR Network , a group dedicated to improving the quality of health research. EQUATOR maintains a library of reporting guidelines. “Sometimes you need more than one, and some of the language can be a bit ‘researchy’,” requiring a good understanding of technical jargon.

Enter the algorithms.

EQUATOR has teamed up with James Harwood, founder of the manuscript-checking company, Penelope, to make choosing a guideline easier. Harwood designed a simple online questionnaire, which asks researchers about their work — such as whether it involved human subjects — and then recommends a guideline. The questionnaire, first tested by BioMed Central in 2016, is available on the Penelope website.

The goal is to create tools that can check whether manuscripts comply with the guidelines, potentially saving journal editors and authors rounds of lengthy rewrites and ensuring the final article contains the data and information needed to be useful to the scientific community. A survey by Nature last year found that 70% of scientists have tried and failed to reproduce another researcher’s work.

Struthers estimates that it will be several years before the software is able to process the complex guidelines. In the meantime, Penelope can already check manuscripts against journal submission guidelines, which are more straightforward than reporting guidelines like CONSORT, and flag any missing pieces. The tool, also called Penelope, uses text mining techniques such as pattern matching, probabilistic algorithms and natural language processing to determine whether a manuscript meets a particular journal’s requirements.

The tool can check, for example, whether all the relevant sections are present, if an ethical statement has been provided, and that the trials included randomization and blinding. It can even pull out some of the statistics and check that they are correct, says Harwood.

The journals Addiction and BMJ Open have been prompting authors to use Penelope for the past year. Hundreds of authors have made use of the service so far, and Molly Jarvis, the editorial manager of Addiction, says the results have been promising.

“We are sending back fewer submissions because they don’t meet our basic requirements. The general quality of submissions — in terms of including things we ask for such as declaration of interest, structured abstract, references in correct format — has improved in the past year,” she says.

Automation is intended to supplement the journal review process, not replace it, says Harwood. “It’s one thing to ask a computer whether something is in a manuscript, but whether it is good science is something else, that needs a human peer reviewer.”