How can institutions and funders help to police questionable research practices?
A new toolkit is being trialed as a way to bring clarity to issues of research misconduct and dishonest practices.
7 September 2021
Dalmeet Singh Chawla
A pilot study at universities and private and public funding agencies in Europe testing a new toolbox of topics aimed at helping institutions draw up plans to curb shoddy or dishonest research practices.
The pilot is part of the European Union-funded Standard Operating Procedures for Research Integrity (SOPs4RI) project, set up to “stimulate transformational processes across European research institutions” to cultivate research integrity and reduce questionable research practices (QRPs).
Several studies have found that QRPs, such as cherry-picking data, withholding negative findings and concealing conflicts of interest, are common. A recent large-scale survey conducted in the Netherlands reported that more than half of the respondents had participated in at least one QRP in the previous three years; around 8% of researchers surveyed admitted to falsifying or fabricating data.
It is a condition of obtaining funding from the EU that an institution draws up a research-integrity promotion plan, which explains how it is managing the issue, says Joeri Tijdink, a psychiatry clinician and researcher at VU Amsterdam in the Netherlands, and one of the researchers involved in the project.
“Institutions have not [got] a lot of guidance [on] how they can improve research integrity with guidelines,” he says, referring to the SOPs4RI project. “We are the first big project that is trying to provide this to institutions.”
Subject topics covered by the toolbox – including research environment, education and training, declaration of interests, supervision and mentoring – were decided following 30 focus group interviews in eight countries, says Mads Sørensen, who studies research integrity at Aarhus University in Denmark and is also a researcher on the project.
Institutions are encouraged to draw on the prompts and guidelines in the toolbox when writing their own research integrity plans. How much importance is placed on each topic varies by discipline within institutions, Sorensen notes.
For instance, three topics that are the most important to humanities scholars include research environment (described as “collegiality, openness, reflection and shared responsibility in the work environment”), declaration of competing interests and education, and training in research and technology. In the social sciences, supervision, mentoring, data management, and dealing with breaches were also emphasized, Sørensen says.
The pilot study will explore how useful the toolbox could be for universities and funders in policing research integrity. Sørensen admits that it will be difficult to measure the precise impact of the toolbox, but hopes to see some indications that it is providing benefits.
Nele Bracke, research-policy advisor at Ghent University in Belgium – one of the institutions that is taking part in the pilot – says the institution has been training experts on research integrity who can, in turn, train others. The aim, she says, is to educate students early in their careers, such as while they are still studying bachelor’s and master’s degrees.
Quality of research is not only the responsibility of individual researchers, she says. “It's also a responsibility of the university as a system that needs to create an environment and a setting that fosters research integrity,” says Bracke.
As yet, no university or funder has formally adopted the SOPs4RI toolbox.
No one-size-fits-all solution
The project stops short of providing templates for exactly what institutions should do. “We do not want to be extremely prescriptive when it comes to what goes into a research-integrity promotion plan,” says Niels Mejlgaard, a political scientist, also at Aarhus, who is involved with SOPs4RI. “But we think that it is a fair requirement or suggestion that each of these areas is tackled.”
Whether having such a plan in place will make a discernible difference in improving the integrity of research practices is open for debate. But, based on his clinical experience, Tijdink says having the right tools in place is useful: “In clinical practice, we have a lot of guidelines that are helping those who make good decisions for our patients.”
“The core idea or ambition is to start invoking some sense of self-reflection among research and funding organizations across Europe,” Mejlgaard adds.
Ultimately, Mejlgaard hopes to see translated into practice the broad principles that are mentioned in the European Code of Conduct for Research Integrity, such as reliability, honesty, respect, and accountability, but are often not so easy to roll out at an organizational level. For Mejlgaard, part of the effort is to make sure institutions don’t have to start from scratch.
Mads Paludan Goddiksen of the University of Copenhagen in Denmark, co-author of a recent study about cheating among secondary school, bachelor’s degree and doctoral students, warns that a zero-tolerance approach towards misconduct can cause students to mistrust and circumvent university policies and processes, for example, by failing to report cheating on behalf of fellow students.
“There’s a balance to strike for institutions about being clear that misconduct and questionable research practices is not something we will accept,” he says. But they should do this while “also communicating to students and young researchers that the system that handles these cases is actually fair, and it actually considers the context in which the case takes place”.