AI writing tools promise faster manuscripts for researchers

Automation brings plagiarism risks, and software still needs human input for analysis and narrative.

17 August 2021

Andy Tay

gmast3r/Getty Images

Writing tools powered by artificial intelligence (AI) have the potential to reduce manuscript preparation time to a few days, or hours. Deep-learning technologies that run chatbots, spellchecks and auto-generated tweets are being used in a growing number of products pitched at students and academics.

Grammarly, for example, claims to “inspect your writing carefully to improve clarity, word choice and more”, offering free and fee-based services.

But are these tools up to the task? In an experiment through education information site EduRef, a group of recent graduates, undergraduates and self-described undergraduate-level writers were given the same assignments as GPT-3, an AI language program developed by OpenAI, a research company co-founded by Elon Musk.

The assignments were evaluated by instructors who did not know who (or what) had written them. GPT-3 performed in line with the humans, according to EduRef, and received “more or less the same feedback”.

The program was praised for writing excellent openings and transitions, but was criticised for using vague, blunt and awkward language, and for failing to craft a strong narrative. It wrote shallow, less descriptive papers than the students, according to EduRef’s write-up on its website, but it took between three and 20 minutes to write a paper, compared to three days for the students.

Hilde van Zeeland is chief applied linguist at start-up company Writefull, which offers AI-based language editing, and is part of London-based Digital Science (see disclosure, below). She says AI tools are already powerful enough to improve a writer’s sentence flow and structure, and will continue to improve.

The company recently analysed more than 250,000 abstracts to identify the most commonly used phrases in each of four different parts of the abstract.

They found, for example, that the words ‘aim of this study’ occurred most frequently in part 1 of the abstract (where study aim and background are described) and the phrase ‘95% confidence interval’ occurred most often in part 4 (which deals with meaning of results, contribution, future research). Users can choose phrases and connecting sentences to use in their own papers.

alt

“With this database of phrases, when a user struggles with the right word to use for each section, our software will be able to provide good alternatives,” says van Zeeland.

The website tells users not to worry about plagiarism, reassuring them that certain phrases won’t be flagged by a plagiarism checker because they are short and commonly used.

Writefull is running a separate experiment that involves feeding an abstract into an AI tool, which generates a paper title based on the input. This function can enhance title readability, draw readership to the abstract and make the article more visible to search engines, according to van Zeeland.

AI tools can do more than check a writer’s grammar and spelling and suggest frequently used phrases, says computer scientist, Guillaume Cabanac, from the University of Toulouse in France.

An analysis of journal-published computer-science papers by Cabanac and his colleagues, posted as a preprint on arXiv in July and yet to be peer-reviewed, found swathes of articles containing tortured phrases and nonsensical text. For example, the phrase ‘colossal information’ was used instead of ‘big data’.

The authors suspect that these papers were the result of using automated translations or software that rewrites existing text to disguise plagiarism, according to a report in Nature.

Their study prompted warnings from research integrity experts that a new type of fabricated paper was entering the scientific literature, which will be hard to spot as AI tools learn to use more sophisticated language.

Considerations for using AI writing tools in academic work

Shu Chian Tay, a science communicator at the National University of Singapore, fears that AI writing tools could exacerbate inequality between labs with different access to resources or openness to trying new technologies.

Shu Chian Tay

Shu Chian Tay

“Consider a hypothetical situation where AI science-writing is possible and two labs have made the same research breakthrough at the same time,” says Tay.

“One lab stuck to human writing, which took weeks, while the other made use of AI writing tools. The latter, who is likely to publish earlier would get most, if not all, of the scientific recognition and lucrative intellectual property rights.”

Will AI replace human writing?

Tay notes that although AI writing tools are quite powerful, they have not yet reached a point where they can write a scientific manuscript from scratch. “We still need researchers to analyse data, present them as figures, and develop a coherent story before feeding all this information to the AI machine,” she says.

Ryan Morrison, professor of English as a second language at George Brown College in Toronto, Canada, says that although the adoption of AI in academic writing is inevitable, it is unlikely to replace human writing in the near future.

Ryan Morrison

Ryan Morrison

“Just as super-computer calculators can process large sets of numbers fast, AI writing tools outperform humans in completing tasks like spell checks and text generation at a much faster rate,” says Morrison. “But the creativity of AI is ultimately limited by the input materials and the discretion of the human curator.”

Software such as Turnitin is also incorporating AI technology to enhance its ability to detect plagiarism. This could potentially be used to identify documents written using AI tools.

Be transparent about using AI writing tools in academic work

Michael Mindzak, assistant professor in the department of educational studies at Brock University in Ontario, Canada, cautions that the academic community has not yet agreed on how to manage potential problems related to AI writing tools, such as plagiarism and authorship credits.

Michael Mindzak

Michael Mindzak

Most institutions are yet to formulate policies on the use of these tools by students and staff. This can be problematic when committee members making decisions on tenure and promotion have different views about the appropriateness of using them, says Mindzak.

“My advice is to be transparent,” says Mindzak. “Declare and add a disclaimer if you have used AI-assisted writing. As more people in the community do this, it may become a norm, just like how the ‘conflict of interest’ section has been added to academic papers in response to a rise in company-funded research and start-ups from academic labs.”

*Digital Science is a subsidiary of Holtzbrinck Publishing Group, which owns 53% of Springer Nature, publisher of the Nature Index. Nature Index is editorially independent of its publisher.

Tags:

Research Highlights of partners

Return to 'News'