A research integrity crisis has swept through the scientific world and thousands of published articles have been retracted in recent years. A new report from the International Mathematical Union (IMU) warns mathematicians of Fraudulent Publishing in the Mathematical Sciences. This report explains that the relatively low rates of publication and citation in mathematics mean that it is particularly vulnerable to manipulation by those who aim to artificially boost their publication record or improve their citation metrics.
The report divides misconduct into three categories. “Occasional poor practice” which includes exaggerated numbers of self-citations in papers; “systematic bad practice” like falsely claiming co-authorship; and then “fraudulent behaviour” such as publishing fake papers produced by paper mills.
Quantitative assessment of research output is named as one of the main drivers of efforts to manipulate publication and citation records and much of the report explains how researchers are incentivised to game metrics for personal advancement. It also explains that institutions can be incentivised to overlook or even support this manipulation to boost their university rankings.
The changing business models in publishing mean that many journals are now financially incentivised to publish more, and the report suggests that this can exacerbate the issue of misconduct by giving bad actors a way to publish work intended to manipulate metrics. However, it is important to remember that reputable journals retain their high-quality standards regardless of financial model, and that Gold Open Access journals should not be equated with predatory journals which seek to exploit unsuspecting authors for financial gain.
The report criticises commercial companies like Clarivate (owners of Web of Science) for calculating citation metrics (like the Journal Impact Factor) and publishing lists (like the Highly Cited Researchers), which it says are having “detrimental effects”. However it does not level the same criticism at the Mathematical Citation Quotient (MCQ) – which has the same calculation as Clarivate’s Five-Year Impact Factor and only differs by being applied to a different database of journals.
This highlights that the problem with metrics is not their existence, but how certain metrics and lists are misused by policymakers, funders, and hiring and promotion panels, as a proxy for quality of a researcher’s work. Until the assessment of research outputs is truly decoupled from publication output and citation numbers, researchers will continue to be incentivised to artificially improve their metrics.
A second document contains a set of recommendations for policy members, institutions, and individuals to tackle the misconduct highlighted in the report. Many of these recommendations align with the 2012 San Francisco Declaration on Research Assessment (DORA). This declaration, which has over 3,500 organisations including the London Mathematical Society as signatories, makes recommendations with the aim of eliminating the use of journal-based metrics in funding, appointment, and promotion considerations.
This new report and recommendations have been produced as a partnership between the IMU and the International Council of Industrial and Applied Mathematics (ICIAM).
Simon Buckmaster
Head of Academic Publications