By Kris A Willems and Kirstin Krauss

In this blog we will illustrate the complexity of the problem of citation pollution and also an approach to doing due diligence assessment – to show what one could do when questionable science turns up on one’s desk.

Since the launch of FideliorTM, the research team and advisory board have received many requests for support with assessing a journal title or a list of references:

The journal looks very suspicious as the website advertises a 2-week turnaround time from submission to publication …

There is only one suspect title in the list of references. Do you think it is necessary to remove it?

I wonder if we should standardise FideliorTM testing for all journal submissions?

Recently a member of our team was approached on a public forum to participate in a paper discussion. The request read more or less as follows:

We invite you to join a discussion of a paper, titled “Indian Journal of Plant Sciences: A Scientometric Approach” in the International Journal of Information Studies and Libraries. You have been invited either because you are following the author or we think you’d be interested based on the overlap between this paper and what you read and write about on this forum.

We did an assessment of this paper, and responded on the forum – and ironically our critical response was removed quite quickly. We believe, however, that this particular example is quite useful to learn from. It is also a typical example of what authors have to deal with in order to avoid citing questionable research.

Like with a Turnitin report, we show that FideliorTM is a tool that can help flag areas of concern. But it is up the users to perform their own due diligence assessment using what FideliorTM flags in its services. What we do not do, is to offer conclusive answers about the specific case in question. We will leave that up to the reader to infer from the data and arguments we offer.

Here are our critical and hopefully also informative views on this paper, the publication outlet, and the data it expounds on.

While the paper seeks to address important issues emanating from a citation analysis (or scientometrics as explained in the paper) within a specific field of interest, it is flawed in a very critical area. It does not consider the concerns and implications of citation pollution and whether the paper is informed by research that is potentially fraudulent. In a recent paper we co-authored, we illustrated the dilemma of citation pollution, drawing on a list of 33 retracted Covid-19 papers from Retraction Watch ( A concern that we highlighted is that any type of inferior publication, whether it is predatory publications, retracted research, or research that is simply weak or of low quality, can hurt scientific discourses and potentially misdirect future scientific endeavours. Therefore, since a key objective of the paper in question is “to measure the impact of science on society”, it is imperative to include a perspective on citation pollution if this research is to be taken seriously and considered relevant.

In order to avoid inferior publications and citation pollution, a thorough due diligence assessment of publication outlets and references should be carried out. Generally, three usage scenarios should be considered, i.e.; selecting a title to cite, selecting a journal to publish in, and assessing a paper’s list of references. In this paper’s case there is a fourth scenario, namely that of assessing the data incorporated into the analysis (i.e., the articles published in the Indian Journal of Plant Sciences) and assessing the outcomes of the data analysis (i.e., the quality of journals, books, conferences, etc. used by these articles).

We used FideliorTM to assess the journal (International Journal of Information Studies & Libraries) where this paper was published in and the journal title that was reviewed in the paper (i.e., Indian Journal of Plant Sciences). We also ran the list of references of the paper through FideliorTM and assessed “the rank list of periodicals containing research literature in the field of botany and plant sciences”.

We applied the following principles:

  1. Access to multiple journal sources and lists (e.g., citation databases, accredited journal lists, disciplinary journal lists, etc.), allows one to better understand and assess the reputability and quality of a journal title.
  2. One can with fair confidence say that a journal title is of reputable quality if listed on multiple reputable journal lists.
  3. If a title appears on no journal list or if it appears on some of the so-call lists of predatory journals, it requires that the reader/author conducts further due diligence assessment of the title.
  4. If a title appears on multiple lists of questionable sources, one can with good confidence accept that the title is of a questionable nature.

Our assessment suggests the following:

  1. The International Journal of Information Studies & Librariesdoes not appear on any of the 37 journal sources incorporated into FideliorTM’s metadatabase. This potentially raises questions on the publication outlet that the authors opted for: Is it predatory or simply poor quality research? A further qualitative assessment of the title is necessary, e.g.; how good is the review processes? What is the title’s standing in the scholarly community? Who serves on the editorial board? etc.
  2. The analysed Indian Journal of Plant Sciences journal also appears on none of the 37 journal sources incorporated into FideliorTM. The same questions as above emerged. One should also ask questions as to why issues from this particular title was selected for data analysis? Why not source a journal title that is seen as more reputable in the community, i.e., a title listed on multiple reputable journal sources?
  3. When we submitted the paper’s references to the FideliorTM service, no serious issues were flagged. The only concern is that an article from Journal of Education and Social Policy was cited. This title appears on none of the journal sources incorporated into FideliorTM’s scanning. This may be a concern as explained above.
  4. In the “Ranking List of Core Journals” section a number of titles are listed as a result of the analysis. The following titles were flagged as potentially suspect:
Journal TitleComment
Applied and Pure BiologyNo journal sources hits
Journal of Plant ScienceFlagged by Scopus discontinued sources: “Publication concerns”
Indian Journal of Plant ScienceNo journal sources hits
International Journal of Applied Life ScienceNo journal sources hits
Asian Journal of Conservative BiologyNo journal sources hits
Critical Review of Plant ScienceNo journal sources hits
BionatureNo journal sources hits
Applied Environmental MicrobiologyNo journal sources hits
Applied and Pure BiologyNo journal sources hits
Journal of Pharmacy ResearchFlagged by two predatory journals lists
Plant and Cell ReportNo journal sources hits
Functional BiologyNo journal sources hits
International Journal of Recent BiotechnologyNo journal sources hits
National Journal of Life SciencesFlagged by three lists as potentially questionable or predatory
BioinfoletNo journal sources hits
International Journal of BioessaysNo journal sources hits
Journal of Experimental ScienceFlagged by two predatory journals lists
Field CropNo journal sources hits

Quite a number of the references cited by the analysed papers are of low quality or possibly fraudulent. It potentially raises questions about the review processes. What is also quite concerning about this list and the authors’ summary in the paper, is the conclusion that Applied and Pure Biology and Journal of Plant Science are considered to be “the top core journals used by the researchers in the field of botany, plant sciences, and agricultural sciences.” As scientists, we are concerned about what sorts of scientific claims, underpinning weak or fake research, feed into the journal’s scientific record.

We also noted mistakes with proper journal title names, for example: ‘Field Crop’ should probably be ‘Field Crops Research’ and ‘International Journal of Bioessays’ should probably be ‘International Journal of Bioassays’. Referencing mistakes can be considered an indication of lack of scientific rigor.

Our concluding assessment of this publication is that while the intention is interesting and possibly useful if carried out with a different selection of journals, this specific paper has very little relevance for the scholarly community or the discipline it associates with. It does not consider how questionable science is penetrating the scientific record through citation pollution. It does not consider the possible presence of predatory publications. Even the findings are somewhat meaningless, because many of the titles listed in the findings are potentially weak or fraudulent.  

As a note; FideliorTM, does not vet any titles, nor do they claim to have a final authoritative assessment of titles. They merely provide usable access so that users can make their own assessments better, as you can see on their website at

  • “Fidelior™ is not responsible for vetting and inclusion criteria of journal titles listed in the various journal sources. Some lists provide limited information about inclusion criteria. Fidelior™ reports will only include what is available from journal sources and lists.”
  • “Through its propriety Metadatabase, Fidelior™ merely facilitates usable access to available journal sources in the public domain as matched to Fidelior™’s Metadatabase. Should a title be flagged as appearing in any of the sources, it is the user’s responsibility to check and assess the quality, journal metrics and journal ratings of the journal flagged, and secondly carefully clarify the reasons for inclusion of a citation in a particular manuscript. In some instances, a Fidelior™ report may reveal false positive hits. This may be due to a variety of reasons outside of Fidelior™’s control, such as references formatting issues, document formatting issues, spelling mistakes in journal titles, journal source data inaccuracies, etc. A Fidelior™ report is merely a tool to assist the user. No software tool or journal list can replace peer review, due diligence and individual assessment of a journal or paper.”

Photo by Tim Mossholder on Unsplash