Skip to content

Journal Impact Factor Manipulation and Citation Cartels

Loaded dice image by Candace McDaniel on Negative Space
Image by Candace McDaniel on Negative Space

In a 1955 Science article, Eugene Garfield proposed a citation index for the sciences (Garfield, 1955). The purpose was to make it easier to see which works cited or criticized a research paper, enabling researchers to find both frequently cited works and those that had flaws. The idea was that this would root out bad research and elevate that which had merit.

The first Science Citation Index was published in 1964. The indexes quickly became a staple tool for science and social science researchers, existing in print for decades and then going online and evolving into the present day Web of Science database. Researchers and faculty members in the sciences are now very cognizant of their h-index factor, a measure of the relevancy of their published works based on citation counts (this site explains how to calculate your h-index and the difference between Google Scholar’s and WOS indexes). Similarly the Journal Impact Factor emerged as a way to calculate the relevance of a scientific journal. Publishing in a high impact factor journal means more prestige for the author and the journal impact factor is a measure that librarians frequently use to make subscription decisions.

Some publishers and researchers have figured out ways to game the system over the years. Puffing up your h-index factor can win you grants and tenure. Similarly, inflating a journal’s impact factor means it will attract top researchers who want to publish their findings in it, as well as more subscriptions and revenue.

In 2013, Nature revealed a group of Brazilian journals had arranged to cite works from each other’s publications in a citation stacking scheme (Van Noorden, 2013). Journals have also found ways to manipulate impact factors by exploiting the types of content published. A recent analysis of the British Journal of Sports Medicine which had a sudden rise in impact factor found that there was a corresponding “exponential rise” in editorials published (Heathers, 2022). Publishing a large number of small citable items, like editorials, can boost impact factors due to the way they are calculated and this worked for BJSM, making it the top ranked sports medicine journal. Publishers also game the calendar by publishing items digitally and allowing them to accumulate citations before giving them an official publication date or “front loading” by publishing more research early in the year to accumulate additional citations when the impact factor calculation is run at the end of the year.

Richard Phelps at Retraction Watch recently wrote a brief article on citation cartels. Established scholars in a field cite each other’s works in an ‘I’ll scratch your back, you scratch mine’ type arrangement that is mutually beneficial. His analysis revealed how a group of ‘strategic scholars’ could boost their impact factors by three times over ‘sincere scholars’ over the course of a few years. This increases their influence and mutes the voices of others. It reinforces the old boys’ club aspect of scientific and medical research and is particularly problematic in light of diversity and equity concerns. 

The fairness and effectiveness of impact factors has been addressed by the Declaration on Research Assessment (DORA). The declaration came out of the 2012 meeting of the American Society for Cell Biology in San Francisco. It is now an international initiative covering all scholarly disciplines. DORA confronts issues of consistency, transparency and equity in research assessment and calls for:

  • the need to eliminate the use of journal-based metrics, such as Journal Impact Factors, in funding, appointment, and promotion considerations;
  • the need to assess research on its own merits rather than on the basis of the journal in which the research is published; and
  • the need to capitalize on the opportunities provided by online publication (such as relaxing unnecessary limits on the number of words, figures, and references in articles, and exploring new indicators of significance and impact).

You can read the entire declaration here

Garfield. (1955). Citation indexes for science; a new dimension in documentation through association of ideas. Science (American Association for the Advancement of Science), 122(3159), 108–111 https://www.science.org/doi/10.1126/science.122.3159.108

Heathers and Grimes. (2022). The Mechanics Behind A Precipitous Rise In Impact Factor: A Case Study From the British Journal of Sports Medicine. OSFPREPRINTS  https://osf.io/pt7cv/

Phelps. (2022). How Citation Cartels Give “Strategic Scholars” an Advantage.  Retraction Watch https://retractionwatch.com/2022/05/17/how-citation-cartels-give-strategic-scholars-an-advantage-a-simple-model/

Van Noorden. (2013). Brazilian Citation Scheme Outed. Nature (London), 500(7464), 510–511. https://doi.org/10.1038/500510a

Print Friendly, PDF & Email