Coercive citation

Last updated

Coercive citation is an academic publishing practice in which an editor or referee of a scientific or academic journal forces an author to add spurious citations to an article before the journal will agree to publish it. This is done to inflate the journal's impact factor, thus artificially boosting the journal's scientific reputation. Manipulation of impact factors and self-citation has long been frowned upon in academic circles; [1] however, the results of a 2012 survey indicate that about 20% of academics working in economics, sociology, psychology, and multiple business disciplines have experienced coercive citation. [2] Individual cases have also been reported in other disciplines. [3]

Contents

Background

The impact factor (IF) of a journal is a measure of how often, on average, papers published in the journal are cited in other academic publications. The IF was devised in the 1950s as a simple way to rank scientific journals. Today, in some disciplines, the prestige of a publication is determined largely by its impact factor. [4]

Use of the impact factor is not necessarily undesirable as it can reasonably incentivise editors to improve their journal through the publication of good science. Two well-known academic journals, Nature and Science, had impact factors of 36 and 31 respectively. A respected journal in a sub-field, such as cognitive science, might have an impact factor of around 3. [5]

However, impact factors have also become a source of increasing controversy. As early as 1999, in a landmark essay Scientific Communication – A Vanity Fair?, Georg Franck criticized citation counts as creating a marketplace where "success in science is rewarded with attention". In particular, he warned of a future "shadow market" where journal editors might inflate citation counts by requiring spurious references. [6] In 2005, an article in The Chronicle of Higher Education called it "the number that's devouring science". [4]

Definition

When an author submits a manuscript for publication in a scientific journal, the editor may request that the article's citations be expanded before it will be published. This is part of the standard peer review process and meant to improve the paper.

Coercive citation, on the other hand, is a specific unethical business practice in which the editor asks the author to add citations to papers published in the very same journal (self-citation) and in particular to cite papers that the author regards as duplicate or irrelevant. [5] Specifically, the term refers to requests which: [2]

In one incident, which has been cited as a particularly blatant example of coercive citation, a journal editor wrote: "you cite Leukemia [once in 42 references]. Consequently, we kindly ask you to add references of articles published in Leukemia to your present article". [3] [2]

Such a request would convey a clear message to authors: "add citations or risk rejection." [2]

The effect of coercive citation is to artificially increase the journal's impact factor. Self-citation can have an appreciable effect: for example, in a published analysis, one journal's impact factor dropped from 2.731 to 0.748 when the self-citations were removed from consideration. [7] It is important to note that not all self-citation is coercive, or indeed improper.

The practice of coercive citation is risky, as it may damage the reputation of the journal, and it hence has the potential of actually reducing the impact factor. Journals also risk temporary exclusion from Thomson Reuters' Journal Citation Reports , an influential list of impact factors, for such practices. [5]

Practice

In 2012, Wilhite and Fong published results of a comprehensive survey of 6,700 scientists and academics in economics, sociology, psychology, and multiple business disciplines. [2] In this survey, respondents were asked whether, when submitting a manuscript to a journal, they had ever been asked by the editor to include spurious citations to other papers in the same journal. Their findings indicate that 1 in 5 respondents have experienced coercive citation incidents, and that 86% regard it as unethical.

A number of factors related to coercive citation have been identified. Coercion is significantly more prevalent in some academic disciplines than others. Wilhite and Fong found that "journals in the business disciplines" (such as marketing, management, or finance) "coerce more than economics journals", whereas coercion in psychology and sociology is "no more prevalent, or even less prevalent" than it is in economics. [2] However, despite the differences in prevalence, they noted that "every discipline" in their study "reported multiple instances of coercion" and that "there are published references to coercion in fields beyond the social sciences." The business journal industry has responded that they intend to confront the practice more directly. [8]

Wilhite and Fong also found that characteristics of publishers are correlated with coercion. In their findings, "journals published by commercial, for-profit companies show significantly greater use of coercive tactics than journals from university presses", and journals published by academic societies also showed a higher likelihood of coercion than journals from university presses. Five of the top ten offenders identified in their research came from the same commercial publishing house, Elsevier. [2]

There may also be a correlation between journal ranking and coercive citation. Some have suggested that larger and more highly ranked journals have more valuable reputations at stake, and thus may be more reluctant to jeopardize their reputations by using the practice. [2] [9] However, Wilhite and Fong found that:

somewhat surprisingly, the results … suggest that more highly ranked journals are more likely to coerce … Focusing on the top 30 journals in each field tempers the results in a minor fashion, but the rank effect is still present and strong. Sadly, in the disciplines identified as practicing coercion, it is some of the most highly ranked journals that are leading the way … Our data cannot discern a direction of causality because some top journals may use coercion to maintain their position, whereas other journals may have attained their lofty position through coercion. But either situation is unsettling. [2]

Commonalities have also been identified among the targets of coercion. Coercive citation is primarily targeted at younger researchers with less senior academic ranks and at papers with a smaller number of authors in order to have the greatest effect on the impact factor. It was also found that authors from non-English-speaking countries were more likely to be targeted. [2]

See also

Related Research Articles

<span class="mw-page-title-main">Scientific journal</span> Periodical journal publishing scientific research

In academic publishing, a scientific journal is a periodical publication designed to further the progress of science by disseminating new research findings to the scientific community. These journals serve as a platform for researchers, scholars, and scientists to share their latest discoveries, insights, and methodologies across a multitude of scientific disciplines. Unlike professional or trade magazines, scientific journals are characterized by their rigorous peer review process, which aims to ensure the validity, reliability, and quality of the published content. With origins dating back to the 17th century, the publication of scientific journals has evolved significantly, playing a pivotal role in the advancement of scientific knowledge, fostering academic discourse, and facilitating collaboration within the scientific community.

<span class="mw-page-title-main">Citation</span> Reference to a source

A citation is a reference to a source. More precisely, a citation is an abbreviated alphanumeric expression embedded in the body of an intellectual work that denotes an entry in the bibliographic references section of the work for the purpose of acknowledging the relevance of the works of others to the topic of discussion at the spot where the citation appears.

<span class="mw-page-title-main">Academic publishing</span> Subfield of publishing distributing academic research and scholarship

Academic publishing is the subfield of publishing which distributes academic research and scholarship. Most academic work is published in academic journal articles, books or theses. The part of academic written output that is not formally published but merely printed up or posted on the Internet is often called "grey literature". Most scientific and scholarly journals, and many academic and scholarly books, though not all, are based on some form of peer review or editorial refereeing to qualify texts for publication. Peer review quality and selectivity standards vary greatly from journal to journal, publisher to publisher, and field to field.

<span class="mw-page-title-main">Scientific literature</span> Literary genre

Scientific literature encompasses a vast body of academic papers that spans various disciplines within the natural and social sciences. It primarily consists of academic papers that present original empirical research and theoretical contributions. These papers serve as essential sources of knowledge and are commonly referred to simply as "the literature" within specific research fields.

The Institute for Scientific Information (ISI) was an academic publishing service, founded by Eugene Garfield in Philadelphia in 1956. ISI offered scientometric and bibliographic database services. Its specialty was citation indexing and analysis, a field pioneered by Garfield.

The impact factor (IF) or journal impact factor (JIF) of an academic journal is a scientometric index calculated by Clarivate that reflects the yearly mean number of citations of articles published in the last two years in a given journal, as indexed by Clarivate's Web of Science.

Scientometrics is a subfield of informetrics that studies quantitative aspects of scholarly literature. Major research issues include the measurement of the impact of research papers and academic journals, the understanding of scientific citations, and the use of such measurements in policy and management contexts. In practice there is a significant overlap between scientometrics and other scientific fields such as information systems, information science, science of science policy, sociology of science, and metascience. Critics have argued that overreliance on scientometrics has created a system of perverse incentives, producing a publish or perish environment that leads to low-quality research.

Citation analysis is the examination of the frequency, patterns, and graphs of citations in documents. It uses the directed graph of citations — links from one document to another document — to reveal properties of the documents. A typical aim would be to identify the most important documents in a collection. A classic example is that of the citations between academic articles and books. For another example, judges of law support their judgements by referring back to judgements made in earlier cases. An additional example is provided by patents which contain prior art, citation of earlier patents relevant to the current claim. The digitization of patent data and increasing computing power have led to a community of practice that uses these citation data to measure innovation attributes, trace knowledge flows, and map innovation networks.

Citation impact or citation rate is a measure of how many times an academic journal article or book or author is cited by other articles, books or authors. Citation counts are interpreted as measures of the impact or influence of academic work and have given rise to the field of bibliometrics or scientometrics, specializing in the study of patterns of academic impact through citation analysis. The importance of journals can be measured by the average citation rate, the ratio of number of citations to number articles published within a given time period and in a given index, such as the journal impact factor or the citescore. It is used by academic institutions in decisions about academic tenure, promotion and hiring, and hence also used by authors in deciding which journal to publish in. Citation-like measures are also used in other fields that do ranking, such as Google's PageRank algorithm, software metrics, college and university rankings, and business performance indicators.

The h-index is an author-level metric that measures both the productivity and citation impact of the publications, initially used for an individual scientist or scholar. The h-index correlates with success indicators such as winning the Nobel Prize, being accepted for research fellowships and holding positions at top universities. The index is based on the set of the scientist's most cited papers and the number of citations that they have received in other publications. The index has more recently been applied to the productivity and impact of a scholarly journal as well as a group of scientists, such as a department or university or country. The index was suggested in 2005 by Jorge E. Hirsch, a physicist at UC San Diego, as a tool for determining theoretical physicists' relative quality and is sometimes called the Hirsch index or Hirsch number.

Scientific writing is writing about science, with an implication that the writing is by scientists and for an audience that primarily includes peers—those with sufficient expertise to follow in detail. Scientific writing is a specialized form of technical writing, and a prominent genre of it involves reporting about scientific studies such as in articles for a scientific journal. Other scientific writing genres include writing literature-review articles, which summarize the existing state of a given aspect of a scientific field, and writing grant proposals, which are a common means of obtaining funding to support scientific research. Scientific writing is more likely to focus on the pure sciences compared to other aspects of technical communication that are more applied, although there is overlap. There is not one specific style for citations and references in scientific writing. Whether you are submitting a grant proposal, literature review articles, or submitting an article into a paper, the citation system that must be used will depend on the publication you plan to submit to.

<span class="mw-page-title-main">Web of Science</span> Online subscription index of citations

The Web of Science is a paid-access platform that provides access to multiple databases that provide reference and citation data from academic journals, conference proceedings, and other documents in various academic disciplines.

<span class="mw-page-title-main">Review article</span> Summary of the understanding on a topic

A review article is an article that summarizes the current state of understanding on a topic within a certain discipline. A review article is generally considered a secondary source since it may analyze and discuss the method and conclusions in previously published studies. It resembles a survey article or, in news publishing, overview article, which also surveys and summarizes previously published primary and secondary sources, instead of reporting new facts and results. Survey articles are however considered tertiary sources, since they do not provide additional analysis and synthesis of new conclusions. A review of such sources is often referred to as a tertiary review.

Today in China, there are more than 8,000 academic journals, of which more than 4,600 can be considered scientific. About 1,400 cover health science. In 2022, it was reported that China has become one of the top countries in the world in both scientific research output, and also for highly cited academic papers.

The Academy of Management Journal is a bimonthly peer-reviewed academic journal covering all aspects of management. It is published by the Academy of Management and was established in 1958 as the Journal of the Academy of Management, obtaining its current name in 1963.

<i>Journal of Banking and Finance</i> Academic journal

The Journal of Banking and Finance is a peer-reviewed academic journal covering research on financial institutions, capital markets, and topics in investments and corporate finance. In 1989, the journal absorbed Studies in Banking & Finance. A 2011 study ranked it among six elite finance journals. It publishes theoretical and empirical research papers spanning all the major research fields in finance and banking. Geert Bekaert is currently the managing editor.

<span class="mw-page-title-main">Predatory publishing</span> Fraudulent business model for scientific publications

Predatory publishing, also write-only publishing or deceptive publishing, is an exploitative academic publishing business model, where the journal or publisher prioritizes self-interest at the expense of scholarship. It is characterized by misleading information, deviates from the standard peer review process, is highly non-transparent, and often utilizes aggressive solicitation practices.

<span class="mw-page-title-main">San Francisco Declaration on Research Assessment</span> 2012 manifesto against using the journal impact factor to assess a scientists work

The San Francisco Declaration on Research Assessment (DORA) is a statement that denounces the practice of correlating the journal impact factor to the merits of a specific scientist's contributions. Also according to this statement, this practice creates biases and inaccuracies when appraising scientific research. It also states that the impact factor is not to be used as a substitute "measure of the quality of individual research articles, or in hiring, promotion, or funding decisions".

Author-level metrics are citation metrics that measure the bibliometric impact of individual authors, researchers, academics, and scholars. Many metrics have been developed that take into account varying numbers of factors.

<span class="mw-page-title-main">Conflicts of interest in academic publishing</span>

Conflicts of interest (COIs) often arise in academic publishing. Such conflicts may cause wrongdoing and make it more likely. Ethical standards in academic publishing exist to avoid and deal with conflicts of interest, and the field continues to develop new standards. Standards vary between journals and are unevenly applied. According to the International Committee of Medical Journal Editors, "[a]uthors have a responsibility to evaluate the integrity, history, practices and reputation of the journals to which they submit manuscripts".

References

  1. McLeod, Sam (25 September 2020). "Should authors cite sources suggested by peer reviewers? Six antidotes for handling potentially coercive reviewer citation suggestions". Learned Publishing. 34 (2): 282–286. doi:10.1002/leap.1335. ISSN   0953-1513. S2CID   225004022.
  2. 1 2 3 4 5 6 7 8 9 10 Wilhite, A. W.; Fong, E. A. (2012). "Coercive Citation in Academic Publishing". Science. 335 (6068): 542–3. Bibcode:2012Sci...335..542W. doi:10.1126/science.1212540. PMID   22301307. S2CID   30073305.
  3. 1 2 Smith, R. (1997). "Journal accused of manipulating impact factor". The BMJ . 314 (7079): 461. doi:10.1136/bmj.314.7079.461d. hdl:10822/901737. S2CID   72372391.
  4. 1 2 R. Monastersky, "The number that’s devouring science." Chron. Higher Educ. (14 October 2005)
  5. 1 2 3 Sebastiaan Mathôt: "Cite my journal or else: Coercive self-citation in academic publishing" at COGSCIdotNL: Cognitive Science and more, 4 February 2012
  6. Franck, G. (1999). "ESSAYS ON SCIENCE AND SOCIETY:Scientific Communication--A Vanity Fair?". Science. 286 (5437): 53. doi:10.1126/science.286.5437.53. S2CID   154057884.
  7. Marco Pagano and Josef Zechner: "Review of Finance Report by the Managing Editors" Stockholm, 17 August 2011. Retrieved 28 May 2012.
  8. Lynch, J. G. (2012). "Business Journals Combat Coercive Citation". Science. 335 (6073): 1169. Bibcode:2012Sci...335.1169L. doi:10.1126/science.335.6073.1169-a. PMID   22403371.
  9. Sarah Huggett (4 June 2012). "Impact Factor Ethics for Editors – Editors' Update – Your network for knowledge". Elsevier. Archived from the original on 28 June 2012. Retrieved 5 June 2012.

See also