Citation impact

Last updated

Citation impact or citation rate is a measure of how many times an academic journal article or book or author is cited by other articles, books or authors. [1] [2] [3] [4] [5] [6] Citation counts are interpreted as measures of the impact or influence of academic work and have given rise to the field of bibliometrics or scientometrics, [7] [8] specializing in the study of patterns of academic impact through citation analysis. The importance of journals can be measured by the average citation rate, [9] [6] the ratio of number of citations to number articles published within a given time period and in a given index, such as the journal impact factor or the citescore. It is used by academic institutions in decisions about academic tenure, promotion and hiring, and hence also used by authors in deciding which journal to publish in. Citation-like measures are also used in other fields that do ranking, such as Google's PageRank algorithm, software metrics, college and university rankings, and business performance indicators.

Contents

Article-level

One of the most basic citation metrics is how often an article was cited in other articles, books, or other sources (such as theses). Citation rates are heavily dependent on the discipline and the number of people working in that area. For instance, many more scientists work in neuroscience than in mathematics, and neuroscientists publish more papers than mathematicians, hence neuroscience papers are much more often cited than papers in mathematics. [10] [11] Similarly, review papers are more often cited than regular research papers because they summarize results from many papers. This may also be the reason why papers with shorter titles get more citations, given that they are usually covering a broader area. [12]

Most-cited papers

The most-cited paper in history is a paper by Oliver Lowry describing an assay to measure the concentration of proteins. [13] By 2014 it had accumulated more than 305,000 citations. The 10 most cited papers all had more than 40,000 citations. [14] To reach the top-100 papers required 12,119 citations by 2014. [14] Of Thomson Reuter's Web of Science database with more than 58 million items only 14,499 papers (~0.026%) had more than 1,000 citations in 2014. [14]

Journal-level

The simplest journal-level metric is the journal impact factor (JIF), the average number of citations that articles published by a journal in the previous two years have received in the current year, as calculated by Clarivate. Other companies report similar metrics, such as the CiteScore (CS), based on Scopus.

However, very high JIF or CS are often based on a small number of very highly cited papers. For instance, most papers in Nature (impact factor 38.1, 2016) were only cited 10 or 20 times during the reference year (see figure). Journals with a lower impact (e.g. PLOS ONE , impact factor 3.1) publish many papers that are cited 0 to 5 times but few highly cited articles. [15]

Journal-level metrics are often misinterpreted as a measure for journal quality or article quality. However, the use of non-article-level metrics to determine the impact of a single article is statistically invalid. Moreover, studies of methodological quality and reliability have found that "reliability of published research works in several fields may be decreasing with increasing journal rank", [16] contrary to widespread expectations. [17]

Citation distribution is skewed for journals because a very small number of articles are driving the vast majority of citations; therefore, some journals have stopped publicizing their impact factor, e.g. the journals of the American Society for Microbiology. [18] Citation counts follow mostly a lognormal distribution, except for the long tail, which is better fit by a power law. [19]

Other journal-level metrics include the Eigenfactor, and the SCImago Journal Rank.

Author-level

Total citations, or average citation count per article, can be reported for an individual author or researcher. Many other measures have been proposed, beyond simple citation counts, to better quantify an individual scholar's citation impact. [20] The best-known measures include the h-index [21] and the g-index. [22] Each measure has advantages and disadvantages, [23] spanning from bias to discipline-dependence and limitations of the citation data source. [24] Counting the number of citations per paper is also employed to identify the authors of citation classics. [25]

Citations are distributed highly unequally among researchers. In a study based on the Web of Science database across 118 scientific disciplines, the top 1% most-cited authors accounted for 21% of all citations. Between 2000 and 2015, the proportion of citations that went to this elite group grew from 14% to 21%. The highest concentrations of 'citation elite' researchers were in the Netherlands, the United Kingdom, Switzerland and Belgium. 70% of the authors in the Web of Science database have fewer than 5 publications, so that the most-cited authors among the 4 million included in this study constitute a tiny fraction. [26]

Alternatives

An alternative approach to measure a scholar's impact relies on usage data, such as number of downloads from publishers and analyzing citation performance, often at article level. [27] [28] [29] [30]

As early as 2004, the BMJ published the number of views for its articles, which was found to be somewhat correlated to citations. [31] In 2008 the Journal of Medical Internet Research began publishing views and Tweets. These "tweetations" proved to be a good indicator of highly cited articles, leading the author to propose a "Twimpact factor", which is the number of Tweets it receives in the first seven days of publication, as well as a Twindex, which is the rank percentile of an article's Twimpact factor. [32]

In response to growing concerns over the inappropriate use of journal impact factors in evaluating scientific outputs and scientists themselves, Université de Montréal , Imperial College London, PLOS, eLife, EMBO Journal, The Royal Society, Nature and Science proposed citation distributions metrics as alternative to impact factors. [33] [34] [35]

Open Access publications

Open access (OA) publications are accessible without cost to readers, hence they would be expected to be cited more frequently. [36] Some experimental and observational studies have found that articles published in OA journals do not receive more citations, on average, than those published in subscription journals; [37] other studies have found that they do. [38] [39] [40]

The evidence that author-self-archived ("green") OA articles are cited more than non-OA articles is somewhat stronger than the evidence that ("gold") OA journals are cited more than non-OA journals. [41] Two reasons for this are that many of the top-cited journals today are still only hybrid OA (author has the option to pay for gold) [42] and many pure author-pays OA journals today are either of low quality or downright fraudulent "predatory journals," preying on authors' eagerness to publish-or-perish, thereby lowering the average citation counts of OA journals. [43]

Recent developments

An important recent development in research on citation impact is the discovery of universality, or citation impact patterns that hold across different disciplines in the sciences, social sciences, and humanities. For example, it has been shown that the number of citations received by a publication, once properly rescaled by its average across articles published in the same discipline and in the same year, follows a universal log-normal distribution that is the same in every discipline. [44] This finding has suggested a universal citation impact measure that extends the h-index by properly rescaling citation counts and resorting publications, however the computation of such a universal measure requires the collection of extensive citation data and statistics for every discipline and year. Social crowdsourcing tools such as Scholarometer have been proposed to address this need. [45] [46] Kaur et al. proposed a statistical method to evaluate the universality of citation impact metrics, i.e., their capability to compare impact fairly across fields. [47] Their analysis identifies universal impact metrics, such as the field-normalized h-index.

Research suggests the impact of an article can be, partly, explained by superficial factors and not only by the scientific merits of an article. [48] Field-dependent factors are usually listed as an issue to be tackled not only when comparison across disciplines are made, but also when different fields of research of one discipline are being compared. [49] For instance in Medicine among other factors the number of authors, the number of references, the article length, and the presence of a colon in the title influence the impact. Whilst in Sociology the number of references, the article length, and title length are among the factors. [50] Also it is found that scholars engage in ethically questionable behavior in order to inflate the number of citations articles receive. [51]

Automated citation indexing [52] has changed the nature of citation analysis research, allowing millions of citations to be analyzed for large scale patterns and knowledge discovery. The first example of automated citation indexing was CiteSeer, later to be followed by Google Scholar. More recently, advanced models for a dynamic analysis of citation aging have been proposed. [53] [54] The latter model is even used as a predictive tool for determining the citations that might be obtained at any time of the lifetime of a corpus of publications.

Some researchers also propose that the journal citation rate on Wikipedia, next to the traditional citation index, "may be a good indicator of the work's impact in the field of psychology." [55] [56]

According to Mario Biagioli: "All metrics of scientific evaluation are bound to be abused. Goodhart's law [...] states that when a feature of the economy is picked as an indicator of the economy, then it inexorably ceases to function as that indicator because people start to game it." [57]

See also

Related Research Articles

<span class="mw-page-title-main">Citation</span> Reference to a source

A citation is a reference to a source. More precisely, a citation is an abbreviated alphanumeric expression embedded in the body of an intellectual work that denotes an entry in the bibliographic references section of the work for the purpose of acknowledging the relevance of the works of others to the topic of discussion at the spot where the citation appears.

<span class="mw-page-title-main">Scientific citation</span>

Scientific citation is providing detailed reference in a scientific publication, typically a paper or book, to previous published communications which have a bearing on the subject of the new publication. The purpose of citations in original work is to allow readers of the paper to refer to cited work to assist them in judging the new work, source background information vital for future development, and acknowledge the contributions of earlier workers. Citations in, say, a review paper bring together many sources, often recent, in one place.

<span class="mw-page-title-main">Open access</span> Research publications distributed freely online

Open access (OA) is a set of principles and a range of practices through which research outputs are distributed online, free of access charges or other barriers. With open access strictly defined, or libre open access, barriers to copying or reuse are also reduced or removed by applying an open license for copyright.

A citation index is a kind of bibliographic index, an index of citations between publications, allowing the user to easily establish which later documents cite which earlier documents. A form of citation index is first found in 12th-century Hebrew religious literature. Legal citation indexes are found in the 18th century and were made popular by citators such as Shepard's Citations (1873). In 1961, Eugene Garfield's Institute for Scientific Information (ISI) introduced the first citation index for papers published in academic journals, first the Science Citation Index (SCI), and later the Social Sciences Citation Index (SSCI) and the Arts and Humanities Citation Index (AHCI). American Chemical Society converted its printed Chemical Abstract Service into internet-accessible SciFinder in 2008. The first automated citation indexing was done by CiteSeer in 1997 and was patented. Other sources for such data include Google Scholar, Microsoft Academic, Elsevier's Scopus, and the National Institutes of Health's iCite.

The impact factor (IF) or journal impact factor (JIF) of an academic journal is a scientometric index calculated by Clarivate that reflects the yearly mean number of citations of articles published in the last two years in a given journal, as indexed by Clarivate's Web of Science.

<span class="mw-page-title-main">Bibliometrics</span> Statistical analysis of written publications

Bibliometrics is the use of statistical methods to analyse books, articles and other publications, especially in scientific contents. Bibliometric methods are frequently used in the field of library and information science. Bibliometrics is closely associated with scientometrics, the analysis of scientific metrics and indicators, to the point that both fields largely overlap.

Scientometrics is the field of study which concerns itself with measuring and analysing scholarly literature. Scientometrics is a sub-field of informetrics. Major research issues include the measurement of the impact of research papers and academic journals, the understanding of scientific citations, and the use of such measurements in policy and management contexts. In practice there is a significant overlap between scientometrics and other scientific fields such as information systems, information science, science of science policy, sociology of science, and metascience. Critics have argued that over-reliance on scientometrics has created a system of perverse incentives, producing a publish or perish environment that leads to low-quality research.

Citation analysis is the examination of the frequency, patterns, and graphs of citations in documents. It uses the directed graph of citations — links from one document to another document — to reveal properties of the documents. A typical aim would be to identify the most important documents in a collection. A classic example is that of the citations between academic articles and books. For another example, judges of law support their judgements by referring back to judgements made in earlier cases. An additional example is provided by patents which contain prior art, citation of earlier patents relevant to the current claim. The digitization of patent data and increasing computing power have led to a community of practice that uses these citation data to measure innovation attributes, trace knowledge flows, and map innovation networks.

<span class="mw-page-title-main">Google Scholar</span> Academic search service by Google

Google Scholar is a freely accessible web search engine that indexes the full text or metadata of scholarly literature across an array of publishing formats and disciplines. Released in beta in November 2004, the Google Scholar index includes peer-reviewed online academic journals and books, conference papers, theses and dissertations, preprints, abstracts, technical reports, and other scholarly literature, including court opinions and patents.

Open access citation advantage (OACA), sometimes known as FUTON bias, is a type of bias whereby scholars tend to cite academic journals with open access (OA)—that is, journals that make their full text available on the Internet without charge —in preference to toll-access publications. The concept was introduced, under the FUTON bias name, by UK medical researcher Reinhard Wentz in a letter to The Lancet in 2002.

The h-index is an author-level metric that measures both the productivity and citation impact of the publications, initially used for an individual scientist or scholar. The h-index correlates with success indicators such as winning the Nobel Prize, being accepted for research fellowships and holding positions at top universities. The index is based on the set of the scientist's most cited papers and the number of citations that they have received in other publications. The index has more recently been applied to the productivity and impact of a scholarly journal as well as a group of scientists, such as a department or university or country. The index was suggested in 2005 by Jorge E. Hirsch, a physicist at UC San Diego, as a tool for determining theoretical physicists' relative quality and is sometimes called the Hirsch index or Hirsch number.

Journal Citation Reports (JCR) is an annual publication by Clarivate. It has been integrated with the Web of Science and is accessed from the Web of Science Core Collection. It provides information about academic journals in the natural and social sciences, including impact factors. The JCR was originally published as a part of the Science Citation Index. Currently, the JCR, as a distinct service, is based on citations compiled from the Science Citation Index Expanded and the Social Sciences Citation Index. As of the 2023 edition, journals from the Arts and Humanities Citation Index and the Emerging Sources Citation Index will also be included.

Journal ranking is widely used in academic circles in the evaluation of an academic journal's impact and quality. Journal rankings are intended to reflect the place of a journal within its field, the relative difficulty of being published in that journal, and the prestige associated with it. They have been introduced as official research evaluation tools in several countries.

The Eigenfactor score, developed by Jevin West and Carl Bergstrom at the University of Washington, is a rating of the total importance of a scientific journal. Journals are rated according to the number of incoming citations, with citations from highly ranked journals weighted to make a larger contribution to the eigenfactor than those from poorly ranked journals. As a measure of importance, the Eigenfactor score scales with the total impact of a journal. All else equal, journals generating higher impact to the field have larger Eigenfactor scores. Citation metrics like eigenfactor or PageRank-based scores reduce the effect of self-referential groups.

<span class="mw-page-title-main">Altmetrics</span> Alternative metrics for analyzing scholarship

In scholarly and scientific publishing, altmetrics are non-traditional bibliometrics proposed as an alternative or complement to more traditional citation impact metrics, such as impact factor and h-index. The term altmetrics was proposed in 2010, as a generalization of article level metrics, and has its roots in the #altmetrics hashtag. Although altmetrics are often thought of as metrics about articles, they can be applied to people, journals, books, data sets, presentations, videos, source code repositories, web pages, etc.

Johan Lambert Trudo Maria Bollen is a scientist investigating complex systems and networks, the relation between social media and a variety of socio-economic phenomena such as the financial markets, public health, and social well-being, as well as Science of Science with a focus on impact metrics derived from usage data. He presently works as associate professor at the Indiana University School of Informatics of Indiana University Bloomington and a fellow at the SparcS Institute of Wageningen University and Research Centre in the Netherlands. He is best known for his work on scholarly impact metrics, measuring public well-being from large-scale social media data, and correlating Twitter mood to stock market prices. He has taught courses on data mining, information retrieval, and digital libraries. His research has been funded by The Andrew W. Mellon Foundation, National Science Foundation, Library of Congress, National Aeronautics and Space Administration and the Los Alamos National Laboratory. In his free time, he DJs at the Root Cellar Lounge in Bloomington, Indiana. He specializes in Deep House and Techno.

Author-level metrics are citation metrics that measure the bibliometric impact of individual authors, researchers, academics, and scholars. Many metrics have been developed that take into account varying numbers of factors.

The Leiden Manifesto for research metrics (LM) is a list of "ten principles to guide research evaluation", published as a comment in Volume 520, Issue 7548 of Nature, on 22 April 2015. It was formulated by public policy professor Diana Hicks, scientometrics professor Paul Wouters, and their colleagues at the 19th International Conference on Science and Technology Indicators, held between 3–5 September 2014 in Leiden, The Netherlands.

The science-wide author databases of standardized citation indicators is a multidimensional ranking of the world’s scientists produced since 2015 by a team of researchers led by John P. A. Ioannidis at Stanford.

References

  1. Garfield, E. (1955). "Citation Indexes for Science: A New Dimension in Documentation through Association of Ideas". Science . 122 (3159): 108–111. Bibcode:1955Sci...122..108G. doi:10.1126/science.122.3159.108. PMID   14385826.
  2. Garfield, E. (1973). "Citation Frequency as a Measure of Research Activity and Performance" (PDF). Essays of an Information Scientist. 1: 406–408.
  3. Garfield, E. (1988). "Can Researchers Bank on Citation Analysis?" (PDF). Essays of an Information Scientist. 11: 354.
  4. Garfield, E. (1998). "The use of journal impact factors and citation analysis in the evaluation of science". 41st Annual Meeting of the Council of Biology Editors.
  5. Moed, Henk F. (2005). Citation Analysis in Research Evaluation. Springer. ISBN   978-1-4020-3713-9.
  6. 1 2 Haustein, S. (2012). Multidimensional Journal Evaluation: Analyzing Scientific Periodicals beyond the Impact Factor. Knowledge and Information. De Gruyter. ISBN   978-3-11-025555-3 . Retrieved 2023-06-06.
  7. Leydesdorff, L., & Milojević, S. (2012). Scientometrics. arXiv preprint arXiv:1208.4566.
  8. Harnad, S. (2009). Open access scientometrics and the UK Research Assessment Exercise. Scientometrics, 79(1), 147-156.
  9. Garfield, Eugene (1972-11-03). "Citation Analysis as a Tool in Journal Evaluation". Science. American Association for the Advancement of Science (AAAS). 178 (4060): 471–479. Bibcode:1972Sci...178..471G. doi:10.1126/science.178.4060.471. ISSN   0036-8075. PMID   5079701.
  10. de Solla Price, D. J. (1963). Little Science, Big Science . Columbia University Press. ISBN   9780231085625.
  11. Larsen, P. O.; von Ins, M. (2010). "The rate of growth in scientific publication and the decline in coverage provided by Science Citation Index". Scientometrics . 84 (3): 575–603. doi:10.1007/s11192-010-0202-z. PMC   2909426 . PMID   20700371.
  12. Deng, B. (26 August 2015). "Papers with shorter titles get more citations". Nature News . doi:10.1038/nature.2015.18246. S2CID   186805536.
  13. Lowry, O. H.; Rosebrough, N. J.; Farr, A. L.; Randall, R. J. (1951). "Protein measurement with the Folin phenol reagent". The Journal of Biological Chemistry . 193 (1): 265–275. doi: 10.1016/S0021-9258(19)52451-6 . PMID   14907713.
  14. 1 2 3 van Noorden, R.; Maher, B.; Nuzzo, R. (2014). "The top 100 papers". Nature . 514 (7524): 550–553. Bibcode:2014Natur.514..550V. doi: 10.1038/514550a . PMID   25355343.
  15. Callaway, E. (2016). "Beat it, impact factor! Publishing elite turns against controversial metric". Nature . 535 (7611): 210–211. Bibcode:2016Natur.535..210C. doi: 10.1038/nature.2016.20224 . PMID   27411614.
  16. Brembs, Björn (2018). "Prestigious Science Journals Struggle to Reach Even Average Reliability". Frontiers in Human Neuroscience. 12: 37. doi: 10.3389/fnhum.2018.00037 . PMC   5826185 . PMID   29515380.
  17. Triggle, Chris R; MacDonald, Ross; Triggle, David J.; Grierson, Donald (2022-04-03). "Requiem for impact factors and high publication charges". Accountability in Research. 29 (3): 133–164. doi: 10.1080/08989621.2021.1909481 . PMID   33787413. One might expect, therefore, that a high JIF factor indicates a higher standard of interest, accuracy and reliability of papers published therein. This is sometimes true but unfortunately is certainly not always the case (Brembs 2018, 2019). Thus, Björn Brembs (2019) concluded: "There is a growing body of evidence against our subjective notion of more prestigious journals publishing 'better' science. In fact, the most prestigious journals may be publishing the least reliable science."
  18. Casadevall, A.; Bertuzzi, S.; Buchmeier, M. J.; Davis, R. J.; Drake, H.; Fang, F. C.; Gilbert, J.; Goldman, B. M.; Imperiale, M. J. (2016). "ASM Journals Eliminate Impact Factor Information from Journal Websites". mSphere . 1 (4): e00184–16. doi:10.1128/mSphere.00184-16. PMC   4941020 . PMID   27408939.
  19. Chatterjee, Arnab; Ghosh, Asim; Chakrabarti, Bikas K. (2016-01-11). Bornmann, Lutz (ed.). "Universality of Citation Distributions for Academic Institutions and Journals". PLOS ONE. Public Library of Science (PLoS). 11 (1): e0146762. Bibcode:2016PLoSO..1146762C. doi: 10.1371/journal.pone.0146762 . ISSN   1932-6203. PMC   4709109 . PMID   26751563.
  20. Belikov, A. V.; Belikov, V. V. (2015). "A citation-based, author- and age-normalized, logarithmic index for evaluation of individual researchers independently of publication counts". F1000Research . 4: 884. doi: 10.12688/f1000research.7070.1 . PMC   4654436 .
  21. Hirsch, J. E. (2005). "An index to quantify an individual's scientific research output". PNAS . 102 (46): 16569–16572. arXiv: physics/0508025 . Bibcode:2005PNAS..10216569H. doi: 10.1073/pnas.0507655102 . PMC   1283832 . PMID   16275915.
  22. Egghe, L. (2006). "Theory and practise of the g-index". Scientometrics . 69 (1): 131–152. doi:10.1007/s11192-006-0144-7. hdl: 1942/981 . S2CID   207236267.
  23. Gálvez RH (March 2017). "Assessing author self-citation as a mechanism of relevant knowledge diffusion". Scientometrics. 111 (3): 1801–1812. doi:10.1007/s11192-017-2330-1. S2CID   6863843.
  24. Couto, F. M.; Pesquita, C.; Grego, T.; Veríssimo, P. (2009). "Handling self-citations using Google Scholar". Cybermetrics. 13 (1): 2. Archived from the original on 2010-06-24. Retrieved 2009-05-27.
  25. Serenko, A.; Dumay, J. (2015). "Citation classics published in knowledge management journals. Part I: Articles and their characteristics" (PDF). Journal of Knowledge Management. 19 (2): 401–431. doi:10.1108/JKM-06-2014-0220.
  26. Reardon, Sara (2021-03-01). "'Elite' researchers dominate citation space". Nature. 591 (7849): 333–334. Bibcode:2021Natur.591..333R. doi: 10.1038/d41586-021-00553-7 . PMID   33649475.
  27. Bollen, J.; Van de Sompel, H.; Smith, J.; Luce, R. (2005). "Toward alternative metrics of journal impact: A comparison of download and citation data". Information Processing and Management . 41 (6): 1419–1440. arXiv: cs.DL/0503007 . Bibcode:2005IPM....41.1419B. doi:10.1016/j.ipm.2005.03.024. S2CID   9864663.
  28. Brody, T.; Harnad, S.; Carr, L. (2005). "Earlier Web Usage Statistics as Predictors of Later Citation Impact". Journal of the Association for Information Science and Technology . 57 (8): 1060. arXiv: cs/0503020 . Bibcode:2005cs........3020B. doi:10.1002/asi.20373. S2CID   12496335.
  29. Kurtz, M. J.; Eichhorn, G.; Accomazzi, A.; Grant, C.; Demleitner, M.; Murray, S. S. (2004). "The Effect of Use and Access on Citations". Information Processing and Management . 41 (6): 1395–1402. arXiv: cs/0503029 . Bibcode:2005IPM....41.1395K. doi:10.1016/j.ipm.2005.03.010. S2CID   16771224.
  30. Moed, H. F. (2005b). "Statistical Relationships Between Downloads and Citations at the Level of Individual Documents Within a Single Journal". Journal of the American Society for Information Science and Technology . 56 (10): 1088–1097. doi:10.1002/asi.20200.
  31. Perneger, T. V. (2004). "Relation between online "hit counts" and subsequent citations: Prospective study of research papers in the BMJ". BMJ . 329 (7465): 546–7. doi:10.1136/bmj.329.7465.546. PMC   516105 . PMID   15345629.
  32. Eysenbach, G. (2011). "Can Tweets Predict Citations? Metrics of Social Impact Based on Twitter and Correlation with Traditional Metrics of Scientific Impact". Journal of Medical Internet Research . 13 (4): e123. doi: 10.2196/jmir.2012 . PMC   3278109 . PMID   22173204.
  33. Veronique Kiermer (2016). "Measuring Up: Impact Factors Do Not Reflect Article Citation Rates". The Official PLOS Blog .
  34. "Ditching Impact Factors for Deeper Data". The Scientist. Retrieved 2016-07-29.
  35. "Scientific publishing observers and practitioners blast the JIF and call for improved metrics". Physics Today. 2016. doi:10.1063/PT.5.8183.
  36. Hitchcock, Steve (2013) [2004]. "The effect of open access and downloads ('hits') on citation impact: a bibliography of studies". opcit.eprints.org. University of Southampton. Retrieved 2023-01-22.
    Brody, T.; Harnad, S. (2004). "Comparing the Impact of Open Access (OA) vs. Non-OA Articles in the Same Journals". D-Lib Magazine . 10: 6.
    Eysenbach, G.; Tenopir, C. (2006). "Citation Advantage of Open Access Articles". PLOS Biology . 4 (5): e157. doi: 10.1371/journal.pbio.0040157 . PMC   1459247 . PMID   16683865.
    Eysenbach, G. (2006). "The Open Access Advantage". Journal of Medical Internet Research . 8 (2): e8. doi: 10.2196/jmir.8.2.e8 . PMC   1550699 . PMID   16867971.
    Hajjem, C.; Harnad, S.; Gingras, Y. (2005). "Ten-Year Cross-Disciplinary Comparison of the Growth of Open Access and How It Increases Research Citation Impact" (PDF). IEEE Data Engineering Bulletin . 28 (4): 39–47. arXiv: cs/0606079 . Bibcode:2006cs........6079H.
    Lawrence, S. (2001). "Free online availability substantially increases a paper's impact". Nature . 411 (6837): 521. Bibcode:2001Natur.411..521L. doi:10.1038/35079151. PMID   11385534. S2CID   4422192.
    MacCallum, C. J.; Parthasarathy, H. (2006). "Open Access Increases Citation Rate". PLOS Biology . 4 (5): e176. doi: 10.1371/journal.pbio.0040176 . PMC   1459260 . PMID   16683866.
    Gargouri, Y.; Hajjem, C.; Lariviere, V.; Gingras, Y.; Brody, T.; Carr, L.; Harnad, S. (2010). "Self-Selected or Mandated, Open Access Increases Citation Impact for Higher Quality Research". PLOS ONE . 5 (10): e13636. arXiv: 1001.0361 . Bibcode:2010PLoSO...513636G. doi: 10.1371/journal.pone.0013636 . PMC   2956678 . PMID   20976155.
  37. Davis, P. M.; Lewenstein, B. V.; Simon, D. H.; Booth, J. G.; Connolly, M. J. L. (2008). "Open access publishing, article downloads, and citations: randomised controlled trial". BMJ . 337: a568. doi:10.1136/bmj.a568. PMC   2492576 . PMID   18669565.
  38. Chua, SK; Qureshi, Ahmad M; Krishnan, Vijay; Pai, Dinker R; Kamal, Laila B; Gunasegaran, Sharmilla; Afzal, MZ; Ambawatta, Lahiru; Gan, JY (2017-03-02). "The impact factor of an open access journal does not contribute to an article's citations". F1000Research. 6: 208. doi: 10.12688/f1000research.10892.1 . PMC   5464220 . PMID   28649365.
  39. Tang, M., Bever, J. D., & Yu, F. H. (2017). Open access increases citations of papers in ecology. Ecosphere, 8(7), e01887.
  40. Niyazov, Y., Vogel, C., Price, R., Lund, B., Judd, D., Akil, A., ... & Shron, M. (2016). Open access meets discoverability: Citations to articles posted to Academia. edu. PLOS ONE, 11(2), e0148257.
  41. Young, J. S., & Brandes, P. M. (2020). Green and gold open access citation and interdisciplinary advantage: A bibliometric study of two science journals. The Journal of Academic Librarianship, 46(2), 102105.
  42. Torres-Salinas, D., Robinson-Garcia, N., & Moed, H. F. (2019). Disentangling Gold Open Access. In Springer Handbook of Science and Technology Indicators (pp. 129–144). Springer, Cham.
  43. Björk, B. C., Kanto-Karvonen, S., & Harviainen, J. T. (2020). How frequently are articles in predatory open access journals cited. Publications, 8(2), 17.
  44. Radicchi, F.; Fortunato, S.; Castellano, C. (2008). "Universality of citation distributions: Toward an objective measure of scientific impact". PNAS . 105 (45): 17268–17272. arXiv: 0806.0974 . Bibcode:2008PNAS..10517268R. doi: 10.1073/pnas.0806977105 . PMC   2582263 . PMID   18978030.
  45. Hoang, D.; Kaur, J.; Menczer, F. (2010). "Crowdsourcing Scholarly Data" (PDF). Proceedings of the WebSci10: Extending the Frontiers of Society On-Line. Archived from the original (PDF) on 2016-03-16. Retrieved 2017-02-20.
  46. Kaur, J.; Hoang, D.; Sun, X.; Possamai, L.; JafariAsbagh, M.; Patil, S.; Menczer, F. (2012). "Scholarometer: A Social Framework for Analyzing Impact across Disciplines". PLOS ONE . 7 (9): e43235. Bibcode:2012PLoSO...743235K. doi: 10.1371/journal.pone.0043235 . PMC   3440403 . PMID   22984414.
  47. Kaur, J.; Radicchi, F.; Menczer, F. (2013). "Universality of scholarly impact metrics". Journal of Informetrics . 7 (4): 924–932. arXiv: 1305.6339 . doi:10.1016/j.joi.2013.09.002. S2CID   7415777.
  48. Bornmann, L.; Daniel, H. D. (2008). "What do citation counts measure? A review of studies on citing behavior". Journal of Documentation . 64 (1): 45–80. doi:10.1108/00220410810844150. hdl: 11858/00-001M-0000-0013-7A94-3 . S2CID   17260826.
  49. Anauati, M. V.; Galiani, S.; Gálvez, R. H. (2014). "Quantifying the Life Cycle of Scholarly Articles Across Fields of Economic Research". SSRN. doi:10.2139/ssrn.2523078. SSRN   2523078.
  50. van Wesel, M.; Wyatt, S.; ten Haaf, J. (2014). "What a difference a colon makes: how superficial factors influence subsequent citation" (PDF). Scientometrics . 98 (3): 1601–1615. doi:10.1007/s11192-013-1154-x. hdl: 20.500.11755/2fd7fc12-1766-4ddd-8f19-1d2603d2e11d . S2CID   18553863.
  51. van Wesel, M. (2016). "Evaluation by Citation: Trends in Publication Behavior, Evaluation Criteria, and the Strive for High Impact Publications". Science and Engineering Ethics . 22 (1): 199–225. doi:10.1007/s11948-015-9638-0. PMC   4750571 . PMID   25742806.
  52. Giles, C. L.; Bollacker, K.; Lawrence, S. (1998). "CiteSeer: An Automatic Citation Indexing System". DL'98 Digital Libraries, 3rd ACM Conference on Digital Libraries. pp. 89–98. doi:10.1145/276675.276685.
  53. Yu, G.; Li, Y.-J. (2010). "Identification of referencing and citation processes of scientific journals based on the citation distribution model". Scientometrics . 82 (2): 249–261. doi:10.1007/s11192-009-0085-z. S2CID   38693917.
  54. Bouabid, H. (2011). "Revisiting citation aging: A model for citation distribution and life-cycle prediction". Scientometrics . 88 (1): 199–211. doi:10.1007/s11192-011-0370-5. S2CID   30345334.
  55. Banasik-Jemielniak, Natalia; Jemielniak, Dariusz; Wilamowski, Maciej (2021-02-16). "Psychology and Wikipedia: Measuring Psychology Journals' Impact by Wikipedia Citations". Social Science Computer Review. 40 (3): 756–774. doi:10.1177/0894439321993836. ISSN   0894-4393. S2CID   233968639.
  56. "Psychology and Wikipedia: Measuring journals' impact by Wikipedia citations". phys.org. Retrieved 2021-09-08.
  57. Biagioli, M. (2016). "Watch out for cheats in citation game". Nature . 535 (7611): 201. Bibcode:2016Natur.535..201B. doi: 10.1038/535201a . PMID   27411599. S2CID   4392261.

Further reading