Science-wide author databases of standardized citation indicators

Last updated

The science-wide author databases of standardized citation indicators is a multidimensional ranking of the world's scientists produced since 2015 by a team of researchers led by John P. A. Ioannidis at Stanford. [1] [2]

Contents

Main

Based on data from Scopus, this indicators explore about 8 million records of scientists' citations in order to rank a subset of 200,000 most-cited authors across all scientific fields. This is commonly referred to as Stanford ranking of the 2% best scientists. [3]

The ranking is achieved via a composite indicator built on six citation metrics

Data

Data (about 200,000 records) are freely downloadable from Elsevier through the International Center for the Study of Research (ICSR) Lab. [2] [4] [5] [6]

Output

The index classifies researchers into 22 scientific fields and 174 sub-fields. Different rankings are produced: career-long and most recent year, with and without self-citations. This results in four different configurations. The difference between this ranking (called as c-score ) and the pure h-index is that it is sensitive to details of co-authorship and author positions: configurations such as single, first, and last author are given more emphasis. Many authors point to the importance of the index created by Ioannidis in the context of accurate, cheap and simple descriptions of research systems, [7] [8] [9] Being listed in Stanford's Rank is treated as prestigious and translates into increased visibility of scientists, which may translate into increased networking potential and for obtaining research funding. [10] [11] [12] Moreover, The rank offers an opportunity to researchers in a field to compare the citation behavior of their field with others. [7]

Reception and applications

The papers introducing the ranking have been quoted extensively by authors working in Bibliometrics and Scientometrics. For example, reference [3] describing an update to the methodology of this index number is cited [13] from authors publishing in journals such as SAGE's Research on Social Work Practice, [11] Elsevier's Perspectives in Ecology and Conservation, [14] Springer's Forensic Science, Medicine and Pathology, [15] Oxford Academic's The Journals of Gerontology: Series A, [16] and Springer's Scientometrics (journal). [8] [17]

The older methodological paper [18] is quoted even more, from journal such as MIT Press's Quantitative Science Studies, [19] Springer's Scientometrics [20] and many others.

These articles variously point to the methodological papers and associated measure to discuss social aspects of the publication activity, such as unequal access to publishing of different social or national groups, including gender bias [14] [21] or the properties of the underlying Scopus' abstract and citation database. The database has been linked to data about Retraction in academic publishing to study the frequency of retractions among highly cited researchers. [22] Frietsch et al. [23] compare the Stanford ranking against the Clarivate's Highly Cited Researchers database, noting that while the latter captures only 10% of Nobel laureates, the former identifies over 90%.

Comparative analyses run on the database show the importance of excluding self-citation as done in the ranking. [24] Another study praises the index over the simple Hirsch number but proposes a different accounting for authors who are neither first nor last in the ranking. [25] The data base is not without errors, [26] and is faulted by some for privileging US-based scholars, notably in legal studies. [27]

Wil van der Aalst reviews some of the known shortcomings of scientific rankings, including subjectivity, Matthew effect, country bias, discipline bias, and the Leiden Manifesto for research metrics admonition that quantitative measures should only support expert assessment and should not replace informed judgment. This author notes that, at the same time, one should not resort to subjective evaluations of research productivity and impact while ignoring the data that are there, and thus praises the usefulness of the database, also as input for further analyses. [28]

See also

References

  1. Ioannidis, J. P. A., Klavans, R., Boyack, K. W. (1 July 2016). "Multiple Citation Indicators and Their Composite across Scientific Disciplines". PLOS Biology. 14 (7) e1002501. Public Library of Science. doi: 10.1371/journal.pbio.1002501 . ISSN   1545-7885. PMC   4930269 . PMID   27367269.
  2. 1 2 Ioannidis, J. P. A., Baas, J., Klavans, R., Boyack, K. W. (12 August 2019). "A standardized citation metrics author database annotated for scientific field". PLOS Biology. 17 (8) e3000384. Public Library of Science. doi: 10.1371/journal.pbio.3000384 . ISSN   1545-7885. PMC   6699798 . PMID   31404057.
  3. 1 2 Ioannidis, J. P. A., Boyack, K. W., Baas, J. (16 October 2020). "Updated science-wide author databases of standardized citation indicators". PLOS Biology. 18 (10) e3000918. Public Library of Science. doi: 10.1371/journal.pbio.3000918 . ISSN   1545-7885. PMC   7567353 . PMID   33064726.
  4. Ioannidis, J. P. A. (2022), September 2022 data-update for "Updated science-wide author databases of standardized citation indicators", vol. 4, Elsevier BV, doi:10.17632/btchxktzyw.4 , retrieved 17 November 2022
  5. Ioannidis, J. P. A. (2024), "Bibliometrics", August 2024 data-update for "Updated science-wide author databases of standardized citation indicators", vol. 7, Elsevier Data Repository, doi:10.17632/btchxktzyw.7 , retrieved 5 October 2024
  6. Ioannidis, J. P. A. (2025), "Bibliometrics", August 2025 data-update for "Updated science-wide author databases of standardized citation indicators", vol. 8, Elsevier Data Repository, doi:10.17632/btchxktzyw.8 , retrieved 19 September 2025
  7. 1 2 Petersen, Kai; Ali, Nauman Bin (2021). "An analysis of top author citations in software engineering and a comparison with other fields". Scientometrics. 126 (11): 9147–9183. doi:10.1007/s11192-021-04144-1.
  8. 1 2 Singh, P. K. (1 January 2022). "t-index: entropy based random document and citation analysis using average h-index". Scientometrics. 127 (1): 637–660. doi:10.1007/s11192-021-04222-4. ISSN   1588-2861.
  9. Tohalino, Jorge A.V.; Amancio, Diego R. (2022). "On predicting research grants productivity via machine learning". Journal of Informetrics. 16 (2) 101260. arXiv: 2106.10700 . doi:10.1016/j.joi.2022.101260.
  10. Oliveira, Leticia DE; Reichert, Fernanda; Zandonà, Eugenia; Soletti, Rossana C.; Staniscuaski, Fernanda (2021). "The 100,000 most influential scientists rank: The underrepresentation of Brazilian women in academia". Anais da Academia Brasileira de Ciências. 93 (suppl 3) e20201952. doi:10.1590/0001-3765202120201952. hdl: 10183/233727 . PMID   34550208.
  11. 1 2 Hodge, D. R., Turner, P. R. (1 March 2023). "Who are the Top 100 Contributors to Social Work Journal Scholarship? A Global Study on Career Impact in the Profession". Research on Social Work Practice. 33 (3). SAGE Publications Inc: 338–349. doi:10.1177/10497315221136623. ISSN   1049-7315.
  12. Perneger, T. (5 September 2023). "Authorship and citation patterns of highly cited biomedical researchers: a cross-sectional study". Research Integrity and Peer Review. 8 (1) 13. doi: 10.1186/s41073-023-00137-1 . ISSN   2058-8615. PMC   10478343 . PMID   37667388.
  13. Updated count in Google Scholar: https://scholar.google.com/scholar?hl=en&as_sdt=0%2C5&q=science-wide+author+databases+of+standardized+citation+indicators&btnG=
  14. 1 2 Zandonà, E. (1 July 2022). "Female ecologists are falling from the academic ladder: A call for action". Perspectives in Ecology and Conservation. 20 (3): 294–299. Bibcode:2022PEcoC..20..294Z. doi:10.1016/j.pecon.2022.04.001. ISSN   2530-0644.
  15. Jones, A. W. (1 March 2022). "Highly cited forensic practitioners in the discipline legal and forensic medicine and the importance of peer-review and publication for admission of expert testimony". Forensic Science, Medicine and Pathology. 18 (1): 37–44. doi:10.1007/s12024-021-00447-0. ISSN   1556-2891. PMID   35129820.
  16. Ferraro, K. F. (1 November 2022). "Disciplinary Roots of 300 Top-Ranked Scientific Contributors to Gerontology: From Legacy to Enriching Our Discovery". The Journals of Gerontology: Series A. 77 (11): 2149–2154. doi:10.1093/gerona/glac129. ISSN   1079-5006. PMC   9678198 . PMID   36409829.
  17. Monte-Serrat, D. M., Cattani, C. (1 June 2021). "Interpretability in neural networks towards universal consistency". International Journal of Cognitive Computing in Engineering. 2: 30–39. doi: 10.1016/j.ijcce.2021.01.002 . ISSN   2666-3074.
  18. Ioannidis, J. P. A., Baas, J., Klavans, R., Boyack, K. W. (12 August 2019). "A standardized citation metrics author database annotated for scientific field". PLOS Biology. 17 (8) e3000384. Public Library of Science. doi: 10.1371/journal.pbio.3000384 . ISSN   1545-7885. PMC   6699798 . PMID   31404057.
  19. Baas, J., Schotten, M., Plume, A., Côté, G., Karimi, R. (1 February 2020). "Scopus as a curated, high-quality bibliometric data source for academic research in quantitative science studies". Quantitative Science Studies. 1 (1): 377–386. doi: 10.1162/qss_a_00019 . ISSN   2641-3337.
  20. Szomszor, M., Pendlebury, D. A., Adams, J. (1 May 2020). "How much is too much? The difference between research influence and self-citation excess". Scientometrics. 123 (2): 1119–1147. doi: 10.1007/s11192-020-03417-5 . ISSN   1588-2861.
  21. Wu, C. (2023). "The gender citation gap: Why and how it matters". Canadian Review of Sociology/Revue Canadienne de Sociologie. 60 (2): 188–211. doi:10.1111/cars.12428. ISSN   1755-618X. PMID   36929271.
  22. Ioannidis, J. P. A., Pezzullo, A. M., Cristiano, A., Boccia, S., Baas, J. (30 January 2025). "Linking citation and retraction data reveals the demographics of scientific retractions among highly cited authors". PLOS Biology. 23 (1) e3002999. Public Library of Science. doi: 10.1371/journal.pbio.3002999 . ISSN   1545-7885. PMC   11781634 . PMID   39883670.
  23. Frietsch, R., Gruber, S., Bornmann, L. (21 January 2025). "The definition of highly cited researchers: the effect of different approaches on the empirical outcome". Scientometrics. 130 (2): 881–907. doi: 10.1007/s11192-024-05158-1 . ISSN   1588-2861.
  24. Abdalla, B. A., Mustafa, A. M., Fattah, F. H., Kakamad, F. H., Omar, S. S., Salih, A. M., Muhialdeen, A. S., Ahmed, J. O., Bapir, R., Mohammed, S. H., Mohammed, K. K., Baba, H. O., Ahmed, S. M., Mustafa, S. M., Najar, K. A. (15 February 2025). "Self-citation pattern among world's top 2 % of the scientists". Heliyon. 11 (3) e42471. Bibcode:2025Heliy..1142471A. doi: 10.1016/j.heliyon.2025.e42471 . ISSN   2405-8440. PMC   11849628 . PMID   39995918.
  25. Jain, H. A., Chandra, R. (2025), Research impact evaluation based on effective authorship contribution sensitivity: h-leadership index, arXiv: 2503.18236
  26. Abduh, A. J. (2023). "A Critical Analysis of the World's Top 2% Most Influential Scientists: Examining the Limitations and Biases of Highly Cited Researchers Lists". Authorea. Retrieved 6 November 2025.
  27. Sasvári, P., Lendvai, G. F. (1 August 2025). "The overrepresentation of the United States in the field of legal studies in the science-wide author databases of standardized citation indicators". Journal of Informetrics. 19 (3) 101680. doi:10.1016/j.joi.2025.101680. ISSN   1751-1577.
  28. Van der Aalst, W. (2023), Yet Another View on Citation Scores