Rankings of academic publishers

Last updated

There are a number of approaches to ranking academic publishing groups and publishers. [1] [2] [3] [4] [5] [6] Rankings rely on subjective impressions by the scholarly community, on analyses of prize winners of scientific associations, discipline, a publisher's reputation, and its impact factor (particularly in the sciences).

Contents

Ranking challenges

Publications are often judged by venue, rather than merit. [7] This has been criticized in the Leiden Manifesto [8] and the San Francisco Declaration on Research Assessment. According to the manifesto, "Science and technology indicators are prone to conceptual ambiguity and uncertainty and require strong assumptions that are not universally accepted. The meaning of citation counts, for example, has long been debated. Thus, best practice uses multiple indicators to provide a more robust and pluralistic picture." [8]

Moreover, studies of methodological quality and reliability have found that "reliability of published research works in several fields may be decreasing with increasing journal rank", [9] contrary to widespread expectations. [10]

In a study assessing an increasingly-diversified array of publishers and their service to the academic community, Janice S. Lewis concluded that college and university librarians ranked university presses higher and commercial publishers lower than did members of the American Political Science Association. [4]

According to Colin Steele, a librarian at the Australian National Library in Canberra, "Listings of publishers by title also fail to take into account that some university presses are strong in certain disciplines, but not across the whole spectrum." [11] Rankings can vary widely by discipline.

Australian Political Science rankings

The Australian Political Studies Association (APSA) ranked academic publishers in 2007, taking into consideration both book and journal publication. [12] By 2022 this was replaced by a ranking of journal titles only. [13]

In 2007, their top-ranked (A+) publishers were:

In 2007, their second-ranked (A) publishers were:

SENSE rankings

The Research School for Socio-Economic and Natural Sciences of the Environment (SENSE Research School) has ranked scientific publishers every year from 2006 until 2022. [14] This ranking was intended for internal use only and is not anymore available.

Spanish National Research Council rankings

In 2012 and 2014, the Spanish National Research Council asked 11,864 Spanish academics to name the 10 most prestigious academic publishers from over 600 international and 500 Spanish-language publishers. It received 2,731 responses, a response rate of 23.05 percent. Results were compiled using a weighted average. [15] The results were:

  1. Cambridge University Press
  2. Oxford University Press
  3. Springer Nature
  4. Routledge
  5. Elsevier
  6. Peter Lang
  7. Thomson Reuters
  8. Blackwell
  9. De Gruyter
  10. McGraw Hill [15]
  11. IGI Global

Granada rankings

To quantitatively assess the output of a publishing company, in 2014 a research group associated with the University of Granada created a methodology based on the Thomson-Reuters Book Citation Index. [16] The quantitative weight of the publishers is based on output data, impact (citations) and publisher profile. According to the Granada study, the 10 leading companies were: [16]

  1. Springer
  2. Palgrave Macmillan
  3. Routledge
  4. Cambridge University Press
  5. Elsevier
  6. Nova Science Publishers
  7. Edward Elgar
  8. Information Age Publishing
  9. Princeton University Press
  10. University of California Press
  11. IGI Global

Libcitation rankings

The Research Impact Measurement Service (RIMS) at the University of New South Wales presented a quantitative methodology of bibliometric comparisons of book publishers. [17] [18] [19] In a Journal of the American Society for Information Science and Technology article, Howard D. White et al. wrote: "Bibliometric measures for evaluating research units in the book-oriented humanities and social sciences are underdeveloped relative to those available for journal-oriented science and technology". The RIMS proposed what they called a "libcitation count", counting the libraries holding a given book as reported in a national (or international) union catalog. In the follow-up literature, comparing research units or even the output of publishing companies became the target of research. [17] [20] White et al. wrote,

Libcitation counts reflect judgments by librarians on the usefulness of publications for their various audiences of readers. The Libcitation measure thus resembles a citation impact measure in discriminating values of publications on a defined ground. It rewards authors whose books (or other publications) are seen by librarians as having relatively wide appeal. A book's absolute appeal can be determined simply by counting how many libraries hold it, but it can also be gauged in relation to other books in its subject class. [17]

Libcitations, according to the RIMS, reflect what librarians know about the prestige of publishers, the opinions of reviewers, and the reputations of authors. [17]

Literature

See also

Related Research Articles

<span class="mw-page-title-main">Citation index</span> Index of citations between publications

A citation index is a kind of bibliographic index, an index of citations between publications, allowing the user to easily establish which later documents cite which earlier documents. A form of citation index is first found in 12th-century Hebrew religious literature. Legal citation indexes are found in the 18th century and were made popular by citators such as Shepard's Citations (1873). In 1961, Eugene Garfield's Institute for Scientific Information (ISI) introduced the first citation index for papers published in academic journals, first the Science Citation Index (SCI), and later the Social Sciences Citation Index (SSCI) and the Arts and Humanities Citation Index (AHCI). American Chemical Society converted its printed Chemical Abstract Service into internet-accessible SciFinder in 2008. The first automated citation indexing was done by CiteSeer in 1997 and was patented. Other sources for such data include Google Scholar, Microsoft Academic, Elsevier's Scopus, and the National Institutes of Health's iCite.

<span class="mw-page-title-main">Webometrics</span>

The science of webometrics tries to measure the World Wide Web to get knowledge about the number and types of hyperlinks, structure of the World Wide Web and using patterns. According to Björneborn and Ingwersen, the definition of webometrics is "the study of the quantitative aspects of the construction and use of information resources, structures and technologies on the Web drawing on bibliometric and informetric approaches." The term webometrics was first coined by Almind and Ingwersen (1997). A second definition of webometrics has also been introduced, "the study of web-based content with primarily quantitative methods for social science research goals using techniques that are not specific to one field of study", which emphasizes the development of applied methods for use in the wider social sciences. The purpose of this alternative definition was to help publicize appropriate methods outside the information-science discipline rather than to replace the original definition within information science.

A book review is a form of literary criticism in which a book is merely described or analyzed based on content, style, and merit.

<span class="mw-page-title-main">Bibliometrics</span> Statistical analysis of written publications

Bibliometrics is the application of statistical methods to the study of bibliographic data, especially in scientific and library and information science contexts, and is closely associated with scientometrics to the point that both fields largely overlap.

Scientometrics is a subfield of informetrics that studies quantitative aspects of scholarly literature. Major research issues include the measurement of the impact of research papers and academic journals, the understanding of scientific citations, and the use of such measurements in policy and management contexts. In practice there is a significant overlap between scientometrics and other scientific fields such as information systems, information science, science of science policy, sociology of science, and metascience. Critics have argued that overreliance on scientometrics has created a system of perverse incentives, producing a publish or perish environment that leads to low-quality research.

Citation analysis is the examination of the frequency, patterns, and graphs of citations in documents. It uses the directed graph of citations — links from one document to another document — to reveal properties of the documents. A typical aim would be to identify the most important documents in a collection. A classic example is that of the citations between academic articles and books. For another example, judges of law support their judgements by referring back to judgements made in earlier cases. An additional example is provided by patents which contain prior art, citation of earlier patents relevant to the current claim. The digitization of patent data and increasing computing power have led to a community of practice that uses these citation data to measure innovation attributes, trace knowledge flows, and map innovation networks.

<span class="mw-page-title-main">Google Scholar</span> Academic search service by Google

Google Scholar is a freely accessible web search engine that indexes the full text or metadata of scholarly literature across an array of publishing formats and disciplines. Released in beta in November 2004, the Google Scholar index includes peer-reviewed online academic journals and books, conference papers, theses and dissertations, preprints, abstracts, technical reports, and other scholarly literature, including court opinions and patents.

<span class="mw-page-title-main">Informetrics</span> Study of the quantitative aspects of information

Informetrics is the study of quantitative aspects of information, it is an extension and evolution of traditional bibliometrics and scientometrics. Informetrics uses bibliometrics and scientometrics methods to study mainly the problems of literature information management and evaluation of science and technology. Informetrics is an independent discipline that uses quantitative methods from mathematics and statistics to study the process, phenomena, and law of informetrics. Informetrics has gained more attention as it is a common scientific method for academic evaluation, research hotspots in discipline, and trend analysis.

Howard D. White is a scientist in library and information science with a focus on informetrics and scientometrics.

Citation impact or citation rate is a measure of how many times an academic journal article or book or author is cited by other articles, books or authors. Citation counts are interpreted as measures of the impact or influence of academic work and have given rise to the field of bibliometrics or scientometrics, specializing in the study of patterns of academic impact through citation analysis. The importance of journals can be measured by the average citation rate, the ratio of number of citations to number articles published within a given time period and in a given index, such as the journal impact factor or the citescore. It is used by academic institutions in decisions about academic tenure, promotion and hiring, and hence also used by authors in deciding which journal to publish in. Citation-like measures are also used in other fields that do ranking, such as Google's PageRank algorithm, software metrics, college and university rankings, and business performance indicators.

The h-index is an author-level metric that measures both the productivity and citation impact of the publications, initially used for an individual scientist or scholar. The h-index correlates with success indicators such as winning the Nobel Prize, being accepted for research fellowships and holding positions at top universities. The index is based on the set of the scientist's most cited papers and the number of citations that they have received in other publications. The index has more recently been applied to the productivity and impact of a scholarly journal as well as a group of scientists, such as a department or university or country. The index was suggested in 2005 by Jorge E. Hirsch, a physicist at UC San Diego, as a tool for determining theoretical physicists' relative quality and is sometimes called the Hirsch index or Hirsch number.

<span class="mw-page-title-main">Nova Science Publishers</span> American academic publishing company

Nova Science Publishers is an academic publisher of books, encyclopedias, handbooks, e-books and journals, based in Hauppauge, New York. It was founded in 1985. Nova is included in Book Citation Index and scopus-indexed. A prolific publisher of books, Nova has received criticism from librarians for not always subjecting its publications to academic peer review and for republishing public domain book chapters and freely-accessible government publications at high prices.

Journal ranking is widely used in academic circles in the evaluation of an academic journal's impact and quality. Journal rankings are intended to reflect the place of a journal within its field, the relative difficulty of being published in that journal, and the prestige associated with it. They have been introduced as official research evaluation tools in several countries.

A bibliometrician is a researcher or a specialist in bibliometrics. It is near-synonymous with an informetrican, a scientometrican and a webometrician, who study webometrics.

The Book Citation Index is an online subscription-based scientific citation indexing service maintained by Clarivate Analytics and is part of the Web of Science Core Collection. It was first launched in 2011 and indexes over 60,000 editorially selected books, starting from 2005. Books in the index are electronic and print scholarly texts that contain articles based on original research and/or reviews of such literature.

Author-level metrics are citation metrics that measure the bibliometric impact of individual authors, researchers, academics, and scholars. Many metrics have been developed that take into account varying numbers of factors.

Judit Bar-Ilan was an Israeli computer scientist known for her research in informetrics and scientometrics. She was a professor of information science, and head of the Department of Information Science at Bar-Ilan University.

<span class="mw-page-title-main">Ronald Rousseau</span>

Ronald Rousseau is a Belgian mathematician and information scientist. He has obtained an international reputation for his research on indicators and citation analysis in the fields of bibliometrics and scientometrics.

The Leiden Manifesto for research metrics (LM) is a list of "ten principles to guide research evaluation", published as a comment in Volume 520, Issue 7548 of Nature, on 22 April 2015. It was formulated by public policy professor Diana Hicks, scientometrics professor Paul Wouters, and their colleagues at the 19th International Conference on Science and Technology Indicators, held between 3–5 September 2014 in Leiden, The Netherlands.

The open science movement has expanded the uses scientific output beyond specialized academic circles.

References

  1. Goodson, Larry P.; Dillman, Bradford; Hira, Anil (1999). "Ranking the Presses: Political Scientists' Evaluations of Publisher Quality". PS: Political Science and Politics. 32 (2): 257–262. doi: 10.1017/S1049096500049416 . JSTOR   420561.
  2. Steele, Colin (2008). "Scholarly Monograph Publishing in the 21st Century: The Future More Than Ever Should be an Open Book". The Journal of Electronic Publishing. 11 (2). doi: 10.3998/3336451.0011.201 .
  3. Garand, James C.; Giles, Micheal W. (2011). "Ranking Scholarly Publishers in Political Science: An Alternative Approach". PS: Political Science and Politics. 44 (2): 375–383. doi: 10.1017/S1049096511000229 . JSTOR   41319924.
  4. 1 2 Lewis, Janice S. (2000). "An Assessment of Publisher Quality by Political Science Librarians". College & Research Libraries. 61 (4): 313–323. doi: 10.5860/crl.61.4.313 .
  5. Samuels, David (2013). "Book Citations Count". PS: Political Science & Politics. 46 (4): 785–790. doi: 10.1017/S1049096513001054 .
  6. Rhodes, R. A. W.; Hamilton, Margaret (2007). "Australian Political Science: Journal and Publisher Rankings" (PDF).
  7. Lee, Icy (2014). "Publish or perish: The myth and reality of academic publishing". Language Teaching. 47 (2): 250–261. doi:10.1017/S0261444811000504. S2CID   146536290.
  8. 1 2 Hicks D, Wouters P, Waltman L, de Rijcke S, Rafols I (23 April 2015). "The Leiden Manifesto for research metrics" (PDF). Nature. 520 (7548): 429–431. doi: 10.1038/520429a . PMID   25903611. S2CID   4462115 . Retrieved 18 October 2017.
  9. Brembs, Björn (2018). "Prestigious Science Journals Struggle to Reach Even Average Reliability". Frontiers in Human Neuroscience. 12: 37. doi: 10.3389/fnhum.2018.00037 . PMC   5826185 . PMID   29515380.
  10. Triggle, Chris R; MacDonald, Ross; Triggle, David J.; Grierson, Donald (3 April 2022). "Requiem for impact factors and high publication charges". Accountability in Research. 29 (3): 133–164. doi: 10.1080/08989621.2021.1909481 . PMID   33787413. One might expect, therefore, that a high JIF factor indicates a higher standard of interest, accuracy and reliability of papers published therein. This is sometimes true but unfortunately is certainly not always the case (Brembs 2018, 2019). Thus, Björn Brembs (2019) concluded: "There is a growing body of evidence against our subjective notion of more prestigious journals publishing 'better' science. In fact, the most prestigious journals may be publishing the least reliable science."
  11. Steele, Colin (2008). "Scholarly Monograph Publishing in the 21st Century: The Future More Than Ever Should be an Open Book". The Journal of Electronic Publishing. 11 (2). doi: 10.3998/3336451.0011.201 .
  12. https://www.eduhk.hk/include_n/getrichfile.php?key=95030d9da8144788e3752da05358f071&secid=50424&filename=secstaffcorner/research_doc/Compiled_Publisher_List.pdf [ bare URL PDF ]
  13. https://auspsa.org.au/news/the-2019-australian-political-studies-association-journal-list-review/ [ bare URL ]
  14. "SENSE – Quality & Criteria". www.sense.nl.
  15. 1 2 http://ilia.cchs.csic.es/SPI/metodologia_2014.html and http://ilia.cchs.csic.es/SPI/prestigio_expertos_2014.php
  16. 1 2 Torres-Salinas, Daniel; Robinson-Garcia, Nicolas; Miguel Campanario, Juan; Delgado López-Cózar, Emilio (2014). "Coverage, field specialisation and the impact of scientific publishers indexed in the Book Citation Index". Online Information Review. 38: 24–42. arXiv: 1312.2791 . doi:10.1108/OIR-10-2012-0169. S2CID   3794376.
  17. 1 2 3 4 White, Howard D.; Boell, Sebastian K.; Yu, Hairong; Davis, Mari; Wilson, Concepción S.; Cole, Fletcher T.H. (2009). "Libcitations: A measure for comparative assessment of book publications in the humanities and social sciences". Journal of the American Society for Information Science and Technology. 60 (6): 1083–1096. doi:10.1002/asi.21045. hdl: 1959.4/44715 . S2CID   33661687.
  18. Drummond, Robyn; Wartho, Richard (2009). "RIMS: The Research Impact Measurement Service at the University of New South Wales". Australian Academic & Research Libraries. 40 (2): 76–87. doi:10.1080/00048623.2009.10721387.
  19. For a recent summary of the literature see Tausch, Arno (2017), Die Buchpublikationen der Nobelpreis-Ökonomen und die führenden Buchverlage der Disziplin. Eine bibliometrische Analyse Bibliotheksdienst, March 2017: 339 – 374. SSRN   2674502
    • Zuccala, A., Guns, R., Cornacchia, R., & Bod, R. (2014). Can we rank scholarly book publishers? A bibliometric experiment with the field of history. Journal of the Association for Information Science and Technology