SCImago Journal Rank

Last updated
Portal SCImago Journal & Country Rank screenshot PortalSJR.JPG
Portal SCImago Journal & Country Rank screenshot

The SCImago Journal Rank (SJR) indicator is a measure of the prestige of scholarly journals that accounts for both the number of citations received by a journal and the prestige of the journals where the citations come from.

Contents

Rationale

Citations are an indicator of popularity of scientific works and can be perceived as endorsement; prestige can be understood as a combination of the number of endorsements and the prestige of the works publishing them. Adopting this view, the SJR indicator assigns different values to citations depending on the perceived prestige of the journals where they come from.

However, studies of methodological quality and reliability have found that "reliability of published research works in several fields may be decreasing with increasing journal rank", [1] contrary to widespread expectations. [2]

The calculation of the SJR indicator is similar to the Eigenfactor score, with the former being based on the Scopus database and the latter on the Web of Science database, [3] and there are other differences. [4]

Computation

A journal's SJR indicator is a numeric value representing the average number of weighted citations received during a selected year per document published in that journal during the previous three years, as indexed by Scopus. Higher SJR indicator values are meant to indicate greater journal prestige. SJR is developed by the Scimago Lab, [5] originated from a research group at the University of Granada.

The SJR indicator is a variant of the eigenvector centrality measure used in network theory. Such measures establish the importance of a node in a network based on the principle that connections to high-scoring nodes contribute more to the score of the node. The SJR indicator has been developed to be used in extremely large and heterogeneous journal citation networks. It is a size-independent indicator and its values order journals by their "average prestige per article" and can be used for journal comparisons in science evaluation processes. The SJR indicator is a free journal metric inspired by, and using an algorithm similar to, PageRank.

The SJR indicator computation is carried out using an iterative algorithm that distributes prestige values among the journals until a steady-state solution is reached. The SJR algorithm begins by setting an identical amount of prestige to each journal, then using an iterative procedure, this prestige is redistributed in a process where journals transfer their achieved prestige to each other through citations. The process ends up when the difference between journal prestige values in consecutive iterations do not reach a minimum threshold value any more. The process is developed in two phases, (a) the computation of Prestige SJR (PSJR) for each journal: a size-dependent measure that reflects the whole journal prestige, and (b) the normalization of this measure to achieve a size-independent measure of prestige, the SJR indicator. [6]

In addition to the network-based SJR indicator, the SJR also provides a more direct alternative to the impact factor (IF), in the form of average citations per document in a 2-year period, abbreviated as Cites per Doc. (2y). [7] [8]

See also

Related Research Articles

Scopus is Elsevier's abstract and citation database launched in 2004. Scopus covers 36,377 titles from 11,678 publishers, of which 34,346 are peer-reviewed journals in top-level subject fields: life sciences, social sciences, physical sciences and health sciences. It covers three types of sources: book series, journals, and trade journals. Scopus also allows patent searches in a dedicated patent database Lexis-Nexis, albeit with a limited functionality.

The impact factor (IF) or journal impact factor (JIF) of an academic journal is a scientometric index calculated by Clarivate that reflects the yearly mean number of citations of articles published in the last two years in a given journal, as indexed by Clarivate's Web of Science.

Scientometrics is the field of study which concerns itself with measuring and analysing scholarly literature. Scientometrics is a sub-field of informetrics. Major research issues include the measurement of the impact of research papers and academic journals, the understanding of scientific citations, and the use of such measurements in policy and management contexts. In practice there is a significant overlap between scientometrics and other scientific fields such as information systems, information science, science of science policy, sociology of science, and metascience. Critics have argued that over-reliance on scientometrics has created a system of perverse incentives, producing a publish or perish environment that leads to low-quality research.

Citation impact or citation rate is a measure of how many times an academic journal article or book or author is cited by other articles, books or authors. Citation counts are interpreted as measures of the impact or influence of academic work and have given rise to the field of bibliometrics or scientometrics, specializing in the study of patterns of academic impact through citation analysis. The importance of journals can be measured by the average citation rate, the ratio of number of citations to number articles published within a given time period and in a given index, such as the journal impact factor or the citescore. It is used by academic institutions in decisions about academic tenure, promotion and hiring, and hence also used by authors in deciding which journal to publish in. Citation-like measures are also used in other fields that do ranking, such as Google's PageRank algorithm, software metrics, college and university rankings, and business performance indicators.

The h-index is an author-level metric that measures both the productivity and citation impact of the publications, initially used for an individual scientist or scholar. The h-index correlates with success indicators such as winning the Nobel Prize, being accepted for research fellowships and holding positions at top universities. The index is based on the set of the scientist's most cited papers and the number of citations that they have received in other publications. The index has more recently been applied to the productivity and impact of a scholarly journal as well as a group of scientists, such as a department or university or country. The index was suggested in 2005 by Jorge E. Hirsch, a physicist at UC San Diego, as a tool for determining theoretical physicists' relative quality and is sometimes called the Hirsch index or Hirsch number.

Journal Citation Reports (JCR) is an annual publication by Clarivate. It has been integrated with the Web of Science and is accessed from the Web of Science Core Collection. It provides information about academic journals in the natural and social sciences, including impact factors. The JCR was originally published as a part of the Science Citation Index. Currently, the JCR, as a distinct service, is based on citations compiled from the Science Citation Index Expanded and the Social Sciences Citation Index. As of the 2023 edition, journals from the Arts and Humanities Citation Index and the Emerging Sources Citation Index will also be included.

Journal ranking is widely used in academic circles in the evaluation of an academic journal's impact and quality. Journal rankings are intended to reflect the place of a journal within its field, the relative difficulty of being published in that journal, and the prestige associated with it. They have been introduced as official research evaluation tools in several countries.

<span class="mw-page-title-main">Web of Science</span> Online subscription index of citations

The Web of Science is a paid-access platform that provides access to multiple databases that provide reference and citation data from academic journals, conference proceedings, and other documents in various academic disciplines. Until 1997, it was originally produced by the Institute for Scientific Information. It is currently owned by Clarivate.

<i>Journal of Chemical Theory and Computation</i> Academic journal

Journal of Chemical Theory and Computation is a peer-reviewed scientific journal, established in 2005 by the American Chemical Society. It is indexed in Chemical Abstracts Service (CAS), Scopus, British Library, and Web of Science. The current editor-in-chief is Laura Gagliardi. Currently as of the year 2022, JCTC has 18 volumes.

The Eigenfactor score, developed by Jevin West and Carl Bergstrom at the University of Washington, is a rating of the total importance of a scientific journal. Journals are rated according to the number of incoming citations, with citations from highly ranked journals weighted to make a larger contribution to the eigenfactor than those from poorly ranked journals. As a measure of importance, the Eigenfactor score scales with the total impact of a journal. All else equal, journals generating higher impact to the field have larger Eigenfactor scores. Citation metrics like eigenfactor or PageRank-based scores reduce the effect of self-referential groups.

<span class="mw-page-title-main">PageRank</span> Algorithm used by Google Search to rank web pages

PageRank (PR) is an algorithm used by Google Search to rank web pages in their search engine results. It is named after both the term "web page" and co-founder Larry Page. PageRank is a way of measuring the importance of website pages. According to Google:

PageRank works by counting the number and quality of links to a page to determine a rough estimate of how important the website is. The underlying assumption is that more important websites are likely to receive more links from other websites.

<i>Social Compass</i> Academic journal

Social Compass is a peer-reviewed academic journal that covers research in the field of sociology of religion. The journal's co-directors are Olivier Servais and Frédéric Laugrand. The current Editor is Carolina Sappia and the journal is published by SAGE Publications.

<span class="mw-page-title-main">Altmetrics</span> Alternative metrics for analyzing scholarship

In scholarly and scientific publishing, altmetrics are non-traditional bibliometrics proposed as an alternative or complement to more traditional citation impact metrics, such as impact factor and h-index. The term altmetrics was proposed in 2010, as a generalization of article level metrics, and has its roots in the #altmetrics hashtag. Although altmetrics are often thought of as metrics about articles, they can be applied to people, journals, books, data sets, presentations, videos, source code repositories, web pages, etc.

Acta Ethologica is a triannual peer-reviewed scientific journal established in 1998. The journal covers all aspects of the behavioural biology of humans and other animals, including behavioural ecology, evolution of behaviour, sociobiology, ethology, behavioural physiology, and population biology.

Author-level metrics are citation metrics that measure the bibliometric impact of individual authors, researchers, academics, and scholars. Many metrics have been developed that take into account varying numbers of factors.

The domain authority of a website describes its relevance for a specific subject area or industry. Domain Authority is a search engine ranking score developed by Moz. This relevance has a direct impact on its ranking by search engines, trying to assess domain authority through automated analytic algorithms. The relevance of domain authority on website-listing in the Search Engine Results Page (SERPs) of search engines led to the birth of a whole industry of Black-Hat SEO providers, trying to feign an increased level of domain authority. The ranking by major search engines, e.g., Google’s PageRank is agnostic of specific industry or subject areas and assesses a website in the context of the totality of websites on the Internet. The results on the SERP page set the PageRank in the context of a specific keyword. In a less competitive subject area, even websites with a low PageRank can achieve high visibility in search engines, as the highest ranked sites that match specific search words are positioned on the first positions in the SERPs.

The International Journal of Health Services is a quarterly peer-reviewed academic journal covering health policy. It was established in 1971 and is published by SAGE Publications. The current editors-in-chief are Carles Muntaner and Joan Benach.

There are a number of approaches to ranking academic publishing groups and publishers. Rankings rely on subjective impressions by the scholarly community, on analyses of prize winners of scientific associations, discipline, a publisher's reputation, and its impact factor.

<i>Dental Hypotheses</i> Academic journal

Dental Hypotheses is a quarterly peer-reviewed open access medical journal covering all aspects of dentistry. It was established in 2010 by Jafar Kolahi and Edward F. Rossomando. The journal is published by Medknow Publications and the editor-in-chief is Edward F. Rossomando. It is an official journal of the American Biodontics Society and the Center for Research and Education in Technology. The latest SJR report shows that the level of the journal has been increased significantly to Q4 (2020). Dental Hypotheses is a member of Committee on Publication Ethics (COPE).

Democratic Theory is a peer-reviewed journal published and distributed by Berghahn.

References

  1. Brembs, Björn (2018). "Prestigious Science Journals Struggle to Reach Even Average Reliability". Frontiers in Human Neuroscience. 12: 37. doi: 10.3389/fnhum.2018.00037 . PMC   5826185 . PMID   29515380.
  2. Triggle, Chris R; MacDonald, Ross; Triggle, David J.; Grierson, Donald (2022-04-03). "Requiem for impact factors and high publication charges". Accountability in Research. 29 (3): 133–164. doi: 10.1080/08989621.2021.1909481 . PMID   33787413. One might expect, therefore, that a high JIF factor indicates a higher standard of interest, accuracy and reliability of papers published therein. This is sometimes true but unfortunately is certainly not always the case (Brembs 2018, 2019). Thus, Björn Brembs (2019) concluded: "There is a growing body of evidence against our subjective notion of more prestigious journals publishing 'better' science. In fact, the most prestigious journals may be publishing the least reliable science."
  3. "SCImago Journal & Country Rank (SJR) as an alternative to Thomson Reuters's Impact Factor and EigenFactor". 21 Aug 2008. Archived from the original on 2013-04-26. Retrieved 20 September 2012.
  4. "Network-based Citation Metrics: Eigenfactor vs. SJR". 28 Jul 2015. Archived from the original on 2020-04-26. Retrieved 26 Apr 2020.
  5. "Scimago Lab". Scimago Lab Website. Retrieved 22 July 2021.
  6. SCImago Journal & Country Rank. "DESCRIPTION OF SCIMAGO JOURNAL RANK INDICATOR" (PDF). SCImago Journal & Country Rank. Retrieved 20 March 2018.
  7. Declan Butler (2 January 2008). "Free journal-ranking tool enters citation market". Nature . 451 (6): 6. Bibcode:2008Natur.451....6B. doi: 10.1038/451006a . PMID   18172465.
  8. Matthew E. Falagas; et al. (2008). "Comparison of SCImago journal rank indicator with journal impact factor". The FASEB Journal . 22 (8): 2623–2628. doi: 10.1096/fj.08-107938 . PMID   18408168.