Article-level metrics

Last updated

Article-level metrics are citation metrics which measure the usage and impact of individual scholarly articles.

Contents

Adoption

Traditionally, bibliometrics have been used to evaluate the usage and impact of research, but have usually been focused on journal-level metrics such as the impact factor or researcher-level metrics such as the h-index. [1] Article-level metrics, on the other hand, may demonstrate the impact of an individual article. This is related to, but distinct from, altmetrics. [2]

Starting in March 2009, the Public Library of Science introduced article-level metrics for all articles. [3] The open access publisher PLOS provides article level metrics for all of its journals [4] including downloads, citations, and altmetrics. [5] In March 2014 it was announced that COUNTER statistics, which measure usage of online scholarly resources, are now available at the article level. [6]

See also

Related Research Articles

<span class="mw-page-title-main">PLOS</span> Nonprofit open-access publisher

PLOS is a nonprofit publisher of open-access journals in science, technology, and medicine and other scientific literature, under an open-content license. It was founded in 2000 and launched its first journal, PLOS Biology, in October 2003.

<span class="mw-page-title-main">Scientific citation</span>

Scientific citation is providing detailed reference in a scientific publication, typically a paper or book, to previous published communications which have a bearing on the subject of the new publication. The purpose of citations in original work is to allow readers of the paper to refer to cited work to assist them in judging the new work, source background information vital for future development, and acknowledge the contributions of earlier workers. Citations in, say, a review paper bring together many sources, often recent, in one place.

<span class="mw-page-title-main">Open access</span> Research publications distributed freely online

Open access (OA) is a set of principles and a range of practices through which research outputs are distributed online, free of access charges or other barriers. Under some models of open access publishing, barriers to copying or reuse are also reduced or removed by applying an open license for copyright.

PLOS Biology is a monthly peer-reviewed scientific journal covering all aspects of biology. Publication began on October 13, 2003. It is the first journal published by the Public Library of Science. The editor-in-chief is Nonia Pariente.

Scopus is Elsevier's abstract and citation database launched in 2004. Scopus covers nearly 36,377 titles from approximately 11,678 publishers, of which 34,346 are peer-reviewed journals in top-level subject fields: life sciences, social sciences, physical sciences and health sciences. It covers three types of sources: book series, journals, and trade journals. All journals covered in the Scopus database are reviewed for sufficiently high quality each year according to four types of numerical quality measure for each title; those are h-Index, CiteScore, SJR and SNIP. Scopus also allows patent searches in a dedicated patent database Lexis-Nexis, albeit with a limited functionality.

The impact factor (IF) or journal impact factor (JIF) of an academic journal is a scientometric index calculated by Clarivate that reflects the yearly mean number of citations of articles published in the last two years in a given journal, as indexed by Clarivate's Web of Science.

<span class="mw-page-title-main">Bibliometrics</span> Statistical analysis of written publications

Bibliometrics is the use of statistical methods to analyse books, articles and other publications, especially in scientific contents. Bibliometric methods are frequently used in the field of library and information science. Bibliometrics is closely associated with scientometrics, the analysis of scientific metrics and indicators, to the point that both fields largely overlap.

Scientometrics is the field of study which concerns itself with measuring and analysing scholarly literature. Scientometrics is a sub-field of informetrics. Major research issues include the measurement of the impact of research papers and academic journals, the understanding of scientific citations, and the use of such measurements in policy and management contexts. In practice there is a significant overlap between scientometrics and other scientific fields such as information systems, information science, science of science policy, sociology of science, and metascience. Critics have argued that over-reliance on scientometrics has created a system of perverse incentives, producing a publish or perish environment that leads to low-quality research.

<span class="mw-page-title-main">Google Scholar</span> Academic search service by Google

Google Scholar is a freely accessible web search engine that indexes the full text or metadata of scholarly literature across an array of publishing formats and disciplines. Released in beta in November 2004, the Google Scholar index includes peer-reviewed online academic journals and books, conference papers, theses and dissertations, preprints, abstracts, technical reports, and other scholarly literature, including court opinions and patents.

ScienceDirect is a website that provides access to a large bibliographic database of scientific and medical publications of the Dutch publisher Elsevier. It hosts over 18 million pieces of content from more than 4,000 academic journals and 30,000 e-books of this publisher. The access to the full-text requires subscription, while the bibliographic metadata is free to read. ScienceDirect is operated by Elsevier. It was launched in March 1997.

Citation impact or citation rate is a measure of how many times an academic journal article or book or author is cited by other articles, books or authors. Citation counts are interpreted as measures of the impact or influence of academic work and have given rise to the field of bibliometrics or scientometrics, specializing in the study of patterns of academic impact through citation analysis. The importance of journals can be measured by the average citation rate, the ratio of number of citations to number articles published within a given time period and in a given index, such as the journal impact factor or the citescore. It is used by academic institutions in decisions about academic tenure, promotion and hiring, and hence also used by authors in deciding which journal to publish in. Citation-like measures are also used in other fields that do ranking, such as Google's PageRank algorithm, software metrics, college and university rankings, and business performance indicators.

<i>PLOS One</i> Peer-reviewed open-access scientific journal

PLOS One is a peer-reviewed open access mega journal published by the Public Library of Science (PLOS) since 2006. The journal covers primary research from any discipline within science and medicine. The Public Library of Science began in 2000 with an online petition initiative by Nobel Prize winner Harold Varmus, formerly director of the National Institutes of Health and at that time director of Memorial Sloan–Kettering Cancer Center; Patrick O. Brown, a biochemist at Stanford University; and Michael Eisen, a computational biologist at the University of California, Berkeley, and the Lawrence Berkeley National Laboratory.

Journal ranking is widely used in academic circles in the evaluation of an academic journal's impact and quality. Journal rankings are intended to reflect the place of a journal within its field, the relative difficulty of being published in that journal, and the prestige associated with it. They have been introduced as official research evaluation tools in several countries.

<span class="mw-page-title-main">Altmetrics</span> Alternative metrics for analyzing scholarship

In scholarly and scientific publishing, altmetrics are non-traditional bibliometrics proposed as an alternative or complement to more traditional citation impact metrics, such as impact factor and h-index. The term altmetrics was proposed in 2010, as a generalization of article level metrics, and has its roots in the #altmetrics hashtag. Although altmetrics are often thought of as metrics about articles, they can be applied to people, journals, books, data sets, presentations, videos, source code repositories, web pages, etc.

ResearchGate is a European commercial social networking site for scientists and researchers to share papers, ask and answer questions, and find collaborators. According to a 2014 study by Nature and a 2016 article in Times Higher Education, it is the largest academic social network in terms of active users, although other services have more registered users, and a 2015–2016 survey suggests that almost as many academics have Google Scholar profiles.

OurResearch, formerly known as ImpactStory, is a nonprofit organization which creates and distributes tools and services for libraries, institutions and researchers. The organization follows open practices with their data, code, and governance. OurResearch is funded by the Alfred P. Sloan Foundation, the National Science Foundation, and Arcadia Fund.

A mega journal is a peer-reviewed academic open access journal designed to be much larger than a traditional journal by exercising low selectivity among accepted articles. It was pioneered by PLOS ONE. This "very lucrative publishing model" was soon emulated by other publishers.

<span class="mw-page-title-main">Altmetric</span>

Altmetric, or altmetric.com, is a data science company that tracks where published research is mentioned online, and provides tools and services to institutions, publishers, researchers, funders and other organisations to monitor this activity, commonly referred to as altmetrics. Altmetric was recognized by European Commissioner Máire Geoghegan-Quinn in 2014 as a company challenging the traditional reputation systems.

Author-level metrics are citation metrics that measure the bibliometric impact of individual authors, researchers, academics, and scholars. Many metrics have been developed that take into account varying numbers of factors.

The Leiden Manifesto for research metrics (LM) is a list of "ten principles to guide research evaluation", published as a comment in Volume 520, Issue 7548 of Nature, on 22 April 2015. It was formulated by public policy professor Diana Hicks, scientometrics professor Paul Wouters, and their colleagues at the 19th International Conference on Science and Technology Indicators, held between 3–5 September 2014 in Leiden, The Netherlands.

References

  1. "Article-Level Metrics". SPARC. Archived from the original on 25 March 2014. Retrieved 13 March 2014.
  2. "Article-Level Metrics: A Sparc Primer" (PDF). SPARC. April 2013. Archived from the original (PDF) on 13 March 2014. Retrieved 13 March 2014.
  3. Fenner, Martin (2005-07-01). "Article-Level Metrics Information". Lagotto. PLoS ONE. Retrieved 2012-05-29.
  4. "Overview". PLOS: Article-Level Metrics. Archived from the original on 14 February 2014. Retrieved 13 March 2014.
  5. Pattinson, Damian (March 2014). "The future is open: opportunities for publishers and institutions". Insights. 27 (1): 38–44. doi: 10.1629/2048-7754.139 .
  6. "Introduction to Release 1 of the COUNTER Code of Practice for Articles". COUNTER. Archived from the original on 22 March 2014. Retrieved 21 March 2014.

Further reading