Scientometrics

Last updated

Scientometrics is a subfield of informetrics that studies quantitative aspects of scholarly literature. Major research issues include the measurement of the impact of research papers and academic journals, the understanding of scientific citations, and the use of such measurements in policy and management contexts. [1] In practice there is a significant overlap between scientometrics and other scientific fields such as information systems, information science, science of science policy, sociology of science, and metascience. Critics have argued that overreliance on scientometrics has created a system of perverse incentives, producing a publish or perish environment that leads to low-quality research.

Contents

Historical development

[2] [3] [4] [5] Modern scientometrics is mostly based on the work of Derek J. de Solla Price and Eugene Garfield. The latter created the Science Citation Index [1] and founded the Institute for Scientific Information which is heavily used for scientometric analysis. A dedicated academic journal, Scientometrics, was established in 1978. The industrialization of science increased the number of publications and research outcomes and the rise of the computers allowed effective analysis of this data. [6] While the sociology of science focused on the behavior of scientists, scientometrics focused on the analysis of publications. [1] Accordingly, scientometrics is also referred to as the scientific and empirical study of science and its outcomes. [7] [8]

The International Society for Scientometrics and Informetrics founded in 1993 is an association of professionals in the field. [9]

Later, around the turn of the century, evaluation and ranking of scientists and institutions came more into the spotlights. Based on bibliometric analysis of scientific publications and citations, the Academic Ranking of World Universities ("Shanghai ranking") was first published in 2004 by the Shanghai Jiao Tong University. Impact factors became an important tool to choose between different journals. Rankings such as the Academic Ranking of World Universities and the Times Higher Education World University Rankings (THE-ranking) became an indicator for the status of universities. The h-index became an important indicator of the productivity and impact of the work of a scientist. However, alternative author-level metrics have been proposed. [10] [11]

Around the same time, the interest of governments in evaluating research for the purpose of assessing the impact of science funding increased. As the investments in scientific research were included as part of the U.S. American Recovery and Reinvestment Act of 2009 (ARRA), a major economic stimulus package, programs like STAR METRICS were set up to assess if the positive impact on the economy would actually occur. [12]

Methods and findings

Methods of research include qualitative, quantitative and computational approaches. The main focus of studies have been on institutional productivity comparisons, institutional research rankings, journal rankings [7] [8] [13] establishing faculty productivity and tenure standards, [14] assessing the influence of top scholarly articles, [15] and developing profiles of top authors and institutions in terms of research performance. [16]

One significant finding in the field is a principle of cost escalation to the effect that achieving further findings at a given level of importance grow exponentially more costly in the expenditure of effort and resources. However, new algorithmic methods in search, machine learning and data mining are showing that is not the case for many information retrieval and extraction-based problems.[ citation needed ]

More recent methods rely on open source and open data to ensure transparency and reproducibility in line with modern open science requirements. For instance, the Unpaywall index and attendant research on open access trends is based on data retrieved from OAI-PMH endpoints of thousands of open archives provided by libraries and institutions worldwide. [17]

Recommendations to avoid common errors in scientometrics include: select topics with sufficient data; use data mining and web scraping, combine methods, and eliminate "false positives". [18] [19] It is also necessary to understand the limits of search engines (e.g. Web of Science, Scopus and Google Scholar) which fail to index thousands of studies in small journals and underdeveloped countries. [20]

Common scientometric indexes

Indexes may be classified as article-level metrics, author-level metrics, and journal-level metrics depending on which feature they evaluate.

Impact factor

The impact factor (IF) or journal impact factor (JIF) of an academic journal is a measure reflecting the yearly average number of citations to recent articles published in that journal. It is frequently used as a proxy for the relative importance of a journal within its field; journals with higher impact factors are often deemed to be more important than those with lower ones. The impact factor was devised by Eugene Garfield, the founder of the Institute for Scientific Information (ISI).

Science Citation Index

The Science Citation Index (SCI) is a citation index originally produced by the Institute for Scientific Information (ISI) and created by Eugene Garfield. It was officially launched in 1964. It is now owned by Clarivate Analytics (previously the Intellectual Property and Science business of Thomson Reuters). [21] [22] [23] [24] The larger version (Science Citation Index Expanded) covers more than 8,500 notable and significant journals, across 150 disciplines, from 1900 to the present. These are alternatively described as the world's leading journals of science and technology, because of a rigorous selection process. [25] [26] [27]

Acknowledgment index

An acknowledgment index (British acknowledgement index) [28] is a method for indexing and analyzing acknowledgments in the scientific literature and, thus, quantifies the impact of acknowledgments. Typically, a scholarly article has a section in which the authors acknowledge entities such as funding, technical staff, colleagues, etc. that have contributed materials or knowledge or have influenced or inspired their work. Like a citation index, it measures influences on scientific work, but in a different sense; it measures institutional and economic influences as well as informal influences of individual people, ideas, and artifacts. Unlike the impact factor, it does not produce a single overall metric, but analyzes the components separately. However, the total number of acknowledgments to an acknowledged entity can be measured and so can the number of citations to the papers in which the acknowledgment appears. The ratio of this total number of citations to the total number of papers in which the acknowledge entity appears can be construed as the impact of that acknowledged entity. [29] [30]

Altmetrics

In scholarly and scientific publishing, altmetrics are nontraditional bibliometrics [31] proposed as an alternative [32] or complement [33] to more traditional citation impact metrics, such as impact factor and h-index. [34] The term altmetrics was proposed in 2010, [35] as a generalization of article level metrics, [36] and has its roots in the #altmetrics hashtag. Although altmetrics are often thought of as metrics about articles, they can be applied to people, journals, books, data sets, presentations, videos, source code repositories, web pages, etc. Altmetrics use public APIs across platforms to gather data with open scripts and algorithms. Altmetrics did not originally cover citation counts, [37] but calculate scholar impact based on diverse online research output, such as social media, online news media, online reference managers and so on. [38] [39] It demonstrates both the impact and the detailed composition of the impact. [35] Altmetrics could be applied to research filter, [35] promotion and tenure dossiers, grant applications [40] [41] and for ranking newly published articles in academic search engines. [42]

Criticisms

Critics have argued that overreliance on scientometrics has created a publish or perish environment with perverse incentives that lead to low-quality research. [43] [44]

The main character in Michael Frayn’s novel Skios is a Professor of Scientometrics.

See also

Journals

References and footnotes

  1. 1 2 3 Leydesdorff, L. and Milojevic, S., "Scientometrics" arXiv:1208.4566 (2013), forthcoming in: Lynch, M. (editor), International Encyclopedia of Social and Behavioral Sciences subsection 85030. (2015)
  2. Nalimov, Vasily Vasilyevich; Mulchenko, B. M. (1969). ""Scientometrics." Studies of science as a process of information". Science. Moscow, Russia.
  3. Garfield, Eugene (2009). "From the science of science to Scientometrics visualizing the history of science with HistCite software" (PDF). Journal of Informetrics. 3 (3): 173–179. doi:10.1016/j.joi.2009.03.009. ISSN   1751-1577 . Retrieved 15 May 2021.
  4. Валеев, Д. Х.; Голубцов, В. Г. (2018). Юридическая Наукометрия И Цивилистические Исследования. Методологические Проблемы Цивилистических Исследований (in Russian): 45–57. Retrieved 15 May 2021.
  5. Борисов, М. В.; Майсуразде, А. И. (2014). Восстановление связей в научном рубрикаторе на основе кластеризации гетерогенной сети (PDF) (Thesis). Московский государственный университет имени М. В. Ломоносова. Retrieved 15 May 2021.
  6. De Solla Price, D., editorial statement. Scientometrics Volume 1, Issue 1 (1978)
  7. 1 2 Lowry, Paul Benjamin; Romans, Denton; Curtis, Aaron (2004). "Global journal prestige and supporting disciplines: A scientometric study of information systems journals". Journal of the Association for Information Systems. 5 (2): 29–80. doi: 10.17705/1jais.00045 . SSRN   666145.
  8. 1 2 Lowry, Paul Benjamin; Moody, Gregory D.; Gaskin, James; Galletta, Dennis F.; Humpherys, Sean; Barlow, Jordan B.; and Wilson, David W. (2013). "Evaluating journal quality and the Association for Information Systems (AIS) Senior Scholars' journal basket via bibliometric measures: Do expert journal assessments add value?," MIS Quarterly (MISQ), vol. 37(4), 993–1012. Also, see a YouTube video narrative of this paper at: https://www.youtube.com/watch?v=LZQIDkA-ke0.
  9. "About". International Society for Scientometrics and Informetrics. Retrieved 2021-01-18.
  10. Belikov, A.V.; Belikov, V.V. (2015). "A citation-based, author- and age-normalized, logarithmic index for evaluation of individual researchers independently of publication counts". F1000Research. 4: 884. doi: 10.12688/f1000research.7070.1 . PMC   4654436 .
  11. Kinouchi, O. (2018). "A simple centrality index for scientific social recognition". Physica A: Statistical Mechanics and Its Applications. 491: 632–640. arXiv: 1609.05273 . Bibcode:2018PhyA..491..632K. doi:10.1016/j.physa.2017.08.072. S2CID   22795899.
  12. Lane, J (2009). "Assessing the Impact of Science Funding". Science. 324 (5932): 1273–1275. doi:10.1126/science.1175335. PMID   19498153. S2CID   206520769.
  13. Lowry, Paul Benjamin; Humphreys, Sean; Malwitz, Jason; Nix, Joshua C (2007). "A scientometric study of the perceived quality of business and technical communication journals". IEEE Transactions on Professional Communication. 50 (4): 352–378. doi:10.1109/TPC.2007.908733. S2CID   40366182. SSRN   1021608. Recipient of the Rudolph Joenk Award for Best Paper Published in IEEE Transactions on Professional Communication in 2007.
  14. Dean, Douglas L; Lowry, Paul Benjamin; Humpherys, Sean (2011). "Profiling the research productivity of tenured information systems faculty at U.S. institutions". MIS Quarterly. 35 (1): 1–15. doi:10.2307/23043486. JSTOR   23043486. SSRN   1562263.
  15. Karuga, Gilbert G.; Lowry, Paul Benjamin; Richardson, Vernon J. (2007). "Assessing the impact of premier information systems research over time". Communications of the Association for Information Systems. 19 (7): 115–131. doi: 10.17705/1CAIS.01907 . SSRN   976891.
  16. Lowry, Paul Benjamin; Karuga, Gilbert G.; Richardson, Vernon J. (2007). "Assessing leading institutions, faculty, and articles in premier information systems research journals". Communications of the Association for Information Systems. 20 (16): 142–203. doi: 10.17705/1CAIS.02016 . SSRN   1021603.
  17. Piwowar, Heather; Priem, Jason; Orr, Richard (2019-10-09). "The Future of OA: A large-scale analysis projecting Open Access publication and readership". bioRxiv   10.1101/795310 .
  18. Jiawei, H., Kamber, M., Han, J., Kamber, M., Pei, J. 2012. Data Mining: Concepts and Techniques. Morgan Kaufmann, Wlatham, EE.UU.
  19. Quintero, Erika; Saavedra, Dalys; Murillo, Danny (2018). "Extracción de datos de perfiles en Google Scholar utilizando un algoritmo en el lenguaje R para hacer minería de datos". I+D Tecnológico. 14: 94–104. doi: 10.33412/idt.v14.1.1807 . S2CID   165340425.
  20. Añino Ramos, Yostin Jesús; Monge Najera, Julian; Murillo-Gonzalez, Danny; Michán-Aguirre, Layla (2021). "Cómo aplicar la cienciometría a la investigación ecológica". Ecosistemas. 30 (2): 1–4. doi: 10.7818/ECOS.2256 . S2CID   238733389.
  21. Garfield, E. (1955). "Citation Indexes for Science: A New Dimension in Documentation through Association of Ideas". Science . 122 (3159): 108–111. Bibcode:1955Sci...122..108G. doi:10.1126/science.122.3159.108. PMID   14385826.
  22. Garfield, Eugene (2011). "The evolution of the Science Citation Index" (PDF). International Microbiology . 10 (1): 65–69. doi:10.2436/20.1501.01.10. PMID   17407063.
  23. Garfield, Eugene (1963). "Science Citation Index" (PDF). University of Pennsylvania Garfield Library. pp. v–xvi. Retrieved 2013-05-27.
  24. "History of Citation Indexing". Clarivate Analytics. November 2010. Retrieved 2010-11-04.
  25. "Science Citation Index Expanded" . Retrieved 2017-01-17.
  26. Ma, Jiupeng; Fu, Hui-Zhen; Ho, Yuh-Shan (December 2012). "The Top-cited Wetland Articles in Science Citation Index Expanded: characteristics and hotspots". Environmental Earth Sciences . 70 (3): 1039. Bibcode:2009EnGeo..56.1247D. doi:10.1007/s12665-012-2193-y. S2CID   18502338.
  27. Ho, Yuh-Shan (2012). "The top-cited research works in the Science Citation Index Expanded" (PDF). Scientometrics . 94 (3): 1297. doi:10.1007/s11192-012-0837-z. S2CID   1301373.
  28. "Acknowledgement vs. Acknowledgment". 22 September 2012.
  29. Councill, Isaac G.; Giles, C. Lee; Han, Hui; Manavoglu, Eren (2005). "Automatic acknowledgement indexing: expanding the semantics of contribution in the CiteSeer digital library". Proceedings of the 3rd international conference on Knowledge capture. K-CAP '05. pp. 19–26. CiteSeerX   10.1.1.59.1661 . doi:10.1145/1088622.1088627. ISBN   1-59593-163-5.
  30. Giles, C. L.; Councill, I. G. (December 15, 2004). "Who gets acknowledged: Measuring scientific contributions through automatic acknowledgment indexing" (PDF). Proc. Natl. Acad. Sci. U.S.A. 101 (51): 17599–17604. Bibcode:2004PNAS..10117599G. doi: 10.1073/pnas.0407743101 . PMC   539757 . PMID   15601767.
  31. "PLOS Collections". Public Library of Science (PLOS) . 3 November 2021. Altmetrics is the study and use of nontraditional scholarly impact measures that are based on activity in web-based environments
  32. "The "alt" does indeed stand for "alternative"" Jason Priem, leading author in the Altmetrics Manifesto comment 592
  33. Haustein, Stefanie; Peters, Isabella; Sugimoto, Cassidy R.; Thelwall, Mike; Larivière, Vincent (2014-04-01). "Tweeting biomedicine: An analysis of tweets and citations in the biomedical literature". Journal of the Association for Information Science and Technology. 65 (4): 656–669. arXiv: 1308.1838 . doi:10.1002/asi.23101. ISSN   2330-1643. S2CID   11113356.
  34. Chavda, Janica; Patel, Anika (30 December 2015). "Measuring research impact: bibliometrics, social media, altmetrics, and the BJGP". British Journal of General Practice. 66 (642): e59–e61. doi:10.3399/bjgp16X683353. PMC   4684037 . PMID   26719483.
  35. 1 2 3 Priem, Jason; Taraborelli, Dario; Groth, Paul; Neylon, Cameron (September 28, 2011). "Altmetrics: A manifesto (v 1.01)". Altmetrics.
  36. Binfield, Peter (9 November 2009). "Article-Level Metrics at PLoS - what are they, and why should you care?" (Video). University of California, Berkeley . Archived from the original on 2021-12-12.
  37. Bartling, Sönke; Friesike, Sascha (2014). Opening Science: The Evolving Guide on How the Internet Is Changing Research, Collaboration and Scholarly Publishing. Cham: Springer International Publishing. p.  181. doi:10.1007/978-3-319-00026-8. ISBN   978-3-31-900026-8. OCLC   906269135. Altmetrics and article-level metrics are sometimes used interchangeably, but there are important differences: article-level metrics also include citations and usage data; ...
  38. Mcfedries, Paul (August 2012). "Measuring the impact of altmetrics [Technically Speaking]". IEEE Spectrum. 49 (8): 28. doi:10.1109/MSPEC.2012.6247557. ISSN   0018-9235.
  39. Galligan, Finbar; Dyas-Correia, Sharon (March 2013). "Altmetrics: Rethinking the Way We Measure". Serials Review. 39 (1): 56–61. doi:10.1016/j.serrev.2013.01.003.
  40. Moher, David; Naudet, Florian; Cristea, Ioana A.; Miedema, Frank; Ioannidis, John P. A.; Goodman, Steven N. (2018-03-29). "Assessing scientists for hiring, promotion, and tenure". PLOS Biology. 16 (3): e2004089. doi: 10.1371/journal.pbio.2004089 . ISSN   1545-7885. PMC   5892914 . PMID   29596415.
  41. Nariani, Rajiv (2017-03-24). "Supplementing Traditional Ways of Measuring Scholarly Impact: The Altmetrics Way". ACRL 2017 Conference Proceedings. hdl:10315/33652.
  42. Mehrazar, Maryam; Kling, Christoph Carl; Lemke, Steffen; Mazarakis, Athanasios; Peters, Isabella (2018-04-08). "Can We Count on Social Media Metrics? First Insights into the Active Scholarly Use of Social Media". Proceedings of the 10th ACM Conference on Web Science. p. 215. arXiv: 1804.02751 . doi: 10.1145/3201064.3201101 . ISBN   978-1-4503-5563-6.
  43. Roche, Christopher (2022-08-31). "The research conduct spectrum for surgeons: your career in their rule bending hands?". Bull Roy Coll Surg Engl. 104 (6): 274–277. doi: 10.1308/rcsbull.2022.112 .
  44. Weingart, Peter (2005-01-01). "Impact of bibliometrics upon the science system: Inadvertent consequences?". Scientometrics. 62 (1): 117–131. doi:10.1007/s11192-005-0007-7. ISSN   0138-9130. S2CID   12359334.

Related Research Articles

<span class="mw-page-title-main">Scientific citation</span>

Scientific citation is providing detailed reference in a scientific publication, typically a paper or book, to previous published communications which have a bearing on the subject of the new publication. The purpose of citations in original work is to allow readers of the paper to refer to cited work to assist them in judging the new work, source background information vital for future development, and acknowledge the contributions of earlier workers. Citations in, say, a review paper bring together many sources, often recent, in one place.

A citation index is a kind of bibliographic index, an index of citations between publications, allowing the user to easily establish which later documents cite which earlier documents. A form of citation index is first found in 12th-century Hebrew religious literature. Legal citation indexes are found in the 18th century and were made popular by citators such as Shepard's Citations (1873). In 1961, Eugene Garfield's Institute for Scientific Information (ISI) introduced the first citation index for papers published in academic journals, first the Science Citation Index (SCI), and later the Social Sciences Citation Index (SSCI) and the Arts and Humanities Citation Index (AHCI). American Chemical Society converted its printed Chemical Abstract Service into internet-accessible SciFinder in 2008. The first automated citation indexing was done by CiteSeer in 1997 and was patented. Other sources for such data include Google Scholar, Microsoft Academic, Elsevier's Scopus, and the National Institutes of Health's iCite.

<span class="mw-page-title-main">Webometrics</span>

The science of webometrics tries to measure the World Wide Web to get knowledge about the number and types of hyperlinks, structure of the World Wide Web and using patterns. According to Björneborn and Ingwersen, the definition of webometrics is "the study of the quantitative aspects of the construction and use of information resources, structures and technologies on the Web drawing on bibliometric and informetric approaches." The term webometrics was first coined by Almind and Ingwersen (1997). A second definition of webometrics has also been introduced, "the study of web-based content with primarily quantitative methods for social science research goals using techniques that are not specific to one field of study", which emphasizes the development of applied methods for use in the wider social sciences. The purpose of this alternative definition was to help publicize appropriate methods outside the information-science discipline rather than to replace the original definition within information science.

The impact factor (IF) or journal impact factor (JIF) of an academic journal is a scientometric index calculated by Clarivate that reflects the yearly mean number of citations of articles published in the last two years in a given journal, as indexed by Clarivate's Web of Science.

<span class="mw-page-title-main">Bibliometrics</span> Statistical analysis of written publications

Bibliometrics is the application of statistical methods to the study of bibliographic data, especially in scientific and library and information science contexts, and is closely associated with scientometrics to the point that both fields largely overlap.

Citation analysis is the examination of the frequency, patterns, and graphs of citations in documents. It uses the directed graph of citations — links from one document to another document — to reveal properties of the documents. A typical aim would be to identify the most important documents in a collection. A classic example is that of the citations between academic articles and books. For another example, judges of law support their judgements by referring back to judgements made in earlier cases. An additional example is provided by patents which contain prior art, citation of earlier patents relevant to the current claim. The digitization of patent data and increasing computing power have led to a community of practice that uses these citation data to measure innovation attributes, trace knowledge flows, and map innovation networks.

<span class="mw-page-title-main">Informetrics</span> Study of the quantitative aspects of information

Informetrics is the study of quantitative aspects of information, it is an extension and evolution of traditional bibliometrics and scientometrics. Informetrics uses bibliometrics and scientometrics methods to study mainly the problems of literature information management and evaluation of science and technology. Informetrics is an independent discipline that uses quantitative methods from mathematics and statistics to study the process, phenomena, and law of informetrics. Informetrics has gained more attention as it is a common scientific method for academic evaluation, research hotspots in discipline, and trend analysis.

Citation impact or citation rate is a measure of how many times an academic journal article or book or author is cited by other articles, books or authors. Citation counts are interpreted as measures of the impact or influence of academic work and have given rise to the field of bibliometrics or scientometrics, specializing in the study of patterns of academic impact through citation analysis. The importance of journals can be measured by the average citation rate, the ratio of number of citations to number articles published within a given time period and in a given index, such as the journal impact factor or the citescore. It is used by academic institutions in decisions about academic tenure, promotion and hiring, and hence also used by authors in deciding which journal to publish in. Citation-like measures are also used in other fields that do ranking, such as Google's PageRank algorithm, software metrics, college and university rankings, and business performance indicators.

The h-index is an author-level metric that measures both the productivity and citation impact of the publications, initially used for an individual scientist or scholar. The h-index correlates with success indicators such as winning the Nobel Prize, being accepted for research fellowships and holding positions at top universities. The index is based on the set of the scientist's most cited papers and the number of citations that they have received in other publications. The index has more recently been applied to the productivity and impact of a scholarly journal as well as a group of scientists, such as a department or university or country. The index was suggested in 2005 by Jorge E. Hirsch, a physicist at UC San Diego, as a tool for determining theoretical physicists' relative quality and is sometimes called the Hirsch index or Hirsch number.

Journal ranking is widely used in academic circles in the evaluation of an academic journal's impact and quality. Journal rankings are intended to reflect the place of a journal within its field, the relative difficulty of being published in that journal, and the prestige associated with it. They have been introduced as official research evaluation tools in several countries.

A bibliometrician is a researcher or a specialist in bibliometrics. It is near-synonymous with an informetrican, a scientometrican and a webometrician, who study webometrics.

<span class="mw-page-title-main">Altmetrics</span> Alternative metrics for analyzing scholarship

In scholarly and scientific publishing, altmetrics are non-traditional bibliometrics proposed as an alternative or complement to more traditional citation impact metrics, such as impact factor and h-index. The term altmetrics was proposed in 2010, as a generalization of article level metrics, and has its roots in the #altmetrics hashtag. Although altmetrics are often thought of as metrics about articles, they can be applied to people, journals, books, data sets, presentations, videos, source code repositories, web pages, etc.

Article-level metrics are citation metrics which measure the usage and impact of individual scholarly articles.

<span class="mw-page-title-main">Altmetric</span>

Altmetric, or altmetric.com, is a data science company that tracks where published research is mentioned online, and provides tools and services to institutions, publishers, researchers, funders and other organisations to monitor this activity, commonly referred to as altmetrics. Altmetric was recognized by European Commissioner Máire Geoghegan-Quinn in 2014 as a company challenging the traditional reputation systems.

Author-level metrics are citation metrics that measure the bibliometric impact of individual authors, researchers, academics, and scholars. Many metrics have been developed that take into account varying numbers of factors.

There are a number of approaches to ranking academic publishing groups and publishers. Rankings rely on subjective impressions by the scholarly community, on analyses of prize winners of scientific associations, discipline, a publisher's reputation, and its impact factor.

Judit Bar-Ilan was an Israeli computer scientist known for her research in informetrics and scientometrics. She was a professor of information science, and head of the Department of Information Science at Bar-Ilan University.

<span class="mw-page-title-main">Ronald Rousseau</span>

Ronald Rousseau is a Belgian mathematician and information scientist. He has obtained an international reputation for his research on indicators and citation analysis in the fields of bibliometrics and scientometrics.

The Leiden Manifesto for research metrics (LM) is a list of "ten principles to guide research evaluation", published as a comment in Volume 520, Issue 7548 of Nature, on 22 April 2015. It was formulated by public policy professor Diana Hicks, scientometrics professor Paul Wouters, and their colleagues at the 19th International Conference on Science and Technology Indicators, held between 3–5 September 2014 in Leiden, The Netherlands.

The open science movement has expanded the uses scientific output beyond specialized academic circles.