This article needs additional citations for verification .(August 2007) |
Scientific citation is providing detailed reference in a scientific publication, typically a paper or book, to previously published (or occasionally private) communications that have a bearing on the subject of the new publication.[ citation needed ] The purpose of citations in original work is to allow readers of the paper to refer to cited work to assist them in judging the new work, source background information vital for future development, and acknowledge the contributions of earlier workers.[ citation needed ]
To a considerable extent the quality of work, in the absence of other criteria, is judged on the number of citations received, adjusting for the volume of work on the relevant topic.[ citation needed ] While this is not necessarily a reliable measure, counting citations is trivially easy; judging the merit of complex work can be very difficult.[ citation needed ]
Previous work may be cited regarding experimental procedures, apparatus, goals, previous theoretical results upon which the new work builds, theses, and so on. Typically such citations establish the general framework of influences and the mindset of research, and especially as "part of what science" it is, and to help determine who conducts the peer review.[ citation needed ]
In patent law, the citation of previous works, or prior art, helps establish the uniqueness of the invention being described. The focus in this practice is to claim originality for commercial purposes, and so the author is motivated to avoid citing works that cast doubt on their originality. This does not appear to be "scientific" citation. Inventors and lawyers have a legal obligation to cite all relevant art; not doing so risks invalidating the patent.[ citation needed ] The patent examiner is obliged to list all further prior art found in searches.[ citation needed ]
A digital object identifier (DOI) is a persistent identifier or handle used to uniquely identify various objects, standardized by the International Organization for Standardization (ISO). [1] DOIs are an implementation of the Handle System; [2] [3] they also fit within the URI system (Uniform Resource Identifier). They are widely used to identify academic, professional, and government information, such as journal articles, research reports, data sets, and official publications.
A DOI aims to resolve to its target, the information object to which the DOI refers. This is achieved by binding the DOI to metadata about the object, such as a URL where the object is located. Thus, by being actionable and interoperable, a DOI differs from ISBNs or ISRCs which are identifiers only. The DOI system uses the indecs Content Model to represent metadata.Citation analysis is a method widely used in metascience:
Citation analysis is the examination of the frequency, patterns, and graphs of citations in documents. It uses the directed graph of citations — links from one document to another document — to reveal properties of the documents. A typical aim would be to identify the most important documents in a collection. A classic example is that of the citations between academic articles and books. [4] [5] For another example, judges of law support their judgements by referring back to judgements made in earlier cases (see citation analysis in a legal context). An additional example is provided by patents which contain prior art, citation of earlier patents relevant to the current claim. The digitization of patent data and increasing computing power have led to a community of practice that uses these citation data to measure innovation attributes, trace knowledge flows, and map innovation networks. [6]
Documents can be associated with many other features in addition to citations, such as authors, publishers, journals as well as their actual texts. The general analysis of collections of documents is known as bibliometrics and citation analysis is a key part of that field. For example, bibliographic coupling and co-citation are association measures based on citation analysis (shared citations or shared references). The citations in a collection of documents can also be represented in forms such as a citation graph, as pointed out by Derek J. de Solla Price in his 1965 article "Networks of Scientific Papers". [7] This means that citation analysis draws on aspects of social network analysis and network science.
An early example of automated citation indexing was CiteSeer, which was used for citations between academic papers, while Web of Science is an example of a modern system which includes more than just academic books and articles reflecting a wider range of information sources. Today, automated citation indexing [8] has changed the nature of citation analysis research, allowing millions of citations to be analyzed for large-scale patterns and knowledge discovery. Citation analysis tools can be used to compute various impact measures for scholars based on data from citation indices. [9] [10] [note 1] These have various applications, from the identification of expert referees to review papers and grant proposals, to providing transparent data in support of academic merit review, tenure, and promotion decisions. This competition for limited resources may lead to ethically questionable behavior to increase citations. [11] [12]
A great deal of criticism has been made of the practice of naively using citation analyses to compare the impact of different scholarly articles without taking into account other factors which may affect citation patterns. [13] Among these criticisms, a recurrent one focuses on "field-dependent factors", which refers to the fact that citation practices vary from one area of science to another, and even between fields of research within a discipline. [14]Modern scientists are sometimes judged by the number of times their work is cited by others—this is actually a key indicator of the relative importance of a work in science. Accordingly, individual scientists are motivated to have their own work cited early and often and as widely as possible, but all other scientists are motivated to eliminate unnecessary citations so as not to devalue this means of judgment. [15] A formal citation index tracks which referred and reviewed papers have referred which other such papers. Baruch Lev and other advocates of accounting reform consider the number of times a patent is cited to be a significant metric of its quality, and thus of innovation.[ citation needed ] Reviews often replace citations to primary studies. [16]
Citation-frequency is one indicator used in scientometrics.
Some studies explore citations and citation-frequencies. Researchers found that papers in leading journals with findings that can not be replicated tend to be cited more than reproducible science. Results that are published unreproducibly – or not in a replicable sufficiently transparent way – are more likely to be wrong, may slow progress and, according to an author, "a simple way to check how often studies have been repeated, and whether or not the original findings are confirmed" is needed. The authors also put forward possible explanations for this state of affairs. [17] [18]
Two metascientists reported that in a growing scientific field, citations disproportionately cite already well-cited papers, possibly slowing and inhibiting canonical progress to some degree in some cases. They find that "structures fostering disruptive scholarship and focusing attention on novel ideas" could be important. [20] [21] [22]
Other metascientists introduced the 'CD index' intended to characterize "how papers and patents change networks of citations in science and technology" and reported that it has declined, which they interpreted as "slowing rates of disruption". They proposed linking this to changes to three "use of previous knowledge"-indicators which they interpreted as "contemporary discovery and invention" being informed by "a narrower scope of existing knowledge". The overall number of papers has risen while the total of "highly disruptive" papers has not. The 1998 discovery of the accelerating expansion of the universe has a CD index of 0. Their results also suggest scientists and inventors "may be struggling to keep up with the pace of knowledge expansion". [23] [21] [19]
Recommendation systems sometimes also use citations to find similar studies to the one the user is currently reading or that the user may be interested in and may find useful. [25] Better availability of integrable open citation information could be useful in addressing the "overwhelming amount of scientific literature". [24]
Knowledge agents may use citations to find studies that are relevant to the user's query, in particular citation statements are used by scite.ai to answer a question, also providing the associated reference(s). [26] [ additional citation(s) needed ]
There have been analyses of citations of science information on Wikipedia or of scientific citations on the site, e.g. enabling listing the most relevant or most-cited scientific journals and categories and dominant domains. [27] Since 2015, the altmetrics platform Altmetric.com also shows citing English Wikipedia articles for a given study, later adding other language editions. [27] [28] The Wikimedia platform under development Scholia also shows "Wikipedia mentions" of scientific works. [29] A study suggests a citation on Wikipedia "could be considered a public parallel to scholarly citation". [30] A scientific publication being "cited in a Wikipedia article is considered an indicator of some form of impact for this publication" and it may be possible to detect certain publications through changes to Wikipedia articles. [31] Wikimedia Research's Cite-o-Meter tool showed a league table of which academic publishers are most cited on Wikipedia [30] as does a page by the "Academic Journals WikiProject". [32] [33] [ circular reference ][ additional citation(s) needed ] Research indicates a large share of academic citations on the platform are paywalled and hence inaccessible to many readers. [34] [35] "[ citation needed]" is a tag added by Wikipedia editors to unsourced statements in articles requesting citations to be added. [36] The phrase is reflective of the policies of verifiability and no original research on Wikipedia and has become a general Internet meme. [37]
The tool scite.ai tracks and links citations of papers as 'Supporting', 'Mentioning', or 'Contrasting' the study, differentiating between these contexts of citations to some degree which may be useful for evaluation/metrics and e.g. discovering studies or statements contrasting statements within a specific study. [39] [40] [41]
The Scite Reference Check bot is an extension of scite.ai that scans new article PDFs "for references to retracted papers, and posts both the citing and retracted papers on Twitter" and also "flags when new studies cite older ones that have issued corrections, errata, withdrawals, or expressions of concern". [41] Studies have suggested as few as 4% of citations to retracted papers clearly recognize the retraction. [41] Research found "that authors tend to keep citing retracted papers long after they have been red flagged, although at a lower rate". [42]
A citation is a reference to a source. More precisely, a citation is an abbreviated alphanumeric expression embedded in the body of an intellectual work that denotes an entry in the bibliographic references section of the work for the purpose of acknowledging the relevance of the works of others to the topic of discussion at the spot where the citation appears.
Open access (OA) is a set of principles and a range of practices through which nominally copyrightable publications are delivered to readers free of access charges or other barriers. With open access strictly defined, or libre open access, barriers to copying or reuse are also reduced or removed by applying an open license for copyright, which regulates post-publication uses of the work.
Scientific literature encompasses a vast body of academic papers that spans various disciplines within the natural and social sciences. It primarily consists of academic papers that present original empirical research and theoretical contributions. These papers serve as essential sources of knowledge and are commonly referred to simply as "the literature" within specific research fields.
A citation index is a kind of bibliographic index, an index of citations between publications, allowing the user to easily establish which later documents cite which earlier documents. A form of citation index is first found in 12th-century Hebrew religious literature. Legal citation indexes are found in the 18th century and were made popular by citators such as Shepard's Citations (1873). In 1961, Eugene Garfield's Institute for Scientific Information (ISI) introduced the first citation index for papers published in academic journals, first the Science Citation Index (SCI), and later the Social Sciences Citation Index (SSCI) and the Arts and Humanities Citation Index (AHCI). American Chemical Society converted its printed Chemical Abstract Service into internet-accessible SciFinder in 2008. The first automated citation indexing was done by CiteSeer in 1997 and was patented. Other sources for such data include Google Scholar, Microsoft Academic, Elsevier's Scopus, and the National Institutes of Health's iCite.
The Institute for Scientific Information (ISI) was an academic publishing service, founded by Eugene Garfield in Philadelphia in 1956. ISI offered scientometric and bibliographic database services. Its specialty was citation indexing and analysis, a field pioneered by Garfield.
The impact factor (IF) or journal impact factor (JIF) of an academic journal is a scientometric index calculated by Clarivate that reflects the yearly mean number of citations of articles published in the last two years in a given journal, as indexed by Clarivate's Web of Science.
Bibliometrics is the application of statistical methods to the study of bibliographic data, especially in scientific and library and information science contexts, and is closely associated with scientometrics to the point that both fields largely overlap.
Scientometrics is a subfield of informetrics that studies quantitative aspects of scholarly literature. Major research issues include the measurement of the impact of research papers and academic journals, the understanding of scientific citations, and the use of such measurements in policy and management contexts. In practice there is a significant overlap between scientometrics and other scientific fields such as information systems, information science, science of science policy, sociology of science, and metascience. Critics have argued that overreliance on scientometrics has created a system of perverse incentives, producing a publish or perish environment that leads to low-quality research.
Citation analysis is the examination of the frequency, patterns, and graphs of citations in documents. It uses the directed graph of citations — links from one document to another document — to reveal properties of the documents. A typical aim would be to identify the most important documents in a collection. A classic example is that of the citations between academic articles and books. For another example, judges of law support their judgements by referring back to judgements made in earlier cases. An additional example is provided by patents which contain prior art, citation of earlier patents relevant to the current claim. The digitization of patent data and increasing computing power have led to a community of practice that uses these citation data to measure innovation attributes, trace knowledge flows, and map innovation networks.
Citation impact or citation rate is a measure of how many times an academic journal article or book or author is cited by other articles, books or authors. Citation counts are interpreted as measures of the impact or influence of academic work and have given rise to the field of bibliometrics or scientometrics, specializing in the study of patterns of academic impact through citation analysis. The importance of journals can be measured by the average citation rate, the ratio of number of citations to number articles published within a given time period and in a given index, such as the journal impact factor or the citescore. It is used by academic institutions in decisions about academic tenure, promotion and hiring, and hence also used by authors in deciding which journal to publish in. Citation-like measures are also used in other fields that do ranking, such as Google's PageRank algorithm, software metrics, college and university rankings, and business performance indicators.
The h-index is an author-level metric that measures both the productivity and citation impact of the publications, initially used for an individual scientist or scholar. The h-index correlates with success indicators such as winning the Nobel Prize, being accepted for research fellowships and holding positions at top universities. The index is based on the set of the scientist's most cited papers and the number of citations that they have received in other publications. The index has more recently been applied to the productivity and impact of a scholarly journal as well as a group of scientists, such as a department or university or country. The index was suggested in 2005 by Jorge E. Hirsch, a physicist at UC San Diego, as a tool for determining theoretical physicists' relative quality and is sometimes called the Hirsch index or Hirsch number.
PLOS One is a peer-reviewed open access mega journal published by the Public Library of Science (PLOS) since 2006. The journal covers primary research from any discipline within science and medicine. The Public Library of Science began in 2000 with an online petition initiative by Nobel Prize winner Harold Varmus, formerly director of the National Institutes of Health and at that time director of Memorial Sloan–Kettering Cancer Center; Patrick O. Brown, a biochemist at Stanford University; and Michael Eisen, a computational biologist at the University of California, Berkeley, and the Lawrence Berkeley National Laboratory.
Journal ranking is widely used in academic circles in the evaluation of an academic journal's impact and quality. Journal rankings are intended to reflect the place of a journal within its field, the relative difficulty of being published in that journal, and the prestige associated with it. They have been introduced as official research evaluation tools in several countries.
In scholarly and scientific publishing, altmetrics are non-traditional bibliometrics proposed as an alternative or complement to more traditional citation impact metrics, such as impact factor and h-index. The term altmetrics was proposed in 2010, as a generalization of article level metrics, and has its roots in the #altmetrics hashtag. Although altmetrics are often thought of as metrics about articles, they can be applied to people, journals, books, data sets, presentations, videos, source code repositories, web pages, etc.
OurResearch, formerly known as ImpactStory, is a nonprofit organization that creates and distributes tools and services for libraries, institutions and researchers. The organization follows open practices with their data, code, and governance. OurResearch is funded by the Alfred P. Sloan Foundation, the National Science Foundation, and Arcadia Fund.
Author-level metrics are citation metrics that measure the bibliometric impact of individual authors, researchers, academics, and scholars. Many metrics have been developed that take into account varying numbers of factors.
Semantic Scholar is a research tool for scientific literature powered by artificial intelligence. It is developed at the Allen Institute for AI and was publicly released in November 2015. Semantic Scholar uses modern techniques in natural language processing to support the research process, for example by providing automatically generated summaries of scholarly papers. The Semantic Scholar team is actively researching the use of artificial intelligence in natural language processing, machine learning, human–computer interaction, and information retrieval.
Metascience is the use of scientific methodology to study science itself. Metascience seeks to increase the quality of scientific research while reducing inefficiency. It is also known as "research on research" and "the science of science", as it uses research methods to study how research is done and find where improvements can be made. Metascience concerns itself with all fields of research and has been described as "a bird's eye view of science". In the words of John Ioannidis, "Science is the best thing that has happened to human beings ... but we can do it better."
The Leiden Manifesto for research metrics (LM) is a list of "ten principles to guide research evaluation", published as a comment in Volume 520, Issue 7548 of Nature, on 22 April 2015. It was formulated by public policy professor Diana Hicks, scientometrics professor Paul Wouters, and their colleagues at the 19th International Conference on Science and Technology Indicators, held between 3–5 September 2014 in Leiden, The Netherlands.
The open science movement has expanded the uses scientific output beyond specialized academic circles.