The Book Citation Index (BCI, BKCI) is an online subscription-based scientific citation indexing service maintained by Clarivate Analytics and is part of the Web of Science Core Collection. [1] It was first launched in 2011 and indexes over 60,000 editorially selected books, starting from 2005. [2] Books in the index are electronic and print scholarly texts that contain articles based on original research and/or reviews of such literature. [2]
The index covers series and non-series books as long as they include full footnotes and the index has two separate editions, a Science edition and a Social Sciences & Humanities edition. The Science edition covers physics and chemistry, engineering, computing and technology, clinical medicine, life sciences, and agriculture and biology. Currently both series only contain books that date back to 2005. [3]
In their 2014 book Beyond Bibliometrics: Harnessing Multidimensional Indicators of Scholarly Impact, Blaise Cronin and Cassidy R. Sugimoto noted that "for impact assessment of book-based fields, bibliometricians need a database with large numbers of books" and that while the Book Citation Index did meet this need, Google Books also fulfilled this purpose and was not only free, but was (at the time) more comprehensive for bibliometric analyses. [4] A 2013 article in the Journal of the American Society for Information Science and Technology remarked on the index's opportunities and limitations. It stated that the "most significant limitations to this potential application are the high share of publications without address information, the inflation of publication counts, the lack of cumulative citation counts from different hierarchical levels, and inconsistency in citation counts between the cited reference search and the book citation index." [5] They also stated that the Book Citation Index was "a first step toward creating a reliable and necessary citation data source for monographs — a very challenging issue, because, unlike journals and conference proceedings, books have specific requirements, and several problems emerge not only in the context of subject classification, but also in their role as cited publications and in citing publications." [5]
A citation is a reference to a source. More precisely, a citation is an abbreviated alphanumeric expression embedded in the body of an intellectual work that denotes an entry in the bibliographic references section of the work for the purpose of acknowledging the relevance of the works of others to the topic of discussion at the spot where the citation appears.
Scientific citation is providing detailed reference in a scientific publication, typically a paper or book, to previous published communications which have a bearing on the subject of the new publication. The purpose of citations in original work is to allow readers of the paper to refer to cited work to assist them in judging the new work, source background information vital for future development, and acknowledge the contributions of earlier workers. Citations in, say, a review paper bring together many sources, often recent, in one place.
A citation index is a kind of bibliographic index, an index of citations between publications, allowing the user to easily establish which later documents cite which earlier documents. A form of citation index is first found in 12th-century Hebrew religious literature. Legal citation indexes are found in the 18th century and were made popular by citators such as Shepard's Citations (1873). In 1961, Eugene Garfield's Institute for Scientific Information (ISI) introduced the first citation index for papers published in academic journals, first the Science Citation Index (SCI), and later the Social Sciences Citation Index (SSCI) and the Arts and Humanities Citation Index (AHCI). American Chemical Society converted its printed Chemical Abstract Service into internet-accessible SciFinder in 2008. The first automated citation indexing was done by CiteSeer in 1997 and was patented. Other sources for such data include Google Scholar, Microsoft Academic, Elsevier's Scopus, and the National Institutes of Health's iCite.
The impact factor (IF) or journal impact factor (JIF) of an academic journal is a scientometric index calculated by Clarivate that reflects the yearly mean number of citations of articles published in the last two years in a given journal, as indexed by Clarivate's Web of Science.
Bibliometrics is the use of statistical methods to analyse books, articles and other publications, especially in scientific contents. Bibliometric methods are frequently used in the field of library and information science. Bibliometrics is closely associated with scientometrics, the analysis of scientific metrics and indicators, to the point that both fields largely overlap.
Scientometrics is the field of study which concerns itself with measuring and analysing scholarly literature. Scientometrics is a sub-field of informetrics. Major research issues include the measurement of the impact of research papers and academic journals, the understanding of scientific citations, and the use of such measurements in policy and management contexts. In practice there is a significant overlap between scientometrics and other scientific fields such as information systems, information science, science of science policy, sociology of science, and metascience. Critics have argued that over-reliance on scientometrics has created a system of perverse incentives, producing a publish or perish environment that leads to low-quality research.
Citation analysis is the examination of the frequency, patterns, and graphs of citations in documents. It uses the directed graph of citations — links from one document to another document — to reveal properties of the documents. A typical aim would be to identify the most important documents in a collection. A classic example is that of the citations between academic articles and books. For another example, judges of law support their judgements by referring back to judgements made in earlier cases. An additional example is provided by patents which contain prior art, citation of earlier patents relevant to the current claim.
Eugene Eli Garfield was an American linguist and businessman, one of the founders of bibliometrics and scientometrics. He helped to create Current Contents, Science Citation Index (SCI), Journal Citation Reports, and Index Chemicus, among others, and founded the magazine The Scientist.
Citation impact is a measure of how many times an academic journal article or book or author is cited by other articles, books or authors. Citation counts are interpreted as measures of the impact or influence of academic work and have given rise to the field of bibliometrics or scientometrics, specializing in the study of patterns of academic impact through citation analysis. The journal impact factor, the two-year average ratio of citations to articles published, is a measure of the importance of journals. It is used by academic institutions in decisions about academic tenure, promotion and hiring, and hence also used by authors in deciding which journal to publish in. Citation-like measures are also used in other fields that do ranking, such as Google's PageRank algorithm, software metrics, college and university rankings, and business performance indicators.
The h-index is an author-level metric that measures both the productivity and citation impact of the publications, initially used for an individual scientist or scholar. The h-index correlates with obvious success indicators such as winning the Nobel Prize, being accepted for research fellowships and holding positions at top universities. The index is based on the set of the scientist's most cited papers and the number of citations that they have received in other publications. The index has more recently been applied to the productivity and impact of a scholarly journal as well as a group of scientists, such as a department or university or country. The index was suggested in 2005 by Jorge E. Hirsch, a physicist at UC San Diego, as a tool for determining theoretical physicists' relative quality and is sometimes called the Hirsch index or Hirsch number.
Nova Science Publishers is an academic publisher of books, encyclopedias, handbooks, e-books and journals, based in Hauppauge, New York. It was founded in 1985. A prolific publisher of books, Nova has received criticism from librarians for not always subjecting its publications to academic peer review and for republishing public domain book chapters and freely-accessible government publications at high prices.
The Web of Science is a paid-access platform that provides access to multiple databases that provide reference and citation data from academic journals, conference proceedings, and other documents in various academic disciplines. It was originally produced by the Institute for Scientific Information. It is currently owned by Clarivate.
The Redalyc project is a bibliographic database and a digital library of Open Access journals, supported by the Universidad Autónoma del Estado de México with the help of numerous other higher education institutions and information systems.
An academic discipline or academic field is a subdivision of knowledge that is taught and researched at the college or university level. Disciplines are defined and recognized by the academic journals in which research is published, and the learned societies and academic departments or faculties within colleges and universities to which their practitioners belong. Academic disciplines are conventionally divided into the humanities, including language, art and cultural studies, and the scientific disciplines, such as physics, chemistry, and biology; the social sciences are sometimes considered a third category.
In scholarly and scientific publishing, altmetrics are non-traditional bibliometrics proposed as an alternative or complement to more traditional citation impact metrics, such as impact factor and h-index. The term altmetrics was proposed in 2010, as a generalization of article level metrics, and has its roots in the #altmetrics hashtag. Although altmetrics are often thought of as metrics about articles, they can be applied to people, journals, books, data sets, presentations, videos, source code repositories, web pages, etc.
Blaise Cronin is an Irish-American information scientist and bibliometrician. He is the Rudy Professor Emeritus of Information Science at Indiana University, Bloomington, where he was Dean of the School of Library and Information Science for seventeen years. From 1985 to 1991 he held the Chair of Information Science and was Head of the Department of Information Science at the University of Strathclyde in Glasgow, U.K.
Author-level metrics are citation metrics that measure the bibliometric impact of individual authors, researchers, academics, and scholars. Many metrics have been developed that take into account varying numbers of factors.
There are a number of approaches to ranking academic publishing groups and publishers. Rankings rely on subjective impressions by the scholarly community, on analyses of prize winners of scientific associations, discipline, a publisher's reputation, and its impact factor.
The Leiden Manifesto for research metrics (LM) is a list of "ten principles to guide research evaluation", published as a comment in Volume 520, Issue 7548 of Nature, on 22 April 2015. It was formulated by public policy professor Diana Hicks, scientometrics professor Paul Wouters, and their colleagues at the 19th International Conference on Science and Technology Indicators, held between 3–5 September 2014 in Leiden, The Netherlands.
Cassidy R. Sugimoto is an American information scientist who is the Professor and Tom and Marie Patton School Chair in the School of Public Policy at the Georgia Institute of Technology. She studies the ways knowledge is processed and disseminated. She is the author of the 2016 MIT Press book Big Data is Not a Monolith.