In academia and librarianship, conference proceeding is a collection of academic papers published in the context of an academic conference or workshop. Conference proceedings typically contain the contributions made by researchers at the conference. They are the written record of the work that is presented to fellow researchers. In many fields, they are published as supplements to academic journals; in some, they are considered the main dissemination route; in others they may be considered grey literature. They are usually distributed in printed or electronic volumes, either before the conference opens or after it has closed.
A less common, broader meaning of proceedings are the acts and happenings of an academic field, a learned society. For example, the title of the Acta Crystallographica journals is New Latin for "Proceedings in Crystallography"; the Proceedings of the National Academy of Sciences of the United States of America is the main journal of that academy. Scientific journals whose ISO 4 title abbreviations start with Proc, Acta, or Trans are journals of the proceedings (transactions) of a field or of an organization concerned with it, in that secondary meaning of the word.
Selecting and collecting papers for conferences is organized by one or more persons, who form the editorial team. The quality of the papers is typically ensured by having external people read the papers before they are accepted in the proceedings. The level of quality control varies considerably from conference to conference: some have only a binary accept/reject decision, others go through more thorough feedback and revisions cycles (peer reviewing or refereeing). Depending on the level of the conference, this process can take up to a year. The editors decide about the composition of the proceedings, the order of the papers, and produce the preface and possibly other pieces of text. Although most changes in papers occur on basis of consensus between editors and authors, editors can also single-handedly make changes in papers.
Since the collection of papers comes from individual researchers, the character of proceedings is distinctly different from an educational textbook. Each paper typically is quite isolated from the other papers in the proceedings. Mostly there is no general argument leading from one contribution to the next.
In some cases, the editors of the proceedings may decide to further develop the proceedings into a textbook. This may even be a goal at the outset of the conference.
Conference proceedings are published in-house by the organizing institution of the conference or via an academic publisher. For example, the Lecture Notes in Computer Science by Springer take much of their input from proceedings. Conference proceedings also get published through dedicated proceedings series as an edited volume where all their inputs comes from the conference papers. For example, AIJR Proceedings [1] series published by academic publisher AIJR. [2] Publication of proceedings as edited volume in such series are different from publishing conference paper in the journals; [3] also known as conference issue. Increasingly, proceedings are published in electronic format via the internet or on CD, USB, etc.
In the sciences, the quality of publications in conference proceedings is usually not as high as that of international scientific journals. However, in computer science, papers published in conference proceedings are accorded a higher status than in other fields, due to the fast-moving nature of the field.[ citation needed ]
A number of full-fledged academic journals unconnected to particular conferences also use the word "proceedings" as part of their name, for example, Proceedings of the National Academy of Sciences of the United States of America .
Conference proceedings may be published as a book or book series, in a journal, or otherwise as a serial publication (see examples). [4] In many cases, impact factors are not available, [5] although other journal metrics (such as Google Scholar h-index and Scimago-metrics) might exist.[ citation needed ] Bibliographic indexing often is done in separate bibliographic databases and citation indexes, e.g., Conference Proceedings Citation Index instead of Science Citation Index.
In academic publishing, a scientific journal is a periodical publication intended to further the progress of science, usually by reporting new research.
BibTeX is reference management software for formatting lists of references. The BibTeX tool is typically used together with the LaTeX document preparation system. Within the typesetting system, its name is styled as . The name is a portmanteau of the word bibliography and the name of the TeX typesetting software.
Academic publishing is the subfield of publishing which distributes academic research and scholarship. Most academic work is published in academic journal articles, books or theses. The part of academic written output that is not formally published but merely printed up or posted on the Internet is often called "grey literature". Most scientific and scholarly journals, and many academic and scholarly books, though not all, are based on some form of peer review or editorial refereeing to qualify texts for publication. Peer review quality and selectivity standards vary greatly from journal to journal, publisher to publisher, and field to field.
An academic journal or scholarly journal is a periodical publication in which scholarship relating to a particular academic discipline is published. Academic journals serve as permanent and transparent forums for the presentation, scrutiny, and discussion of research. They are usually peer-reviewed or refereed. Content typically takes the form of articles presenting original research, review articles, or book reviews. The purpose of an academic journal, according to Henry Oldenburg, is to give researchers a venue to "impart their knowledge to one another, and contribute what they can to the Grand design of improving natural knowledge, and perfecting all Philosophical Arts, and Sciences."
Scientific literature comprises scholarly publications that report original empirical and theoretical work in the natural and social sciences. Within an academic field, scientific literature is often referred to as the literature. Academic publishing is the process of contributing the results of one's research into the literature, which often requires a peer-review process.
A citation index is a kind of bibliographic index, an index of citations between publications, allowing the user to easily establish which later documents cite which earlier documents. A form of citation index is first found in 12th-century Hebrew religious literature. Legal citation indexes are found in the 18th century and were made popular by citators such as Shepard's Citations (1873). In 1960, Eugene Garfield's Institute for Scientific Information (ISI) introduced the first citation index for papers published in academic journals, first the Science Citation Index (SCI), and later the Social Sciences Citation Index (SSCI) and the Arts and Humanities Citation Index (AHCI). The first automated citation indexing was done by CiteSeer in 1997 and was patented. Other sources for such data include Google Scholar, Elsevier's Scopus, and the National Institutes of Health's iCite.
The Institute for Scientific Information (ISI) was an academic publishing service, founded by Eugene Garfield in Philadelphia in 1956. ISI offered scientometric and bibliographic database services. Its specialty was citation indexing and analysis, a field pioneered by Garfield.
The impact factor (IF) or journal impact factor (JIF) of an academic journal is a scientometric index calculated by Clarivate that reflects the yearly mean number of citations of articles published in the last two years in a given journal, as indexed by Clarivate's Web of Science. As a journal-level metric, it is frequently used as a proxy for the relative importance of a journal within its field; journals with higher impact factor values are given status of being more important, or carry more prestige in their respective fields, than those with lower values. While frequently used by universities and funding bodies to decide on promotion and research proposals, it has recently come under attack for distorting good scientific practices.
Scientometrics is the field of study which concerns itself with measuring and analysing scholarly literature. Scientometrics is a sub-field of bibliometrics. Major research issues include the measurement of the impact of research papers and academic journals, the understanding of scientific citations, and the use of such measurements in policy and management contexts. In practice there is a significant overlap between scientometrics and other scientific fields such as information systems, information science, science of science policy, sociology of science, and metascience. Critics have argued that over-reliance on scientometrics has created a system of perverse incentives, producing a publish or perish environment that leads to low quality research.
Citation analysis is the examination of the frequency, patterns, and graphs of citations in documents. It uses the directed graph of citations — links from one document to another document — to reveal properties of the documents. A typical aim would be to identify the most important documents in a collection. A classic example is that of the citations between academic articles and books. For another example, judges of law support their judgements by referring back to judgements made in earlier cases. An additional example is provided by patents which contain prior art, citation of earlier patents relevant to the current claim.
Springer Science+Business Media, commonly known as Springer, is a German multinational publishing company of books, e-books and peer-reviewed journals in science, humanities, technical and medical (STM) publishing.
Google Scholar is a freely accessible web search engine that indexes the full text or metadata of scholarly literature across an array of publishing formats and disciplines. Released in beta in November 2004, the Google Scholar index includes most peer-reviewed online academic journals and books, conference papers, theses and dissertations, preprints, abstracts, technical reports, and other scholarly literature, including court opinions and patents. Google Scholar uses a web crawler, or web robot, to identify files for inclusion in the search results. For content to be indexed in Google Scholar, it must meet certain specified criteria. An earlier statistical estimate published in PLOS ONE using a Mark and recapture method estimated approximately 80–90% coverage of all articles published in English with an estimate of 100 million. This estimate also determined how many documents were freely available on the internet.
Citation impact is a measure of how many times an academic journal article or book or author is cited by other articles, books or authors. Citation counts are interpreted as measures of the impact or influence of academic work and have given rise to the field of bibliometrics or scientometrics, specializing in the study of patterns of academic impact through citation analysis. The journal impact factor, the two-year average ratio of citations to articles published, is a measure of the importance of journals. It is used by academic institutions in decisions about academic tenure, promotion and hiring, and hence also used by authors in deciding which journal to publish in. Citation-like measures are also used in other fields that do ranking, such as Google's PageRank algorithm, software metrics, college and university rankings, and business performance indicators.
The h-index is an author-level metric that measures both the productivity and citation impact of the publications, initially used for an individual scientist or scholar. The h-index correlates with obvious success indicators such as winning the Nobel Prize, being accepted for research fellowships and holding positions at top universities. The index is based on the set of the scientist's most cited papers and the number of citations that they have received in other publications. The index has more recently been applied to the productivity and impact of a scholarly journal as well as a group of scientists, such as a department or university or country. The index was suggested in 2005 by Jorge E. Hirsch, a physicist at UC San Diego, as a tool for determining theoretical physicists' relative quality and is sometimes called the Hirsch index or Hirsch number.
Journal Citation Reports (JCR) is an annual publication by Clarivate Analytics. It has been integrated with the Web of Science and is accessed from the Web of Science-Core Collections. It provides information about academic journals in the natural sciences and social sciences, including impact factors. The JCR was originally published as a part of Science Citation Index. Currently, the JCR, as a distinct service, is based on citations compiled from the Science Citation Index Expanded and the Social Sciences Citation Index.
Web of Science is a website that provides subscription-based access to multiple databases that provide comprehensive citation data for many different academic disciplines. It was originally produced by the Institute for Scientific Information (ISI) and is currently maintained by Clarivate Analytics.
The Journal of Applied Behavioral Science is a peer-reviewed academic journal that publishes papers in the field of Psychology. The journal's editor is W. Warner Burke. It has been in publication since 1965 and is currently published by SAGE Publications in association with NTL Institute for Applied Behavioral Science.
In scholarly and scientific publishing, altmetrics are non-traditional bibliometrics proposed as an alternative or complement to more traditional citation impact metrics, such as impact factor and h-index. The term altmetrics was proposed in 2010, as a generalization of article level metrics, and has its roots in the #altmetrics hashtag. Although altmetrics are often thought of as metrics about articles, they can be applied to people, journals, books, data sets, presentations, videos, source code repositories, web pages, etc. Altmetrics use public APIs across platforms to gather data with open scripts and algorithms. Altmetrics did not originally cover citation counts, but calculate scholar impact based on diverse online research output, such as social media, online news media, online reference managers and so on. It demonstrates both the impact and the detailed composition of the impact. Altmetrics could be applied to research filter, promotion and tenure dossiers, grant applications and for ranking newly-published articles in academic search engines.
Author-level metrics are citation metrics that measure the bibliometric impact of individual authors, researchers, academics, and scholars. Many metrics have been developed that take into account varying numbers of factors.
The Leiden Manifesto for research metrics is a list of "ten principles to guide research evaluation", published as a comment in Volume 520, Issue 7548 of Nature, on 22 April 2015. It was formulated by public policy professor Diana Hicks, scientometrics professor Paul Wouters, and their colleagues at the 19th International Conference on Science and Technology Indicators, held between 3–5 September 2014 in Leiden, The Netherlands.