Producer | Clarivate Analytics (Canada and Hong Kong) |
---|---|
Coverage | |
Disciplines | Social sciences |
Record depth | Index & citation indexing |
Links | |
Website | clarivate.com/ssci |
Title list(s) | mjl |
The Social Sciences Citation Index (SSCI) is a commercial citation index product of Clarivate Analytics. It was originally developed by the Institute for Scientific Information from the Science Citation Index. The Social Sciences Citation Index is a multidisciplinary index which indexes over 3,400 journals across 58 social science disciplines – 1985 to present, and it has 122 million cited references – 1900 to present. It also includes a range of 3,500 selected items from some of the world's finest scientific and technical journals. It has a range of useful search functions such as 'cited reference searching', searching by author, subject, or title. [1] Whilst the Social Sciences Citation Index provides extensive support in bibliographic analytics and research, a number of academic scholars have expressed criticisms relating to ideological bias and its English-dominant publishing nature.
The SSCI citation database covers some 3,400 academic journals in the social sciences across more than 50 disciplines. [2] It is made available online through the Web of Science service for a fee. The database records which articles are cited by other articles and aids in many bibliographic analytics. The Master Journal List provides users with the ability to search for journals that have been indexed through a simple user interface that allows users to search by author, title or citation. The Social Science Citation Index was conceptualized in 1961 when the founder of the Institute for Scientific Information, Eugene Garfield, received funding from the United States National Institutes of Health. He received this funding in order to produce a comprehensive index of Genetics literature, also known as Genetics Citation Index, although this later became known as the Science Citation Index. The SSCI was then established in 1972 by the Institute for Scientific Information and subsequently owned by Clarivate in 2017. [3]
Philip Altbach has criticized the Social Sciences Citation Index of favoring English-language journals generally and American journals specifically, while greatly under-representing journals in non-English languages. [4] Academics such as June Yichun Liu have expressed similar criticisms stating only two percent of the SSCI academic publications come from developing nations thereby creating an artificial importance in countries such as Taiwan. [5] This artificial importance in Taiwan deems all scholarly work published in the SSCI as canonical and most other work immaterial thus affecting scholarships and funding of other scholarly research. Liu suggests this will have some influence in research undertaken by scholars vastly affecting the types of subjects of research, ultimately limiting the scope of academic work. Similar to criticisms leveled by Liu and Altbach, Yusuf Ziya Olpak and Muhammet Arican found that only 2.138% of the total SSCI indexed articles published expressed a variable related to Turkey such as research area or authors address. [6] They also found only three articles as highly cited (0.707%) suggesting a poor representation of Turkish academic literature and Turkish-addressed articles. [6] In contrast to minute representation, they found that the United States was listed as the address in over half the indexed articles within the SSCI, however they note the number of academics publishing Social Science research is perhaps the determining variable. [6] This suggests the English-dominant nature of the SSCI affects the number of Non-English articles and also their impact factor. Comparing Turkish articles average citations with other non-English countries, Olpak and Arican found Turkey's average count was 6.653, Taiwan's was 17.35, Germany was 17.29 and Spain was 12.77. Olpak and Arican suggest this is due to some of the 272 journal articles listed being removed. [7]
Another set of criticisms launched at the SSCI was authored by John B. Davis. He concluded that "one should only apply the impact-adjusted rankings using SSCI data with considerable caution when evaluating scholarly productivity of individuals and departments." [8] He also noted research approaching the boundaries of economics is less likely to be published in prominent journals thereby inappropriately evaluating scholarly productivity relative to the orientation of the discipline. It is also worth noting upon analysing the Journal Citation reports analytics relating to the Social Science Citation Index and Science Citation Index, Loet Leydesdorff expressed difficulty in analysis correlations between disciplines in social sciences in comparison with natural sciences for numerous reasons including methodology. Loet expresses the following regarding methodology, "Unlike the natural and life sciences, the social sciences often construct their subject matter both in terms of 'what' they study and in terms of 'how' the subject under study is to be analyzed." [9] Leydesdorff also noted the developments in specialty clusters are small, recognition on an international scale is limited but also volatile.
Whilst there are some criticisms drawn from the aforementioned academics, there are opposing views on different aspects of the SSCI. Drawing a stark contrast to the social science subset, the field of information science has been stabilized under the subset labelled 'library and information science' over the course of the past two decades. [10] He notes "The relevant set of journals is visible in 2001 both as a factor with 35 journals and as a bi-component of 28 journals. The two composing substructures of library and information science have remained visible in this representation." [10] He also notes in a separate study of the Journal Citation Reports SCI and SSCI merger, that if a journal is potentially marginal in one context, the addition of the other "can provide interesting perspectives on its position in the field and its function in the network." [11] Implemented within the library and information science is institutional repositories; however, this is the only subset in which they are listed which has practical implications for academics as noted by Yi-Ping Liao and Tsu-Jui Ma. [3] It is not, however, without its solutions, if implemented in other subsets of the SSCI, Liao and Ma suggest this will help further academic research in future.
The potential for ideological bias within the Social Science Citation Index is classed as a potential hazardous outcome by several scholars such as Eric Chiang and Daniel Klein. Aspects of the Social Science Citation Index have been rigorously studied for Ideological bias, with evidence being found, however not conclusive. [12] During the year 2003, using a criterion of consistency and outspokenness, Chiang and Klein analysed under a quarter of the available 1,768 articles published for ideological bias. They clarified their criterion of consistency and outspokenness for republican articles as "an organization is orientated toward a particular ideology if, relative to the norm, it dwells on, expresses, or espouses the sensibilities of that ideology". [13] They do, however, discuss the importance of the benchmark of the 'norm' as they note "it is well established that social democratic sensibilities dominate the social sciences and humanities." [14] They concluded the Social Science Citation Index has quite a clear ideological bias towards 'Social Democratic' articles, however the opposite could be argued due to the lopsided number of social democratic journals. [15]
Citation count in academic work is commonly associated with quality of research whilst some researchers suggest it merely reflects influence and visibility within a citation index. [16] Using a variety of quantitative techniques Dragan Ivanović and Yuh-Shan Ho to elaborate on an understanding on the direction of the Information Science and Library Science fields within the SSCI. They documented a number of trends and patterns such as:
A citation index is a kind of bibliographic index, an index of citations between publications, allowing the user to easily establish which later documents cite which earlier documents. A form of citation index is first found in 12th-century Hebrew religious literature. Legal citation indexes are found in the 18th century and were made popular by citators such as Shepard's Citations (1873). In 1961, Eugene Garfield's Institute for Scientific Information (ISI) introduced the first citation index for papers published in academic journals, first the Science Citation Index (SCI), and later the Social Sciences Citation Index (SSCI) and the Arts and Humanities Citation Index (AHCI). American Chemical Society converted its printed Chemical Abstract Service into internet-accessible SciFinder in 2008. The first automated citation indexing was done by CiteSeer in 1997 and was patented. Other sources for such data include Google Scholar, Microsoft Academic, Elsevier's Scopus, and the National Institutes of Health's iCite.
In sociology, social complexity is a conceptual framework used in the analysis of society. In the sciences, contemporary definitions of complexity are found in systems theory, wherein the phenomenon being studied has many parts and many possible arrangements of the parts; simultaneously, what is complex and what is simple are relative and change in time.
The Institute for Scientific Information (ISI) was an academic publishing service, founded by Eugene Garfield in Philadelphia in 1956. ISI offered scientometric and bibliographic database services. Its specialty was citation indexing and analysis, a field pioneered by Garfield.
A book review is a form of literary criticism in which a book is merely described or analyzed based on content, style, and merit.
Scientometrics is a subfield of informetrics that studies quantitative aspects of scholarly literature. Major research issues include the measurement of the impact of research papers and academic journals, the understanding of scientific citations, and the use of such measurements in policy and management contexts. In practice there is a significant overlap between scientometrics and other scientific fields such as information systems, information science, science of science policy, sociology of science, and metascience. Critics have argued that overreliance on scientometrics has created a system of perverse incentives, producing a publish or perish environment that leads to low-quality research.
Citation analysis is the examination of the frequency, patterns, and graphs of citations in documents. It uses the directed graph of citations — links from one document to another document — to reveal properties of the documents. A typical aim would be to identify the most important documents in a collection. A classic example is that of the citations between academic articles and books. For another example, judges of law support their judgements by referring back to judgements made in earlier cases. An additional example is provided by patents which contain prior art, citation of earlier patents relevant to the current claim. The digitization of patent data and increasing computing power have led to a community of practice that uses these citation data to measure innovation attributes, trace knowledge flows, and map innovation networks.
Google Scholar is a freely accessible web search engine that indexes the full text or metadata of scholarly literature across an array of publishing formats and disciplines. Released in beta in November 2004, the Google Scholar index includes peer-reviewed online academic journals and books, conference papers, theses and dissertations, preprints, abstracts, technical reports, and other scholarly literature, including court opinions and patents.
The h-index is an author-level metric that measures both the productivity and citation impact of the publications, initially used for an individual scientist or scholar. The h-index correlates with success indicators such as winning the Nobel Prize, being accepted for research fellowships and holding positions at top universities. The index is based on the set of the scientist's most cited papers and the number of citations that they have received in other publications. The index has more recently been applied to the productivity and impact of a scholarly journal as well as a group of scientists, such as a department or university or country. The index was suggested in 2005 by Jorge E. Hirsch, a physicist at UC San Diego, as a tool for determining theoretical physicists' relative quality and is sometimes called the Hirsch index or Hirsch number.
The Science Citation Index Expanded is a citation index originally produced by the Institute for Scientific Information (ISI) and created by Eugene Garfield.
Econ Journal Watch is a semiannual peer-reviewed electronic journal established in 2004. It is published by the Fraser Institute. According its website, the journal publishes comments on articles appearing in other economics journals, essays, reflections, investigations, and classic critiques. As of 2017, the Journal maintained a podcast, voiced by Lawrence H. White.
The Korea Citation Index is a non-commercial South Korean citation index operated by the National Research Foundation of Korea. Though it mainly covers researches written in Korean language, some of English language journals are also indexed in KCI. Established in 2007, KCI covers selected South Korean journals and their researches published from year 2004. KCI indexed journals are reported to respond faster than SCI and SSCI indexed journals.
Journal ranking is widely used in academic circles in the evaluation of an academic journal's impact and quality. Journal rankings are intended to reflect the place of a journal within its field, the relative difficulty of being published in that journal, and the prestige associated with it. They have been introduced as official research evaluation tools in several countries.
The Web of Science is a paid-access platform that provides access to multiple databases that provide reference and citation data from academic journals, conference proceedings, and other documents in various academic disciplines.
Louis André (Loet) Leydesdorff was a Dutch sociologist, cyberneticist, communication scientist and Professor in the Dynamics of Scientific Communication and Technological Innovation at the University of Amsterdam. He is known for his work in the sociology of communication and innovation, especially for his Triple helix model of innovation developed with Henry Etzkowitz in the 1990s.
The CWTS Leiden Ranking is an annual global university ranking based exclusively on bibliometric indicators. The rankings are compiled by the Centre for Science and Technology Studies at Leiden University in the Netherlands. The Clarivate Analytics bibliographic database Web of Science is used as the source of the publication and citation data.
The Chinese Science Citation Database (CSCD) is a bibliographic database and citation index managed by Clarivate in partnership with the Chinese Academy of Sciences.
Main path analysis is a mathematical tool, first proposed by Hummon and Doreian in 1989, to identify the major paths in a citation network, which is one form of a directed acyclic graph (DAG). It has since become an effective technique for mapping technological trajectories, exploring scientific knowledge flows, and conducting literature reviews.
Clarivate Plc is a British-American publicly traded analytics company that operates a collection of subscription-based services, in the areas of bibliometrics and scientometrics; business / market intelligence, and competitive profiling for pharmacy and biotech, patents, and regulatory compliance; trademark protection, and domain and brand protection. In the academy and the scientific community, Clarivate is known for being the company that calculates the impact factor, using data from its Web of Science product family, that also includes services/applications such as Publons, EndNote, EndNote Click, and ScholarOne. Its other product families are Cortellis, DRG, CPA Global, Derwent, CompuMark, and Darts-ip, and also the various ProQuest products and services.
Clarivate's Highly Cited Researchers program measures the amount of citations in Web of Science–indexed journals a paper or article has amassed, and honors certain top authors as a "Highly Cited Researcher" (HCR). Within the scientific community, Highly Cited Researchers are found as only 1 in every 1,000 scientists.