Citation index

Last updated

A citation index is a kind of bibliographic index, an index of citations between publications, allowing the user to easily establish which later documents cite which earlier documents. A form of citation index is first found in 12th-century Hebrew religious literature. Legal citation indexes are found in the 18th century and were made popular by citators such as Shepard's Citations (1873). In 1961, Eugene Garfield's Institute for Scientific Information (ISI) introduced the first citation index for papers published in academic journals, first the Science Citation Index (SCI), and later the Social Sciences Citation Index (SSCI) and the Arts and Humanities Citation Index (AHCI). American Chemical Society converted its printed Chemical Abstract Service (established in 1907) into internet-accessible SciFinder in 2008. The first automated citation indexing [1] was done by CiteSeer in 1997 and was patented. [2] Other sources for such data include Google Scholar, Microsoft Academic, Elsevier's Scopus, and the National Institutes of Health's iCite . [3]

Contents

History

The earliest known citation index is an index of biblical citations in rabbinic literature, the Mafteah ha-Derashot, attributed to Maimonides and probably dating to the 12th century. It is organized alphabetically by biblical phrase. Later biblical citation indexes are in the order of the canonical text. These citation indices were used both for general and for legal study. The Talmudic citation index En Mishpat (1714) even included a symbol to indicate whether a Talmudic decision had been overridden, just as in the 19th-century Shepard's Citations. [4] [5] Unlike modern scholarly citation indexes, only references to one work, the Bible, were indexed.

In English legal literature, volumes of judicial reports included lists of cases cited in that volume starting with Raymond's Reports (1743) and followed by Douglas's Reports (1783). Simon Greenleaf (1821) published an alphabetical list of cases with notes on later decisions affecting the precedential authority of the original decision. [6] These early tables of legal citations ("citators") were followed by a more complete, book length index, Labatt's Table of Cases...California... (1860) and in 1872 by Wait's Table of Cases...New York.... The most important and best-known citation index for legal cases was released in 1873 with the publication of Shepard's Citations. [6]

William Adair, a former president of Shepard's Citations, suggested in 1920 that citation indexes could serve as a tool for tracking science and engineering literature. [7] After learning that Eugene Garfield held a similar opinion, Adair corresponded with Garfield in 1953. [8] The correspondence prompted Garfield to examine Shepard's Citations index as a model that could be extended to the sciences. Two years later Garfield published "Citation indexes for science" in the journal Science . [9] In 1959, Garfield started a consulting business, the Institute for Scientific Information (ISI), in Philadelphia and began a correspondence with Joshua Lederberg about the idea. [7] In 1961 Garfield received a grant from the U.S. National Institutes of Health to compile a citation index for Genetics. To do so, Garfield's team gathered 1.4 million citations from 613 journals. [8] From this work, Garfield and the ISI produced the first version of the Science Citation Index , published as a book in 1963. [10]

Major citation indexing services

General-purpose, subscription-based academic citation indexes include:

Each of these offer an index of citations between publications and a mechanism to establish which documents cite which other documents. They are not open-access and differ widely in cost: Web of Science and Scopus are available by subscription (generally to libraries).

In addition, CiteSeer and Google Scholar are freely available online.

Several open-access, subject-specific citation indexing services also exist, such as:

Representativeness of proprietary databases

Clarivate Analytics' Web of Science (WoS) and Elsevier's Scopus databases are synonymous with data on international research, and considered as the two most trusted or authoritative sources of bibliometric data for peer-reviewed global research knowledge across disciplines. [12] [13] [14] [15] [16] [17] They are both also used widely for the purposes of researcher evaluation and promotion, institutional impact (for example the role of WoS in the UK Research Excellence Framework 2021 [note 1] ), and international league tables (Bibliographic data from Scopus represents more than 36% of assessment criteria in the THE rankings [note 2] ). But while these databases are generally agreed to contain rigorously-assessed, high quality research, they do not represent the sum of current global research knowledge. [18]

It is often mentioned in popular science articles that the research output of countries in South America, Asia, and Africa are disappointingly low. Sub-Saharan Africa is cited as an example for having "13.5% of the global population but less than 1% of global research output". [note 3] This fact is based on data from a World Bank/Elsevier report from 2012 which relies on data from Scopus. [note 4] Research outputs in this context refers to papers specifically published in peer-reviewed journals that are indexed in Scopus. Similarly, many others have analysed putatively global or international collaborations and mobility using the even more selective WoS database. [19] [20] [21] Research outputs in this context refers to papers specifically published in peer-reviewed journals that are indexed either in Scopus or WoS.

Both WoS and Scopus are considered highly selective. Both are commercial enterprises, whose standards and assessment criteria are mostly controlled by panels in North America and Western Europe. The same is true for more comprehensive databases such as Ulrich's Web which lists as many as 70,000 journals, [22] while Scopus has fewer than 50% of these, and WoS has fewer than 25%. [12] While Scopus is larger and geographically broader than WoS, it still only covers a fraction of journal publishing outside North America and Europe. For example, it reports a coverage of over 2,000 journals in Asia ("230% more than the nearest competitor"), [note 5] which may seem impressive until you consider that in Indonesia alone there are more than 7,000 journals listed on the government's Garuda portal [note 6] (of which more than 1,300 are currently listed on DOAJ); [note 7] whilst at least 2,500 Japanese journals listed on the J-Stage platform. [note 8] Similarly, Scopus claims to have about 700 journals listed from Latin America, in comparison with SciELO's 1,285 active journal count; [note 9] but that is just the tip of the iceberg judging by the 1,300+ DOAJ-listed journals in Brazil alone. [note 10] Furthermore, the editorial boards of the journals contained in Wos and Scopus databases are integrated by researchers from western Europe and North America. For example, in the journal Human Geography, 41% of editorial board members are from the United States, and 37.8% from the UK. [23] Similarly, [24] ) studied ten leading marketing journals in WoS and Scopus databases, and concluded that 85.3% of their editorial board members are based in the United States. It comes as no surprise that the research that gets published in these journals is the one that fits the editorial boards' world view. [24]

Comparison with subject-specific indexes has further revealed the geographical and topic bias – for example Ciarli [25] found that by comparing the coverage of rice research in CAB Abstracts (an agriculture and global health database) with WoS and Scopus, the latter "may strongly under-represent the scientific production by developing countries, and over-represent that by industrialised countries", and this is likely to apply to other fields of agriculture. This under-representation of applied research in Africa, Asia, and South America may have an additional negative effect on framing research strategies and policy development in these countries. [26] The overpromotion of these databases diminishes the important role of "local" and "regional" journals for researchers who want to publish and read locally-relevant content. Some researchers deliberately bypass "high impact" journals when they want to publish locally useful or important research in favour of outlets that will reach their key audience quicker, and in other cases to be able to publish in their native language. [27] [28] [29]

Furthermore, the odds are stacked against researchers for whom English is a foreign language. 95% of WoS journals are English [30] [31] consider the use of English language a hegemonic and unreflective linguistic practice. The consequences include that non-native speakers spend part of their budget on translation and correction and invest a significant amount of time and effort on subsequent corrections, making publishing in English a burden. [32] [33] A far-reaching consequence of the use of English as the lingua franca of science is in knowledge production, because its use benefits "worldviews, social, cultural, and political interests of the English-speaking center" ( [31] p. 123).

The small proportion of research from South East Asia, Africa, and Latin America which makes it into WoS and Scopus journals is not attributable to a lack of effort or quality of research; but due to hidden and invisible epistemic and structural barriers (Chan 2019 [note 11] ). These are a reflection of "deeper historical and structural power that had positioned former colonial masters as the centers of knowledge production, while relegating former colonies to peripheral roles" (Chan 2018 [note 12] ). Many North American and European journals demonstrate conscious and unconscious bias against researchers from other parts of the world. [note 13] Many of these journals call themselves "international" but represent interests, authors, and even references only in their own languages. [note 14] [34] Therefore, researchers in non-European or North American countries commonly get rejected because their research is said to be "not internationally significant" or only of "local interest" (the wrong "local"). This reflects the current concept of "international" as limited to a Euro/Anglophone-centric way of knowledge production. [35] [30] In other words, "the ongoing internationalisation has not meant academic interaction and exchange of knowledge, but the dominance of the leading Anglophone journals in which international debates occurs and gains recognition" (, [36] p. 8).

Clarivate Analytics have made some positive steps to broaden the scope of WoS, integrating the SciELO citation index – a move not without criticism [note 15] – and through the creation of the Emerging Sources Index (ESI), which has allowed database access to many more international titles. However, there is still a lot of work to be done to recognise and amplify the growing body of research literature generated by those outside North America and Europe. The Royal Society have previously identified that "traditional metrics do not fully capture the dynamics of the emerging global science landscape", and that academia needs to develop more sophisticated data and impact measures to provide a richer understanding of the global scientific knowledge that is available to us. [37]

Academia has not yet built digital infrastructures which are equal, comprehensive, multi-lingual and allows fair participation in knowledge creation. [38] One way to bridge this gap is with discipline- and region-specific preprint repositories such as AfricArXiv and InarXiv. Open access advocates recommend to remain critical of those "global" research databases that have been built in Europe or Northern America and be wary of those who celebrate these products act as a representation of the global sum of human scholarly knowledge. Finally, let us also be aware of the geopolitical impact that such systematic discrimination has on knowledge production, and the inclusion and representation of marginalised research demographics within the global research landscape. [18]

See also

Notes

  1. "Clarivate Analytics will provide citation data during REF2021"..
  2. "World University Rankings 2019: Methodology". 7 September 2018., Times Higher Education.
  3. "Africa produces just 1.1% of global scientific knowledge – but change is coming". TheGuardian.com . 26 October 2015..
  4. "A decade of development in sub-Saharan African science, technology, engineering, and Mathematics research" (PDF)..
  5. "Scopus content coverage guide" (PDF). Archived from the original (PDF) on 2019-09-04. Retrieved 2020-01-04., 2017.
  6. "Garuda portal". Archived from the original on 2020-02-27. Retrieved 2020-01-04..
  7. "DOAJ journals from Indonesia"..
  8. "Homepage". J-STAGE. Retrieved April 24, 2022.
  9. "SciELO". portal.
  10. "DOAJ journals from Brazil"..
  11. "Leslie Chan"., Twitter.
  12. "Open Access, the Global South and the Politics of Knowledge Production and Circulation"., Leslie Chan interview with Open Library of Humanities.
  13. "Richard Smith: Strong evidence of bias against research from low income countries". 5 December 2017..
  14. Neylon, Cameron (3 September 2018). "The Local and the Global: Puncturing the myth of the "international" journal".
  15. "SciELO, Open Infrastructure and Independence". 3 September 2018., Leslie Chan.

Related Research Articles

Scopus is Elsevier's abstract and citation database launched in 2004. Scopus covers 36,377 titles from 11,678 publishers, of which 34,346 are peer-reviewed journals in top-level subject fields: life sciences, social sciences, physical sciences and health sciences. It covers three types of sources: book series, journals, and trade journals. Scopus also allows patent searches in a dedicated patent database Lexis-Nexis, albeit with a limited functionality.

The Institute for Scientific Information (ISI) was an academic publishing service, founded by Eugene Garfield in Philadelphia in 1956. ISI offered scientometric and bibliographic database services. Its specialty was citation indexing and analysis, a field pioneered by Garfield.

The impact factor (IF) or journal impact factor (JIF) of an academic journal is a scientometric index calculated by Clarivate that reflects the yearly mean number of citations of articles published in the last two years in a given journal, as indexed by Clarivate's Web of Science.

<span class="mw-page-title-main">Bibliometrics</span> Statistical analysis of written publications

Bibliometrics is the use of statistical methods to analyse books, articles and other publications, especially in scientific contents. Bibliometric methods are frequently used in the field of library and information science. Bibliometrics is closely associated with scientometrics, the analysis of scientific metrics and indicators, to the point that both fields largely overlap.

Scientometrics is the field of study which concerns itself with measuring and analysing scholarly literature. Scientometrics is a sub-field of informetrics. Major research issues include the measurement of the impact of research papers and academic journals, the understanding of scientific citations, and the use of such measurements in policy and management contexts. In practice there is a significant overlap between scientometrics and other scientific fields such as information systems, information science, science of science policy, sociology of science, and metascience. Critics have argued that over-reliance on scientometrics has created a system of perverse incentives, producing a publish or perish environment that leads to low-quality research.

Citation analysis is the examination of the frequency, patterns, and graphs of citations in documents. It uses the directed graph of citations — links from one document to another document — to reveal properties of the documents. A typical aim would be to identify the most important documents in a collection. A classic example is that of the citations between academic articles and books. For another example, judges of law support their judgements by referring back to judgements made in earlier cases. An additional example is provided by patents which contain prior art, citation of earlier patents relevant to the current claim. The digitization of patent data and increasing computing power have led to a community of practice that uses these citation data to measure innovation attributes, trace knowledge flows, and map innovation networks.

<span class="mw-page-title-main">Eugene Garfield</span> American linguist and businessman (1925–2017)

Eugene Eli Garfield was an American linguist and businessman, one of the founders of bibliometrics and scientometrics. He helped to create Current Contents, Science Citation Index (SCI), Journal Citation Reports, and Index Chemicus, among others, and founded the magazine The Scientist.

<span class="mw-page-title-main">SciELO</span> Bibliographic database of open access journals

SciELO is a bibliographic database, digital library, and cooperative electronic publishing model of open access journals. SciELO was created to meet the scientific communication needs of developing countries and provides an efficient way to increase visibility and access to scientific literature. Originally established in Brazil in 1997, today there are 16 countries in the SciELO network and its journal collections: Argentina, Bolivia, Brazil, Chile, Colombia, Costa Rica, Cuba, Ecuador, Mexico, Paraguay, Peru, Portugal, South Africa, Spain, Uruguay, and Venezuela.

The h-index is an author-level metric that measures both the productivity and citation impact of the publications, initially used for an individual scientist or scholar. The h-index correlates with success indicators such as winning the Nobel Prize, being accepted for research fellowships and holding positions at top universities. The index is based on the set of the scientist's most cited papers and the number of citations that they have received in other publications. The index has more recently been applied to the productivity and impact of a scholarly journal as well as a group of scientists, such as a department or university or country. The index was suggested in 2005 by Jorge E. Hirsch, a physicist at UC San Diego, as a tool for determining theoretical physicists' relative quality and is sometimes called the Hirsch index or Hirsch number.

The Science Citation Index Expanded – previously titled Science Citation Index – is a citation index originally produced by the Institute for Scientific Information (ISI) and created by Eugene Garfield.

<span class="mw-page-title-main">Social Sciences Citation Index</span> Citation index product of Clarivate Analytics

The Social Sciences Citation Index (SSCI) is a commercial citation index product of Clarivate Analytics. It was originally developed by the Institute for Scientific Information from the Science Citation Index. The Social Sciences Citation Index is a multidisciplinary index which indexes over 3,400 journals across 58 social science disciplines – 1985 to present, and it has 122 million cited references – 1900 to present. It also includes a range of 3,500 selected items from some of the world's finest scientific and technical journals. It has a range of useful search functions such as 'cited reference searching', searching by author, subject, or title. Whilst the Social Sciences Citation Index provides extensive support in bibliographic analytics and research, a number of academic scholars have expressed criticisms relating to ideological bias and its English-dominant publishing nature.

Journal Citation Reports (JCR) is an annual publication by Clarivate. It has been integrated with the Web of Science and is accessed from the Web of Science Core Collection. It provides information about academic journals in the natural and social sciences, including impact factors. The JCR was originally published as a part of the Science Citation Index. Currently, the JCR, as a distinct service, is based on citations compiled from the Science Citation Index Expanded and the Social Sciences Citation Index. As of the 2023 edition, journals from the Arts and Humanities Citation Index and the Emerging Sources Citation Index will also be included.

<span class="mw-page-title-main">Web of Science</span> Online subscription index of citations

The Web of Science is a paid-access platform that provides access to multiple databases that provide reference and citation data from academic journals, conference proceedings, and other documents in various academic disciplines. Until 1997, it was originally produced by the Institute for Scientific Information. It is currently owned by Clarivate.

The Redalyc project is a bibliographic database and a digital library of Open Access journals, supported by the Universidad Autónoma del Estado de México with the help of numerous other higher education institutions and information systems.

ResearcherID is an identifying system for scientific authors. The system was introduced in January 2008 by Thomson Reuters Corporation.

<span class="mw-page-title-main">Serbian Citation Index</span>

Serbian Citation Index is a combination of an online multidisciplinary bibliographic database, a national citation index, an Open Access full-text journal repository and an electronic publishing platform. It is produced and maintained by the Centre for Evaluation in Education and Science (CEON/CEES), based in Belgrade, Serbia. In July 2017, it indexed 230 Serbian scholarly journals in all areas of science and contained more than 80,000 bibliographic records and more than one million bibliographic references.

<span class="mw-page-title-main">Clarivate</span> American analytics company

Clarivate Plc is a British-American publicly traded analytics company that operates a collection of subscription-based services, in the areas of bibliometrics and scientometrics; business / market intelligence, and competitive profiling for pharmacy and biotech, patents, and regulatory compliance; trademark protection, and domain and brand protection. In the academy and the scientific community, Clarivate is known for being the company which calculates the impact factor, using data from its Web of Science product family, that also includes services/applications such as Publons, EndNote, EndNote Click, and ScholarOne. Its other product families are Cortellis, DRG, CPA Global, Derwent, MarkMonitor, CompuMark, and Darts-ip, and also the various ProQuest products and services.

The open science movement has expanded the uses scientific output beyond specialized academic circles.

The Journal d'Analyse Mathématique is a triannual peer-reviewed scientific journal published by Springer Science+Business Media on behalf of Magnes Press. It was established in 1951 by Binyamin Amirà.

Pure and Applied Geophysics is a monthly peer-reviewed scientific journal that covers research in the field of geophysics. It is published by Birkhäuser and the editors-in-chief are Carla F. Braitenberg, Alexander B. Rabinovich, and Renata Dmowska. The journal was established in 1939 as Geofisica Pura e Applicata before obtaining its current title in 1964.

References

  1. Giles, C. Lee, Kurt D. Bollacker, and Steve Lawrence. "CiteSeer: An automatic citation indexing system." In Proceedings of the third ACM conference on Digital libraries, pp. 89-98. 1998.
  2. SR Lawrence, KD Bollacker, CL Giles "Autonomous citation indexing and literature browsing using citation context; US Patent 6,738,780, 2004.
  3. Hutchins, BI; Baker, KL; Davis, MT; Diwersy, MA; Haque, E; Harriman, RM; Hoppe, TA; Leicht, SA; Meyer, P; Santangelo, GM (October 2019). "The NIH Open Citation Collection: A public access, broad coverage resource". PLOS Biology. 17 (10): e3000385. doi: 10.1371/journal.pbio.3000385 . PMC   6786512 . PMID   31600197.
  4. Bella Hass Weinberg, "The Earliest Hebrew Citation Indexes" in Trudi Bellardo Hahn, Michael Keeble Buckland, eds., Historical Studies in Information Science, 1998, p. 51ff
  5. Bella Hass Weinberg, "Predecessors of Scientific Indexing Structures in the Domain of Religion" in W. Boyden Rayward, Mary Ellen Bowden, The History and Heritage of Scientific and Technological Information Systems, Proceedings of the 2002 Conference, 2004, p. 126ff
  6. 1 2 Shapiro, Fred R. (1992). "Origins of bibliometrics, citation indexing, and citation analysis: The neglected legal literature". Journal of the American Society for Information Science. 43 (5): 337–339. doi:10.1002/(SICI)1097-4571(199206)43:5<337::AID-ASI2>3.0.CO;2-T.
  7. 1 2 Small, Henry (2018-03-02). "Citation Indexing Revisited: Garfield's Early Vision and Its Implications for the Future". Frontiers in Research Metrics and Analytics. 3: 8. doi: 10.3389/frma.2018.00008 . ISSN   2504-0537.
  8. 1 2 Garfield, Eugene (2000). The Web of Knowledge: A Festschrift in Honor of Eugene Garfield. Information Today, Inc. pp. 16–18. ISBN   978-1-57387-099-3.
  9. Garfield, Eugene (1955-07-15). "Citation Indexes for Science: A New Dimension in Documentation through Association of Ideas". Science. 122 (3159): 108–111. Bibcode:1955Sci...122..108G. doi:10.1126/science.122.3159.108. ISSN   0036-8075. PMID   14385826.
  10. Garfield, Eugene (1963). "Science Citation Index" (PDF). University of Pennsylvania Garfield Library. pp. v–xvi. Retrieved 2013-05-27.
  11. "Web of Science". Clarivate. Retrieved April 24, 2022.
  12. 1 2 Mongeon, Philippe; Paul-Hus, Adèle (2016). "The Journal Coverage of Web of Science and Scopus: A Comparative Analysis". Scientometrics. 106: 213–228. arXiv: 1511.08096 . doi:10.1007/s11192-015-1765-5. S2CID   17753803.
  13. Archambault, Éric; Campbell, David; Gingras, Yves; Larivière, Vincent (2009). "Comparing Bibliometric Statistics Obtained from the Web of Science and Scopus". Journal of the American Society for Information Science and Technology. 60 (7): 1320–1326. arXiv: 0903.5254 . Bibcode:2009arXiv0903.5254A. doi:10.1002/asi.21062. S2CID   1168518.
  14. Falagas, Matthew E.; Pitsouni, Eleni I.; Malietzis, George A.; Pappas, Georgios (2008). "Comparison of PubMed, Scopus, Web of Science, and Google Scholar: Strengths and Weaknesses". The FASEB Journal. 22 (2): 338–342. doi: 10.1096/fj.07-9492LSF . PMID   17884971. S2CID   303173.
  15. Alonso, S.; Cabrerizo, F.J.; Herrera-Viedma, E.; Herrera, F. (2009). "H-Index: A Review Focused in Its Variants, Computation and Standardization for Different Scientific Fields" (PDF). Journal of Informetrics. 3 (4): 273–289. doi:10.1016/j.joi.2009.04.001.
  16. Harzing, Anne-Wil; Alakangas, Satu (2016). "Google Scholar, Scopus and the Web of Science: A Longitudinal and Cross-Disciplinary Comparison" (PDF). Scientometrics. 106 (2): 787–804. doi:10.1007/s11192-015-1798-9. S2CID   207236780. Archived from the original (PDF) on 2021-06-23. Retrieved 2021-02-21.
  17. Robinson-Garcia, Nicolas; Chavarro, Diego Andrés; Molas-Gallart, Jordi; Ràfols, Ismael (2016-05-28). "On the Dominance of Quantitative Evaluation in 'Peripheral" Countries: Auditing Research with Technologies of Distance". SSRN   2818335.
  18. 1 2 Vanholsbeeck, Marc; Thacker, Paul; Sattler, Susanne; Ross-Hellauer, Tony; Rivera-López, Bárbara S.; Rice, Curt; Nobes, Andy; Masuzzo, Paola; Martin, Ryan; Kramer, Bianca; Havemann, Johanna; Enkhbayar, Asura; Davila, Jacinto; Crick, Tom; Crane, Harry; Tennant, Jonathan P. (2019-03-11). "Ten Hot Topics around Scholarly Publishing". Publications. 7 (2): 34. doi: 10.3390/publications7020034 .
  19. Ribeiro, Leonardo Costa; Rapini, Márcia Siqueira; Silva, Leandro Alves; Albuquerque, Eduardo Motta (2018). "Growth Patterns of the Network of International Collaboration in Science". Scientometrics. 114: 159–179. doi:10.1007/s11192-017-2573-x. S2CID   19052437.
  20. Chinchilla-Rodríguez, Zaida; Miao, Lili; Murray, Dakota; Robinson-García, Nicolás; Costas, Rodrigo; Sugimoto, Cassidy R. (2018). "A Global Comparison of Scientific Mobility and Collaboration According to National Scientific Capacities". Frontiers in Research Metrics and Analytics. 3. doi: 10.3389/frma.2018.00017 . hdl: 10261/170058 .
  21. Boshoff, Nelius; Akanmu, Moses A. (2018). "Scopus or Web of Science for a Bibliometric Profile of Pharmacy Research at a Nigerian University?". South African Journal of Libraries and Information Science. 83 (2). doi: 10.7553/83-2-1682 .
  22. Wang, Yuandi; Hu, Ruifeng; Liu, Meijun (2017). "The Geotemporal Demographics of Academic Journals from 1950 to 2013 According to Ulrich's Database". Journal of Informetrics. 11 (3): 655–671. doi:10.1016/j.joi.2017.05.006. hdl: 10722/247620 .
  23. Gutiérrez, Javier; López-Nieva, Pedro (2001). "Are International Journals of Human Geography Really International?". Progress in Human Geography. 25: 53–69. doi:10.1191/030913201666823316. S2CID   144150221.
  24. 1 2 Rosenstreich, Daniela; Wooliscroft, Ben (2006). "How International Are the Top Academic Journals? The Case of Marketing". European Business Review. 18 (6): 422–436. doi:10.1108/09555340610711067.
  25. "The Under-Representation of Developing Countries in the Main Bibliometric Databases: A Comparison of Rice Studies in the Web of Science, Scopus and CAB Abstracts". Context Counts: Pathways to Master Big and Little Data. Proceedings of the Science and Technology Indicators Conference 2014 Leiden. pp. 97–106.
  26. I Rafols; Tommaso Ciarli; Diego Chavarro (2015). "Under-Reporting Research Relevant to Local Needs in the Global South. Database Biases in the Representation of Knowledge on Rice". ISSI. doi:10.13039/501100000269. S2CID   11720845.
  27. Chavarro, D.; Tang, P.; Rafols, I. (2014). "Interdisciplinarity and Research on Local Issues: Evidence from a Developing Country". Research Evaluation. 23 (3): 195–209. arXiv: 1304.6742 . doi:10.1093/reseval/rvu012. hdl:10251/85447. S2CID   1466718.
  28. Justice and the Dynamics of Research and Publication in Africa: Interrogating the Performance of "Publish or Perish". Uganda Martyrs University. 2017. ISBN   978-9970-09-009-9.
  29. Juan Pablo Alperin; Cecillia Rozemblum (2017). "La reinterpretation de visibilidad y calidad en las nuevas politicas de evaluacion de revistas cientificas". Inicio (in Spanish). 40 (3): 231–241. doi: 10.17533/udea.rib.v40n3a04 .
  30. 1 2 Paasi, Anssi (2015). "Academic Capitalism and the Geopolitics of Knowledge". The Wiley Blackwell Companion to Political Geography. pp. 507–523. doi:10.1002/9781118725771.ch37. ISBN   978-1-118-72577-1.
  31. 1 2 Tietze, Susanne; Dick, Penny. "The Victorious English Language: Hegemonic Practices in the Management Academy" (PDF). Journal of Management Inquiry. 22 (1): 122–134. doi:10.1177/1056492612444316. S2CID   143610201.
  32. Aalbers, Manuel B. (2004). "Creative Destruction through the Anglo-American Hegemony: A Non-Anglo-American View on Publications, Referees and Language". Area. 36 (3): 319–22. doi:10.1111/j.0004-0894.2004.00229.x.
  33. Hwang, Kumju (June 1, 2005). "The Inferior Science and the Dominant Use of English in Knowledge Production: A Case Study of Korean Science and Technology". Science Communication. doi:10.1177/1075547005275428. S2CID   144242790.
  34. Rivera-López, Bárbara Sofía (September 1, 2016). Uneven Writing Spaces in Academic Publishing: A Case Study on Internationalisation in the Disciplines of Biochemistry and Molecular Biology (Thesis). doi:10.31237/osf.io/8cypr. S2CID   210180559.
  35. Lillis, Theresa M.; Curry, Mary Jane (2013). Academic writing in a global context: The politics and practices of publishing in English. Routledge. ISBN   978-0-415-46881-7.
  36. Minca, C. (2013). "(Im)Mobile Geographies". Geographica Helvetica. 68 (1): 7–16. doi: 10.5194/gh-68-7-2013 .
  37. "Knowledge and Nations: Global Scientific Collaboration in the 21st Century". March 2011.
  38. Okune, Angela; Hillyer, Rebecca; Albornoz, Denisse; Posada, Alejandro; Chan, Leslie (June 20, 2018). "Whose Infrastructure? Towards Inclusive and Collaborative Knowledge Infrastructures in Open Science". Connecting the Knowledge Commons: From Projects to Sustainable Infrastructure. doi: 10.4000/proceedings.elpub.2018.31 .