Scientific citation

Last updated
Reference section in scientific paper. Scientific citations.png
Reference section in scientific paper.

Scientific citation is providing detailed reference in a scientific publication, typically a paper or book, to previous published (or occasionally private) communications which have a bearing on the subject of the new publication.[ citation needed ] The purpose of citations in original work is to allow readers of the paper to refer to cited work to assist them in judging the new work, source background information vital for future development, and acknowledge the contributions of earlier workers.[ citation needed ] Citations in, say, a review paper bring together many sources, often recent, in one place.

Contents

To a considerable extent the quality of work, in the absence of other criteria, is judged on the number of citations received, adjusting for the volume of work in the relevant topic.[ citation needed ] While this is not necessarily a reliable measure, counting citations is trivially easy; judging the merit of complex work can be very difficult.[ citation needed ]

Previous work may be cited regarding experimental procedures, apparatus, goals, previous theoretical results upon which the new work builds, theses, and so on. Typically such citations establish the general framework of influences and the mindset of research, and especially as "part of what science" it is, and to help determine who conducts the peer review.[ citation needed ]

Patent references

In patent law the citation of previous works, or prior art, helps establish the uniqueness of the invention being described. The focus in this practice is to claim originality for commercial purposes, and so the author is motivated to avoid citing works that cast doubt on its originality. Thus this does not appear to be "scientific" citation. Inventors and lawyers have a legal obligation to cite all relevant art; not to do so risks invalidating the patent.[ citation needed ] The patent examiner is obliged to list all further prior art found in searches.[ citation needed ]

Digital object identifier (DOI)

A digital object identifier (DOI) is a persistent identifier or handle used to uniquely identify various objects, standardized by the International Organization for Standardization (ISO). [1] DOIs are an implementation of the Handle System; [2] [3] they also fit within the URI system (Uniform Resource Identifier). They are widely used to identify academic, professional, and government information, such as journal articles, research reports, data sets, and official publications.

A DOI aims to resolve to its target, the information object to which the DOI refers. This is achieved by binding the DOI to metadata about the object, such as a URL where the object is located. Thus, by being actionable and interoperable, a DOI differs from ISBNs or ISRCs which are identifiers only. The DOI system uses the indecs Content Model for representing metadata.

Research and development

Citation analysis is a method widely used in metascience:

Citation analysis

Citation analysis is the examination of the frequency, patterns, and graphs of citations in documents. It uses the directed graph of citations links from one document to another document to reveal properties of the documents. A typical aim would be to identify the most important documents in a collection. A classic example is that of the citations between academic articles and books. [4] [5] For another example, judges of law support their judgements by referring back to judgements made in earlier cases (see citation analysis in a legal context). An additional example is provided by patents which contain prior art, citation of earlier patents relevant to the current claim. The digitization of patent data and increasing computing power have led to a community of practice that uses these citation data to measure innovation attributes, trace knowledge flows, and map innovation networks. [6]

Documents can be associated with many other features in addition to citations, such as authors, publishers, journals as well as their actual texts. The general analysis of collections of documents is known as bibliometrics and citation analysis is a key part of that field. For example, bibliographic coupling and co-citation are association measures based on citation analysis (shared citations or shared references). The citations in a collection of documents can also be represented in forms such as a citation graph, as pointed out by Derek J. de Solla Price in his 1965 article "Networks of Scientific Papers". [7] This means that citation analysis draws on aspects of social network analysis and network science.

An early example of automated citation indexing was CiteSeer, which was used for citations between academic papers, while Web of Science is an example of a modern system which includes more than just academic books and articles reflecting a wider range of information sources. Today, automated citation indexing [8] has changed the nature of citation analysis research, allowing millions of citations to be analyzed for large-scale patterns and knowledge discovery. Citation analysis tools can be used to compute various impact measures for scholars based on data from citation indices. [9] [10] [note 1] These have various applications, from the identification of expert referees to review papers and grant proposals, to providing transparent data in support of academic merit review, tenure, and promotion decisions. This competition for limited resources may lead to ethically questionable behavior to increase citations. [11] [12]

A great deal of criticism has been made of the practice of naively using citation analyses to compare the impact of different scholarly articles without taking into account other factors which may affect citation patterns. [13] Among these criticisms, a recurrent one focuses on "field-dependent factors", which refers to the fact that citation practices vary from one area of science to another, and even between fields of research within a discipline. [14]

Citation frequency

Modern scientists are sometimes judged by the number of times their work is cited by others—this is actually a key indicator of the relative importance of a work in science. Accordingly, individual scientists are motivated to have their own work cited early and often and as widely as possible, but all other scientists are motivated to eliminate unnecessary citations so as not to devalue this means of judgment [15] .[ citation needed ] A formal citation index tracks which referred and reviewed papers have referred which other such papers. Baruch Lev and other advocates of accounting reform consider the number of times a patent is cited to be a significant metric of its quality, and thus of innovation.[ citation needed ] Reviews often replace citations to primary studies. [16]

Citation-frequency is one indicator used in scientometrics.

Replication crisis

Some studies explore citations and citation-frequencies. Researchers found that papers in leading journals with findings that can not be replicated tend to be cited more than reproducible science. Results that are published unreproducibly – or not in a replicable sufficiently transparent way – are more likely to be wrong, may slow progress and, according to an author, "a simple way to check how often studies have been repeated, and whether or not the original findings are confirmed" is needed. The authors also put forward possible explanations for this state of affairs. [17] [18]

Progress and citation consolidation

Various results from scientific citation analysis
(more graphs from it) Papers and patents are using narrower portions of existing knowledge.png
Various results from scientific citation analysis
(more graphs from it)

Two metascientists reported that in a growing scientific field, citations disproportionately cite already well-cited papers, possibly slowing and inhibiting canonical progress to some degree in some cases. They find that "structures fostering disruptive scholarship and focusing attention on novel ideas" could be important. [20] [21] [22]

Other metascientists introduced the 'CD index' intended to characterize "how papers and patents change networks of citations in science and technology" and reported that it has declined, which they interpreted as "slowing rates of disruption". They proposed linking this to changes to three "use of previous knowledge"-indicators which they interpreted as "contemporary discovery and invention" being informed by "a narrower scope of existing knowledge". The overall number of papers has risen while the total of "highly disruptive" papers has not. The 1998 discovery of the accelerating expansion of the universe has a CD index of 0. Their results also suggest scientists and inventors "may be struggling to keep up with the pace of knowledge expansion". [23] [21] [19]

IT systems

Research discovery

Stages of research and publication processes and metadata, including citation metadata Stages of the publication process, the generic research process, and The OPTIMETA Way with their connections.png
Stages of research and publication processes and metadata, including citation metadata

Recommendation systems sometimes also use citations to find similar studies to the one the user is currently reading or that the user may be interested in and may find useful. [25] Better availability of integrable open citation information could be useful in addressing the "overwhelming amount of scientific literature". [24]

Q&A agents

Knowledge agents may use citations to find studies that are relevant to the user's query, in particular citation statements are used by scite.ai to answer a question, also providing the associated reference(s). [26] [ additional citation(s) needed ]

Wikipedia

Years of publication of a set of analyzed scientific articles referenced in Wikipedia Box and violin plots for the years of publication of the scientific articles referenced in Wikipedia (outliers are shown in red).png
Years of publication of a set of analyzed scientific articles referenced in Wikipedia

There also has been analysis of citations of science information on Wikipedia or of scientific citations on the site, e.g. enabling listing the most relevant or most-cited scientific journals and categories and dominant domains. [27] Since 2015, the altmetrics platform Altmetric.com also shows citing English Wikipedia articles for a given study, later adding other language editions. [27] [28] The Wikimedia platform under development Scholia also shows "Wikipedia mentions" of scientific works. [29] A study suggests a citation on Wikipedia "could be considered a public parallel to scholarly citation". [30] A scientific publication being "cited in a Wikipedia article is considered an indicator of some form of impact for this publication" and it may be possible to detect certain publications through changes to Wikipedia articles. [31] Wikimedia Research's Cite-o-Meter tool showed a league table of which academic publishers are most cited on Wikipedia [30] as does a page by the "Academic Journals WikiProject". [32] [33] [ additional citation(s) needed ] Research indicates a large share of academic citations on the platform are paywalled and hence inaccessible to many readers. [34] [35] "[citation needed]" is a tag added by Wikipedia editors to unsourced statements in articles requesting citations to be added. [36] The phrase is reflective of the policies of verifiability and no original research on Wikipedia and has become a general Internet meme. [37]

Differentiation of semantic citation contexts

Percent of all citances in each field that contain signals of disagreement Disagreement in the scientific literature by field.jpg
Percent of all citances in each field that contain signals of disagreement

The tool scite.ai tracks and links citations of papers as 'Supporting', 'Mentioning' or 'Contrasting' the study, differentiating between these contexts of citations to some degree which may be useful for evaluation/metrics and e.g. discovering studies or statements contrasting statements within a specific study. [39] [40] [41]

Retractions

The Scite Reference Check bot is an extension of scite.ai that scans new article PDFs "for references to retracted papers, and posts both the citing and retracted papers on Twitter" and also "flags when new studies cite older ones that have issued corrections, errata, withdrawals, or expressions of concern". [41] Studies have suggested as few as 4% of citations to retracted papers clearly recognize the retraction. [41] Research found "that authors tend to keep citing retracted papers long after they have been red flagged, although at a lower rate". [42]

See also

Notes

  1. Examples include subscription-based tools based on proprietary data, such as Web of Science and Scopus, and free tools based on open data, such as Scholarometer by Filippo Menczer and his team.

    Related Research Articles

    <span class="mw-page-title-main">Citation</span> Reference to a source

    A citation is a reference to a source. More precisely, a citation is an abbreviated alphanumeric expression embedded in the body of an intellectual work that denotes an entry in the bibliographic references section of the work for the purpose of acknowledging the relevance of the works of others to the topic of discussion at the spot where the citation appears.

    CiteSeerX is a public search engine and digital library for scientific and academic papers, primarily in the fields of computer and information science.

    <span class="mw-page-title-main">Scientific literature</span> Literary genre

    Scientific literature comprises academic papers that report original empirical and theoretical work in the natural and social sciences. Within a field of research, relevant papers are often referred to as "the literature". Academic publishing is the process of contributing the results of one's research into the literature, which often requires a peer-review process.

    A citation index is a kind of bibliographic index, an index of citations between publications, allowing the user to easily establish which later documents cite which earlier documents. A form of citation index is first found in 12th-century Hebrew religious literature. Legal citation indexes are found in the 18th century and were made popular by citators such as Shepard's Citations (1873). In 1961, Eugene Garfield's Institute for Scientific Information (ISI) introduced the first citation index for papers published in academic journals, first the Science Citation Index (SCI), and later the Social Sciences Citation Index (SSCI) and the Arts and Humanities Citation Index (AHCI). American Chemical Society converted its printed Chemical Abstract Service into internet-accessible SciFinder in 2008. The first automated citation indexing was done by CiteSeer in 1997 and was patented. Other sources for such data include Google Scholar, Microsoft Academic, Elsevier's Scopus, and the National Institutes of Health's iCite.

    The Institute for Scientific Information (ISI) was an academic publishing service, founded by Eugene Garfield in Philadelphia in 1956. ISI offered scientometric and bibliographic database services. Its specialty was citation indexing and analysis, a field pioneered by Garfield.

    The impact factor (IF) or journal impact factor (JIF) of an academic journal is a scientometric index calculated by Clarivate that reflects the yearly mean number of citations of articles published in the last two years in a given journal, as indexed by Clarivate's Web of Science.

    <span class="mw-page-title-main">Bibliometrics</span> Statistical analysis of written publications

    Bibliometrics is the application of statistical methods to the study of bibliographic data, especially in scientific and library and information science contexts, and is closely associated with scientometrics to the point that both fields largely overlap.

    Scientometrics is a subfield of informetrics that studies quantitative aspects of scholarly literature. Major research issues include the measurement of the impact of research papers and academic journals, the understanding of scientific citations, and the use of such measurements in policy and management contexts. In practice there is a significant overlap between scientometrics and other scientific fields such as information systems, information science, science of science policy, sociology of science, and metascience. Critics have argued that overreliance on scientometrics has created a system of perverse incentives, producing a publish or perish environment that leads to low-quality research.

    Citation analysis is the examination of the frequency, patterns, and graphs of citations in documents. It uses the directed graph of citations — links from one document to another document — to reveal properties of the documents. A typical aim would be to identify the most important documents in a collection. A classic example is that of the citations between academic articles and books. For another example, judges of law support their judgements by referring back to judgements made in earlier cases. An additional example is provided by patents which contain prior art, citation of earlier patents relevant to the current claim. The digitization of patent data and increasing computing power have led to a community of practice that uses these citation data to measure innovation attributes, trace knowledge flows, and map innovation networks.

    Citation impact or citation rate is a measure of how many times an academic journal article or book or author is cited by other articles, books or authors. Citation counts are interpreted as measures of the impact or influence of academic work and have given rise to the field of bibliometrics or scientometrics, specializing in the study of patterns of academic impact through citation analysis. The importance of journals can be measured by the average citation rate, the ratio of number of citations to number articles published within a given time period and in a given index, such as the journal impact factor or the citescore. It is used by academic institutions in decisions about academic tenure, promotion and hiring, and hence also used by authors in deciding which journal to publish in. Citation-like measures are also used in other fields that do ranking, such as Google's PageRank algorithm, software metrics, college and university rankings, and business performance indicators.

    The h-index is an author-level metric that measures both the productivity and citation impact of the publications, initially used for an individual scientist or scholar. The h-index correlates with success indicators such as winning the Nobel Prize, being accepted for research fellowships and holding positions at top universities. The index is based on the set of the scientist's most cited papers and the number of citations that they have received in other publications. The index has more recently been applied to the productivity and impact of a scholarly journal as well as a group of scientists, such as a department or university or country. The index was suggested in 2005 by Jorge E. Hirsch, a physicist at UC San Diego, as a tool for determining theoretical physicists' relative quality and is sometimes called the Hirsch index or Hirsch number.

    <span class="mw-page-title-main">Web of Science</span> Online subscription index of citations

    The Web of Science is a paid-access platform that provides access to multiple databases that provide reference and citation data from academic journals, conference proceedings, and other documents in various academic disciplines. Until 1997, it was originally produced by the Institute for Scientific Information. It is currently owned by Clarivate.

    <span class="mw-page-title-main">Altmetrics</span> Alternative metrics for analyzing scholarship

    In scholarly and scientific publishing, altmetrics are non-traditional bibliometrics proposed as an alternative or complement to more traditional citation impact metrics, such as impact factor and h-index. The term altmetrics was proposed in 2010, as a generalization of article level metrics, and has its roots in the #altmetrics hashtag. Although altmetrics are often thought of as metrics about articles, they can be applied to people, journals, books, data sets, presentations, videos, source code repositories, web pages, etc.

    OurResearch, formerly known as ImpactStory, is a nonprofit organization that creates and distributes tools and services for libraries, institutions and researchers. The organization follows open practices with their data, code, and governance. OurResearch is funded by the Alfred P. Sloan Foundation, the National Science Foundation, and Arcadia Fund.

    <span class="mw-page-title-main">Altmetric</span>

    Altmetric, or altmetric.com, is a data science company that tracks where published research is mentioned online, and provides tools and services to institutions, publishers, researchers, funders and other organisations to monitor this activity, commonly referred to as altmetrics. Altmetric was recognized by European Commissioner Máire Geoghegan-Quinn in 2014 as a company challenging the traditional reputation systems.

    Author-level metrics are citation metrics that measure the bibliometric impact of individual authors, researchers, academics, and scholars. Many metrics have been developed that take into account varying numbers of factors.

    Semantic Scholar is a research tool for scientific literature powered by artificial intelligence. It is developed at the Allen Institute for AI and was publicly released in November 2015. Semantic Scholar uses modern techniques in natural language processing to support the research process, for example by providing automatically generated summaries of scholarly papers. The Semantic Scholar team is actively researching the use of artificial intelligence in natural language processing, machine learning, human–computer interaction, and information retrieval.

    Metascience is the use of scientific methodology to study science itself. Metascience seeks to increase the quality of scientific research while reducing inefficiency. It is also known as "research on research" and "the science of science", as it uses research methods to study how research is done and find where improvements can be made. Metascience concerns itself with all fields of research and has been described as "a bird's eye view of science". In the words of John Ioannidis, "Science is the best thing that has happened to human beings ... but we can do it better."

    The Leiden Manifesto for research metrics (LM) is a list of "ten principles to guide research evaluation", published as a comment in Volume 520, Issue 7548 of Nature, on 22 April 2015. It was formulated by public policy professor Diana Hicks, scientometrics professor Paul Wouters, and their colleagues at the 19th International Conference on Science and Technology Indicators, held between 3–5 September 2014 in Leiden, The Netherlands.

    The open science movement has expanded the uses scientific output beyond specialized academic circles.

    References

    1. "ISO 26324:2012(en), Information and documentation – Digital object identifier system". ISO. Archived from the original on 17 June 2016. Retrieved 20 April 2016.
    2. "The Handle System". Handle.Net Registry. Archived from the original on Jan 7, 2023.
    3. "Factsheets". DOI. Archived from the original on Dec 25, 2022.
    4. Rubin, Richard (2010). Foundations of library and information science (3rd ed.). New York: Neal-Schuman Publishers. ISBN   978-1-55570-690-6.
    5. Garfield, E. Citation Indexing - Its Theory and Application in Science, Technology and Humanities Philadelphia:ISI Press, 1983.
    6. Jaffe, Adam; de Rassenfosse, Gaétan (2017). "Patent citation data in social science research: Overview and best practices". Journal of the Association for Information Science and Technology. 68: 1360–1374.
    7. Derek J. de Solla Price (July 30, 1965). "Networks of Scientific Papers" (PDF). Science . 149 (3683): 510–515. Bibcode:1965Sci...149..510D. doi:10.1126/science.149.3683.510. PMID   14325149.
    8. Giles, C. Lee; Bollacker, Kurt D.; Lawrence, Steve (1998), "CiteSeer", Proceedings of the third ACM conference on Digital libraries - DL '98, New York: Association for Computing Machinery, pp. 89–98, doi:10.1145/276675.276685, ISBN   978-0-89791-965-4, S2CID   514080
    9. Kaur, Jasleen; Diep Thi Hoang; Xiaoling Sun; Lino Possamai; Mohsen JafariAsbagh; Snehal Patil; Filippo Menczer (2012). "Scholarometer: A Social Framework for Analyzing Impact across Disciplines". PLOS ONE. 7 (9): e43235. Bibcode:2012PLoSO...743235K. doi: 10.1371/journal.pone.0043235 . PMC   3440403 . PMID   22984414.
    10. Hoang, D.; Kaur, J.; Menczer, F. (2010), "Crowdsourcing Scholarly Data", Proceedings of the WebSci10: Extending the Frontiers of Society On-Line, April 26-27th, 2010, Raleigh, NC: US, archived from the original on 2015-04-17, retrieved 2015-08-09
    11. Anderson, M.S. van; Ronning, E.A. van; de Vries, R.; Martison, B.C. (2007). "The perverse effects of competition on scientists' work and relationship". Science and Engineering Ethics. 4 (13): 437–461. doi:10.1007/s11948-007-9042-5. PMID   18030595. S2CID   2994701.
    12. Wesel, M. van (2016). "Evaluation by Citation: Trends in Publication Behavior, Evaluation Criteria, and the Strive for High Impact Publications". Science and Engineering Ethics. 22 (1): 199–225. doi:10.1007/s11948-015-9638-0. PMC   4750571 . PMID   25742806.
    13. Bornmann, L.; Daniel, H. D. (2008). "What do citation counts measure? A review of studies on citing behavior". Journal of Documentation. 64 (1): 45–80. doi:10.1108/00220410810844150. hdl: 11858/00-001M-0000-0013-7A94-3 . S2CID   17260826.
    14. Anauati, Maria Victoria and Galiani, Sebastian and Gálvez, Ramiro H., Quantifying the Life Cycle of Scholarly Articles Across Fields of Economic Research (November 11, 2014). Available at SSRN: https://ssrn.com/abstract=2523078
    15. Aksnes, Dag W.; Langfeldt, Liv; Wouters, Paul (2019-01-01). "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories". SAGE Open. 9. doi: 10.1177/2158244019829575 . hdl: 1887/78034 . S2CID   150974941.
    16. Gurevitch, Jessica; Koricheva, Julia; Nakagawa, Shinichi; Stewart, Gavin (March 2018). "Meta-analysis and the science of research synthesis". Nature. 555 (7695): 175–182. Bibcode:2018Natur.555..175G. doi:10.1038/nature25753. ISSN   1476-4687. PMID   29517004. S2CID   3761687.
    17. "A new replication crisis: Research that is less likely to be true is cited more". phys.org. Retrieved 14 June 2021.
    18. Serra-Garcia, Marta; Gneezy, Uri (2021-05-01). "Nonreplicable publications are cited more than replicable ones". Science Advances. 7 (21): eabd1705. Bibcode:2021SciA....7.1705S. doi:10.1126/sciadv.abd1705. ISSN   2375-2548. PMC   8139580 . PMID   34020944.
    19. 1 2 Park, Michael; Leahey, Erin; Funk, Russell J. (January 2023). "Papers and patents are becoming less disruptive over time". Nature. 613 (7942): 138–144. arXiv: 2106.11184 . Bibcode:2023Natur.613..138P. doi:10.1038/s41586-022-05543-x. ISSN   1476-4687. PMID   36600070. S2CID   255466666.
    20. Snyder, Alison. "New ideas are struggling to emerge from the sea of science". Axios. Retrieved 15 November 2021.
    21. 1 2 Thompson, Derek (11 January 2023). "The Consolidation-Disruption Index Is Alarming". The Atlantic. Retrieved 25 February 2023.
    22. Chu, Johan S. G.; Evans, James A. (12 October 2021). "Slowed canonical progress in large fields of science". Proceedings of the National Academy of Sciences. 118 (41). Bibcode:2021PNAS..11821636C. doi: 10.1073/pnas.2021636118 . ISSN   0027-8424. PMC   8522281 . PMID   34607941.
    23. Tejada, Patricia Contreras (13 January 2023). "With fewer disruptive studies, is science becoming an echo chamber?". Advanced Science News. Archived from the original on 15 February 2023. Retrieved 15 February 2023.
    24. 1 2 Nüst, Daniel; Yücel, Gazi; Cordts, Anette; Hauschke, Christian (4 January 2023). "Enriching the scholarly metadata commons with citation metadata and spatio-temporal metadata to support responsible research assessment and research discovery". arXiv: 2301.01502 [cs.DL].
    25. Beel, Joeran; Gipp, Bela; Langer, Stefan; Breitinger, Corinna (1 November 2016). "Research-paper recommender systems: a literature survey". International Journal on Digital Libraries. 17 (4): 305–338. doi:10.1007/s00799-015-0156-0. ISSN   1432-1300. S2CID   254074596.
    26. "How does ask a question work?". scite.ai. Retrieved 25 February 2023.
    27. 1 2 3 Arroyo-Machado, Wenceslao; Torres-Salinas, Daniel; Herrera-Viedma, Enrique; Romero-Frías, Esteban (10 February 2020). "Science through Wikipedia: A novel representation of open knowledge through co-citation networks". PLOS ONE. 15 (2): e0228713. arXiv: 2002.04347 . Bibcode:2020PLoSO..1528713A. doi: 10.1371/journal.pone.0228713 . ISSN   1932-6203. PMC   7010282 . PMID   32040488.
    28. "New Source Alert: Wikipedia". Altmetric. 4 February 2015. Retrieved 25 February 2023.
    29. Arroyo-Machado, Wenceslao; Torres-Salinas, Daniel; Costas, Rodrigo (20 December 2022). "Wikinformetrics: Construction and description of an open Wikipedia knowledge graph data set for informetric purposes". Quantitative Science Studies. 3 (4): 931–952. doi:10.1162/qss_a_00226. hdl: 10481/80532 . S2CID   253107766.
    30. 1 2 Priem, Jason (6 July 2015). "Altmetrics (Chapter from Beyond Bibliometrics: Harnessing Multidimensional Indicators of Scholarly Impact)". arXiv: 1507.01328 [cs.DL].
    31. Zagorova, Olga; Ulloa, Roberto; Weller, Katrin; Flöck, Fabian (12 April 2022). ""I updated the <ref>": The evolution of references in the English Wikipedia and the implications for altmetrics" (PDF). Quantitative Science Studies. 3 (1): 147–173. doi: 10.1162/qss_a_00171 .
    32. Katz, Gilad; Rokach, Lior (8 January 2016). "Wikiometrics: A Wikipedia Based Ranking System". arXiv: 1601.01058 [cs.DL].
    33. "Wikipedia:WikiProject Academic Journals/Journals cited by Wikipedia". Wikipedia. 15 September 2022. Retrieved 25 February 2023.
    34. Leva, Federico (21 February 2022). "Wikipedia is open to all, the research underpinning it should be too". Impact of Social Sciences. Retrieved 25 February 2023.
    35. Tattersall, Andy; Sheppard, Nick; Blake, Thom; O'Neill, Kate; Carroll, Chris (2 February 2022). "Exploring open access coverage of Wikipedia-cited research across the White Rose Universities" (PDF). Insights: The UKSG Journal. 35: 3. doi: 10.1629/uksg.559 . S2CID   246504456.
    36. Redi, Miriam; Fetahu, Besnik; Morgan, Jonathan; Taraborelli, Dario (13 May 2019). "Citation Needed: A Taxonomy and Algorithmic Assessment of Wikipedia's Verifiability". The World Wide Web Conference. WWW '19. San Francisco, CA, USA: Association for Computing Machinery. pp. 1567–1578. doi:10.1145/3308558.3313618. ISBN   978-1-4503-6674-8. S2CID   67856117.
    37. McDowell, Zachary J.; Vetter, Matthew A. (2022). "What Counts as Information: The Construction of Reliability and Verifability". Wikipedia and the Representation of Reality. Routledge, Taylor & Francis. p. 34. doi: 10.4324/9781003094081 . hdl:20.500.12657/50520. ISBN   978-1-000-47427-5.
    38. Lamers, Wout S; Boyack, Kevin; Larivière, Vincent; Sugimoto, Cassidy R; van Eck, Nees Jan; Waltman, Ludo; Murray, Dakota (24 December 2021). "Investigating disagreement in the scientific literature". eLife. 10: e72737. doi: 10.7554/eLife.72737 . ISSN   2050-084X. PMC   8709576 . PMID   34951588.
    39. Khamsi, Roxanne (1 May 2020). "Coronavirus in context: Scite.ai tracks positive and negative citations for COVID-19 literature". Nature. doi:10.1038/d41586-020-01324-6 . Retrieved 19 February 2022.
    40. Nicholson, Josh M.; Mordaunt, Milo; Lopez, Patrice; Uppala, Ashish; Rosati, Domenic; Rodrigues, Neves P.; Grabitz, Peter; Rife, Sean C. (5 November 2021). "scite: A smart citation index that displays the context of citations and classifies their intent using deep learning" (PDF). Quantitative Science Studies. 2 (3): 882–898. doi: 10.1162/qss_a_00146 . S2CID   232283218.
    41. 1 2 3 "New bot flags scientific studies that cite retracted papers". Nature Index. 2 February 2021. Retrieved 25 January 2023.
    42. Peng, Hao; Romero, Daniel M.; Horvát, Emőke-Ágnes (21 June 2022). "Dynamics of cross-platform attention to retracted papers". Proceedings of the National Academy of Sciences. 119 (25): e2119086119. arXiv: 2110.07798 . Bibcode:2022PNAS..11919086P. doi: 10.1073/pnas.2119086119 . ISSN   0027-8424. PMC   9231484 . PMID   35700358.

    Further reading