Open science

Last updated

Open science
UNESCO-Open science-pillars-en.png
Pillars of the Open Science according to UNESCO's 2021 Open Science recommendation [1]
CountryWorldwide
Major figures UNESCO
Influences Open access, Open source movement, Creative Commons licenses, Sci-Hub, Wikimedia movement.
InfluencedAcademia worldwide

Open science is the movement to make scientific research (including publications, data, physical samples, and software) and its dissemination accessible to all levels of society, amateur or professional. [2] [3] Open science is transparent and accessible knowledge that is shared and developed through collaborative networks. [4] It encompasses practices such as publishing open research, campaigning for open access, encouraging scientists to practice open-notebook science (such as openly sharing data and code [5] ), broader dissemination and engagement in science [6] and generally making it easier to publish, access and communicate scientific knowledge.

Contents

Usage of the term varies substantially across disciplines, with a notable prevalence in the STEM disciplines. Open research is often used quasi-synonymously to address the gap that the denotion of "science" might have regarding an inclusion of the Arts, Humanities and Social Sciences. The primary focus connecting all disciplines is the widespread uptake of new technologies and tools, and the underlying ecology of the production, dissemination and reception of knowledge from a research-based point-of-view. [7] [8]

As Tennant et al. (2020) [9] note, the term open science "implicitly seems only to regard ‘scientific’ disciplines, whereas open scholarship can be considered to include research from the Arts and Humanities, [10] [11] as well as the different roles and practices that researchers perform as educators and communicators, and an underlying open philosophy of sharing knowledge beyond research communities."

Open science can be seen as a continuation of, rather than a revolution in, practices begun in the 17th century with the advent of the academic journal, when the societal demand for access to scientific knowledge reached a point at which it became necessary for groups of scientists to share resources [12] with each other. [13] In modern times there is debate about the extent to which scientific information should be shared. [14] [5] The conflict that led to the Open Science movement is between the desire of scientists to have access to shared resources versus the desire of individual entities to profit when other entities partake of their resources. [15] Additionally, the status of open access and resources that are available for its promotion are likely to differ from one field of academic inquiry to another. [16]

Principles

Open science elements based on UNESCO presentation of 17 February 2021. This depiction includes indigenous science. Osc2021-unesco-open-science-no-gray.svg
Open science elements based on UNESCO presentation of 17 February 2021. This depiction includes indigenous science.

The six principles of open science are: [17]

Background

Science is broadly understood as collecting, analyzing, publishing, reanalyzing, criticizing, and reusing data. Proponents of open science identify a number of barriers that impede or dissuade the broad dissemination of scientific data. [5] [18] These include financial paywalls of for-profit research publishers, restrictions on usage applied by publishers of data, poor formatting of data or use of proprietary software that makes it difficult to re-purpose, and cultural reluctance to publish data for fears of losing control of how the information is used. [5] [18] [19]

According to the FOSTER taxonomy [20] Open science can often include aspects of Open access, Open data and the open source movement whereby modern science requires software to process data and information. [21] [22] [23] Open research computation also addresses the problem of reproducibility of scientific results.

Types

The term "open science" does not have any one fixed definition or operationalization. On the one hand, it has been referred to as a "puzzling phenomenon". [24] On the other hand, the term has been used to encapsulate a series of principles that aim to foster scientific growth and its complementary access to the public. Two influential sociologists, Benedikt Fecher and Sascha Friesike, have created multiple "schools of thought" that describe the different interpretations of the term. [25]

According to Fecher and Friesike ‘Open Science’ is an umbrella term for various assumptions about the development and dissemination of knowledge. To show the term's multitudinous perceptions, they differentiate between five Open Science schools of thought:

Infrastructure School

The infrastructure school is founded on the assumption that "efficient" research depends on the availability of tools and applications. Therefore, the "goal" of the school is to promote the creation of openly available platforms, tools, and services for scientists. Hence, the infrastructure school is concerned with the technical infrastructure that promotes the development of emerging and developing research practices through the use of the internet, including the use of software and applications, in addition to conventional computing networks. In that sense, the infrastructure school regards open science as a technological challenge. The infrastructure school is tied closely with the notion of "cyberscience", which describes the trend of applying information and communication technologies to scientific research, which has led to an amicable development of the infrastructure school. Specific elements of this prosperity include increasing collaboration and interaction between scientists, as well as the development of "open-source science" practices. The sociologists discuss two central trends in the infrastructure school:

1. Distributed computing: This trend encapsulates practices that outsource complex, process-heavy scientific computing to a network of volunteer computers around the world. The examples that the sociologists cite in their paper is that of the Open Science Grid, which enables the development of large-scale projects that require high-volume data management and processing, which is accomplished through a distributed computer network. Moreover, the grid provides the necessary tools that the scientists can use to facilitate this process. [26]

2. Social and Collaboration Networks of Scientists: This trend encapsulates the development of software that makes interaction with other researchers and scientific collaborations much easier than traditional, non-digital practices. Specifically, the trend is focused on implementing newer Web 2.0 tools to facilitate research related activities on the internet. De Roure and colleagues (2008) [27] list a series of four key capabilities which they believe define a Social Virtual Research Environment (SVRE):

  • The SVRE should primarily aid the management and sharing of research objects. The authors define these to be a variety of digital commodities that are used repeatedly by researchers.
  • Second, the SVRE should have inbuilt incentives for researchers to make their research objects available on the online platform.
  • Third, the SVRE should be "open" as well as "extensible", implying that different types of digital artifacts composing the SVRE can be easily integrated.
  • Fourth, the authors propose that the SVRE is more than a simple storage tool for research information. Instead, the researchers propose that the platform should be "actionable". That is, the platform should be built in such a way that research objects can be used in the conduct of research as opposed to simply being stored.

Measurement school

The measurement school, in the view of the authors, deals with developing alternative methods to determine scientific impact. This school acknowledges that measurements of scientific impact are crucial to a researcher's reputation, funding opportunities, and career development. Hence, the authors argue, that any discourse about Open Science is pivoted around developing a robust measure of scientific impact in the digital age. The authors then discuss other research indicating support for the measurement school. The three key currents of previous literature discussed by the authors are:

  • The peer-review is described as being time-consuming.
  • The impact of an article, tied to the name of the authors of the article, is related more to the circulation of the journal rather than the overall quality of the article itself.
  • New publishing formats that are closely aligned with the philosophy of Open Science are rarely found in the format of a journal that allows for the assignment of the impact factor.

Hence, this school argues that there are faster impact measurement technologies that can account for a range of publication types as well as social media web coverage of a scientific contribution to arrive at a complete evaluation of how impactful the science contribution was. The gist of the argument for this school is that hidden uses like reading, bookmarking, sharing, discussing and rating are traceable activities, and these traces can and should be used to develop a newer measure of scientific impact. The umbrella jargon for this new type of impact measurements is called altmetrics, coined in a 2011 article by Priem et al., (2011). [28] Markedly, the authors discuss evidence that altmetrics differ from traditional webometrics which are slow and unstructured. Altmetrics are proposed to rely upon a greater set of measures that account for tweets, blogs, discussions, and bookmarks. The authors claim that the existing literature has often proposed that altmetrics should also encapsulate the scientific process, and measure the process of research and collaboration to create an overall metric. However, the authors are explicit in their assessment that few papers offer methodological details as to how to accomplish this. The authors use this and the general dearth of evidence to conclude that research in the area of altmetrics is still in its infancy.

Public School

According to the authors, the central concern of the school is to make science accessible to a wider audience. The inherent assumption of this school, as described by the authors, is that the newer communication technologies such as Web 2.0 allow scientists to open up the research process and also allow scientist to better prepare their "products of research" for interested non-experts. Hence, the school is characterized by two broad streams: one argues for the access of the research process to the masses, whereas the other argues for increased access to the scientific product to the public.

  • Accessibility to the Research Process: Communication technology allows not only for the constant documentation of research but also promotes the inclusion of many different external individuals in the process itself. The authors cite citizen science – the participation of non-scientists and amateurs in research. The authors discuss instances in which gaming tools allow scientists to harness the brain power of a volunteer workforce to run through several permutations of protein-folded structures. This allows for scientists to eliminate many more plausible protein structures while also "enriching" the citizens about science. The authors also discuss a common criticism of this approach: the amateur nature of the participants threatens to pervade the scientific rigor of experimentation.
  • Comprehensibility of the Research Result: This stream of research concerns itself with making research understandable for a wider audience. The authors describe a host of authors that promote the use of specific tools for scientific communication, such as microblogging services, to direct users to relevant literature. The authors claim that this school proposes that it is the obligation of every researcher to make their research accessible to the public. The authors then proceed to discuss if there is an emerging market for brokers and mediators of knowledge that is otherwise too complicated for the public to grasp.

Democratic school

The democratic school concerns itself with the concept of access to knowledge. As opposed to focusing on the accessibility of research and its understandability, advocates of this school focus on the access of products of research to the public. The central concern of the school is with the legal and other obstacles that hinder the access of research publications and scientific data to the public. Proponents assert that any research product should be freely available. and that everyone has the same, equal right of access to knowledge, especially in the instances of state-funded experiments and data. Two central currents characterize this school: Open Access and Open Data.

  • Open Data: Opposition to the notion that publishing journals should claim copyright over experimental data, which prevents the re-use of data and therefore lowers the overall efficiency of science in general. The claim is that journals have no use of the experimental data and that allowing other researchers to use this data will be fruitful. Only a quarter of researchers agree to share their data with other researchers because of the effort required for compliance.
  • Open Access to Research Publication: According to this school, there is a gap between the creation and sharing of knowledge. Proponents argue that even though scientific knowledge doubles every 5 years, access to this knowledge remains limited. These proponents consider access to knowledge as a necessity for human development, especially in the economic sense.

Pragmatic School

The pragmatic school considers Open Science as the possibility to make knowledge creation and dissemination more efficient by increasing the collaboration throughout the research process. Proponents argue that science could be optimized by modularizing the process and opening up the scientific value chain. 'Open' in this sense follows very much the concept of open innovation. [29] Take for instance transfers the outside-in (including external knowledge in the production process) and inside-out (spillovers from the formerly closed production process) principles to science. [30] Web 2.0 is considered a set of helpful tools that can foster collaboration (sometimes also referred to as Science 2.0). Further, citizen science is seen as a form of collaboration that includes knowledge and information from non-scientists. Fecher and Friesike describe data sharing as an example of the pragmatic school as it enables researchers to use other researchers' data to pursue new research questions or to conduct data-driven replications.

History

The widespread adoption of the institution of the scientific journal marks the beginning of the modern concept of open science. Before this time societies pressured scientists into secretive behaviors.

Before journals

Before the advent of scientific journals, scientists had little to gain and much to lose by publicizing scientific discoveries. [31] Many scientists, including Galileo, Kepler, Isaac Newton, Christiaan Huygens, and Robert Hooke, made claim to their discoveries by describing them in papers coded in anagrams or cyphers and then distributing the coded text. [31] Their intent was to develop their discovery into something off which they could profit, then reveal their discovery to prove ownership when they were prepared to make a claim on it. [31]

The system of not publicizing discoveries caused problems because discoveries were not shared quickly and because it sometimes was difficult for the discoverer to prove priority. Newton and Gottfried Leibniz both claimed priority in discovering calculus. [31] Newton said that he wrote about calculus in the 1660s and 1670s, but did not publish until 1693. [31] Leibniz published "Nova Methodus pro Maximis et Minimis", a treatise on calculus, in 1684. Debates over priority are inherent in systems where science is not published openly, and this was problematic for scientists who wanted to benefit from priority.[ citation needed ]

These cases are representative of a system of aristocratic patronage in which scientists received funding to develop either immediately useful things or to entertain. [13] In this sense, funding of science gave prestige to the patron in the same way that funding of artists, writers, architects, and philosophers did. [13] Because of this, scientists were under pressure to satisfy the desires of their patrons, and discouraged from being open with research which would bring prestige to persons other than their patrons. [13]

Emergence of academies and journals

Eventually the individual patronage system ceased to provide the scientific output which society began to demand. [13] Single patrons could not sufficiently fund scientists, who had unstable careers and needed consistent funding. [13] The development which changed this was a trend to pool research by multiple scientists into an academy funded by multiple patrons. [13] In 1660 England established the Royal Society and in 1666 the French established the French Academy of Sciences. [13] Between the 1660s and 1793, governments gave official recognition to 70 other scientific organizations modeled after those two academies. [13] [32] In 1665, Henry Oldenburg became the editor of Philosophical Transactions of the Royal Society, the first academic journal devoted to science, and the foundation for the growth of scientific publishing. [33] By 1699 there were 30 scientific journals; by 1790 there were 1052. [34] Since then publishing has expanded at even greater rates. [35]

The first popular science periodical of its kind was published in 1872, under a suggestive name that is still a modern portal for the offering science journalism: Popular Science. The magazine claims to have documented the invention of the telephone, the phonograph, the electric light and the onset of automobile technology. The magazine goes so far as to claim that the "history of Popular Science is a true reflection of humankind's progress over the past 129+ years". [36] Discussions of popular science writing most often contend their arguments around some type of "Science Boom". A recent historiographic account of popular science traces mentions of the term "science boom" to Daniel Greenberg's Science and Government Reports in 1979 which posited that "Scientific magazines are bursting out all over. Similarly, this account discusses the publication Time, and its cover story of Carl Sagan in 1980 as propagating the claim that popular science has "turned into enthusiasm". [37] Crucially, this secondary account asks the important question as to what was considered as popular "science" to begin with. The paper claims that any account of how popular science writing bridged the gap between the informed masses and the expert scientists must first consider who was considered a scientist to begin with.

Collaboration among academies

In modern times many academies have pressured researchers at publicly funded universities and research institutions to engage in a mix of sharing research and making some technological developments proprietary. [15] Some research products have the potential to generate commercial revenue, and in hope of capitalizing on these products, many research institutions withhold information and technology which otherwise would lead to overall scientific advancement if other research institutions had access to these resources. [15] It is difficult to predict the potential payouts of technology or to assess the costs of withholding it, but there is general agreement that the benefit to any single institution of holding technology is not as great as the cost of withholding it from all other research institutions. [15]

Coining of term "Open Science"

Steve Mann claimed to have coined the term "Open Science" in 1998. [38] He also registered the domain names openscience.com and openscience.org in 1998, which he sold to degruyter.com in 2011.[ citation needed ] The term was previously used in a manner that refers to today's 'open science' norms by Daryl E. Chubin in his 1985 essay "Open Science and Closed Science: Tradeoffs in a Democracy". [39] Chubin's essay cited Robert K. Merton's 1942 proposal of what we now refer to as Mertonian Norms for ideal science practices and scientific modes of communication. [40] The term was used sporadically in the 1970s and 1980s in various scholarship to refer to different things.

Internet and the free access to scientific documents

The open science movement, as presented in activist and institutional discourses at the beginning of the 21st century, refers to different ways of opening up science, especially in the Internet age. Its first pillar is free access to scientific publications. The Budapest conference organised by the Open Society Foundations in 2001 was decisive in imposing this issue on the political landscape. The resulting declaration calls for the use of digital tools such as open archives and open access journals, free of charge for the reader. [41]

The idea of open access to scientific publications quickly became inseparable from the question of free licenses to guarantee the right to disseminate and possibly modify shared documents, such as the Creative Commons licenses, created in 2002. In 2011, a new text from the Budapest Open Initiative explicitly refers to the relevance of the CC-BY license to guarantee free dissemination and not only free access to a scientific document. [42]

The openness promise by the Internet is then extended to research data, which underpins scientific studies in different disciplines, as mentioned already in the Berlin Declaration in 2003. In 2007, the Organisation for Economic Co-operation and Development (OECD) published a report on access to publicly funded research data, in which it defined it as the data that validates research results. [43]

Beyond its democratic virtues, open science aims to respond to the replication crisis of research results, notably through the generalization of the opening of data or source code used to produce them or through the dissemination of methodological articles. [44]

The open science movement inspired several regulatory and legislative measures. Thus, in 2007, the University of Liège made the deposit of its researchers’ publications in its institutional open repository (Orbi) compulsory. The next year, the NIH Public Access Policy adopted a similar mandate for every paper funded by the National Institutes of Health. In France, the law for a digital Republic enacted in 2016 creates the right to deposit the validated manuscript of a scientific article in an open archive, with an embargo period following the date of publication in the journal. The law also creates the principle of reuse of public data by default. [45]

Politics

In many countries, governments fund some science research. Scientists often publish the results of their research by writing articles and donating them to be published in scholarly journals, which frequently are commercial. Public entities such as universities and libraries subscribe to these journals. Michael Eisen, a founder of the Public Library of Science, has described this system by saying that "taxpayers who already paid for the research would have to pay again to read the results." [46]

In December 2011, some United States legislators introduced a bill called the Research Works Act, which would prohibit federal agencies from issuing grants with any provision requiring that articles reporting on taxpayer-funded research be published for free to the public online. [47] Darrell Issa, a co-sponsor of the bill, explained the bill by saying that "Publicly funded research is and must continue to be absolutely available to the public. We must also protect the value added to publicly funded research by the private sector and ensure that there is still an active commercial and non-profit research community." [48] One response to this bill was protests from various researchers; among them was a boycott of commercial publisher Elsevier called The Cost of Knowledge. [49]

The Dutch Presidency of the Council of the European Union called out for action in April 2016 to migrate European Commission funded research to Open Science. European Commissioner Carlos Moedas introduced the Open Science Cloud at the Open Science Conference in Amsterdam on 4–5 April. [50] During this meeting also The Amsterdam Call for Action on Open Science was presented, a living document outlining concrete actions for the European Community to move to Open Science. The European Commission continues to be committed to an Open Science policy including developing a repository for research digital objects, European Open Science Cloud (EOSC) and metrics for evaluating quality and impact. [51]

In October 2021, the French Ministry of Higher Education, Research and Innovation released an official translation of its second plan for open science spanning the years 2021–2024. [52]

Standard setting instruments

There is currently no global normative framework covering all aspects of Open Science. In November 2019, UNESCO was tasked by its 193 Member States, during their 40th General Conference, with leading a global dialogue on Open Science to identify globally-agreed norms and to create a standard-setting instrument. [53] [54] The multistakeholder, consultative, inclusive and participatory process to define a new global normative instrument on Open Science is expected to take two years and to lead to the adoption of a UNESCO Recommendation on Open Science by Member States in 2021. [55]

Two UN frameworks set out some common global standards for application of Open Science and closely related concepts: the UNESCO Recommendation on Science and Scientific Researchers, [56] approved by the General Conference at its 39th session in 2017, and the UNESCO Strategy on Open Access to scientific information and research, [57] approved by the General Conference at its 36th session in 2011.

Advantages and disadvantages

Arguments in favor of open science generally focus on the value of increased transparency in research, and in the public ownership of science, particularly that which is publicly funded. In January 2014 J. Christopher Bare published a comprehensive "Guide to Open Science". [58] Likewise, in 2017, a group of scholars known for advocating open science published a "manifesto" for open science in the journal Nature. [59]

Advantages

Open access publication of research reports and data allows for rigorous peer-review

An article published by a team of NASA astrobiologists in 2010 in Science reported a bacterium known as GFAJ-1 that could purportedly metabolize arsenic (unlike any previously known species of lifeform). [60] This finding, along with NASA's claim that the paper "will impact the search for evidence of extraterrestrial life", met with criticism within the scientific community. Much of the scientific commentary and critique around this issue took place in public forums, most notably on Twitter, where hundreds of scientists and non-scientists created a hashtag community around the hashtag #arseniclife. [61] University of British Columbia astrobiologist Rosie Redfield, one of the most vocal critics of the NASA team's research, also submitted a draft of a research report of a study that she and colleagues conducted which contradicted the NASA team's findings; the draft report appeared in arXiv, [62] an open-research repository, and Redfield called in her lab's research blog for peer review both of their research and of the NASA team's original paper. [63] Researcher Jeff Rouder defined Open Science as "endeavoring to preserve the rights of others to reach independent conclusions about your data and work". [64]

Publicly funded science will be publicly available

Public funding of research has long been cited as one of the primary reasons for providing Open Access to research articles. [65] [66] Since there is significant value in other parts of the research such as code, data, protocols, and research proposals a similar argument is made that since these are publicly funded, they should be publicly available under a Creative Commons Licence.

Open science will make science more reproducible and transparent

Increasingly the reproducibility of science is being questioned and for many papers or multiple fields of research [67] [68] was shown to be lacking. This problem has been described as a "reproducibility crisis". [69] For example, psychologist Stuart Vyse notes that "(r)ecent research aimed at previously published psychology studies has demonstrated – shockingly – that a large number of classic phenomena cannot be reproduced, and the popularity of p-hacking is thought to be one of the culprits." [70] Open Science approaches are proposed as one way to help increase the reproducibility of work [71] as well as to help mitigate against manipulation of data.

Open science has more impact

There are several components to impact in research, many of which are hotly debated. [72] However, under traditional scientific metrics parts Open science such as Open Access and Open Data have proved to outperform traditional versions. [73] [74] [75]

Open Science can provide learning opportunities

Open science needs to acknowledge and accommodate the heterogeneity of science, it provides an opportunities for different communities to learn from other communities. [76] For example preregistration in quantitative sciences can benefit qualitative researchers to reduce researcher degrees of freedom, [77] whereas positionality statements have been used to contextual researcher and research environment in qualitative can be used in order to combat reproducibility crisis in quantitative research. [78] In addition, journals should be open to publishing these behaviours, using a guide to ease journal editors into open science. [79]

Open science will help answer uniquely complex questions

Recent arguments in favor of Open Science have maintained that Open Science is a necessary tool to begin answering immensely complex questions, such as the neural basis of consciousness, [80] or pandemics such as the COVID-19 pandemic. [81] The typical argument propagates the fact that these type of investigations are too complex to be carried out by any one individual, and therefore, they must rely on a network of open scientists to be accomplished. By default, the nature of these investigations also makes this "open science" as "big science". [82] It is thought that open science could support innovation and societal benefits, supporting and reinforcing research activities by enabling digital resources that could, for example, use or provide structured open data. [6]

Disadvantages

The open sharing of research data is not widely practiced. To deposit or not to deposit, that is the question - journal.pbio.1001779.g001.png
The open sharing of research data is not widely practiced.

Arguments against open science tend to focus on the advantages of data ownership and concerns about the misuse of data, [83] [84] but see [5]

Potential misuse

In 2011, Dutch researchers announced their intention to publish a research paper in the journal Science describing the creation of a strain of H5N1 influenza which can be easily passed between ferrets, the mammals which most closely mimic the human response to the flu. [85] The announcement triggered a controversy in both political [86] and scientific [87] circles about the ethical implications of publishing scientific data which could be used to create biological weapons. These events are examples of how science data could potentially be misused. [88] It has been argued that constraining the dissemination of dual-use knowledge can in certain cases be justified [89] because, for example, "scientists have a responsibility for potentially harmful consequences of their research; the public need not always know of all scientific discoveries [or all its details]; uncertainty about the risks of harm may warrant precaution; and expected benefits do not always outweigh potential harm". [90]

Scientists have collaboratively agreed to limit their own fields of inquiry on occasions such as the Asilomar conference on recombinant DNA in 1975, [91] :111 and a proposed 2015 worldwide moratorium on a human-genome-editing technique. [92] Differential technological development aims to decrease risks by influencing the sequence in which technologies are developed. Relying only on the established form of legislation and incentives to ensure the right outcomes may not be adequate as these may often be too slow. [93]

The public may misunderstand science data

In 2009 NASA launched the Kepler spacecraft and promised that they would release collected data in June 2010. Later they decided to postpone release so that their scientists could look at it first. Their rationale was that non-scientists might unintentionally misinterpret the data, and NASA scientists thought it would be preferable for them to be familiar with the data in advance so that they could report on it with their level of accuracy. [94]

Low-quality science

Post-publication peer review, a staple of open science, has been criticized as promoting the production of lower quality papers that are extremely voluminous. [95] Specifically, critics assert that as quality is not guaranteed by preprint servers, the veracity of papers will be difficult to assess by individual readers. This will lead to rippling effects of false science, akin to the recent epidemic of false news, propagated with ease on social media websites. [96] Common solutions to this problem have been cited as adaptations of a new format in which everything is allowed to be published but a subsequent filter-curator model is imposed to ensure some basic quality of standards are met by all publications. [97]

Entrapment by platform capitalism

For Philip Mirowski open science runs the risk of continuing a trend of commodification of science [98] which ultimately serves the interests of capital in the guise of platform capitalism. [99]

WEIRD-focus

Open Science is primarily driven by Western, Educated, Industrialized, Rich and Democratic (WEIRD) [100] society that it is challenging for people from the Global South to implement or follow these changes for Open Science. [101] As a result, it perpetuates inequalities found across cultures. However, journal editors have taken note of guidelines for change (e.g. [102] ) in order to make sure Open Science is more inclusive with a focus of multi-site studies and value of diversity within Open Science discussion.

Actions and initiatives

Open-science projects

Different projects conduct, advocate, develop tools for, or fund open science.

The Allen Institute for Brain Science [103] conducts numerous open science projects while the Center for Open Science has projects to conduct, advocate, and create tools for open science. Other workgroups have been created in different fields, such as the Decision Analysis in R for Technologies in Health (DARTH) workgroup], [104] which is a multi-institutional, multi-university collaborative effort by researchers who have a common goal to develop transparent and open-source solutions to decision analysis in health.

Organizations have extremely diverse sizes and structures. The Open Knowledge Foundation (OKF) is a global organization sharing large data catalogs, running face to face conferences, and supporting open source software projects. In contrast, Blue Obelisk is an informal group of chemists and associated cheminformatics projects. The tableau of organizations is dynamic with some organizations becoming defunct, e.g., Science Commons, and new organizations trying to grow, e.g., the Self-Journal of Science. [105] Common organizing forces include the knowledge domain, type of service provided, and even geography, e.g., OCSDNet's [106] concentration on the developing world.

The Allen Brain Atlas maps gene expression in human and mouse brains; the Encyclopedia of Life documents all the terrestrial species; the Galaxy Zoo classifies galaxies; the International HapMap Project maps the haplotypes of the human genome; the Monarch Initiative makes available integrated public model organism and clinical data; and the Sloan Digital Sky Survey which regularizes and publishes data sets from many sources. All these projects accrete information provided by many different researchers with different standards of curation and contribution.

Mathematician Timothy Gowers launched open science journal Discrete Analysis in 2016 to demonstrate that a high-quality mathematics journal could be produced outside the traditional academic publishing industry. [107] The launch followed a boycott of scientific journals that he initiated. [108] The journal is published by a nonprofit which is owned and published by a team of scholars.

Other projects are organized around completion of projects that require extensive collaboration. For example, OpenWorm seeks to make a cellular level simulation of a roundworm, a multidisciplinary project. The Polymath Project seeks to solve difficult mathematical problems by enabling faster communications within the discipline of mathematics. The Collaborative Replications and Education project recruits undergraduate students as citizen scientists by offering funding. Each project defines its needs for contributors and collaboration.

Another practical example for open science project was the first "open" doctoral thesis started in 2012. It was made publicly available as a self-experiment right from the start to examine whether this dissemination is even possible during the productive stage of scientific studies. [109] [110] The goal of the dissertation project: Publish everything related to the doctoral study and research process as soon as possible, as comprehensive as possible and under an open license, online available at all time for everyone. [111] End of 2017, the experiment was successfully completed and published in early 2018 as an open access book. [112]

An example promoting accessibility of open-source code for research papers is CatalyzeX, [113] which finds and links both official implementations by authors and source code independently replicated by other researchers. These code implementations are also surfaced on the preprint server arXiv [114] and open peer-review platform OpenReview. [115] [116]

The ideas of open science have also been applied to recruitment with jobRxiv, a free and international job board that aims to mitigate imbalances in what different labs can afford to spend on hiring. [117] [118] [ non-primary source needed ]

Advocacy

Numerous documents, organizations, and social movements advocate wider adoption of open science. Statements of principles include the Budapest Open Access Initiative from a December 2001 conference [119] and the Panton Principles. New statements are constantly developed, such as the Amsterdam Call for Action on Open Science to be presented to the Dutch Presidency of the Council of the European Union in late May 2016. These statements often try to regularize licenses and disclosure for data and scientific literature.

Other advocates concentrate on educating scientists about appropriate open science software tools. Education is available as training seminars, e.g., the Software Carpentry project; as domain specific training materials, e.g., the Data Carpentry project; and as materials for teaching graduate classes, e.g., the Open Science Training Initiative. Many organizations also provide education in the general principles of open science.

Within scholarly societies there are also sections and interest groups that promote open science practices. The Ecological Society of America has an Open Science Section. Similarly, the Society for American Archaeology has an Open Science Interest Group. [23]

Journal support

Many individual journals are experimenting with the open access model: the Public Library of Science, or PLOS, is creating a library of open access journals and scientific literature. Other publishing experiments include delayed and hybrid models. There are experiments in different fields:

Journal support for open-science does not conflict with preprint servers: figshare archives and shares images, readings, and other data; and Open Science Framework preprints, arXiv, and HAL Archives Ouvertes provide electronic preprints across many fields.

Software

A variety of computer resources support open science. These include software like the Open Science Framework from the Center for Open Science to manage project information, data archiving and team coordination; distributed computing services like Ibercivis to use unused CPU time for computationally intensive tasks; and services like Experiment.com to provide crowdsourced funding for research projects.

Blockchain platforms for open science have been proposed. The first such platform is the Open Science Organization, which aims to solve urgent problems with fragmentation of the scientific ecosystem and difficulties of producing validated, quality science. Among the initiatives of Open Science Organization include the Interplanetary Idea System (IPIS), Researcher Index (RR-index), Unique Researcher Identity (URI), and Research Network. The Interplanetary Idea System is a blockchain based system that tracks the evolution of scientific ideas over time. It serves to quantify ideas based on uniqueness and importance, thus allowing the scientific community to identify pain points with current scientific topics and preventing unnecessary re-invention of previously conducted science. The Researcher Index aims to establish a data-driven statistical metric for quantifying researcher impact. The Unique Researcher Identity is a blockchain technology based solution for creating a single unifying identity for each researcher, which is connected to the researcher's profile, research activities, and publications. The Research Network is a social networking platform for researchers. A scientific paper from November 2019 examined the suitability of blockchain technology to support open science. [122]

Preprint servers

Preprint Servers come in many varieties, but the standard traits across them are stable: they seek to create a quick, free mode of communicating scientific knowledge to the public. Preprint servers act as a venue to quickly disseminate research and vary on their policies concerning when articles may be submitted relative to journal acceptance. [123] [124] Also typical of preprint servers is their lack of a peer-review process – typically, preprint servers have some type of quality check in place to ensure a minimum standard of publication, but this mechanism is not the same as a peer-review mechanism. Some preprint servers have explicitly partnered with the broader open science movement. [125] Preprint servers can offer service similar to those of journals, [126] and Google Scholar indexes many preprint servers and collects information about citations to preprints. [127] The case for preprint servers is often made based on the slow pace of conventional publication formats. [128] The motivation to start SocArXiv, an open-access preprint server for social science research, is the claim that valuable research being published in traditional venues often takes several months to years to get published, which slows down the process of science significantly. Another argument made in favor of preprint servers like SocArXiv is the quality and quickness of feedback offered to scientists on their pre-published work. [129] The founders of SocArXiv claim that their platform allows researchers to gain easy feedback from their colleagues on the platform, thereby allowing scientists to develop their work into the highest possible quality before formal publication and circulation. The founders of SocArXiv further claim that their platform affords the authors the greatest level of flexibility in updating and editing their work to ensure that the latest version is available for rapid dissemination. The founders claim that this is not traditionally the case with formal journals, which instate formal procedures to make updates to published articles[ citation needed ]. Perhaps the strongest advantage of some preprint servers is their seamless compatibility with Open Science software such as the Open Science Framework. The founders of SocArXiv claim that their preprint server connects all aspects of the research life cycle in OSF with the article being published on the preprint server. According to the founders, this allows for greater transparency and minimal work on the authors' part. [125]

One criticism of pre-print servers is their potential to foster a culture of plagiarism. For example, the popular physics preprint server ArXiv had to withdraw 22 papers when it came to light that they were plagiarized. In June 2002, a high-energy physicist in Japan was contacted by a man called Ramy Naboulsi, a non-institutionally affiliated mathematical physicist. Naboulsi requested Watanabe to upload his papers on ArXiv as he was not able to do so, because of his lack of an institutional affiliation. Later, the papers were realized to have been copied from the proceedings of a physics conference. [130] Preprint servers are increasingly developing measures to circumvent this plagiarism problem. In developing nations like India and China, explicit measures are being taken to combat it. [131] These measures usually involve creating some type of central repository for all available pre-prints, allowing the use of traditional plagiarism detecting algorithms to detect the fraud [ citation needed ]. Nonetheless, this is a pressing issue in the discussion of pre-print servers, and consequently for open science.

See also

Related Research Articles

<span class="mw-page-title-main">Scientific journal</span> Periodical journal publishing scientific research

In academic publishing, a scientific journal is a periodical publication designed to further the progress of science by disseminating new research findings to the scientific community. These journals serve as a platform for researchers, scholars, and scientists to share their latest discoveries, insights, and methodologies across a multitude of scientific disciplines. Unlike professional or trade magazines, scientific journals are characterized by their rigorous peer review process, which aims to ensure the validity, reliability, and quality of the published content. With origins dating back to the 17th century, the publication of scientific journals has evolved significantly, playing a pivotal role in the advancement of scientific knowledge, fostering academic discourse, and facilitating collaboration within the scientific community.

arXiv Online archive of e-preprints

arXiv is an open-access repository of electronic preprints and postprints approved for posting after moderation, but not peer review. It consists of scientific papers in the fields of mathematics, physics, astronomy, electrical engineering, computer science, quantitative biology, statistics, mathematical finance and economics, which can be accessed online. In many fields of mathematics and physics, almost all scientific papers are self-archived on the arXiv repository before publication in a peer-reviewed journal. Some publishers also grant permission for authors to archive the peer-reviewed postprint. Begun on August 14, 1991, arXiv.org passed the half-million-article milestone on October 3, 2008, had hit a million by the end of 2014 and two million by the end of 2021. As of November 2024, the submission rate is about 24,000 articles per month.

<span class="mw-page-title-main">Preprint</span> Academic paper prior to journal publication

In academic publishing, a preprint is a version of a scholarly or scientific paper that precedes formal peer review and publication in a peer-reviewed scholarly or scientific journal. The preprint may be available, often as a non-typeset version available free, before or after a paper is published in a journal.

<span class="mw-page-title-main">Scientific citation</span>

Scientific citation is providing detailed reference in a scientific publication, typically a paper or book, to previous published communications which have a bearing on the subject of the new publication. The purpose of citations in original work is to allow readers of the paper to refer to cited work to assist them in judging the new work, source background information vital for future development, and acknowledge the contributions of earlier workers. Citations in, say, a review paper bring together many sources, often recent, in one place.

<span class="mw-page-title-main">Open access</span> Research publications distributed freely online

Open access (OA) is a set of principles and a range of practices through which nominally copyrightable publications are delivered to readers free of access charges or other barriers. With open access strictly defined, or libre open access, barriers to copying or reuse are also reduced or removed by applying an open license for copyright, which regulates post-publication uses of the work.

<span class="mw-page-title-main">Bibliometrics</span> Statistical analysis of written publications

Bibliometrics is the application of statistical methods to the study of bibliographic data, especially in scientific and library and information science contexts, and is closely associated with scientometrics to the point that both fields largely overlap.

Scientometrics is a subfield of informetrics that studies quantitative aspects of scholarly literature. Major research issues include the measurement of the impact of research papers and academic journals, the understanding of scientific citations, and the use of such measurements in policy and management contexts. In practice there is a significant overlap between scientometrics and other scientific fields such as information systems, information science, science of science policy, sociology of science, and metascience. Critics have argued that overreliance on scientometrics has created a system of perverse incentives, producing a publish or perish environment that leads to low-quality research.

PubMed Central (PMC) is a free digital repository that archives open access full-text scholarly articles that have been published in biomedical and life sciences journals. As one of the major research databases developed by the National Center for Biotechnology Information (NCBI), PubMed Central is more than a document repository. Submissions to PMC are indexed and formatted for enhanced metadata, medical ontology, and unique identifiers which enrich the XML structured data for each article. Content within PMC can be linked to other NCBI databases and accessed via Entrez search and retrieval systems, further enhancing the public's ability to discover, read and build upon its biomedical knowledge.

<span class="mw-page-title-main">Self-archiving</span> Authorial deposit of documents to provide open access

Self-archiving is the act of depositing a free copy of an electronic document online in order to provide open access to it. The term usually refers to the self-archiving of peer-reviewed research journal and conference articles, as well as theses and book chapters, deposited in the author's own institutional repository or open archive for the purpose of maximizing its accessibility, usage and citation impact. The term green open access has become common in recent years, distinguishing this approach from gold open access, where the journal itself makes the articles publicly available without charge to the reader.

Nature Precedings was an open access electronic preprint repository of scholarly work in the fields of biomedical sciences, chemistry, and earth sciences. It ceased accepting new submissions as of April 3, 2012.

Open peer review is the various possible modifications of the traditional scholarly peer review process. The three most common modifications to which the term is applied are:

  1. Open identities: Authors and reviewers are aware of each other's identity.
  2. Open reports: Review reports are published alongside the relevant article.
  3. Open participation: The wider community are able to contribute to the review process.
<span class="mw-page-title-main">Altmetrics</span> Alternative metrics for analyzing scholarship

In scholarly and scientific publishing, altmetrics are non-traditional bibliometrics proposed as an alternative or complement to more traditional citation impact metrics, such as impact factor and h-index. The term altmetrics was proposed in 2010, as a generalization of article level metrics, and has its roots in the #altmetrics hashtag. Although altmetrics are often thought of as metrics about articles, they can be applied to people, journals, books, data sets, presentations, videos, source code repositories, web pages, etc.

<span class="mw-page-title-main">Center for Open Science</span> American nonprofit organization

The Center for Open Science is a non-profit technology organization based in Charlottesville, Virginia with a mission to "increase the openness, integrity, and reproducibility of scientific research." Brian Nosek and Jeffrey Spies founded the organization in January 2013, funded mainly by the Laura and John Arnold Foundation and others.

bioRxiv Preprint service

bioRxiv is an open access preprint repository for the biological sciences co-founded by John Inglis and Richard Sever in November 2013. It is hosted by the Cold Spring Harbor Laboratory (CSHL).

Metascience is the use of scientific methodology to study science itself. Metascience seeks to increase the quality of scientific research while reducing inefficiency. It is also known as "research on research" and "the science of science", as it uses research methods to study how research is done and find where improvements can be made. Metascience concerns itself with all fields of research and has been described as "a bird's eye view of science". In the words of John Ioannidis, "Science is the best thing that has happened to human beings ... but we can do it better."

<span class="mw-page-title-main">Open access in India</span> Overview of the culture and regulation of open access in India

In India, the Open Access movement started in May 2004, when two workshops were organized by the M S Swaminathan Research Foundation, Chennai. In 2006, the National Knowledge Commission in its recommendations proposed that "access to knowledge is the most fundamental way of increasing the opportunities and reach of individuals and groups". In 2011, the Council of Scientific & Industrial Research (CSIR) began requiring that its grantees provide open access to funded research, the Open Access India forum formulated a draft policy on Open Access for India. The Shodhganga, a digital repository for theses, was also established in 2011 with the aim of promoting and preserving academic research. The University Grants Commission (UGC) made it mandatory for scholars to deposit their theses in Shodhganga, as per the Minimum Standards and Procedure for Award of M. Phil./Ph.D. Degrees Regulations, 2016. Currently, the Directory of Open Access Journals lists 326 open access journals published in India, of which 233 have no fees.

<span class="mw-page-title-main">History of open access</span>

The idea and practise of providing free online access to journal articles began at least a decade before the term "open access" was formally coined. Computer scientists had been self-archiving in anonymous ftp archives since the 1970s and physicists had been self-archiving in arXiv since the 1990s. The Subversive Proposal to generalize the practice was posted in 1994.

<span class="mw-page-title-main">EarthArXiv</span>

EarthArXiv is both a preprint server and a volunteer community devoted to open scholarly communication. As a preprint server, EarthArXiv publishes articles from all subdomains of Earth Science and related domains of planetary science. These publications are versions of scholarly papers that precede publication in peer-reviewed scientific journals. EarthArXiv is not itself a journal and does not evaluate the scientific quality of a paper. Instead, EarthArXiv serves as a platform for free hosting and rapid dissemination of scientific results. The EarthArXiv platform assigns each submission a Digital Object Identifier (DOI), therefore assigning provenance and making it citable in other scholarly works. EarthArXiv's mission is to promote open access, share open access and preprint resources, and participate in shared governance of the preprint server and its policies. EarthArXiv was launched on October 23, 2017.

The open science movement has expanded the uses scientific output beyond specialized academic circles.

References

  1. UNESCO (2022). Understanding open science — Factsheet — SC-PBS-STIP/2022/OST/1. Paris, France: UNESCO. doi:10.54677/UTCD9302 . Retrieved 8 July 2024. 6 pages. Available in other languages. Open Access logo PLoS transparent.svg
  2. Woelfle, M.; Olliaro, P.; Todd, M. H. (2011). "Open science is a research accelerator". Nature Chemistry. 3 (10): 745–748. Bibcode:2011NatCh...3..745W. doi: 10.1038/nchem.1149 . PMID   21941234.
  3. Parsons, Sam; Azevedo, Flávio; Elsherif, Mahmoud M.; Guay, Samuel; Shahim, Owen N.; Govaart, Gisela H.; Norris, Emma; O’Mahony, Aoife; Parker, Adam J.; Todorovic, Ana; Pennington, Charlotte R. (March 2022). "A community-sourced glossary of open scholarship terms". Nature Human Behaviour. 6 (3): 312–318. doi:10.1038/s41562-021-01269-4. hdl: 2292/62865 . ISSN   2397-3374. PMID   35190714. S2CID   247025114.
  4. Vicente-Saez, Ruben; Martinez-Fuentes, Clara (2018). "Open Science now: A systematic literature review for an integrated definition". Journal of Business Research. 88: 428–436. doi:10.1016/j.jbusres.2017.12.043. S2CID   158229869.
  5. 1 2 3 4 5 Gomes, Dylan G. E.; Pottier, Patrice; Crystal-Ornelas, Robert; Hudgins, Emma J.; Foroughirad, Vivienne; Sánchez-Reyes, Luna L.; Turba, Rachel; Martinez, Paula Andrea; Moreau, David; Bertram, Michael G.; Smout, Cooper A.; Gaynor, Kaitlyn M. (30 November 2022). "Why don't we share data and code? Perceived barriers and benefits to public archiving practices". Proceedings of the Royal Society B: Biological Sciences. 289 (1987): 1–11. doi:10.1098/rspb.2022.1113. PMC   9682438 . PMID   36416041. S2CID   253761876.
  6. 1 2 Hou, Jianhua; Wang, Yuanyuan; Zhang, Yang; Wang, Dongyi (1 February 2022). "How do scholars and non-scholars participate in dataset dissemination on Twitter". Journal of Informetrics. 16 (1): 101223. doi:10.1016/j.joi.2021.101223. ISSN   1751-1577. S2CID   245114882. many believe that broader dissemination and public engagement in science are vital elements of open science [...] In the context of open access and open science, it is envisaged that digital resources would be extensively used to support and reinforce research activities (Araujo, 2020). Remarkably, open data are considered as the basis of innovation (Duus & Cooray, 2016). The propagation of publicly available datasets can offer an opportunity for governments, businesses, and entrepreneurs to obtain economic, social, and scientific benefits ( Sadiq & Indulska, 2017; Tennant et al., 2016). To ensure the authenticity and repeatability of science, several fund projects and journals require
  7. FOSTER Consortium (26 November 2018). "What is Open Science?". Zenodo. doi:10.5281/zenodo.2629946 . Retrieved 13 August 2020.
  8. Tennant, J; Beamer, J E; Bosman, J; Brembs, B; Chung, N C; Clement, G; Crick, T; Dugan, J; Dunning, A; Eccles, D; Enkhbayar, A; Graziotin, D; Harding, R; Havemann, J; Katz, D; Khanal, K; Norgaard Kjaer, J; Koder, T; Macklin, P; Madan, C; Masuzzo, P; Matthias, L; Mayer, K; Nichols, D; Papadopoulou, E; Pasquier, T; Ross-Hellauer, T; Schulte-Mecklenbeck, M; Sholler, D; Steiner, T; Szczesny, P; Turner, A. "Foundations for Open Scholarship Strategy Development". MetaArXiv. doi:10.31222/osf.io/b4v8p. S2CID   159417649 . Retrieved 13 August 2020.
  9. Tennant, Jon; Argawal, Ritwik; Baždarić, Ksenija; Brassard, David; Crick, Tom; Dunleavy, Daniel; Evans, Thomas; Garnder, Nicholas; Gonzalez-Marquez, Monica; Graziotin, Daniel; Greshake Tzovaras, Bastian; Gunnarsson, Daniel; Havemann, Johanna; Hosseini, Mohammad; Katz, Daniel; Madan, Christopher; Manghi, Paolo; Marocchino, Alberto; Masuzzo, Paolo; Murray-Rust, Peter; Narayanaswamy, Sanjay; Nilsonne, Gustav; Pacheco-Mendoza, Josmel; Penders, Bart; Pourret, Olivier; Rera, Michael; Samuel, John; Steiner, Tobias; Stojanovski, Jadranka; Uribe-Tirado, Alejandro; Vos, Rutger; Worthington, Simon; Yarkoni, Tal (4 March 2020). "A tale of two 'opens': intersections between Free and Open Source Software and Open Scholarship". SocArXiv. OSF. doi:10.31235/osf.io/2kxq8. S2CID   215878907 . Retrieved 15 February 2021.
  10. Eve, Martin (2014). Open Access and the Humanities Contexts, Controversies and the Future. Cambridge University Press. doi:10.1017/CBO9781316161012. ISBN   978-1316161012.
  11. Knöchelmann, Marcel (19 November 2019). "Open Science in the Humanities, or: Open Humanities?". Publications. 7 (4): 65. doi: 10.3390/publications7040065 .
  12. "Machado, J. "Open data and open science". In Albagli, Maciel & Abdo. "Open Science, Open Questions", 2015".
  13. 1 2 3 4 5 6 7 8 9 David, P. A. (2004). "Understanding the emergence of 'open science' institutions: Functionalist economics in historical context". Industrial and Corporate Change. 13 (4): 571–589. doi:10.1093/icc/dth023.
  14. Nielsen 2011, pp. 198–202.
  15. 1 2 3 4 David, Paul A. (March 2004). "Can "Open Science" be Protected from the Evolving Regime of IPR Protections?". Journal of Institutional and Theoretical Economics. 160 (1): 9–34. doi:10.1628/093245604773861069. JSTOR   40752435.
  16. "Open Science | A Guide to Open Access, Publishing Market and Recent Developments".
  17. Was ist Open Science? online 23 June 2014 from OpenScience ASAP
  18. 1 2 Molloy, J. C. (2011). "The Open Knowledge Foundation: Open Data Means Better Science". PLOS Biology. 9 (12): e1001195. doi: 10.1371/journal.pbio.1001195 . PMC   3232214 . PMID   22162946.
  19. Bosman, Jeroen (2 March 2017). "Defining Open Science Definitions". I&M / I&O 2.0. Retrieved 27 March 2017.
  20. Nancy Pontika; Petr Knoth; Matteo Cancellieri; Samuel Pearce (2015). Fostering Open Science to Research using a Taxonomy and an eLearning Portal. i-KNOW '15: 15th International Conference on Knowledge Technologies and Data-Driven Business, Graz Austria, 21–22 October 2015. Association for Computing Machinery. pp. 1–8. doi:10.1145/2809563.2809571. ISBN   978-1-4503-3721-2.
  21. Glyn Moody (26 October 2011). "Open Source, Open Science, Open Source Science" . Retrieved 3 January 2012.
  22. Rocchini, D.; Neteler, M. (2012). "Let the four freedoms paradigm apply to ecology". Trends in Ecology & Evolution. 27 (6): 310–311. Bibcode:2012TEcoE..27..310R. CiteSeerX   10.1.1.296.8255 . doi:10.1016/j.tree.2012.03.009. PMID   22521137.
  23. 1 2 Marwick, Ben; d’Alpoim Guedes, Jade; Barton, Michael (2017). "Open science in archaeology" (PDF). SAA Archaeological Record. 17 (4): 8–14.
  24. David, P.A. (2008). "The historical origins of 'Open Science': An essay on patronage, reputation and common agency contracting in the scientific revolution". Capitalism and Society. 3 (2): 5. doi:10.2202/1932-0213.1040. S2CID   41478207. SSRN   2209188.
  25. Fecher, Benedikt; Friesike, Sascha (2014). "Open Science: One Term, Five Schools of Thought". Opening Science. pp. 17–47. doi:10.1007/978-3-319-00026-8_2. ISBN   978-3319000251.
  26. Altunay, M.; et al. (2010). "A science-driven production Cyberinfrastructure—the Open Science grid". Journal of Grid Computing. 9 (2): 201–218. doi:10.1007/s10723-010-9176-6. OSTI   1975710. S2CID   1636510.
  27. Roure, David De; Goble, Carole; Bhagat, Jiten; Cruickshank, Don; Goderis, Antoon; Michaelides, Danius; Newman, David (2008). "My Experiment: Defining the Social Virtual Research Environment" (PDF). 2008 IEEE Fourth International Conference on e Science. pp. 182–189. doi:10.1109/eScience.2008.86. ISBN   978-1424433803. S2CID   11104419.
  28. Priem, J., et al. (2011). Uncovering impacts: CitedIn and total-impact, two new tools for gathering altmetrics (pp. 9–11). In iConference 2012. Available at: https://jasonpriem.org/self-archived/two-altmetrics-tools.pdf
  29. Friesike, S.; et al. (2015). "Opening science: towards an agenda of open science in academia and industry". The Journal of Technology Transfer. 40 (4): 581–601. doi: 10.1007/s10961-014-9375-6 .
  30. Tacke, O., 2010. Open Science 2.0: How Research and Education Can Benefit from Open Innovation and Web 2.0. In T. J. Bastiaens, U. Baumöl, & B. J. Krämer, eds. On Collective Intelligence. Berlin, Heidelberg: Springer Berlin Heidelberg, pp. 37–48.
  31. 1 2 3 4 5 Nielsen 2011, pp. 172–175.
  32. McClellan III, James E. (1985). Science reorganized : scientific societies in the eighteenth century. New York: Columbia University Press. ISBN   978-0231059961.
  33. Groen 2007, pp. 215–216.
  34. Kronick 1976, p. 78.
  35. Price 1986.
  36. "The History of Popular Science". Popular Science. 18 March 2019.
  37. Lewenstein, Bruce V. "Was there really a popular science "boom"?." Science, Technology, & Human Values 12.2 (1987): 29–41.
  38. Mann, Steve (2016). "Surveillance (Oversight), Sousveillance (Undersight), and Metaveillance (Seeing Sight Itself)". 2016 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW). pp. 1408–1417. doi:10.1109/CVPRW.2016.177. ISBN   978-1-5090-1437-8.
  39. Chubin, Daryl E. (1 April 1985). "Open Science and Closed Science: Tradeoffs in a Democracy". Science, Technology, & Human Values. 10 (2): 73–80. doi:10.1177/016224398501000211. ISSN   0162-2439. S2CID   145631585.
  40. Merton, Robert K. (1942). "Science and technology in a democratic order". Journal of Legal and Political Sociology. 1: 115–126.
  41. "Read". Budapest Open Access Initiative. Retrieved 7 June 2021.
  42. "Ten years on from the Budapest Open Access Initiative: setting the default to open". Budapest Open Access Initiative. Retrieved 7 June 2021.
  43. "OECD Principles and Guidelines for Access to Research Data from Public Funding" (PDF). OECD Journal on Development. 2007. doi:10.1787/journal_dev-v8-2-en. ISBN   978-9264019652.
  44. European Commission. Directorate General for Research and Innovation. (1 December 2020). Reproducibility of scientific results in the EU : scoping report. Publications Office of the European Union. doi:10.2777/341654. ISBN   978-9276198888 . Retrieved 29 October 2023 via op.europa.eu.
  45. "LOI n° 2016-1321 du 7 octobre 2016 pour une République numérique – Dossiers législatifs – Légifrance". legifrance.gouv.fr. Retrieved 7 June 2021.
  46. Eisen, Michael (10 January 2012). "Research Bought, Then Paid For". The New York Times . New York City. ISSN   0362-4331 . Retrieved 12 February 2012.
  47. Howard, Jennifer (22 January 2012). "Who Gets to See Published Research?". The Chronicle of Higher Education . Retrieved 12 February 2012.
  48. Rosen, Rebecca J. (5 January 2012). "Why Is Open-Internet Champion Darrell Issa Supporting an Attack on Open Science? – Rebecca J. Rosen". The Atlantic . Retrieved 12 February 2012.
  49. Dobbs, David (30 January 2012). "Testify: The Open-Science Movement Catches Fire". Wired. Retrieved 12 February 2012.
  50. Van Calmthout, Martijn (5 April 2016). "EU wil dat onderzoekers gegevens meer gaan delen in eigen datacloud". De Volkskrant. Retrieved 8 April 2016.
  51. "Open Science". European Commission - European Commission. Retrieved 18 April 2021.
  52. Ministry of Higher Education, Research and Innovation (July 2021). Second French Plan for Open Science: Generalising open science in France 2021–2024 (PDF). Paris, France: Ministry of Higher Education, Research and Innovation. Retrieved 12 October 2021. Publication date refers to French language version.
  53. "Press release: UNESCO Takes the Lead in Developing a New Global Standard-setting Instrument on Open Science". UNESCO. 28 November 2019. Retrieved 6 January 2020.
  54. "Press release: Outcomes of the 40th General Conference". UNESCO. 27 November 2019. Retrieved 6 January 2020.
  55. "Resolution 40 C/63 on the desirability of a recommendation on Open Science". UNESCO . Retrieved 6 January 2020.
  56. "UNESCO Recommendation on Science and Scientific Researchers". UNESCO. 21 May 2019. Retrieved 6 January 2020.
  57. "UNESCO Strategy on Open Access to scientific information and research". UNESCO . Retrieved 6 January 2020.
  58. "Guide to Open Science". 9 January 2014.
  59. Munafò, Marcus R.; Nosek, Brian A.; Bishop, Dorothy V. M.; Button, Katherine S.; Chambers, Christopher D.; Sert, Nathalie Percie du; Simonsohn, Uri; Wagenmakers, Eric-Jan; Ware, Jennifer J. (1 January 2017). "A manifesto for reproducible science". Nature Human Behaviour. 1 (1): 0021. doi: 10.1038/s41562-016-0021 . hdl:11245.1/3534b98f-a374-496b-9ad1-e61539477d66. ISSN   2397-3374. PMC   7610724 . PMID   33954258.
  60. Wolfe-Simon, Felisa; Blum, Jodi Switzer; Kulp, Thomas R.; Gordon, Gwyneth W.; Hoeft, Shelley E.; Pett-Ridge, Jennifer; Stolz, John F.; Webb, Samuel M.; et al. (2 December 2010). "A bacterium that can grow by using arsenic instead of phosphorus" (PDF). Science . 332 (6034): 1163–1166. Bibcode:2011Sci...332.1163W. doi:10.1126/science.1197258. PMID   21127214. S2CID   51834091.
  61. Zimmer, Carl (27 May 2011). "The Discovery of Arsenic-Based Twitter". Slate. Retrieved 19 April 2012.
  62. M. L. Reaves; S. Sinha; J. D. Rabinowitz; L. Kruglyak; R. J. Redfield (31 January 2012). "Absence of arsenate in DNA from arsenate-grown GFAJ-1 cells". Science. 337 (6093): 470–473. arXiv: 1201.6643 . Bibcode:2012Sci...337..470R. doi:10.1126/science.1219861. PMC   3845625 . PMID   22773140.
  63. Redfield, Rosie (1 February 2012). "Open peer review of our arseniclife submission please". RRResearch – the Redfield Lab, University of British Columbia. Retrieved 19 April 2012.
  64. Jeff Rouder Twitter, 6 December 2017
  65. "Academic Publishing: Survey of funders supports the benign Open Access outcome priced into shares" (PDF). HSBC. Retrieved 22 October 2015.
  66. Albert, Karen M. (1 July 2006). "Open access: implications for scholarly publishing and medical libraries". Journal of the Medical Library Association. 94 (3): 253–262. ISSN   1536-5050. PMC   1525322 . PMID   16888657.
  67. "Dozens of major cancer studies can't be replicated". Science News. 7 December 2021. Retrieved 19 January 2022.
  68. "Reproducibility Project: Cancer Biology". cos.io. Center for Open Science . Retrieved 19 January 2022.
  69. Couchman, John R. (1 January 2014). "Peer Review and Reproducibility. Crisis or Time for Course Correction?". Journal of Histochemistry and Cytochemistry. 62 (1): 9–10. doi:10.1369/0022155413513462. ISSN   0022-1554. PMC   3873808 . PMID   24217925.
  70. Vyse, Stuart (2017). "P-Hacker Confessions: Daryl Bem and Me". Skeptical Inquirer . 41 (5): 25–27. Archived from the original on 5 August 2018. Retrieved 5 August 2018.
  71. Collaboration, Open Science (1 November 2012). "An Open, Large-Scale, Collaborative Effort to Estimate the Reproducibility of Psychological Science". Perspectives on Psychological Science. 7 (6): 657–660. doi: 10.1177/1745691612462588 . hdl: 10211.3/195814 . ISSN   1745-6916. PMID   26168127.
  72. "Specials : Nature". Nature. Retrieved 22 October 2015.
  73. Vuong, Quan-Hoang (2018). "The (ir)rational consideration of the cost of science in transition economies". Nature Human Behaviour. 2 (1): 5. doi: 10.1038/s41562-017-0281-4 . PMID   30980055. S2CID   46878093.
  74. Piwowar, Heather A.; Day, Roger S.; Fridsma, Douglas B. (2 March 2007). "Sharing Detailed Research Data Is Associated with Increased Citation Rate". PLOS ONE. 2 (3): e308. Bibcode:2007PLoSO...2..308P. doi: 10.1371/journal.pone.0000308 . PMC   1817752 . PMID   17375194.
  75. Swan, Alma. "The Open Access citation advantage: Studies and results to date." (2010).
  76. Steltenpohl, Crystal N.; Lustick, Hilary; Meyer, Melanie S.; Lee, Lindsay Ellis; Stegenga, Sondra M.; Reyes, Laurel Standiford; Renbarger, Rachel L. (17 May 2023). "Rethinking Transparency and Rigor from a Qualitative Open Science Perspective". Journal of Trial & Error. 4. doi:10.36850/mr7.
  77. Jacobs, Alan M. (31 March 2020), "Pre-registration and Results-Free Review in Observational and Qualitative Research", The Production of Knowledge, Cambridge University Press, pp. 221–264, doi:10.1017/9781108762519.009, ISBN   978-1-108-76251-9 , retrieved 16 May 2024
  78. Jamieson, Michelle K.; Govaart, Gisela H.; Pownall, Madeleine (2 February 2023). "Reflexivity in quantitative research: A rationale and beginner's guide". Social and Personality Psychology Compass. 17 (4). doi:10.1111/spc3.12735. ISSN   1751-9004.
  79. Silverstein, Priya; Elman, Colin; Montoya, Amanda; McGillivray, Barbara; Pennington, Charlotte R.; Harrison, Chase H.; Steltenpohl, Crystal N.; Röer, Jan Philipp; Corker, Katherine S.; Charron, Lisa M.; Elsherif, Mahmoud; Malicki, Mario; Hayes-Harb, Rachel; Grinschgl, Sandra; Neal, Tess (16 February 2024). "A guide for social science journal editors on easing into open science". Research Integrity and Peer Review. 9 (1): 2. doi: 10.1186/s41073-023-00141-5 . ISSN   2058-8615. PMC   10870631 . PMID   38360805.
  80. Center for Brains, Minds and Machines (CBMM) (19 August 2016). "Big Science, Team Science & Open Science to Understand Neocortex". Archived from the original on 19 December 2021 via YouTube.
  81. Besançon, Lonni; Peiffer-Smadja, Nathan; Segalas, Corentin; Jiang, Haiting; Masuzzo, Paola; Smout, Cooper; Billy, Eric; Deforet, Maxime; Leyrat, Clémence (2020). "Open Science Saves Lives: Lessons from the COVID-19 Pandemic". BMC Medical Research Methodology. 21 (1): 117. doi: 10.1186/s12874-021-01304-y . PMC   8179078 . PMID   34090351.
  82. "Brain Research through Advancing Innovative Neurotechnologies (BRAIN) – National Institutes of Health (NIH)". braininitiative.nih.gov.
  83. Osborne, Robin (8 July 2013). "Why open access makes no sense". The Guardian. ISSN   0261-3077 . Retrieved 11 January 2017.
  84. Eveleth, Rose. "Free Access to Science Research Doesn't Benefit Everyone". The Atlantic. Retrieved 11 January 2017.
  85. Enserink, Martin (23 November 2011). "Scientists Brace for Media Storm Around Controversial Flu Studies". Archived from the original on 16 April 2013. Retrieved 19 April 2012.
  86. Malakoff, David (4 March 2012). "Senior U.S. Lawmaker Leaps Into H5N1 Flu Controversy". Science Insider – AAAS.ORG. Archived from the original on 1 May 2013. Retrieved 19 April 2012.
  87. Cohen, Jon (25 January 2012). "A Central Researcher in the H5N1 Flu Debate Breaks His Silence". Science Insider – AAAS.ORG. Archived from the original on 11 May 2013. Retrieved 19 April 2012.
  88. Nielsen 2011, p. 200.
  89. Musunuri, Sriharshita; Sandbrink, Jonas B.; Monrad, Joshua Teperowski; Palmer, Megan J.; Koblentz, Gregory D. (October 2021). "Rapid Proliferation of Pandemic Research: Implications for Dual-Use Risks". mBio. 12 (5): e0186421. doi:10.1128/mBio.01864-21. PMC   8524337 . PMID   34663091.
  90. Kuhlau, Frida; Höglund, Anna T; Eriksson, Stefan; Evers, Kathinka (March 2013). "The ethics of disseminating dual-use knowledge". Research Ethics. 9 (1): 6–19. doi:10.1177/1747016113478517. ISSN   1747-0161. S2CID   153462235.
  91. Crotty, Shane (2003). Ahead of the curve : David Baltimore's life in science. Berkeley: University of California Press. ISBN   978-0520239043 . Retrieved 23 May 2015.
  92. Wade, Nicholas (19 March 2015). "Scientists Seek Ban on Method of Editing the Human Genome". The New York Times. Retrieved 25 May 2015.
  93. "Technology is changing faster than regulators can keep up – here's how to close the gap". World Economic Forum. 21 June 2018. Retrieved 27 January 2022.
  94. Nielsen 2011, p. 201.
  95. "Open Science and its Discontents – Ronin Institute". ronininstitute.org. 28 June 2016.
  96. "Fake news". NPR .
  97. "The Winnower – Open Scholarly Publishing". thewinnower.com.
  98. P. Mirowski, Science-Mart, Privatizing American Science. Harvard University Press, 2011.
  99. P. Mirowski, "The future(s) of open science," Soc. Stud. Sci., vol. 48, no. 2, pp. 171–203, April 2018.
  100. Henrich, Joseph; Heine, Steven J.; Norenzayan, Ara (June 2010). "The weirdest people in the world?". Behavioral and Brain Sciences. 33 (2–3): 61–83. doi:10.1017/S0140525X0999152X. ISSN   0140-525X. PMID   20550733. S2CID   263512337.
  101. Jin, Haiyang; Wang, Qing; Yang, Yu-Fang; Zhang, Han; Gao, Mengyu (Miranda); Jin, Shuxian; Chen, Yanxiu (Sharon); Xu, Ting; Zheng, Yuan-Rui; Chen, Ji; Xiao, Qinyu; Yang, Jinbiao; Wang, Xindi; Geng, Haiyang; Ge, Jianqiao (January 2023). "The Chinese Open Science Network (COSN): Building an Open Science Community From Scratch". Advances in Methods and Practices in Psychological Science. 6 (1): 251524592211449. doi: 10.1177/25152459221144986 . ISSN   2515-2459.
  102. Puthillam, Arathy; Montilla Doble, Lysander James; Delos Santos, Junix Jerald I.; Elsherif, Mahmoud Medhat; Steltenpohl, Crystal N.; Moreau, David; Pownall, Madeleine; Silverstein, Priya; Anand-Vembar, Shaakya; Kapoor, Hansika (January 2024). "Guidelines to improve internationalization in the psychological sciences". Social and Personality Psychology Compass. 18 (1). doi: 10.1111/spc3.12847 . ISSN   1751-9004.
  103. Allen, Paul (30 November 2011). "Why We Chose 'Open Science'". The Wall Street Journal . Retrieved 6 January 2012.
  104. "DARTH – Decision Analysis in R for Technologies in Health".
  105. http://www.sjscience.org
  106. "OCSDNET". OCSDNET.
  107. "Discrete Analysis launched". Gowers's Weblog. 1 March 2016. Retrieved 8 December 2019.
  108. "Discrete Analysis". discreteanalysisjournal.com. Retrieved 8 December 2019.
  109. "A discussion about transparency". helmholtz.de. Helmholtz Association. 30 November 2016. Retrieved 20 October 2018.
  110. Heise, Christian; Pearce, Joshua M. (10 May 2020). "From Open Access to Open Science: The Path From Scientific Reality to Open Scientific Communication". SAGE Open. 10 (2). doi: 10.1177/2158244020915900 . ISSN   2158-2440.
  111. "About the first open PhD thesis". offene-doktorarbeit.de. Retrieved 20 October 2018.
  112. Heise, Christian (2018). Von Open Access zu Open Science (in German). Lüneburg, Germany: meson press e.G. doi:10.14619/1303. ISBN   978-3957961303.
  113. "CatalyzeX". catalyzex.com.
  114. @gragtah (22 June 2023). "✅ CatalyzeX 🤝 ArXiv Now you can see open-source code implementations for AI research papers (and more!) via @catalyzeX directly on @arxiv…" (Tweet). Retrieved 9 July 2024 via Twitter.
  115. "OpenReview". openreview.net.
  116. @gragtah (3 May 2023). "🎉 CatalyzeX 🤝 OpenReview now you can see code implementations via @catalyzeX directly on @openreviewnet…" (Tweet). Retrieved 9 July 2024 via Twitter.
  117. "Browse Jobs". jobRxiv. Retrieved 26 June 2020.
  118. @jobRxiv (25 June 2020). "We want to change the way recruitment for #ScienceJobs is done, to open up a way for all labs to find the best cand…" (Tweet). Retrieved 26 June 2020 via Twitter.
  119. Noble, Ivan (14 February 2002). "Boost for research paper access". BBC News . London. Retrieved 12 February 2012.
  120. Wright, David; Williams, Elaine; Bryce, Colin; le May, Andrée; Stein, Ken; Milne, Ruairidh; Walley, Tom (31 July 2018). "A novel approach to sharing all available information from funded health research: the NIHR Journals Library". Health Research Policy and Systems. 16 (1): 70. doi: 10.1186/s12961-018-0339-4 . ISSN   1478-4505. PMC   6069813 . PMID   30064444.
  121. "About". NIHR Journals Library. Retrieved 14 January 2022.
  122. Leible, Stephan; Schlager, Steffen; Schubotz, Moritz; Gipp, Bela (2019). "A Review on Blockchain Technology and Blockchain Projects Fostering Open Science". Frontiers in Blockchain. 2: 1–28. doi: 10.3389/fbloc.2019.00016 .
  123. "Advancing the sharing of research results for the life sciences". biorxiv.org. Retrieved 17 February 2018.
  124. "Copyright Training Resources | MarXiv". marxivinfo.org. Archived from the original on 18 February 2018. Retrieved 17 February 2018.
  125. 1 2 "Announcing the development of SocArXiv, an open social science archive". 9 July 2016.
  126. Tierney, H. L., Hammond, P., Nordlander, P., & Weiss, P. S. (2012). Prior Publication: Extended Abstracts, Proceedings Articles, Preprint Servers, and the Like.
  127. "Accelerating Your Science with arXiv and Google Scholar". An Assembly of Fragments. 2 November 2012. Retrieved 17 February 2018.
  128. Moed, H. F. (2007). "The effect of "open access" on citation impact: An analysis of ArXiv's Condensed matter section". Journal of the American Society for Information Science and Technology. 58 (13): 2047–2054. arXiv: cs/0611060 . Bibcode:2007JASIS..58.2047M. doi:10.1002/asi.20663. S2CID   1060908.
  129. Binfield, P. (2014). Novel scholarly journal concepts. In Opening science (pp. 155–163). Springer International Publishing.
  130. Giles, Jim (2003). "Preprint server seeks way to halt plagiarists". Nature. 426 (6962): 7. Bibcode:2003Natur.426Q...7G. doi: 10.1038/426007a . PMID   14603280.
  131. Chaddah, P. (2016). On the need for a National Preprint Repository. Proceedings of the Indian National Science Academy, 82(4), 1167–1170.

Sources