Berkeley Initiative for Transparency in the Social Sciences

Last updated
Berkeley Initiative for Transparency in the Social Sciences
BITSS logo.png
AbbreviationBITSS
Formation2012
Founder Edward Miguel
TypeInitiative
PurposePromoting open research in the social sciences
Headquarters Berkeley, California
Faculty Director
Edward Miguel
Parent organization
Center for Effective Global Action
Website www.bitss.org

The Berkeley Initiative for Transparency in the Social Sciences, abbreviated BITSS, is an academic initiative dedicated to advancing transparency, reproducibility, and openness in social science research. It was established in 2012 by the University of California, Berkeley's Center for Effective Global Action. [1] It has worked with the Center for Open Science to define and promote a set of best practices for social scientists to maximize transparency in their research. [2] BITSS has also worked to promote registered reports, supporting journals like the Journal of Development Economics in taking up the review track.

Contents

In 2015, BITSS began awarding the annual Leamer-Rosenthal Prizes for Open Social Science to honor outstanding achievements and emerging leaders in promoting transparency in social science. [3] Through its Catalyst program, the initiative also supports and empowers over 150 graduate students, faculty, librarians, and early career researchers to advance open science all over the world. [4] Their annual Research Transparency and Reproducibility Training (RT2) provides an overview of and hands-on practice with tools and practices for transparent and reproducible social science research. Their Massive Open Online Course "Transparent and Open Social Science Research,” based on a UC Berkeley course taught by Edward Miguel, is available on the FutureLearn platform. In 2019, BITSS also began distributing copies of "Transparent and Reproducible Social Science Research," a textbook written by former BITSS Scientist Garret Christensen, Jeremy Freese, and Edward Miguel with support from BITSS, at their trainings and events.

BITSS has supported or led several metascience research projects including the State of Social Science (3S) study and the Social Science Meta-Analysis and Research Transparency (SSMART) portfolio. [5] BITSS also manages MetaArxiv, an interdisciplinary archive hosted on OSF Preprints of articles focused on metascience, research transparency, and reproducibility.

In recent years, BITSS has begun developing digital infrastructure to enable open science practices. The Social Science Prediction Platform (SSPP), launched in 2020, enables the systematic collection and assessment of expert forecasts of research results and the effects of untested social programs. [6] The Social Science Reproduction Platform (SSRP) crowdsources and catalogs attempts to assess and improve the computational reproducibility of social science research. The accompanying Guide for Accelerating Computational Reproducibility in the Social Sciences elucidates a common approach, terminology, and standards for conducting reproductions. [7] These platforms are part of a growing ecosystem of tools that expand opportunities to participate in the scientific endeavor.

BITSS has also incubated an initiative on Open Policy Analysis (OPA), [8] which seeks to strengthen connections between research and policy and reduce political polarization by translating open science practices to policy analysis. Led by Fernando Hoces de la Guardia, the OPA initiative has developed tools for US Senator Elizabeth Warren's wealth tax proposal and Evidence Action's Deworm the World program.

See also

Related Research Articles

Reproducibility, also known as replicability and repeatability, is a major principle underpinning the scientific method. For the findings of a study to be reproducible means that results obtained by an experiment or an observational study or in a statistical analysis of a data set should be achieved again with a high degree of reliability when the study is replicated. There are different kinds of replication but typically replication studies involve different researchers using the same methodology. Only after one or several such successful replications should a result be recognized as scientific knowledge.

E-Science or eScience is computationally intensive science that is carried out in highly distributed network environments, or science that uses immense data sets that require grid computing; the term sometimes includes technologies that enable distributed collaboration, such as the Access Grid. The term was created by John Taylor, the Director General of the United Kingdom's Office of Science and Technology in 1999 and was used to describe a large funding initiative starting in November 2000. E-science has been more broadly interpreted since then, as "the application of computer technology to the undertaking of modern scientific investigation, including the preparation, experimentation, data collection, results dissemination, and long-term storage and accessibility of all materials generated through the scientific process. These may include data modeling and analysis, electronic/digitized laboratory notebooks, raw and fitted data sets, manuscript production and draft versions, pre-prints, and print and/or electronic publications." In 2014, IEEE eScience Conference Series condensed the definition to "eScience promotes innovation in collaborative, computationally- or data-intensive research across all disciplines, throughout the research lifecycle" in one of the working definitions used by the organizers. E-science encompasses "what is often referred to as big data [which] has revolutionized science... [such as] the Large Hadron Collider (LHC) at CERN... [that] generates around 780 terabytes per year... highly data intensive modern fields of science...that generate large amounts of E-science data include: computational biology, bioinformatics, genomics" and the human digital footprint for the social sciences.

Bibliometrics Statistical analysis of written publications

Bibliometrics is the use of statistical methods to analyse books, articles and other publications, especially in regard with scientific contents. Bibliometric methods are frequently used in the field of library and information science. Bibliometrics is closely associated with scientometrics, that is the analysis of scientific metrics and indicators, to the point that both fields largely overlap.

<i>American Psychologist</i> Academic journal

American Psychologist is the flagship peer-reviewed academic journal of the American Psychological Association. The journal publishes timely high-impact articles of broad interest. Papers include empirical reports and scholarly reviews covering science, practice, education, and policy. Current editor-in-chief is Harris Cooper, PhD.

Open science is the movement to make scientific research and its dissemination accessible to all levels of society, amateur or professional. Open science is transparent and accessible knowledge that is shared and developed through collaborative networks. It encompasses practices such as publishing open research, campaigning for open access, encouraging scientists to practice open-notebook science, broader dissemination and engagement in science and generally making it easier to publish, access and communicate scientific knowledge.

Open data Data that anyone can access, use or share

Open data is data that is openly accessible, exploitable, editable and shared by anyone for any purpose, even commercially. Open data is licensed under an open license.

Data sharing

Data sharing is the practice of making data used for scholarly research available to other investigators. Many funding agencies, institutions, and publication venues have policies regarding data sharing because transparency and openness are considered by many to be part of the scientific method.

Galaxy (computational biology)

Galaxy is a scientific workflow, data integration, and data and analysis persistence and publishing platform that aims to make computational biology accessible to research scientists that do not have computer programming or systems administration experience. Although it was initially developed for genomics research, it is largely domain agnostic and is now used as a general bioinformatics workflow management system.

Arthur Lupia is an American political scientist. He is the Gerald R. Ford University Professor at the University of Michigan and Assistant Director of the National Science Foundation. Prior to joining NSF, he was Chairperson of the Board of the Center for Open Science and Chair of National Research Council's Roundtable on the Application of Behavioral and Social Science. His research concerns how information and institutions affect policy and politics, with a focus on how people make decisions when they lack information. He draws from multiple scientific and philosophical disciplines and uses multiple research methods. His topics of expertise include information processing, persuasion, strategic communication, and civic competence.

Why Most Published Research Findings Are False

"Why Most Published Research Findings Are False" is a 2005 essay written by John Ioannidis, a professor at the Stanford School of Medicine, and published in PLOS Medicine. It is considered foundational to the field of metascience.

Edward Miguel American economist

Edward "Ted" Andrew Miguel is the Oxfam Professor of Environmental and Resource Economics in the Department of Economics at University of California, Berkeley, US. He is the founder and faculty director of the Center for Effective Global Action (CEGA) at U.C. Berkeley.

<i>Journal of Experimental Psychology: General</i> Academic journal

The Journal of Experimental Psychology: General is a peer-reviewed academic journal published by the American Psychological Association. It was established in 1975 as an independent section of the Journal of Experimental Psychology and covers research in experimental psychology.

Center for Open Science

The Center for Open Science is a non-profit technology organization based in Charlottesville, Virginia with a mission to "increase the openness, integrity, and reproducibility of scientific research." Brian Nosek and Jeffrey Spies founded the organization in January 2013, funded mainly by the Laura and John Arnold Foundation and others.

Replication crisis Ongoing methodological crisis in science stemming from failure to replicate many studies

The replication crisis is an ongoing methodological crisis in which it has been found that the results of many scientific studies are difficult or impossible to reproduce. Because the reproducibility of empirical results is an essential part of the scientific method, such failures undermine the credibility of theories building on them and potentially call into question substantial parts of scientific knowledge.

The Open Energy Modelling Initiative (openmod) is a grassroots community of energy system modellers from universities and research institutes across Europe and elsewhere. The initiative promotes the use of open-source software and open data in energy system modelling for research and policy advice. The Open Energy Modelling Initiative documents a variety of open-source energy models and addresses practical and conceptual issues regarding their development and application. The initiative runs an email list, an internet forum, and a wiki and hosts occasional academic workshops. A statement of aims is available.

Metascience is the use of scientific methodology to study science itself. Metascience seeks to increase the quality of scientific research while reducing inefficiency. It is also known as "research on research" and "the science of science", as it uses research methods to study how research is done and find where improvements can be made. Metascience concerns itself with all fields of research and has been described as "a bird's eye view of science". In the words of John Ioannidis, "Science is the best thing that has happened to human beings ... but we can do it better."

The Meta-Research Center at Tilburg University is a metascience research center within the School of Social and Behavioral Sciences at the Dutch Tilburg University. They were profiled in a September 2018 article in Science.

Statcheck is an R package designed to detect statistical errors in peer-reviewed psychology articles by searching papers for statistical results, redoing the calculations described in each paper, and comparing the two values to see if they match. It takes advantage of the fact that psychological research papers tend to report their results in accordance with the guidelines published by the American Psychological Association (APA). This leads to several disadvantages: it can only detect results reported completely and in exact accordance with the APA's guidelines, and it cannot detect statistics that are only included in tables in the paper. Another limitation is that Statcheck cannot deal with statistical corrections to test statistics, like Greenhouse–Geisser or Bonferroni corrections, which actually make tests more conservative. Some journals have begun piloting Statcheck as part of their peer review process. Statcheck is free software published under the GNU GPL v3.

Preregistration (science)

Preregistration is the practice of registering the hypotheses, methods, and/or analyses of a scientific study before it is conducted. This can include analyzing primary data or secondary data. Clinical trial registration is similar, although it may not require the registration of a study's analysis protocol. Finally, registered reports include the peer review and in principle acceptance of a study protocol prior to data collection.

Crowdsourced science refers to collaborative contributions of a large group of people to the different steps of the research process in science. In psychology, the nature and scope of the collaborations can vary in their application and in the benefits it offers.

References

  1. "About". Berkeley Initiative for Transparency in the Social Sciences. 2015-10-08. Retrieved 2018-10-19.
  2. Miguel, E.; Camerer, C.; Casey, K.; Cohen, J.; Esterling, K. M.; Gerber, A.; Glennerster, R.; Green, D. P.; Humphreys, M. (2014-01-03). "Promoting Transparency in Social Science Research". Science. 343 (6166): 30–31. Bibcode:2014Sci...343...30M. doi:10.1126/science.1245317. ISSN   0036-8075. PMC   4103621 . PMID   24385620.
  3. Yong, Ed (2015-12-10). "Make Science More Reliable, Win Cash Prizes". The Atlantic. Retrieved 2018-10-19.
  4. "Catalysts". Berkeley Initiative for Transparency in the Social Sciences. 2015-12-03. Retrieved 2018-10-19.
  5. Christensen, Garret; Wang, Zenan; Paluck, Elizabeth Levy; Swanson, Nicholas; Birke, David J.; Miguel, Edward; Littman, Rebecca (2019-10-18). "Open Science Practices are on the Rise: The State of Social Science (3S) Survey". doi:10.31222/osf.io/5rksu.{{cite journal}}: Cite journal requires |journal= (help)
  6. DellaVigna, Stefano; Pope, Devin; Vivalt, Eva (2019-10-25). "Predict science to improve science". Science. 366 (6464): 428–429. doi:10.1126/science.aaz1704. ISSN   0036-8075.
  7. Team, ACRE. Guide for Accelerating Computational Reproducibility in the Social Sciences | Guide for Accelerating Computational Reproducibility in the Social Sciences.
  8. Hoces de la Guardia, Fernando; Grant, Sean; Miguel, Edward (2021-04-01). "A framework for open policy analysis". Science and Public Policy. 48 (2): 154–163. doi:10.1093/scipol/scaa067. hdl: 1805/24824 . ISSN   0302-3427.