Open peer review

Last updated

Open peer review is the various possible modifications of the traditional scholarly peer review process. The three most common modifications to which the term is applied are: [1]

Contents

  1. Open identities: Authors and reviewers are aware of each other's identity. [2] [3]
  2. Open reports: Review reports are published alongside the relevant article (rather than being kept confidential).
  3. Open participation: The wider community (and not just invited reviewers) are able to contribute to the review process.

These modifications are supposed to address various perceived shortcomings of the traditional scholarly peer review process, in particular its lack of transparency, lack of incentives, wastefulness, [1] bullying and harassment. [4]

Definitions

Open identities
Open peer review may be defined as "any scholarly review mechanism providing disclosure of author and referee identities to one another at any point during the peer review or publication process". [5] Then reviewer's identities may or may not be disclosed to the public. This is in contrast to the traditional peer review process where reviewers remain anonymous to anyone but the journal's editors, while authors' names are disclosed from the beginning.
Open reports
Open peer review may be defined as making the reviewers' reports public, instead of disclosing them to the article's authors only. This may include publishing the rest of the peer review history, i.e. the authors' replies and editors' recommendations. Most often, this concerns only articles that are accepted for publication, and not those that are rejected.
Open participation
Open peer review may be defined as allowing self-selected reviewers to comment on an article, rather than (or in addition to) having reviewers who are selected by the editors. This assumes that the text of the article is openly accessible. The self-selected reviewers may or may not be screened for their basic credentials, and they may contribute either short comments or full reviews. [1]

History

In 1999, the open access journal Journal of Medical Internet Research [6] was launched, which from its inception decided to publish the names of the reviewers at the bottom of each published article. Also in 1999, the British Medical Journal moved to an open peer review system, revealing reviewers' identities to the authors but not the readers, [7] and in 2000, the medical journals in the open access BMC series [8] published by BioMed Central, launched using open peer review. As with the BMJ , the reviewers' names are included on the peer review reports. In addition, if the article is published the reports are made available online as part of the "pre-publication history"'.[ citation needed ]

Several other journals published by the BMJ Group allow optional open peer review, [7] as does PLoS Medicine , published by the Public Library of Science. [9] The BMJ's Rapid Responses allows ongoing debate and criticism following publication. [10]

In June 2006, Nature launched an experiment in parallel open peer review: some articles that had been submitted to the regular anonymous process were also available online for open, identified public comment. The results were less than encouraging – only 5% of authors agreed to participate in the experiment, and only 54% of those articles received comments. [11] [12] The editors have suggested that researchers may have been too busy to take part and were reluctant to make their names public. The knowledge that articles were simultaneously being subjected to anonymous peer review may also have affected the uptake.

In February 2006, the journal Biology Direct was launched by BioMed Central, adding another alternative to the traditional model of peer review. If authors can find three members of the Editorial Board who will each return a report or will themselves solicit an external review, the article will be published. As with Philica , reviewers cannot suppress publication, but in contrast to Philica, no reviews are anonymous and no article is published without being reviewed. Authors have the opportunity to withdraw their article, to revise it in response to the reviews, or to publish it without revision. If the authors proceed with publication of their article despite critical comments, readers can clearly see any negative comments along with the names of the reviewers. [13] [ third-party source needed ] In the social sciences, there have been experiments with wiki-style, signed peer reviews, for example in an issue of the Shakespeare Quarterly . [14]

In 2010, the BMJ began publishing signed reviewer's reports alongside accepted papers, after determining that telling reviewers that their signed reviews might be posted publicly did not significantly affect the quality of the reviews. [15]

In 2011, Peerage of Science, an independent peer review service, was launched with several non-traditional approaches to academic peer review. Most prominently, these include the judging and scoring of the accuracy and justifiability of peer reviews, and concurrent usage of a single peer review round by several participating journals.[ citation needed ] Peerage of Science went out of business only a few year after it was founded, because it could attract neither enough publishers nor enough reviewers.

Starting in 2013 with the launch of F1000Research , some publishers have combined open peer review with postpublication peer review by using a versioned article system. At F1000Research, articles are published before review, and invited peer review reports (and reviewer names) are published with the article as they come in. [16] Author-revised versions of the article are then linked to the original. A similar postpublication review system with versioned articles is used by Science Open launched in 2014. [17]

Also in 2013, researchers from College of Information and Computer Sciences at University of Massachusetts Amherst founded OpenReview website [18] to host anonymized review reports together with articles, which is as of 2023 popular among computer scientists.

In 2014, Life implanted an open peer review system, [19] under which the peer-review reports and authors' responses are published as an integral part of the final version of each article.

Since 2016, Synlett is experimenting with closed crowd peer review. The article under review is sent to a pool of 80+ expert reviewers who then collaboratively comment on the manuscript. [20]

In an effort to address issues with the reproducibility of research results, some scholars are asking that authors agree to share their raw data as part of the peer review process. [21] As far back as 1962, for example, a number of psychologists have attempted to obtain raw data sets from other researchers, with mixed results, in order to reanalyze them. A recent attempt resulted in only seven data sets out of fifty requests. The notion of obtaining, let alone requiring, open data as a condition of peer review remains controversial. [22] In 2020 peer review lack of access to raw data led to article retractions in prestigious The New England Journal of Medicine and The Lancet. Many journals now require access to raw data to be included in peer review. [23]

Adoption

Adoption by publishers

These publishers and journals operate various types of open peer review:

Peer review at The BMJ , [30] BioMed Central, [31] EMBO, [32] eLife , [33] ReScience C , [28] and the Semantic Web journal [34] involves posting the entire pre-publication history of the article online, including not only signed reviews of the article, but also its previous versions and in some cases names of handling editors and author responses to the reviewers. Furthermore, the Semantic Web journal publishes reviews of all submissions, including rejected ones, on its website, while eLife plans to publish the reviews not only for published articles, but also for rejected articles. [35]

The European Geosciences Union operates public discussions where open peer review is conducted before suitable articles are accepted for publication in the journal. [36]

Sci, an open access journal which covers all research fields, adapted a post publication public peer-review (P4R) in which it promised authors immediate visibility of their manuscripts on the journal's online platform after a brief and limited check of scientific soundness and proper reporting and against plagiarism and offensive material; the manuscript is rendered open for public review by the entire community. [37] [38] [39] [40]

In 2021, the authors of nearly half of the articles published by Nature chose to publish the reviewer reports as well. The journal considers this as an encouraging trial of transparent peer review. [41]

Open peer review of preprints

Some platforms, including some preprint servers, facilitate open peer review of preprints.

Advantages and disadvantages

Argued

Open identities have been argued to incite reviewers to be "more tactful and constructive" than they would be if they could remain anonymous, while however allowing authors to accumulate enemies who try to keep their papers from being published or their grant applications from being successful. [46]

Open peer review in all its forms has been argued to favour more honest reviewing, and to prevent reviewers from following their individual agendas. [47]

An article by Lonni Besançon et al. has also argued that open peer review helps evaluate the legitimacy of manuscripts that contain editorial conflict of interests; the authors argue that the COVID-19 pandemic has spurred many publishers to open up their review process, increasing transparency in the process. [48]

Observed

In an experiment with 56 research articles accepted by the Medical Journal of Australia in 1996–1997, the articles were published online together with the peer reviewers' comments; readers could email their comments and the authors could amend their articles further before print publication. [49] The investigators concluded that the process had modest benefits for authors, editors and readers.

Some studies have found that open identities lead to an increase in the quality of reviews, while other studies find no significant effect. [50]

Open peer review at BMJ journals has lent itself to randomized trials to study open identity and open report reviews. These studies did not find that open identities and open reports significantly affected the quality of review or the rate of acceptance of articles for publication, and there was only one reported instance of a conflict between authors and reviewers ("adverse event"). The only significant negative effect of open peer review was "increasing the likelihood of reviewers declining to review". [3] [51]

In some cases, open identities have helped detect reviewers' conflicts of interests. [52]

Open participation has been criticised as being a form of popularity contest in which well known authors are more likely to get their manuscripts reviewed than others. [53] However, even with this implementation of Open Peer Reviews, both authors and reviewers acknowledged that Open Reviews could lead to a higher quality of reviews, foster collaborations and reduce the "cite-me" effect.

According to a 2020 Nature editorial, [25] experience from Nature Communications negates the concerns that open reports would be less critical, or would require an excessive amount of work from reviewers.

Thanks to published reviewer comments, it is possible to conduct quantitative studies of the peer review process. For example, a 2021 study has found that scrutiny by more reviewers mostly does not correlate with more impactful papers. [54]

See also

Related Research Articles

<span class="mw-page-title-main">Peer review</span> Evaluation of work by one or more people of similar competence to the producers of the work

Peer review is the evaluation of work by one or more people with similar competencies as the producers of the work. It functions as a form of self-regulation by qualified members of a profession within the relevant field. Peer review methods are used to maintain quality standards, improve performance, and provide credibility. In academia, scholarly peer review is often used to determine an academic paper's suitability for publication. Peer review can be categorized by the type of activity and by the field or profession in which the activity occurs, e.g., medical peer review. It can also be used as a teaching tool to help students improve writing assignments.

<span class="mw-page-title-main">Preprint</span> Academic paper prior to journal publication

In academic publishing, a preprint is a version of a scholarly or scientific paper that precedes formal peer review and publication in a peer-reviewed scholarly or scientific journal. The preprint may be available, often as a non-typeset version available free, before or after a paper is published in a journal.

<span class="mw-page-title-main">Academic publishing</span> Subfield of publishing distributing academic research and scholarship

Academic publishing is the subfield of publishing which distributes academic research and scholarship. Most academic work is published in academic journal articles, books or theses. The part of academic written output that is not formally published but merely printed up or posted on the Internet is often called "grey literature". Most scientific and scholarly journals, and many academic and scholarly books, though not all, are based on some form of peer review or editorial refereeing to qualify texts for publication. Peer review quality and selectivity standards vary greatly from journal to journal, publisher to publisher, and field to field.

<span class="mw-page-title-main">Open access</span> Research publications distributed freely online

Open access (OA) is a set of principles and a range of practices through which research outputs are distributed online, free of access charges or other barriers. With open access strictly defined, or libre open access, barriers to copying or reuse are also reduced or removed by applying an open license for copyright.

The impact factor (IF) or journal impact factor (JIF) of an academic journal is a scientometric index calculated by Clarivate that reflects the yearly mean number of citations of articles published in the last two years in a given journal, as indexed by Clarivate's Web of Science.

F1000 is an open research publisher for scientists, scholars, and clinical researchers. F1000 offers a different research evaluation service from standard academic journals by offering peer-review after, rather than before, publishing a research article. Initially, F1000 was named after the 1,000 faculty members that performed peer-reviews, but over time F1000 expanded to more than 8,000 members. When F1000 was acquired by Taylor & Francis Group in January 2020, it kept the publishing services. F1000Prime and F1000 Workspace were acquired by different brands.

PubMed Central (PMC) is a free digital repository that archives open access full-text scholarly articles that have been published in biomedical and life sciences journals. As one of the major research databases developed by the National Center for Biotechnology Information (NCBI), PubMed Central is more than a document repository. Submissions to PMC are indexed and formatted for enhanced metadata, medical ontology, and unique identifiers which enrich the XML structured data for each article. Content within PMC can be linked to other NCBI databases and accessed via Entrez search and retrieval systems, further enhancing the public's ability to discover, read and build upon its biomedical knowledge.

Citation impact or citation rate is a measure of how many times an academic journal article or book or author is cited by other articles, books or authors. Citation counts are interpreted as measures of the impact or influence of academic work and have given rise to the field of bibliometrics or scientometrics, specializing in the study of patterns of academic impact through citation analysis. The importance of journals can be measured by the average citation rate, the ratio of number of citations to number articles published within a given time period and in a given index, such as the journal impact factor or the citescore. It is used by academic institutions in decisions about academic tenure, promotion and hiring, and hence also used by authors in deciding which journal to publish in. Citation-like measures are also used in other fields that do ranking, such as Google's PageRank algorithm, software metrics, college and university rankings, and business performance indicators.

<span class="mw-page-title-main">Open science</span> Generally available scientific research

Open science is the movement to make scientific research and its dissemination accessible to all levels of society, amateur or professional. Open science is transparent and accessible knowledge that is shared and developed through collaborative networks. It encompasses practices such as publishing open research, campaigning for open access, encouraging scientists to practice open-notebook science, broader dissemination and engagement in science and generally making it easier to publish, access and communicate scientific knowledge.

<i>PLOS One</i> Peer-reviewed open-access scientific journal

PLOS One is a peer-reviewed open access mega journal published by the Public Library of Science (PLOS) since 2006. The journal covers primary research from any discipline within science and medicine. The Public Library of Science began in 2000 with an online petition initiative by Nobel Prize winner Harold Varmus, formerly director of the National Institutes of Health and at that time director of Memorial Sloan–Kettering Cancer Center; Patrick O. Brown, a biochemist at Stanford University; and Michael Eisen, a computational biologist at the University of California, Berkeley, and the Lawrence Berkeley National Laboratory.

Scholarly peer review or academic peer review is the process of having a draft version of a researcher's methods and findings reviewed by experts in the same field. Peer review is widely used for helping the academic publisher decide whether the work should be accepted, considered acceptable with revisions, or rejected for official publication in an academic journal, a monograph or in the proceedings of an academic conference. If the identities of authors are not revealed to each other, the procedure is called dual-anonymous peer review.

<i>eLife</i> Open-access scientific journal

eLife is a not-for-profit, peer-reviewed, open access, science publisher for the biomedical and life sciences. It was established at the end of 2012 by the Howard Hughes Medical Institute, Max Planck Society, and Wellcome Trust, following a workshop held in 2010 at the Janelia Farm Research Campus. Together, these organizations provided the initial funding to support the business and publishing operations. In 2016, the organizations committed US$26 million to continue publication of the journal.

<i>PeerJ</i> Academic journal

PeerJ is an open access peer-reviewed scientific mega journal covering research in the biological and medical sciences. It was originally published by a company of the same name that was co-founded by CEO Jason Hoyt and publisher Peter Binfield, with initial financial backing of US$950,000 from O'Reilly Media's O'Reilly AlphaTech Ventures, and later funding from Sage Publishing. In 2024, it was acquired by traditional research publisher Taylor & Francis.

ScienceOpen is a web-based platform, that hosts open access journals. It is freely accessible for readers, authors and publishers, and it generates its revenues via promotional services for publishers and authors' institutions. The organization is based in Berlin and has a technical office in Boston. It is a member of CrossRef, ORCID, the Open Access Scholarly Publishers Association, STM Association and the Directory of Open Access Journals. The company was designated as one of “10 to Watch” by research advisory firm Outsell in its report “Open Access 2015: Market Size, Share, Forecast, and Trends.”

bioRxiv Preprint service

bioRxiv is an open access preprint repository for the biological sciences co-founded by John Inglis and Richard Sever in November 2013. It is hosted by the Cold Spring Harbor Laboratory (CSHL).

<span class="mw-page-title-main">Conflicts of interest in academic publishing</span>

Conflicts of interest (COIs) often arise in academic publishing. Such conflicts may cause wrongdoing and make it more likely. Ethical standards in academic publishing exist to avoid and deal with conflicts of interest, and the field continues to develop new standards. Standards vary between journals and are unevenly applied. According to the International Committee of Medical Journal Editors, "[a]uthors have a responsibility to evaluate the integrity, history, practices and reputation of the journals to which they submit manuscripts".

Statcheck is an R package designed to detect statistical errors in peer-reviewed psychology articles by searching papers for statistical results, redoing the calculations described in each paper, and comparing the two values to see if they match. It takes advantage of the fact that psychological research papers tend to report their results in accordance with the guidelines published by the American Psychological Association (APA). This leads to several disadvantages: it can only detect results reported completely and in exact accordance with the APA's guidelines, and it cannot detect statistics that are only included in tables in the paper. Another limitation is that Statcheck cannot deal with statistical corrections to test statistics, like Greenhouse–Geisser or Bonferroni corrections, which actually make tests more conservative. Some journals have begun piloting Statcheck as part of their peer review process. Statcheck is free software published under the GNU GPL v3.

<span class="mw-page-title-main">SciPost</span> Nonprofit open-access publisher

SciPost is a non-profit foundation dedicated to developing, implementing and maintaining innovative forms of electronic scientific communication and publishing. It is notable for operating the scipost.org open-access scientific publishing portal.

<span class="mw-page-title-main">Peer Community in</span> Scientific organization

Peer Community in (PCI) is a non-profit scientific organization that offers an editorial process of open science by creating specific communities of researchers reviewing and recommending preprints in their field. Since 2021, a new journal, Peer Community Journal, publishes recommended preprints.

References

  1. 1 2 3 Ross-Hellauer, Tony (2017-08-31). "What is open peer review? A systematic review". F1000Research. F1000 Research Ltd. 6: 588. doi: 10.12688/f1000research.11369.2 . ISSN   2046-1402. PMC   5437951 . PMID   28580134.
  2. Walsh E, Rooney M, Appleby L, Wilkinson G (January 2000). "Open peer review: a randomised controlled trial". The British Journal of Psychiatry. 176 (1): 47–51. doi: 10.1192/bjp.176.1.47 . PMID   10789326.
  3. 1 2 van Rooyen S, Godlee F, Evans S, Black N, Smith R (January 1999). "Effect of open peer review on quality of reviews and on reviewers' recommendations: a randomised trial". BMJ. 318 (7175): 23–7. doi:10.1136/bmj.318.7175.23. PMC   27670 . PMID   9872878.
  4. Sanders, Jeremy K. M. (January 2020). "Editorial 2020: Changing publishing and academic culture". Royal Society Open Science. 7 (1): 192197. Bibcode:2020RSOS....792197S. doi:10.1098/rsos.192197. ISSN   2054-5703. PMC   7029889 . PMID   32218987.
  5. Ford E (2015-07-20). "Open peer review at four STEM journals: an observational overview". F1000Research. 4: 6. doi: 10.12688/f1000research.6005.2 . PMC   4350441 . PMID   25767695.
  6. "JMIR Home". JMIR.org. Retrieved 4 January 2012.
  7. 1 2 Smith, R. (January 1999). "Opening up BMJ peer review". BMJ. 318 (7175): 4–5. doi:10.1136/bmj.318.7175.4. PMC   1114535 . PMID   9872861.
  8. "BMC series". Biomedcentral.com. Retrieved 4 January 2012.
  9. Mathers, Colin D; Loncar, Dejan (27 March 2009). "PLoS Medicine: A Peer-Reviewed, Open-Access Journal". PLOS Medicine. 3 (11): e442. doi: 10.1371/journal.pmed.0030442 . PMC   1664601 . PMID   17132052 . Retrieved 4 January 2012.
  10. Delamothe, T.; Smith, R. (May 2002). "Twenty thousand conversations". BMJ. 324 (7347): 1171–2. doi:10.1136/bmj.324.7347.1171. PMC   1123149 . PMID   12016170.
  11. "Overview: Nature's peer review trial". Nature. December 2006. doi:10.1038/nature05535.
  12. "Peer review and fraud". Nature. 444 (7122): 971–972. 2006. Bibcode:2006Natur.444R.971.. doi: 10.1038/444971b . PMID   17183274. S2CID   27163842.
  13. "Aims and scope". Biology Direct.
  14. Cohen, Patricia (August 23, 2010). "For Scholars, Web Changes Sacred Rite of Peer Review". The New York Times.
  15. van Rooyen, S.; Delamothe, T.; Evans, S. J. (November 2010). "Effect on peer review of telling reviewers that their signed reviews might be posted on the web: randomised controlled trial". BMJ. 341: c5729. doi:10.1136/bmj.c5729. PMC   2982798 . PMID   21081600.
  16. Jeffrey Marlow (July 23, 2013). "Publish First, Ask Questions Later". Wired. Retrieved 2015-01-13.
  17. Elizabeth Allen (September 29, 2017) [December 8, 2014]. "The recipe for our (not so) secret Post-Publication Peer Review sauce!". ScienceOpen.com. Retrieved 2015-01-13.
  18. "OpenReview".
  19. Rampelotto, Pabulo (2014). "Editorial". Life. 4 (2): 225–226. Bibcode:2014Life....4..225R. doi: 10.3390/life4020225 . PMC   4187159 . PMID   25370195.
  20. "The case for crowd peer review". Chemical & Engineering News. 2018-11-26. Retrieved 2020-03-10.
  21. "The PRO Initiative for Open Science". Peer Reviewers' Openness Initiative. 2014-09-13. Retrieved 15 September 2018.
  22. Witkowski, Tomasz (2017). "A Scientist Pushes Psychology Journals toward Open Data". Skeptical Inquirer . 41 (4): 6–7. Archived from the original on 2018-09-15.
  23. Covid-19 studies based on flawed Surgisphere data force medical journals to review processes The Guardian, 2020
  24. "MDPI | The Editorial Process". www.mdpi.com. Retrieved 2022-03-16.
  25. 1 2 "Nature will publish peer review reports as a trial". Nature. 578 (7793): 8. 2020. Bibcode:2020Natur.578....8.. doi: 10.1038/d41586-020-00309-9 . PMID   32025024.
  26. "Increasing the diversity and depth of the peer review pool through embracing identity". OUPblog. 2021-09-21. Retrieved 2022-01-27.
  27. "Open Peer Review". PLOS . 2020. Archived from the original on 2021-09-02. Retrieved 2021-09-02.
  28. 1 2 Perkel, Jeffrey M. (2020-08-24). "Challenge to scientists: does your ten-year-old code still run?". Nature . 584 (7822): 656–658. Bibcode:2020Natur.584..656P. doi: 10.1038/d41586-020-02462-7 . PMID   32839567.
  29. "Refereeing Procedure". SciPost. Retrieved 22 August 2021.
  30. Groves T, Loder E (September 2014). "Prepublication histories and open peer review at the BMJ". BMJ. 349 (sep03 13): g5394. doi: 10.1136/bmj.g5394 . PMID   25186622.
  31. "What is 'open peer review', as operated by the medical journals in the BMC series?". BioMed Central. Retrieved 31 July 2015.
  32. Pulverer B (November 2010). "Transparency showcases strength of peer review". Nature. 468 (7320): 29–31. Bibcode:2010Natur.468...29P. doi: 10.1038/468029a . PMID   21048742.
  33. "Peer review". eLife. Retrieved 30 December 2019.
  34. Janowicz, Krzysztof; Hitzler, Pascal (January 2012). "Open and transparent: the review process of the Semantic Web journal". Learned Publishing . 25 (1): 48–55. doi: 10.1087/20120107 .
  35. Kwon, Diana (2020-12-15). "Open-access journal eLife announces 'preprint first' publishing model". Nature. Springer Science and Business Media LLC. doi:10.1038/d41586-020-03541-5. ISSN   0028-0836. PMID   33319829. S2CID   229172479.
  36. "Online + Open Access Publishing". European Geosciences Union. Retrieved 30 December 2019.
  37. Rittman, Martyn; Vazquez, Franck (June 2019). "Sci—An Open Access Journal with Post-Publication Peer Review". Sci. 1 (1): 1. doi: 10.3390/sci1010001 .
  38. Jacob, Claus; Rittman, Martyn; Vazquez, Franck; Abdin, Ahmad Yaman (June 2019). "Evolution of Sci's Community-Driven Post-Publication Peer-Review". Sci. 1 (1): 16. doi: 10.3390/sci1010016.v1 .
  39. Vazquez, Franck; Lin, Shu-Kun; Jacob, Claus (December 2020). "Changing Sci from Post-Publication Peer-Review to Single-Blind Peer-Review". Sci. 2 (4): 82. doi: 10.3390/sci2040082 .
  40. Abdin, Ahmad Yaman; Nasim, Muhammad Jawad; Ney, Yannick; Jacob, Claus (March 2021). "The Pioneering Role of Sci in Post Publication Public Peer Review (P4R)". Publications. 9 (1): 13. doi: 10.3390/publications9010013 .
  41. "Nature is trialling transparent peer review — the early results are encouraging". Nature. 603 (7899): 8. 2022-03-01. Bibcode:2022Natur.603....8.. doi: 10.1038/d41586-022-00493-w . ISSN   0028-0836. PMID   35233099. S2CID   247189806.
  42. Soergel, David; Saunders, Adam; McCallum, Andrew (2013-05-14). "Open Scholarship and Peer Review: a Time for Experimentation". OpenReview. Retrieved 2023-12-05.
  43. Brainard, Jeffrey (2019-10-10). "In bid to boost transparency, bioRxiv begins posting peer reviews next to preprints". Science. American Association for the Advancement of Science (AAAS). doi:10.1126/science.aaz8160. ISSN   0036-8075. S2CID   211766434.
  44. Coy, Peter (2022-01-28). "Opinion | How to Disseminate Science Quickly". The New York Times. ISSN   0362-4331 . Retrieved 2022-10-04.
  45. Johansson, Michael A.; Saderi, Daniela (2020). "Open peer-review platform for COVID-19 preprints". Nature. Springer Science and Business Media LLC. 579 (7797): 29. Bibcode:2020Natur.579...29J. doi: 10.1038/d41586-020-00613-4 . ISSN   0028-0836. PMID   32127711.
  46. Decoursey, Thomas (March 1999). "Pros and cons of open peer review". Nature Neuroscience. 2 (3): 197–8. doi:10.1038/nature04991. PMID   10195206.
  47. "What is peer review?". Elsevier. Retrieved 31 July 2015.
  48. Besançon, Lonni; Peiffer-Smadja, Nathan; Segalas, Corentin; Jiang, Haiting; Masuzzo, Paola; Smout, Cooper; Billy, Eric; Deforet, Maxime; Leyrat, Clémence (2020). "Open Science Saves Lives: Lessons from the COVID-19 Pandemic". BMC Medical Research Methodology. 21 (1): 117. doi: 10.1186/s12874-021-01304-y . PMC   8179078 . PMID   34090351.
  49. Bingham, Craig M.; Higgins, Gail; Coleman, Ross; Van Der Weyden, Martin B. (1998). "The Medical Journal of Australia internet peer-review study". The Lancet. 352 (9126): 441–445. doi:10.1016/S0140-6736(97)11510-0. PMID   9708752. S2CID   34493476.
  50. Lee CJ, Sugimoto CR, Zhang G, Cronin B (January 2013). "Bias in peer review". Journal of the American Society for Information Science and Technology. 64 (1): 2–17. doi:10.1002/asi.22784.
  51. van Rooyen S, Delamothe T, Evans SJ (November 2010). "Effect on peer review of telling reviewers that their signed reviews might be posted on the web: randomised controlled trial". BMJ. 341: c5729. doi:10.1136/bmj.c5729. PMC   2982798 . PMID   21081600.
  52. Benos DJ, Bashari E, Chaves JM, Gaggar A, Kapoor N, LaFrance M, Mans R, Mayhew D, McGowan S, Polter A, Qadri Y, Sarfare S, Schultz K, Splittgerber R, Stephenson J, Tower C, Walton RG, Zotov A (June 2007). "The ups and downs of peer review". Advances in Physiology Education. 31 (2): 145–52. doi:10.1152/advan.00104.2006. PMID   17562902.
  53. Besançon, Lonni; Rönnberg, Niklas; Löwgren, Jonas; Tennant, Jonathan P.; Cooper, Matthew (2020). "Open up: a survey on open and non-anonymized peer reviewing". Research Integrity and Peer Review. 5 (1): 8. doi: 10.1186/s41073-020-00094-z . ISSN   2058-8615. PMC   7318523 . PMID   32607252.
  54. Wolfram, Dietmar; Wang, Peiling; Abuzahra, Fuad (2021-03-13). "An exploration of referees' comments published in open peer review journals: The characteristics of review language and the association between review scrutiny and citations". Research Evaluation. Oxford University Press (OUP). 30 (3): 314–322. doi:10.1093/reseval/rvab005. ISSN   0958-2029.