PubPeer

Last updated
PubPeer
URL pubpeer.com
Launched2012

PubPeer is a website that allows users to discuss and review scientific research after publication, i.e. post-publication peer review, established in 2012.

Contents

The site has served as a whistleblowing platform, in that it highlighted shortcomings in several high-profile papers, in some cases leading to retractions and to accusations of scientific fraud, [1] [2] [3] [4] [5] [6] [7] as noted by Retraction Watch. [8] Contrary to most platforms, it allows anonymous post-publication commenting, a controversial feature which is the main factor for its success. [9] [10] Consequently, accusations of libel have been levelled at some of PubPeer's users; [11] [12] correspondingly the website has since 2016 told commentators to use only facts that can be publicly verified. [13]

Questions have been raised about the copyright ownership of PubPeers often anonymous contents. [14]

In 2021 a study found that "more than two-thirds of comments [on PubPeer] are posted to report some type of misconduct, mainly about image manipulation". Health sciences and life sciences were shown to have most comments, and most comments reporting publishing fraud and data manipulation. Social science and humanities disciplines in turn had fewer comments, but the highest percentage comments about critical reviews about theory and highlight methodological flaws. The research concluded that "while biochemists access the site to report misconduct... social scientists and humanists use it to discuss conclusions and detect methodological errors". The study also reported that 85.6% of comment are anonymous and that "only 31.5% of publications received more than three comments, and the response rate of authors is very low (7.5%)." [15]

In 2023 a study found that "only 21.5% of the articles [flagged on PubPeer] that deserve an editorial notice (i.e., honest errors, methodological flaws, publishing fraud, manipulation) were corrected by the [relevant] journal". [16]

See also

Related Research Articles

<span class="mw-page-title-main">Scientific misconduct</span> Violation of codes of scholarly conduct and ethical behavior in scientific research

Scientific misconduct is the violation of the standard codes of scholarly conduct and ethical behavior in the publication of professional scientific research. It is violation of scientific integrity: violation of the scientific method and of research ethics in science, including in the design, conduct, and reporting of research.

In academic publishing, a retraction is a mechanism by which a published paper in an academic journal is flagged for being seriously flawed to the extent that their results and conclusions can no longer be relied upon. Retracted articles are not removed from the published literature but marked as retracted. In some cases it may be necessary to remove an article from publication, such as when the article is clearly defamatory, violates personal privacy, is the subject of a court order, or might pose a serious health risk to the general public.

"Severe dopaminergic neurotoxicity in primates after a common recreational dose regimen of MDMA ("ecstasy")", is an article by George A. Ricaurte that was published in September 2002 in the peer-reviewed journal Science, one of the world's top academic journals. It was later retracted; instead of using MDMA, methamphetamine had been used in the test.

In scientific inquiry and academic research, data fabrication is the intentional misrepresentation of research results. As with other forms of scientific misconduct, it is the intent to deceive that marks fabrication as unethical, and thus different from scientists deceiving themselves. There are many ways data can be fabricated. Experimental data can be fabricated by reporting experiments that were never conducted, and accurate data can be manipulated or misrepresented to suit a desired outcome. One of the biggest problems with this form of scientific fraud is that "university investigations into research misconduct are often inadequate, opaque and poorly conducted. They challenge the idea that institutions can police themselves on research integrity."

In academic publishing, the least publishable unit (LPU), also smallest publishable unit (SPU), minimum publishable unit (MPU), loot, or publon, is the minimum amount of information that can be used to generate a publication in a peer-reviewed venue, such as a journal or a conference. (Maximum publishable unit and optimum publishable unit are also used.) The term is often used as a joking, ironic, or derogatory reference to the strategy of artificially inflating quantity of publications.

<i>PLOS One</i> Peer-reviewed open-access scientific journal

PLOS One is a peer-reviewed open access mega journal published by the Public Library of Science (PLOS) since 2006. The journal covers primary research from any discipline within science and medicine. The Public Library of Science began in 2000 with an online petition initiative by Nobel Prize winner Harold Varmus, formerly director of the National Institutes of Health and at that time director of Memorial Sloan–Kettering Cancer Center; Patrick O. Brown, a biochemist at Stanford University; and Michael Eisen, a computational biologist at the University of California, Berkeley, and the Lawrence Berkeley National Laboratory.

Trinity Southwest University (TSU) is an unaccredited evangelical Christian institution of higher education with an office in Albuquerque, New Mexico. Principally a theological school that encompasses both the Bible college and theological seminary concepts of Christian education, it offers distance education programs and degrees in Biblical Studies, Theological Studies, Archaeology & Biblical History, Biblical Counseling, Biblical Representational Research, and University Studies.

Open peer review is the various possible modifications of the traditional scholarly peer review process. The three most common modifications to which the term is applied are:

  1. Open identities: Authors and reviewers are aware of each other's identity.
  2. Open reports: Review reports are published alongside the relevant article.
  3. Open participation: The wider community are able to contribute to the review process.

Scientific Reports is a peer-reviewed open-access scientific mega journal published by Nature Portfolio, covering all areas of the natural sciences. The journal was established in 2011. The journal states that their aim is to assess solely the scientific validity of a submitted paper, rather than its perceived importance, significance, or impact.

Scholarly peer review or academic peer review is the process of having a draft version of a researcher's methods and findings reviewed by experts in the same field. Peer review is widely used for helping the academic publisher decide whether the work should be accepted, considered acceptable with revisions, or rejected for official publication in an academic journal, a monograph or in the proceedings of an academic conference. If the identities of authors are not revealed to each other, the procedure is called dual-anonymous peer review.

<span class="mw-page-title-main">Retraction Watch</span> Blog covering scientific paper retractions

Retraction Watch is a blog that reports on retractions of scientific papers and on related topics. The blog was launched in August 2010 and is produced by science writers Ivan Oransky and Adam Marcus. Its parent organization is the Center for Scientific Integrity, a US 501(c)(3) nonprofit organization.

Invalid science consists of scientific claims based on experiments that cannot be reproduced or that are contradicted by experiments that can be reproduced. Recent analyses indicate that the proportion of retracted claims in the scientific literature is steadily increasing. The number of retractions has grown tenfold over the past decade, but they still make up approximately 0.2% of the 1.4m papers published annually in scholarly journals.

Supportive Care in Cancer is a monthly peer-reviewed medical journal covering research on cancer care. It is published by Springer Science+Business Media on behalf of the Multinational Association of Supportive Care in Cancer.

The Meta-Research Center at Tilburg University is a metascience research center within the School of Social and Behavioral Sciences at the Dutch Tilburg University. They were profiled in a September 2018 article in Science.

Tumor Biology is a bimonthly peer-reviewed open access medical journal covering clinical and experimental oncology. It was established in 1980 as Oncodevelopmental Biology and Medicine, obtaining its current name in 1984. It is owned by the International Society of Oncology and BioMarkers, of which it is the official journal. Originally published by Karger Publishers, it moved to Springer Science+Business Media beginning in 2010. In December 2016, the journal moved again, this time to SAGE Publications. The editor-in-chief is Magdalena Chechlinska. According to the Journal Citation Reports, the journal has a 2016 impact factor of 3.650.

<span class="mw-page-title-main">Conflicts of interest in academic publishing</span>

Conflicts of interest (COIs) often arise in academic publishing. Such conflicts may cause wrongdoing and make it more likely. Ethical standards in academic publishing exist to avoid and deal with conflicts of interest, and the field continues to develop new standards. Standards vary between journals and are unevenly applied. According to the International Committee of Medical Journal Editors, "[a]uthors have a responsibility to evaluate the integrity, history, practices and reputation of the journals to which they submit manuscripts".

Statcheck is an R package designed to detect statistical errors in peer-reviewed psychology articles by searching papers for statistical results, redoing the calculations described in each paper, and comparing the two values to see if they match. It takes advantage of the fact that psychological research papers tend to report their results in accordance with the guidelines published by the American Psychological Association (APA). This leads to several disadvantages: it can only detect results reported completely and in exact accordance with the APA's guidelines, and it cannot detect statistics that are only included in tables in the paper. Another limitation is that Statcheck cannot deal with statistical corrections to test statistics, like Greenhouse–Geisser or Bonferroni corrections, which actually make tests more conservative. Some journals have begun piloting Statcheck as part of their peer review process. Statcheck is free software published under the GNU GPL v3.

<span class="mw-page-title-main">Elisabeth Bik</span> Dutch microbiologist (born 1966)

Elisabeth Margaretha Harbers-Bik is a Dutch microbiologist and scientific integrity consultant. Bik is known for her work detecting photo manipulation in scientific publications, and identifying over 4,000 potential cases of improper research conduct. Bik is the founder of Microbiome Digest, a blog with daily updates on microbiome research, and the Science Integrity Digest blog.

In research, a paper mill is a business that publishes poor or fake journal papers that seem to resemble genuine research, as well as sells authorship.

References

  1. "Researcher admits mistakes in stem cell study". Phys.org. 23 May 2013.
  2. Sven Stockrahm; Lydia Klöckner; Dagny Lüdemann (2013-05-23). "Zellbiologe gibt Fehler in Klonstudie zu". Zeit.
  3. Cyranoski, David; Check Hayden, Erika (2013-05-23). "Stem-cell cloner acknowledges errors in groundbreaking paper". Nature. doi:10.1038/nature.2013.13060. ISSN   1476-4687.
  4. Otake, Tomoko (2014-04-20). "'STAPgate' shows Japan must get back to basics in science". The Japan Times. Retrieved 2024-08-31.
  5. Singh Chawla, Dalmeet (2024-04-29). "How reliable is this research? Tool flags papers discussed on PubPeer". Nature. 629 (8011): 271–272. doi:10.1038/d41586-024-01247-6.
  6. Ordway, Denise-Marie (2023-08-01). "5 tips for using PubPeer to report on research and the scientific community". The Journalist's Resource. Retrieved 2024-08-31.
  7. Barbour, Boris; Stell, Brandon M. (2020-01-28), Biagioli, Mario; Lippman, Alexandra (eds.), "PubPeer: Scientific Assessment Without Metrics", Gaming the Metrics, The MIT Press, pp. 149–156, doi:10.7551/mitpress/11087.003.0015, ISBN   978-0-262-35656-5 , retrieved 2024-08-31
  8. "Leading diabetes researcher corrects paper as more than a dozen studies are questioned on PubPeer". Retraction Watch. 12 January 2015. Retrieved 17 May 2017.
  9. Torny, Didier (February 2018). Pubpeer: vigilante science, journal club or alarm raiser? The controversies over anonymity in post-publication peer review. International Conference on Peer Review.
  10. Teixeira da Silva, Jaime A. (2018-01-01). "The opacity of the PubPeer Foundation: what PubPeer's "About" page tells us". Online Information Review. 42 (2): 282–287. doi:10.1108/OIR-06-2017-0191. ISSN   1468-4527.
  11. Paul Jump (13 November 2014). "Can post-publication peer review endure?". Times Higher Education. Retrieved 5 December 2014.
  12. "PubPeer's first legal threat" (blog). 24 August 2014. Retrieved 5 December 2014.
  13. "PubPeer - How to comment on PubPeer". pubpeer. Archived from the original on 15 November 2016. Retrieved 17 May 2017.
  14. Silva, Jaime A. Teixeira da (2018-07-01). "The Issue of Comment Ownership and Copyright at PubPeer". 教育資料與圖書館學. 55 (2): 227–237. doi:10.6120/JoEMLS.201807_55(2).e001.BC.BE.
  15. Ortega, José Luis (May 2022). "Classification and analysis of PubPeer comments: How a web journal club is used". Journal of the Association for Information Science and Technology. 73 (5): 655–670. doi: 10.1002/asi.24568 . ISSN   2330-1635.
  16. Ortega, José-Luis; Delgado-Quirós, Lorena (2023-01-23). "How do journals deal with problematic articles. Editorial response of journals to articles commented in PubPeer". Profesional de la información. 32 (1). doi:10.3145/epi.2023.ene.18. hdl: 10261/362437 . ISSN   1699-2407.

Further reading