Scientific misconduct

Last updated

Scientific misconduct is the violation of the standard codes of scholarly conduct and ethical behavior in the publication of professional scientific research. It is violation of scientific integrity: violation of the scientific method and of research ethics in science, including in the design, conduct, and reporting of research.

Contents

A Lancet review on Handling of Scientific Misconduct in Scandinavian countries provides the following sample definitions, [1] reproduced in The COPE report 1999: [2]

The consequences of scientific misconduct can be damaging for perpetrators and journal audience [3] [4] and for any individual who exposes it. [5] In addition there are public health implications attached to the promotion of medical or other interventions based on false or fabricated research findings.

Three percent of the 3,475 research institutions that report to the US Department of Health and Human Services' Office of Research Integrity, indicate some form of scientific misconduct. [6] However the ORI will only investigate allegations of impropriety where research was funded by federal grants. They routinely monitor such research publications for red flags and their investigation is subject to a statute of limitations. Other private organizations like the Committee of Medical Journal Editors (COJE) can only police their own members. [7]

Motivation

According to David Goodstein of Caltech, there are motivators for scientists to commit misconduct, which are briefly summarised here. [8]

Career pressure
Science is still a very strongly career-driven discipline. Scientists depend on a good reputation to receive ongoing support and funding, and a good reputation relies largely on the publication of high-profile scientific papers. Hence, there is a strong imperative to "publish or perish". Clearly, this may motivate desperate (or fame-hungry) scientists to fabricate results.
Ease of fabrication
In many scientific fields, results are often difficult to reproduce accurately, being obscured by noise, artifacts, and other extraneous data. That means that even if a scientist does falsify data, they can expect to get away with it – or at least claim innocence if their results conflict with others in the same field. There are few strongly backed systems to investigate possible violations, attempt to press charges, or punish deliberate misconduct. It is relatively easy to cheat although difficult to know exactly how many scientists fabricate data. [9]
Monetary Gain
In many scientific fields, the most lucrative options for professionals are often selling opinions. Corporations can pay experts to support products directly or indirectly via conferences. Psychologists can make money by repeatedly acting as an expert witness in custody proceedings for the same law firms.

Forms

The U.S. National Science Foundation defines three types of research misconduct: fabrication, falsification, and plagiarism. [10] [11]

Other types of research misconduct are also recognized:

Photo manipulation

Compared to other forms of scientific misconduct, image fraud (manipulation of images to distort their meaning) is of particular interest since it can frequently be detected by external parties. In 2006, the Journal of Cell Biology gained publicity for instituting tests to detect photo manipulation in papers that were being considered for publication. [28] This was in response to the increased usage of programs such as Adobe Photoshop by scientists, which facilitate photo manipulation. Since then more publishers, including the Nature Publishing Group, have instituted similar tests and require authors to minimize and specify the extent of photo manipulation when a manuscript is submitted for publication. However, there is little evidence to indicate that such tests are applied rigorously. One Nature paper published in 2009 [29] has subsequently been reported to contain around 20 separate instances [30] of image fraud.

Although the type of manipulation that is allowed can depend greatly on the type of experiment that is presented and also differ from one journal to another, in general the following manipulations are not allowed: [31] [32]

Image manipulations are typically done on visually repetitive images such as those of blots and microscope images. [33]

Helicopter research

Neo-colonial research or neo-colonial science, [34] [35] frequently described as helicopter research, [34] parachute science [36] [37] or research, [38] parasitic research, [39] [40] or safari study, [41] is when researchers from wealthier countries go to a developing country, collect information, travel back to their country, analyze the data and samples, and publish the results with no or little involvement of local researchers. A 2003 study by the Hungarian Academy of Sciences found that 70% of articles in a random sample of publications about least-developed countries did not include a local research co-author. [35]

Frequently, during this kind of research, the local colleagues might be used to provide logistics support as fixers but are not engaged for their expertise or given credit for their participation in the research. Scientific publications resulting from parachute science frequently only contribute to the career of the scientists from rich countries, thus limiting the development of local science capacity (such as funded research centers) and the careers of local scientists. [34] This form of "colonial" science has reverberations of 19th century scientific practices of treating non-Western participants as "others" in order to advance colonialism—and critics call for the end of these extractivist practices in order to decolonize knowledge. [42] [43]

This kind of research approach reduces the quality of research because international researchers may not ask the right questions or draw connections to local issues. [44] The result of this approach is that local communities are unable to leverage the research to their own advantage. [37] Ultimately, especially for fields dealing with global issues like conservation biology which rely on local communities to implement solutions, neo-colonial science prevents institutionalization of the findings in local communities in order to address issues being studied by scientists. [37] [42]

Responsibilities

Authorship responsibility

All authors of a scientific publication are expected to have made reasonable attempts to check findings submitted to academic journals for publication.

Simultaneous submission of scientific findings to more than one journal or duplicate publication of findings is usually regarded as misconduct, under what is known as the Ingelfinger rule, named after the editor of The New England Journal of Medicine 1967–1977, Franz Ingelfinger. [45]

Guest authorship (where there is stated authorship in the absence of involvement, also known as gift authorship) and ghost authorship (where the real author is not listed as an author) are commonly regarded as forms of research misconduct. In some cases coauthors of faked research have been accused of inappropriate behavior or research misconduct for failing to verify reports authored by others or by a commercial sponsor. Examples include the case of Gerald Schatten who co-authored with Hwang Woo-Suk, the case of Professor Geoffrey Chamberlain named as guest author of papers fabricated by Malcolm Pearce, [46] (Chamberlain was exonerated from collusion in Pearce's deception) [47] – and the coauthors with Jan Hendrik Schön at Bell Laboratories. More recent cases include that of Charles Nemeroff, [48] then the editor-in-chief of Neuropsychopharmacology, and a well-documented case involving the drug Actonel. [49]

Authors are expected to keep all study data for later examination even after publication. The failure to keep data may be regarded as misconduct. Some scientific journals require that authors provide information to allow readers to determine whether the authors might have commercial or non-commercial conflicts of interest. Authors are also commonly required to provide information about ethical aspects of research, particularly where research involves human or animal participants or use of biological material. Provision of incorrect information to journals may be regarded as misconduct. Financial pressures on universities have encouraged this type of misconduct. The majority of recent cases of alleged misconduct involving undisclosed conflicts of interest or failure of the authors to have seen scientific data involve collaborative research between scientists and biotechnology companies. [48] [50]

Research institution responsibility

In general, defining whether an individual is guilty of misconduct requires a detailed investigation by the individual's employing academic institution. Such investigations require detailed and rigorous processes and can be extremely costly. Furthermore, the more senior the individual under suspicion, the more likely it is that conflicts of interest will compromise the investigation. In many countries (with the notable exception of the United States) acquisition of funds on the basis of fraudulent data is not a legal offence and there is consequently no regulator to oversee investigations into alleged research misconduct. Universities therefore have few incentives to investigate allegations in a robust manner, or act on the findings of such investigations if they vindicate the allegation.

Well publicised cases illustrate the potential role that senior academics in research institutions play in concealing scientific misconduct. A King's College (London) internal investigation showed research findings from one of their researchers to be 'at best unreliable, and in many cases spurious' [51] but the college took no action, such as retracting relevant published research or preventing further episodes from occurring.

In a more recent case [52] an internal investigation at the National Centre for Cell Science (NCCS), Pune determined that there was evidence of misconduct by Gopal Kundu, but an external committee was then organised which dismissed the allegation, and the NCCS issued a memorandum exonerating the authors of all charges of misconduct. Undeterred by the NCCS exoneration, the relevant journal ( Journal of Biological Chemistry ) withdrew the paper based on its own analysis.

Scientific peer responsibility

Some academics believe that scientific colleagues who suspect scientific misconduct should consider taking informal action themselves, or reporting their concerns. [53] This question is of great importance since much research suggests that it is very difficult for people to act or come forward when they see unacceptable behavior, unless they have help from their organizations. A "User-friendly Guide," and the existence of a confidential organizational ombudsman may help people who are uncertain about what to do, or afraid of bad consequences for their speaking up. [54]

Responsibility of journals

Journals are responsible for safeguarding the research record and hence have a critical role in dealing with suspected misconduct. This is recognised by the Committee on Publication Ethics (COPE) which has issued clear guidelines [55] on the form (e.g. retraction) that concerns over the research record should take.

Evidence emerged in 2012 that journals learning of cases where there is strong evidence of possible misconduct, with issues potentially affecting a large portion of the findings, frequently fail to issue an expression of concern or correspond with the host institution so that an investigation can be undertaken. In one case the Journal of Clinical Oncology issued a correction despite strong evidence that the original paper was invalid. [56] [ failed verification ] In another case, [29] Nature allowed a corrigendum to be published despite clear evidence of image fraud. Subsequent retraction of the paper required the actions of an independent whistleblower. [57]

The cases of Joachim Boldt and Yoshitaka Fujii [58] in anaesthesiology focussed attention on the role that journals play in perpetuating scientific fraud as well as how they can deal with it. In the Boldt case, the editors-in-chief of 18 specialist journals (generally anaesthesia and intensive care) made a joint statement regarding 88 published clinical trials conducted without Ethics Committee approval. In the Fujii case, involving nearly 200 papers, the journal Anesthesia & Analgesia, which published 24 of Fujii's papers, has accepted that its handling of the issue was inadequate. Following publication of a letter to the editor from Kranke and colleagues in April 2000, [59] along with a non-specific response from Dr. Fujii, there was no follow-up on the allegation of data manipulation and no request for an institutional review of Dr. Fujii's research. Anesthesia & Analgesia went on to publish 11 additional manuscripts by Dr. Fujii following the 2000 allegations of research fraud, with Editor Steven Shafer stating [60] in March 2012 that subsequent submissions to the Journal by Dr. Fujii should not have been published without first vetting the allegations of fraud. In April 2012 Shafer led a group of editors to write a joint statement, [61] in the form of an ultimatum made available to the public, to a large number of academic institutions where Fujii had been employed, offering these institutions the chance to attest to the integrity of the bulk of the allegedly fraudulent papers.

Consequences of scientific misconduct

Consequences for science

The consequences of scientific fraud vary based on the severity of the fraud, the level of notice it receives, and how long it goes undetected. For cases of fabricated evidence, the consequences can be wide-ranging, with others working to confirm (or refute) the false finding, or with research agendas being distorted to address the fraudulent evidence. The Piltdown Man fraud is a case in point: The significance of the bona-fide fossils that were being found was muted for decades because they disagreed with Piltdown Man and the preconceived notions that those faked fossils supported. In addition, the prominent paleontologist Arthur Smith Woodward spent time at Piltdown each year until he died, trying to find more Piltdown Man remains. The misdirection of resources kept others from taking the real fossils more seriously and delayed the reaching of a correct understanding of human evolution. (The Taung Child, which should have been the death knell for the view that the human brain evolved first, was instead treated very critically because of its disagreement with the Piltdown Man evidence.)

In the case of Prof Don Poldermans, the misconduct occurred in reports of trials of treatment to prevent death and myocardial infarction in patients undergoing operations. [62] The trial reports were relied upon to issue guidelines that applied for many years across North America and Europe. [63]

In the case of Dr Alfred Steinschneider, two decades and tens of millions of research dollars were lost trying to find the elusive link between infant sleep apnea, which Steinschneider said he had observed and recorded in his laboratory, and sudden infant death syndrome (SIDS), of which he stated it was a precursor. The cover was blown in 1994, 22 years after Steinschneider's 1972 Pediatrics paper claiming such an association, [64] when Waneta Hoyt, the mother of the patients in the paper, was arrested, indicted and convicted on five counts of second-degree murder for the smothering deaths of her five children. [65] While that in itself was bad enough, the paper, presumably written as an attempt to save infants' lives, ironically was ultimately used as a defense by parents suspected in multiple deaths of their own children in cases of Münchausen syndrome by proxy. The 1972 Pediatrics paper was cited in 404 papers in the interim and is still listed on Pubmed without comment. [66]

Consequences for those who expose misconduct

The potentially severe consequences for individuals who are found to have engaged in misconduct also reflect on the institutions that host or employ them and also on the participants in any peer review process that has allowed the publication of questionable research. This means that a range of actors in any case may have a motivation to suppress any evidence or suggestion of misconduct. Persons who expose such cases, commonly called whistleblowers, find themselves open to retaliation by a number of different means. [46] These negative consequences for exposers of misconduct have driven the development of whistle blowers charters – designed to protect those who raise concerns (for more details refer to retaliation (law) ).

Regulatory Violations and Consequences (example)

Title 10 Code of Federal Regulation (CFR) Part 50.5, Deliberate Misconduct of the U.S. Nuclear Regulatory Commission (NRC) regulations, addresses the prohibition of certain activities by individual involved in NRC-licensed activities. 10 CFR 50.5 is designed to ensure the safety and integrity of nuclear operations. 10 CFR Part 50.9, Completeness and Accuracy of Information, focuses on the requirements for providing information and data to the NRC. The intent of 10 CFR 50.5 is to deter and penalize intentional wrongdoing (i.e., violations). 10 CFR 50.9 is crucial in maintaining transparency and reliability in the nuclear industry, which effectively emphasizes honesty and integrity in maintaining the safety and security of nuclear operations. Providing false or misleading information or data to the NRC is therefore a violation of 10 CFR 50.9.

Violation of any of these rules can lead to severe penalties, including termination, fines and criminal prosecution. It can also result in the revocation of licenses or certifications, thereby barring individuals or entities from participating in any NRC-licensed activities in the future.

Data issues

Exposure of fraudulent data

With the advancement of the internet, there are now several tools available to aid in the detection of plagiarism and multiple publication within biomedical literature. One tool developed in 2006 by researchers in Dr. Harold Garner's laboratory at the University of Texas Southwestern Medical Center at Dallas is Déjà vu, [67] an open-access database containing several thousand instances of duplicate publication. All of the entries in the database were discovered through the use of text data mining algorithm eTBLAST, also created in Dr. Garner's laboratory. The creation of Déjà vu [68] and the subsequent classification of several hundred articles contained therein have ignited much discussion in the scientific community concerning issues such as ethical behavior, journal standards, and intellectual copyright. Studies on this database have been published in journals such as Nature and Science , among others. [69] [70]

Other tools which may be used to detect fraudulent data include error analysis. Measurements generally have a small amount of error, and repeated measurements of the same item will generally result in slight differences in readings. These differences can be analyzed, and follow certain known mathematical and statistical properties. Should a set of data appear to be too faithful to the hypothesis, i.e., the amount of error that would normally be in such measurements does not appear, a conclusion can be drawn that the data may have been forged. Error analysis alone is typically not sufficient to prove that data have been falsified or fabricated, but it may provide the supporting evidence necessary to confirm suspicions of misconduct.

Data sharing

Kirby Lee and Lisa Bero suggest, "Although reviewing raw data can be difficult, time-consuming and expensive, having such a policy would hold authors more accountable for the accuracy of their data and potentially reduce scientific fraud or misconduct." [71]

Underreporting

The vast majority of cases of scientific misconduct may not be reported. The number of article retractions in 2022 was nearly 5,500, but Ivan Oransky and Adam Marcus, co-founders of Retraction Watch , estimate that at least 100,000 retractions should occur every year, with only about one in five being due to "honest error". [72]

Some notable cases

In 1998 Andrew Wakefield published a fraudulent research paper in The Lancet claiming links between the MMR vaccine, autism, and inflammatory bowel disease. In 2010 he was found guilty of dishonesty in his research and banned from medicine by the UK General Medical Council following an investigation by Brian Deer of the London Sunday Times . [73]

The claims in Wakefield's paper were widely reported, [74] leading to a sharp drop in vaccination rates in the UK and Ireland and outbreaks of mumps and measles. Promotion of the claimed link continues to fuel the anti-vaccination movement.

In 2011 Diederik Stapel, a highly regarded Dutch social psychologist, turned out to have fabricated data in dozens of studies on human behaviour. [75] He has been called "the biggest con man in academic science". [76]

In 2020 Sapan Desai and his coauthors published two papers, in the prestigious medical journals The Lancet and the The New England Journal of Medicine , early in the COVID-19 pandemic. The papers were based on a very large dataset published by Surgisphere, a company owned by Desai. The dataset was exposed as a fabrication, and the papers were soon retracted. [77] [78]

Solutions

Changing research assessment

Since 2012, the Declaration on Research Assessment (DORA), from San Francisco, gathers many institutions, publishers and individuals committing to improve the metrics used to assess research and to stop focusing on the journal impact factor. [79]

See also

Related Research Articles

<span class="mw-page-title-main">Scientific literature</span> Literary genre

Scientific literature comprises academic papers that report original empirical and theoretical work in the natural and social sciences. Within a field of research, relevant papers are often referred to as "the literature". Academic publishing is the process of contributing the results of one's research into the literature, which often requires a peer-review process.

In academic publishing, a retraction is a mechanism by which a published paper in an academic journal is flagged for being seriously flawed to the extent that their results and conclusions can no longer be relied upon. Retracted articles are not removed from the published literature but marked as retracted. In some cases it may be necessary to remove an article from publication, such as when the article is clearly defamatory, violates personal privacy, is the subject of a court order, or might pose a serious health risk to the general public.

John Roland Darsee is an American physician and former medical researcher. After compiling an impressive list of publications in reputable scientific journals, he was found to have fabricated data for his publications.

In scientific inquiry and academic research, data fabrication is the intentional misrepresentation of research results. As with other forms of scientific misconduct, it is the intent to deceive that marks fabrication as unethical, and thus different from scientists deceiving themselves. There are many ways data can be fabricated. Experimental data can be fabricated by reporting experiments that were never conducted, and accurate data can be manipulated or misrepresented to suit a desired outcome. One of the biggest problems with this form of scientific fraud is that "university investigations into research misconduct are often inadequate, opaque and poorly conducted. They challenge the idea that institutions can police themselves on research integrity."

Research ethics is a discipline within the study of applied ethics. Its scope ranges from general scientific integrity and misconduct to the treatment of human and animal subjects. The societal responsibilities science and reseach has are not traditionally included and less well defined.

Jon Sudbø is a Norwegian dentist, physician, and former medical researcher, who was exposed as a scientific fraudster in 2006. Over a period of several years, he fabricated results in the field of oncology which he published in leading medical journals. The article that led to his downfall, which was published in The Lancet, was based on 900 patients Sudbø had fabricated entirely. The editor of The Lancet described this as the biggest scientific fraud conducted by a single researcher ever.

Scientific writing is writing about science, with an implication that the writing is by scientists and for an audience that primarily includes peers—those with sufficient expertise to follow in detail. Scientific writing is a specialized form of technical writing, and a prominent genre of it involves reporting about scientific studies such as in articles for a scientific journal. Other scientific writing genres include writing literature-review articles, which summarize the existing state of a given aspect of a scientific field, and writing grant proposals, which are a common means of obtaining funding to support scientific research. Scientific writing is more likely to focus on the pure sciences compared to other aspects of technical communication that are more applied, although there is overlap. There is not one specific style for citations and references in scientific writing. Whether you are submitting a grant proposal, literature review articles, or submitting an article into a paper, the citation system that must be used will depend on the publication you plan to submit to.

Academic authorship of journal articles, books, and other original works is a means by which academics communicate the results of their scholarly work, establish priority for their discoveries, and build their reputation among their peers.

A lack of oversight and a lack of proper training for scientists have led to the rise of plagiarism and research misconduct in India. India does not have a statutory body to deal with scientific misconduct in academia, like the Office of Research Integrity in the US, and hence cases of plagiarism are often dealt in ad-hoc fashion with different routes being followed in different cases. In most cases, a public and media outcry leads to an investigation either by institutional authorities or by independent enquiry committees. Plagiarists have in some cases been suspended, removed or demoted. However, no fixed route has been prescribed to monitor such activities. This has led to calls for establishment of an independent ethics body.

Medical ghostwriters are employed by pharmaceutical companies and medical-device manufacturers to produce apparently independent manuscripts for peer-reviewed journals, conference presentations and other communications. Physicians and other scientists are paid to attach their names to the manuscripts as though they had authored them. The named authors may have had little or no involvement in the research or writing process.

Paolo Macchiarini is a disgraced thoracic surgeon and former regenerative medicine researcher who became known for research fraud and manipulative behavior. He was convicted of research-related crimes in Italy and Sweden.

Carlo Maria Croce is an Italian-American professor of medicine at Ohio State University, specializing in oncology and the molecular mechanisms underlying cancer. Croce and his research have attracted public attention because of multiple allegations of scientific misconduct.

Research integrity or scientific integrity is an aspect of research ethics that deals with best practice or rules of professional practice of scientists.

Yoshitaka Fujii is a Japanese researcher in anesthesiology, who in 2012 was found to have fabricated data in at least 219 scientific papers, of which 183 have been retracted.

Annarosa Leri is a medical doctor and former associate professor at Harvard University. Along with former professor Piero Anversa, Leri was engaged in biomedical research at Brigham and Women’s Hospital in Boston, an affiliate of Harvard Medical School. Since at least 2003 Anversa and Leri had investigated the ability of the heart to regenerate damaged cells using cardiac stem cells.

<span class="mw-page-title-main">Conflicts of interest in academic publishing</span>

Conflicts of interest (COIs) often arise in academic publishing. Such conflicts may cause wrongdoing and make it more likely. Ethical standards in academic publishing exist to avoid and deal with conflicts of interest, and the field continues to develop new standards. Standards vary between journals and are unevenly applied. According to the International Committee of Medical Journal Editors, "[a]uthors have a responsibility to evaluate the integrity, history, practices and reputation of the journals to which they submit manuscripts".

The Lancet MMR autism fraud centered on the publication in February 1998 of a fraudulent research paper titled "Ileal-lymphoid-nodular hyperplasia, non-specific colitis, and pervasive developmental disorder in children" in The Lancet. The paper, authored by now discredited and deregistered Andrew Wakefield, and twelve coauthors, falsely claimed causative links between the measles, mumps, and rubella (MMR) vaccine and colitis and between colitis and autism. The fraud was exposed in a lengthy Sunday Times investigation by reporter Brian Deer, resulting in the paper's retraction in February 2010 and Wakefield being struck off the UK medical register three months later. Wakefield reportedly stood to earn up to US$43 million per year selling diagnostic kits for a non-existent syndrome he claimed to have discovered. He also held a patent to a rival vaccine at the time, and he had been employed by a lawyer representing parents in lawsuits against vaccine producers.

Abida Sophie Jamal is a Canadian endocrinologist and former osteoporosis researcher who was at the centre of a scientific misconduct case in the mid-to-late 2010s. Jamal published a high-profile paper suggesting that the heart medication nitroglycerin was a treatment for osteoporosis, and was later demonstrated to have misrepresented her results. She received a lifetime ban from receiving funding from the Canadian Institutes of Health Research and was named directly in their disclosure report, becoming the first person mentioned by name by the institute for scientific misconduct. Jamal was later stripped of her medical license for two years, regaining it in a controversial 3–2 decision.

In research, a paper mill is a business that publishes poor or fake journal papers that seem to resemble genuine research, as well as sells authorship.

References

  1. Nylenna, M.; Andersen, D.; Dahlquist, G.; Sarvas, M.; Aakvaag, A. (1999). "Handling of scientific dishonesty in the Nordic countries. National Committees on Scientific Dishonesty in the Nordic Countries". Lancet. 354 (9172): 57–61. doi:10.1016/S0140-6736(98)07133-5. PMID   10406378. S2CID   36326829.
  2. "Coping with fraud" (PDF). The COPE Report 1999: 11–18. Archived from the original (PDF) on 2007-09-28. Retrieved 2006-09-02. It is 10 years, to the month, since Stephen Lock ... Reproduced with kind permission of the Editor, The Lancet.
  3. Xie, Yun (2008-08-12). "What are the consequences of scientific misconduct?". Ars Technica. Retrieved 2013-03-01.
  4. Redman, B. K.; Merz, J. F. (2008). "SOCIOLOGY: Scientific Misconduct: Do the Punishments Fit the Crime?" (PDF). Science. 321 (5890): 775. doi:10.1126/science.1158052. PMID   18687942. S2CID   206512870.
  5. "Consequences of Whistleblowing for the Whistleblower in Misconduct in Science Cases". Research Triangle Institute. 1995. Archived from the original (PDF) on 2017-08-24. Retrieved 2012-05-24.
  6. Singh, Dr. Yatendra Kumar; Kumar Dubey, Bipin (2021). Introduction of Research Methods and Publication Ethics. New Delhi: Friends Publications (India). p. 90. ISBN   978-93-90649-38-9.
  7. Part III. Department of Health and Human Services Archived 2021-10-22 at the Wayback Machine
  8. Goodstein, David (January–February 2002). "Scientific misconduct". Academe. 88 (1): 28–31. doi:10.2307/40252116. JSTOR   40252116.
  9. Fanelli, D. (2009). Tregenza, Tom (ed.). "How Many Scientists Fabricate and Falsify Research? A Systematic Review and Meta-Analysis of Survey Data". PLOS ONE. 4 (5): e5738. Bibcode:2009PLoSO...4.5738F. doi: 10.1371/journal.pone.0005738 . PMC   2685008 . PMID   19478950.
  10. "New Research Misconduct Policies" (PDF). NSF. Archived from the original (PDF) on 2012-09-10. Retrieved 2013-03-01.
  11. 45 CFR Part 689 Archived 2008-10-23 at the Wayback Machine
  12. Shapiro, M.F. (1992). "Data audit by a regulatory agency: Its effect and implication for others". Accountability in Research. 2 (3): 219–229. doi:10.1080/08989629208573818. PMID   11653981.
  13. Emmeche, slide 5
  14. Garfield, Eugene (January 21, 2002). "Demand Citation Vigilance". The Scientist. 16 (2): 6. Retrieved 2009-07-30.
  15. Emmeche, slide 3, who refers to the phenomenon as Dulbecco's law.
  16. "Publication Ethics Policies for Medical Journals". The World Association of Medical Editors. Archived from the original on 2009-07-31. Retrieved 2009-07-30.
  17. "ICMJE – Home". www.icmje.org. Retrieved 3 April 2018.
  18. "Publication Ethics Policies for Medical Journals". The World Association of Medical Editors. Archived from the original on 2009-07-31. Retrieved 2009-07-30.
  19. Kwok, L. S. (2005). "The White Bull effect: Abusive coauthorship and publication parasitism". Journal of Medical Ethics. 31 (9): 554–556. doi:10.1136/jme.2004.010553. PMC   1734216 . PMID   16131560.
  20. Bates, T.; Anić, A.; Marusić, M.; Marusić, A. (2004). "Authorship Criteria and Disclosure of Contributions: Comparison of 3 General Medical Journals with Different Author Contribution Forms". JAMA. 292 (1): 86–88. doi:10.1001/jama.292.1.86. PMID   15238595.
  21. Bhopal, R.; Rankin, J.; McColl, E.; Thomas, L.; Kaner, E.; Stacy, R.; Pearson, P.; Vernon, B.; Rodgers, H. (1997). "The vexed question of authorship: Views of researchers in a British medical faculty". BMJ. 314 (7086): 1009–1012. doi:10.1136/bmj.314.7086.1009. PMC   2126416 . PMID   9112845.
  22. Wager, E. (2007). "Do medical journals provide clear and consistent guidelines on authorship?". MedGenMed. 9 (3): 16. PMC   2100079 . PMID   18092023.
  23. Wren, Jonathan D; Valencia, Alfonso; Kelso, Janet (15 September 2019). "Reviewer-coerced citation: case report, update on journal policy and suggestions for future prevention". Bioinformatics. 35 (18): 3217–3218. doi:10.1093/bioinformatics/btz071. PMC   6748764 . PMID   30698640.
  24. Chaplain, Mark; Kirschner, Denise; Iwasa, Yoh (March 2020). "JTB Editorial Malpractice: A Case Report". Journal of Theoretical Biology. 488: 110171. Bibcode:2020JThBi.48810171C. doi:10.1016/j.jtbi.2020.110171. PMID   32007131.
  25. de La Blanchardière A, Barde F, Peiffer-Smadja N, Maisonneuve H (June 2021). "Revues prédatrices: une vraie menace pour la recherche médicale. 2 Evaluer leurs conséquences et engager une riposte" [Predatory journals: A real threat for medical research. 2 Assess their consequences and initiate a response]. Rev Med Interne (in French). 42 (6): 427–433. doi:10.1016/j.revmed.2021.03.327. PMID   33836895. S2CID   241560050.
  26. Yeo-Teh NS, Tang BL (October 2021). "Wilfully submitting to and publishing in predatory journals – a covert form of research misconduct?". Biochem Med (Zagreb). 31 (3): 395–402. doi:10.11613/BM.2021.030201. PMC   8340504 . PMID   34393593.
  27. Brown, C. (2005) Overcoming Barriers to Use of Promising Research Among Elite Middle East Policy Groups, Journal of Social Behaviour and Personality, Select Press.
  28. Nicholas Wade (2006-01-24). "It May Look Authentic; Here's How to Tell It Isn't". New York Times . Retrieved 2010-04-01.
  29. 1 2 Kim, M. S.; Kondo, T.; Takada, I.; Youn, M. Y.; Yamamoto, Y.; Takahashi, S.; Matsumoto, T.; Fujiyama, S.; Shirode, Y.; Yamaoka, I.; Kitagawa, H.; Takeyama, K. I.; Shibuya, H.; Ohtake, F.; Kato, S. (2009). "DNA demethylation in hormone-induced transcriptional derepression". Nature. 461 (7266): 1007–1012. Bibcode:2009Natur.461.1007K. doi:10.1038/nature08456. PMID   19829383. S2CID   4426439. (Retracted, see doi:10.1038/nature11164)
  30. 11jigen (2012-01-15). "Shigeaki Kato (the University of Tokyo): DNA demethylation in hormone-induced transcriptional derepression". Katolab-imagefraud.blogspot.co.uk. Retrieved 2013-08-04.{{cite web}}: CS1 maint: numeric names: authors list (link)
  31. "Restrictions of Image Manipulation" (PDF). AMED. Retrieved April 22, 2024.
  32. "Editorial Policies". rupress.org. Retrieved 2024-04-22.
  33. Ritchie, Stuart (2021-07-02). "Why Are Gamers So Much Better Than Scientists at Catching Fraud?". The Atlantic. Retrieved 2021-07-19.
  34. 1 2 3 Minasny, Budiman; Fiantis, Dian; Mulyanto, Budi; Sulaeman, Yiyi; Widyatmanti, Wirastuti (2020-08-15). "Global soil science research collaboration in the 21st century: Time to end helicopter research". Geoderma. 373: 114299. Bibcode:2020Geode.373k4299M. doi:10.1016/j.geoderma.2020.114299. ISSN   0016-7061.
  35. 1 2 Dahdouh-Guebas, Farid; Ahimbisibwe, J.; Van Moll, Rita; Koedam, Nico (2003-03-01). "Neo-colonial science by the most industrialised upon the least developed countries in peer-reviewed publishing". Scientometrics. 56 (3): 329–343. doi:10.1023/A:1022374703178. ISSN   1588-2861. S2CID   18463459.
  36. "Q&A: Parachute Science in Coral Reef Research". The Scientist Magazine®. Retrieved 2021-03-24.
  37. 1 2 3 "The Problem With 'Parachute Science'". Science Friday. Retrieved 2021-03-24.
  38. "Scientists Say It's Time To End 'Parachute Research'". NPR.org. Retrieved 2021-03-24.
  39. Health, The Lancet Global (2018-06-01). "Closing the door on parachutes and parasites". The Lancet Global Health. 6 (6): e593. doi: 10.1016/S2214-109X(18)30239-0 . ISSN   2214-109X. PMID   29773111. S2CID   21725769.
  40. Smith, James (2018-08-01). "Parasitic and parachute research in global health". The Lancet Global Health. 6 (8): e838. doi: 10.1016/S2214-109X(18)30315-2 . ISSN   2214-109X. PMID   30012263. S2CID   51630341.
  41. "Helicopter Research". TheFreeDictionary.com. Retrieved 2021-03-24.
  42. 1 2 Vos, Asha de. "The Problem of 'Colonial Science'". Scientific American. Retrieved 2021-03-24.
  43. "The Traces of Colonialism in Science". Observatory of Educational Innovation. Retrieved 2021-03-24.
  44. Stefanoudis, Paris V.; Licuanan, Wilfredo Y.; Morrison, Tiffany H.; Talma, Sheena; Veitayaki, Joeli; Woodall, Lucy C. (2021-02-22). "Turning the tide of parachute science". Current Biology. 31 (4): R184–R185. doi: 10.1016/j.cub.2021.01.029 . ISSN   0960-9822. PMID   33621503.
  45. Toy, Jennifer (2002). "The Ingelfinger Rule: Franz Ingelfinger at The New England Journal of Medicine 1967–77" (PDF). Science Editor . 25 (6): 195–198.
  46. 1 2 Lock, S (June 17, 1995). "Lessons from the Pearce affair: handling scientific fraud". BMJ . 310 (6994): 1547–148. doi:10.1136/bmj.310.6994.1547. PMC   2549935 . PMID   7787632.(registration required)
  47. "Independent Committee of Inquiry into the publication of articles in the British Journal of Obstetrics and Gynaecology (1994–1995)" . Retrieved 2011-08-26.
  48. 1 2 "Journal editor quits in conflict scandal". The Scientist. Retrieved 3 April 2018.
  49. "Actonel Case Media Reports - Scientific Misconduct Wiki". Archived from the original on 2009-02-02. Retrieved 2008-03-22.
  50. Dickerson, John (2005-12-22). "Did a British university sell out to P&G?". Slate. Retrieved 2013-08-04.
  51. Wilmshurst P (2002). "Institutional corruption in medicine (2002)". British Medical Journal. 325 (7374): 1232–1235. doi:10.1136/bmj.325.7374.1232. PMC   1124696 . PMID   12446544.
  52. Jayaraman, K. S. (June 14, 2007). "Indian scientists battle journal retraction". Nature. 447 (7146): 764. Bibcode:2007Natur.447..764J. doi: 10.1038/447764a . PMID   17568715.
  53. See Gerald Koocher & Patricia Keith-Spiegel (22 July 2010). "Peers Nip Misconduct in the Bud". Nature. 466 (7305): 438–440. Bibcode:2010Natur.466..438K. doi:10.1038/466438a. PMID   20651674. S2CID   4396687. and (with Joan Sieber) Responding to Research Wrongdoing: A User Friendly Guide, July 2010.
  54. Rowe, Mary; Wilcox, Linda; Gadlin, Howard (2009). "Dealing with or Reporting 'Unacceptable' Behavior with additional thoughts about the 'Bystander Effect'" (PDF). Journal of the International Ombudsman Association. 2 (1): 52–64.
  55. Retraction Guidelines Archived 2020-03-26 at the Wayback Machine (PDF)
  56. Roman-Gomez, J.; Jimenez-Velasco, A.; Agirre, X.; Prosper, F.; Heiniger, A.; Torres, A. (2005). "Lack of CpG Island Methylator Phenotype Defines a Clinical Subtype of T-Cell Acute Lymphoblastic Leukemia Associated with Good Prognosis" (PDF). Journal of Clinical Oncology. 23 (28): 7043–7049. doi:10.1200/JCO.2005.01.4944. hdl: 10171/17316 . PMID   16192589.
  57. "Shikeagi Kato, who resigned post in March, retracts Nature paper". RetractionWatch . 2012-06-13. Retrieved 2013-03-01.
  58. "Major fraud probe of Japanese anesthesiologist Yoshitaka Fujii may challenge retraction record". RetractionWatch . 2012-03-08. Retrieved 2013-08-04.
  59. Kranke, P.; Apfel, C. C.; Roewer, N.; Fujii, Y. (2000). "Reported data on granisetron and postoperative nausea and vomiting by Fujii et al. Are incredibly nice!". Anesthesia and Analgesia. 90 (4): 1004–1007. doi: 10.1213/00000539-200004000-00053 . PMID   10735823.
  60. Fujii Statement of Concern Archived 2016-03-04 at the Wayback Machine (PDF)
  61. Fujii Join EIC Statement Archived 2016-03-04 at the Wayback Machine (PDF)
  62. Vogel, G. (30 January 2014). "Suspect Drug Research Blamed for Massive Death Toll". Science. 343 (6170): 473–474. Bibcode:2014Sci...343..473V. doi:10.1126/science.343.6170.473. PMID   24482457.
  63. Cole, G. D.; Francis, D. P. (29 August 2014). "Perioperative beta blockade: guidelines do not reflect the problems with the evidence from the DECREASE trials". BMJ. 349 (aug29 8): g5210. doi:10.1136/bmj.g5210. PMID   25172044. S2CID   13845087.
  64. Steinschneider A (October 1972). "Prolonged apnea and the sudden infant death syndrome: clinical and laboratory observations". Pediatrics . 50 (4): 646–654. doi:10.1542/peds.50.4.646. PMID   4342142. S2CID   8561269.
  65. Talan, Jamie; Firstman, Richard (1997). The death of innocents. New York: Bantam Books. ISBN   978-0553100136.
  66. Steinschneider, A (2013-03-25). "Prolonged apnea and the sudden infant death syndrome: clinical and laboratory observations". Pediatrics. 50 (4): 646–654. doi:10.1542/peds.50.4.646. PMID   4342142. S2CID   8561269.
  67. "Déjà vu: Medline duplicate publication database". dejavu.vbi.vt.edu. Archived from the original on 2015-04-25. Retrieved 2013-08-04.
  68. "Deja vu: Medline duplicate publication database". dejavu.vbi.vt.edu. Archived from the original on 2014-07-22. Retrieved 2013-08-04.
  69. Errami M; Garner HR (2008-01-23). "A tale of two citations". Nature . 451 (7177): 397–399. Bibcode:2008Natur.451..397E. doi: 10.1038/451397a . PMID   18216832. S2CID   4358525.
  70. Long TC; Errami M; George AC; Sun Z; Garner HR (2009-03-06). "Scientific Integrity: Responding to Possible Plagiarism". Science . 323 (5919): 1293–1294. doi:10.1126/science.1167408. PMID   19265004. S2CID   28467385.
  71. Lee, Kirby (2006). "Ethics: Increasing accountability". Nature . doi:10.1038/nature05007. Archived from the original on 2012-09-12. Retrieved 2010-08-16.
  72. Oransky, Ivan; Marcus, Adam (August 9, 2023). "There's far more scientific fraud than anyone wants to admit". The Guardian . Retrieved August 12, 2023.
  73. "Dr. Andrew Jeremy Wakefield: Determination on Serious Professional Misconduct (SPM) and Sanction" (PDF). General Medical Council. 24 May 2010. Archived from the original (PDF) on 9 August 2011. Retrieved 10 August 2011.
  74. Goldacre, B. (30 August 2008). "The MMR hoax". The Guardian. London. Retrieved 30 August 2008.
  75. Gretchen Vogel (October 31, 2011). "Report: Dutch 'Lord of the Data' Forged Dozens of Studies (UPDATE)". Science .
  76. Bhattacharjee, Yudhijit (2013-04-26). "The Mind of a Con Man". The New York Times.
  77. Mehra, Mandeep R.; Desai, Sapan S.; Kuy, SreyRam; Henry, Timothy D.; Patel, Amit N. (4 June 2020). "Retraction: Cardiovascular Disease, Drug Therapy, and Mortality in Covid-19. N Engl J Med. DOI: 10.1056/NEJMoa2007621". The New England Journal of Medicine. 382 (26): 2582. doi:10.1056/NEJMc2021225. PMC   7274164 . PMID   32501665.
  78. Mehra, Mandeep R; Ruschitzka, Frank; Patel, Amit N (5 June 2020). "Retraction—Hydroxychloroquine or chloroquine with or without a macrolide for treatment of COVID-19: a multinational registry analysis". The Lancet. 395 (10240): 1820. doi:10.1016/S0140-6736(20)31324-6. PMC   7274621 . PMID   32511943.
  79. "Read the Declaration". DORA. Retrieved 2022-06-07.

Further reading