This article needs additional citations for verification .(May 2023) |
This article lists notable open letters that were initiated by scientists or other academics or have a substantial share of academic signees.
Open letters that are not open for signing by other academics or the public in general and have not received both a large number of signatures – in specific no less than 10 before 2000 and no less than 40 after 2010 – and substantial media attention are not included, nor are petitions. With the advent of the Internet and World Wide Web, such open letters may have become far more frequent.
Open letters targeting or defending individual academics or small groups of scholars [1] [2] as well as letters calling for retractions of specific studies are not included.
Name | Year | Signatures | Scope / topic | Demands | Organized by | Demand achieved | Criticized |
---|---|---|---|---|---|---|---|
Elsevier: NeuroImage transition - all editors have resigned over the high publication fee, and are starting a new non-profit journal, Imaging Neuroscience [3] [4] [5] [6] [7] | 2023 | 42 academics | Global | All editors at NeuroImage and NeuroImage: Reports resigned because they oppose the large fee ($3450) that the journal owner, publishing giant Elsevier, charges authors to make their scientific papers open access (OA). As Elsevier refused to reduce the fee, they launch a new OA journal Imaging Neuroscience hosted by the non-profit publisher MIT Press. [8] They commit to make the new journal replace their abandoned one as "the top journal in [their] field" and hope they demonstrate "the way forward in non-profit publishing". [9] [8] | (Group action of full editors-team) | DA | No |
Pause Giant AI Experiments: An Open Letter [10] [11] [12] [13] | 2023 | >1,000 incl. prominent figures in software & tech and risk researchers | Global | "AI labs to immediately pause for at least 6 months the training of AI systems more powerful than GPT-4" due to "profound risks to society and humanity". [14] | Future of Life Institute | Yes [15] [16] [17] [18] | |
Open Letter to the European Commission on the Cyber Resilience Act [19] [20] | 2023 | Many open source software organizations incl. OSI and TDF | European Union / Global | Changes to the proposed Cyber Resilience Act due to "unnecessary economic and technological risk to the EU" and improving engagement with the under-represented open source software community as "more than 70% of the software in Europe [open source/FOSS] is about to be regulated without an in-depth consultation" and potential chilling effects of OSS development. | The Eclipse Foundation | ||
An open letter regarding research on reflecting sunlight to reduce the risks of climate change [21] | 2023 | >110 academics | Global | Acceleration of research on "atmospheric aerosols and their potential to increase the reflection of sunlight from the atmosphere to address climate risk". | Yes | ||
Open Letter calling for an International Non-Use Agreement on Solar Geoengineering [22] [23] | 2022 | >380 academics | Global | "[I]mmediate political action from governments, the United Nations, and other actors to prevent the normalization of solar geoengineering as a climate policy option". | 16 scholars [24] | No | Yes |
Fossil Free Research Open Letter [25] [26] [27] | 2022 | >800 incl. Nobel laureates & IPCC authors | Global | Universities should stop accepting funding from fossil fuel companies to conduct climate research. | Fossil Free Research [28] | No | |
2022 open letter from Nobel laureates in support of Ukraine | 2022 | >200 Nobel laureates | Russia, Ukraine | Independence of the Ukrainian people and freedom of the Ukrainian state (in face of the 2022 Russian invasion of the country). | Roald Hoffmann, Richard J. Roberts | No | |
An open letter of Russian scientists and science journalists against the war in Ukraine [29] [30] [31] | 2022 | >8,000 academics and science journalists of Russia | Russia, Ukraine | Condemnation of the war against Ukraine with "responsibility for unleashing a new war in Europe [lying] entirely with Russia". [32] | NA | ||
Montreal Declaration on Animal Exploitation [33] | 2022 | >500 academics (moral and political philosophy) | Global | "Insofar as it involves unnecessary violence and harm, [they] declare that animal exploitation is unjust and morally indefensible". | NA | Yes | |
Fulfil the NPT: From nuclear threats to human security | 2022 | >1,000 incl. many academics | Global | Four steps towards a nuclear-weapon-free world to take by the Tenth Review Conference of the Non-Proliferation Treaty in 2022. | NoFirstUse Global | ||
Statement by other Academics in support of Bill No. 28 of 2022 [34] | 2022 | >100 academics | Malta | Passage of abortion law changes Bill No. 28 of 2022 as is. | |||
Scientists to President Biden: Follow the Science, Stop Fossil Fuels [35] | 2022 | 275 academics | USA / Global | Halting recent moves towards increasing fossil fuel production and instead take bold action to rapidly reduce fossil fuel extraction and infrastructure by President Biden. | Scientists Bob Howarth, Mark Jacobson, Michael Mann, Sandra Steingraber, and Peter Kalmus | No | |
Scientists & Experts Want Climate Action – An Open Letter to the White House [36] | 2021 | >1,500 academics incl. 5 senior/lead IPCC authors | USA / Global | "President Joe Biden and his administration to commit to reducing U.S. heat-trapping emissions by at least 50 percent below 2005 levels by 2030". | Union of Concerned Scientists | ||
It is essential to Remove Climate-Harming Logging and Fossil Fuel Provisions from Reconciliation and Infrastructure Bills [37] [38] | 2021 | >100 academics | USA / Global | Removal of logging and fossil fuel provisions from reconciliation and infrastructure bills by President Biden and members of Congress. [39] | |||
Scientists and Engineers Letter to President Biden on the Nuclear Posture Review [40] [41] | 2021 | ~700 scientists and engineers incl. 21 Nobel laureates | USA / Global | President Biden to "use his forthcoming declaration of a new national strategy for managing nuclear weapons as a chance to cut the U.S. arsenal by a third, and to declare, for the first time, that the United States would never be the first to use nuclear weapons in a conflict". | |||
An Open Letter from U.S. Scientists Imploring President Biden to End the Fossil Fuel Era [42] | 2021 | >330 academics | USA / Global |
| Biologist Dr. Sandra Steingraber, climate scientist Dr. Peter Kalmus, advocacy groups Center for Biological Diversity and Food & Water Watch | ||
Stop attempts to criminalise nonviolent climate protest [46] | 2021 | >400 academics | Global | End of the criminalization of non-violent civil disobedience from direct action climate activist groups. | Oscar Berglund, others | ||
Academic Open Letter in Support of the TRIPS Intellectual Property Waiver Proposal [47] | 2021 | >100 academics (international intellectual property) | Global | A "temporary TRIPS waiver – as proposed by India and South Africa and supported by more than 100 countries – [as a] necessary and proportionate legal measure towards the clearing of existing intellectual property barriers to scaling up of production of COVID-19 health technologies in a direct, consistent and effective fashion". | No | ||
WTO must ban harmful fisheries subsidies [48] [49] [50] | 2021 | 296 academics [51] | Global | WTO to eliminate increasing harmful fisheries subsidies. | |||
It Is Time to Address Airborne Transmission of Coronavirus Disease 2019 (COVID-19) [52] | 2020 | >239 academics [53] | Global | WHO, medical community and relevant national and other international bodies to "recognize the potential for airborne spread" of COVID-19 by around the time of the letter. | Lidia Morawska, Donald K Milton | No | |
Call to stop "Chain of Killings" [54] | 2020 | >100 academics | Philippine | End to President Rodrigo Duterte's "bloody war on drugs" and the creation of a Truth Commission. | No | ||
A Letter on Justice and Open Debate | 2020 | 153; mostly scholars and writers | Global | Condemnation of cancel culture and liberal intolerance and defending free speech, making an argument that hostility to free speech was becoming widespread on what could be described as "the political left" as well. | Thomas Chatterton Williams, Robert Worth, George Packer, David Greenberg, Mark Lilla | NA | Yes [55] [56] |
A warning on climate and the risk of societal collapse [57] | 2020 | 258 mostly academics | Global | Engagement with the risk of disruption and even collapse of societies due to climate change by policymakers. | Gesa Weyhenmeyer, Will Steffen | NA | |
We declare our support for Extinction Rebellion [58] | 2019 | >250 academics at Australian universities | Global | Declaration of support for Extinction Rebellion and sufficient change of Australian government's inaction on the climate crisis. It is based on a 2018 open letter. | NA | ||
School climate strike children's brave stand has our support [59] | 2019 | 224 academics | Global | Declaration of support for School Strikes for Climate. | NA | ||
The EU needs a stability and wellbeing pact, not more growth [60] [61] | 2018 | 238 academics | Global | "Plan[ning]" for a post-growth future in which human and ecological wellbeing is prioritised over GDP by the European Union and its member states. | Wellbeing Economy Alliance and others | NA | |
An Open Letter from Scientists to President-Elect Trump on Climate Change [62] | 2016 | >800 academics (Earth science and energy) | USA / Global | Six steps of immediate and sustained action to take by Donald Trump against human-caused climate change. | No | ||
Science and the Public Interest – An Open Letter to President-Elect Trump and the 115th Congress [63] | 2016 | >2,000 academics | USA / Global | Sufficient funding of scientific research as well as "support[ing] and rely[ing] on science as a key input for crafting public policy" by the incoming administration. | Union of Concerned Scientists | No | |
Open Letter to Google From 80 Internet Scholars: Release RTBF Compliance Data [64] | 2015 | 80 academics | Global – Google | More transparency of Google's right to be forgotten (RTBF) processes. | |||
Open Letter on Artificial Intelligence | 2015 | >150 incl. Stephen Hawking, Elon Musk, and many AI experts | Global | The letter highlights both the positive and negative effects of artificial intelligence. [65] According to Bloomberg Business, Professor Max Tegmark of MIT circulated the letter in order to find common ground between signatories who consider super intelligent AI a significant existential risk, and signatories such as Professor Oren Etzioni, who believe the AI field was being "impugned" by a one-sided media focus on the alleged risks. [66] The letter contends that: | Future of Life Institute | NA | |
An Open Letter From Internet Engineers to the U.S. Congress [67] | 2011 | 83 prominent Internet inventors and engineers | USA, Canada / Global | Stoppage of SOPA and PIPA Internet blacklist bills. [68] | Electronic Frontier Foundation | Yes | No |
Open letter from ten Members of the Russian Academy of Sciences to the President | 2007 | 10 members of the Russian Academy of Sciences | Russia | The letter was intended to warn both the society and the government about the growing influence of the Russian Orthodox Church and its expansion into many fields of social life, particularly into the state education system, which is strictly prohibited under the Russian Constitution. | No | Yes [69] | |
Open letter to Gorbachev | 1990 | 30 academics incl. Nobel laureates | Russia | Preventing privatisation of land itself, instead of a Georgist system of common ownership and the collection of public revenue through land-value taxation. |
Vinod Khosla is an Indian-American billionaire businessman and venture capitalist. He is a co-founder of Sun Microsystems and the founder of Khosla Ventures. Khosla made his wealth from early venture capital investments in areas such as networking, software, and alternative energy technologies. He is considered one of the most successful and influential venture capitalists. Khosla was named the top venture capitalist on the Forbes Midas List in 2001 and has been listed multiple times since that time. As of August 2024, Forbes estimated his net worth at US$7.2 billion.
Elon Reeve Musk is a businessman known for his key roles in the space company SpaceX and the automotive company Tesla, Inc. His other involvements include ownership of X Corp., the company that operates the social media platform X, and his role in the founding of the Boring Company, xAI, Neuralink, and OpenAI. Musk is the wealthiest individual in the world; as of December 2024, Forbes estimates his net worth to be US$344 billion.
Jaan Tallinn is an Estonian billionaire computer programmer and investor known for his participation in the development of Skype and file-sharing application FastTrack/Kazaa.
Gary S. Gensler is an American government official and former investment banker serving as the chair of the U.S. Securities and Exchange Commission (SEC). Gensler previously worked for Goldman Sachs and has led the Biden–Harris transition's Federal Reserve, Banking, and Securities Regulators agency review team. Prior to his appointment, he was professor of Practice of Global Economics and Management at the MIT Sloan School of Management.
The Future of Life Institute (FLI) is a nonprofit organization which aims to steer transformative technology towards benefiting life and away from large-scale risks, with a focus on existential risk from advanced artificial intelligence (AI). FLI's work includes grantmaking, educational outreach, and advocacy within the United Nations, United States government, and European Union institutions.
In January 2015, Stephen Hawking, Elon Musk, and dozens of artificial intelligence experts signed an open letter on artificial intelligence calling for research on the societal impacts of AI. The letter affirmed that society can reap great potential benefits from artificial intelligence, but called for concrete research on how to prevent certain potential "pitfalls": artificial intelligence has the potential to eradicate disease and poverty, but researchers must not create something which is unsafe or uncontrollable. The four-paragraph letter, titled "Research Priorities for Robust and Beneficial Artificial Intelligence: An Open Letter", lays out detailed research priorities in an accompanying twelve-page document.
Existential risk from artificial intelligence refers to the idea that substantial progress in artificial general intelligence (AGI) could lead to human extinction or an irreversible global catastrophe.
Yoshua Bengio is a Canadian computer scientist, most noted for his work on artificial neural networks and deep learning. He is a professor at the Department of Computer Science and Operations Research at the Université de Montréal and scientific director of the Montreal Institute for Learning Algorithms (MILA).
Neuralink Corp. is an American neurotechnology company that has developed, as of 2024, implantable brain–computer interfaces (BCIs). It was founded by Elon Musk and a team of eight scientists and engineers. Neuralink was launched in 2016 and was first publicly reported in March 2017.
Andrej Karpathy is a Slovak-Canadian computer scientist who served as the director of artificial intelligence and Autopilot Vision at Tesla. He co-founded and formerly worked at OpenAI, where he specialized in deep learning and computer vision.
The tobacco industry playbook, tobacco strategy or simply disinformation playbook describes a strategy devised by the tobacco industry in the 1950s to protect revenues in the face of mounting evidence of links between tobacco smoke and serious illnesses, primarily cancer. Much of the playbook is known from industry documents made public by whistleblowers or as a result of the Tobacco Master Settlement Agreement. These documents are now curated by the UCSF Truth Tobacco Industry Documents project and are a primary source for much commentary on both the tobacco playbook and its similarities to the tactics used by other industries, notably the fossil fuel industry. It is possible that the playbook may even have originated with the oil industry.
Elon Musk is the CEO or owner of multiple companies including Tesla, SpaceX, and X Corp, and has expressed many views on a wide variety of subjects, ranging from politics to science.
The business magnate Elon Musk initiated an acquisition of American social media company Twitter, Inc. on April 14, 2022, and concluded it on October 27, 2022. Musk had begun buying shares of the company in January 2022, becoming its largest shareholder by April with a 9.1 percent ownership stake. Twitter invited Musk to join its board of directors, an offer he initially accepted before declining. On April 14, Musk made an unsolicited offer to purchase the company, to which Twitter's board responded with a "poison pill" strategy to resist a hostile takeover before unanimously accepting Musk's buyout offer of $44 billion on April 25. Musk stated that he planned to introduce new features to the platform, make its algorithms open-source, combat spambot accounts, and promote free speech, framing the acquisition as the cornerstone of X, an "everything app".
In February 2022, two days after Russia's full-scale invasion, Ukraine requested that American aerospace company SpaceX activate their Starlink satellite internet service in the country, to replace internet and communication networks degraded or destroyed during the war. Starlink has since been used by Ukrainian civilians, government and military. The satellite service has served for humanitarian purposes, as well as defense and attacks on Russian positions.
Elon Musk completed his acquisition of Twitter in October 2022; Musk acted as CEO of Twitter until June 2023 when he was succeeded by Linda Yaccarino. In a move that, despite Yaccarino's accession, was widely attributed to Musk, Twitter was rebranded to X on July 23, 2023, and its domain name changed from twitter.com to x.com on May 17, 2024.
Pause Giant AI Experiments: An Open Letter is the title of a letter published by the Future of Life Institute in March 2023. The letter calls "all AI labs to immediately pause for at least 6 months the training of AI systems more powerful than GPT-4", citing risks such as AI-generated propaganda, extreme automation of jobs, human obsolescence, and a society-wide loss of control. It received more than 30,000 signatures, including academic AI researchers and industry CEOs such as Yoshua Bengio, Stuart Russell, Elon Musk, Steve Wozniak and Yuval Noah Harari.
Grok is a generative artificial intelligence chatbot developed by xAI. Based on the large language model (LLM) of the same name, it was launched in 2023 as an initiative by Elon Musk. The chatbot is advertised as having a "sense of humor" and direct access to X. It is currently under beta testing.
PauseAI is a global political movement founded in the Netherlands with the stated aim of achieving global coordination to stop the development of artificial intelligence systems more powerful than GPT-4, at least until it is known how to build them safely, and keep them under democratic control. The movement was established in Utrecht in May 2023 by software entrepreneur Joep Meindertsma.
Connor Leahy is a German-American artificial intelligence researcher and entrepreneur known for cofounding EleutherAI and being CEO of AI safety research company Conjecture. He has warned of the existential risk from artificial general intelligence, and has called for regulation such as "a moratorium on frontier AI runs" implemented through a cap on compute.
The Musk Foundation is a US-based charitable foundation funded and directed primarily by entrepreneur and billionaire Elon Musk. The foundation is dedicated to promoting renewable energy, manned space exploration, pediatrics, science and engineering education, and the “development of safe artificial intelligence for the benefit of humanity”. At the end of 2022, the foundation had assets of $5 billion, $4.5 billion of which were in the form of shares in the carmaker Tesla.
Among the research cited was "On the Dangers of Stochastic Parrots", a well-known paper co-authored by Margaret Mitchell, who previously oversaw ethical AI research at Google. Mitchell, now chief ethical scientist at AI firm Hugging Face, criticised the letter, telling Reuters it was unclear what counted as "more powerful than GPT4".
"By treating a lot of questionable ideas as a given, the letter asserts a set of priorities and a narrative on AI that benefits the supporters of FLI," she said. "Ignoring active harms right now is a privilege that some of us don't have."
But Timnit Gebru, whose academic paper was cited to support that claim, wrote on Twitter on Thursday that her article actually warns against making such inflated claims about AI.
"They basically say the opposite of what we say and cite our paper," she wrote.
Her co-author Emily Bender said the letter was a "hot mess" and was "just dripping with AI hype".
Rather than fear-mongering, the letter is careful to highlight both the positive and negative effects of artificial intelligence.