This article needs additional citations for verification .(May 2023) |
This article lists notable open letters that were initiated by scientists or other academics or have a substantial share of academic signees.
Open letters that are not open for signing by other academics or the public in general and have not received both a large number of signatures – in specific no less than 10 before 2000 and no less than 40 after 2010 – and substantial media attention are not included, nor are petitions. With the advent of the Internet and World Wide Web, such open letters may have become far more frequent.
Open letters targeting or defending individual academics or small groups of scholars [1] [2] as well as letters calling for retractions of specific studies are not included.
Name | Year | Signatures | Scope / topic | Demands | Organized by | Demand achieved | Criticized |
---|---|---|---|---|---|---|---|
Elsevier: NeuroImage transition - all editors have resigned over the high publication fee, and are starting a new non-profit journal, Imaging Neuroscience [3] [4] [5] [6] [7] | 2023 | 42 academics | Global | All editors at NeuroImage and NeuroImage: Reports resigned because they oppose the large fee ($3450) that the journal owner, publishing giant Elsevier, charges authors to make their scientific papers open access (OA). As Elsevier refused to reduce the fee, they launch a new OA journal Imaging Neuroscience hosted by the non-profit publisher MIT Press. [8] They commit to make the new journal replace their abandoned one as "the top journal in [their] field" and hope they demonstrate "the way forward in non-profit publishing". [9] [8] | (Group action of full editors-team) | DA | No |
Pause Giant AI Experiments [10] [11] [12] [13] | 2023 | >1,000 incl. prominent figures in software & tech and risk researchers | Global | "AI labs to immediately pause for at least 6 months the training of AI systems more powerful than GPT-4" due to "profound risks to society and humanity". [14] | Future of Life Institute | Yes [15] [16] [17] [18] | |
Open Letter to the European Commission on the Cyber Resilience Act [19] [20] | 2023 | Many open source software organizations incl. OSI and TDF | European Union / Global | Changes to the proposed Cyber Resilience Act due to "unnecessary economic and technological risk to the EU" and improving engagement with the under-represented open source software community as "more than 70% of the software in Europe [open source/FOSS] is about to be regulated without an in-depth consultation" and potential chilling effects of OSS development. | The Eclipse Foundation | ||
An open letter regarding research on reflecting sunlight to reduce the risks of climate change [21] | 2023 | >110 academics | Global | Acceleration of research on "atmospheric aerosols and their potential to increase the reflection of sunlight from the atmosphere to address climate risk". | Yes | ||
Open Letter calling for an International Non-Use Agreement on Solar Geoengineering [22] [23] | 2022 | >380 academics | Global | "[I]mmediate political action from governments, the United Nations, and other actors to prevent the normalization of solar geoengineering as a climate policy option". | 16 scholars [24] | No | Yes |
Fossil Free Research Open Letter [25] [26] [27] | 2022 | >800 incl. Nobel laureates & IPCC authors | Global | Universities should stop accepting funding from fossil fuel companies to conduct climate research. | Fossil Free Research [28] | No | |
2022 open letter from Nobel laureates in support of Ukraine | 2022 | >200 Nobel laureates | Russia, Ukraine | Independence of the Ukrainian people and freedom of the Ukrainian state (in face of the 2022 Russian invasion of the country). | Roald Hoffmann, Richard J. Roberts | No | |
An open letter of Russian scientists and science journalists against the war in Ukraine [29] [30] [31] | 2022 | >8,000 academics and science journalists of Russia | Russia, Ukraine | Condemnation of the war against Ukraine with "responsibility for unleashing a new war in Europe [lying] entirely with Russia". [32] | NA | ||
Montreal Declaration on Animal Exploitation [33] | 2022 | >500 academics (moral and political philosophy) | Global | "Insofar as it involves unnecessary violence and harm, [they] declare that animal exploitation is unjust and morally indefensible". | NA | Yes | |
Fulfil the NPT: From nuclear threats to human security | 2022 | >1,000 incl. many academics | Global | Four steps towards a nuclear-weapon-free world to take by the Tenth Review Conference of the Non-Proliferation Treaty in 2022. | NoFirstUse Global | ||
Statement by other Academics in support of Bill No. 28 of 2022 [34] | 2022 | >100 academics | Malta | Passage of abortion law changes Bill No. 28 of 2022 as is. | |||
Scientists to President Biden: Follow the Science, Stop Fossil Fuels [35] | 2022 | 275 academics | USA / Global | Halting recent moves towards increasing fossil fuel production and instead take bold action to rapidly reduce fossil fuel extraction and infrastructure by President Biden. | Scientists Bob Howarth, Mark Jacobson, Michael Mann, Sandra Steingraber, and Peter Kalmus | No | |
Scientists & Experts Want Climate Action – An Open Letter to the White House [36] | 2021 | >1,500 academics incl. 5 senior/lead IPCC authors | USA / Global | "President Joe Biden and his administration to commit to reducing U.S. heat-trapping emissions by at least 50 percent below 2005 levels by 2030". | Union of Concerned Scientists | ||
It is essential to Remove Climate-Harming Logging and Fossil Fuel Provisions from Reconciliation and Infrastructure Bills [37] [38] | 2021 | >100 academics | USA / Global | Removal of logging and fossil fuel provisions from reconciliation and infrastructure bills by President Biden and members of Congress. [39] | |||
Scientists and Engineers Letter to President Biden on the Nuclear Posture Review [40] [41] | 2021 | ~700 scientists and engineers incl. 21 Nobel laureates | USA / Global | President Biden to "use his forthcoming declaration of a new national strategy for managing nuclear weapons as a chance to cut the U.S. arsenal by a third, and to declare, for the first time, that the United States would never be the first to use nuclear weapons in a conflict". | |||
An Open Letter from U.S. Scientists Imploring President Biden to End the Fossil Fuel Era [42] | 2021 | >330 academics | USA / Global |
| Biologist Dr. Sandra Steingraber, climate scientist Dr. Peter Kalmus, advocacy groups Center for Biological Diversity and Food & Water Watch | ||
Stop attempts to criminalise nonviolent climate protest [46] | 2021 | >400 academics | Global | End of the criminalization of non-violent civil disobedience from direct action climate activist groups. | Oscar Berglund, others | ||
Academic Open Letter in Support of the TRIPS Intellectual Property Waiver Proposal [47] | 2021 | >100 academics (international intellectual property) | Global | A "temporary TRIPS waiver – as proposed by India and South Africa and supported by more than 100 countries – [as a] necessary and proportionate legal measure towards the clearing of existing intellectual property barriers to scaling up of production of COVID-19 health technologies in a direct, consistent and effective fashion". | No | ||
WTO must ban harmful fisheries subsidies [48] [49] [50] | 2021 | 296 academics [51] | Global | WTO to eliminate increasing harmful fisheries subsidies. | |||
It Is Time to Address Airborne Transmission of Coronavirus Disease 2019 (COVID-19) [52] | 2020 | >239 academics [53] | Global | WHO, medical community and relevant national and other international bodies to "recognize the potential for airborne spread" of COVID-19 by around the time of the letter. | Lidia Morawska, Donald K Milton | No | |
Call to stop "Chain of Killings" [54] | 2020 | >100 academics | Philippine | End to President Rodrigo Duterte's "bloody war on drugs" and the creation of a Truth Commission. | No | ||
A Letter on Justice and Open Debate | 2020 | 153; mostly scholars and writers | Global | Condemnation of cancel culture and liberal intolerance and defending free speech, making an argument that hostility to free speech was becoming widespread on what could be described as "the political left" as well. | Thomas Chatterton Williams, Robert Worth, George Packer, David Greenberg, Mark Lilla | NA | Yes [55] [56] |
A warning on climate and the risk of societal collapse [57] | 2020 | 258 mostly academics | Global | Engagement with the risk of disruption and even collapse of societies due to climate change by policymakers. | Gesa Weyhenmeyer, Will Steffen | NA | |
We declare our support for Extinction Rebellion [58] | 2019 | >250 academics at Australian universities | Global | Declaration of support for Extinction Rebellion and sufficient change of Australian government's inaction on the climate crisis. It is based on a 2018 open letter. | NA | ||
School climate strike children’s brave stand has our support [59] | 2019 | 224 academics | Global | Declaration of support for School Strikes for Climate. | NA | ||
The EU needs a stability and wellbeing pact, not more growth [60] [61] | 2018 | 238 academics | Global | "Plan[ning]" for a post-growth future in which human and ecological wellbeing is prioritised over GDP by the European Union and its member states. | Wellbeing Economy Alliance and others | NA | |
An Open Letter from Scientists to President-Elect Trump on Climate Change [62] | 2016 | >800 academics (Earth science and energy) | USA / Global | Six steps of immediate and sustained action to take by Donald Trump against human-caused climate change. | No | ||
Science and the Public Interest – An Open Letter to President-Elect Trump and the 115th Congress [63] | 2016 | >2,000 academics | USA / Global | Sufficient funding of scientific research as well as "support[ing] and rely[ing] on science as a key input for crafting public policy" by the incoming administration. | Union of Concerned Scientists | No | |
Open Letter to Google From 80 Internet Scholars: Release RTBF Compliance Data [64] | 2015 | 80 academics | Global – Google | More transparency of Google's right to be forgotten (RTBF) processes. | |||
Open Letter on Artificial Intelligence | 2015 | >150 incl. Stephen Hawking, Elon Musk, and many AI experts | Global | The letter highlights both the positive and negative effects of artificial intelligence. [65] According to Bloomberg Business, Professor Max Tegmark of MIT circulated the letter in order to find common ground between signatories who consider super intelligent AI a significant existential risk, and signatories such as Professor Oren Etzioni, who believe the AI field was being "impugned" by a one-sided media focus on the alleged risks. [66] The letter contends that: | Future of Life Institute | NA | |
An Open Letter From Internet Engineers to the U.S. Congress [67] | 2011 | 83 prominent Internet inventors and engineers | USA, Canada / Global | Stoppage of SOPA and PIPA Internet blacklist bills. [68] | Electronic Frontier Foundation | Yes | No |
Open letter from ten Members of the Russian Academy of Sciences to the President | 2007 | 10 members of the Russian Academy of Sciences | Russia | The letter was intended to warn both the society and the government about the growing influence of the Russian Orthodox Church and its expansion into many fields of social life, particularly into the state education system, which is strictly prohibited under the Russian Constitution. | No | Yes [69] | |
Open letter to Gorbachev | 1990 | 30 academics incl. Nobel laureates | Russia | Preventing privatisation of land itself, instead of a Georgist system of common ownership and the collection of public revenue through land-value taxation. |
Elon Reeve Musk is a businessman and investor. He is the founder, chairman, CEO, and chief technology officer of SpaceX; angel investor, CEO, product architect and former chairman of Tesla, Inc.; owner, chairman and CTO of X Corp.; founder of the Boring Company and xAI; co-founder of Neuralink and OpenAI; and president of the Musk Foundation. He is the wealthiest person in the world, with an estimated net worth of US$232 billion as of December 2023, according to the Bloomberg Billionaires Index, and $254 billion according to Forbes, primarily from his ownership stakes in Tesla and SpaceX.
Gary Gensler is an American government official and former Goldman Sachs investment banker serving as the chair of the U.S. Securities and Exchange Commission (SEC). Gensler previously led the Biden–Harris transition's Federal Reserve, Banking, and Securities Regulators agency review team. Prior to his appointment, he was professor of Practice of Global Economics and Management at the MIT Sloan School of Management.
This is a timeline of artificial intelligence, sometimes alternatively called synthetic intelligence.
The Future of Life Institute (FLI) is a nonprofit organization with the stated goal of steering transformative technology towards benefiting life and away from large-scale risks facing humanity, including existential risk from advanced artificial intelligence (AI). FLI's work includes grantmaking, educational outreach, and advocacy within the United Nations, United States government, and European Union institutions.
In January 2015, Stephen Hawking, Elon Musk, and dozens of artificial intelligence experts signed an open letter on artificial intelligence calling for research on the societal impacts of AI. The letter affirmed that society can reap great potential benefits from artificial intelligence, but called for concrete research on how to prevent certain potential "pitfalls": artificial intelligence has the potential to eradicate disease and poverty, but researchers must not create something which is unsafe or uncontrollable. The four-paragraph letter, titled "Research Priorities for Robust and Beneficial Artificial Intelligence: An Open Letter", lays out detailed research priorities in an accompanying twelve-page document.
Existential risk from artificial general intelligence is the idea that substantial progress in artificial general intelligence (AGI) could result in human extinction or an irreversible global catastrophe.
Neuralink Corp. is an American neurotechnology company that is developing implantable brain–computer interfaces (BCIs), based in Fremont, California, as of 2022. Founded by Elon Musk and a team of seven scientists and engineers, Neuralink was launched in 2016 and was first publicly reported in March 2017.
Life 3.0: Being Human in the Age of Artificial Intelligence is a 2017 non-fiction book by Swedish-American cosmologist Max Tegmark. Life 3.0 discusses artificial intelligence (AI) and its impact on the future of life on Earth and beyond. The book discusses a variety of societal implications, what can be done to maximize the chances of a positive outcome, and potential futures for humanity, technology and combinations thereof.
Extinction Rebellion is a UK-founded global environmental movement, with the stated aim of using nonviolent civil disobedience to compel government action to avoid tipping points in the climate system, biodiversity loss, and the risk of social and ecological collapse. Extinction Rebellion was established in Stroud in May 2018 by Gail Bradbrook, Simon Bramwell, and Roger Hallam, along with eight other co-founders from the campaign group Rising Up!
Peter Kalmus is an American scientist and writer based in Altadena, California. He is a data scientist at NASA's Jet Propulsion Laboratory as an associate project scientist at UCLA's Joint Institute for Regional Earth System Science & Engineering.
The tobacco industry playbook, tobacco strategy or simply disinformation playbook describes a strategy devised by the tobacco industry in the 1950s to protect revenues in the face of mounting evidence of links between tobacco smoke and serious illnesses, primarily cancer. Much of the playbook is known from industry documents made public by whistleblowers or as a result of the Tobacco Master Settlement Agreement. These documents are now curated by the UCSF Truth Tobacco Industry Documents project and are a primary source for much commentary on both the tobacco playbook and its similarities to the tactics used by other industries, notably the fossil fuel industry. It is possible that the playbook may even have originated with the oil industry.
Elon Musk is the CEO or owner of multiple companies including Tesla, SpaceX, and X Corp, has expressed many views on a wide variety of subjects, ranging from politics to science.
Business magnate Elon Musk initiated an acquisition of American social media company Twitter, Inc. on April 14, 2022, and concluded it on October 27, 2022. Musk had begun buying shares of the company in January 2022, becoming its largest shareholder by April with a 9.1 percent ownership stake. Twitter invited Musk to join its board of directors, an offer he initially accepted before declining. On April 14, Musk made an unsolicited offer to purchase the company, to which Twitter's board responded with a "poison pill" strategy to resist a hostile takeover before unanimously accepting Musk's buyout offer of $44 billion on April 25. Musk stated that he planned to introduce new features to the platform, make its algorithms open-source, combat spambot accounts, and promote free speech.
Scientist Rebellion is an international scientists' environmentalist group that campaigns for degrowth, climate justice, and more effective climate change mitigation.
In February 2022, two days after Russia's full-scale invasion, Ukraine requested American aerospace company SpaceX to activate their Starlink satellite internet service in the country to replace internet and communication networks degraded or destroyed during the war. Starlink has since been used by Ukrainian civilians, government and military. The satellite service has served for humanitarian purposes, as well as defense and attacks on Russian positions.
ChatGPT is a chatbot developed by OpenAI and launched on November 30, 2022. Based on a large language model, it enables users to refine and steer a conversation towards a desired length, format, style, level of detail, and language. Successive prompts and replies, known as prompt engineering, are considered at each conversation stage as a context.
Elon Musk completed his acquisition of Twitter in October 2022; Musk acted as CEO of Twitter until June 2023 when he was succeeded by Linda Yaccarino. Twitter was then rebranded to X in July 2023. Initially during Musk's tenure, Twitter introduced a series of reforms and management changes; the company reinstated a number of previously banned accounts, reduced the workforce by approximately 80%, closed one of Twitter's three data centers, and largely eliminated the content moderation team, replacing it with the crowd-sourced fact-checking system Community Notes.
Pause Giant AI Experiments: An Open Letter is the title of a letter published by the Future of Life Institute in March 2023. The letter calls "all AI labs to immediately pause for at least 6 months the training of AI systems more powerful than GPT-4", citing risks such as AI-generated propaganda, extreme automation of jobs, human obsolescence, and a society-wide loss of control. It received more than 20,000 signatures, including academic AI researchers and industry CEOs such as Yoshua Bengio, Stuart Russell, Elon Musk, Steve Wozniak and Yuval Noah Harari.
Ariel K. Smith is a U.S. Virgin Islands lawyer serving as the 19th attorney general of the United States Virgin Islands since 2023.
Grok is a conversational generative artificial intelligence chatbot developed by xAI, based on a large language model (LLM). It was developed as an initiative by Elon Musk as a direct response to the rise of OpenAI's ChatGPT which Musk co-founded. The chatbot is advertised as "having a sense of humor" and direct access to Twitter (X). It is currently under beta testing for those with the premium version of X.
Among the research cited was "On the Dangers of Stochastic Parrots", a well-known paper co-authored by Margaret Mitchell, who previously oversaw ethical AI research at Google. Mitchell, now chief ethical scientist at AI firm Hugging Face, criticised the letter, telling Reuters it was unclear what counted as "more powerful than GPT4".
"By treating a lot of questionable ideas as a given, the letter asserts a set of priorities and a narrative on AI that benefits the supporters of FLI," she said. "Ignoring active harms right now is a privilege that some of us don't have."
But Timnit Gebru, whose academic paper was cited to support that claim, wrote on Twitter on Thursday that her article actually warns against making such inflated claims about AI.
"They basically say the opposite of what we say and cite our paper," she wrote.
Her co-author Emily Bender said the letter was a "hot mess" and was "just dripping with AI hype".
Rather than fear-mongering, the letter is careful to highlight both the positive and negative effects of artificial intelligence.