The Stanford Internet Observatory (SIO) is a multidisciplinary program for the study of abuse in information technologies, with a focus on social media, established in 2019. It is part of the Stanford Cyber Policy Center, a joint initiative of the Freeman Spogli Institute for International Studies and Stanford Law School. [1]
Alex Stamos founded the Stanford Internet Observatory in 2019, after leaving Facebook the year before over frustrations that he was not allowed to publish the full account of Russia's influence operations on the platform in the 2016 US Presidential elections. [2]
According to Lauren Coffey of Inside Higher Ed, by 2024 the Stanford Internet Observatory had "published 15 white paper reports, 10 journal articles and garnered more than 5,000 media mentions". [3]
The SIO was the first to out Russian support for Trump online in 2016, [4] [5] raised China spying concerns around the Clubhouse app in a 2021 report, [6] partnered with the Wall Street Journal in a 2023 report on Instagram and online child sexual abuse materials, [7] and developed a curriculum for teaching college students how to handle trust and safety issues on social media platforms. [8]
The Stanford Internet Observatory participated in pre-2020 election research focusing on misinformation about election processes and procedures, resulting in a 2021 report that concluded "The 2020 election demonstrated that actors—both foreign and domestic—remain committed to weaponizing viral false and misleading narratives to undermine confidence in the US electoral system and erode Americans’ faith in our democracy". [9]
SIO co-founded the Election Integrity Partnership along with the University of Washington Center for an Informed Public to identify real-time viral falsehoods about election procedures and outcomes. The partnership worked the 2020 and 2022 election cycles and has since concluded their work. [10] [3] The lawsuits, which were eventually dismissed, as well as rhetoric about the research and work, however, resulted in scaled back or shut down research on elections by 2024. Researchers also received threats and online harassment from disinformation about their research. [11] [12] [13] As of 2024, the Center for an Informed Public continued to work on election misinformation at the University of Washington. [14]
In 2021, SIO launched the Journal of Online Trust and Safety, an open access peer-reviewed journal covering research on how consumer internet services are abused to cause harm and how to prevent those harms. [8] [15]
Praised as "the gold-standard organization for determining the veracity of political information circulating online" by Thom Hartmann of The New Republic. [4]
In 2024, Lauren Coffey of Inside Higher Ed wrote that SIO "served as a research powerhouse with a focus on social media amid growing misinformation". [3]
Joseph Menn of The Washington Post wrote "The Stanford Internet Observatory [...] published some of the most influential analysis of the spread of false information on social media during elections." [8]
Disinformation research groups, including the Stanford Internet Observatory, having reported on such topics as the 2020 stolen election claims and COVID-19 vaccine misinformation, had been under attack by GOP resources accusing them of colluding with the US Government and social media outlets of censoring conservative voices. The GOP-led House Judiciary Committee subpoenaed Stanford University for any records or emails with government officials and social media platforms. In May 2023, America First Legal sued SIO and other researchers in Louisiana, aiming to bring down the "censorship-industrial complex". A Texas lawsuit filed by anti-vaccine advocates alleged their social media posts were flagged or removed in what it called mass censorship. [16] [5] [17]
These legal cases have cost Stanford millions of dollars in legal expenses and have distracted researchers from their work. A Stanford spokesperson said: "Stanford remains deeply concerned about efforts, including lawsuits and congressional investigations, that chill freedom of inquiry and undermine legitimate and much needed academic research — both at Stanford and across academia." [10]
On June 26, 2024, the US Supreme Court ruled in favor of the Stanford Internet Observatory, and any other groups, communicating with the government with the aim of stemming the spread of falsehoods online. [18]
In June 2024, the Stanford Internet Observatory cut several jobs and several news outlets reported on its dismantling. [8] [19] [20] [10] Leadership including Alex Stamos, the main fundraiser, had left in November 2023 citing the toll of the political pressure while Renée DiResta's contract was not renewed in June 2024. [19] The SIO's closure would mark a significant setback for misinformation researchers. Conservative lawmakers had also threatened to cut federal funding to any universities that study propaganda, while the Washington Post theorized the university also might not want to alienate conservative donors. [8] The New Republic critiqued the Republican efforts as an attempt to prevent fact-checking of GOP lies. [4]
In a statement to Platformer on June 13, Stanford denied that SIO was being dismantled:
The important work of SIO continues under new leadership... Stanford remains deeply concerned about efforts, including lawsuits and congressional investigations, that chill freedom of inquiry and undermine legitimate and much needed academic research – both at Stanford and across academia. [19]
On June 18, the school said it was not shutting down the project but said that its founding grants were running out and they were seeking new funding. [21]
Some SIO work has more concrete plans to continue under new leadership at Stanford, including work on child safety, The Journal of Online Trust and Safety and the Trust and Safety Research Conference will continue. [8] [19]
On June 25, 2024, Renée DiResta penned an op-ed in the New York Times warning about the vulnerabilities in the upcoming US elections without the Election Integrity Partnership and other research by SIO focused on election misinformation and with greatly reduced trust and safety teams at many social media companies. [22]
Disinformation is false information deliberately spread to deceive people. Disinformation is an orchestrated adversarial activity in which actors employ strategic deceptions and media manipulation tactics to advance political, military, or commercial goals. Disinformation is implemented through attacks that "weaponize multiple rhetorical strategies and forms of knowing—including not only falsehoods but also truths, half-truths, and value judgements—to exploit and amplify culture wars and other identity-driven controversies."
Fact-checking is the process of verifying the factual accuracy of questioned reporting and statements. Fact-checking can be conducted before or after the text or content is published or otherwise disseminated. Internal fact-checking is such checking done in-house by the publisher to prevent inaccurate content from being published; when the text is analyzed by a third party, the process is called external fact-checking.
Misinformation is incorrect or misleading information. Misinformation can exist without specific malicious intent; disinformation is distinct in that it is deliberately deceptive and propagated. Misinformation can include inaccurate, incomplete, misleading, or false information as well as selective or half-truths. In January 2024, the World Economic Forum identified misinformation and disinformation, propagated by both internal and external interests, to "widen societal and political divides" as the most severe global risks within the next two years.
The Center for Countering Digital Hate (CCDH), formerly Brixton Endeavors, is a British-American not-for-profit NGO company with offices in London and Washington, D.C. with the stated purpose of stopping the spread of online hate speech and disinformation. It campaigns to deplatform people that it believes promote hate or misinformation, and campaigns to restrict media organisations such as The Daily Wire from advertising. CCDH is a member of the Stop Hate For Profit coalition.
The Institute for Strategic Dialogue (ISD) is a political advocacy organization founded in 2006 by Sasha Havlicek and George Weidenfeld and headquartered in London, United Kingdom.
A troll farm or troll factory is an institutionalised group of internet trolls that seeks to interfere in political opinions and decision-making.
Fake news or information disorder is false or misleading information claiming the aesthetics and legitimacy of news. Fake news often has the aim of damaging the reputation of a person or entity, or making money through advertising revenue. Although false news has always been spread throughout history, the term fake news was first used in the 1890s when sensational reports in newspapers were common. Nevertheless, the term does not have a fixed definition and has been applied broadly to any type of false information presented as news. It has also been used by high-profile people to apply to any news unfavorable to them. Further, disinformation involves spreading false information with harmful intent and is sometimes generated and propagated by hostile foreign actors, particularly during elections. In some definitions, fake news includes satirical articles misinterpreted as genuine, and articles that employ sensationalist or clickbait headlines that are not supported in the text. Because of this diversity of types of false news, researchers are beginning to favour information disorder as a more neutral and informative term.
Alex Stamos is an American computer scientist and adjunct professor at Stanford University's Center for International Security and Cooperation. He is the former chief security officer (CSO) at Facebook. His planned departure from the company, following disagreement with other executives about how to address the Russian government's use of its platform to spread disinformation during the 2016 U.S. presidential election, was reported in March 2018.
Social media was used extensively in the 2020 United States presidential election. Both incumbent president Donald Trump and Democratic Party nominee Joe Biden's campaigns employed digital-first advertising strategies, prioritizing digital advertising over print advertising in the wake of the pandemic. Trump had previously utilized his Twitter account to reach his voters and make announcements, both during and after the 2016 election. The Democratic Party nominee Joe Biden also made use of social media networks to express his views and opinions on important events such as the Trump administration's response to the COVID-19 pandemic, the protests following the murder of George Floyd, and the controversial appointment of Amy Coney Barrett to the Supreme Court.
Yonder, formerly named New Knowledge, was a company from Austin, Texas, that specialized in information integrity. It is most widely known for supporting the Senate Select Committee on Intelligence in its investigation of Russian interference in the 2016 US presidential election. The company was also involved in a disinformation operation during the 2017 US Senate special election in Alabama, though the company denied any political motivation behind its research. More recently, Yonder's CEO and researchers have provided expert commentary to the New York Times, Fast Company, and Axios about 5G and COVID-19 misinformation.
Disinformation attacks are strategic deception campaigns involving media manipulation and internet manipulation, to disseminate misleading information, aiming to confuse, paralyze, and polarize an audience. Disinformation can be considered an attack when it occurs as an adversarial narrative campaign that weaponizes multiple rhetorical strategies and forms of knowing—including not only falsehoods but also truths, half-truths, and value-laden judgements—to exploit and amplify identity-driven controversies. Disinformation attacks use media manipulation to target broadcast media like state-sponsored TV channels and radios. Due to the increasing use of internet manipulation on social media, they can be considered a cyber threat. Digital tools such as bots, algorithms, and AI technology, along with human agents including influencers, spread and amplify disinformation to micro-target populations on online platforms like Instagram, Twitter, Google, Facebook, and YouTube.
Russian disinformation campaigns have occurred in many countries. For example, disinformation campaigns led by Yevgeny Prigozhin have been reported in several African countries. Russia, however, denies that it uses disinformation to influence public opinion.
This timeline includes entries on the spread of COVID-19 misinformation and conspiracy theories related to the COVID-19 pandemic in Canada. This includes investigations into the origin of COVID-19, and the prevention and treatment of COVID-19 which is caused by the virus SARS-CoV-2. Social media apps and platforms, including Facebook, TikTok, Telegram, and YouTube, have contributed to the spread of misinformation. The Canadian Anti-Hate Network (CAHN) reported that conspiracy theories related to COVID-19 began on "day one". CAHN reported on March 16, 2020, that far-right groups in Canada were taking advantage of the climate of anxiety and fear surrounding COVID, to recycle variations of conspiracies from the 1990s, that people had shared over shortwave radio. COVID-19 disinformation is intentional and seeks to create uncertainty and confusion. But most of the misinformation is shared online unintentionally by enthusiastic participants who are politically active.
Renée DiResta is a professor, writer and former research manager at Stanford Internet Observatory (SIO). DiResta has written about pseudoscience, conspiracy theories, terrorism, and state-sponsored information warfare. She has also served as an advisor to the U.S. Congress on ongoing efforts to prevent online and social media disinformation.
Debunk.org is an independent technology think tank and non-governmental organisation based in Vilnius, Lithuania. Founded in 2018, the organisation was developed to counter online disinformation and state-sponsored internet propaganda. It researches and analyses disinformation within the Baltic states, Poland, Georgia, Montenegro, North Macedonia and the United States. It also aims to improve societal resilience to disinformation through educational courses and media literacy campaigns.
The Disinformation Project is a research group studying the effects of disinformation in the context of the COVID-19 pandemic in New Zealand. The research group was established in 2020 to combat disinformation during the COVID-19 pandemic but subsequently expanded its scope to cover other "conspiracy theory beliefs" including anti-vaccine, climate change denial, anti-immigration, the anti-gender movement, anti-Māori racism and hatred towards the LGBTQ+ community. The Disinformation Project also took an interest in monitoring neo-Nazism, far right activism, antisemitism and Islamophobia.
The Twitter Files are a series of releases of select internal Twitter, Inc. documents published from December 2022 through March 2023 on Twitter. CEO Elon Musk gave the documents to journalists Matt Taibbi, Bari Weiss, Lee Fang, and authors Michael Shellenberger, David Zweig and Alex Berenson shortly after he acquired Twitter on October 27, 2022. Taibbi and Weiss coordinated the publication of the documents with Musk, releasing details of the files as a series of Twitter threads.
Murthy v. Missouri was a case in the Supreme Court of the United States involving the First Amendment, the federal government, and social media. The states of Missouri and Louisiana, led by Missouri's then Attorney General Eric Schmitt, filed suit against the U.S. government in the Western District of Louisiana. They claimed that the federal government pressured social media companies to censor conservative views and criticism of the Biden administration in violation of the right to freedom of expression. The government said it had only made requests, not demands, that social media operators remove misinformation.
Disinformation is "false information that is purposely spread to deceive people". Misinformation is information that is false or misleading, that contradicts consensus by experts in the field or by the "best available evidence".
African Stream is a Nairobi-based online media outlet that is described as a front for Russian disinformation operations, though it presents itself as a "Pan-African digital media platform covering affairs concerning Africans at home and in the diaspora".
In advance of the 2020 election, the Stanford Internet Observatory, a CITR member, partnered with three other research organizations to form the Election Integrity Partnership. The partnership's purpose was to create "a coalition of research entities who would focus on supporting real-time information exchange" regarding the spread of potentially misleading claims about election processes and procedures.
With misinformation research under fire and social media platforms less willing to factcheck viral posts, 2024 could see a flood of voter fraud lies, making for an even more contentious election than in 2020.