Parts of this article (those related to History) need to be updated.(August 2023) |
Purpose | "… promot[ing] free expression by making principled, independent decisions … issuing recommendations on the relevant Facebook company content policy." [1] |
---|---|
Co-chairs | |
| |
Funding | Meta Platforms |
Website | oversightboard |
This article is part of a series about |
Meta Platforms |
---|
Products and services |
People |
Business |
The Oversight Board is a body that makes consequential precedent-setting content moderation decisions (see Table of decisions below) on the social media platforms Facebook and Instagram, in a form of "platform self-governance". [3]
Meta (then Facebook) CEO Mark Zuckerberg approved the creation of the board in November 2018, shortly after a meeting with Harvard Law School professor Noah Feldman, who had proposed the creation of a quasi-judiciary on Facebook. [4] Zuckerberg originally described it as a kind of "Supreme Court", given its role in settlement, negotiation, and mediation, including the power to override the company's decisions. [5]
Zuckerberg first announced the idea in November 2018, and, after a period of public consultation, the board's 20 founding members were announced in May 2020. The board officially began its work on October 22, 2020, [6] and issued its first five decisions on January 28, 2021, with four out of the five overturning Facebook's actions with respect to the matters appealed. [7] It has been subject to substantial media speculation and coverage since its announcement, and has remained so following the referral of Facebook's decision to suspend Donald Trump after the 2021 United States Capitol attack. [8]
In November 2018, after meeting with Harvard Law School professor Noah Feldman, who had proposed the creation of a quasi-judiciary on Facebook to oversee content moderation, CEO Mark Zuckerberg approved the creation of the board. [9] [7] [10] Among the board's goals were to improve the fairness of the appeals process, give oversight and accountability from an outside source, and increase transparency. [10] The board was modeled after the United States' federal judicial system, as the Oversight Board gives precedential value to previous board decisions. [11]
Between late 2017 and early 2018, Facebook had hired Brent C. Harris, who had previously worked on the National Commission on the BP Deepwater Horizon Oil Spill and Offshore Drilling, and as an advisor to non-profits, to become the company's Director of Global Affairs. [12] [4] [13] Harris led the effort to create the board, reporting to Nick Clegg, who reported directly to Zuckerberg. [14] Harris also credited Clegg's involvement, saying that efforts to establish the board "wouldn't have moved absent Nick's sponsorship", and that it was "stalled within the company until Nick really took it on". [15]
In January 2019, Facebook received a draft charter for the board [16] and began a period of public consultations and workshops with experts, institutions, and people around the world. [17] [18] In June 2019, Facebook released a 250-page report summarizing its findings and announced that they are in the process of looking for people to serve on a 40-person board (the board ended up having 20 members). [19]
In January 2020, it appointed British human rights expert and former Article 19 Executive Director Thomas Hughes as Director of Oversight Board Administration. [20] It also said that board members would be named "in the coming months". [21]
On May 6, 2020, Facebook announced the 20 members that would make up the Oversight Board. [22] Facebook's VP of Global Affairs and Communications Nick Clegg described the group as having a "wide range of views and experiences" and who collectively lived in "over 27 countries", speaking "at least 29 languages, [23] but a quarter of the group and two of the four co-chairs are from the United States, which some free speech and internet governance experts expressed concerns about. [22] In July 2020 it was announced that the board would not start work until "later in the year". [24] It starting accepting cases on October 22, 2020. [6] Members of the board have noted that it will take several years for the full impact of the board and its decisions to be understood. [7] [25] The board officially began to cover cases related to Threads in May 2024. [26]
On January 28, 2021, the board ruled on five moderation decisions made by Facebook, overturning four of them and upholding one. [27] [7] [28] All but one were unanimous. [8] Each ruling was decided by a majority vote of a panel of five members of the board, including at least one member from the region where the moderated post originated. [7]
In October 2020, a Facebook user in Myanmar posted images of photographs taken by Turkish photojournalist Nilüfer Demir of the corpse of Kurdish Syrian toddler Alan Kurdi, accompanied by text in Burmese to the effect that there was "something wrong" with the psychology or the mindset of Muslims or Muslim men. [29] The text further contrasted terrorist attacks in France in response to depictions of Muhammad with an asserted relative silence by Muslims in response to the persecution of Uyghurs in China, [7] [29] and asserted that this conduct had led to a loss of sympathy for those like the child in the photograph. [29] The post was reinstated [30]
In reviewing Facebook's decision to remove the post, the board sought a re-translation of the post, [7] and noted that the post could be read as an insult directed towards Muslims, but could also be read as commentary on a perceived inconsistency of reactions by Muslims to the events in France and China addressed. [7] [29]
A post showing churches in Baku, Azerbaijan was captioned with a statement in Russian that "asserted that Armenians had historical ties with Baku that Azerbaijanis didn't", referring to Azerbaijanis with the ethnic slur taziks. The board found that the post was harmful to the safety and dignity of Azerbaijanis, and therefore upheld its removal. [7]
In October 2020, a Brazilian woman posted a series of images on Facebook subsidiary Instagram including uncovered breasts with a visible nipple, as part of an international campaign to raise breast cancer awareness. [31] [29] The photographs were asserted to show breast cancer symptoms, and indicated this in text in Portuguese, which the website's automated review system failed to understand. [7]
The images were removed and then later restored. [7] [29] Facebook asked that the review be dropped as moot, but the board chose to review the action nonetheless, finding that the importance of the issue made it more beneficial for the board to render a judgment on the underlying question. [7] The board further held that removal of the post was improper, as it impacted the human rights of women, and recommended improvements to the decision-making process for the removal of such posts. [7] In particular, the board recommended that users be informed of the use of automated content review mechanisms, that Instagram community standards be revised to expressly permit images with female nipples in breast cancer awareness posts, and that Facebook should clarify that its community standards take precedence over those of Instagram. [31]
In October 2020, a Facebook user posted a quote incorrectly attributed to Nazi propagandist Joseph Goebbels, stating that appeals to emotion and instinct are more important than appeals to truth. [7] The post contained no images or symbols. Facebook took down the post under its policy prohibiting the promotion of dangerous individuals and organizations, including Goebbels. The account user appealed, asserting that the post was intended as a commentary on Donald Trump. The board found that the evidence supported this assertion and held that post did not indicate support for Goebbels, and ordered that it be restored, with the recommendation that Facebook should indicate to users posting about such persons that "the user must make clear that they are not praising or supporting them". [7]
In October 2020, a French user posted a French language-video in a Facebook group criticizing the Agence nationale de sécurité du médicament for its refusal to authorize hydroxychloroquine and azithromycin to treat COVID-19. [28] Facebook removed the post for spreading COVID-19 misinformation, which the board reversed, in part because the drugs mentioned are prescription drugs in France, which would require individuals seeking them to interact with a physician. The board recommended that Facebook correct such misinformation rather than removing it. [7]
Although Facebook restored the post, it also noted that its approach to COVID-19 misinformation reflects the guidance of the U.S. Centers for Disease Control and Prevention and the World Health Organization, and that it would therefore not change its approach to such matters. [7]
On February 12, 2021, the Board overturned the removal of a Facebook forum post made in October 2020, containing an image of a TV character holding a sheathed sword, with Hindi text translated as stating "if the tongue of the kafir starts against the Prophet, then the sword should be taken out of the sheath", with hashtags equating French President Emmanuel Macron to the devil, and calling for a boycott of products from France. The board found that the post was not likely to cause harm. [32]
On April 13, 2021, the board upheld the removal of a Facebook post by a Dutch Facebook containing a 17-second video of a child and three adults wearing traditional Dutch "Sinterklaas" costumes, including two white adults dressed as Zwarte Piet (Black Pete), with faces painted black and wearing Afro wigs. The board found that although the cultural tradition is not intentionally racist, use of blackface is a common racist trope. [33]
Facebook's deplatforming of U.S. President Donald Trump was not among the initial decisions as it was collecting comments from the public. [34] [35]
On January 6, 2021, amidst an attack at the Capitol while Congress was counting the electoral votes, Trump posted a short video to social media in which he praised the rioters, despite urging them to end the violence, and reiterated his baseless claim that the 2020 presidential election was fraudulent. [36] Several platforms, including Facebook, removed it, with Facebook's vice president of integrity, Guy Rosen, explaining that the video "contributes to rather than diminishes the risk of ongoing violence". [37] That day, Facebook also blocked Trump's ability to post new content; the next day, Facebook said the block would remain at least until the end of Trump's term on January 20. [38]
On April 16, 2021, the board announced that it was delaying the decision on whether to overturn Trump's suspensions on Facebook and Instagram to sometime "in the coming weeks" in order to review the more than 9,000 public comments it had received. [39] Notably, on January 27, 2021, incoming board member Suzanne Nossel had published an op-ed in the Los Angeles Times titled "Banning Trump from Facebook may feel good. Here's why it might be wrong", [40] but a spokesperson announced that she would not participate in the deliberations over the Trump's case and would be spending the upcoming weeks in training. [41] On the same day Nossel's appointment was announced, the board also announced a new case.
On May 5, 2021, the board announced its decision to uphold Trump's account suspension, but instructed Facebook to reassess their decision to indefinitely ban Trump within six months. [42] The board specified that Facebook's standard procedures involve either a timed ban or a complete removal of the offending account, stating that Facebook must follow a "clear, published procedure" in the matter. [43]
On June 4, 2021, Facebook announced that it had changed the indefinite ban to a two-year suspension, ending on January 7, 2023. [44] Trump's Facebook account was later reinstated in March 2023, with Meta saying the public should be allowed to hear from politicians, but that Trump would be subject to "heightened penalties" for repeated violations of its rules. [45]
In September 2021, the board announced it would review Facebook's internal XCheck system, which fully exempted high-profile users from some of the platform's rules and regulations as well as partially exempting less high-profile users with their posts subjected only to Facebook's content review. This program was a separate system and queue, intended only for around 5.8 million users. [46] The board's quarterly report, issued on October 21, 2021, stated that the company was not transparent about the XCheck program and did not provide the board with complete information upon which to conduct a review. [47] The board also noted that the company's lack of transparency with users about reasons for content deletion was unfair. [48] In response, the company stated that it would aim for greater clarity in the future. [48]
In October 2021, the board announced that it would be meeting with former Facebook employee and whistleblower, Frances Haugen, to discuss her statements about the company that she previously shared with The Wall Street Journal and United States Senate Commerce Committee's Sub-Committee on Consumer Protection, Product Safety, and Data Security. [49] [50]
As the Oversight Board is not a tribunal, court of law, or quasi-judicial body, it is not guided by enabling legislation created by any government. Instead, a corporate charter, bylaws, and series of governing documents set out the scope and powers of the Board. [3] Opinions written by the board reference Meta's corporate human rights policy, which "voluntarily incorporates the United Nations Guiding Principles on Business and Human Rights, the International Bill of Human Rights, and numerous international human rights treaties". [51]
In order to ensure the board's independence, Facebook established an irrevocable trust with $130 million in initial funding, expected to cover operational costs for over half a decade. [52] [53] The board is able to hear appeals submitted by both Facebook and its users, and Facebook "will be required to respond publicly to any recommendations". [52] Notably, while the initial remit of the board gave it broad scope to hear anything that can be appealed on Facebook, the company stated that it would take the building of technical infrastructure in order for this to extend beyond the appeal of removals of content. [54] [55] The entire Oversight Board is overseen by the Oversight Board Trust, which has the power to confirm or remove new board appointees, as well as ensure that the board is operating in accordance with its stated purpose. [52] [53]
In legal terms, the Oversight Board actually is incorporated as a Delaware-based LLC, with the Oversight Board Trust as its only member. [51]
Board members indicated that the board would begin its work slowly and deliberately, with a focus on producing meaningful opinions in cases carefully selected to be representative of substantial issues. [56] Facebook also developed software to enable it to transfer cases to the board without compromising user privacy. [56] On April 13, 2021, the Oversight Board announced that it would start accepting appeals by users seeking to take down other people's content that had not been removed following an objection. [57]
The charter provides for future candidates to be nominated for board membership, through a recommendations portal operated by the U.S. law firm Baker McKenzie. [58]
The 20 members of the Oversight Board were announced on May 6, 2020. [59] The co-chairs, who selected the other members jointly with Facebook, are former U.S. federal circuit judge and religious freedom expert Michael McConnell, constitutional law expert Jamal Greene, Colombian attorney Catalina Botero-Marino and former Danish Prime Minister Helle Thorning-Schmidt. [59] Among the initial cohort were: former European Court of Human Rights judge András Sajó, Internet Sans Frontières Executive Director Julie Owono, Yemeni activist and Nobel Peace Prize laureate Tawakkol Karman, former editor-in-chief of The Guardian Alan Rusbridger, Pakistani digital rights advocate Nighat Dad, and Ronaldo Lemos, lawyer that created the Brazilian Civil Rights Framework for the Internet law. [60]
On April 20, 2021, its newest board member, PEN America CEO Suzanne Nossel, was appointed to replace Pamela S. Karlan, who had resigned in February 2021 to join the Biden administration. [41] As of 2021 [update] , the United States has the most substantial representation with five members, including two of the four co-chairs of the board. Two board members come from South American countries, six come from countries all across Asia, three come from Africa including one with both African and European ties, who also counts towards three coming from Europe, and one comes from Australia.
Name | Country | Term | Details |
---|---|---|---|
Helle Thorning-Schmidt, Co-chair | Denmark | 2020–Present | Former Prime Minister of Denmark |
Catalina Botero Marino, Co-chair | Colombia | 2020–Present | Dean of Law Faculty at Universidad de los Andes |
Michael W. McConnell, Co-chair | United States | 2020–Present | Former Judge of the U.S. Court of Appeals for the 10th Circuit |
Evelyn Aswad, Co-chair | United States | 2020–Present | University of Oklahoma College of Law Professor |
Afia Asantewaa Asare-Kyei | Ghana South Africa | 2020–Present | Human rights lawyer |
Endy Bayuni | Indonesia | 2020–Present | Journalist |
Katherine Chen | Taiwan | 2020–Present | Public relations and statistics professor at National Chengchi University |
Nighat Dad | Pakistan | 2020–Present | Lawyer and internet activist |
Tawakkol Karman | Yemen | 2020–Present | Journalist and human rights activist |
Sudhir Krishnaswamy | India | 2020–Present | Vice-Chancellor of the National Law School of India University |
Ronaldo Lemos | Brazil | 2020–Present | Lawyer and academic |
Julie Owono | Cameroon France | 2020–Present | Lawyer and executive director of Internet Sans Frontières |
Emi Palmor | Israel | 2020–Present | Former Director General of Israeli Ministry of Justice |
Alan Rusbridger | United Kingdom | 2020–Present | Journalist |
András Sajó | Hungary | 2020–Present | Legal Scholar |
John Samples | United States | 2020–Present | Vice President of the Cato Institute |
Nicolas Suzor | Australia | 2020–Present | Queensland University of Technology Law Professor |
Suzanne Nossel | United States | 2021–Present | CEO of PEN America |
Khaled Mansour | Egypt | 2022–Present | Journalist |
Pamela San Martin | Mexico | 2022–Present | Lawyer, former National Electoral Institute Councilor |
Paolo Carozza | United States | 2022–Present | University of Notre Dame Law and Political Science Professor |
Kenji Yoshino | United States | 2023–Present | New York University School of Law Professor of Constitutional Law |
Name | Country | Term | Details |
---|---|---|---|
Pamela S. Karlan | United States | 2020–2021 | Stanford Law School Professor |
Jamal Greene, Co-chair | United States | 2020–2023 | Columbia Law School Professor |
Maina Kiai | Kenya | 2020–2023 | Lawyer and human rights activist |
Name | Country | Term | Details |
---|---|---|---|
Stephen Neal, Chair | United States | 2021–Present | Chairman Emeritus and Senior Counsel at the law firm Cooley LLP, former Board Chairperson of Levi Strauss & Co. |
Robert Post | United States | 2020–Present | Professor and former Dean of Yale Law School |
Kate O'Regan | South Africa | 2020–Present | Former Deputy Chief Justice of South Africa |
Kristina Arriaga [61] | United States | 2020–Present | Former Vice-Chair of the U.S. Commission on International Religious Freedom |
Cherine Chalaby [62] | United Kingdom | 2020–Present | Former Chairman of the Board of the Internet Corporation for Assigned Names & Numbers (ICANN) |
Marie Wieck [63] | United States | 2022–Present | Former General Manager for Blockchain for IBM Industry Platform |
Name | Country | Term | Details |
---|---|---|---|
Paul G. Haaga Jr., Inaugural Chairperson | United States | 2020–2021 | Former Chairman of the Capital Group |
Wanda Felton [64] | United States | 2020–2021 | Former Vice-Chair of the Export–Import Bank of the United States |
Decision date | Platform | Appeal type | Ruling | Countries | Relevant community standard | Link to case |
---|---|---|---|---|---|---|
January 28, 2021 | Removal | n/a | Malaysia | Hate speech | 2020-001-FB-UA | |
January 28, 2021 | Removal | Overturn | Myanmar, France, China | Hate speech | 2020-002-FB-UA | |
January 28, 2021 | Removal | Uphold | Armenia, Azerbaijan | Hate speech | 2020-003-FB-UA | |
January 28, 2021 | Removal | Overturn | Brazil | Adult nudity and sexual activity | 2020-004-IG-UA | |
January 28, 2021 | Removal | Overturn | United States | Dangerous individuals and organizations | 2020-005-FB-UA | |
January 28, 2021 | Removal | Overturn | France | Violence and incitement | 2020-006-FB-FBR | |
February 12, 2021 | Removal | Overturn | France, India | Violence and incitement | 2020-007-FB-FBR | |
April 13, 2021 | Removal | Uphold | Netherlands | Hate speech | 2021-002-FB-UA | |
April 29, 2021 | Removal | Overturn | India | Dangerous individuals and organizations | 2021-003-FB-UA | |
May 5, 2021 | Account suspension | Uphold | United States | Dangerous individuals and organizations | 2021-001-FB-FBR | |
July 10, 2021 | Removal | Overturn | Russia | Bullying And Harassment | 2021-004-FB-UA |
Facebook's introduction of the Oversight Board elicited a variety of responses, with St. John's University law professor Kate Klonick describing its creation as an historic endeavor, [65] and technology news website The Verge deeming it "a wild new experiment in platform governance". [56] Politico described it as "an unapologetically globalist mix of academic experts, journalists and political figures". [15]
Even before the board made its first decisions, critics speculated that the board would be too strict, too lenient, or otherwise ineffective. In May 2020, Republican Senator Josh Hawley described the board as a "special censorship committee". [66] Other critics expressed doubts that it would be effective, leading to the creation of an unrelated and unaffiliated group of "vocal Facebook critics" calling itself the "Real Facebook Oversight Board". [56] Facebook issued no official comment on the effort, while Slate described it as "a citizen campaign against the board". [7]
Legal affairs blogger Evelyn Douek noted that the board's initial decisions "strike at matters fundamental to the way Facebook designs its content moderation system and clearly signal that the FOB does not intend to play mere occasional pitstop on Facebook's journey to connect the world". [66]
Marc Lowell Andreessen is an American businessman and former software engineer. He is the co-author of Mosaic, the first widely used web browser with a graphical user interface; co-founder of Netscape; and co-founder and general partner of Silicon Valley venture capital firm Andreessen Horowitz. He co-founded and later sold the software company Opsware to Hewlett-Packard; he also co-founded Ning, a company that provides a platform for social networking websites. He is an inductee in the World Wide Web Hall of Fame. Andreessen's net worth is estimated at $1.7 billion.
Mark Elliot Zuckerberg is an American businessman who co-founded the social media service Facebook and its parent company Meta Platforms, of which he is the chairman, chief executive officer, and controlling shareholder. Zuckerberg has been the subject of multiple lawsuits regarding the creation and ownership of the website as well as issues such as user privacy.
Facebook is a social media and social networking service owned by American technology conglomerate Meta. Created in 2004 by Mark Zuckerberg with four other Harvard College students and roommates Eduardo Saverin, Andrew McCollum, Dustin Moskovitz, and Chris Hughes, its name derives from the face book directories often given to American university students. Membership was initially limited to Harvard students, gradually expanding to other North American universities. Since 2006, Facebook allows everyone to register from 13 years old, except in the case of a handful of nations, where the age limit is 14 years. As of December 2022, Facebook claimed almost 3 billion monthly active users. As of October 2023, Facebook ranked as the third-most-visited website in the world, with 22.56% of its traffic coming from the United States. It was the most downloaded mobile app of the 2010s.
Facebook has been the subject of criticism and legal action since it was founded in 2004. Criticisms include the outsize influence Facebook has on the lives and health of its users and employees, as well as Facebook's influence on the way media, specifically news, is reported and distributed. Notable issues include Internet privacy, such as use of a widespread "like" button on third-party websites tracking users, possible indefinite records of user information, automatic facial recognition software, and its role in the workplace, including employer-employee account disclosure. The use of Facebook can have negative psychological and physiological effects that include feelings of sexual jealousy, stress, lack of attention, and social media addiction that in some cases is comparable to drug addiction.
Parler is an American alt-tech social networking service associated with conservatives. Launched in August 2018, Parler marketed itself as a free speech-focused and unbiased alternative to mainstream social networks such as Twitter and Facebook. Journalists described Parler as an alt-tech alternative to Twitter, with its users including those banned from mainstream social networks or who oppose their moderation policies.
In the United States, Section 230 is a section of the Communications Act of 1934 that was enacted as part of the Communications Decency Act of 1996, which is Title V of the Telecommunications Act of 1996, and generally provides immunity for online computer services with respect to third-party content generated by its users. At its core, Section 230(c)(1) provides immunity from liability for providers and users of an "interactive computer service" who publish information provided by third-party users:
No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
Facebook is a social networking service originally launched as Facemash in 2003. It became TheFacebook on February 4, 2004, before changing its name to simply Facebook in August 2005. Facebook was rebranded to Meta on October 28, 2021 during the Connect 2021.
Ineitha Lynnette Hardaway and Herneitha Rochelle Hardaway Richardson, known as Diamond and Silk, respectively, were a pair of American conservative political commentators and vloggers. They are known for their support of former U.S. president Donald Trump. Both have served as contributors for conservative news channel Newsmax.
Facebook has been involved in multiple controversies involving censorship of content, removing or omitting information from its services in order to comply with company policies, legal demands, and government censorship laws.
Facebook's Feed, formerly known as the News Feed, is a web feed feature for the social network. The feed is the primary system through which users are exposed to content posted on the network. Feed highlights information that includes profile changes, upcoming events, and birthdays, among other updates. Using a proprietary method, Facebook selects a handful of updates to show users every time they visit their feed, out of an average of 2,000 updates they can potentially receive. Over two billion people use Facebook every month, making the network's Feed the most viewed and most influential aspect of the news industry. The feature, introduced in 2006, was renamed "Feed" in 2022.
Donald Trump's use of social media attracted attention worldwide since he joined Twitter in May 2009. Over nearly twelve years, Trump tweeted around 57,000 times, including about 8,000 times during the 2016 election campaign and over 25,000 times during his presidency. The White House said the tweets should be considered official statements. When Twitter banned Trump from the platform in January 2021 during the final days of his term, his handle @realDonaldTrump had over 88.9 million followers. On November 19, 2022, Twitter's new owner, Elon Musk, reinstated his account, although Trump had stated he would not use it in favor of his own social media platform, Truth Social. The first tweet since 2021 was made in August 2023 about his mugshot from Fulton County Jail, but the account remained inactive until he tweeted again in August 2024.
Derrick Johnson is an American lawyer who is the current president and CEO of the NAACP. He had previously served as president of its Mississippi state chapter, and vice chairman of its board of directors. Johnson is the founder of the Mississippi nonprofit group One Voice Inc., which aims to improve quality of life for African Americans through public engagement.
In the 2010s, personal data belonging to millions of Facebook users was collected without their consent by British consulting firm Cambridge Analytica, predominantly to be used for political advertising.
Meta Platforms, Inc., doing business as Meta, and formerly named Facebook, Inc., and TheFacebook, Inc., is an American multinational technology conglomerate based in Menlo Park, California. The company owns and operates Facebook, Instagram, Threads, and WhatsApp, among other products and services. Advertising accounts for 97.8 percent of its revenue. Originally known as the parent company of the Facebook service, as Facebook, Inc., it was rebranded to its current name in 2021 to "reflect its focus on building the metaverse", an integrated environment linking the company's products and services.
"When the looting starts, the shooting starts" is a phrase originally used by Walter E. Headley, the police chief of Miami, Florida, in response to an outbreak of violent crime during the 1967 Christmas holiday season. He accused "young hoodlums, from 15 to 21", of taking "advantage of the civil rights campaign" that was then sweeping the United States. Having ordered his officers to combat the violence with shotguns, he told the press that "we don't mind being accused of police brutality". The quote may have been borrowed from a 1963 comment from Birmingham, Alabama police chief Bull Connor. It was featured in Headley's 1968 obituary published by the Miami Herald.
The 2020 Facebook ad boycotts were a group of boycotts that took place during the month of July 2020. Much of the boycotts were organized under the Stop Hate for Profit campaign, launched by the advocacy groups the Anti-Defamation League, the NAACP, Color of Change, Common Sense Media, Free Press and Sleeping Giants. Over 1,000 companies participated in the boycott.
The Real Facebook Oversight Board is an entity founded in 2020 by British journalist Carole Cadwalladr, in response to Facebook's announcement of the creation of its Oversight Board to address contentious content decisions made by the company through an independent appellate process.
In October 2020, a controversy arose involving data from a laptop that belonged to Hunter Biden. The owner of a Delaware computer shop, John Paul Mac Isaac, said that the laptop had been left by a man who identified himself as Hunter Biden. Mac Isaac also stated that he is legally blind and could not be sure whether the man was actually Hunter Biden. Three weeks before the 2020 United States presidential election, the New York Post published a front-page story that presented emails from the laptop, alleging they showed corruption by Joe Biden, the Democratic presidential nominee and Hunter Biden's father. According to the Post, the story was based on information provided to Rudy Giuliani, the personal attorney of incumbent president and candidate Donald Trump, by Mac Isaac. Forensic analysis later authenticated some of the emails from the laptop, including one of the two emails used by the Post in their initial reporting.
In 2021, an internal document leak from the company then known as Facebook showed it was aware of harmful societal effects from its platforms, yet persisted in prioritizing profit over addressing these harms. The leak, released by whistleblower Frances Haugen, resulted in reporting from The Wall Street Journal in September, as The Facebook Files series, as well as the Facebook Papers, by a consortium of news outlets the next month.
Facebook and Meta Platforms have been criticized for their management of various content on posts, photos and entire groups and profiles. This includes but is not limited to allowing violent content, including content related to war crimes, and not limiting the spread of fake news and COVID-19 misinformation on their platform, as well as allowing incitement of violence against multiple groups.
{{cite web}}
: CS1 maint: multiple names: authors list (link)