Gonzalez v. Google LLC

Last updated

Gonzalez v. Google LLC
Seal of the United States Supreme Court.svg
Argued February 21, 2023
Decided May 18, 2023
Full case nameReynaldo Gonzalez, et al., v. Google LLC
Docket no. 21-1333
Citations598 U.S. 617 ( more )
Argument Oral argument
Holding
The Ninth Circuit’s judgment—which held that plaintiffs’ complaint was barred by §230 of the Communications Decency Act—is vacated, and the case is remanded for reconsideration in light of the Court’s decision in Twitter, Inc. v. Taamneh .
Court membership
Chief Justice
John Roberts
Associate Justices
Clarence Thomas  · Samuel Alito
Sonia Sotomayor  · Elena Kagan
Neil Gorsuch  · Brett Kavanaugh
Amy Coney Barrett  · Ketanji Brown Jackson
Case opinion
Per curiam
Laws applied
Justice Against Sponsors of Terrorism Act

Gonzalez v. Google LLC, 598 U.S. 617 (2023), was a case at the Supreme Court of the United States which dealt with the question of whether or not recommender systems are covered by liability exemptions under section 230 of the Communications Act of 1934, which was established by section 509 of the Telecommunications Act of 1996, for Internet service providers (ISPs) in dealing with terrorism-related content posted by users and hosted on their servers. [1] [2] The case was granted certiorari alongside another Section 230 and terrorism-related case, Twitter, Inc. v. Taamneh .

Contents

In May 2023, the court ruled unanimously in Twitter that the charges against the social media companies were not permissible under antiterrorism law. Gonzalez was sent back to lower courts on a per curiam decision with instructions to consider the Court's decision in Twitter. [3]

Background

In November 2015, a series of coordinated terrorism attacks occurred in Paris. At least 130 were killed by the terrorists, and the Islamic State took responsibility for the attack.

Among those killed was a single American, 23-year-old student Nohemi Gonzalez, on an exchange program with California State University, Long Beach. Her family began to seek legal remedies against Google, the parent company of YouTube. Their suit argued that through its recommendation system that tailors content based on user profiles, YouTube led users towards recruitment videos for the Islamic State, and were partially responsible for Nohemi's death. [4] Google defended itself by relying on Section 230, passed as part of the Telecommunications Act of 1996, which provides immunity from content published on an Internet service provider's platform by third-party users. A lower court ruled in favor of Google, and the decision was upheld by the Ninth Circuit Court of Appeals. [5]

In their appeal to the Supreme Court, the family focused more on the YouTube algorithm that has been tailored to deliver content believed to be of interest to the end user, arguing that while this was automatically done, it was a form of moderation that Section 230 does not fully cover. They wrote in their petition to the Supreme Court, "Whether Section 230 applies to these algorithm-generated recommendations is of enormous practical importance. Interactive computer services constantly direct such recommendations, in one form or another, at virtually every adult and child in the United States who uses social media." [6]

Some leading figures in major U.S. political parties want the law changed, but for different reasons. Many Democrats point to concerns about the proliferation of content that is harmful to children, while many Republicans are worried about conservative viewpoints being blocked on certain sites. [7]

Supreme Court

The Supreme Court granted certiorari to the case in October 2022, along with a related case Twitter, Inc. v. Taamneh also dealing with Section 230 and terrorism-related content. They will be the first cases that the Court will hear over Section 230, which since around 2015 has come under increasing partisan criticism towards Big Tech. Justice Clarence Thomas had spoken to a need to review Section 230 in previous dissenting statements to court orders, arguing that social media companies should be regulated like "common carriers", which would prohibit content-based discrimination. [5]

Many of the Big Tech companies provided their own amicus curiae to support Google's recommender system as part of the case, as well as smaller sites including Reddit and the Wikimedia Foundation which rely on moderation systems that partially incorporate user moderation as part of their systems. While there is general support for updating Section 230 to reflect modern concerns, these briefs broadly stressed the need to let Congress pass legislation rather than having the Supreme Court issue its own judgement. [8] This position was also upheld by Ron Wyden and Christopher Cox, the lawmakers behind Section 230, and law professor Eric Goldman who has written extensively about Section 230, in addition to mobile app platforms like Yelp and Craigslist and free speech advocacy groups like the ACLU and the Electronic Frontier Foundation. [9] [7]

Briefs in support of Gonzalez' position include several Republican Congresspeople including Ted Cruz, Mike Johnson, and Josh Hawley. Some advocacy groups, like the Anti-Defamation League, argue that Google and other Big Tech groups have used Section 230 to remain immune for their own conduct, while also defending robust section 230 protection for moderation decisions. Groups that support child protections on the Internet also provided briefs for Gonzalez. [9] [7]

Oral arguments in Gonzalez were held February 21, 2023. Observers to the Court found the Justices from both liberal and conservative sides questioning the issues around algorithms, stating that most Internet services are based on algorithms. The Justices had also questioned whether YouTube's algorithm was specifically tailored to promote terrorism-related content. [10] The Justices were not sure it would be possible to delineate content further, and the potential for a mass of lawsuits and economic impact should Section 230 be changed. Justice Amy Coney Barrett suggested that the result of the related Twitter case may help resolve the case against Google. [11]

The Court issued decisions for both Gonzalez and Twitter on May 18, 2023. In Twitter, the Court unanimously held that the families' claims against the social media companies were not allowable under the Antiterrorism Act, and did not make any ruling related to Section 230. Subsequently, in the per curiam order given for Gonzalez, the Court vacated the Ninth Circuit's decision and remanded the case for that court to reconsider the case in light of the Twitter decision. [12]

See also

Related Research Articles

The Communications Decency Act of 1996 (CDA) was the United States Congress's first notable attempt to regulate pornographic material on the Internet. In the 1997 landmark case Reno v. ACLU, the United States Supreme Court unanimously struck the act's anti-indecency provisions.

<span class="mw-page-title-main">Content moderation</span> System to sort undesirable contributions

On Internet websites that invite users to post comments, content moderation is the process of detecting contributions that are irrelevant, obscene, illegal, harmful, or insulting, in contrast to useful or informative contributions, frequently for censorship or suppression of opposing viewpoints. The purpose of content moderation is to remove or apply a warning label to problematic content or allow users to block and filter content themselves.

Reno v. American Civil Liberties Union, 521 U.S. 844 (1997), was a landmark decision of the Supreme Court of the United States, unanimously ruling that anti-indecency provisions of the 1996 Communications Decency Act violated the First Amendment's guarantee of freedom of speech. This was the first major Supreme Court ruling on the regulation of materials distributed via the Internet.

Barrett v. Rosenthal, 40 Cal.4th 33 (2006), was a California Supreme Court case concerning online defamation. The case resolved a defamation claim brought by Stephen Barrett, Terry Polevoy, and attorney Christopher Grell against Ilena Rosenthal and several others. Barrett and others alleged that the defendants had republished libelous information about them on the internet. In a unanimous decision, the court held that Rosenthal was a "user of interactive computer services" and therefore immune from liability under Section 230 of the Communications Decency Act.

<span class="mw-page-title-main">Internet censorship in India</span> Overview of Internet censorship in India

Internet censorship in India is done by both central and state governments. DNS filtering and educating service users in suggested usages is an active strategy and government policy to regulate and block access to Internet content on a large scale. Measures for removing content at the request of content creators through court orders have also become more common in recent years. Initiating a mass surveillance government project like Golden Shield Project is an alternative that has been discussed over the years by government bodies.

<span class="mw-page-title-main">Twitter</span> American social networking service

X, formerly and commonly called Twitter, is an online social media and social networking service operated by the American company X Corp., the successor of Twitter, Inc. On X, registered users can post text, images and videos. Users can also like, repost, quote repost, comment on posts, direct message, video and audio call, bookmark, join lists and communities, and join public Spaces with other registered users. Users can vote on context added by approved users using the Community Notes feature. Posting information to the site is often referred to historically as tweeting, retweeting and quote tweeting/retweeting. Although the service is now called X, the primary website address remains twitter.com as of November 2023, with the x.com domain name redirecting to that address.

Pruneyard Shopping Center v. Robins, 447 U.S. 74 (1980), was a U.S. Supreme Court decision issued on June 9, 1980 which affirmed the decision of the California Supreme Court in a case that arose out of a free speech dispute between the Pruneyard Shopping Center in Campbell, California, and several local high school students.

<span class="mw-page-title-main">Roskomnadzor</span> Russian government agency

The Federal Service for Supervision of Communications, Information Technology and Mass Media, abbreviated as Roskomnadzor (RKN), is the Russian federal executive agency responsible for monitoring, controlling and censoring Russian mass media. Its areas of responsibility include electronic media, mass communications, information technology and telecommunications, supervising compliance with the law, protecting the confidentiality of personal data being processed, and organizing the work of the radio-frequency service.

<span class="mw-page-title-main">Section 230</span> US federal law on website liability

Section 230 is a section of Title 47 of the United States Code that was enacted as part of the Communications Decency Act of 1996, which is Title V of the Telecommunications Act of 1996, and generally provides immunity for online computer services with respect to third-party content generated by its users. At its core, Section 230(c)(1) provides immunity from liability for providers and users of an "interactive computer service" who publish information provided by third-party users:

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

<span class="mw-page-title-main">Like button</span> Communication software feature used to express support

A like button, like option, or recommend button is a feature in communication software such as social networking services, Internet forums, news websites and blogs where the user can express that they like, enjoy or support certain content. Internet services that feature like buttons usually display the number of users who liked each content, and may show a full or partial list of them. This is a quantitative alternative to other methods of expressing reaction to content, like writing a reply text. Some websites also include a dislike button, so the user can either vote in favor, against or neutrally. Other websites include more complex web content voting systems. For example, five stars or reaction buttons to show a wider range of emotion to the content.

Shadow banning, also called stealth banning, hellbanning, ghost banning, and comment ghosting, is the practice of blocking or partially blocking a user or the user's content from some areas of an online community in such a way that the ban is not readily apparent to the user, regardless of whether the action is taken by an individual or an algorithm. For example, shadow-banned comments posted to a blog or media website would be visible to the sender, but not to other users accessing the site.

Google has been involved in multiple lawsuits over issues such as privacy, advertising, intellectual property and various Google services such as Google Books and YouTube. The company's legal department expanded from one to nearly 100 lawyers in the first five years of business, and by 2014 had grown to around 400 lawyers. Google's Chief Legal Officer is Senior Vice President of Corporate Development David Drummond.

<i>OKroley v. Fastcase, Inc.</i>

O'Kroley v. Fastcase, Inc.,, aff'd, No. 15-6336, is a U.S. court case concerning defamation in online search results. The plaintiff, Colin O'Kroley, alleged that Google's automated snippet algorithm created a defamatory search result by falsely implying that the plaintiff had been accused of indecency with a child. The District Court granted Google's motion to dismiss the case, and found that Google had immunity from the defamation charges under Section 230 of the Communications Decency Act, which protects interactive computer services from being held liable as a speaker or publisher for information provided by a third-party information content provider. On appeal, the United States Court of Appeals for the Sixth Circuit affirmed the District Court's decision.

<span class="mw-page-title-main">FOSTA-SESTA</span> US communications/sex trafficking bills

FOSTA and SESTA are U.S. Senate and House bills which became law on April 11, 2018. They clarify the country's sex trafficking law to make it illegal to knowingly assist, facilitate, or support sex trafficking, and amend the Section 230 safe harbors of the Communications Decency Act to exclude enforcement of federal or state sex trafficking laws from its immunity. Senate sponsor Rob Portman had previously led an investigation into the online classifieds service Backpage, and argued that Section 230 was protecting its "unscrupulous business practices" and was not designed to provide immunity to websites that facilitate sex trafficking.

<span class="mw-page-title-main">EARN IT Act</span> Proposed US legislation

The EARN IT Act is a proposed legislation first introduced in 2020 in the United States Congress. It aims to amend Section 230 of the Communications Act of 1934, which allows operators of websites to remove user-posted content that they deem inappropriate, and provides them with immunity from civil lawsuits related to such posting. Section 230 is the only surviving portion of the Communications Decency Act, passed in 1996.

Algorithmic radicalization is the concept that recommender algorithms on popular social media sites such as YouTube and Facebook drive users toward progressively more extreme content over time, leading to them developing radicalized extremist political views. Algorithms record user interactions, from likes/dislikes to amount of time spent on posts, to generate endless media aimed to keep users engaged. Through echo chamber channels, the consumer is driven to be more polarized through preferences in media and self-confirmation.

<span class="mw-page-title-main">Texas House Bill 20</span>

An Act Relating to censorship of or certain other interference with digital expression, including expression on social media platforms or through electronic mail messages, also known as Texas House Bill 20 (HB20), is a Texas anti-deplatforming law enacted on September 9, 2021.

Moody v. NetChoice, LLC and NetChoice, LLC v. Paxton are pending United States Supreme Court cases related to protected speech under the First Amendment, content moderation by interactive service providers on the Internet under Section 230 of the Communications Decency Act, and two state laws passed in Florida and Texas that sought to limit this moderation. Both cases are challenges to state laws restricting content moderation on social media websites. The cases are expected to be heard during the Court's 2023–24 term.

Twitter, Inc. v. Taamneh, 598 U.S. 471 (2023), was a case of the Supreme Court of the United States. The case considered whether Internet service providers are liable for "aiding and abetting" a designated foreign terrorist organization in an "act of international terrorism", on account of recommending such content posted by users, under Section 2333 of the Antiterrorism and Effective Death Penalty Act of 1996. Along with Gonzalez v. Google LLC, Taamneh is one of two cases where social media companies are accused of aiding and abetting terrorism in violation of the law. The cases were decided together in a ruling by the United States Court of Appeals for the Ninth Circuit, which ruled that Taamneh's case could proceed. The cases challenge the broad liability immunity for hosting and recommending terrorist content that websites have enjoyed.

<i>Force v. Facebook, Inc.</i> 2019 US appeals court decision

Force v. Facebook, Inc., 934 F.3d 53 was a 2019 decision by the US Second Circuit Appeals Court holding that Section 230 bars civil terrorism claims against social media companies and internet service providers, the first federal appellate court to do so.

References

  1. "In Gonzalez v. Google, SCOTUS Has Chance To Clarify Section 230's Meaning". Newsweek. January 27, 2023.
  2. "Israeli NGO gets US Supreme Court nod in bid to hold social media accountable for terror". Israel Hayom. October 9, 2022. Retrieved February 21, 2023.
  3. "Supreme Court sidesteps ruling on scope of internet companies' immunity from lawsuits over user content". NBC News . May 18, 2023.
  4. Liptak, Adam; McCabe, Dave (October 3, 2022). "Supreme Court Takes Up Challenge to Social Media Platforms' Shield". The New York Times . Retrieved October 3, 2022.
  5. 1 2 Kern, Rebecca (October 3, 2022). "SCOTUS to hear challenge to Section 230 protections". Politico . Retrieved October 3, 2022.
  6. "Social Media Company Liability Draws Supreme Court Scrutiny". MSN. Retrieved October 4, 2022.
  7. 1 2 3 McKinnon, John D. (February 20, 2023). "Google Case Heads to Supreme Court With Powerful Internet Shield Law at Stake". Wall Street Journal. Retrieved February 21, 2023.
  8. Ryan-Mosley, Tate (February 1, 2023). "How the Supreme Court ruling on Section 230 could end Reddit as we know it". MIT Technology Review . Retrieved February 1, 2023.
  9. 1 2 Barr, Kyle (January 23, 2023). "Tech Groups, Politicians, and Reddit Evangelize Section 230 to Supreme Court". Gizmodo . Retrieved February 5, 2023.
  10. Chung, Andrew; Kruzel, John (February 21, 2023). "U.S. Supreme Court torn over challenge to internet firms' legal shield". Reuters. Retrieved February 21, 2023.
  11. Fung, Brian; Sneed, Tierney (February 21, 2023). "Takeaways from the Supreme Court's hearing in blockbuster internet speech case". CNN . Retrieved February 21, 2023.
  12. Hurley, Lawrence (May 18, 2023). "Supreme Court sidesteps ruling on scope of internet companies' immunity from lawsuits over user content". NBC News . Retrieved May 18, 2023.