Algorithms of Oppression

Last updated
Algorithms of Oppression
Algorithms of Oppression.jpg
First edition
Author Safiya Noble
Country United States
LanguageEnglish
SubjectRacism, algorithms
GenreNon-fiction
PublishedFebruary 2018
Publisher NYU Press
Pages256 pp
ISBN 978-1-4798-4994-9 (Hardcover)

Algorithms of Oppression: How Search Engines Reinforce Racism is a 2018 book by Safiya Umoja Noble in the fields of information science, machine learning, and human-computer interaction. [1] [2] [3] [4]

Contents

Background

Noble earned an undergraduate degree in sociology from California State University, Fresno in the 1990s, then worked in advertising and marketing for fifteen years before going to the University of Illinois Urbana-Champaign for a Master of Library and Information Science degree in the early 2000s. [5] The book's first inspiration came in 2011, when Noble Googled the phrase "black girls" and saw results for pornography on the first page. [5] Noble's doctoral thesis, completed in 2012, was titled "Searching for Black girls: Old traditions in new media." [6] At this time, Noble thought of the title "Algorithms of Oppression" for the eventual book. [7] By this time, changes to Google's algorithm had changed the most common results for a search of "black girls," though the underlying biases remain influential. [8] Noble became an assistant professor at University of California, Los Angeles in 2014. [9] In 2017, she published an article on racist and sexist bias in search engines in The Chronicle of Higher Education. [9] [10] The book was published on February 20, 2018. [11]

Overview

Algorithms of Oppression is a text based on over six years of academic research on Google search algorithms, examining search results from 2009 to 2015. [12] The book addresses the relationship between search engines and discriminatory biases. Noble argues that search algorithms are racist and perpetuate societal problems because they reflect the negative biases that exist in society and the people who create them. [13] [14] [15] Noble dismantles the idea that search engines are inherently neutral by explaining how algorithms in search engines privilege whiteness by depicting positive cues when key words like “white” are searched as opposed to “asian,”  “hispanic,”  or “Black.” Her main example surrounds the search results of "Black girls" versus "white girls" and the biases that are depicted in the results. [16] These algorithms can then have negative biases against women of color and other marginalized populations, while also affecting Internet users in general by leading to "racial and gender profiling, misrepresentation, and even economic redlining." The book argues that algorithms perpetuate oppression and discriminate against People of Color, specifically women of color.

Noble takes a Black intersectional feminist approach to her work in studying how google algorithms affect people differently by race and gender. Intersectional Feminism takes into account the diverse experiences of women of different races and sexualities when discussing their oppression society, and how their distinct backgrounds affect their struggles. Additionally, Noble's argument addresses how racism infiltrates the google algorithm itself, something that is true throughout many coding systems including facial recognition, and medical care programs. [17] While many new technological systems promote themselves as progressive and unbiased, Noble is arguing against this point and saying that many technologies, including google's algorithm "reflect and reproduce existing inequities." [18]

Chapter Summaries

Chapter 1

In Chapter 1 of Algorithms of Oppression, Safiya Noble explores how Google search's auto suggestion feature is demoralizing. On September 18, 2011, a mother googled “black girls” attempting to find fun activities to show her stepdaughter and nieces. To her surprise, the results encompassed websites and images of porn. This result encloses the data failures specific to people of color and women which Noble coins algorithmic oppression. Noble also adds that as a society we must have a feminist lens, with racial awareness to understand the “problematic positions about the benign instrumentality of technologies.” [19]

Noble also discusses how Google can remove the human curation from the first page of results to eliminate any potential racial slurs or inappropriate imaging. Another example discussed in this text is a public dispute of the results that were returned when “Jew” was searched on Google. The results included a number of anti-Semitic pages and Google claimed little ownership for the way it provided these identities. Google instead encouraged people to use “Jews” or “Jewish people” and claimed the actions of White supremacist groups are out of Google's control. [16] Unless pages are unlawful, Google will allow its algorithm to continue to act without removing pages.

Noble reflects on AdWords which is Google's advertising tool and how this tool can add to the biases on Google. Adwords allows anyone to advertise on Google's search pages and is highly customizable. [20] First, Google ranks ads on relevance and then displays the ads on pages which it believes are relevant to the search query taking place. An advertiser can also set a maximum amount of money per day to spend on advertising. The more you spend on ads, the higher probability your ad will be closer to the top. Therefore, if an advertiser is passionate about his/her topic but it is controversial it may be the first to appear on a Google search.

Chapter 2

In Chapter 2 of Algorithms of Oppression, Noble explains that Google has exacerbated racism and how they continue to deny responsibility for it. Google puts the blame on those who have created the content and as well as those who are actively seeking this information. Google's algorithm has maintained social inequalities and stereotypes for Black, Latina, and Asian women, mostly due in part to Google's design and infrastructure that normalizes whiteness and men. She explains that the Google algorithm categorizes information which exacerbates stereotypes while also encouraging white hegemonic norms. Noble found that after searching for black girls, the first search results were common stereotypes of black girls, or the categories that Google created based on their own idea of a black girl. Google hides behind their algorithm that has been proven to perpetuate inequalities.

Chapter 3

In Chapter 3 of Algorithms of Oppression, Safiya Noble discusses how Google's search engine combines multiple sources to create threatening narratives about minorities. She explains a case study where she searched “black on white crimes” on Google. [21] Noble highlights that the sources and information that were found after the search pointed to conservative sources that skewed information. These sources displayed racist and anti-black information from white supremacist sources. Ultimately, she believes this readily-available, false information fueled the actions of white supremacist Dylann Roof, who committed a massacre

Chapter 4

In Chapter 4 of Algorithms of Oppression, Noble furthers her argument by discussing the way in which Google has oppressive control over identity. This chapter highlights multiple examples of women being shamed due to their activity in the porn industry, regardless if it was consensual or not. She critiques the internet's ability to influence one's future due to its permanent nature and compares U.S. privacy laws to those of the European Union, which provides citizens with “the right to forget or be forgotten.” [22] When utilizing search engines such as Google, these breaches of privacy disproportionately affect women and people of color. Google claims that they safeguard our data in order to protect us from losing our information, but fails to address what happens when you want your data to be deleted.

Chapter 5

In Chapter 5 of Algorithms of Oppression, Noble moves the discussion away from google and onto other information sources deemed credible and neutral. Noble says that prominent libraries, including the Library of Congress, encourage whiteness, heteronormativity, patriarchy and other societal standards as correct, and alternatives as problematic. She explains this problem by discussing a case between Dartmouth College and the Library of Congress where "student-led organization the Coalition for Immigration Reform, Equality (CoFired) and DREAMers" engaged in a two-year battle to change the Library's terminology from 'illegal aliens' to 'noncitizen' or 'unauthorised immigrants.' [20] Noble later discusses the problems that ensue from misrepresentation and classification which allows her to enforce the importance of contextualisation. Noble argues that it is not just google, but all digital search engines that reinforce societal structures and discriminatory biases and by doing so she points out just how interconnected technology and society are. [23]

Chapter 6

In Chapter 6 of Algorithms of Oppression, Safiya Noble discusses possible solutions for the problem of algorithmic bias. She first argues that public policies enacted by local and federal governments will reduce Google's “information monopoly” and regulate the ways in which search engines filter their results. She insists that governments and corporations bear the most responsibility to reform the systemic issues leading to algorithmic bias.

Simultaneously, Noble condemns the common neoliberal argument that algorithmic biases will disappear if more women and racial minorities enter the industry as software engineers. She calls this argument “complacent” because it places responsibility on individuals, who have less power than media companies, and indulges a mindset she calls “big-data optimism,” or a failure to challenge the notion that the institutions themselves do not always solve, but sometimes perpetuate inequalities. To illustrate this point, she uses the example of Kandis, a Black hairdresser whose business faces setbacks because the review site Yelp has used biased advertising practices and searching strategies against her.

She closes the chapter by calling upon the Federal Communications Commission (FCC) and the Federal Trade Commission (FTC) to “regulate decency,” or to limit the amount of racist, homophobic, or prejudiced rhetoric on the Internet. She urges the public to shy away from “colorblind” ideologies toward race because it has historically erased the struggles faced by racial minorities. Lastly, she points out that big-data optimism leaves out discussion about the harms that big data can disproportionately enact upon minority communities.

Conclusion

In Algorithms of Oppression, Safiya Noble explores the social and political implications of the results from our Google searches and our search patterns online. Noble challenges the idea of the internet being a fully democratic or post-racial environment. Each chapter examines different layers to the algorithmic biases formed by search engines. By outlining crucial points and theories throughout the book, Algorithms of Oppression is not limited to only academic readers. This allows for Noble's writing to reach a wider and more inclusive audience.

Critical reception

Critical reception for Algorithms of Oppression has been largely positive. In the Los Angeles Review of Books , Emily Drabinski writes, "What emerges from these pages is the sense that Google’s algorithms of oppression comprise just one of the hidden infrastructures that govern our daily lives, and that the others are likely just as hard-coded with white supremacy and misogyny as the one that Noble explores." [24] In PopMatters, Hans Rollman writes that Algorithms of Oppression "demonstrate[s] that search engines, and in particular Google, are not simply imperfect machines, but systems designed by humans in ways that replicate the power structures of the western countries where they are built, complete with all the sexism and racism that are built into those structures." [1] In Booklist, reviewer Lesley Williams states, "Noble’s study should prompt some soul-searching about our reliance on commercial search engines and about digital social equity." [25]

In early February 2018, Algorithms of Oppression received press attention when the official Twitter account for the Institute of Electrical and Electronics Engineers expressed criticism of the book, saying that the results of a Google search suggested in its blurb did not match Noble's predictions. IEEE's outreach historian, Alexander Magoun, later revealed that he had not read the book, and issued an apology. [14]

See also

Related Research Articles

Google Search Search engine from Google

Google Search is a search engine provided by Google. Handling more than 3.5 billion searches per day, it has a 92% share of the global search engine market. It is also the most-visited website in the world.

Triple oppression, also called double jeopardy, Jane Crow, or triple exploitation, is a theory developed by black socialists in the United States, such as Claudia Jones. The theory states that a connection exists between various types of oppression, specifically classism, racism, and sexism. It hypothesizes that all three types of oppression need to be overcome at once.

<i>I Know Why the Caged Bird Sings</i> 1969 autobiography about the early years of African-American writer and poet Maya Angelou

I Know Why the Caged Bird Sings is a 1969 autobiography describing the young and early years of American writer and poet Maya Angelou. The first in a seven-volume series, it is a coming-of-age story that illustrates how strength of character and a love of literature can help overcome racism and trauma. The book begins when three-year-old Maya and her older brother are sent to Stamps, Arkansas, to live with their grandmother and ends when Maya becomes a mother at the age of 16. In the course of Caged Bird, Maya transforms from a victim of racism with an inferiority complex into a self-possessed, dignified young woman capable of responding to prejudice.

In social justice theory, internalized oppression is a concept in which an oppressed group uses the methods of the oppressing group against itself. It occurs when one group perceives an inequality of value relative to another group, and desires to be like the more highly-valued group.

Black feminism is a philosophy that centers on the idea that "Black women are inherently valuable, that [Black women's] liberation is a necessity not as an adjunct to somebody else's but because our need as human persons for autonomy."

Robert Epstein American psychologist and journalist

Robert Epstein is an American psychologist, professor, author, and journalist. He earned his Ph.D. in psychology at Harvard University in 1981, was editor in chief of Psychology Today, and has held positions at several universities including Boston University, University of California, San Diego, and Harvard University. He is also the founder and director emeritus of the Cambridge Center for Behavioral Studies in Concord, MA. In 2012, he founded the American Institute for Behavioral Research and Technology (AIBRT), a nonprofit organization that conducts research to promote the well-being and functioning of people worldwide.

Internalized racism is a form of internalized oppression, defined by sociologist Karen D. Pyke as the "internalization of racial oppression by the racially subordinated." In her study The Psychology of Racism, Robin Nicole Johnson emphasizes that internalized racism involves both "conscious and unconscious acceptance of a racial hierarchy in which whites are consistently ranked above people of color." These definitions encompass a wide range of instances, including, but not limited to, belief in negative stereotypes, adaptations to white cultural standards, and thinking that supports the status quo.

The Combahee River Collective was a Black feminist lesbian socialist organization active in Boston from 1974 to 1980. The Collective argued that both the white feminist movement and the Civil Rights Movement were not addressing their particular needs as Black women and, more specifically, as Black lesbians. Racism was present in the mainstream feminist movement, while Delaney and Manditch-Prottas argue that much of the Civil Rights Movement had a sexist and homophobic reputation. The Collective are perhaps best known for developing the Combahee River Collective Statement, a key document in the history of contemporary Black feminism and the development of the concepts of identity politics as used among political organizers and social theorists, and for introducing the concept of interlocking systems of oppression, a key concept of intersectionality. Gerald Izenberg credits the 1977 Combahee statement with the first usage of the phrase "identity politics". Through writing their statement, the CRC connected themselves to the activist tradition of Black women in the 19th Century and to the struggles of Black liberation in the 1960s.

Racist rhetoric is distributed through computer-mediated means and includes some or all of the following characteristics: ideas of racial uniqueness, racist attitudes towards specific social categories, racist stereotypes, hate-speech, nationalism and common destiny, racial supremacy, superiority and separation, conceptions of racial otherness, and anti-establishment world-view. Racism online can have the same effects as offensive remarks not online.

Filter bubble Intellectual isolation involving search engines

A filter bubble or ideological frame is a state of intellectual isolation that can result from personalized searches when a website algorithm selectively guesses what information a user would like to see based on information about the user, such as location, past click-behavior and search history. As a result, users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles. The choices made by these algorithms are not transparent. Prime examples include Google Personalized Search results and Facebook's personalized news-stream.

The Cyber Civil Rights Initiative (CCRI) is a non-profit organization founded by Holly Jacobs in 2012. The organization offers services to victims of cybercrimes through its crisis helpline. They have compiled resources to help victims of cybercrimes both in America and internationally. CCRI's resources include a list of frequently asked questions, an online image removal guide, a roster of attorneys who may be able to offer low-cost or pro-bono legal assistance, and a list of laws related to nonconsensual pornography and related issues. CCRI publishes reports on nonconsensual pornography, engages in advocacy work, and contributes to updating tech policy. CCRI offers expert advice to tech industry leaders such as Twitter, Facebook, and Google regarding their policies against nonconsensual pornography. CCRI is the lead educator in the United States on subject matter related to nonconsensual pornography, recorded sexual assault, and sextortion.

Algorithmic bias Technological phenomenon with social implications

Algorithmic bias describes systematic and repeatable errors in a computer system that create "unfair" outcomes, such as "privileging" one category over another in ways different from the intended function of the algorithm.

Joy Buolamwini Computer scientist and digital activist

Joy Adowaa Buolamwini is a Ghanaian-American-Canadian computer scientist and digital activist based at the MIT Media Lab. Buolamwini introduces herself as a poet of code, daughter of art and science. She founded the Algorithmic Justice League, an organization that works to challenge bias in decision-making software, using art, advocacy, and research to highlight the social implications and harms of artificial intelligence (AI).

Safiya Noble American professor and author

Safiya Umoja Noble is a Professor at UCLA, and is the Co-Founder and Co-Director of the UCLA Center for Critical Internet Inquiry. She is the author of Algorithms of Oppression, and co-editor of two edited volumes: The Intersectional Internet: Race, Sex, Class and Culture and Emotions, Technology & Design. She is a Research Associate at the Oxford Internet Institute at the University of Oxford. She was appointed a Commissioner to the University of Oxford Commission on AI and Good Governance in 2020. In 2020 she was nominated to the Global Future Council on Artificial Intelligence for Humanity at the World Economic Foundation.

Timnit Gebru Computer scientist

Timnit Gebru is an American computer scientist who works on algorithmic bias and data mining. She is an advocate for diversity in technology and co-founder of Black in AI, a community of Black researchers working in artificial intelligence (AI). She is the founder of the Distributed Artificial Intelligence Research Institute (DAIR).

<i>White Fragility</i> 2018 book by Robin DiAngelo

White Fragility: Why It's So Hard for White People to Talk About Racism is a 2018 book written by Robin DiAngelo about race relations in the United States. An academic with experience in diversity training, DiAngelo coined the term "white fragility" in 2011 to describe any defensive instincts or reactions that a white person experiences when questioned about race or made to consider their own race. In White Fragility, DiAngelo views racism in the United States as systemic and often perpetuated unconsciously by individuals. She recommends against viewing racism as committed intentionally by "bad people".

Sarah T. Roberts Professor of Library & Information Science, author, and scholar

Sarah T. Roberts is a professor, author, and scholar who specializes in content moderation of social media. She is an expert in the areas of internet culture, social media, digital labor, and the intersections of media and technology. She coined the term "commercial content moderation" (CCM) to describe the job paid content moderators do to regulate legal guidelines and standards. Roberts wrote the book Behind the Screen: Content Moderation in the Shadows of Social Media.

Danielle Hairston is an American psychiatrist who is Director of Residency Training in the Department of Psychiatry at Howard University College of Medicine, and a practicing psychiatrist in the Division of Consultation-Liaison Psychiatry at the University of Maryland Medical Center in Baltimore, Maryland. Hairston is also the Scientific Program Chair for the Black Psychiatrists of America and the President of the American Psychiatric Association's Black Caucus.

<i>Coded Bias</i> 2020 American documentary film

Coded Bias is an American documentary film directed by Shalini Kantayya that premiered at the 2020 Sundance Film Festival. The film includes contributions from researchers Joy Buolamwini, Deborah Raji, Meredith Broussard, Cathy O’Neil, Zeynep Tufekci, Safiya Noble, Timnit Gebru, Virginia Eubanks, and Silkie Carlo, and others.

Algorithmic Justice League

The Algorithmic Justice League (AJL) is a digital advocacy non-profit organization based in Cambridge, Massachusetts. Founded in 2016 by computer scientist Joy Buolamwini, the AJL uses research, artwork, and policy advocacy to increase societal awareness regarding the use of artificial intelligence (AI) in society and the harms and biases that AI can pose to society. The AJL has engaged in a variety of open online seminars, media appearances, and tech advocacy initiatives to communicate information about bias in AI systems and promote industry and government action to mitigate against the creation and deployment of biased AI systems. In 2021, Fast Company named AJL as one of the 10 most innovative AI companies in the world.

References

  1. 1 2 "Don't Google It! How Search Engines Reinforce Racism". PopMatters. 2018-01-30. Retrieved 2018-03-24.
  2. Fine, Cordelia (7 March 2018). "Coded prejudice: how algorithms fuel injustice". Financial Times. Retrieved 2018-05-10.
  3. "Opinion | Noah Berlatsky: How search algorithms reinforce racism and sexism". NBC News. Retrieved 2018-05-10.
  4. "How search engines are making us more racist". Vox. Retrieved 2018-05-10.
  5. 1 2 Munro, Donald (2018-04-19). "When Google gets it wrong". THE MUNRO REVIEW. Retrieved 2021-10-05.
  6. Jessie, Daniels; Karen, Gregory; Cottom, Tressie McMillan (2017). Digital Sociologies. Policy Press. p. 420. ISBN   978-1-4473-2901-5.
  7. "In 'Algorithms of Oppression,' Safiya Noble finds old stereotypes persist in new media". annenberg.usc.edu. Retrieved 2021-10-05.
  8. "a book review by Robert Fantina: Algorithms of Oppression: How Search Engines Reinforce Racism". www.nyjournalofbooks.com. Retrieved 2021-10-05.
  9. 1 2 "Safiya Umoja Noble Receives Top Honor from Fresno State | UCLA GSE&IS Ampersand". 2019-02-07. Archived from the original on 2019-02-07. Retrieved 2021-10-05.
  10. Noble, Safiya U. (2017-01-15). "Google and the Misinformed Public". www.chronicle.com. Archived from the original on 2020-07-23. Retrieved 2021-10-05.
  11. ALGORITHMS OF OPPRESSION | Kirkus Reviews.
  12. Erigha, Maryann (2019-07-01). "Algorithms of Oppression: How Search Engines Reinforce Racism by Safiya Umoja Noble". American Journal of Sociology. 125 (1): 305–307. doi:10.1086/703431. ISSN   0002-9602. S2CID   198603932.
  13. Noble's main focus is on Google’s algorithms, although she also discusses Amazon, Facebook, Twitter, and WordPress. She invests in the control over what users see and don't see. "Search results reflects the values and norms of the search companies commercial partners and advertisers and often reflect our lowest and most demeaning beliefs, because these ideas circulate so freely and so often that they are normalized and extremely profitable." (Nobel, 36)
  14. 1 2 "Scholar sets off Twitter furor by critiquing a book he hasn't read" . Retrieved 2018-02-08.
  15. "Can an algorithm be racist? Spotting systemic oppression in the age of Google". Digital Trends. 2018-03-03. Retrieved 2018-03-24.
  16. 1 2 Noble, Safiya (2018). Algorithms of oppression: How search engines reinforce racism. New York, NY, US: New York University Press. pp. Ch. 2. ISBN   978-1-4798-3364-1.
  17. D’Ignazio, C.; Klein, L. (2019). Data Feminism. MIT Press. pp. The Power Chapter 1: The Power Chapter (pgs 21-47).
  18. Benjamin, R. (2019). Race After Technology: Abolitionist Tools for the New Jim Code. Medford, MA: Polity. p. 3.
  19. Noble, Safiya (2018). Algorithms of oppression: How search engines reinforce racism. New York, NY, US: New York University Press. p. 230. ISBN   978-1-4798-3364-1.
  20. 1 2 Noble, Safiya Umoja (20 February 2018). Algorithms of oppression : how search engines reinforce racism. New York. pp. 134–135. ISBN   9781479837243. OCLC   987591529.
  21. Noble, Safiya Umoja (20 February 2018). Algorithms of oppression: how search engines reinforce racism. New York. p. 112. ISBN   978-1-4798-3724-3. OCLC   987591529.
  22. Noble, Safiya Umoja (2018). Algorithms of oppression : how search engines reinforce racism. New York. p. 121. ISBN   978-1-4798-3364-1. OCLC   1017736697.
  23. Noble, Safiya Umoja (20 February 2018). Algorithms of oppression : how search engines reinforce racism. New York. ISBN   978-1-4798-3724-3. OCLC   987591529.
  24. "Ideologies of Boring Things: The Internet and Infrastructures of Race - Los Angeles Review of Books". Los Angeles Review of Books. Retrieved 2018-03-24.
  25. Algorithms of Oppression: How Search Engines Reinforce Racism, by Safiya Umoja Noble | Booklist Online.