Safiya Noble

Last updated
Noble, Safiya Umoja (2018). Algorithms of oppression: how search engines reinforce racism. New York University Press. ISBN   9781479837243. OCLC   1029007986.
  • Noble, Safiya U.; Austin, Jeanie; Sweeney, Miriam E.; McKeever, Lucas; Sullivan, Elizabeth (2013). "Changing Course: Collaborative Reflections of Teaching/Taking "Race, Gender, and Sexuality in the Information Professions"". Journal of Education for Library and Information Science. 55 (3): 212–222. ISSN   0748-5786.
  • Noble, Safiya U. (15 January 2017). "Google and the Misinformed Public". The Chronicle of Higher Education. Retrieved 6 February 2019.
  • Edited volumes

    Related Research Articles

    Class discrimination, also known as classism, is prejudice or discrimination on the basis of social class. It includes individual attitudes, behaviors, systems of policies and practices that are set up to benefit the upper class at the expense of the lower class.

    Triple oppression, also called double jeopardy, Jane Crow, or triple exploitation, is a theory developed by black socialists in the United States, such as Claudia Jones. The theory states that a connection exists between various types of oppression, specifically classism, racism, and sexism. It hypothesizes that all three types of oppression need to be overcome at once.

    <span class="mw-page-title-main">Robert Epstein</span> American psychologist and journalist (born 1953)

    Robert Epstein is an American psychologist, professor, author, and journalist. He was awarded a Ph.D. in psychology by Harvard University in 1981, was editor-in-chief of Psychology Today, and has held positions at several universities including Boston University, University of California, San Diego, and Harvard University. He is also the founder and director emeritus of the Cambridge Center for Behavioral Studies in Concord, MA. In 2012, he founded the American Institute for Behavioral Research and Technology (AIBRT), a nonprofit organization that conducts research to promote the well-being and functioning of people worldwide.

    <span class="mw-page-title-main">Kimberlé Crenshaw</span> American academic and lawyer (born 1959)

    Kimberlé Williams Crenshaw is an American civil rights advocate and a scholar of critical race theory. She is a professor at the UCLA School of Law and Columbia Law School, where she specializes in race and gender issues.

    Racism on the Internet sometimes also referred to as cyber-racism and more broadly considered as an online hate crime or an internet hate crime consists of racist rhetoric or bullying that is distributed through computer-mediated means and includes some or all of the following characteristics: ideas of racial uniqueness, racist attitudes towards specific social categories, racist stereotypes, hate-speech, nationalism and common destiny, racial supremacy, superiority and separation, conceptions of racial otherness, and anti-establishment world-view. Racism online can have the same effects as offensive remarks made face-to-face.

    <span class="mw-page-title-main">Filter bubble</span> Intellectual isolation through internet algorithms

    A filter bubble or ideological frame is a state of intellectual isolation that can result from personalized searches, recommendation systems, and algorithmic curation. The search results are based on information about the user, such as their location, past click-behavior, and search history. Consequently, users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles, resulting in a limited and customized view of the world. The choices made by these algorithms are only sometimes transparent. Prime examples include Google Personalized Search results and Facebook's personalized news-stream.

    <span class="mw-page-title-main">Timeline of women in computing</span>

    This is a timeline of women in computing. It covers the time when women worked as "human computers" and then as programmers of physical computers. Eventually, women programmers went on to write software, develop Internet technologies and other types of programming. Women have also been involved in computer science, various related types of engineering and computer hardware.

    <span class="mw-page-title-main">Ruha Benjamin</span> American sociologist

    Ruha Benjamin is a sociologist and a professor in the Department of African American Studies at Princeton University. The primary focus of her work is the relationship between innovation and equity, particularly the intersection of race, justice, and technology. Benjamin is the author of numerous publications, including the books People's Science: Bodies and Rights on the Stem Cell Frontier (2013), Race After Technology: Abolitionist Tools for the New Jim Code (2019), and Viral Justice: How We Grow the World We Want (2022).

    <span class="mw-page-title-main">Algorithmic bias</span> Technological phenomenon with social implications

    Algorithmic bias describes systematic and repeatable errors in a computer system that create "unfair" outcomes, such as "privileging" one category over another in ways different from the intended function of the algorithm.

    <i>Algorithms of Oppression</i> 2018 book by Safiya Umoja Noble

    Algorithms of Oppression: How Search Engines Reinforce Racism is a 2018 book by Safiya Umoja Noble in the fields of information science, machine learning, and human-computer interaction.

    <span class="mw-page-title-main">Meredith Broussard</span> Data journalism professor

    Meredith Broussard is a data journalism professor at the Arthur L. Carter Journalism Institute at New York University. Her research focuses on the role of artificial intelligence in journalism.

    <span class="mw-page-title-main">Joy Buolamwini</span> Computer scientist and digital activist

    Joy Adowaa Buolamwini is a Canadian-American computer scientist and digital activist formerly based at the MIT Media Lab. She founded the Algorithmic Justice League (AJL), an organization that works to challenge bias in decision-making software, using art, advocacy, and research to highlight the social implications and harms of artificial intelligence (AI).

    <span class="mw-page-title-main">Julia Angwin</span> American investigative journalist

    Julia Angwin is an American investigative journalist, author, and entrepreneur. She co-founded and was editor-in-chief of The Markup, a nonprofit newsroom that investigates the impact of technology on society. She was a staff reporter at the New York bureau of The Wall Street Journal from 2000 to 2013, during which time she was on a team that won the Pulitzer Prize in journalism. She worked as a senior reporter at ProPublica from 2014 to April 2018, during which time she was a finalist for the Pulitzer Prize.

    <span class="mw-page-title-main">Mary Chayko</span> American sociologist

    Mary Chayko is an American sociologist and Distinguished Teaching Professor of Communication and Information at Rutgers University. She is the director of Undergraduate Interdisciplinary Studies at Rutgers University's School of Communication and Information and she was a six-year Faculty Fellow in Residence at the Rutgers-New Brunswick Honors College (2017-2023). She is an affiliated faculty member of the Sociology Department and Women's, Gender, and Sexuality Studies Department at Rutgers.

    <span class="mw-page-title-main">Sarah T. Roberts</span> Professor of Library & Information Science, author, and scholar

    Sarah T. Roberts is a professor, author, and scholar who specializes in content moderation of social media. She is an expert in the areas of internet culture, social media, digital labor, and the intersections of media and technology. She coined the term "commercial content moderation" (CCM) to describe the job paid content moderators do to regulate legal guidelines and standards. Roberts wrote the book Behind the Screen: Content Moderation in the Shadows of Social Media.

    <i>Coded Bias</i> 2020 American documentary film

    Coded Bias is an American documentary film directed by Shalini Kantayya that premiered at the 2020 Sundance Film Festival. The film includes contributions from researchers Joy Buolamwini, Deborah Raji, Meredith Broussard, Cathy O’Neil, Zeynep Tufekci, Safiya Noble, Timnit Gebru, Virginia Eubanks, and Silkie Carlo, and others.

    <span class="mw-page-title-main">Algorithmic Justice League</span> Digital advocacy non-profit organization

    The Algorithmic Justice League (AJL) is a digital advocacy non-profit organization based in Cambridge, Massachusetts. Founded in 2016 by computer scientist Joy Buolamwini, the AJL uses research, artwork, and policy advocacy to increase societal awareness regarding the use of artificial intelligence (AI) in society and the harms and biases that AI can pose to society. The AJL has engaged in a variety of open online seminars, media appearances, and tech advocacy initiatives to communicate information about bias in AI systems and promote industry and government action to mitigate against the creation and deployment of biased AI systems. In 2021, Fast Company named AJL as one of the 10 most innovative AI companies in the world.

    <span class="mw-page-title-main">Virginia Eubanks</span> American political scientist, author

    Virginia Eubanks is an American political scientist, professor, and author studying technology and social justice. She is an associate professor in the Department of Political Science at the University at Albany, SUNY. Previously Eubanks was a Fellow at New America researching digital privacy, economic inequality, and data-based discrimination.

    <span class="mw-page-title-main">Kishonna Gray</span> American communication, gender, and Black studies researcher

    Kishonna L. Gray is an American communication and gender studies researcher based at the University of Michigan School of Information. Gray is best known for her research on technology, gaming, race, and gender. As an expert in Women's and Communication Studies, she has written several articles for publications such as the New York Times. In the academic year 2016–2017, she was a Visiting Assistant Professor in the Martin Luther King, Jr. Visiting Professors and Scholars Program at the Massachusetts Institute of Technology, hosted by the Department of Women's and Gender Studies and the MIT Comparative Media Studies/Writing Program. She has also been a faculty visitor at the Berkman Klein Center for Internet & Society at Harvard University and at Microsoft Research.

    Automated decision-making (ADM) involves the use of data, machines and algorithms to make decisions in a range of contexts, including public administration, business, health, education, law, employment, transport, media and entertainment, with varying degrees of human oversight or intervention. ADM involves large-scale data from a range of sources, such as databases, text, social media, sensors, images or speech, that is processed using various technologies including computer software, algorithms, machine learning, natural language processing, artificial intelligence, augmented intelligence and robotics. The increasing use of automated decision-making systems (ADMS) across a range of contexts presents many benefits and challenges to human society requiring consideration of the technical, legal, ethical, societal, educational, economic and health consequences.

    References

    1. "New Commission to Address AI and Good Governance in Public Policy". Oxford Internet Institute. 28 July 2020. Retrieved 2024-09-24.
    2. "Global Future Council on Artificial Intelligence for Humanity". World Economic Forum. Archived from the original on 9 October 2020.
    3. 1 2 3 Munro, Donald (19 April 2018). "When Google gets it wrong: Safiya Noble exposes how search engines reinforce racism". The Munro Review. Retrieved 24 September 2024.
    4. "Safiya U. Noble". USC Annenberg School for Communication and Journalism. Archived from the original on 7 February 2019. Retrieved 6 February 2019.
    5. Kaur, Bineet (23 April 2018). "Is Google's algorithm racially biased? This Fresno State alumna thinks so". The Collegian. Retrieved 6 February 2019.
    6. "Spotlight on Safiya Umoja Noble". Department of African-American Studies | University of Illinois. Archived from the original on 7 February 2019. Retrieved 6 February 2019.
    7. McDonald, John (25 October 2018). "Safiya Umoja Noble Receives Top Honor from Fresno State". UCLA GSE&IS Ampersand. Archived from the original on 7 February 2019. Retrieved 6 February 2019.
    8. 1 2 Noble, Safiya Umoja (December 2012). Searching for black girls: old traditions in new media (PhD thesis). University of Illinois at Urbana-Champaign.
    9. 1 2 Harmon, Joanie (15 July 2014). "Safiya U. Noble: Scholar of Critical Digital Media Studies Joins IS Faculty". UCLA GSE&IS Ampersand. Archived from the original on 14 October 2018. Retrieved 6 February 2019.
    10. "Awards". UCLA GSEIS Information Studies. Archived from the original on 7 February 2019. Retrieved 6 February 2019.
    11. 1 2 Hudson, Marc (9 May 2018). "Generosity and conviviality in the age of algorithmic oppression: #Manchester #odmnoble". marchudson.net. Retrieved 2024-09-24.
    12. "Safiya Noble". Stanford HAI. Retrieved 2024-09-24.
    13. Noble, Safiya Umoja (January 2019). "SAFIYA UMOJA NOBLE, Ph.D." (PDF). Safiya U. Noble. Archived from the original (PDF) on 15 March 2022. Retrieved 15 March 2019.
    14. "Lunch Keynote with Dr. Safiya Umoja Noble: Social Justice in LIS, Finding the Imperative to Act". 2017 MLA Annual Conference. Retrieved 31 May 2018.
    15. "Safiya Noble (University of Southern California)". UCSD Design Lab. Retrieved 24 September 2024.
    16. Solon, Olivia (25 September 2020). "While Facebook works to create an oversight board, industry experts formed their own". NBC News. Retrieved 24 September 2024.
    17. Dockterman, Eliana (20 October 2020). "Prince Harry and Meghan, the Duke and Duchess of Sussex, Discuss Misinformation With Silicon Valley's Biggest Critics". TIME. Retrieved 24 September 2024.
    18. Royston, Jack (27 August 2020). "Meghan Markle Praises Prince Harry as Feminist Father to Baby Archie". Newsweek. Retrieved 20 November 2020.
    19. Stevens, Matt; Schuessler, Jennifer (28 September 2021). "MacArthur Foundation Announces 2021 'Genius' Grant Winners". The New York Times. Retrieved 28 September 2021.
    20. O'Neil, Lorena (12 August 2023). "These Women Tried to Warn Us About AI". Rolling Stone. Retrieved 24 September 2024.
    21. Should you delete Facebook in Protest? (Video). 27 March 2018. Retrieved 24 September 2024 via YouTube.
    22. Noble, Safiya U.; Roberts, Sarah T. (12 November 2016). "Targeting race in ads is nothing new, but the stakes are high". USA Today. Retrieved 24 September 2024.
    23. Noble, Safiya Umoja (4 March 2018). "Social Inequality Will Not Be Solved By an App". Wired. ISSN   1059-1028 . Retrieved 24 September 2024.
    24. Brown, Tracy (21 March 2019). "Sasheer Zamata's 'Full Frontal' segment confronts racial bias in technology". Los Angeles Times. Retrieved 24 September 2024.
    25. Manjoo, Farhad (30 August 2018). "Here's the Conversation We Really Need to Have About Bias at Google". The New York Times. Retrieved 24 September 2024.
    26. Noble, Safiya (1 July 2020). "The Loss Of Public Goods To Big Tech". Noema.
    27. "Two New Senior Research Fellows to Join the Oxford Internet Institute". Oxford Internet Institute. 28 October 2018. Archived from the original on 7 February 2019. Retrieved 6 February 2019.
    28. 1 2 Noble, Safiya (26 March 2018). "Google Has a Striking History of Bias Against Black Girls". Time. Retrieved 6 February 2019.
    29. "The Frontier Issue". Bitch Media (54). Spring 2012. Archived from the original on 7 February 2019. Retrieved 6 February 2019.
    30. Noble, Safiya Umoja; Tynes, Brendesha M., eds. (2016). The intersectional Internet: race, sex, class and culture online. New York: Peter Lang. ISBN   9781433130007. OCLC   918150002.
    31. Tettegah, Sharon; Noble, Safiya, eds. (28 December 2015). Emotions, Technology, and Design. Academic Press. ISBN   9780128018392.
    32. "NYPL 2018 Best Books for Adults". New York Public Library. Archived from the original on 24 March 2023.
    33. "In 'Algorithms of Oppression,' Safiya Noble finds old stereotypes persist in new media". USC Annenberg School for Communication and Journalism. 16 February 2018. Retrieved 31 May 2018.
    34. 1 2 "Algorithms of Oppression | How Search Engines Reinforce Racism". NYU Press. Retrieved 31 May 2018.
    35. "Algorithms of Oppression". Data & Society. 15 May 2018. Retrieved 24 September 2024.
    36. Noble, Safiya Umoja (2018). Algorithms of Oppression: How search engines reinforce racism. New York University Press. ISBN   978-1-4798-4994-9.
    37. Noble, Safiya Umoja. "Digital Infrastructures of Race and Gender". Fotomuseum Winterthur. Retrieved 24 September 2024.
    38. "Safiya U. Noble, Ph.D." Stratelligence. Archived from the original on 14 October 2018.
    39. Safiya Umoja Noble - 'Just Google It': Algorithms of Oppression, 14 December 2015, retrieved 31 May 2018
    40. Algorithms of Oppression: Safiya Umoja Noble, 28 February 2018, retrieved 31 May 2018
    41. Noble, Safiya (12 August 2017). "Algorithms of Oppression: How Search Engines Reinforce Racism". Open Transcripts. Retrieved 31 May 2018.
    42. "Digital Futures – Safiya Umoja Noble". Concordia University. Retrieved 31 May 2018.
    43. Dickey, Megan Rose (27 January 2018). "CTRL+T podcast: Artificial intelligence may become a human rights issue". TechCrunch. Retrieved 31 May 2018.
    Safiya Noble
    Re publica 18 - Day 1 (26984263637) (cropped).jpg
    Known for Algorithms of Oppression
    Awards MacArthur Fellow
    Academic background
    Alma mater California State University, Fresno
    University of Illinois at Urbana-Champaign
    Thesis Searching for black girls: old traditions in new media  (2012)