Sara Wachter-Boettcher

Last updated

Sara Wachter-Boettcher is an author, consultant and speaker. [1] She is the author of Technically Wrong [2] and Content Everywhere and the co-author, with Eric Meyer, of Design for Real Life. [3] Her book Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech was recommended by Wired magazine as one of the best tech books in 2017 [4] and by Fast Company as one of the best business and leadership books in 2017. [5]

She has written for various newspapers and magazines, including Slate [6] and The Guardian . [7] [8] [9]

Wachter-Boettcher is considered an expert on FemTech and the lack of diversity in technology in general. [10] [11] [12] Her works have also received attention in academic literature on technology and algorithms. [13] [14] [15]

Related Research Articles

<span class="mw-page-title-main">Eric A. Meyer</span> Web design consultant and author

Eric A. Meyer is an American web design consultant and author. He is best known for his advocacy work on behalf of web standards, most notably CSS, a technique for managing how HTML is displayed. Meyer has written a number of books and articles on CSS and given many presentations promoting its use. Eric currently works for Igalia.

<span class="mw-page-title-main">Technological fix</span> Attempt at using engineering or technology to solve a problem

A technological fix, technical fix, technological shortcut or (techno-)solutionism refers to attempts to use engineering or technology to solve a problem.

The ethics of artificial intelligence is the branch of the ethics of technology specific to artificially intelligent systems. It is sometimes divided into a concern with the moral behavior of humans as they design, make, use and treat artificially intelligent systems, and a concern with the behavior of machines, in machine ethics.

Sexism in the technology industry is overt, subtle, or covert occupational sexism which makes the technology industry less friendly, less accessible, and less profitable for women. While the participation of women in the tech industry varies by region, it is generally around 4% to 20% depending on the measure used. Possible causes that have been studied by researchers include gender stereotypes, investment influenced by those beliefs, a male-dominated environment, a lack of awareness about sexual harassment, and the culture of the industry itself. Margaret O'Mara, a professor of history at the University of Washington, in 2019 concluded that Silicon Valley is uniquely influential locale that is shaping our world. But she points to problematic failures regarding diversity. Male oligopolies of high-tech power have recreated traditional environments that repress the talents and ambitions of women, people of color, and other minorities to the benefit of whites and Asian males.

<span class="mw-page-title-main">Ruha Benjamin</span> American sociologist

Ruha Benjamin is a sociologist and a professor in the Department of African American Studies at Princeton University. The primary focus of her work is the relationship between innovation and equity, particularly focusing on the intersection of race, justice and technology. Benjamin is the author of numerous publications, including the books People's Science: Bodies and Rights on the Stem Cell Frontier (2013), Race After Technology: Abolitionist Tools for the New Jim Code (2019) and Viral Justice: How We Grow the World We Want (2022).

The comments section is a feature on most online blogs, news websites, and other websites in which the publishers invite the audience to comment on the published content. This is a continuation of the older practice of publishing letters to the editor. Despite this, comments sections can be used for more discussion between readers.

<span class="mw-page-title-main">Naomi Wu</span> Chinese DIY maker and internet personality

Naomi Wu, also known as Sexy Cyborg, is a Chinese DIY maker and internet personality. As an advocate of women in STEM, transhumanism, open source hardware, and body modification, she attempts to challenge gender and tech stereotypes with a flamboyant public persona, using objectification of her appearance to inspire women.

<span class="mw-page-title-main">Algorithmic bias</span> Technological phenomenon with social implications

Algorithmic bias describes systematic and repeatable errors in a computer system that create "unfair" outcomes, such as "privileging" one category over another in ways different from the intended function of the algorithm.

<i>Algorithms of Oppression</i> 2018 book by Safiya Umoja Noble

Algorithms of Oppression: How Search Engines Reinforce Racism is a 2018 book by Safiya Umoja Noble in the fields of information science, machine learning, and human-computer interaction.

<span class="mw-page-title-main">Meredith Broussard</span> Data journalism professor

Meredith Broussard is a data journalism professor at the Arthur L. Carter Journalism Institute at New York University. Her research focuses on the role of artificial intelligence in journalism.

<span class="mw-page-title-main">Tabitha Goldstaub</span> British tech entrepreneur

Tabitha Goldstaub is a British tech entrepreneur who specialises in communicating the impact of artificial intelligence. She is the co-founder of CogX, a festival and online platform. She is also the chair of the UK government's AI Council, a member of the DCMS Digitial Economy Council and on the TechUK board. A serial entrepreneur, she was the co-founder of video distribution company Rightster. Tabitha is the author of How To Talk To Robots - A Girls' Guide to a World Dominated by AI. She's also an advisor to Tortoise Media, Raspberry Pi, CarbonRe, Monumo, Cambridge Innovation Capital and The Alan Turing Institute.

<span class="mw-page-title-main">Joy Buolamwini</span> Computer scientist and digital activist

Joy Adowaa Buolamwini is a Ghanaian-American-Canadian computer scientist and digital activist based at the MIT Media Lab. Buolamwini introduces herself as a poet of code, daughter of art and science. She founded the Algorithmic Justice League, an organization that works to challenge bias in decision-making software, using art, advocacy, and research to highlight the social implications and harms of artificial intelligence (AI).

<span class="mw-page-title-main">Hinge (app)</span> American online dating app

Hinge is an online dating application. Using an algorithm, the app displays potential matches, allowing the user to dismiss or attempt to match by responding to a specific piece of content on their profile. The service emphasizes uploading user-generated content in a variety of formats, such as photos, videos, and "prompts" as a way to express personality and appearance. The app is fully owned by Match Group as of February 2019.

<span class="mw-page-title-main">Meredith Whittaker</span> American artificial intelligence research scientist

Meredith Whittaker is the president of the Signal Foundation and serves on their board of directors. She was formerly the Minderoo Research Professor at New York University (NYU), and the co-founder and faculty director of the AI Now Institute. She also served as a senior advisor on AI to Chair Lina Khan at the Federal Trade Commission. Whittaker was employed at Google for 13 years, where she founded Google's Open Research group and co-founded the M-Lab. In 2018, she was a core organizer of the Google Walkouts and resigned from the company in July 2019.

<span class="mw-page-title-main">Julia Angwin</span> American investigative journalist

Julia Angwin is a Pulitzer Prize-winning American investigative journalist, New York Times bestselling author, and entrepreneur. She was a co-founder and editor-in-chief of The Markup, a nonprofit newsroom that investigates the impact of technology on society. She was a senior reporter at ProPublica from 2014 to April 2018 and staff reporter at the New York bureau of The Wall Street Journal from 2000 to 2013. Angwin is author of non-fiction books, Stealing MySpace: The Battle to Control the Most Popular Website in America (2009) and Dragnet Nation (2014). She is a winner and two-time finalist for the Pulitzer Prize in journalism.

<span class="mw-page-title-main">Sandra Wachter</span> Data Ethics, Artificial Intelligence, robotics researcher

Sandra Wachter is a professor and senior researcher in data ethics, artificial intelligence, robotics, algorithms and regulation at the Oxford Internet Institute. She is a former Fellow of The Alan Turing Institute.

<span class="mw-page-title-main">Rachel Thomas (academic)</span> American computer scientist

Rachel Thomas is an American computer scientist and founding Director of the Center for Applied Data Ethics at the University of San Francisco. Together with Jeremy Howard, she is co-founder of fast.ai. Thomas was selected by Forbes magazine as one of the 20 most incredible women in artificial intelligence.

Government by algorithm is an alternative form of government or social ordering where the usage of computer algorithms, especially of artificial intelligence and blockchain, is applied to regulations, law enforcement, and generally any aspect of everyday life such as transportation or land registration. The term "government by algorithm" appeared in academic literature as an alternative for "algorithmic governance" in 2013. A related term, algorithmic regulation, is defined as setting the standard, monitoring and modifying behaviour by means of computational algorithms – automation of judiciary is in its scope. In the context of blockchain, it is also known as blockchain governance.

Regulation of algorithms, or algorithmic regulation, is the creation of laws, rules and public sector policies for promotion and regulation of algorithms, particularly in artificial intelligence and machine learning. For the subset of AI algorithms, the term regulation of artificial intelligence is used. The regulatory and policy landscape for artificial intelligence (AI) is an emerging issue in jurisdictions globally, including in the European Union. Regulation of AI is considered necessary to both encourage AI and manage associated risks, but challenging. Another emerging topic is the regulation of blockchain algorithms and is mentioned along with regulation of AI algorithms. Many countries have enacted regulations of high frequency trades, which is shifting due to technological progress into the realm of AI algorithms.

<span class="mw-page-title-main">Black in AI</span> Technology research organization

Black in AI, formally called the Black in AI Workshop, is a technology research organization and affinity group, founded by computer scientists Timnit Gebru and Rediet Abebe in 2017. It started as a conference workshop, later pivoting into an organization. Black in AI increases the presence and inclusion of Black people in the field of artificial intelligence (AI) by creating space for sharing ideas, fostering collaborations, mentorship, and advocacy.

References

  1. "Sara Wachter-Boettcher". Sara Wachter-Boettcher. Retrieved 2019-03-10.
  2. Wachter-Boettcher, Sara (2017). Technically wrong : sexist apps, biased algorithms, and other threats of toxic tech (First ed.). New York, NY. ISBN   9780393634631. OCLC   993134234.{{cite book}}: CS1 maint: location missing publisher (link)
  3. Meyer, Eric A.; Wachter-Boettcher, Sara (2016). Design for real life. New York. ISBN   9781937557409. OCLC   946884891.{{cite book}}: CS1 maint: location missing publisher (link)
  4. "The Top Tech Books of 2017: Part I". Wired. 2017-12-27. ISSN   1059-1028 . Retrieved 2019-03-10.
  5. Horton, Anisa Purbasari (2017-12-20). "The Best Business And Leadership Books Of 2017". Fast Company. Retrieved 2019-03-10.
  6. Wachter-Boettcher, Sara (2017-11-14). "Technology Should Stop Catering to the "Average" User. We're All Edge Cases". Slate Magazine. Retrieved 2019-03-10.
  7. Wachter-Boettcher, Sara (2017-11-18). "How algorithms are pushing the tech giants into the danger zone". The Observer. ISSN   0029-7712 . Retrieved 2019-03-10.
  8. Kuchler, Hannah (2018-03-09). "Tech's sexist algorithms and how to fix them". Financial Times. Retrieved 2019-03-10.
  9. Hoffmann, Anna Lauren (2017-10-17). "Biased tech design prompts a writer to call for resistance". Books, Et Al. Retrieved 2019-03-10.
  10. Tiffany, Kaitlyn (2018-11-13). "Who are period-tracking apps really built for?". Vox. Retrieved 2019-03-10.
  11. Sankaran, Vishwam (2018-11-27). "Google disables some Gmail smart suggestions because it can't fix AI gender bias". The Next Web. Retrieved 2019-03-10.
  12. "'Toxic tech': How Silicon Valley's lack of diversity leaks into its products". WTOP. 2017-10-10. Retrieved 2019-03-10.
  13. Zambonelli, Franco; Salim, Flora; Loke, Seng W.; De Meuter, Wolfgang; Kanhere, Salil (June 2018). "Algorithmic Governance in Smart Cities: The Conundrum and the Potential of Pervasive Computing Solutions". IEEE Technology and Society Magazine. 37 (2): 80–87. doi:10.1109/MTS.2018.2826080. hdl: 11380/1167772 . ISSN   0278-0097. S2CID   46937194.
  14. Jandrić, Petar; Ryberg, Thomas; Knox, Jeremy; Lacković, Nataša; Hayes, Sarah; Suoranta, Juha; Smith, Mark; Steketee, Anne; Peters, Michael (2018-10-27). "Postdigital Dialogue". Postdigital Science and Education. 1: 163–189. doi: 10.1007/s42438-018-0011-x . ISSN   2524-485X.
  15. Pleasants, Jacob; Olson, Joanne K. (January 2019). "What is engineering? Elaborating the nature of engineering for K-12 education: PLEASANTS and OLSON". Science Education. 103 (1): 145–166. doi: 10.1002/sce.21483 .