Design justice

Last updated

Design justice is the ethical and inclusive approach to designing products or systems that address and mitigate historical inequalities to ensure fair and equitable outcomes for all users. The article covers an overview of design justice, the 10 Principles of Design Justice, challenges in equitable design, and applications of design justice in Artificial Intelligence and Human-Computer Interaction.

Contents

Overview

Design justice is defined by experts in interdisciplinary fields as a theory and practice of designing with action to mitigate risks and harms toward the marginalized, and oppressed. This action is taken by moving away from the idea of “designing for good” and moving towards designing with input from the affected communities. [1] [2] Design Justice aims to balance the distribution of risks, harms, and benefits among users through the design process, by including people whose lives will be impacted by the product or system in addition to technical experts in the design process. [3] [4] Design justice is built upon ten principles to ensure that communities can come together to lead to more equitable products.

Below are the 10 design justice principles: [5]

  1. Use design to sustain, heal and empower communities, as well as to seek liberation from exploitative and oppressive systems.
  2. Center the voices of those who are directly impacted by the outcome of the design process.
  3. Prioritize design’s impact on the community over the intentions of the designer.
  4. View change as emergent from an accountable, accessible, and collaborative process rather than as a point at the end of a process.
  5. See the role of the designer as a facilitator rather than an expert.
  6. Believe that everyone is an expert based on their own lived experience and that all people have unique and brilliant contributions to bring to a design process.
  7. Share design knowledge and tools with communities.
  8. Work towards sustainable, community-led and -controlled outcomes.
  9. Work towards non-exploitative solutions that reconnect to the earth and each other.
  10. Before seeking new design solutions, look for what is already working at the community level. Honor and uplift traditional, indigenous, and local knowledge and practices.

Challenges

Equitable design in technology has been hard to achieve because of the overwhelming presence of heterosexual white men in product design leadership. [6] Designers in the tech industry stand in for the promise of innovation and entrepreneurialism, but also are part of one of the most unequal industries, with Black women only making up 1.7% of the tech workforce. [7] In an industry where meaningful efforts to diversify are not made and structural inequality is rarely mentioned or challenges, design justice is an important call to rupture current design practices.

Artificial intelligence and machine learning

Design justice in artificial intelligence is the ethical and inclusive approach to designing and building artificial intelligence technology, aligning with the 10 principles outlined previously. It emphasizes the need to consider ethical, cultural, and social dimensions of designing technology to create a more equitable and overall positive impact on society. [8]

Designing artificial intelligence products

Design justice in artificial intelligence aims to transform the design process of artificial intelligence products. Currently, the design of artificial intelligence is inequitable because of who has power in the design process, who the users of the technology are imagined to be, and where the design is taking place. [8] The benefit of designing artificial intelligence technologies flows to individuals who traditionally have more power in the design process and harms individuals who are already marginalized. [9] Design Justice in artificial intelligence aims to bridge the gap between the two groups and deconstruct the power dynamic in the artificial intelligence design process. [3]

Artificial intelligence products are designed using training data, which depends on human efforts to learn and deliver algorithmic results. [10] A specific case would be in resume screenings. Artificial intelligence has been used in resume screenings to automate and speed the process of resume screenings, helping employers find candidates faster, making sure qualified candidates are notified and making the hiring process smoother on both sides. [11] However, if the resume dataset contains past biases in the company, these biases will continue to be reflected by the artificial intelligence resume screening. Amazon had used a machine learning resume tool until machine learning specialists found that the tool was discriminating against women because the models were trained on resumes submitted to the company over a 10-year period, mostly coming from men. [12] Design justice could be used in this context to evaluate the dataset and make sure it adheres with Amazon’s goals to have a diverse workforce.

Transition to artificial intelligence products

In addition to the design process of artificial intelligence products, design justice principles can be applied to how artificial intelligence products are used, as many industries transition to using artificial intelligence products. [13] [14] For example, consider the importance of design justice in the process of transition to using artificial intelligence products in mental healthcare. [13] Researchers advocate for a more ethical and well-rounded approach to the implementation of artificial intelligence in the healthcare industry to make sure that all users of the product, the patients, are a part of the decision making process in the transition to using artificial intelligence in healthcare. [13] As artificial intelligence products and usage increases, experts in the field call to using design justice principles not only in the design of the products, but in the implementation of them as well. [15]

Human-computer interaction

Design justice in Human-Computer Interaction (or HCI) is a branch of design justice focused on the principles of design for digital interfaces. [16] Considering issues of power, privilege, and access within the design of digital products, interfaces, and systems is the cornerstone of HCI-centric design justice. [16] Not only does design justice in HCI include the development of accessible websites and interfaces, but it also reexamines the core principles and practices that HCI designers rely on. [4]

User interface and experience

Design justice has become increasingly prevalent in the user interface/user experience (UI/UX) field in recent years. [17] The goal of design justice in UI/UX is to increase the usability of interfaces and products by focusing on the needs of marginalized communities in the design process. [18]

Design justice principles encourage designers to involve the user in the design process more in order to work towards this goal. One such strategy of increased user involvement is participatory design. This entails finding a set of users to test the product, surveying them on their experience with the product, and adjusting the product design to accommodate for feedback. For this strategy to follow design justice principles, the pool of users should include representation for various marginalized communities. [17] This approach can be helpful in identifying areas of design to increase inclusivity. In a case study from the University of Illinois Champaign-Urbana, designers were able to create an interface inclusive of color-blind individuals after receiving feedback from their user pool. [19] Another strategy, co-design, involves users from marginalized communities throughout the development process. In contrast to participatory design, users actively participate in the design lifecycle and directly contribute their experiences to the product. This lessens the divide between designers and users, allowing both to have equal ownership and contribution to the product. Co-design also encourages designers and users alike to recognize and understand their identity and how it may impact the product outcome. The core of both participatory design and co-design strategies is for designers to be empathetic and educated in order to empower all users through the use of interfaces and technological experiences. [17]

Principles

Bu combining design justice with HCI principles, designers and researchers can contribute to technology that is not only usable but also promotes social justice, inclusivity, and intersectionality. [4] The goal of design justice within the HCI field is to generate social change through digital platforms. [4] Sasha Costanza-Chock outlines the problems within the field of HCI and how designers can address them in their article "Design Justice: towards an intersectional feminist framework for design theory and practice". One of the problems in interface design is inclusivity. Most designers make an assumption when creating interfaces: their users have privileges such as U.S. citizenship, English language proficiency, access to broadband internet, a smartphone, no disabilities, and more. [4] The principles of HCI promote the continuous iteration and testing of designs with real people, and design justice suggests that including the intended audience of an interface in the design process will aid in the creation of inclusive interfaces. [20]

Another issue that Costanza-Chock highlights is intersectionality. When designers consider equality during development, most employ a single-axis framework, leading to most interfaces ignoring the complex identities of marginalized peoples. [4] HCI principles encourage the early determination of the users and tasks that a user interface will support, and principles of design justice, like principles 2 and 6, place the unique identities of individuals at the forefront of creating user interfaces. Costanza-Chock also emphasizes the affect that design justice has on the empowerment and participation of marginalized peoples. Many organizations have been created to support gender equality in STEM fields. [4] Design justice recognizes that employment in paid design fields for marginalized people is important, but that other aspects of design, like the intended beneficiaries of design, are equally as important. [4] [16]

Related Research Articles

<span class="mw-page-title-main">Chatbot</span> Program that simulates conversation

A chatbot is a software application or web interface that is designed to mimic human conversation through text or voice interactions. Modern chatbots are typically online and use generative artificial intelligence systems that are capable of maintaining a conversation with a user in natural language and simulating the way a human would behave as a conversational partner. Such chatbots often use deep learning and natural language processing, but simpler chatbots have existed for decades.

Interaction design, often abbreviated as IxD, is "the practice of designing interactive digital products, environments, systems, and services." While interaction design has an interest in form, its main area of focus rests on behavior. Rather than analyzing how things are, interaction design synthesizes and imagines things as they could be. This element of interaction design is what characterizes IxD as a design field, as opposed to a science or engineering field.

Participatory design is an approach to design attempting to actively involve all stakeholders in the design process to help ensure the result meets their needs and is usable. Participatory design is an approach which is focused on processes and procedures of design and is not a design style. The term is used in a variety of fields e.g. software design, urban design, architecture, landscape architecture, product design, sustainability, graphic design, industrial design, planning, and health services development as a way of creating environments that are more responsive and appropriate to their inhabitants' and users' cultural, emotional, spiritual and practical needs. It is also one approach to placemaking.

<span class="mw-page-title-main">SAS (software)</span> Statistical software

SAS is a statistical software suite developed by SAS Institute for data management, advanced analytics, multivariate analysis, business intelligence, criminal investigation, and predictive analytics. SAS' analytical software is built upon artificial intelligence and utilizes machine learning, deep learning and generative AI to manage and model data. The software is widely used in industries such as finance, insurance, health care and education.

Human-centered computing (HCC) studies the design, development, and deployment of mixed-initiative human-computer systems. It is emerged from the convergence of multiple disciplines that are concerned both with understanding human beings and with the design of computational artifacts. Human-centered computing is closely related to human-computer interaction and information science. Human-centered computing is usually concerned with systems and practices of technology use while human-computer interaction is more focused on ergonomics and the usability of computing artifacts and information science is focused on practices surrounding the collection, manipulation, and use of information.

Inclusive design is a design process in which a product, service, or environment is designed to be usable for as many people as possible, particularly groups who are traditionally excluded from being able to use an interface or navigate an environment. Its focus is on fulfilling as many user needs as possible, not just as many users as possible. Historically, inclusive design has been linked to designing for people with physical disabilities, and accessibility is one of the key outcomes of inclusive design. However, rather than focusing on designing for disabilities, inclusive design is a methodology that considers many aspects of human diversity that could affect a person's ability to use a product, service, or environment, such as ability, language, culture, gender, and age. The Inclusive Design Research Center reframes disability as a mismatch between the needs of a user and the design of a product or system, emphasizing that disability can be experienced by any user. With this framing, it becomes clear that inclusive design is not limited to interfaces or technologies, but may also be applied to the design of policies and infrastructure.

<span class="mw-page-title-main">Ben Shneiderman</span> American computer scientist

Ben Shneiderman is an American computer scientist, a Distinguished University Professor in the University of Maryland Department of Computer Science, which is part of the University of Maryland College of Computer, Mathematical, and Natural Sciences at the University of Maryland, College Park, and the founding director (1983-2000) of the University of Maryland Human-Computer Interaction Lab. He conducted fundamental research in the field of human–computer interaction, developing new ideas, methods, and tools such as the direct manipulation interface, and his eight rules of design.

<span class="mw-page-title-main">User interface design</span> Planned operator–machine interaction

User interface (UI) design or user interface engineering is the design of user interfaces for machines and software, such as computers, home appliances, mobile devices, and other electronic devices, with the focus on maximizing usability and the user experience. In computer or software design, user interface (UI) design primarily focuses on information architecture. It is the process of building interfaces that clearly communicate to the user what's important. UI design refers to graphical user interfaces and other forms of interface design. The goal of user interface design is to make the user's interaction as simple and efficient as possible, in terms of accomplishing user goals.

In artificial intelligence, an embodied agent, also sometimes referred to as an interface agent, is an intelligent agent that interacts with the environment through a physical body within that environment. Agents that are represented graphically with a body, for example a human or a cartoon animal, are also called embodied agents, although they have only virtual, not physical, embodiment. A branch of artificial intelligence focuses on empowering such agents to interact autonomously with human beings and the environment. Mobile robots are one example of physically embodied agents; Ananova and Microsoft Agent are examples of graphically embodied agents. Embodied conversational agents are embodied agents that are capable of engaging in conversation with one another and with humans employing the same verbal and nonverbal means that humans do.

Affective design describes the design of products, services, and user interfaces that aim to evoke intended emotional responses from consumers, ultimately improving customer satisfaction. It is often regarded within the domain of technology interaction and computing, in which emotional information is communicated to the computer from the user in a natural and comfortable way. The computer processes the emotional information and adapts or responds to try to improve the interaction in some way. The notion of affective design emerged from the field of human–computer interaction (HCI), specifically from the developing area of affective computing. Affective design serves an important role in user experience (UX) as it contributes to the improvement of the user's personal condition in relation to the computing system. Decision-making, brand loyalty, and consumer connections have all been associated with the integration of affective design. The goals of affective design focus on providing users with an optimal, proactive experience. Amongst overlap with several fields, applications of affective design include ambient intelligence, human–robot interaction, and video games.

Social design is the application of design methodologies in order to tackle complex human issues, placing the social issues as the priority. Historically social design has been mindful of the designer's role and responsibility in society, and of the use of design processes to bring about social change. Social design as a discipline has been practiced primarily in two different models, as either the application of the human-centered design methodology in the social sector or governmental sector, or sometimes is synonymously practiced by designers who venture into social entrepreneurship.

<span class="mw-page-title-main">FX Palo Alto Laboratory</span>

FX Palo Alto Laboratory, Inc. (FXPAL) was a research lab for Fuji Xerox Co. Ltd. FXPAL employed roughly 40 people, including 25 Ph.D. research scientists. In addition, FXPAL had a strong intern program with 10 summer interns each year from universities worldwide and as well as visiting researchers from Fuji Xerox Japan. There were three primary research areas: Internet of Things (IoT), Artificial Intelligence (AI) and Human Computer Interaction (HCI). Focus in IoT was on augmented reality and localization, in AI on text and image analysis and in HCI on visualization and interaction. A particular strength of FXPAL was the collaboration among researchers in these different areas, as well as with artists, designers and other non-academic employees.

User experience design, upon which is the centralized requirements for "User Experience Design Research", defines the experience a user would go through when interacting with a company, its services, and its products. User experience design is a user centered design approach because it considers the user's experience when using a product or platform. Research, data analysis, and test results drive design decisions in UX design rather than aesthetic preferences and opinions, for which is known as UX Design Research. Unlike user interface design, which focuses solely on the design of a computer interface, UX design encompasses all aspects of a user's perceived experience with a product or website, such as its usability, usefulness, desirability, brand perception, and overall performance. UX design is also an element of the customer experience (CX), and encompasses all design aspects and design stages that are around a customer's experience.

Human-centered design is an approach to problem-solving commonly used in process, product, service and system design, management, and engineering frameworks that develops solutions to problems by involving the human perspective in all steps of the problem-solving process. Human involvement typically takes place in initially observing the problem within context, brainstorming, conceptualizing, developing concepts and implementing the solution.

Human-centered design is an approach to interactive systems development that aims to make systems usable and useful by focusing on the users, their needs and requirements, and by applying human factors/ergonomics, and usability knowledge and techniques. This approach enhances effectiveness and efficiency, improves human well-being, user satisfaction, accessibility and sustainability; and counteracts possible adverse effects of use on human health, safety and performance.

<span class="mw-page-title-main">Human–computer interaction</span> Academic discipline studying the relationship between computer systems and their users

Human–computer interaction (HCI) is research in the design and the use of computer technology, which focuses on the interfaces between people (users) and computers. HCI researchers observe the ways humans interact with computers and design technologies that allow humans to interact with computers in novel ways. A device that allows interaction between human being and a computer is known as a "Human-computer Interface (HCI)".

<span class="mw-page-title-main">Sasha Costanza-Chock</span> American communications scholar

Sasha Costanza-Chock is an American communications scholar, author, and activist. They are an associate professor at Northeastern University and a faculty affiliate at the Berkman Klein Center for Internet & Society.

Semiotic Engineering was originally proposed by Clarisse de Souza as a semiotic approach to designing user interface languages. Over the years, with research done at the Department of Informatics of the Pontifical Catholic University of Rio de Janeiro, it evolved into a semiotic theory of human-computer interaction (HCI).

<span class="mw-page-title-main">Behavioural design</span> Field of design concerned with the influence of design on behavior

Behavioural design is a sub-category of design, which is concerned with how design can shape, or be used to influence human behaviour. All approaches of design for behaviour change acknowledge that artifacts have an important influence on human behaviour and/or behavioural decisions. They strongly draw on theories of behavioural change, including the division into personal, behavioural, and environmental characteristics as drivers for behaviour change. Areas in which design for behaviour change has been most commonly applied include health and wellbeing, sustainability, safety and social context, as well as crime prevention.

Feminist HCI is a subfield of human-computer interaction (HCI) that applies feminist theory, critical theory and philosophy to social topics in HCI, including scientific objectivity, ethical values, data collection, data interpretation, reflexivity, and unintended consequences of HCI software. The term was originally used in 2010 by Shaowen Bardzell, and although the concept and original publication are widely cited, as of 2020 Bardzell's proposed frameworks have been rarely used since.

<span class="mw-page-title-main">Algorithmic Justice League</span> Digital advocacy non-profit organization

The Algorithmic Justice League (AJL) is a digital advocacy non-profit organization based in Cambridge, Massachusetts. Founded in 2016 by computer scientist Joy Buolamwini, the AJL uses research, artwork, and policy advocacy to increase societal awareness regarding the use of artificial intelligence (AI) in society and the harms and biases that AI can pose to society. The AJL has engaged in a variety of open online seminars, media appearances, and tech advocacy initiatives to communicate information about bias in AI systems and promote industry and government action to mitigate against the creation and deployment of biased AI systems. In 2021, Fast Company named AJL as one of the 10 most innovative AI companies in the world.

References

  1. Costanza-Chock, Sasha (March 2020). Design Justice. The MIT Press.
  2. Zielke, Julia; Morawe, Jan Marc; Aktan, Alev Nazli; Miani, Céline (2023-11-24). "'That Sounds to Me Like You Are Making This Too Complicated …': Reflections on a Social Media Recruitment Effort for a Study on Masculinities and Contraception". Qualitative Health Research. 34 (4): 280–286. doi:10.1177/10497323231203631. ISSN   1049-7323. PMID   37997352.
  3. 1 2 Katyal, Sonia; Jung, Jessica (Jan 3, 2022). "The Gender Panopticon: AI, Gender, and Design Justice". UCLA Law Review.
  4. 1 2 3 4 5 6 7 8 Costanza-Chock, Sasha (28 June 2018). "Design Justice: towards an intersectional feminist framework for design theory and practice". Design Research Society. DRS2018: Catalyst. 2. doi:10.21606/drs.2018.679. ISBN   978-1-912294-27-5.
  5. "Design Justice Network". Design Justice Network.
  6. "Diversity in High Tech". www.eeoc.gov. Retrieved 5 December 2023.
  7. Jackson, Ashton (24 February 2022). "Black employees make up just 7.4% of the tech workforce—these nonprofits are working to change that". CNBC. Retrieved 5 December 2023.
  8. 1 2 Costanza-Chock, Sasha (27 July 2018). "Design Justice, A.I., and Escape from the Matrix of Domination". Journal of Design and Science. 3 (5). doi: 10.21428/96c8d426 . Retrieved 5 December 2023.
  9. Kalluri, Pratyusha (2020). "Don't ask if artificial intelligence is good or fair, ask how it shifts power". Nature. 583 (7815): 169. Bibcode:2020Natur.583..169K. doi:10.1038/d41586-020-02003-2. PMID   32636520.
  10. Megorskaya, Olga. "Council Post: Training Data: The Overlooked Problem Of Modern AI". Forbes. Retrieved 6 December 2023.
  11. Hu, James. "Over 98% of Fortune 500 Companies Use Applicant Tracking Systems (ATS) - Jobscan Blog". Jobscan Blog. Retrieved 6 December 2023.
  12. Dastin, Jeffrey. "Amazon scraps secret AI recruiting tool that showed bias against women". Reuters. Retrieved 6 December 2023.
  13. 1 2 3 Zidaru, Theodor; Morrow, Elizabeth (August 2021). "Ensuring patient and public involvement in the transition to AI-assisted mental health care: A systematic scoping review and agenda for design justice". Health Expectations. 24 (4): 1072–1124. doi:10.1111/hex.13299. PMC   8369091 . PMID   34118185.
  14. Obermeyer, Ziad; Emanuel, Ezekiel J. (2016-09-29). "Predicting the Future — Big Data, Machine Learning, and Clinical Medicine". New England Journal of Medicine. 375 (13): 1216–1219. doi:10.1056/NEJMp1606181. ISSN   0028-4793. PMC   5070532 . PMID   27682033.
  15. Bohr, Adam; Memarzadeh, Kaveh (2020-01-01), Bohr, Adam; Memarzadeh, Kaveh (eds.), "Chapter 2 - The rise of artificial intelligence in healthcare applications", Artificial Intelligence in Healthcare, Academic Press: 25–60, doi:10.1016/b978-0-12-818438-7.00002-2, ISBN   978-0-12-818438-7, PMC   7325854
  16. 1 2 3 Dombrowski, Lynn; Harmon, Ellie; Fox, Sarah (2016-06-04). "Social Justice-Oriented Interaction Design: Outlining Key Design Strategies and Commitments". Proceedings of the 2016 ACM Conference on Designing Interactive Systems. ACM. pp. 656–671. doi:10.1145/2901790.2901861. hdl: 1805/12029 . ISBN   978-1-4503-4031-1.
  17. 1 2 3 Caruso, Christine; Frankel, Lois (2010). "Everyday People: Enabling User Expertise in Socially Responsible Design" (PDF).
  18. Rose, Emma J.; Edenfield, Avery; Walton, Rebecca; Gonzales, Laura; McNair, Ann Shivers; Zhvotovska, Tetyana; Jones, Natasha; de Mueller, Genevieve I. Garcia; Moore, Kristen (2018-08-03). "21". Social Justice in UX: Centering Marginalized Users. SIGDOC '18: Proceedings of the 36th ACM International Conference on the Design of Communication. ACM. pp. 1–2. doi:10.1145/3233756.3233931. ISBN   978-1-4503-5935-1.
  19. Han, Yingying; Markazi, Daniela M.; Narang, Samual (2021-11-08). "Outlining a design justice-based social media website for university students in the age of COVID-19". 19th CIRN Conference 2021: Communities, Technology and This Moment. hdl:2142/114351.
  20. Nkonde, Mutale. "Automated Anti-Blackness: Facial Recognition in Brooklyn, New York". Harvard Kennedy School Journal of African American Policy (2019–20): 30–36.