Critical data studies

Last updated

Critical data studies is the exploration of and engagement with social, cultural, and ethical challenges that arise when working with big data. It is through various unique perspectives and taking a critical approach that this form of study can be practiced. [1] As its name implies, critical data studies draws heavily on the influence of critical theory, which has a strong focus on addressing the organization of power structures. This idea is then applied to the study of data.

Contents

Interest in this unique field of critical data studies began in 2011 with scholars Danah Boyd and Kate Crawford posing various questions for the critical study of big data and recognizing its potential threatening impacts on society and culture. [2] It was not until 2014, and more exploration and conversations, that critical data studies was officially coined by scholars Craig Dalton and Jim Thatcher. [1] They put a large emphasis on understanding the context of big data in order to approach it more critically. Researchers such as Daniel Ribes, Robert Soden, Seyram Avle, Sarah E. Fox, and Phoebe Sengers focus on understanding data as a historical artifact and taking an interdisciplinary approach towards critical data studies. [3] Other key scholars in this discipline include Rob Kitchin and Tracey P. Lauriault who focus on reevaluating data through different spheres. [4]

Various critical frameworks that can be applied to analyze big data include Feminist, Anti-Racist, Queer, Indigenous, Decolonial, Anti-Ableist, as well as Symbolic and Synthetic data science. These frameworks help to make sense of the data by addressing power, biases, privacy, consent, and underrepresentation or misrepresentation concerns that exist in data as well as how to approach and analyze this data with a more equitable mindset.

Motivation

In their article in which they coin the term 'critical data studies,' Dalton and Thatcher also provide several justifications as to why data studies is a discipline worthy of a critical approach. [5] First, 'big data' is an important aspect of twenty-first century society, and the analysis of 'big data' allows for a deeper understanding of what is happening and for what reasons. [1] Big data is important to critical data studies because it is the type of data used within this field. Big data does not necessarily refer to a large data set, it can have a data set with millions of rows, but also a data set that just has a wide variety and expansive scope of data with a smaller type of dataset. As well as having whole populations in the data set and not just sample sizes. Furthermore, big data as a technological tool and the information that it yields are not neutral, according to Dalton and Thatcher, making it worthy of critical analysis in order to identify and address its biases. Building off this idea, another justification for a critical approach is that the relationship between big data and society is an important one, and therefore worthy of study. [1]

Ribes et. al. argue there is a need for an interdisciplinary understanding of data as a historical artifact as a motivating aspect of critical data studies.The overarching consensus  in the Computer-Supported Cooperative Work (CSCW) field, is that people should speak for the data, and not let the data speak for itself.

The sources of big data and it’s relationship to varied metadata can be a complicated one, which leads to data disorder and a need for an ethical analysis. [6] Additionally, Iliadis  and Russo (2016) have called for studying data assemblages. [6] This is to say, data has innate technological, political, social, and economic histories that should be taken into consideration. Kitchin argues data is almost never raw, and it is almost always cooked , meaning that it is always spoken for by the data scientists utilizing it. Thus, Big Data should be open to a variety of perspectives, especially those of cultural and philosophical nature. Further, data contains hidden histories, ideologies, and philosophies. [6]

Big data technology can cause significant changes in society's structure and in the everyday lives of people, [1] and, being a product of society, big data technology is worthy of sociological investigation. [1] Moreover, data sets are almost never completely without any influence. Rather, data are shaped by the vision or goals of those gathering the data, and during the data collection process, certain things are quantified, stored, sorted and even discarded by the research team. [7] A critical approach is thus necessary in order to understand and reveal the intent behind the information being presented.One of these critical approaches has been through feminist data studies. This method applies feminist principles to critical studies and data collecting and analysis. The goal of this is to address the power imbalance in data science and society. According to Catherine D’Ignazio and Lauren F. Klein, a power analysis can be performed by examining power, challenging power, evaluating emotion and embodiment, rethinking binaries and hierarchies, embracing pluralism, considering context, and making labor visible. [8] Feminist data studies is part of the movement towards making data to benefit everyone and not to increase existing inequalities. Moreover, data alone cannot speak for themselves; in order to possess any concrete meaning, data must be accompanied by theoretical insight or alternative quantitative or qualitative research measures. [1] [9] Based on different social topics such as anti-racist data studies, critical data studies give a focus on those social issues concerning data. Specifically in anti-racist data studies they use a classification approach to get representation for those within that community. Desmond Upton Patton and others used their own classification system in the communities of Chicago to help target and reduce violence with young teens on twitter. They had students in those communities help them to decipher the terminology and emojis of these teens to target the language used in tweets that followed with violence outside of the computer screens. [10] This is just one real world example of critical data studies and its application. Dalton and Thatcher argue that if one were to only think of data in terms of its exploitative power, there is no possibility of using data for revolutionary, liberatory purposes. [1] Finally, Dalton and Thatcher propose that a critical approach in studying data allows for 'big data' to be combined with older, 'small data,' and thus create more thorough research, opening up more opportunities, questions and topics to be explored. [1] [11]

Issues and concerns for critical data scholars

Data plays a pivotal role in the emerging knowledge economy, driving productivity, competitiveness, efficiency, sustainability, and capital accumulation. The ethical, political, and economic dimensions of data dynamically evolve across space and time, influenced by changing regimes, technologies, and priorities. Technically, the focus lies on handling, storing, and analyzing vast data sets, utilizing machine learning-based data mining and analytics. This technological advancement raises concerns about data quality, encompassing validity, reliability, authenticity, usability, and lineage. [12]

The use of data in modern society brings about new ways of understanding and measuring the world, but also brings with it certain concerns or issues. [13] Data scholars attempt to bring some of these issues to light in their quest to be critical of data.

Technical and organizational issues could include the scope of the data set, meaning there is too little or too much data to work with, leading to inaccurate results. It becomes crucial for critical data scholars to carefully consider the adequacy of data volume for their analyses.

The quality of the data itself is another facet of concern. The data itself could be of poor quality, such as an incomplete or messy data set with missing or inaccurate data values. This would lead researchers to have to make edits and assumptions about the data itself. Addressing these issues often requires scholars to make edits and assumptions about the data to ensure its reliability and relevance.

Data scientists could have improper access to the actual data set, limiting their abilities to analyze it. Linnet Taylor explains how gaps in data can arise when people of varying levels of power have certain rights to their data sources. These people in power can control what data is collected, how it is displayed and how it is analyzed. [14]

The capabilities of the research team also play a crucial role in the quality of data analytics. The research team may have inadequate skills or organizational capabilities which leads to the actual analytics performed on the dataset to be biased. This can also lead to ecological fallacies, meaning an assumption is made about an individual based on data or results from a larger group of people. [13]

These technical and organizational challenges highlight the complexity of working with data and emphasize the need for scholars to navigate a landscape where issues related to data scope, quality, access, and team capabilities are intricately interwoven.

Some of the normative and ethical concerns addressed by Kitchin include surveillance through one's data, (dataveillance [7] ) the privacy of one's data is referenced in this article and one of the main key points that the National Cybersecurity Alliance touches on is how data is rapidly becoming a necessity as companies recognize it as an asset and realize the potential value in collecting, using, and sharing it (National Cyber Security Alliance]), the ownership of one's data in which Scassa writes on how debates over ownership rights in data have been heating up. In Europe, policymakers have raised the possibility of creating sui generis ownership rights in data (Data Ownership), the security of one's data in which Data breaches pose a threat to both individuals and organizations. Learn more about data security breaches and what cybersecurity professionals do to prevent them (Data Security breach), anticipatory or corporate governance in which Corporate data and information are used interchangeably, but they are not the same terms. There are differences between these, and their purpose also differs. Corporate data is a raw form of information without proper meaning or usefulness unless it is processed and transformed into meaningful forms (Corporate Data and Information), and profiling individuals by their data. [5] This is heavily emphasized in data colonialism (Data colonialism), where data sovereignty is encouraged for individuals that are being harmed, because it can be a powerful tool for whom that data represents. A common theme across these approaches to data sovereignty is when and how to collect, protect, and share data with only those who have a legitimate or appropriate need to access it. All of these concerns must be taken into account by scholars of data in their objective to be critical.“The labels that we attach to the data are always going to be cruder and less representative of what they describe than what we would like them to be. Treating candidates under a single label, whether it's a gender label, whether it's a gender label, whether it's an age group, whether it's consumers of a particular product, or whether it’s people suffering from a particular disease, can cause people to be treated as interchangeable and fungible data points. Every one of those individuals with that label is unique and has the right to be respected as a person" (Vallor: Data Ethics). All of these concerns must be taken into account by scholars of data in their objective to be critical.

Following in the tradition of critical urban studies, [15] other scholars have raised similar concerns around data and digital information technologies in the urban context. [16] [17] [18] For example, Joe Shaw and Mark Graham have examined these in light of Henri Lefebvre's 'right to the city'. [19]

Practical applications of critical data studies

The most practical and concerning applications of critical data studies is the cross between ethics and privacy. Tendler, Hong,Kane, Kopaczynski, Terry, and Emanuel explain that in an age where private institutions use customer data to market, perform research on customer wants and needs, and more, it is vital to protect the data collected. When looking into the medical studies field one small step in protecting participants is informed consent. [20]

There are many algorithmic biases and discrimination in data. Many emphasize the importance of this in the healthcare field because of the gravity of data driven decision outcomes on patient care and how the data is used and why this data is collected. Institutions and companies can ensure fairness and fight systemic racism by using critical data studies to highlight algorithmic bias in data driven decision making. Nong explains how a very popular example of this is insurance algorithms and access to healthcare. Insurance companies use algorithms to allocate care resources across clients. The algorithms used demonstrated “a clear racial bias against Black patients” which caused estimated “health expenditures [to be] based on historical data structured by systemic racism and perpetuating that bias in access to care management” [21]

In many trained machine learning and artificial models, there is no standard model reporting procedure to properly document the performance characteristics. [22] When these models are applied to real life scenarios the consequences have major effects in the real world, most notably within the context of healthcare, education and law enforcement. Timnit Gebru explains how the lack of sufficient documentation for these models makes it challenging to users to assess their suitability for specific contexts, this is where the use of model cards comes into play. Model cards can provide short records to accompany machine learning models in order to provide information about the models characteristics, intended uses, potential biases, and measures of performance. The use of model cards aims to provide important information to its users about the capabilities and limitations of machine learning systems and ways to promote fair and inclusive outcomes with the use of machine learning technology. [23]

Theoretical frameworks of critical data studies

Data feminism framework promotes thinking about data and ethics guided by ideas of intersectional feminism. Data feminism emphasizes practices where data science reinforces power inequalities in the world and how users can use data to challenge existing power and commit to creating balanced data. According to D'ignazio and Klein, the intersectionality of data feminism acknowledges that data must account for intersecting factors like identity, race, class, etc. to provide a complete and accurate representation of individuals' experiences. This framework also highlights the importance of various ethical considerations by advocating for informed consent, privacy, and the responsibility data collectors have to individuals data is being collected from. [8]

Dataveillance is the monitoring of people on their online data. Unlike surveillance, dataveillance goes far beyond simply monitoring people for specific reasons. Dataveillance infiltrates people's lives with constant tracking for blanket and generalized purposes. According to Raley it has become the preferred way of monitoring people through various online presence. This framework focuses on ways to approach and understand how data is collected, processed, and used emphasizing ethical perspectives and protecting individuals information. [24] Datafication focuses on understanding the process associated with the emergence and use of big data. According to Jose and Dijck, it highlights the transformation of social actions into digital data allowing real time tracking and predictive analysis. Datafication emphasizes the interest driven process of data collection since social activities change while the transformation into data does not. It also examines how societal changes take effect as digital data becomes more prevalent in our everyday lives. Datafication stresses the complicated relationship between data and society and goes hand in hand with dataveillance. [25]

Algorithmic biases framework refers to the systematic and unjust biases against certain groups or outcomes in the algorithmic decision making process. Häußler says that users focus on how algorithms can produce discriminatory outcomes specifically when it comes to race, gender, age, and other characteristics, and can reinforce ideas of social inequities and unjust practices. Generally there are key components within the framework bias identification, data quality, impact assessment, fairness and equity, transparency, remediation, and implications. [26]

Related Research Articles

<span class="mw-page-title-main">Feminist economics</span> Gender-aware branch of economics

Feminist economics is the critical study of economics and economies, with a focus on gender-aware and inclusive economic inquiry and policy analysis. Feminist economic researchers include academics, activists, policy theorists, and practitioners. Much feminist economic research focuses on topics that have been neglected in the field, such as care work, intimate partner violence, or on economic theories which could be improved through better incorporation of gendered effects and interactions, such as between paid and unpaid sectors of economies. Other feminist scholars have engaged in new forms of data collection and measurement such as the Gender Empowerment Measure (GEM), and more gender-aware theories such as the capabilities approach. Feminist economics is oriented towards the goal of "enhancing the well-being of children, women, and men in local, national, and transnational communities."

Machine learning (ML) is a field of study in artificial intelligence concerned with the development and study of statistical algorithms that can learn from data and generalize to unseen data, and thus perform tasks without explicit instructions. Recently, artificial neural networks have been able to surpass many previous approaches in performance.

The ethics of artificial intelligence is the branch of the ethics of technology specific to artificial intelligence (AI) systems.

<span class="mw-page-title-main">Big data</span> Extremely large or complex datasets

Big data primarily refers to data sets that are too large or complex to be dealt with by traditional data-processing application software. Data with many entries (rows) offer greater statistical power, while data with higher complexity may lead to a higher false discovery rate. Though used sometimes loosely partly due to a lack of formal definition, the best interpretation is that it is a large body of information that cannot be comprehended when used in small amounts only.

Lucas D. Introna is Professor of Organisation, Technology and Ethics at the Lancaster University Management School. He is a scholar within the Social Study of Information Systems field. His research is focused on the phenomenon of technology. Within the area of technology studies he has made significant contributions to our understanding of the ethical and political implications of technology for society.

Machine ethics is a part of the ethics of artificial intelligence concerned with adding or ensuring moral behaviors of man-made machines that use artificial intelligence, otherwise known as artificial intelligent agents. Machine ethics differs from other ethical fields related to engineering and technology. Machine ethics should not be confused with computer ethics, which focuses on human use of computers. It should also be distinguished from the philosophy of technology, which concerns itself with the grander social effects of technology.

<span class="mw-page-title-main">Mirko Tobias Schäfer</span>

Mirko Tobias Schäfer is a media scholar at Utrecht University. He is an Associate Professor at the Department for Information & Computing Sciences and Science Lead of the Utrecht Data School.

Datafication is a technological trend turning many aspects of our life into data which is subsequently transferred into information realised as a new form of value. Kenneth Cukier and Viktor Mayer-Schönberger introduced the term datafication to the broader lexicon in 2013. Up until this time, datafication had been associated with the analysis of representations of our lives captured through data, but not on the present scale. This change was primarily due to the impact of big data and the computational opportunities afforded to predictive analytics.

Datafication is not the same as digitization, which takes analog content—books, films, photographs—and converts it into digital information, a sequence of ones and zeros that computers can read. Datafication is a far broader activity: taking all aspects of life and turning them into data [...] Once we datafy things, we can transform their purpose and turn the information into new forms of value

Data politics encompasses the political aspects of data including topics ranging from data activism, open data and open government. The ways in which data is collected, accessed, and what we do with that data has changed in contemporary society due to a number of factors surrounding issues of politics. An issue that arises from political data is often how disconnected people are from their own data, rarely gaining access to the data they produce. Large platforms like Google have a "better to ask forgiveness than permission" stance on data collection to which the greater population is largely ignorant, leading to movements within data activism.

<span class="mw-page-title-main">Big data ethics</span> Ethics of mass data analytics

Big data ethics, also known simply as data ethics, refers to systemizing, defending, and recommending concepts of right and wrong conduct in relation to data, in particular personal data. Since the dawn of the Internet the sheer quantity and quality of data has dramatically increased and is continuing to do so exponentially. Big data describes this large amount of data that is so voluminous and complex that traditional data processing application software is inadequate to deal with them. Recent innovations in medical research and healthcare, such as high-throughput genome sequencing, high-resolution imaging, electronic medical patient records and a plethora of internet-connected health devices have triggered a data deluge that will reach the exabyte range in the near future. Data ethics is of increasing relevance as the quantity of data increases because of the scale of the impact.

<span class="mw-page-title-main">Algorithmic bias</span> Technological phenomenon with social implications

Algorithmic bias describes systematic and repeatable errors in a computer system that create "unfair" outcomes, such as "privileging" one category over another in ways different from the intended function of the algorithm.

Feminist HCI is a subfield of human-computer interaction (HCI) that applies feminist theory, critical theory and philosophy to social topics in HCI, including scientific objectivity, ethical values, data collection, data interpretation, reflexivity, and unintended consequences of HCI software. The term was originally used in 2010 by Shaowen Bardzell, and although the concept and original publication are widely cited, as of 2020 Bardzell's proposed frameworks have been rarely used since.

<i>Algorithms of Oppression</i> 2018 book by Safiya Umoja Noble

Algorithms of Oppression: How Search Engines Reinforce Racism is a 2018 book by Safiya Umoja Noble in the fields of information science, machine learning, and human-computer interaction.

<span class="mw-page-title-main">Feminist philosophy of science</span> Means of interpreting scientific evidence through a feminist lens

Feminist philosophy of science is a branch of feminist philosophy that seeks to understand how the acquirement of knowledge through scientific means has been influenced by notions of gender identity and gender roles in society. Feminist philosophers of science question how scientific research and scientific knowledge itself may be influenced and possibly compromised by the social and professional framework within which that research and knowledge is established and exists. The intersection of gender and science allows feminist philosophers to reexamine fundamental questions and truths in the field of science to reveal how gender biases may influence scientific outcomes. The feminist philosophy of science has been described as being located "at the intersections of the philosophy of science and feminist science scholarship" and has attracted considerable attention since the 1980s.

Ethics of quantification is the study of the ethical issues associated to different forms of visible or invisible forms of quantification. These could include algorithms, metrics/indicators, statistical and mathematical modelling, as noted in a review of various aspects of sociology of quantification.

Digital self-determination is a multidisciplinary concept derived from the legal concept of self-determination and applied to the digital sphere, to address the unique challenges to individual and collective agency and autonomy arising with increasing digitalization of many aspects of society and daily life.

Automated decision-making (ADM) involves the use of data, machines and algorithms to make decisions in a range of contexts, including public administration, business, health, education, law, employment, transport, media and entertainment, with varying degrees of human oversight or intervention. ADM involves large-scale data from a range of sources, such as databases, text, social media, sensors, images or speech, that is processed using various technologies including computer software, algorithms, machine learning, natural language processing, artificial intelligence, augmented intelligence and robotics. The increasing use of automated decision-making systems (ADMS) across a range of contexts presents many benefits and challenges to human society requiring consideration of the technical, legal, ethical, societal, educational, economic and health consequences.

Data imaginaries are a form of cultural imaginary related to social conceptions of data, a concept that comes from the field of critical data studies. A data imaginary is a particular framing of data that defines what data are and what can be done with them. Imaginaries are produced by social institutions and practices and they influence how people understand and use the object of the imaginary, in this case data.

A data ecosystem is the complex environment of co-dependent networks and actors that contribute to data collection, transfer and use. They can span across sectors – such as healthcare or finance, to inform one another's practices. A data ecosystem often consists of numerous data assemblages. Research into data ecosystems has developed in response to the rapid proliferation and availability of information through the web, which has contributed to the commodification of data.

Data universalism is an epistemological framework that assumes a single universal narrative of any dataset without any consideration of geographical borders and social contexts. This assumption is enabled by a generalized approach in data collection. Data are used in universal endeavours across social, political, and physical sciences unrestricted from their local source and people. Data are gathered and transformed into a mutual understanding of knowing the world which forms theories of knowledge. One of many fields of critical data studies explores the geologies and histories of data by investigating data assemblages and tracing data lineage which unfolds data histories and geographies (p.35). This reveals intersections of data politics, praxes, and powers at play which challenges data universalism as a misguided concept.

References

  1. 1 2 3 4 5 6 7 8 9 Dalton, Craig; Thatcher, Jim (12 May 2014). "What does a Critical Data Studies look like and why do we care?". Society + Space. Retrieved 17 January 2018.
  2. boyd, danah; Crawford, Kate (10 May 2012). "Critical questions for Big Data". Information, Communication & Society. 15 (5): 662–679. doi:10.1080/1369118X.2012.678878. hdl:10983/1320. S2CID   51843165.
  3. Soden, Robert; Ribes, David; Avle, Seyram; Fox, Sarah E; Sengers, Phoebe; Paudel, Shreyasha; Marathe, Megh (2023-10-14). "Historicism in/As CSCW Method: Research, Sensibilities, and Design". Computer Supported Cooperative Work and Social Computing. New York, NY, USA: ACM. pp. 497–500. doi:10.1145/3584931.3611288. ISBN   979-8-4007-0129-0.
  4. Kitchin, Rob; Lauriault, Tracey P., "Toward Critical Data Studies", Thinking Big Data in Geography, UNP - Nebraska, pp. 3–20, doi:10.2307/j.ctt21h4z6m.6 , retrieved 2023-11-09
  5. Dalton, Craig, and Jim Thatcher, 2014.
  6. 1 2 3 Iliadis, Andrew; Russo, Federica (December 2016). "Critical data studies: An introduction". Big Data & Society. 3 (2): 205395171667423. doi: 10.1177/2053951716674238 . hdl: 20.500.12613/162 . ISSN   2053-9517.
  7. 1 2 Michael, Mike; Lupton, Deborah (2015-10-13). "Toward a manifesto for the 'public understanding of big data'". Public Understanding of Science. 25 (1): 104–116. doi:10.1177/0963662515609005. hdl: 10871/26112 . PMID   26468128. S2CID   206607967.[ permanent dead link ]
  8. 1 2 D’Ignazio, Catherine; Klein, Lauren (2020). "Seven intersectional feminist principles for equitable and actionable COVID-19 data". SageJournals. 7 (2). doi:10.1177/2053951720942544. hdl: 1721.1/126699 . Retrieved 7 November 2023.
  9. Bar-Yam, Yaneer (2016-11-01). "From big data to important information". Complexity. 21 (S2): 73–98. arXiv: 1604.00976 . Bibcode:2016Cmplx..21S..73B. doi:10.1002/cplx.21785. ISSN   1099-0526. S2CID   14419066.
  10. Patton, Desmond Upton; Hong, Jun Sung; Ranney, Megan; Patel, Sadiq; Kelley, Caitlin; Eschmann, Rob; Washington, Tyreasa (2014-06-01). "Social media as a vector for youth violence: A review of the literature". Computers in Human Behavior. 35: 548–553. doi:10.1016/j.chb.2014.02.043. ISSN   0747-5632.
  11. Abreu, Amelia; Acker, Amelia (2013). "Context and collection: A research agenda for small data". IConference 2013 Proceedings: 549–554. doi:10.9776/13275 (inactive 31 January 2024). hdl:2142/39750 . Retrieved 17 January 2018.{{cite journal}}: CS1 maint: DOI inactive as of January 2024 (link)
  12. Kitchin, Rob; Lauriault, Tracey (July 30, 2014). Eckert, J.; Shears, A.; Thatcher, J. (eds.). "Towards Critical Data Studies: Charting and Unpacking Data Assemblages and Their Work". The Programmable City Working Paper 2. University of Nebraska Press.
  13. 1 2 Kitchin, Rob, 2014
  14. "Towards A Contextual And Inclusive Data Studies". www.societyandspace.org. Retrieved 2023-11-07.
  15. Brenner, Neil (2009). "What is critical urban theory?". City. 13 (2–3): 198–207. doi:10.1080/13604810902996466. S2CID   22041642.
  16. Joe Shaw and Mark Graham (15 February 2017). "Our Digital Rights to the City". meatspacepress.org. Meatspace Press.
  17. Crawford, Kate; Finn, Megan (2015-08-01). "The limits of crisis data: analytical and ethical challenges of using social and mobile data to understand disasters". GeoJournal. 80 (4): 491–502. doi:10.1007/s10708-014-9597-z. ISSN   0343-2521. S2CID   153865729.
  18. Pickren, Graham (2016-10-12). "The global assemblage of digital flow". Progress in Human Geography. 42 (2): 225–243. doi:10.1177/0309132516673241. S2CID   152088341.
  19. Shaw, Joe; Graham, Mark (February 2017). "An Informational Right to the City? Code, Content, Control, and the Urbanization of Information". Antipode . 49 (4): 907. doi: 10.1111/anti.12312 .
  20. Tendler, C., Hong, P. S., Kane, C., Kopaczynski, C., Terry, W., & Emanuel, E. J. (2023). Academic and Private Partnership to Improve Informed Consent Forms Using a Data Driven Approach. The American Journal of Bioethics, 1-3.
  21. Nong, P. (2023). Predictive Technologies in Healthcare: Public Perspectives and Health System Governance in the Context of Structural Inequity (Doctoral dissertation).
  22. Mitchell, M., Wu, S., Zaldivar, A., Barnes, P., Vasserman, L., Hutchinson, B., ... & Gebru, T. (2019, January). Model cards for model reporting. In Proceedings of the conference on fairness, accountability, and transparency (pp. 220-229).
  23. Gebru, T., Morgenstern, J., Vecchione, B., Vaughan, J. W., Wallach, H., Iii, H. D., & Crawford, K. (2021). Datasheets for datasets. Communications of the ACM, 64(12), 86-92.
  24. Van Dijck, J. (2014). Datafication, dataism and dataveillance: Big Data between scientific paradigm and ideology. Surveillance & society, 12(2), 197-208.
  25. Hepp, A., Jarke, J., & Kramp, L. (2022). New perspectives in Critical Data Studies: The ambivalences of data power (p. 473). Springer Nature.
  26. Häußler, H. (2021). The underlying values of data ethics frameworks: a critical analysis of discourses and power structures. Libri, 71(4), 307-319.

General

Sources