Computer ethics is a part of practical philosophy concerned with how computing professionals should make decisions regarding professional and social conduct. [1]
Margaret Anne Pierce, a professor in the Department of Mathematics and Computers at Georgia Southern University has categorized the ethical decisions related to computer technology and usage into three primary influences: [2]
Computer ethics was first coined by Walter Maner, [1] a professor at Bowling Green State University. Maner noticed ethical concerns that were brought up during his Medical Ethics course at Old Dominion University became more complex and difficult when the use of technology and computers became involved. [3] The conceptual foundations of computer ethics are investigated by information ethics, a branch of philosophical ethics promoted, among others, by Luciano Floridi. [4]
The concept of computer ethics originated in the 1940s with MIT professor Norbert Wiener, the American mathematician and philosopher. While working on anti-aircraft artillery during World War II, Wiener and his fellow engineers developed a system of communication between the part of a cannon that tracked a warplane, the part that performed calculations to estimate a trajectory, and the part responsible for firing. [1] Wiener termed the science of such information feedback systems, "cybernetics," and he discussed this new field with its related ethical concerns in his 1948 book, Cybernetics. [1] [5] In 1950, Wiener's second book, The Human Use of Human Beings , delved deeper into the ethical issues surrounding information technology and laid out the basic foundations of computer ethics. [5]
A bit later during the same year, the world's first computer crime was committed. A programmer was able to use a bit of computer code to stop his banking account from being flagged as overdrawn. However, there were no laws in place at that time to stop him, and as a result he was not charged. [6] [ unreliable source? ] To make sure another person did not follow suit, an ethics code for computers was needed.
In 1973, the Association for Computing Machinery (ACM) adopted its first code of ethics. [1] SRI International's Donn Parker, [7] an author on computer crimes, led the committee that developed the code. [1]
In 1976, medical teacher and researcher Walter Maner noticed that ethical decisions are much harder to make when computers are added. He noticed a need for a different branch of ethics for when it came to dealing with computers. The term "computer ethics" was thus invented. [1] [5]
In 1976 Joseph Weizenbaum made his second significant addition to the field of computer ethics. He published a book titled Computer Power and Human Reason , [8] which talked about how artificial intelligence is good for the world; however it should never be allowed to make the most important decisions as it does not have human qualities such as wisdom. By far the most important point he makes in the book is the distinction between choosing and deciding. He argued that deciding is a computational activity while making choices is not and thus the ability to make choices is what makes us humans.
At a later time during the same year Abbe Mowshowitz, a professor of Computer Science at the City College of New York, published an article titled "On approaches to the study of social issues in computing." This article identified and analyzed technical and non-technical biases in research on social issues present in computing.
During 1978, the Right to Financial Privacy Act was adopted by the United States Congress, drastically limiting the government's ability to search bank records. [9]
During the next year Terrell Ward Bynum, the professor of philosophy at Southern Connecticut State University as well as Director of the Research Center on Computing and Society there, developed curriculum for a university course on computer ethics. [10] Bynum was also editor of the journal Metaphilosophy. [1] In 1983 the journal held an essay contest on the topic of computer ethics and published the winning essays in its best-selling 1985 special issue, “Computers and Ethics.” [1]
In 1984, the United States Congress passed the Small Business Computer Security and Education Act, which created a Small Business Administration advisory council to focus on computer security related to small businesses. [11]
In 1985, James Moor, professor of philosophy at Dartmouth College in New Hampshire, published an essay called "What is Computer Ethics?" [5] In this essay Moor states the computer ethics includes the following: "(1) identification of computer-generated policy vacuums, (2) clarification of conceptual muddles, (3) formulation of policies for the use of computer technology, and (4) ethical justification of such policies." [1]
During the same year, Deborah Johnson, professor of Applied Ethics and chair of the Department of Science, Technology, and Society in the School of Engineering and Applied Sciences of the University of Virginia, got the first major computer ethics textbook published. [5] Johnson's textbook identified major issues for research in computer ethics for more than 10 years after publication of the first edition. [5]
In 1988, Robert Hauptman, a librarian at St. Cloud University, came up with "information ethics", a term that was used to describe the storage, production, access and dissemination of information. [12] Near the same time, the Computer Matching and Privacy Act was adopted and this act restricted United States government programs identifying debtors. [13]
In the year 1992, ACM adopted a new set of ethical rules called "ACM code of Ethics and Professional Conduct" [14] which consisted of 24 statements of personal responsibility.
Three years later, in 1995, Krystyna Górniak-Kocikowska, a professor of philosophy at Southern Connecticut State University, Coordinator of the Religious Studies Program, as well as a senior research associate in the Research Center on Computing and Society, came up with the idea that computer ethics will eventually become a global ethical system and soon after, computer ethics would replace ethics altogether as it would become the standard ethics of the information age. [5]
In 1999, Deborah Johnson revealed her view, which was quite contrary to Górniak-Kocikowska's belief, and stated that computer ethics will not evolve but rather be our old ethics with a slight twist. [12]
Post 20th century, as a result to much debate of ethical guidelines, many organizations such as ABET [15] offer ethical accreditation to University or College applications such as "Applied and Natural Science, Computing, Engineering and Engineering Technology at the associate, bachelor, and master levels" to try and promote quality works that follow sound ethical and moral guidelines.
In 2018 The Guardian and The New York Times reported that Facebook took data from 87 million Facebook users to sell to Cambridge Analytica. [16]
In 2019 Facebook started a fund to build an ethics center at the Technical University of Munich, located in Germany. This was the first time that Facebook funded an academic institute for matters regarding computer ethics. [17]
Computer crime, privacy, anonymity, freedom, and intellectual property fall under topics that will be present in the future of computer ethics. [18]
Ethical considerations have been linked to the Internet of Things (IoT) with many physical devices being connected to the internet. [18]
Virtual Crypto-currencies in regards to the balance of the current purchasing relationship between the buyer and seller. [18]
Autonomous technology such as self-driving cars forced to make human decisions. There is also concern over how autonomous vehicles would behave in different countries with different culture values. [19]
Security risks have been identified with cloud-based technology with every user interaction being sent and analyzed to central computing hubs. [20] Artificial intelligence devices like the Amazon Alexa and Google Home are collecting personal data from users while at home and uploading it to the cloud. Apple's Siri and Microsoft's Cortana smartphone assistants are collecting user information, analyzing the information, and then sending the information back to the user.
Computers and information technology have caused privacy concerns surrounding collection and use of personal data. [21] For example, Google was sued in 2018 for tracking user location without permission. [22] also In July 2019, Facebook reached a $5 billion settlement with the U.S. Federal Trade Commission for violating an agreement with the agency to protect user privacy. [23]
A whole industry of privacy and ethical tools has grown over time, giving people the choice to not share their data online. These are often open source software, which allows the users to ensure that their data is not saved to be used without their consent. [24]
The ethics of artificial intelligence covers a broad range of topics within the field that are considered to have particular ethical stakes. [25] This includes algorithmic biases, fairness, automated decision-making, accountability, privacy, and regulation. It also covers various emerging or potential future challenges such as machine ethics (how to make machines that behave ethically), lethal autonomous weapon systems, arms race dynamics, AI safety and alignment, technological unemployment, AI-enabled misinformation, how to treat certain AI systems if they have a moral status (AI welfare and rights), artificial superintelligence and existential risks. [25]
Some application areas may also have particularly important ethical implications, like healthcare, education, criminal justice, or the military.The effects of infringing copying in the digital realm, particularly studied in computer software and recorded music industries, have raised significant concerns among empirically-oriented economists. While the software industry manages to thrive despite digital copying, the recorded music sector witnesses a sharp decline in revenues, especially with the rise of file-sharing of MP3 files. Establishing the impact of unpaid consumption on paid consumption is challenging due to difficulties in obtaining data on unpaid consumption and drawing causal inferences. As simple as the question seems—the extent to which unpaid consumption of recorded music cannibalizes paid consumption—the answer is rather difficult to establish empirically, for two reasons. [26] Empirical studies consistently suggest a depressing impact on paid music consumption, indicating a likely contribution to the downturn in recorded music sales. The emergence of cyberlockers and rapid technological changes further complicate the analysis of revenue impacts on content industries, highlighting the need for ongoing research and a nuanced approach to copyright policy that considers user welfare effects and rewards distribution to artists and creators.
Various national and international professional societies and organizations have produced code of ethics documents to give basic behavioral guidelines to computing professionals and users. They include:
Software engineering professionalism is a movement to make software engineering a profession, with aspects such as degree and certification programs, professional associations, professional ethics, and government licensing. The field is a licensed discipline in Texas in the United States, Engineers Australia(Course Accreditation since 2001, not Licensing), and many provinces in Davao.
The ethics of technology is a sub-field of ethics addressing the ethical questions specific to the Technology Age, the transitional shift in society wherein personal computers and subsequent devices provide for the quick and easy transfer of information. Technology ethics is the application of ethical thinking to the growing concerns of technology as new technologies continue to rise in prominence.
Eugene Howard Spafford, known as Spaf, is an American professor of computer science at Purdue University and a computer security expert.
Information ethics has been defined as "the branch of ethics that focuses on the relationship between the creation, organization, dissemination, and use of information, and the ethical standards and moral codes governing human conduct in society". It examines the morality that comes from information as a resource, a product, or as a target. It provides a critical framework for considering moral issues concerning informational privacy, moral agency, new environmental issues, problems arising from the life-cycle of information. It is very vital to understand that librarians, archivists, information professionals among others, really understand the importance of knowing how to disseminate proper information as well as being responsible with their actions when addressing information.
Luciano Floridi is an Italian and British philosopher. He is the director of the Digital Ethics Center at Yale University. He is also a Professor of Sociology of Culture and Communication at the University of Bologna, Department of Legal Studies, where he is the director of the Centre for Digital Ethics. Furthermore, he is adjunct professor at the Department of Economics, American University, Washington D.C. He is married to the neuroscientist Anna Christina Nobre.
The philosophy of information (PI) is a branch of philosophy that studies topics relevant to information processing, representational system and consciousness, cognitive science, computer science, information science and information technology.
The International Association for Computing and Philosophy (IACAP) is a professional, philosophical association emerging from a history of conferences that began in 1986. Adopting its mission from these conferences, the IACAP exists in order to promote scholarly dialogue on all aspects of the computational/informational turn and the use of computers in the service of philosophy.
The K. JonBarwise Prize was established in 2002 by the American Philosophical Association (APA), in conjunction with the APA Committee on Philosophy and Computers, on the basis of a proposal from the International Association for Computing and Philosophy for significant and sustained contributions to areas relevant to philosophy and computing.
The following outline is provided as an overview of and topical guide to ethics.
Cyberethics is "a branch of ethics concerned with behavior in an online environment". In another definition, it is the "exploration of the entire range of ethical and moral issues that arise in cyberspace" while cyberspace is understood to be "the electronic worlds made visible by the Internet." For years, various governments have enacted regulations while organizations have defined policies about cyberethics.
The philosophy of computer science is concerned with the philosophical questions that arise within the study of computer science. There is still no common understanding of the content, aims, focus, or topics of the philosophy of computer science, despite some attempts to develop a philosophy of computer science like the philosophy of physics or the philosophy of mathematics. Due to the abstract nature of computer programs and the technological ambitions of computer science, many of the conceptual questions of the philosophy of computer science are also comparable to the philosophy of science, philosophy of mathematics, and the philosophy of technology.
Value sensitive design (VSD) is a theoretically grounded approach to the design of technology that accounts for human values in a principled and comprehensive manner. VSD originated within the field of information systems design and human-computer interaction to address design issues within the fields by emphasizing the ethical values of direct and indirect stakeholders. It was developed by Batya Friedman and Peter Kahn at the University of Washington starting in the late 1980s and early 1990s. Later, in 2019, Batya Friedman and David Hendry wrote a book on this topic called "Value Sensitive Design: Shaping Technology with Moral Imagination". Value Sensitive Design takes human values into account in a well-defined matter throughout the whole process. Designs are developed using an investigation consisting of three phases: conceptual, empirical and technological. These investigations are intended to be iterative, allowing the designer to modify the design continuously.
James H. Moor is the Daniel P. Stone Professor of Intellectual and Moral Philosophy at Dartmouth College. He earned his Ph.D. in 1972 from Indiana University. Moor's 1985 paper entitled "What is Computer Ethics?" established him as one of the pioneering theoreticians in the field of computer ethics. He has also written extensively on the Turing Test. His research includes study in philosophy of artificial intelligence, philosophy of mind, philosophy of science, and logic.
Terrell Ward Bynum is an American philosopher, writer and editor. Bynum is currently director of the Research Center on Computing and Society at Southern Connecticut State University, where he is also a professor of philosophy, and visiting professor in the Centre for Computing and Social Responsibility in De Montfort University, Leicester, England. He is best known as a pioneer and historian in the field of computer and information ethics; for his achievements in that field, he was awarded the Barwise Prize of the American Philosophical Association, the Weizenbaum Award of the International Society for Ethics and Information Technology, and the 2011 Covey Award of the International Association for Computing and Philosophy. In addition, Bynum was the founder and longtime editor-in-chief of the philosophy journal Metaphilosophy ; a key founding figure (1974–1980) and the first executive director (1980–1982) of the American Association of Philosophy Teachers; biographer of the philosopher/ mathematician Gottlob Frege, as well as a translator of Frege's early works in logic. Bynum's most recent research and publications concern the ultimate nature of the universe and the impact of the information revolution upon philosophy.
The Covey Award was established in 2008 by the International Association for Computing and Philosophy, to recognise "accomplished innovative research, and possibly teaching that flows from that research, in the field of computing and philosophy broadly conceived".
This article gives an overview of professional ethics as applied to computer programming and software development, in particular the ethical guidelines that developers are expected to follow and apply when writing programming code, and when they are part of a programmer-customer or employee-employer relationship. These rules shape and differentiate good practices and attitudes from the wrong ones when creating software or when making decisions on a crucial or delicate issue regarding a programming project. They are also the basis for ethical decision-making skills in the conduct of professional work.
Simon Rogerson is lifetime Professor Emeritus in Computer Ethics at the Centre for Computing and Social Responsibility (CCSR), De Montfort University. He was the founder and editor for 19 volumes of the Journal of Information, Communication and Ethics in Society. He has had two careers; first as a technical software developer and then in academia as reformer. He was the founding Director of CCSR, launching it in 1995 at the first ETHICOMP conference which he conceived and co-directed until 2013. He became Europe's first Professor in Computer Ethics in 1998. His most important research focuses on providing rigorously grounded practical tools and guidance to computing practitioners. For his leadership and research achievements in the computer and information ethics interdisciplinary field he was awarded the fifth IFIP-WG9.2 Namur Award in 2000 and the SIGCAS Making a Difference Award in 2005.
Contextual integrity is a theory of privacy developed by Helen Nissenbaum and presented in her book Privacy In Context: Technology, Policy, and the Integrity of Social Life. It comprises four essential descriptive claims:
Mariarosaria Taddeo is an Italian philosopher working on the ethics of digital technologies. She is Professor of Digital Ethics and Defence Technologies at the Oxford Internet Institute, University of Oxford and Dslt Ethics Fellow at the Alan Turing Institute, London.
{{cite book}}
: CS1 maint: date and year (link){{cite book}}
: |journal=
ignored (help)