Moral blindness

Last updated

Moral blindness, also known as ethical blindness, is defined as a person's temporary inability to see the ethical aspect of a decision they are making. It is often caused by external factors due to which an individual is unable to see the immoral aspect of their behavior in that particular situation. [1]

Contents

While the concept of moral blindness (and more broadly, that of immorality) has its roots in ancient philosophy, [2] [3] the idea of moral blindness became popular after the events of World War II, particularly the Holocaust. [4] This led to more research by psychologists and some surprising findings (notably by Stanley Milgram and Philip Zimbardo) on human behavior in the context of obedience and authority bias. [1]

Moral blindness has been identified as being a concern in areas such as business organisation and legal systems. [5] [6]

Overview

Moral blindness is a phenomenon in which people with sufficient moral reasoning abilities are temporarily unable to see reason which causes them to behave in ways counter to their actual moral values. This behaviour can be due to situational or other factors. The idea of moral blindness usually requires the following: people need to deviate from their intrinsic moral beliefs and this deviation should be temporary and unconscious i.e. people are unaware at the time of their unethical behaviour. [1] [7]

Interest in the idea of moral blindness increased after Hannah Arendt's Eichmann in Jerusalem. A Report on the Banality of Evil [4] which focused on Adolf Eichmann, a German-Austrian Nazi soldier who was responsible for the deportation of Jews to extermination camps and thus played a major role in the Holocaust. [8]

The ideas of moral blindness and the "banality of evil" also influenced the field of psychology and led to some notable studies in the 70s such as the obedience studies by Stanley Milgram and the Stanford Prison Experiment by Philip Zimbardo. These studies looked at the impact of authority on obedience and individual behaviour. [1]

Subsequent research has looked at moral blindness in contexts beyond war crimes and genocide. The idea has been expanded to study people's behaviour in areas as diverse as organisational behavior and mental health to name a few. [5] [9] [10]

Origins and early theories

Roots in philosophy

The origins of the concept of moral blindness lie in philosophy and can be traced to ancient Greek philosophers such as Socrates who spoke of moral intellectualism, Plato who spoke about emotions clouding moral judgements, and Aristotle who first used the term "ethics" for the field of moral philosophy. [2] Early spiritual leaders such as the Buddha and Confucius also spoke about moral behaviour in their discourses although they were more prescriptive in nature. [3] Modern contributions to moral judgement came from Western philosophers such as Descartes, Locke, Hume, and Kant around the 17th and 18th century [11] [12] [13] and more contemporary philosophers such as G.E. Moore, who in his book Principia Ethica talks about the "indefinability of good". [14]

Normative ethics seeks to define the rightness or wrongness of an action. Two opposing views that have developed in this area are deontology where the morality of an action depends on its appropriateness with respect to rules and consequentialism where an action's morality depends on its results. These views are often reflected in responses to Greene's trolley problem. [15]

In psychology

Moral blindness has been studied jointly across philosophy and psychology with empirical studies of morality going back to the 1890s. The focus on a normative approach to moral behaviour led to research focused on the cognitive and developmental context. Piaget put forth his prominent theory of cognitive development in 1936 which Kohlberg developed to come up with the three stages of moral development in 1958. [16] Later, in 1982, James Rest published his influential Four Component Model of Morality (FCM) where he identified four distinct stages from which immoral behaviour could arise: moral sensitivity, moral judgment, moral motivation, and moral implementation. [15] This model was meant to convey the complexity behind moral behaviour. Competence in one stage did not imply competence in another, so immoral behaviour could result from failure at any stage. [17] The above cognitive focus was found to be in contrast to some of the observed behavior. The field of behavioral ethics eventually emerged to study how people react to moral dilemmas. [15]

Theoretical and experimental research in psychology

A major driver for modern research on moral blindness is purported to be post World War II sentiments towards people such as Adolf Eichmann (responsible for genocide under the Nazi regime during the Holocaust). His capture and subsequent trial in 1961 had many observers comment on his ordinary nature and appearance which seemed at contrast with his 'evil' behaviour. Hannah Arendt, who was covering the trial for the New Yorker, coined the term the "banality of evil" in reference to Eichmann as during the trial, Eichmann showed no remorse nor did he accept responsibility - he claimed to have done what he was told to do. This is believed to have influenced researchers such as Milgram to study individual behaviour in response to obedience to authority. [1] [18] [19]

In his obedience studies in 1961-62, Milgram had subjects think they were administering electric shocks to another participant, who in fact was a confederate of the experimenters. These studies had been designed to answer questions such as: "Could it be that Eichmann and his million accomplices in the Holocaust were just following orders? Could we call them all accomplices?" [20] To most people's surprise, 65% of the subjects from the original study went ahead to pull a switch that would have administered the maximum of 450 volts. [21]

Later in 1971, Zimbardo in his Stanford Prison Experiment studied showed how "good people behave in pathological ways that are alien to their nature". [1] Male undergraduate students at Stanford were assigned to be guards or prisoners in a simulated prison setting. The experiment was designed to see how far subjects would go to internalise their roles and obey external orders and later raised some ethical concerns about the nature of the study itself. [22]

Post these findings, researchers began to study moral agency, its exercise and drivers of moral blindness. In his research, Bandura argued that moral disengagement could arise out of various forces (individual, situational, or institutional) along with mechanisms such as diffusion of responsibility and disconnected division of tasks could lead to immoral behaviour. [23] [1] [24]

More recent research has led to the development of the concept of 'bounded ethicality" - the idea that people can be unintentionally unethical when it comes to their behaviour as well as judging others' behaviour; something they may realise only on further reflection. [25] [26] Studies on individual unethicality have also looked at the role of social norms and as well as how we view others' unethical behaviour. [27] [28]

Moral blindness has been studied and applied in a range of domains beyond war crimes, politics, and administration. A major area of application has been in the field of management and organisational behaviour with research looking at a wide range of topics such as corporate transgressions, business ethics, and moral disengagement at work. [9] [5] Law and justice is another area where moral blindness, especially when it comes to lawyers, is seen as a concern. [29] [6] Some research has also referred to psychopathy being a specific kind of moral blindness although the findings are not conclusive. [10]

The field has also been expanded to study broader ideas such as moral blind spots (overestimating ability to act ethically), [30] ethical erosion (gradual decline of ethics over time), [28] and ethical fading (when ethical concerns around a situation 'fade' during decision making). [31]

See also

Related Research Articles

<span class="mw-page-title-main">Ethics</span> Branch of philosophy concerning right and wrong conduct

Ethics or moral philosophy is a branch of philosophy that "involves systematizing, defending, and recommending concepts of right and wrong behavior". The field of ethics, along with aesthetics, concerns matters of value; these fields comprise the branch of philosophy called axiology.

<span class="mw-page-title-main">Milgram experiment</span> Series of social psychology experiments

The Milgram experiment(s) on obedience to authority figures were a series of social psychology experiments conducted by Yale University psychologist Stanley Milgram. They measured the willingness of study participants, 40 men in the age range of 20 to 50 from a diverse range of occupations with varying levels of education, to obey an authority figure who instructed them to perform acts conflicting with their personal conscience. Participants were led to believe that they were assisting an unrelated experiment, in which they had to administer electric shocks to a "learner". These fake electric shocks gradually increased to levels that would have been fatal had they been real.

<span class="mw-page-title-main">Stanley Milgram</span> American social psychologist

Stanley Milgram was an American social psychologist, best known for his controversial experiments on obedience conducted in the 1960s during his professorship at Yale.

<span class="mw-page-title-main">Morality</span> Differentiation between right and wrong

Morality is the differentiation of intentions, decisions and actions between those that are distinguished as proper (right) and those that are improper (wrong). Morality can be a body of standards or principles derived from a code of conduct from a particular philosophy, religion or culture, or it can derive from a standard that a person believes should be universal. Morality may also be specifically synonymous with "goodness" or "rightness".

<span class="mw-page-title-main">Stanford prison experiment</span> Controversial 1971 psychological experiment

The Stanford prison experiment (SPE) was a psychological experiment conducted in August 1971. It was a two-week simulation of a prison environment that examined the effects of situational variables on participants' reactions and behaviors. Stanford University psychology professor Philip Zimbardo led the research team who administered the study.

Obedience, in human behavior, is a form of "social influence in which a person yields to explicit instructions or orders from an authority figure". Obedience is generally distinguished from compliance, which some authors define as behavior influenced by peers while others use it as a more general term for positive responses to another individual's request, and from conformity, which is behavior intended to match that of the majority. Depending on context, obedience can be seen as moral, immoral, or amoral. For example, in psychological research, individuals are usually confronted with immoral demands designed to elicit an internal conflict. If individuals still choose to submit to the demand, they are acting obediently.

<span class="mw-page-title-main">Moral character</span> Steady moral qualities in people

Moral character or character is an analysis of an individual's steady moral qualities. The concept of character can express a variety of attributes, including the presence or lack of virtues such as empathy, courage, fortitude, honesty, and loyalty, or of good behaviors or habits; these attributes are also a part of one's soft skills.

<span class="mw-page-title-main">Human subject research</span> Systematic, scientific investigation that involves human beings as research subjects

Human subject research is systematic, scientific investigation that can be either interventional or observational and involves human beings as research subjects, commonly known as test subjects. Human subject research can be either medical (clinical) research or non-medical research. Systematic investigation incorporates both the collection and analysis of data in order to answer a specific question. Medical human subject research often involves analysis of biological specimens, epidemiological and behavioral studies and medical chart review studies. On the other hand, human subject research in the social sciences often involves surveys which consist of questions to a particular group of people. Survey methodology includes questionnaires, interviews, and focus groups.

Moral agency is an individual's ability to make moral choices based on some notion of right and wrong and to be held accountable for these actions. A moral agent is "a being who is capable of acting with reference to right and wrong."

Moral psychology is a field of study in both philosophy and psychology. Historically, the term "moral psychology" was used relatively narrowly to refer to the study of moral development. Moral psychology eventually came to refer more broadly to various topics at the intersection of ethics, psychology, and philosophy of mind. Some of the main topics of the field are moral judgment, moral reasoning, moral sensitivity, moral responsibility, moral motivation, moral identity, moral action, moral development, moral diversity, moral character, altruism, psychological egoism, moral luck, moral forecasting, moral emotion, affective forecasting, and moral disagreement.

Evolutionary ethics is a field of inquiry that explores how evolutionary theory might bear on our understanding of ethics or morality. The range of issues investigated by evolutionary ethics is quite broad. Supporters of evolutionary ethics have claimed that it has important implications in the fields of descriptive ethics, normative ethics, and metaethics.

Sexual ethics is a branch of philosophy that considers the ethics or morality or otherwise in sexual behavior. Sexual ethics seeks to understand, evaluate and critique interpersonal relationships and sexual activities from a social, cultural, and philosophical perspective. Some people consider aspects of human sexuality, such as gender identification and sexual orientation, as well as consent, sexual relations and procreation, as giving rise to issues of sexual ethics.

Under the controversy of person–situation debate, situationism is the theory that changes in human behavior are factors of the situation rather than the traits a person possesses. Behavior is believed to be influenced by external, situational factors rather than internal traits or motivations. Situationism therefore challenges the positions of trait theorists, such as Hans Eysenck or Raymond B. Cattell. This is an ongoing debate that has truth to both sides; psychologists are able to prove each of the view points through human experimentation.

Moral disengagement is a meaning from social psychology and Educational Psychology for the process of convincing the self that ethical standards do not apply to oneself in a particular context. This is done by separating moral reactions from inhumane conduct and disabling the mechanism of self-condemnation. Thus, moral disengagement involves a process of cognitive re-construing or re-framing of destructive behavior as being morally acceptable without changing the behavior or the moral standards.

Animal ethics is a branch of ethics which examines human-animal relationships, the moral consideration of animals and how nonhuman animals ought to be treated. The subject matter includes animal rights, animal welfare, animal law, speciesism, animal cognition, wildlife conservation, wild animal suffering, the moral status of nonhuman animals, the concept of nonhuman personhood, human exceptionalism, the history of animal use, and theories of justice. Several different theoretical approaches have been proposed to examine this field, in accordance with the different theories currently defended in moral and political philosophy. There is no theory which is completely accepted due to the differing understandings of what is meant by the term ethics; however, there are theories that are more widely accepted by society such as animal rights and utilitarianism.

Radical evil is a phrase used by German philosopher Immanuel Kant, one representing the Christian term, radix malorum. Kant believed that human beings naturally have a tendency to be evil. He explains radical evil as corruption that entirely takes over a human being and leads to desires acting against the universal moral law. The outcome of one's natural tendency, or innate propensity, towards evil are actions or "deeds" that subordinate the moral law. According to Kant, these actions oppose universally moral maxims and display self-love and self conceit. By many authors, Kant's concept of radical evil is seen as a paradox and inconsistent through his development of moral theories.

Machine ethics is a part of the ethics of artificial intelligence concerned with adding or ensuring moral behaviors of man-made machines that use artificial intelligence, otherwise known as artificial intelligent agents. Machine ethics differs from other ethical fields related to engineering and technology. Machine ethics should not be confused with computer ethics, which focuses on human use of computers. It should also be distinguished from the philosophy of technology, which concerns itself with the grander social effects of technology.

Behavioural ethics is a new field of social scientific research that seeks to understand how people actually behave when confronted with ethical dilemmas. It refers to behaviour that is judged according to generally accepted norms of behaviour.

<span class="mw-page-title-main">Social experiment</span> Psychological or sociological research

A social experiment is a method of psychological or sociological research that observes people's reactions to certain situations or events. The experiment depends on a particular social approach where the main source of information is the participants' point of view and knowledge. To carry out a social experiment, specialists usually split participants into two groups — active participants and respondents. Throughout the experiment, specialists monitor participants to identify the effects and differences resulting from the experiment. A conclusion is then created based on the results. Intentional communities are generally considered social experiments.

Social determinism is the theory that social interactions alone determine individual behavior.

References

  1. 1 2 3 4 5 6 7 Palazzo, Guido; Krings, Franciska; Hoffrage, Ulrich (2012-09-01). "Ethical Blindness". Journal of Business Ethics. 109 (3): 323–338. doi:10.1007/s10551-011-1130-4. ISSN   1573-0697. S2CID   254381575.
  2. 1 2 Oberhelman, David D. (2001-06-01). "Stanford Encyclopedia of Philosophy". Reference Reviews. Emerald Group Publishing Limited. 15 (6): 9. doi:10.1108/rr.2001.15.6.9.311. ISSN   0950-4125.
  3. 1 2 Tucker, John A. (2015-02-03), Davis, Bret W (ed.), "Japanese Neo-Confucian Philosophy", The Oxford Handbook of Japanese Philosophy, Oxford University Press, pp. 272–290, doi:10.1093/oxfordhb/9780199945726.013.16, ISBN   978-0-19-994572-6 , retrieved 2020-11-30
  4. 1 2 Burin, Frederic S.; Arendt, Hannah (March 1964). "Eichmann in Jerusalem: A Report on the Banality of Evil". Political Science Quarterly. 79 (1): 122. doi:10.2307/2146583. ISSN   0032-3195. JSTOR   2146583.
  5. 1 2 3 Barsky, Adam (2011-06-16). "Investigating the Effects of Moral Disengagement and Participation on Unethical Work Behavior". Journal of Business Ethics. 104 (1): 59–75. doi:10.1007/s10551-011-0889-7. ISSN   1573-0697. S2CID   144577232.
  6. 1 2 Eldred, Tigran (2012-09-28). "Prescriptions for Ethical Blindness: Improving Advocacy for Indigent Defendants in Criminal Cases". Rutgers Law Review . Rochester, NY. SSRN   2153869.
  7. de Klerk, J. J. (2017-04-01). "Nobody is as Blind as Those Who Cannot Bear to See: Psychoanalytic Perspectives on the Management of Emotions and Moral Blindness". Journal of Business Ethics. 141 (4): 745–761. doi:10.1007/s10551-016-3114-x. ISSN   1573-0697. S2CID   147226367.
  8. Becoming Eichmann: rethinking the life, crimes, and trial of a "desk murderer". 2006-10-01.
  9. 1 2 Bandura, Albert; Caprara, Gian-Vittorio; Zsolnai, Laszlo (2000). "Corporate Transgressions through Moral Disengagement". Journal of Human Values. 6 (1): 57–64. doi:10.1177/097168580000600106. S2CID   143829357.
  10. 1 2 Larsen, Rasmus Rosenberg (2020-09-01). "Psychopathy as moral blindness: a qualifying exploration of the blindness-analogy in psychopathy theory and research". Philosophical Explorations . 23 (3): 214–233. doi:10.1080/13869795.2020.1799662. ISSN   1386-9795. S2CID   221361039.
  11. Cohon, Rachel (2018), "Hume's Moral Philosophy", in Zalta, Edward N. (ed.), The Stanford Encyclopedia of Philosophy (Fall 2018 ed.), Metaphysics Research Lab, Stanford University, retrieved 2020-11-29
  12. García Moriyon (2011). Moral Blindness. 15th ICPIC International Conference, Gyeongsang National University, Jinju, South Korea. doi:10.13140/2.1.1717.0885.
  13. Hare, John (2019), "Religion and Morality", in Zalta, Edward N. (ed.), The Stanford Encyclopedia of Philosophy (Fall 2019 ed.), Metaphysics Research Lab, Stanford University, retrieved 2020-11-29
  14. Cooper, Barton C. (1959-01-01). "The Alleged Indefinability of Good". The Journal of Philosophy. 56 (25): 977–985. doi:10.2307/2022719. JSTOR   2022719 . Retrieved 2020-11-29.
  15. 1 2 3 Bazerman, Max H.; Gino, Francesca (December 2012). "Behavioral Ethics: Toward a Deeper Understanding of Moral Judgment and Dishonesty". Annual Review of Law and Social Science . 8 (1): 85–104. doi:10.1146/annurev-lawsocsci-102811-173815. ISSN   1550-3585. S2CID   14311511.
  16. Hallpike, C. R. (Christopher Robert) (2004). The evolution of moral understanding. Prometheus Research Group. Alton: Prometheus Research Group. ISBN   0-9542168-4-9. OCLC   56463709.
  17. You, Di; Bebeau, Muriel J. (2013-11-01). "The independence of James Rest's components of morality: evidence from a professional ethics curriculum study". Ethics and Education. 8 (3): 202–216. doi:10.1080/17449642.2013.846059. ISSN   1744-9642. S2CID   144861318.
  18. "Eichmann Trial". encyclopedia.ushmm.org. Retrieved 2020-11-30.
  19. Russell, Nestar John Charles (2011). "Milgram's obedience to authority experiments: Origins and early evolution". British Journal of Social Psychology. 50 (1): 140–162. doi:10.1348/014466610X492205. ISSN   2044-8309. PMID   21366616.
  20. Schulweis, Harold M. (2009). Conscience : the duty to obey and the duty to disobey. Woodstock, Vt.: Jewish Lights Pub. ISBN   978-1-58023-419-1. OCLC   731340449.
  21. Blass, Thomas (March 1991). "Understanding behavior in the Milgram obedience experiment: The role of personality, situations, and their interactions". Journal of Personality and Social Psychology. 60 (3): 398–413. doi:10.1037/0022-3514.60.3.398. ISSN   1939-1315.
  22. Bartels, Jared (2019-11-02). "Revisiting the Stanford prison experiment, again: Examining demand characteristics in the guard orientation". The Journal of Social Psychology. 159 (6): 780–790. doi:10.1080/00224545.2019.1596058. ISSN   0022-4545. PMID   30961456. S2CID   104295568.
  23. Bandura, Albert (1999-08-01). "Moral Disengagement in the Perpetration of Inhumanities". Personality and Social Psychology Review. 3 (3): 193–209. doi:10.1207/s15327957pspr0303_3. ISSN   1088-8683. PMID   15661671. S2CID   1589183.
  24. Bandura, Albert (2002-06-01). "Selective Moral Disengagement in the Exercise of Moral Agency". Journal of Moral Education. 31 (2): 101–119. doi:10.1080/0305724022014322. ISSN   0305-7240. S2CID   146449693.
  25. Gino, Francesca (2015-06-01). "Understanding ordinary unethical behavior: why people who value morality act immorally". Current Opinion in Behavioral Sciences. Social behavior. 3: 107–111. doi:10.1016/j.cobeha.2015.03.001. ISSN   2352-1546. S2CID   53205769.
  26. Chugh, Dolly; Bazerman, Max H.; Banaji, Mahzarin R. (2005-04-18), "Bounded Ethicality as a Psychological Barrier to Recognizing Conflicts of Interest", Conflicts of Interest, Cambridge University Press, pp. 74–95, doi:10.1017/cbo9780511610332.006, ISBN   978-0-521-84439-0 , retrieved 2020-11-30
  27. Gino, Francesca; Ayal, Shahar; Ariely, Dan (2009-03-01). "Contagion and Differentiation in Unethical Behavior: The Effect of One Bad Apple on the Barrel". Psychological Science. 20 (3): 393–398. doi:10.1111/j.1467-9280.2009.02306.x. ISSN   1467-9280. PMID   19254236. S2CID   10456659.
  28. 1 2 Gino, Francesca; Moore, Don A.; Bazerman, Max H. (2008). "See No Evil: When We Overlook Other People's Unethical Behavior". SSRN Electronic Journal. doi:10.2139/ssrn.1079969. ISSN   1556-5068. S2CID   145409936.
  29. Hall, Katherine (2010), Why good intentions are often not enough: The potential for ethical blindness in legal decision-making, Routledge, ISBN   978-0-415-54653-9 , retrieved 2020-11-30
  30. Bazerman, Max H.; Tenbrunsel, Ann E. (2011-12-31). Blind Spots. Princeton: Princeton University Press. doi:10.1515/9781400837991. ISBN   978-1-4008-3799-1.
  31. Tenbrunsel, Ann E.; Messick, David M. (June 2004). "Ethical Fading: The Role of Self-Deception in Unethical Behavior". Social Justice Research. 17 (2): 223–236. doi:10.1023/B:SORE.0000027411.35832.53. ISSN   0885-7466. S2CID   26603323.