Social intuitionism

Last updated

In moral psychology, social intuitionism is a model that proposes that moral positions are often non-verbal and behavioral. [1] Often such social intuitionism is based on "moral dumbfounding" where people have strong moral reactions but fail to establish any kind of rational principle to explain their reaction. [2]

Contents

Overview

Social intuitionism proposes four main claims about moral positions, namely that they are primarily

  1. intuitive ("intuitions come first")
  2. rationalized, justified, or otherwise explained after the fact
  3. taken mainly to influence other people
  4. often influenced and sometimes changed by discussing such positions with others. [3]

This model diverges from earlier rationalist theories of morality, such as of Lawrence Kohlberg's stage theory of moral reasoning. [4] Inspired in part by work on motivated reasoning, automaticity, and Antonio Damasio's somatic marker hypothesis, Jonathan Haidt's (2001) social intuitionist model [1] de-emphasized the role of reasoning in reaching moral conclusions. Haidt asserts that moral judgment is primarily given rise to by intuition, with reasoning playing a smaller role in most of our moral decision-making. Conscious thought-processes serve as a kind of post hoc justification of our decisions.

His main evidence comes from studies of "moral dumbfounding" [5] [6] where people have strong moral reactions but fail to establish any kind of rational principle to explain their reaction. [7] An example situation in which moral intuitions are activated is as follows: Imagine that a brother and sister sleep together once. No one else knows, no harm befalls either one, and both feel it brought them closer as siblings. Most people imagining this incest scenario have very strong negative reaction, yet cannot explain why. [8] Referring to earlier studies by Howard Margolis [9] and others, Haidt suggests that we have unconscious intuitive heuristics which generate our reactions to morally charged-situations, and underlie our moral behavior. He suggests that when people explain their moral positions, they often miss, if not hide, the core premises and processes that actually led to those conclusions. [10]

SocialIntuitionistCC.jpg

Haidt's model also states that moral reasoning is more likely to be interpersonal than private, reflecting social motives (reputation, alliance-building) rather than abstract principles. He does grant that interpersonal discussion (and, on very rare occasions, private reflection) can activate new intuitions which will then be carried forward into future judgments.

Reasons to doubt the role of cognition

Haidt (2001) lists four reasons to doubt the cognitive primacy model championed by Kohlberg and others. [1]

  1. There is considerable evidence that many evaluations, including moral judgments, take place automatically, at least in their initial stages (and these initial judgments anchor subsequent judgments).
  2. The moral reasoning process is highly biased by two sets of motives, which Haidt labels "relatedness" motives (relating to managing impressions and having smooth interactions with others) and "coherence" motives (preserving a coherent identity and worldview).
  3. The reasoning process has repeatedly been shown to create convincing post hoc justifications for behavior that are believed by people despite not actually correctly describing the reason underlying the choice. [11]
  4. According to Haidt, moral action covaries more with moral emotion than with moral reasoning.

These four arguments led Haidt to propose a major reinterpretation of decades of existing work on moral reasoning:

Because the justifications that people give are closely related to the moral judgments that they make, prior researchers have assumed that the justificatory reasons caused the judgments. But if people lack access to their automatic judgment processes then the reverse causal path becomes more plausible. If this reverse path is common, then the enormous literature on moral reasoning can be reinterpreted as a kind of ethnography of the a priori moral theories held by various communities and age groups. [1] :822

Objections to Haidt's model

Among the main criticisms of Haidt's model are that it underemphasizes the role of reasoning. [12] For example, Joseph Paxton and Joshua Greene (2010) review evidence suggesting that moral reasoning plays a significant role in moral judgment, including counteracting automatic tendencies toward bias. [13] Greene and colleagues have proposed an alternative to the social intuitionist model – the Dual Process Model [14] – which suggests that deontological moral judgments, which involve rights and duties, are driven primarily by intuition, while utilitarian judgments aimed at promoting the greater good are underlain by controlled cognitive reasoning processes. Greene's 2008 article "The Secret Joke of Kant's Soul" [15] argues that Kantian/deontological ethics tends to be driven by emotional respondes and is best understood as rationalization rather than rationalism—an attempt to justify intuitive moral judgments post-hoc, although the author states that his argument is speculative and will not be conclusive. Several philosophers have written critical responses. [16] [17] [18] [19] Paul Bloom similarly criticizes Haidt's model on the grounds that intuition alone cannot account for historical changes in moral values. Moral change, he believes, is a phenomenon that is largely a product of rational deliberation. [20]

Augusto Blasi emphasizes the importance of moral responsibility and reflection as one analyzes an intuition. [21] His main argument is that some, if not most, intuitions tend to be self-centered and self-seeking. [22] Blasi critiques Haidt in describing the average person and questioning if this model (having an intuition, acting on it, and then justifying it) always happens. He came to the conclusion that not everyone follows this model. In more detail, Blasi proposes Haidt's five default positions on intuition.[ clarification needed ]

Because such are the empirical facts, the "rationalistic" theories and methods of Piaget and Kohlberg are rejected. Blasi argues that Haidt does not provide adequate evidence to support his position. [23]

Other researchers have criticized the evidence cited in support of social intuitionism relating to moral dumbfounding, [2] arguing these findings rely on a misinterpretation of participants' responses. [24] [25]

See also

Related Research Articles

<span class="mw-page-title-main">Morality</span> Differentiation between right and wrong

Morality is the differentiation of intentions, decisions and actions between those that are distinguished as proper (right) and those that are improper (wrong). Morality can be a body of standards or principles derived from a code of conduct from a particular philosophy, religion or culture, or it can derive from a standard that a person believes should be universal. Morality may also be specifically synonymous with "goodness" or "rightness".

<span class="mw-page-title-main">Cognitive bias</span> Systematic pattern of deviation from norm or rationality in judgment

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.

Lawrence Kohlberg was an American psychologist best known for his theory of stages of moral development.

Moral reasoning is the study of how people think about right and wrong and how they acquire and apply moral rules. It is a subdiscipline of moral psychology that overlaps with moral philosophy, and is the foundation of descriptive ethics.

Moral psychology is a field of study in both philosophy and psychology. Historically, the term "moral psychology" was used relatively narrowly to refer to the study of moral development. Moral psychology eventually came to refer more broadly to various topics at the intersection of ethics, psychology, and philosophy of mind. Some of the main topics of the field are moral judgment, moral reasoning, moral sensitivity, moral responsibility, moral motivation, moral identity, moral action, moral development, moral diversity, moral character, altruism, psychological egoism, moral luck, moral forecasting, moral emotion, affective forecasting, and moral disagreement.

Lawrence Kohlberg's stages of moral development constitute an adaptation of a psychological theory originally conceived by the Swiss psychologist Jean Piaget. Kohlberg began work on this topic as a psychology graduate student at the University of Chicago in 1958 and expanded upon the theory throughout his life.

Experimental philosophy is an emerging field of philosophical inquiry that makes use of empirical data—often gathered through surveys which probe the intuitions of ordinary people—in order to inform research on philosophical questions. This use of empirical data is widely seen as opposed to a philosophical methodology that relies mainly on a priori justification, sometimes called "armchair" philosophy, by experimental philosophers. Experimental philosophy initially began by focusing on philosophical questions related to intentional action, the putative conflict between free will and determinism, and causal vs. descriptive theories of linguistic reference. However, experimental philosophy has continued to expand to new areas of research.

Jonathan Baron is an American psychologist. He is a professor emeritus of psychology at the University of Pennsylvania in the science of decision-making.

In psychology, a dual process theory provides an account of how thought can arise in two different ways, or as a result of two different processes. Often, the two processes consist of an implicit (automatic), unconscious process and an explicit (controlled), conscious process. Verbalized explicit processes or attitudes and actions may change with persuasion or education; though implicit process or attitudes usually take a long amount of time to change with the forming of new habits. Dual process theories can be found in social, personality, cognitive, and clinical psychology. It has also been linked with economics via prospect theory and behavioral economics, and increasingly in sociology through cultural analysis.

<span class="mw-page-title-main">Jonathan Haidt</span> American social psychologist

Jonathan David Haidt is an American social psychologist and author. He is the Thomas Cooley Professor of Ethical Leadership at the New York University Stern School of Business. His main areas of study are the psychology of morality and moral emotions.

In psychology, the human mind is considered to be a cognitive miser due to the tendency of humans to think and solve problems in simpler and less effortful ways rather than in more sophisticated and effortful ways, regardless of intelligence. Just as a miser seeks to avoid spending money, the human mind often seeks to avoid spending cognitive effort. The cognitive miser theory is an umbrella theory of cognition that brings together previous research on heuristics and attributional biases to explain when and why people are cognitive misers.

Elliot Turiel is a psychologist and Distinguished Professor in the Berkeley School of Education at the University of California, Berkeley. He teaches courses on human development and its relation to education. He was born in Rhodes, Greece.

The psychology of reasoning is the study of how people reason, often broadly defined as the process of drawing conclusions to inform how people solve problems and make decisions. It overlaps with psychology, philosophy, linguistics, cognitive science, artificial intelligence, logic, and probability theory.

Moral development focuses on the emergence, change, and understanding of morality from infancy through adulthood. The theory states that morality develops across a lifespan in a variety of ways and is influenced by an individual's experiences and behavior when faced with moral issues through different periods of physical and cognitive development. Morality concerns an individual's reforming sense of what is right and wrong; it is for this reason that young children have different moral judgment and character than that of a grown adult. Morality in itself is often a synonym for "rightness" or "goodness." It also refers to a specific code of conduct that is derived from one's culture, religion, or personal philosophy that guides one's actions, behaviors, and thoughts.

Moral foundations theory is a social psychological theory intended to explain the origins of and variation in human moral reasoning on the basis of innate, modular foundations. It was first proposed by the psychologists Jonathan Haidt, Craig Joseph, and Jesse Graham, building on the work of cultural anthropologist Richard Shweder. More recently, Mohammad Atari, Jesse Graham, and Jonathan Haidt have revised some aspects of the theory and developed new measurement tools. The theory has been developed by a diverse group of collaborators and popularized in Haidt's book The Righteous Mind. The theory proposes that morality is "more than one thing", first arguing for five foundations, and later expanding for six foundations :

<span class="mw-page-title-main">Joshua Greene (psychologist)</span> American experimental psychologist, neuroscientist, and moral philosopher

Joshua David Greene is an American experimental psychologist, neuroscientist, and philosopher. He is a professor of psychology at Harvard University. Most of his research and writing has been concerned with moral judgment and decision-making. His recent research focuses on fundamental issues in cognitive science.

<i>The Righteous Mind</i> 2012 social psychology book by Jonathan Haidt

The Righteous Mind: Why Good People are Divided by Politics and Religion is a 2012 social psychology book by Jonathan Haidt, in which the author describes human morality as it relates to politics and religion.

<span class="mw-page-title-main">Dual process theory (moral psychology)</span> Theory of human moral judgment

Dual process theory within moral psychology is an influential theory of human moral judgement that posits that human beings possess two distinct cognitive subsystems that compete in moral reasoning processes: one fast, intuitive and emotionally-driven, the other slow, requiring conscious deliberation and a higher cognitive load. Initially proposed by Joshua Greene along with Brian Sommerville, Leigh Nystrom, John Darley, Jonathan David Cohen and others, the theory can be seen as a domain-specific example of more general dual process accounts in psychology, such as Daniel Kahneman's "system1"/"system 2" distinction popularised in his book, Thinking, Fast and Slow. Greene has often emphasized the normative implications of the theory, which has started an extensive debate in ethics.

Moral emotions are a variety of social emotions that are involved in forming and communicating moral judgments and decisions, and in motivating behavioral responses to one's own and others' moral behavior. As defined by Jonathan Haidt, moral emotions "are linked to the interests or welfare either of a society as a whole or at least of persons other than the judge or agent". A person may not always have clear words to articulate, yet simultaneously, that same person knows it to be true deep down inside.

Jonathan St B. T. Evans is a British cognitive psychologist, currently Emeritus Professor of Psychology at the University of Plymouth. In 1975, with Peter Wason, Evans proposed one of the first dual-process theories of reasoning, an idea later developed and popularized by Daniel Kahneman. In a 2011 Festschrift, Evans' peers described him as "one of the most influential figures in the psychology of human reasoning".

References

  1. 1 2 3 4 Haidt, Jonathan (2001). "The emotional dog and its rational tail: A social intuitionist approach to moral judgment". Psychological Review. 108 (4): 814–834. doi:10.1037/0033-295X.108.4.814. PMID   11699120.
  2. 1 2 Haidt, Jonathan; Björklund, Fredrik; Murphy, Scott (August 10, 2000), Moral Dumbfounding: When Intuition Finds No Reason (PDF) (Unpublished manuscript)
  3. Haidt, Jonathan (2012). The Righteous Mind: Why Good People Are Divided by Politics and Religion. Pantheon. pp.  913 Kindle ed. ISBN   978-0307377906.
  4. Levine, Charles; Kohlberg, Lawrence; Hewer, Alexandra (1985). "The Current Formulation of Kohlberg's Theory and a Response to Critics". Human Development. 28 (2): 94–100. doi:10.1159/000272945.
  5. McHugh, Cillian; McGann, Marek; Igou, Eric R.; Kinsella, Elaine L. (2017-10-04). "Searching for Moral Dumbfounding: Identifying Measurable Indicators of Moral Dumbfounding". Collabra: Psychology. 3 (1). doi: 10.1525/collabra.79 . hdl: 10344/6306 . ISSN   2474-7394.
  6. McHugh, Cillian; McGann, Marek; Igou, Eric R.; Kinsella, Elaine L. (2020-01-05). "Reasons or rationalizations: The role of principles in the moral dumbfounding paradigm". Journal of Behavioral Decision Making. x (x): 376–392. doi:10.1002/bdm.2167. hdl: 10344/10452 . ISSN   1099-0771. S2CID   214515549.
  7. Haidt, Jonathan. The righteous mind. Pantheon: 2012. Loc 539, Kindle ed. In footnote 29, Haidt credits the neology of the term moral dumbfounding to social/experimental psychologist Daniel Wegner.
  8. Haidt, Jonathan. The righteous mind. Pantheon: 2012. Loc 763 Kindle ed.
  9. Grover, Burton L. (1989-06-30). "Patterns, Thinking, and Cognition: A Theory of Judgment by Howard Margolis. Chicago: University of Chicago Press, 1987, 332 pp. (ISBN 0-226-50527-8)". The Educational Forum. 53 (2): 199–202. doi:10.1080/00131728909335595. ISSN   0013-1725.
  10. Haidt, Jonathan. The righteous mind. Pantheon: 2012. Loc 1160 Kindle ed.
  11. This was demonstrated in a classic paper by Nisbett and Wilson (1977).[ full citation needed ]
  12. LaFollette, Hugh; Woodruff, Michael L. (13 September 2013). "The limits of Haidt: How his explanation of political animosity fails". Philosophical Psychology. 28 (3): 452–465. doi:10.1080/09515089.2013.838752. S2CID   142745897.
  13. Paxton, Joseph M.; Greene, Joshua D. (13 May 2010). "Moral Reasoning: Hints and Allegations". Topics in Cognitive Science. 2 (3): 511–527. doi: 10.1111/j.1756-8765.2010.01096.x . PMID   25163874.
  14. Greene, J. D. (14 September 2001). "An fMRI Investigation of Emotional Engagement in Moral Judgment". Science. 293 (5537): 2105–2108. Bibcode:2001Sci...293.2105G. doi:10.1126/science.1062872. PMID   11557895. S2CID   1437941.
  15. https://psycnet.apa.org/record/2007-14534-005
  16. Lott, Micah (October 2016). "Moral Implications from Cognitive (Neuro)Science? No Clear Route". Ethics. 127 (1): 241–256. doi:10.1086/687337. S2CID   151940241.
  17. Königs, Peter (April 3, 2018). "Two types of debunking arguments". Philosophical Psychology. 31 (3): 383–402. doi:10.1080/09515089.2018.1426100. S2CID   148678250.
  18. Meyers, C. D. (May 19, 2015). "Brains, trolleys, and intuitions: Defending deontology from the Greene/Singer argument". Philosophical Psychology. 28 (4): 466–486. doi:10.1080/09515089.2013.849381. S2CID   146547149.
  19. Kahane, Guy (2012). "On the Wrong Track: Process and Content in Moral Psychology". Mind & Language. 27 (5): 519–545. doi:10.1111/mila.12001. PMC   3546390 . PMID   23335831. S2CID   184105.
  20. Bloom, Paul (March 2010). "How do morals change?". Nature. 464 (7288): 490. Bibcode:2010Natur.464..490B. doi: 10.1038/464490a . PMID   20336117.
  21. Narvaez, Darcia; Lapsley, Daniel K. (2009). Personality, Identity, and Character: Explorations in Moral Psychology. Cambridge University Press. p.  423. ISBN   978-0-521-89507-1.
  22. Narvaez & Lapsley 2009, p.  397.
  23. Narvaez & Lapsley 2009, p.  412.
  24. Guglielmo, Steve (January 2018). "Unfounded dumbfounding: How harm and purity undermine evidence for moral dumbfounding". Cognition. 170: 334–337. doi:10.1016/j.cognition.2017.08.002. PMID   28803616. S2CID   46809661.
  25. Royzman, Edward B; Kim, Kwanwoo; Leeman, Robert F (2015). "The curious tale of Julie and Mark: Unraveling the moral dumbfounding effect". Judgment and Decision Making. 10 (4): 296–313. doi: 10.1017/S193029750000512X . S2CID   55658416.