Moral psychology

Last updated

Moral Psychology is the study of human thought and behavior in ethical contexts. [1] Historically, the term "moral psychology" was used relatively narrowly to refer to the study of moral development. [2] [3] This field of study is interdisciplinary between the application of philosophy and psychology. Moral psychology eventually came to refer more broadly to various topics at the intersection of ethics, psychology, and philosophy of mind. [4] [5] [6] Some of the main topics of the field are moral judgment, moral reasoning, moral satisficing, moral sensitivity, moral responsibility, moral motivation, moral identity, moral action, moral development, moral diversity, moral character (especially as related to virtue ethics), altruism, psychological egoism, moral luck, moral forecasting, moral emotion, affective forecasting, and moral disagreement. [7] [8]

Contents

Today, moral psychology is a thriving area of research spanning many disciplines, [9] with major bodies of research on the biological, [10] [11] cognitive/computational [12] [13] [14] and cultural [15] [16] basis of moral judgment and behavior, and a growing body of research on moral judgment in the context of artificial intelligence. [17] [18]

History

The origins of moral psychology can be traced back to early philosophical works, largely concerned with moral education, such as by Plato and Aristotle in Ancient Greece, [19] [20] as well as from the Buddhist [21] and Confucian traditions. [22] [23] [24] Empirical studies of moral judgment go back at least as far as the 1890s with the work of Frank Chapman Sharp, [25] coinciding with the development of psychology as a discipline separate from philosophy. Since at least 1894, philosophers and psychologists attempted to empirically evaluate the morality of an individual, [26] [27] especially attempting to distinguish adults from children in terms of their judgment. Unfortunately, these efforts failed because they "attempted to quantify how much morality an individual had—a notably contentious idea—rather than understand the individual's psychological representation of morality". [28] :284

[I]f you said that you studied moral psychology in the 1980s, then you probably studied the development of moral reasoning. You didn't need to agree with Kohlberg on any particular claim, but you lived and worked on land that Kohlberg had cleared.

Jonathan Haidt [29] :282

In most introductory psychology courses, students learn about moral psychology by studying the psychologist Lawrence Kohlberg, [30] [31] [32] who proposed a highly influential theory of moral development, developed throughout the 1950s and 1960s. This theory was built on Piaget's observation that children develop intuitions about justice that they can later articulate.[ citation needed ] Kohlberg proposed six stages broken into three categories of moral reasoning that he believed to be universal to all people in all cultures. [33] The increasing sophistication of justice-based reasoning was taken as a sign of development. Moral cognitive development, in turn, was assumed to be a necessary (but not sufficient) condition for moral action. [34]

But researchers using the Kohlberg model found a gap between what people said was most moral and actions they took. In response, Augusto Blasi proposed his self-model [35] that links ideas of moral judgment and action through moral commitment. Those with moral goals central to the self-concept are more likely to take moral action, as they feel a greater obligation to do so. Those who are motivated will attain a unique moral identity. [36]

Following the independent publication of a pair of landmark papers in 2001 (respectively led by Jonathan Haidt and Joshua Greene), [37] [38] there was a surge in interest in moral psychology across a broad range of subfields of psychology, with interest shifting away from developmental processes towards a greater emphasis on social, cognitive, affective and neural processes involved in moral judgment. [2] [6] [39]

Methods

The trolley problem, a commonly-used moral dilemma in psychological research Trolley problem.png
The trolley problem, a commonly-used moral dilemma in psychological research

Philosophers, psychologists and researchers from other fields have created various methods for studying topics in moral psychology, with empirical studies dating back to at least the 1890s. [28] The methods used in these studies include moral dilemmas such as the trolley problem, [25] [38] structured interviews and surveys as a means to study moral psychology and its development, as well as the use of economic games, [40] neuroimaging, [41] and studies of natural language use. [42]

Interview techniques

In 1963, Lawrence Kohlberg presented an approach to studying differences in moral judgment by modeling evaluative diversity as reflecting a series of developmental stages (à la Jean Piaget). Lawrence Kohlberg's stages of moral development are: [43]

  1. Obedience and punishment orientation
  2. Self-interest orientation
  3. Interpersonal accord and conformity
  4. Authority and social-order maintaining orientation
  5. Social contract orientation
  6. Universal ethical principles

Stages 1 and 2 are combined into a single stage labeled "pre-conventional", and stages 5 and 6 are combined into a single stage labeled "post-conventional" for the same reason; psychologists can consistently categorize subjects into the resulting four stages using the "Moral Judgement Interview" which asks subjects why they endorse the answers they do to a standard set of moral dilemmas. [31]

Survey instruments

Between 1910 and 1930, in the United States and Europe, several morality tests were developed to classify subjects as either fit or unfit to make moral judgments. [28] [44] Test-takers would classify or rank standardized lists of personality traits, hypothetical actions, or pictures of hypothetical scenes. As early as 1926, catalogs of personality tests included sections specifically for morality tests, though critics persuasively argued that they merely measured intelligence or awareness of social expectations. [28]

Meanwhile, Kohlberg inspired a new series of morality tests. The Defining Issues Test (dubbed "Neo-Kohlbergian" by its constituents) scores relative preference for post-conventional justifications, [45] [46] and the Moral Judgment Test scores consistency of one's preferred justifications. [47] [48] Both treat evaluative ability as similar to IQ (hence the single score), allowing categorization by high score vs. low score.

Among the more recently developed survey measures, the Moral Foundations Questionnaire [49] is a widely used survey measure of the five moral intuitions proposed by Moral Foundations Theory: care/harm, fairness/cheating, loyalty/betrayal, authority/subversion, and sanctity/degradation. The questions ask respondents to rate various considerations in terms of how relevant they are to the respondent's moral judgments. The purpose of the questionnaire is to measure the degree to which people rely upon each of the five moral intuitions (which may coexist). The new and improved version of this instrument (i.e., Moral Foundations Questionnaire-2; MFQ-2) was developed in 2023. In this version, Fairness was split to Equality and Proportionality. Hence, the MFQ-2 measures Care, Equality, Proportionality, Loyalty, Authority, and Purity. [50] In addition to survey instruments measuring endorsement of moral foundations, a number of other contemporary survey measures exist relating to other broad taxonomies of moral values, [51] [52] [53] as well as more specific moral beliefs, [54] [55] or concerns. [56] [57]

Evolutionary origins

Cooperative behavior has been observed in many nonhuman animals. IMG-20180218-WA0095.jpg
Cooperative behavior has been observed in many nonhuman animals.

According to Haidt, [29] the belief that morality is not innate was one of the few theoretical commitments uniting many of the prominent psychologists studying morality in the twentieth century (with some exceptions [58] [59] ). A substantial amount of research in recent decades has focused on the evolutionary origins of various aspects of morality. [60] [61] [62] [63]

In Unto Others: the Evolution and Psychology of Unselfish Behavior (1998), Elliott Sober and David Sloan Wilson demonstrated that diverse moralities could evolve through group selection. [64] In particular, they dismantled the idea that natural selection will favor a homogeneous population in which all creatures care only about their own personal welfare and/or behave only in ways which advance their own personal reproduction. [64]

Tim Dean has advanced the more general claim that moral diversity would evolve through frequency-dependent selection because each moral approach is vulnerable to a different set of situations which threatened our ancestors. [65]

Topics and theories

Moral identity

Moral identity refers to the importance of morality to a person's identity, typically construed as either a trait-like individual difference, or set of chronically accessible schemas. [36] [66] There are considered to be two main levels of perspective on moral identity. One of them the trait-based perspective theory where certain personality traits are triggered during moral situations.The second perspective is the socio-cognitive perspective where these moral identities are a "self-schema" that will occur due to the social environment. [67] Moral identity is theorized to be one of the key motivational forces connecting moral reasoning to moral behavior, [66] as suggested by a 2016 meta-analysis reporting that moral identity is positively (albeit only modestly) associated with moral behavior. [68] Although moral identity mainly focuses on a moral action there is sometimes "moral disengagement" that will take place too reduce the negative consequences of a action or lack of action. [69]

Moral satisficing

The theory of moral satisficing applies the study of ecological rationality to moral behavior. [70] [71] In this view, much of moral behavior is based on social heuristics rather than traits, virtues, or utilitarian calculations. Social heuristics are a form of satisficing, a term coined by Nobel laureate Herbert Simon. [72] Social heuristics are not good or bad, or beneficial or harmful, per se, but solely in relation to the environments in which they are used. For instance, an adolescent may commit a crime not because of an evil character or a utilitarian calculation but due to following the social heuristic “do what your peers do.” After shifting to a different peer group, the same person’s behavior may shift to a more socially desirable outcome – by relying on the very same heuristic. From this perspective, moral behavior is thus not simply a consequence of inner virtue or traits, but a function of both the mind and the environment, a view based on Simon’s scissors analogy. [73] Many other moral theories, in contrast, consider the mind alone, such as Kohlberg’s state theory, identity theories, virtue theories, and willpower theories.

The ecological perspective has methodological implications for the study of morality: According to it, behavior needs to be studied in social groups and not only in individuals, in natural environments and not only in labs. Both principles are violated, for instance, by the study of how individuals respond to artificial trolley problems. The theory of moral satisficing also has implications for moral policy, implying that problematic behavior can be changed by changing the environment, not only the individual.

Darwin argued that one original function of morality was the coherence and coordination of groups. [74] This suggests that social heuristics that generate coherence and coordination are also those that guide moral behavior. These social heuristics include imitate-your-peers, equality (divide a resource equally), and tit-for-tat (be kind first, then imitate your partner’s behavior). [75] [76] In general, the social heuristics of individuals or institutions shape their moral fabric.

Moral satisficing explains two phenomena that pose a puzzle for virtue and trait theories: moral luck and systematic inconsistencies, as when teens who voluntarily made a virginity pledge were just as likely to have premarital sex as their peers who did not. [77] From an ecological view of morality, such inconsistencies are to be expected when individuals move from one environment to another.

Nagel (1979, p. 59) defines moral luck as follows: ‘‘Where a significant aspect of what someone does depends on factors beyond his control, yet we continue to treat him in that respect as an object of moral judgment, it can be called moral luck.’’ [78] Others voiced concerns that moral luck poses a limit to improving our moral behavior and makes it difficult to evaluate behavior as right or wrong. [79] Yet this concern is based on an internal view of the causes of moral behavior; from an ecological view, moral luck is an inevitable consequence of the interaction between mind and environment. A teen is morally lucky to have not grown up in a criminal peer group, and an adult is morally lucky to have not been conscripted into an army.

Moral satisficing postulates that behavior is guided by social heuristics, not by moral rules such as “don’t kill”, as assumed in theories of moral heuristics [80] or in Hauser’s “moral grammar” with hard-wired moral rules. [81] Moral satisficing postulates that moral rules are essentially social heuristics that ultimately serve the coordination and cooperation of social groups.

Moral values

Psychologist Shalom Schwartz defines individual values as "conceptions of the desirable that guide the way social actors (e.g. organisational leaders, policymakers, individual persons) select actions, evaluate people at events, and explain their actions and evaluations." [82] Cultural values form the basis for social norms, laws, customs and practices. While individual values vary case by case (a result of unique life experience), the average of these values point to widely held cultural beliefs (a result of shared cultural values).

Kristiansen and Hotte [83] reviewed many research articles regarding people's values and attitudes and whether they guide behavior. With the research they reviewed and their own extension of Ajzen and Fishbein's theory of reasoned action, they conclude that value-attitude-behavior depends on the individual and their moral reasoning. Another issue that Kristiansen and Hotte discovered through their research was that individuals tended to "create" values to justify their reactions to certain situations, which they called the "value justification hypothesis". [83] Their theory is comparable to Jonathan Haidt's social intuitionist theory, [37] where individuals justify their intuitive emotions and actions through post-hoc moral reasoning.

Kristiansen and Hotte also found that independent selves had actions and behaviors that are influenced by their own thoughts and feelings, but Interdependent selves have actions, behaviors and self-concepts that were based on the thoughts and feelings of others. Westerners have two dimensions of emotions, activation and pleasantness. The Japanese have one more, the range of their interdependent relationships. Markus and Kitayama found that these two different types of values had different motives. Westerners, in their explanations, show self-bettering biases. Easterners, on the other hand, tend to focus on "other-oriented" biases. [83]

Moral foundations theory

Moral foundations theory, first proposed in 2004 by Jonathan Haidt and Craig Joseph, [84] attempts to explain the origins of and variation in human moral reasoning on the basis of innate, modular foundations. [85] Notably, moral foundations theory has been used to describe the difference between the moral foundations of political liberals and political conservatives. [86] [87] Haidt and Joseph expanded on previous research done by Shweder and his three ethics theory. [84] Shweder's theory consisted of three moral ethics: the ethics of community, autonomy, and divinity. [88] Haidt and Graham took this theory and extended it to discuss the five psychological systems that more specifically make up the three moral ethics theory. These Five Foundations of Morality and their importance vary throughout each culture and construct virtues based on their emphasized foundation. The five psychological foundations are:

  • Harm/care, which starts with the sensitivity to signs of suffering in offspring and develops into a general dislike of seeing suffering in others and the potential to feel compassion in response.
  • Fairness/reciprocity, which is developed when someone observes or engages in reciprocal interactions. This foundation is concerned with virtues related to fairness and justice.
  • Ingroup/loyalty, which constitutes recognizing, trusting, and cooperating with members of one's ingroup as well as being wary of members of other groups.
  • Authority/respect, which is how someone navigates in a hierarchal ingroups and communities.
  • Purity/sanctity, which stems from the emotion of disgust that guards the body by responding to elicitors that are biologically or culturally linked to disease transmission.

The five foundations theory are both a nativist and cultural-psychological theory. Modern moral psychology concedes that "morality is about protecting individuals" and focuses primarily on issues of justice (harm/care and fairness/reciprocity). [86] :99 Their research found that "justice and related virtues...make up half of the moral world for liberals, while justice-related concerns make up only one fifth of the moral world for conservatives". [86] :99 Liberals value harm/care and fairness/reciprocity significantly more than the other moralities, while conservatives value all five equally. Ownership has also been argued to be a strong candidate to be a moral foundation. [89]

Moral virtues

In 2004, D. Lapsley and D. Narvaez outlined how social cognition explains aspects of moral functioning. [90] Their social cognitive approach to personality has six critical resources of moral personality: cognition, self-processes, affective elements of personality, changing social context, lawful situational variability, and the integration of other literature. Lapsley and Narvaez suggest that moral values and actions stem from more than our virtues and are controlled by a set of self-created schemas (cognitive structures that organize related concepts and integrate past events). They claim that schemas are "fundamental to our very ability to notice dilemmas as we appraise the moral landscape" and that over time, people develop greater "moral expertise". [91]

Triune ethics theory

The triune ethics meta-theory (TEM) has been proposed by Darcia Narvaez as a metatheory that highlights the relative contributions to moral development of biological inheritance (including human evolutionary adaptations), environmental influences on neurobiology, and the role of culture. [92] TET proposes three basic mindsets that shape ethical behavior: self-protectionism (a variety of types), engagement, and imagination (a variety of types that are fueled by protectionism or engagement). A mindset influences perception, affordances, and rhetorical preferences. Actions taken within a mindset become an ethic when they trump other values. Engagement and communal imagination represent optimal human functioning that are shaped by the evolved developmental niche (evolved nest) that supports optimal psychosocial neurobiological development. [93] Based on worldwide anthropological research (e.g., Hewlett and Lamb's Hunter-Gatherer Childhoods), Narvaez uses small-band hunter-gatherers as a baseline for the evolved nest and its effects.

Moral reasoning and development

Moral development and reasoning are two overlapping topics of study in moral psychology that have historically received a great amount of attention, even preceding the influential work of Piaget and Kohlberg. [28] Moral reasoning refers specifically to the study of how people think about right and wrong and how they acquire and apply moral rules. [94] Moral development refers more broadly to age-related changes in thoughts and emotions that guide moral beliefs, judgments and behaviors. [95]

Kohlberg's stage theory

Jean Piaget, in watching children play games, noted how their rationales for cooperation changed with experience and maturation. [96] He identified two stages, heteronomous (morality centered outside the self) and autonomous (internalized morality). Lawerence Kohlberg sought to expand Piaget's work. His cognitive developmental theory of moral reasoning dominated the field for decades. He focused on moral development as one's progression in the capacity to reason about justice. Kohlberg's interview method included hypothetical moral dilemmas or conflicts of interest (most notably, the Heinz dilemma). He proposed six stages and three levels of development (claiming that "anyone who interviewed children about dilemmas and who followed them longitudinally in time would come to our six stages and no others). [97] At the Preconventional level, the first two stages included the punishment-and-obedience orientation and the instrumental-relativist orientation. The next level, the conventional level, included the interpersonal concordance or "good boy – nice girl" orientation, along with the "law and order" orientation. Lastly, the final Postconventional level consisted of the social-contract, legalistic orientation and the universal-ethical-principle orientation. [98] According to Kohlberg, an individual is considered more cognitively mature depending on their stage of moral reasoning, which grows as they advance in education and world experience.

Critics of Kohlberg's approach (such as Carol Gilligan and Jane Attanucci) argue that there is an over-emphasis on justice and an under-emphasis on an additional perspective to moral reasoning, known as the care perspective. The justice perspective draws attention to inequality and oppression, while striving for reciprocal rights and equal respect for all. The care perspective draws attention to the ideas of detachment and abandonment, while striving for attention and response to people who need it. Care Orientation is relationally based. It has a more situational focus that is dependent on the needs of others as opposed to Justice Orientation's objectivity. [99] However, reviews by others have found that Gilligan's theory was not supported by empirical studies since orientations are individual dependent. [100] [101] In fact, in neo-Kohlbergian studies with the Defining Issues Test, females tend to get slightly higher scores than males. [102] [ page needed ]

The attachment approach to moral judgment

Aner Govrin's attachment approach to moral judgment [103] [104] [105] proposes that, through early interactions with the caregiver, the child acquires an internal representation of a system of rules that determine how right/wrong judgments are to be construed, used, and understood. By breaking moral situations down into their defining features, the attachment model of moral judgment outlines a framework for a universal moral faculty based on a universal, innate, deep structure that appears uniformly in the structure of almost all moral judgments regardless of their content.

Moral behaviour

Historically, major topics of study in the domain of moral behavior have included violence and altruism, [106] [107] bystander intervention and obedience to authority (e.g., the Milgram experiment [108] and Stanford prison experiment [109] ). [2] [110] Recent research on moral behavior uses a wide range of methods, including using experience sampling to try and estimate the actual prevalence of various kinds of moral behavior in everyday life. [111] [112] Research has also focused on variation in moral behavior over time, through studies of phenomena such as moral licensing. [113] [114] Yet other studies focusing on social preferences examine various kinds of resource allocation decisions, [15] [115] or use incentivized behavioral experiments to investigate the way people weighted their own interests against other people's when deciding whether to harm others, for example, by examine how willing people are to administer electric shocks to themselves vs. others in exchange for money. [116]

James Rest reviewed the literature on moral functioning and identified at least four components necessary for a moral behavior to take place: [117] [118]

Reynolds and Ceranic researched the effects of social consensus on one's moral behavior. Depending on the level of social consensus (high vs. low), moral behaviors will require greater or lesser degrees of moral identity to motivate an individual to make a choice and endorse a behavior. Also, depending on social consensus, particular behaviors may require different levels of moral reasoning. [119]

More recent attempts to develop an integrated model of moral motivation [120] have identified at least six different levels of moral functioning, each of which has been shown to predict some type of moral or pro-social behavior: moral intuitions, moral emotions, moral virtues/vices (behavioral capacities), moral values, moral reasoning, and moral willpower. This social intuitionist model of moral motivation [121] suggests that moral behaviors are typically the product of multiple levels of moral functioning, and are usually energized by the "hotter" levels of intuition, emotion, and behavioral virtue/vice. The "cooler" levels of values, reasoning, and willpower, while still important, are proposed to be secondary to the more affect-intensive processes.

Moral behavior is also studied under the umbrella of personality psychology. Topics within personality psychology include the traits or individual differences underlying moral behavior, such as generativity, self-control, agreeableness, cooperativeness and honesty/humility, [122] [123] [124] as well as moral change goals, [125] among many other topics.

Regarding interventions aimed at shaping moral behavior, a 2009 meta analysis of business ethics instruction programs found that such programs have only "a minimal impact on increasing outcomes related to ethical perceptions, behavior, or awareness." [126] A 2005 meta analysis [127] suggested that positive affect can at least momentarily increase prosocial behavior (with subsequent meta analyses also showing that prosocial behavior reciprocally increases positive affect in the actor [128] [129] ).

Value-behavior consistency

In looking at the relations between moral values, attitudes, and behaviors, previous research asserts that there is less correspondence between these three aspects than one might assume. [130] In fact, it seems to be more common for people to label their behaviors with a justifying value rather than having a value beforehand and then acting on it. There are some people that are more likely to act on their personal values: those low in self-monitoring and high in self-consciousness, due to the fact that they are more aware of themselves and less aware of how others may perceive them. Self-consciousness here means being literally more conscious of yourself, not fearing judgement or feeling anxiety from others. Social situations and the different categories of norms can be telling of when people may act in accordance with their values, but this still is not concrete either. People will typically act in accordance with social, contextual and personal norms, and there is a likelihood that these norms can also follow one's moral values. Though there are certain assumptions and situations that would suggest a major value-attitude-behavior relation, there is not enough research to confirm this phenomenon.

Moral willpower

Building on earlier work by Metcalfe and Mischel on delayed gratification, [131] Baumeister, Miller, and Delaney explored the notion of willpower by first defining the self as being made up of three parts: reflexive consciousness, or the person's awareness of their environment and of himself as an individual; interpersonal being, which seeks to mold the self into one that will be accepted by others; and executive function. [132] They stated, "[T]he self can free its actions from being determined by particular influences, especially those of which it is aware". [133] The three prevalent theories of willpower describe it as a limited supply of energy, as a cognitive process, and as a skill that is developed over time. Research has largely supported that willpower works like a "moral muscle" with a limited supply of strength that may be depleted (a process referred to as Ego depletion), conserved, or replenished, and that a single act requiring much self-control can significantly deplete the "supply" of willpower. [132] While exertion reduces the ability to engage in further acts of willpower in the short term, such exertions actually improve a person's ability to exert willpower for extended periods in the long run. [134] Additional research has been conducted that may cast doubt on the idea of ego-depletion. [135]

Moral intuitions

In 2001, Jonathan Haidt introduced his social intuitionist model which claimed that with few exceptions, moral judgments are made based upon socially derived intuitions. Moral intuitions happen immediately, automatically, and unconsciously, with reasoning largely serving to generate post-hoc rationalizations to justify one's instinctual reactions. [37] He provides four arguments to doubt causal importance of reason. Firstly, Haidt argues that since there is a dual process system in the brain when making automatic evaluations or assessments, this same process must be applicable to moral judgement as well. The second argument, based on research on motivated reasoning, claims that people behave like "intuitive lawyers", searching primarily for evidence that will serve motives for social relatedness and attitudinal coherence. Thirdly, Haidt found that people have post hoc reasoning when faced with a moral situation, this a posteriori (after the fact) explanation gives the illusion of objective moral judgement but in reality is subjective to one's gut feeling. Lastly, research has shown that moral emotion has a stronger link to moral action than moral reasoning, citing Damasio's research on the somatic marker hypothesis and Batson's empathy-altruism hypothesis. [37]

Similarly, in his theory of moral satisficing, Gerd Gigerenzer argues that moral behavior is not solely a result of deliberate reasoning but also of social heuristics that are embedded in social environments. In other words, intuitionist theories can use heuristics to explain intuition. He emphasizes that these are key to understanding moral behavior.  Modifying moral behavior therefore entails changing heuristics and/or modifying environments rather than focussing on individuals. In this way, moral satisficing extends social intuitionism by adding both concrete heuristics and a focus on the environments with which the heuristics interact to produce behavior. [136] [137]

Following the publication of a landmark fMRI study in 2001, [38] Joshua Greene separately proposed his dual process theory of moral judgment, according to which intuitive/emotional and deliberative processes respectively give rise to characteristically deontological and consequentialist moral judgments. A "deontologist" is someone who has rule-based morality that is mainly focused on duties and rights; in contrast, a "consequentialist" is someone who believes that only the best overall consequences ultimately matter. [138]

Moral emotions

Moral emotions are a variety of social emotion that are involved in forming and communicating moral judgments and decisions, and in motivating behavioral responses to one's own and others' moral behavior. [139] [140] [141] While moral reasoning has been the focus of most study of morality dating back to Plato and Aristotle, the emotive side of morality was historically looked upon with disdain in early moral psychology research. [139] However, in the last 30–40 years, there has been a rise in a new front of research: moral emotions as the basis for moral behavior. [141] This development began with a focus on empathy and guilt, but has since moved on to encompass new scholarship on emotions such as anger, shame, disgust, awe, and elevation. While different moral transgressions have been linked to different emotional reactions, bodily reactions to such transgressions are not too different and can be characterized by some felt activations in the gut area as well as the head area. [142]

Moralization and moral conviction

Moralization, a term introduced to moral psychology by Paul Rozin, refers to the process through which preferences are converted into values. [143] [144] [145] Relatedly, Linda Skitka and colleagues have introduced the concept of moral conviction, which refers to a "strong and absolute belief that something is right or wrong, moral or immoral." [146] [147] According to Skitka's integrated theory of moral conviction (ITMC), attitudes held with moral conviction, known as moral mandates, differ from strong but non-moral attitudes in a number of important ways. Namely, moral mandates derive their motivational force from their perceived universality, perceived objectivity, and strong ties to emotion. [148] Perceived universality refers to the notion that individuals experience moral mandates as transcending persons and cultures; additionally, they are regarded as matters of fact. Regarding association with emotion, ITMC is consistent with Jonathan Haidt's social intuitionist model in stating that moral judgments are accompanied by discrete moral emotions (i.e., disgust, shame, guilt). Importantly, Skitka maintains that moral mandates are not the same thing as moral values. Whether an issue will be associated with moral conviction varies across persons.

One of the main lines of IMTC research addresses the behavioral implications of moral mandates. Individuals prefer greater social and physical distance from attitudinally dissimilar others when moral conviction was high. This effect of moral conviction could not be explained by traditional measures of attitude strength, extremity, or centrality. Skitka, Bauman, and Sargis placed participants in either attitudinally heterogeneous or homogenous groups to discuss procedures regarding two morally mandated issues, abortion and capital punishment. Those in attitudinally heterogeneous groups demonstrated the least amount of goodwill towards other group members, the least amount of cooperation, and the most tension/defensiveness. Furthermore, individuals discussing a morally mandated issue were less likely to reach a consensus compared to those discussing non-moral issues. [149]

Moral Enhancement

Main Article: Moral enhancement

Moral Enhancement (abbreviated ME), also called moral bioenhancement (abbreviated MBE), is the use of biomedical technology to morally improve individuals. [150] There is another subdiscipline under moral enhancement which is denoted as Traditional Moral Enhancement (TME).   [151] Moral Enhancement is also considered the altering of moral behavior, moral traits, moral decision making, and or cognitive abilities but there hasn't been a clear cut definition of what it all entails. A scholar by the name of J.B.S Haldane made a statement "it is only hopeful if mankind can adjust its morality to its powers." Haldane had a focus on using moral enhancement to improve the way society as whole functioned to increase the balance between morality and the usage of science. [152]

Intersections with other fields

Sociological applications

Some research shows that people tend to self-segregate based on moral and political views, [153] [154] exaggerate the magnitude of moral disagreements across political divides, [155] and avoid exposure to the opinions of those with opposing political views. [156]

Normative implications

Researchers have begun to debate the implications (if any) moral psychology research has for other subfields of ethics such as normative ethics and meta-ethics. [157] [158] [159] [160] [161] For example Peter Singer, citing Haidt's work on social intuitionism and Greene's dual process theory, presented an "evolutionary debunking argument" suggesting that the normative force of our moral intuitions is undermined by their being the "biological residue of our evolutionary history." [162] John Michael Doris discusses the way in which social psychological experiments—such as the Stanford prison experiments involving the idea of situationism—call into question a key component in virtue ethics: the idea that individuals have a single, environment-independent moral character. [163] [ page needed ] As a further example, Shaun Nichols (2004) examines how empirical data on psychopathology suggests that moral rationalism is false. [164] [ page needed ]

Additionally, research in moral psychology is being used to inform debates in applied ethics around moral enhancement. [165] [166]

Robotics and artificial intelligence

At the intersection of moral psychology and machine ethics, researchers have begun to study people's views regarding the potentially ethically significant decisions that will be made by self-driving cars. [18] [17] [167] [168]

Mohammad Atari and his colleagues recently examined the moral psychology of the famous chatbot, ChatGPT. These authors asked in their title, "which humans?" — rhetorically pointing out that people should not ask how "human-like" machine morality is, but to which humans it resembles. [169] These authors discovered that Large Language Models (LLMs), especially ChatGPT, tend to echo moral values endorsed by Westerners, as their training datasets originate predominantly from Western, Educated, Industrialized, Rich, and Democratic (WEIRD) societies. [170] This study points out that compared to the global average, people from WEIRD societies are more inclined toward individualism and impersonal prosocial behaviors while showing less traditionalism and group loyalty. The authors further highlighted that societies less aligned with these WEIRD moral values tend to experience greater misalignment with the moral values and outputs of ChatGPT. [171]

Gerd Gigerenzer argued that the focus of AI ethics should reach far beyond the question whether an AI system has a moral bias or is able to exhibit human-like moral responses. It also needs to investigate the actual motives and ethical behavior of the people behind the AI. Contrary to the 1990s dream of an egalitarian internet providing honest and accurate information to all, various tech billionaires and politicians have highly succeeded in leveraging AI for their own purposes of surveillance and control, for tolerating systematic misinformation for profit, and for increasing their individual power to the detriment of a democracy. [172]

See also

Notes

  1. Moral Psychology: Empirical Approaches. Metaphysics Research Lab, Stanford University. April 19, 2006.
  2. 1 2 3 Haidt, Jonathan; Kesebir, Selin (2010). "Morality". In Fiske, S; Gilbert, D; Lindzey, G (eds.). Handbook of Social Psychology (PDF) (5 ed.). Hoboken NJ: Wiley. pp. 797–832.
  3. Lapsley, Daniel K. (1996). Moral Psychology . Developmental psychology series. Boulder, Colorado: Westview Press. ISBN   978-0-8133-3032-7.
  4. Doris, John; Stich, Stephen (2008), Zalta, Edward N. (ed.), "Moral Psychology: Empirical Approaches", The Stanford Encyclopedia of Philosophy, Metaphysics Research Lab, Stanford University
  5. Wallace, R. Jay (November 29, 2007). "Moral Psychology". In Jackson, Frank; Smith, Michael (eds.). The Oxford Handbook of Contemporary Philosophy. OUP Oxford. pp. 86–113. ISBN   978-0-19-923476-9. Moral psychology is the study of morality in its psychological dimensions
  6. 1 2 Ellemers, Naomi; van der Toorn, Jojanneke; Paunov, Yavor; van Leeuwen, Thed (18 January 2019). "The Psychology of Morality: A Review and Analysis of Empirical Studies Published From 1940 Through 2017". Personality and Social Psychology Review. 23 (4): 332–366. doi:10.1177/1088868318811759. ISSN   1088-8683. PMC   6791030 . PMID   30658545.
  7. Doris & Stich 2008, §1.
  8. Teper, R.; Inzlicht, M.; Page-Gould, E. (2011). "Are we more moral than we think?: Exploring the role of affect in moral behavior and moral forecasting". Psychological Science. 22 (4): 553–558. CiteSeerX   10.1.1.1033.5192 . doi:10.1177/0956797611402513. PMID   21415242. S2CID   206585532.
  9. Doris & Stich (2008), §1.
  10. Sevinc, Gunes; Spreng, R. Nathan; Soriano-Mas, Carles (4 February 2014). "Contextual and Perceptual Brain Processes Underlying Moral Cognition: A Quantitative Meta-Analysis of Moral Reasoning and Moral Emotions". PLOS ONE. 9 (2): e87427. Bibcode:2014PLoSO...987427S. doi: 10.1371/journal.pone.0087427 . PMC   3913597 . PMID   24503959.
  11. Moll, Jorge; Zahn, Roland; de Oliveira-Souza, Ricardo; Krueger, Frank; Grafman, Jordan (October 2005). "The neural basis of human moral cognition". Nature Reviews Neuroscience. 6 (10): 799–809. doi:10.1038/nrn1768. PMID   16276356. S2CID   2915834.
  12. Kleiman-Weiner, Max; Saxe, Rebecca; Tenenbaum, Joshua B. (October 2017). "Learning a commonsense moral theory". Cognition. 167: 107–123. doi:10.1016/j.cognition.2017.03.005. hdl: 1721.1/118457 . PMID   28351662. S2CID   3184506.
  13. Cushman, Fiery (16 July 2013). "Action, Outcome, and Value". Personality and Social Psychology Review. 17 (3): 273–292. doi:10.1177/1088868313495594. PMID   23861355. S2CID   18501147.
  14. Crockett, Molly J. (August 2013). "Models of morality". Trends in Cognitive Sciences. 17 (8): 363–366. doi:10.1016/j.tics.2013.06.005. PMC   3925799 . PMID   23845564.
  15. 1 2 Henrich, Joseph; Boyd, Robert; Bowles, Samuel; Camerer, Colin; Fehr, Ernst; Gintis, Herbert; McElreath, Richard; Alvard, Michael; Barr, Abigail; Ensminger, Jean; Henrich, Natalie Smith; Hill, Kim; Gil-White, Francisco; Gurven, Michael; Marlowe, Frank W.; Patton, John Q.; Tracer, David (22 December 2005). ""Economic man" in cross-cultural perspective: Behavioral experiments in 15 small-scale societies" (PDF). Behavioral and Brain Sciences. 28 (6): 795–815. doi:10.1017/S0140525X05000142. PMID   16372952. S2CID   3194574.
  16. Purzycki, Benjamin Grant; Apicella, Coren; Atkinson, Quentin D.; Cohen, Emma; McNamara, Rita Anne; Willard, Aiyana K.; Xygalatas, Dimitris; Norenzayan, Ara; Henrich, Joseph (10 February 2016). "Moralistic gods, supernatural punishment and the expansion of human sociality" (PDF). Nature. 530 (7590): 327–330. Bibcode:2016Natur.530..327P. doi:10.1038/nature16980. PMID   26863190. S2CID   205247725.
  17. 1 2 Awad, Edmond; Dsouza, Sohan; Kim, Richard; Schulz, Jonathan; Henrich, Joseph; Shariff, Azim; Bonnefon, Jean-François; Rahwan, Iyad (24 October 2018). "The Moral Machine experiment". Nature. 563 (7729): 59–64. Bibcode:2018Natur.563...59A. doi:10.1038/s41586-018-0637-6. hdl: 10871/39187 . PMID   30356211. S2CID   53029241.
  18. 1 2 Bonnefon, J.-F.; Shariff, A.; Rahwan, I. (23 June 2016). "The social dilemma of autonomous vehicles". Science. 352 (6293): 1573–1576. arXiv: 1510.03346 . Bibcode:2016Sci...352.1573B. doi:10.1126/science.aaf2654. PMID   27339987. S2CID   35400794.
  19. Carr, David (26 August 2014). "Metaphysics and methods in moral enquiry and education: Some old philosophical wine for new theoretical bottles". Journal of Moral Education. 43 (4): 500–515. doi:10.1080/03057240.2014.943167. S2CID   145588696.
  20. Lewis, Paul (June 2012). "In defence of Aristotle on character: toward a synthesis of recent psychology, neuroscience and the thought of Michael Polanyi". Journal of Moral Education. 41 (2): 155–170. doi:10.1080/03057240.2012.668005. S2CID   146755766.
  21. Goodman, Charles, "Ethics in Indian and Tibetan Buddhism", The Stanford Encyclopedia of Philosophy (Summer 2021 Edition), Edward N. Zalta (ed.), URL = <https://plato.stanford.edu/archives/sum2021/entries/ethics-indian-buddhism/>.
  22. Fengyan *, Wang (December 2004). "Confucian thinking in traditional moral education: key ideas and fundamental features". Journal of Moral Education. 33 (4): 429–447. doi:10.1080/0305724042000327984. S2CID   216114943.
  23. Murray, Judson B. (December 2012). "Educating human nature: 'nature' and 'nurture' in early Confucian moral education". Journal of Moral Education. 41 (4): 509–527. doi:10.1080/03057240.2012.721759. S2CID   144739301.
  24. Wei, Tan Tai (January 1990). "Some Confucian Insights and Moral Education". Journal of Moral Education. 19 (1): 33–37. doi:10.1080/0305724900190104.
  25. 1 2 Sharp, Frank Chapman (January 1898). "An Objective Study of Some Moral Judgments". The American Journal of Psychology. 9 (2): 198–234. doi:10.2307/1411759. JSTOR   1411759.
  26. May, Mark A.; Hartshorne, Hugh (March 1925). "Objective Methods of Measuring Character". The Pedagogical Seminary and Journal of Genetic Psychology. 32 (1): 45–67. doi:10.1080/08856559.1925.10532317.
  27. Pittel, Stephen M.; Mendelsohn, Gerald A. (1966). "Measurement of moral values: A review and critique". Psychological Bulletin. 66 (1): 22–35. doi:10.1037/h0023425. PMID   5329602.
  28. 1 2 3 4 5 Wendorf, Craig A (2001). "History of American morality research, 1894–1932". History of Psychology. 4 (3): 272–288. doi:10.1037/1093-4510.4.3.272.
  29. 1 2 Haidt, Jonathan (September 2013). "Moral psychology for the twenty-first century". Journal of Moral Education. 42 (3): 281–297. doi:10.1080/03057240.2013.817327. S2CID   144638008.
  30. Kohlberg, Lawrence (1958). The development of modes of moral thinking and choice in the years 10 to 16 (PhD thesis). Chicago. OCLC   1165315.
  31. 1 2 Colby, Anne; Kohlberg, Lawrence (1987). The Measurement of Moral Judgment. Standard Issue Scoring Manual. Vol. 2. Cambridge: Cambridge University Press. ISBN   978-0-521-32565-3.
  32. Kohlberg, L. (1969). "Stage and sequence: The cognitive development approach to socialization" . In Goslin, David (ed.). Handbook of Socialization Theory and Research. Chicago: Rand McNally. pp.  347–480.
  33. Kohlberg, Lawrence (1971-01-31), "1. Stages of moral development as a basis for moral education", in Beck, Clive M; Crittenden, Brian S; Sullivan, Edmund (eds.), Moral Education, University of Toronto Press, pp. 23–92, doi:10.3138/9781442656758-004, ISBN   9781442656758
  34. Kohlberg, Lawrence; Hersh, Richard H. (1977). "Moral development: A review of the theory". Theory into Practice. 16 (2): 53–59. doi:10.1080/00405847709542675.
  35. Lapsley, Daniel (2016-09-22). "Moral Self-Identity and the Social-Cognitive Theory of Virtue". Developing the Virtues. pp. 34–68. doi:10.1093/acprof:oso/9780190271466.003.0003. ISBN   978-0-19-027146-6.
  36. 1 2 Hardy, S. A.; Carlo, G. (2011). "Moral identity: What is it, how does it develop, and is it linked to moral action?". Child Development Perspectives. 5 (3): 212–218. doi:10.1111/j.1750-8606.2011.00189.x.
  37. 1 2 3 4 Haidt, Jonathan (October 2001). "The Emotional Dog and Its Rational Tail" (PDF). Psychological Review. 108 (4): 814–834. CiteSeerX   10.1.1.620.5536 . doi:10.1037/0033-295X.108.4.814. PMID   11699120.
  38. 1 2 3 Greene, J. D.; Sommerville, R Brian; Nystrom, Leigh E; Darley, John M.; Cohen, Jonathan D (14 September 2001). "An fMRI Investigation of Emotional Engagement in Moral Judgment". Science. 293 (5537): 2105–2108. Bibcode:2001Sci...293.2105G. doi:10.1126/science.1062872. PMID   11557895. S2CID   1437941.
  39. Cohen Priva, Uriel; Austerweil, Joseph L. (February 2015). "Analyzing the history of Cognition using Topic Models". Cognition. 135: 4–9. doi:10.1016/j.cognition.2014.11.006. PMID   25497481. S2CID   37146919.
  40. Story, Giles W.; Vlaev, Ivo; Metcalfe, Robert D.; Crockett, Molly J.; Kurth-Nelson, Zeb; Darzi, Ara; Dolan, Raymond J. (30 October 2015). "Social redistribution of pain and money". Scientific Reports. 5 (1): 15389. Bibcode:2015NatSR...515389S. doi:10.1038/srep15389. PMC   4626774 . PMID   26515529.
  41. Moll, Jorge; de Oliveira-Souza, Ricardo; Eslinger, Paul J.; Bramati, Ivanei E.; Mourão-Miranda, Janaı́na; Andreiuolo, Pedro Angelo; Pessoa, Luiz (1 April 2002). "The Neural Correlates of Moral Sensitivity: A Functional Magnetic Resonance Imaging Investigation of Basic and Moral Emotions". The Journal of Neuroscience. 22 (7): 2730–2736. doi:10.1523/JNEUROSCI.22-07-02730.2002. PMC   6758288 . PMID   11923438.
  42. Sagi, Eyal; Dehghani, Morteza (31 October 2013). "Measuring Moral Rhetoric in Text". Social Science Computer Review. 32 (2): 132–144. doi:10.1177/0894439313506837. S2CID   62259852.
  43. Kohlberg, Lawrence (1973). "The Claim to Moral Adequacy of a Highest Stage of Moral Judgment". Journal of Philosophy. 70 (18): 630–646. doi:10.2307/2025030. JSTOR   2025030.
  44. Verplaetse, Jan (2008). "Measuring the moral sense: morality tests in continental Europe between 1910 and 1930". Paedagogica Historica. 44 (3): 265–286. doi:10.1080/00309230701722721. S2CID   143771452.
  45. Rest, James R. (1979). Development in Judging Moral Issues. Minneapolis: University of Minnesota Press. ISBN   978-0-8166-0891-1.
  46. Rest, James; Narvaez, Darcia; Bebeau, Muriel; Thoma, Stephen (1999). "A Neo-Kohlbergian Approach: The DIT and Schema Theory". Educational Psychology Review. 11 (4): 291–324. doi:10.1023/A:1022053215271. S2CID   14483253.
  47. Lind, Georg (1978). "Wie misst man moralisches Urteil? Probleme und alternative Möglichkeiten der Messung eines komplexen Konstrukts" [How do you measure moral judgment? Problems and alternative ways of measuring a complex construct]. In Portele, G. (ed.). Sozialisation und Moral[Socialization and Morality] (in German). Weinheim: Beltz. pp. 171–201. ISBN   9783407511348. OCLC   715635639.
  48. Lind, Georg (2008). "The meaning and measurement of moral judgment competence: A dual-aspect model". In Fasko, Daniel Jr; Willis, Wayne (eds.). Contemporary Philosophical and Psychological Perspectives on Moral Development and Education. Hampton Press. pp. 185–220.
  49. Graham, Jesse; Nosek, Brian A.; Haidt, Jonathan; Iyer, Ravi; Koleva, Spassena; Ditto, Peter H. (2011). "Mapping the moral domain". Journal of Personality and Social Psychology. 101 (2): 366–385. doi:10.1037/a0021847. PMC   3116962 . PMID   21244182.
  50. Atari, Mohammad; Haidt, Jonathan; Graham, Jesse; Koleva, Sena; Stevens, Sean T.; Dehghani, Morteza (November 2023). "Morality beyond the WEIRD: How the nomological network of morality varies across cultures". Journal of Personality and Social Psychology. 125 (5): 1157–1188. doi:10.1037/pspp0000470. ISSN   1939-1315. PMID   37589704.
  51. Curry, Oliver Scott; Jones Chesters, Matthew; Van Lissa, Caspar J. (February 2019). "Mapping morality with a compass: Testing the theory of 'morality-as-cooperation' with a new questionnaire". Journal of Research in Personality. 78: 106–124. doi: 10.1016/j.jrp.2018.10.008 .
  52. Janoff-Bulman, Ronnie; Carnes, Nate C. (31 March 2016). "Social Justice and Social Order: Binding Moralities across the Political Spectrum". PLOS ONE. 11 (3): e0152479. Bibcode:2016PLoSO..1152479J. doi: 10.1371/journal.pone.0152479 . PMC   4816418 . PMID   27031103.
  53. Guerra, Valeschka M.; Giner-Sorolla, Roger (January 2010). "The Community, Autonomy, and Divinity Scale (CADS): A New Tool for the Cross-Cultural Study of Morality" (PDF). Journal of Cross-Cultural Psychology. 41 (1): 35–50. doi:10.1177/0022022109348919. S2CID   145410595.
  54. Bastian, Brock; Bain, Paul; Buhrmester, Michael D.; Gómez, Ángel; Vázquez, Alexandra; Knight, Clinton G.; Swann, William B. (August 2015). "Moral Vitalism: Seeing Good and Evil as Real, Agentic Forces". Personality and Social Psychology Bulletin. 41 (8): 1069–1081. doi:10.1177/0146167215589819. PMID   26089349. S2CID   11280774.
  55. Ståhl, Tomas; Zaal, Maarten P.; Skitka, Linda J. (16 November 2016). "Moralized Rationality: Relying on Logic and Evidence in the Formation and Evaluation of Belief Can Be Seen as a Moral Issue". PLOS ONE. 11 (11): e0166332. Bibcode:2016PLoSO..1166332S. doi: 10.1371/journal.pone.0166332 . PMC   5112873 . PMID   27851777.
  56. Ho, Arnold K.; Sidanius, Jim; Kteily, Nour; Sheehy-Skeffington, Jennifer; Pratto, Felicia; Henkel, Kristin E.; Foels, Rob; Stewart, Andrew L. (December 2015). "The nature of social dominance orientation: Theorizing and measuring preferences for intergroup inequality using the new SDO7 scale" (PDF). Journal of Personality and Social Psychology. 109 (6): 1003–1028. doi:10.1037/pspi0000033. PMID   26479362.
  57. Duckitt, John; Bizumic, Boris (December 2013). "Multidimensionality of Right-Wing Authoritarian Attitudes: Authoritarianism-Conservatism-Traditionalism: Authoritarianism-Conservatism-Traditionalism". Political Psychology. 34 (6): 841–862. doi:10.1111/pops.12022.
  58. Petrinovich, Lewis; O'Neill, Patricia; Jorgensen, Matthew (1993). "An empirical study of moral intuitions: Toward an evolutionary ethics". Journal of Personality and Social Psychology. 64 (3): 467–478. doi:10.1037/0022-3514.64.3.467.
  59. Krebs, Dennis L.; Denton, Kathy; Wark, Gillian (June 1997). "The Forms and Functions of Real-life Moral Decision-making". Journal of Moral Education. 26 (2): 131–145. doi:10.1080/0305724970260202.
  60. Sinnott-Armstrong, Walter, ed. (2007). Moral Psychology, Volume 1: The Evolution of Morality: Adaptations and Innateness. Vol. 1. A Bradford Book. ISBN   9780262693547.
  61. Brosnan, S. F.; de Waal, F. B. M. (18 September 2014). "Evolution of responses to (un)fairness". Science. 346 (6207): 1251776. doi:10.1126/science.1251776. PMC   4451566 . PMID   25324394.
  62. Tomasello, Michael; Vaish, Amrisha (3 January 2013). "Origins of Human Cooperation and Morality". Annual Review of Psychology . 64 (1): 231–255. doi:10.1146/annurev-psych-113011-143812. hdl: 10161/13649 . PMID   22804772.
  63. Hare, Brian (3 January 2017). "Survival of the Friendliest: Evolved via Selection for Prosociality". Annual Review of Psychology . 68 (1): 155–186. doi:10.1146/annurev-psych-010416-044201. PMID   27732802. S2CID   3387266.
  64. 1 2 Sober, Elliott; Wilson, David Sloan (1998). Unto Others: The Evolution and Psychology of Unselfish Behavior . Cambridge: Harvard University Press. ISBN   9780674930469. OCLC   37761960.
  65. Dean, Tim (2012). "Evolution and moral diversity". Baltic International Yearbook of Cognition, Logic and Communication. 7. doi: 10.4148/biyclc.v7i0.1775 .
  66. 1 2 Hardy, Sam A.; Carlo, Gustavo (2011). "Moral Identity". In Schwartz, Seth J.; Luyckx, Koen; Vignoles, Vivian L. (eds.). Handbook of identity theory and research. Springer. pp. 495–513. ISBN   978-1-4419-7988-9.
  67. Krettenauer, Tobias (2020-07-02). "Moral identity as a goal of moral action: A Self-Determination Theory perspective". Journal of Moral Education. 49 (3): 330–345. doi:10.1080/03057240.2019.1698414. ISSN   0305-7240.
  68. Hertz, Steven G.; Krettenauer, Tobias (1 June 2016). "Does Moral Identity Effectively Predict Moral Behavior?: A Meta-Analysis". Review of General Psychology. 20 (2): 129–140. doi: 10.1037/gpr0000062 . S2CID   148276515.
  69. Krettenauer, Tobias (2020-07-02). "Moral identity as a goal of moral action: A Self-Determination Theory perspective". Journal of Moral Education. 49 (3): 330–345. doi:10.1080/03057240.2019.1698414. ISSN   0305-7240.
  70. "Moral Intuition = Fast and Frugal Heuristics?", Moral Psychology, The MIT Press, 2007, doi:10.7551/mitpress/7573.003.0003, hdl: 11858/00-001M-0000-0024-FB30-8 , ISBN   978-0-262-30301-9 , retrieved 2024-10-14
  71. Gigerenzer, Gerd (2010-05-12). "Moral Satisficing: Rethinking Moral Behavior as Bounded Rationality". Topics in Cognitive Science. 2 (3): 528–554. doi:10.1111/j.1756-8765.2010.01094.x. hdl: 11858/00-001M-0000-0024-F5F4-4 . ISSN   1756-8757. PMID   25163875.
  72. Simon, H. A. (1956). "Rational choice and the structure of the environment". Psychological Review. 63 (2): 129–138. doi:10.1037/h0042769. ISSN   1939-1471. PMID   13310708.
  73. Simon, H. (1990-01-01). "Invariants Of Human Behavior". Annual Review of Psychology. 41 (1): 1–19. doi:10.1146/annurev.psych.41.1.1. ISSN   0066-4308. PMID   18331187.
  74. Darwin, Charles (1871). The descent of man, and selection in relation to sex. London: J. Murray. doi:10.5962/bhl.title.2092.
  75. Gigerenzer, Gerd; Gaissmaier, Wolfgang (2011-01-10). "Heuristic Decision Making". Annual Review of Psychology. 62 (1): 451–482. doi:10.1146/annurev-psych-120709-145346. hdl: 11858/00-001M-0000-0024-F16D-5 . ISSN   0066-4308. PMID   21126183.
  76. Hertwig, Ralph; Hoffrage, Ulrich; Research Group, ABC, eds. (2012-11-29). Simple Heuristics in a Social World. Oxford University Press. doi:10.1093/acprof:oso/9780195388435.001.0001. ISBN   978-0-19-538843-5.
  77. Rosenbaum, Janet Elise (2009-01-01). "Patient Teenagers? A Comparison of the Sexual Behavior of Virginity Pledgers and Matched Nonpledgers". Pediatrics. 123 (1): e110–e120. doi:10.1542/peds.2008-0407. hdl:20.500.12648/8444. ISSN   0031-4005. PMC   2768056 . PMID   19117832.
  78. Nagel, T. (1979). "Moral luck. Mortal Questions". [New York: Cambridge University Press, 1979]: 31–32.
  79. Matheson, David S; Kissoon, Niranjan (2006). "A comparison of decision-making by physicians and administrators in healthcare settings". Critical Care. 10 (5): 163. doi: 10.1186/cc5028 . ISSN   1364-8535. PMC   1751052 . PMID   16959045.
  80. sunstein, cass r. (August 2005). "moral heuristics". Behavioral and Brain Sciences. 28 (4): 531–542. doi:10.1017/s0140525x05000099. ISSN   0140-525X. PMID   16209802.
  81. Hauser, Marc D. (2006-12-01). "The liver and the moral organ". Social Cognitive and Affective Neuroscience. 1 (3): 214–220. doi:10.1093/scan/nsl026. ISSN   1749-5024. PMC   2555424 . PMID   18985108.
  82. Schwartz, S. H. (1999). "A Theory of Cultural Values and Some Implications for Work" (PDF). Applied Psychology: An International Review. 48 (1): 23–47. doi:10.1080/026999499377655.
  83. 1 2 3 Kristiansen, Connie M; Hotte, Alan M (1996). Morality and the self: Implications for the when and how of value-attitude-behavior relations. The Psychology of Values: The Ontario Symposium on Personality and Social Psychology. Vol. 8. Erlbaum Hillsdale, NJ. pp. 77–105.
  84. 1 2 Haidt, Jonathan; Craig Joseph (Fall 2004). "Intuitive ethics: how innately prepared intuitions generate culturally variable virtues" (PDF). Daedalus. 133 (4): 55–66. doi:10.1162/0011526042365555. S2CID   1574243. Archived from the original (PDF) on 2016-09-09. Retrieved 2012-10-13.
  85. Graham, J.; Haidt, J.; Koleva, S.; Motyl, M.; Iyer, R.; Wojcik, S.; Ditto, P.H. (2013). Moral Foundations Theory: The pragmatic validity of moral pluralism (PDF). Vol. 47. pp. 55–130. doi:10.1016/b978-0-12-407236-7.00002-4. ISBN   9780124072367. S2CID   2570757. Archived from the original (PDF) on 2017-07-31. Retrieved 2015-01-11.{{cite book}}: |journal= ignored (help)
  86. 1 2 3 Haidt, Jonathan; Graham, Jesse (23 May 2007). "When Morality Opposes Justice: Conservatives Have Moral Intuitions that Liberals may not Recognize". Social Justice Research. 20 (1): 98–116. doi:10.1007/s11211-007-0034-z. S2CID   6824095.
  87. Graham, Jesse; Haidt, Jonathan; Nosek, Brian A. (2009). "Liberals and conservatives rely on different sets of moral foundations" (PDF). Journal of Personality and Social Psychology. 96 (5): 1029–1046. doi:10.1037/a0015141. PMID   19379034.
  88. Shweder, Richard; Much, Nancy; Mahapatra, Manamohan; Park, Lawrence (1997). "The "big three" of morality (autonomy, community, divinity) and the "big three" explanations of suffering.". In Brandt, Allan; Rozin, Paul (eds.). Morality and Health. Routledge. pp. 119–169.
  89. Atari, Mohammad; Haidt, Jonathan (2023-10-10). "Ownership is (likely to be) a moral foundation". The Behavioral and Brain Sciences. 46: e326. doi:10.1017/S0140525X2300119X. ISSN   1469-1825. PMID   37813408.
  90. Lapsley, Daniel K.; Narvaez, Darcia (2004). "A social-cognitive approach to the moral personality". Moral Development, Self, and Identity. Psychology Press. pp. 189–212. ISBN   978-1-135-63233-5.
  91. Lapsley & Narvaez 2004, p. 197.
  92. Narvaez, Darcia (March 1, 2008). "Triune ethics: The neurobiological roots of our multiple moralities". New Ideas in Psychology. 26 (1): 95–119. CiteSeerX   10.1.1.152.4926 . doi:10.1016/j.newideapsych.2007.07.008. ISSN   0732-118X.
  93. Narvaez, Darcia (2014). Neurobiology and the Development of Human Morality: Evolution, Culture and Wisdom. WWNorton. ISBN   978-0393706550.
  94. Pizarro, David A. (2007). "Moral Reasoning". In Baumeister, Roy F; Vohs, Kathleen F (eds.). Encyclopedia of Social Psychology . SAGE Publications, Inc. pp.  591–592. doi:10.4135/9781412956253.n352. ISBN   9781412956253.
  95. Barnett, Mark A. (2007). "Moral Development". In Baumeister, Roy F; Vohs, Kathleen D (eds.). Encyclopedia of Social Psychology . SAGE Publications, Inc. pp.  587. doi:10.4135/9781412956253.n349. ISBN   9781412956253.
  96. Piaget, Jean (1948). The Moral Judgment of the Child (PDF). Free Press.
  97. Kohlberg, Lawrence (1984). The Psychology of Moral Development: The Nature and Validity of Moral Stages. Essays on Moral Development. Vol. 2. Harper & Row. p. 195. ISBN   978-0-06-064761-2.
  98. Crain, W.C. "Kohlberg's Stages of Moral Development". Theories of Development. Prentice-Hall. Archived from the original on October 4, 2011. Retrieved October 3, 2011.
  99. Gilligan, Carol; Attanucci, Jane (1994). "Two Moral Orientations: Gender Differences and Similarities". In Puka, Bill (ed.). Moral Development: Caring Voices and Women's Moral Frames. Vol. 34. Taylor & Francis. pp. 123–237. ISBN   978-0-8153-1553-7.
  100. Walker, Lawrence J.; Smetana, Judith (2005). "Gender and Morality". In Killen, Melanie (ed.). Handbook of Moral Development. Psychology Press. pp. 93–115. ISBN   978-1-135-61917-6.
  101. Jaffee and Hyde (2001)[ full citation needed ]
  102. Rest, James R.; Narvaez, Darcia; Thoma, Stephen J.; Bebeau, Muriel J. (1999). Postconventional Moral Thinking: A Neo-Kohlbergian Approach. Psychology Press. ISBN   978-1-135-70561-9.
  103. Govrin, A (2014). "The ABC of moral development: an attachment approach to moral judgment". Frontiers in Psychology. 5 (6): 1–14. doi: 10.3389/fpsyg.2014.00006 . PMC   3901400 . PMID   24478739. CC-BY icon.svg This article contains quotations from this source, which is available under the Creative Commons Attribution 3.0 Unported (CC BY 3.0) license.
  104. Govrin, A. (2019). Ethics and attachment - How we make moral judgments. London: Routledge
  105. Govrin, A. (2014) "From Ethics of Care to Psychology of Care: Reconnecting Ethics of Care to Contemporary Moral Psychology", Frontiers in Psychology pp. 1-10
  106. Staub, Ervin (2003). Psychology of good and evil: why children, adults, and groups help and harm others. Cambridge University Press. ISBN   978-0-511-07031-0.
  107. Baumeister, Roy F. (1997). Evil : inside human cruelty and violence. New York: W.H. Freeman. ISBN   9780716735670.
  108. Milgram, Stanley (1963). "Behavioral Study of Obedience". Journal of Abnormal and Social Psychology. 67 (4): 371–8. CiteSeerX   10.1.1.599.92 . doi:10.1037/h0040525. PMID   14049516.
  109. Haney, Craig; Banks, Curtis; Zimbardo, Philip (1972). "Interpersonal Dynamics in a Simulated Prison" (PDF). Archived from the original (PDF) on March 26, 2020.{{cite journal}}: Cite journal requires |journal= (help)
  110. Martin, Jack (2016). "Ernest Becker and Stanley Milgram: Twentieth-century students of evil". History of Psychology. 19 (1): 3–21. doi:10.1037/hop0000016. PMID   26640976.
  111. Hofmann, W.; Wisneski, D. C.; Brandt, M. J.; Skitka, L. J. (11 September 2014). "Morality in everyday life". Science. 345 (6202): 1340–1343. Bibcode:2014Sci...345.1340H. doi:10.1126/science.1251560. PMID   25214626. S2CID   31731176.
  112. Hofmann, Wilhelm; Brandt, Mark J.; Wisneski, Daniel C.; Rockenbach, Bettina; Skitka, Linda J. (30 May 2018). "Moral Punishment in Everyday Life" (PDF). Personality and Social Psychology Bulletin. 44 (12): 1697–1711. doi:10.1177/0146167218775075. PMID   29848212. S2CID   44154039.
  113. Monin, B; Miller, D. T. (2001). "Moral credentials and the expression of prejudice" (PDF). Journal of Personality and Social Psychology. 81 (1): 33–43. doi:10.1037/0022-3514.81.1.33. PMID   11474723. Archived from the original on 2015-04-30.{{cite journal}}: CS1 maint: bot: original URL status unknown (link)
  114. Blanken, Irene; van de Ven, Niels; Zeelenberg, Marcel (25 February 2015). "A Meta-Analytic Review of Moral Licensing". Personality and Social Psychology Bulletin. 41 (4): 540–558. doi:10.1177/0146167215572134. PMID   25716992. S2CID   65216.
  115. Peysakhovich, Alexander; Nowak, Martin A.; Rand, David G. (16 September 2014). "Humans display a 'cooperative phenotype' that is domain general and temporally stable". Nature Communications. 5 (1): 4939. Bibcode:2014NatCo...5.4939P. doi: 10.1038/ncomms5939 . PMID   25225950.
  116. Crockett, Molly J.; Kurth-Nelson, Zeb; Siegel, Jenifer Z.; Dayan, Peter; Dolan, Raymond J. (2 December 2014). "Harm to others outweighs harm to self in moral decision making". Proceedings of the National Academy of Sciences. 111 (48): 17320–17325. Bibcode:2014PNAS..11117320C. doi: 10.1073/pnas.1408988111 . PMC   4260587 . PMID   25404350.
  117. Rest, James R (1983). "Morality". Handbook of Child Psychology. 3: 556–629.
  118. Narváez, Darcia; Rest, James (1995). "The four components of acting morally" (PDF). Moral Behavior and Moral Development: An Introduction: 385–400.
  119. Reynolds, Scott J.; Ceranic, Tara L. (2007). "The effects of moral judgment and moral identity on moral behavior: An empirical examination of the moral individual" (PDF). Journal of Applied Psychology. 92 (6): 1610–1624. doi:10.1037/0021-9010.92.6.1610. ISSN   1939-1854. PMID   18020800.
  120. Leffel, G. M. (2008). "Who cares? Generativity and the moral emotions, part 2: A "social intuitionist model" of moral motivation". Journal of Psychology and Theology. 36 (3): 182–201. doi:10.1177/009164710803600303. S2CID   149360947.
  121. Leffel 2008's model draws heavily on Haidt 2001's social intuitionist model of moral judgment.
  122. Baumeister, Roy F.; Juola Exline, Julie (December 1999). "Virtue, Personality, and Social Relations: Self-Control as the Moral Muscle". Journal of Personality. 67 (6): 1165–1194. doi:10.1111/1467-6494.00086. PMID   10637991.
  123. Thielmann, Isabel; Spadaro, Giuliana; Balliet, Daniel (January 2020). "Personality and prosocial behavior: A theoretical framework and meta-analysis". Psychological Bulletin. 146 (1): 30–90. doi:10.1037/bul0000217. PMID   31841013. S2CID   209384267.
  124. McAdams, Dan P.; Mayukha, Ananya (2023-01-28). "Hiding in plain view: An historical perspective on the study of morality in personality psychology". Journal of Personality. 92 (3): 666–682. doi: 10.1111/jopy.12808 . ISSN   0022-3506. PMID   36648361.
  125. Sun, Jessie; Wilt, Joshua; Meindl, Peter; Watkins, Hanne M.; Goodwin, Geoffrey P. (2023-01-18). "How and Why People Want to Be More Moral". Journal of Personality. 92 (3): 907–925. doi: 10.1111/jopy.12812 . ISSN   0022-3506. PMID   36652292.
  126. Waples, Ethan P.; Antes, Alison L.; Murphy, Stephen T.; Connelly, Shane; Mumford, Michael D. (June 2009). "A Meta-Analytic Investigation of Business Ethics Instruction". Journal of Business Ethics. 87 (1): 133–151. doi:10.1007/s10551-008-9875-0. S2CID   153414285.
  127. Lyubomirsky, Sonja; King, Laura; Diener, Ed (2005). "The Benefits of Frequent Positive Affect: Does Happiness Lead to Success?". Psychological Bulletin. 131 (6): 803–855. doi:10.1037/0033-2909.131.6.803. PMID   16351326. S2CID   684129.
  128. Curry, Oliver Scott; Rowland, Lee A.; Van Lissa, Caspar J.; Zlotowitz, Sally; McAlaney, John; Whitehouse, Harvey (May 2018). "Happy to help? A systematic review and meta-analysis of the effects of performing acts of kindness on the well-being of the actor". Journal of Experimental Social Psychology. 76: 320–329. doi: 10.1016/j.jesp.2018.02.014 .
  129. Hui, Bryant P. H.; Ng, Jacky C. K.; Berzaghi, Erica; Cunningham-Amos, Lauren A.; Kogan, Aleksandr (3 September 2020). "Rewards of kindness? A meta-analysis of the link between prosociality and well-being". Psychological Bulletin. 146 (12): 1084–1116. doi:10.1037/bul0000298. PMID   32881540. S2CID   221497259.
  130. Darnell, Catherine; Gulliford, Liz; Kristjánsson, Kristján; Paris, Panos (2019). "Phronesis and the Knowledge-Action Gap in Moral Psychology and Moral Education: A New Synthesis?" (PDF). Human Development. 62 (3): 101–129. doi:10.1159/000496136. S2CID   150535431.
  131. Metcalfe, J.; Mischel, W. (1999). "A hot/cool-system analysis of delay of gratification: Dynamics of willpower". Psychological Review. 106 (1): 3–19. doi:10.1037/0033-295x.106.1.3. PMID   10197361.
  132. 1 2 Baumeister (2005). "Self and volition". In Miller, William; Delaney, Harold (eds.). Judeo-Christian Perspectives on Psychology: Human Nature, Motivation, and Change. Washington, DC: American Psychological Association. pp. 57–72. ISBN   978-1-59147-161-5.
  133. Baumeister 2005, p. 68.
  134. Muraven, Mark; Baumeister, Roy F.; Tice, Dianne M. (August 1, 1999). "Longitudinal Improvement of Self-Regulation Through Practice: Building Self-Control Strength Through Repeated Exercise" (PDF). The Journal of Social Psychology. 139 (4): 446–457. doi:10.1080/00224549909598404. ISSN   0022-4545. PMID   10457761.
  135. "Hagger et al (2016) A Multilab Preregistered Replication of the Ego-Depletion Effect.pdf" (PDF).
  136. Gigerenzer, Gerd (2008). "Moral intuition= fast and frugal heuristics?". Moral Psychology: 1–26 via MIT Press.
  137. Gigerenzer, G. (2010). Moral satisficing: Rethinking moral behavior as bounded rationality. Topics in cognitive science, 2(3), 528-554.
  138. Greene, Joshua (2008). "The secret joke of Kant's Soul". In Sinnott-Armstrong, Walter (ed.). Moral Psychology. Vol. 3. Cambridge, Massachusetts: MIT Press. pp. 35–80. ISBN   978-0-262-69355-4. OCLC   750463100.
  139. 1 2 Pizarro, David A. (2007). "Moral Emotions". In Baumeister, Roy F; Vohs, Kathleen D (eds.). Encyclopedia of Social Psychology . SAGE Publications, Inc. pp.  588–589. doi:10.4135/9781412956253.n350. ISBN   9781412956253.
  140. Haidt, Jonathan (2003). "The Moral Emotions" (PDF). In Davidson, Richard; Scherer, Klaus; Goldsmith, H. (eds.). Handbook of Affective Sciences. Oxford University Press. pp.  855. ISBN   978-0-19-512601-3.
  141. 1 2 Tangney, June Price; Stuewig, Jeff; Mashek, Debra J. (January 2007). "Moral Emotions and Moral Behavior" (PDF). Annual Review of Psychology . 58 (1): 345–372. doi:10.1146/annurev.psych.56.091103.070145. PMC   3083636 . PMID   16953797.
  142. Atari, Mohammad; Mostafazadeh Davani, Aida; Dehghani, Morteza (February 2020). "Body Maps of Moral Concerns". Psychological Science. 31 (2): 160–169. doi:10.1177/0956797619895284. ISSN   0956-7976. PMID   31913779.
  143. Rozin, Paul (1999). "The Process of Moralization". Psychological Science. 10 (3): 218–221. doi:10.1111/1467-9280.00139. S2CID   145121850.
  144. Rozin, Paul; Markwith, Maureen; Stoess, Caryn (6 May 2016). "Moralization and Becoming a Vegetarian: The Transformation of Preferences Into Values and the Recruitment of Disgust". Psychological Science. 8 (2): 67–73. doi:10.1111/j.1467-9280.1997.tb00685.x. S2CID   22267477.
  145. Rhee, Joshua J.; Schein, Chelsea; Bastian, Brock (25 November 2019). "The what, how, and why of moralization: A review of current definitions, methods, and evidence in moralization research". Social and Personality Psychology Compass. 13 (12). doi:10.1111/spc3.12511. S2CID   212770791.
  146. Skitka, Linda (2002). "Do the means always justify the ends or do the ends sometimes justify the means? A value protection model of justice". Personality and Social Psychology Bulletin. 28 (5): 452–461. doi:10.1177/0146167202288003. S2CID   145542300.
  147. Skitka, Linda J.; Hanson, Brittany E.; Scott Morgan, G.; Wisneski, Daniel C. (4 January 2021). "The Psychology of Moral Conviction". Annual Review of Psychology . 72 (1): annurev–psych–063020-030612. doi:10.1146/annurev-psych-063020-030612. PMID   32886586. S2CID   221504252.
  148. Morgan, G. S.; Skitka, L. J. (2011). "Moral conviction". In Christie, Daniel J. (ed.). Encyclopedia of Peace Psychology. Wiley-Blackwell. ISBN   978-1-4051-9644-4.
  149. Skitka, L. J.; Bauman, C.; Sargis, E. (2005). "Moral conviction: Another contributor to attitude strength or something more?" (PDF). Journal of Personality and Social Psychology. 88 (6): 895–917. doi:10.1037/0022-3514.88.6.895. PMID   15982112. S2CID   14291970. Archived from the original (PDF) on 2018-02-09.
  150. "Moral enhancement", Wikipedia, 2024-07-08, retrieved 2024-11-10
  151. Zarpentine, Chris (April 2013). "'The Thorny and Arduous Path of Moral Progress': Moral Psychology and Moral Enhancement". Neuroethics. 6 (1): 141–153. doi:10.1007/s12152-012-9166-4. ISSN   1874-5490.
  152. Zarpentine, Chris (April 2013). "'The Thorny and Arduous Path of Moral Progress': Moral Psychology and Moral Enhancement". Neuroethics. 6 (1): 141–153. doi:10.1007/s12152-012-9166-4. ISSN   1874-5490.
  153. Haidt, Jonathan; Rosenberg, Evan; Hom, Holly (2003). "Differentiating Diversities: Moral Diversity Is Not Like Other Kinds". Journal of Applied Social Psychology. 33 (1): 1–36. doi:10.1111/j.1559-1816.2003.tb02071.x.
  154. Motyl, Matt; Iyer, Ravi; Oishi, Shigehiro; Trawalterl, Sophie; Nosek, Brian A. (2014). "How ideological migration geographically segregates groups". Journal of Experimental Social Psychology. 51: 1–14. doi:10.1016/j.jesp.2013.10.010.
  155. Graham, Jesse; Nosek, Brian A.; Haidt, Jonathan; Young, Liane (12 December 2012). "The Moral Stereotypes of Liberals and Conservatives: Exaggeration of Differences across the Political Spectrum". PLOS ONE. 7 (12): e50092. Bibcode:2012PLoSO...750092G. doi: 10.1371/journal.pone.0050092 . PMC   3520939 . PMID   23251357.
  156. Frimer, Jeremy A.; Skitka, Linda J.; Motyl, Matt (September 2017). "Liberals and conservatives are similarly motivated to avoid exposure to one another's opinions". Journal of Experimental Social Psychology. 72: 1–12. doi:10.1016/j.jesp.2017.04.003.
  157. Kahane, Guy (March 2011). "Evolutionary Debunking Arguments". Noûs. 45 (1): 103–125. doi:10.1111/j.1468-0068.2010.00770.x. PMC   3175808 . PMID   21949447.
  158. Greene, Joshua (October 2003). "From neural 'is' to moral 'ought': what are the moral implications of neuroscientific moral psychology?". Nature Reviews Neuroscience. 4 (10): 846–850. doi:10.1038/nrn1224. PMID   14523384. S2CID   14438498.
  159. Greene, Joshua; Cohen, Jonathan (29 November 2004). "For the law, neuroscience changes nothing and everything". Philosophical Transactions of the Royal Society of London. Series B: Biological Sciences. 359 (1451): 1775–1785. doi:10.1098/rstb.2004.1546. PMC   1693457 . PMID   15590618.
  160. Berker, Selim (September 2009). "The Normative Insignificance of Neuroscience". Philosophy & Public Affairs. 37 (4): 293–329. doi:10.1111/j.1088-4963.2009.01164.x. S2CID   5952062.
  161. Nagel, Thomas (2 November 2013). "You Can't Learn About Morality from Brain Scans". The New Republic.
  162. Singer, Peter (October 2005). "Ethics and Intuitions". The Journal of Ethics. 9 (3–4): 331–352. doi:10.1007/s10892-005-3508-y. S2CID   49914215.
  163. Doris, John M. (2002). Lack of Character: Personality and Moral Behavior. Cambridge University Press. ISBN   978-1-316-02549-9.
  164. Nichols, Shaun (2004). Sentimental Rules: On the Natural Foundations of Moral Judgment. Oxford University Press. ISBN   978-0-19-988347-9.
  165. Darby, R. Ryan; Pascual-Leone, Alvaro (22 February 2017). "Moral Enhancement Using Non-invasive Brain Stimulation". Frontiers in Human Neuroscience. 11: 77. doi: 10.3389/fnhum.2017.00077 . PMC   5319982 . PMID   28275345.
  166. Levy, Neil; Douglas, Thomas; Kahane, Guy; Terbeck, Sylvia; Cowen, Philip J.; Hewstone, Miles; Savulescu, Julian (2014). "Are You Morally Modified?: The Moral Effects of Widely Used Pharmaceuticals". Philosophy, Psychiatry, & Psychology. 21 (2): 111–125. doi:10.1353/ppp.2014.0023. PMC   4398979 . PMID   25892904.
  167. Awad, Edmond; Dsouza, Sohan; Bonnefon, Jean-François; Shariff, Azim; Rahwan, Iyad (24 February 2020). "Crowdsourcing moral machines". Communications of the ACM. 63 (3): 48–55. doi: 10.1145/3339904 . hdl: 21.11116/0000-0007-4771-A .
  168. De Freitas, Julian; Anthony, Sam E.; Censi, Andrea; Alvarez, George A. (31 July 2020). "Doubting Driverless Dilemmas". Perspectives on Psychological Science. 15 (5): 1284–1288. doi:10.1177/1745691620922201. PMID   32735472. S2CID   220908883.
  169. Atari, Mohammad; Xue, Mona J.; Park, Peter S.; Blasi, Damián Ezequiel; Henrich, Joseph. "OSF". osf.io. doi:10.31234/osf.io/5b26t . Retrieved 2023-12-10.
  170. Henrich, Joseph (2020-09-08). The WEIRDest People in the World: How the West Became Psychologically Peculiar and Particularly Prosperous. New York: Farrar, Straus and Giroux. ISBN   978-0-374-17322-7.
  171. Bloom, Paul (2023-11-29). "How Moral Can A.I. Really Be?". The New Yorker. ISSN   0028-792X . Retrieved 2023-12-10.
  172. Gigerenzer, G. (2022). How to stay smart in a smart world: Why human intelligence still beats algorithms. MIT Press.

Related Research Articles

<span class="mw-page-title-main">Morality</span> Differentiation between right and wrong

Morality is the categorization of intentions, decisions and actions into those that are proper, or right, and those that are improper, or wrong. Morality can be a body of standards or principles derived from a code of conduct from a particular philosophy, religion or culture, or it can derive from a standard that is understood to be universal. Morality may also be specifically synonymous with "goodness", "appropriateness" or "rightness".

<span class="mw-page-title-main">Cognitive bias</span> Systematic pattern of deviation from norm or rationality in judgment

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.

Lawrence Kohlberg was an American psychologist best known for his theory of stages of moral development.

Moral reasoning is the study of how people think about right and wrong and how they acquire and apply moral rules. It is a subdiscipline of moral psychology that overlaps with moral philosophy, and is the foundation of descriptive ethics.

Lawrence Kohlberg's stages of moral development constitute an adaptation of a psychological theory originally conceived by the Swiss psychologist Jean Piaget. Kohlberg began work on this topic as a psychology graduate student at the University of Chicago in 1958 and expanded upon the theory throughout his life.

In moral psychology, social intuitionism is a model that proposes that moral positions are often non-verbal and behavioral. Often such social intuitionism is based on "moral dumbfounding" where people have strong moral reactions but fail to establish any kind of rational principle to explain their reaction.

<span class="mw-page-title-main">Jonathan Haidt</span> American social psychologist (born 1963)

Jonathan David Haidt is an American social psychologist and author. He is the Thomas Cooley Professor of Ethical Leadership at the New York University Stern School of Business. His main areas of study are the psychology of morality and moral emotions.

<span class="mw-page-title-main">Philip E. Tetlock</span> Canadian-American political scientist

Philip E. Tetlock is a Canadian-American political science writer, and is currently the Annenberg University Professor at the University of Pennsylvania, where he is cross-appointed at the Wharton School and the School of Arts and Sciences. He was elected a Member of the American Philosophical Society in 2019.

Heuristics is the process by which humans use mental shortcuts to arrive at decisions. Heuristics are simple strategies that humans, animals, organizations, and even machines use to quickly form judgments, make decisions, and find solutions to complex problems. Often this involves focusing on the most relevant aspects of a problem or situation to formulate a solution. While heuristic processes are used to find the answers and solutions that are most likely to work or be correct, they are not always right or the most accurate. Judgments and decisions based on heuristics are simply good enough to satisfy a pressing need in situations of uncertainty, where information is incomplete. In that sense they can differ from answers given by logic and probability.

Moral development focuses on the emergence, change, and understanding of morality from infancy through adulthood. The theory states that morality develops across the lifespan in a variety of ways. Morality is influenced by an individual's experiences, behavior, and when they are faced with moral issues through different periods of physical and cognitive development. Morality concerns an individual's reforming sense of what is right and wrong; it is for this reason that young children have different moral judgment and character than that of a grown adult. Morality in itself is often a synonym for "rightness" or "goodness." It also refers to a specific code of conduct that is derived from one's culture, religion, or personal philosophy that guides one's actions, behaviors, and thoughts.

Political ethics is the practice of making moral judgments about political action and political agents. It covers two areas: the ethics of process, which covers public officials and their methods, and the ethics of policy, which concerns judgments surrounding policies and laws.

Moral foundations theory is a social psychological theory intended to explain the origins of and variation in human moral reasoning on the basis of innate, modular foundations. It was first proposed by the psychologists Jonathan Haidt, Craig Joseph, and Jesse Graham, building on the work of cultural anthropologist Richard Shweder. More recently, Mohammad Atari, Jesse Graham, and Jonathan Haidt have revised some aspects of the theory and developed new measurement tools. The theory has been developed by a diverse group of collaborators and popularized in Haidt's book The Righteous Mind. The theory proposes that morality is "more than one thing", first arguing for five foundations, and later expanding for six foundations :

Role-taking theory is the social-psychological concept that one of the most important factors in facilitating social cognition in children is the growing ability to understand others’ feelings and perspectives, an ability that emerges as a result of general cognitive growth. Part of this process requires that children come to realize that others’ views may differ from their own. Role-taking ability involves understanding the cognitive and affective aspects of another person's point of view, and differs from perceptual perspective taking, which is the ability to recognize another person's visual point of view of the environment. Furthermore, albeit some mixed evidence on the issue, role taking and perceptual perspective taking seem to be functionally and developmentally independent of each other.

<i>The Righteous Mind</i> 2012 book by Jonathan Haidt

The Righteous Mind: Why Good People are Divided by Politics and Religion is a 2012 social psychology book by Jonathan Haidt, in which the author describes human morality as it relates to politics and religion.

Elevation is an emotion elicited by witnessing actual or imagined virtuous acts of remarkable moral goodness. It is experienced as a distinct feeling of warmth and expansion that is accompanied by appreciation and affection for the individual whose exceptional conduct is being observed. Elevation motivates those who experience it to open up to, affiliate with, and assist others. Elevation makes an individual feel lifted up and optimistic about humanity.

Moral blindness, also known as ethical blindness, is defined as a person's temporary inability to see the ethical aspect of a decision they are making. It is often caused by external factors due to which an individual is unable to see the immoral aspect of their behavior in that particular situation.

Relational models theory (RMT) is a theory of interpersonal relationships, authored by anthropologist Alan Fiske and initially developed from his fieldwork in Burkina Faso. RMT proposes that all human interactions can be described in terms of just four "relational models", or elementary forms of human relations: communal sharing, authority ranking, equality matching and market pricing.

Moral emotions are a variety of social emotions that are involved in forming and communicating moral judgments and decisions, and in motivating behavioral responses to one's own and others' moral behavior. As defined by Jonathan Haidt, moral emotions "are linked to the interests or welfare either of a society as a whole or at least of persons other than the judge or agent". A person may not always have clear words to articulate, yet simultaneously knows it to be true.

Moral identity is a concept within moral psychology concerning the importance of morality to a person’s identity, typically construed as either a trait-like individual difference, or set of chronically accessible schemas.

References

From the Stanford Encyclopedia of Philosophy
From the Internet Encyclopedia of Philosophy