The bias blind spot is the cognitive bias of recognizing the impact of biases on the judgment of others, while failing to see the impact of biases on one's own judgment. [1] The term was created by Emily Pronin, a social psychologist from Princeton University's Department of Psychology, with colleagues Daniel Lin and Lee Ross. [2] [ better source needed ] The bias blind spot is named after the visual blind spot. Most people appear to exhibit the bias blind spot. In a sample of more than 600 residents of the United States, more than 85% believed they were less biased than the average American. Only one participant believed that they were more biased than the average American. People do vary with regard to the extent to which they exhibit the bias blind spot. This phenomenon has been successfully replicated and it appears that in general, stronger personal free will beliefs are associated with bias blind spot. [3] It appears to be a stable individual difference that is measurable. [5]
The bias blind spot appears to be a true blind spot in that it is unrelated to actual decision making ability. Performance on indices of decision making competence are not related to individual differences in bias blind spot. In other words, most people appear to believe that they are less biased than others, regardless of their actual decision making ability. [4]
Bias blind spots may be caused by a variety of other biases and self-deceptions. [6]
Self-enhancement biases may play a role, in that people are motivated to view themselves in a positive light. Biases are generally seen as undesirable, [7] so people tend to think of their own perceptions and judgments as being rational, accurate, and free of bias. The self-enhancement bias also applies when analyzing our own decisions, in that people are likely to think of themselves as better decision-makers than others. [6]
People also tend to believe they are aware of "how" and "why" they make their decisions, and therefore conclude that bias did not play a role. Many of our decisions are formed from biases and cognitive shortcuts, which are unconscious processes. By definition, people are unaware of unconscious processes, and therefore cannot see their influence in the decision making process. [6]
When made aware of various biases acting on our perception, decisions, or judgments, research has shown that we are still unable to control them. This contributes to the bias blind spot in that even if one is told that they are biased, they are unable to alter their biased perception. [6]
Emily Pronin and Matthew Kugler have argued that this phenomenon is due to the introspection illusion. [8] In their experiments, subjects had to make judgments about themselves and about other subjects. [9] They displayed standard biases, for example rating themselves above the others on desirable qualities (demonstrating illusory superiority). The experimenters explained cognitive bias, and asked the subjects how it might have affected their judgment. The subjects rated themselves as less susceptible to bias than others in the experiment (confirming the bias blind spot). When they had to explain their judgments, they used different strategies for assessing their own and others' bias.
Pronin and Kugler's interpretation is that, when people decide whether someone else is biased, they use overt behaviour. On the other hand, when assessing whether they themselves are biased, people look inward, searching their own thoughts and feelings for biased motives. [8] Since biases operate unconsciously, these introspections are not informative, but people wrongly treat them as reliable indication that they themselves, unlike other people, are immune to bias. [9]
Pronin and Kugler tried to give their subjects access to others' introspections. To do this, they made audio recordings of subjects who had been told to say whatever came into their heads as they decided whether their answer to a previous question might have been affected by bias. [9] Although subjects persuaded themselves they were unlikely to be biased, their introspective reports did not sway the assessments of observers.
People tend to attribute bias in an uneven way. When people reach different perceptions, they tend to label one another as biased while labelling themselves as accurate and unbiased. Pronin hypothesizes that this bias misattribution may be a source of conflict and misunderstanding between people. For example, in labeling another person as biased, one may also label their intentions cynically. But when examining one's own cognitions, people judge themselves based on their good intentions. It is likely that in this case, one may attribute another's bias to "intentional malice" rather than an unconscious process. [10]
Pronin also hypothesizes ways to use awareness of the bias blind spot to reduce conflict, and to think in a more "scientifically informed" way. Although we are unable to control bias on our own cognitions, [6] one may keep in mind that biases are acting on everyone. Pronin suggests that people might use this knowledge to separate other's intentions from their actions. [10]
Initial evidence suggests that the bias blind spot is not related to actual decision-making ability. [4] Participants who scored better or poorer on various tasks associated with decision making competence were no more or less likely to be higher or lower in their susceptibility to bias blind spot. Bias blind spot does, however, appear to increase susceptibility to related biases.
A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, or what is broadly called irrationality.
Introspection is the examination of one's own conscious thoughts and feelings. In psychology, the process of introspection relies on the observation of one's mental state, while in a spiritual context it may refer to the examination of one's soul. Introspection is closely related to human self-reflection and self-discovery and is contrasted with external observation.
Egocentric bias is the tendency to rely too heavily on one's own perspective and/or have a higher opinion of oneself than reality. It appears to be the result of the psychological need to satisfy one's ego and to be advantageous for memory consolidation. Research has shown that experiences, ideas, and beliefs are more easily recalled when they match one's own, causing an egocentric outlook. Michael Ross and Fiore Sicoly first identified this cognitive bias in their 1979 paper, "Egocentric biases in availability and attribution". Egocentric bias is referred to by most psychologists as a general umbrella term under which other related phenomena fall.
Trait ascription bias is the tendency for people to view themselves as relatively variable in terms of personality, behavior and mood while viewing others as much more predictable in their personal traits across different situations. More specifically, it is a tendency to describe one's own behaviour in terms of situational factors while preferring to describe another's behaviour by ascribing fixed dispositions to their personality. This may occur because peoples' own internal states are more readily observable and available to them than those of others.
In psychology, the false consensus effect, also known as consensus bias, is a pervasive cognitive bias that causes people to “see their own behavioral choices and judgments as relatively common and appropriate to existing circumstances”. In other words, they assume that their personal qualities, characteristics, beliefs, and actions are relatively widespread through the general population.
The illusion of control is the tendency for people to overestimate their ability to control events. It was named by U.S. psychologist Ellen Langer and is thought to influence gambling behavior and belief in the paranormal. Along with illusory superiority and optimism bias, the illusion of control is one of the positive illusions.
Thomas Dashiff Gilovich an American psychologist who is the Irene Blecker Rosenfeld Professor of Psychology at Cornell University. He has conducted research in social psychology, decision making, behavioral economics, and has written popular books on these subjects. Gilovich has collaborated with Daniel Kahneman, Richard Nisbett, Lee Ross and Amos Tversky. His articles in peer-reviewed journals on subjects such as cognitive biases have been widely cited. In addition, Gilovich has been quoted in the media on subjects ranging from the effect of purchases on happiness to perception of judgment in social situations. Gilovich is a fellow of the Committee for Skeptical Inquiry.
Lee David Ross was a Canadian-American professor. He held the title of the Stanford Federal Credit Union Professor of Humanities and Sciences at Stanford University and was an influential social psychologist who studied attributional biases, shortcomings in judgment and decision making, and barriers to conflict resolution, often with longtime collaborator Mark Lepper. Ross was known for his identification and explication of the fundamental attribution error and for the demonstration and analysis of other phenomena and shortcomings that have become standard topics in textbooks and in some cases, even popular media. His interests included ongoing societal problems, in particular protracted inter-group conflicts, the individual and collective rationalization of evil, and the psychological processes that make it difficult to confront societal challenges. Ross went beyond the laboratory to involve himself in conflict resolution and public peace processes in the Middle East, Northern Ireland, and other areas of the world.
Selective perception is the tendency not to notice and more quickly forget stimuli that cause emotional discomfort and contradict our prior beliefs. For example, a teacher may have a favorite student because they are biased by in-group favoritism. The teacher ignores the student's poor attainment. Conversely, they might not notice the progress of their least favorite student.
The adaptive unconscious, first coined by social psychologist Daniel Wegner in 2002, is described as a set of mental processes that is able to affect judgement and decision-making, but is out of reach of the conscious mind. It is thought to be adaptive as it helps to keep the organism alive. Architecturally, the adaptive unconscious is said to be unreachable because it is buried in an unknown part of the brain. This type of thinking evolved earlier than the conscious mind, enabling the mind to transform information and think in ways that enhance an organism's survival. It can be described as a quick sizing up of the world which interprets information and decides how to act very quickly and outside the conscious view. The adaptive unconscious is active in everyday activities such as learning new material, detecting patterns, and filtering information. It is also characterized by being unconscious, unintentional, uncontrollable, and efficient without requiring cognitive tools. Lacking the need for cognitive tools does not make the adaptive unconscious any less useful than the conscious mind as the adaptive unconscious allows for processes like memory formation, physical balancing, language, learning, and some emotional and personalities processes that includes judgement, decision making, impression formation, evaluations, and goal pursuing. Despite being useful, the series of processes of the adaptive unconscious will not always result in accurate or correct decisions by the organism. The adaptive unconscious is affected by things like emotional reaction, estimations, and experience and is thus inclined to stereotyping and schema which can lead to inaccuracy in decision making. The adaptive conscious does however help decision making to eliminate cognitive biases such as prejudice because of its lack of cognitive tools.
The overconfidence effect is a well-established bias in which a person's subjective confidence in his or her judgments is reliably greater than the objective accuracy of those judgments, especially when confidence is relatively high. Overconfidence is one example of a miscalibration of subjective probabilities. Throughout the research literature, overconfidence has been defined in three distinct ways: (1) overestimation of one's actual performance; (2) overplacement of one's performance relative to others; and (3) overprecision in expressing unwarranted certainty in the accuracy of one's beliefs.
The illusion of asymmetric insight is a cognitive bias whereby people perceive their knowledge of others to surpass other people's knowledge of them. This bias "has been traced to people's tendency to view their own spontaneous or off-the-cuff responses to others' questions as relatively unrevealing even though they view others' similar responses as meaningful".
Optimism bias is a cognitive bias that causes someone to believe that they themselves are less likely to experience a negative event. It is also known as unrealistic optimism or comparative optimism.
In the field of social psychology, illusory superiority is a condition of cognitive bias wherein a person overestimates their own qualities and abilities, in relation to the same qualities and abilities of other people. Illusory superiority is one of many positive illusions, relating to the self, that are evident in the study of intelligence, the effective performance of tasks and tests, and the possession of desirable personal characteristics and personality traits. Overestimation of abilities compared to an objective measure is known as the overconfidence effect.
The introspection illusion is a cognitive bias in which people wrongly think they have direct insight into the origins of their mental states, while treating others' introspections as unreliable. The illusion has been examined in psychological experiments, and suggested as a basis for biases in how people compare themselves to others. These experiments have been interpreted as suggesting that, rather than offering direct access to the processes underlying mental states, introspection is a process of construction and inference, much as people indirectly infer others' mental states from their behaviour.
In social psychology, naïve realism is the human tendency to believe that we see the world around us objectively, and that people who disagree with us must be uninformed, irrational, or biased.
Intuition in the context of decision-making is defined as a "non-sequential information-processing mode." It is distinct from insight and can be contrasted with the deliberative style of decision-making. Intuition can influence judgment through either emotion or cognition, and there has been some suggestion that it may be a means of bridging the two. Individuals use intuition and more deliberative decision-making styles interchangeably, but there has been some evidence that people tend to gravitate to one or the other style more naturally. People in a good mood gravitate toward intuitive styles, while people in a bad mood tend to become more deliberative. The specific ways in which intuition actually influences decisions remain poorly understood.
Debiasing is the reduction of bias, particularly with respect to judgment and decision making. Biased judgment and decision making is that which systematically deviates from the prescriptions of objective standards such as facts, logic, and rational behavior or prescriptive norms. Biased judgment and decision making exists in consequential domains such as medicine, law, policy, and business, as well as in everyday life. Investors, for example, tend to hold onto falling stocks too long and sell rising stocks too quickly. Employers exhibit considerable discrimination in hiring and employment practices, and some parents continue to believe that vaccinations cause autism despite knowing that this link is based on falsified evidence. At an individual level, people who exhibit less decision bias have more intact social environments, reduced risk of alcohol and drug use, lower childhood delinquency rates, and superior planning and problem solving abilities.
The false-uniqueness effect is an attributional type of cognitive bias in social psychology that describes how people tend to view their qualities, traits, and personal attributes as unique when in reality they are not. This bias is often measured by looking at the difference between estimates that people make about how many of their peers share a certain trait or behaviour and the actual number of peers who report these traits and behaviours.