Introspection illusion

Last updated
The surface appearance of an iceberg is often used to illustrate the human conscious and unconscious mind; the visible portions are easily noticed, and yet their shape depends on the much larger portions that are out of view. Iceberg in the Arctic with its underside exposed.jpg
The surface appearance of an iceberg is often used to illustrate the human conscious and unconscious mind; the visible portions are easily noticed, and yet their shape depends on the much larger portions that are out of view.

The introspection illusion is a cognitive bias in which people wrongly think they have direct insight into the origins of their mental states, while treating others' introspections as unreliable. The illusion has been examined in psychological experiments, and suggested as a basis for biases in how people compare themselves to others. These experiments have been interpreted as suggesting that, rather than offering direct access to the processes underlying mental states, introspection is a process of construction and inference, much as people indirectly infer others' mental states from their behaviour. [1]

Contents

When people mistake unreliable introspection for genuine self-knowledge, the result can be an illusion of superiority over other people, for example when each person thinks they are less biased and less conformist than the rest of the group. Even when experimental subjects are provided with reports of other subjects' introspections, in as detailed a form as possible, they still rate those other introspections as unreliable while treating their own as reliable. Although the hypothesis of an introspection illusion informs some psychological research, the existing evidence is arguably inadequate to decide how reliable introspection is in normal circumstances. [2]

In certain situations, this illusion leads people to make confident but false explanations of their own behaviour (called "causal theories" [3] ) or inaccurate predictions of their future mental states.

Correction for the bias may be possible through education about the bias and its unconscious nature. [4]

Components

The phrase "introspection illusion" was coined by Emily Pronin. [5] Pronin describes the illusion as having four components:

  1. People give a strong weighting to introspective evidence when assessing themselves.
  2. They do not give such a strong weight when assessing others.
  3. People disregard their own behaviour when assessing themselves (but not others).
  4. Own introspections are more highly weighted than others. It is not just that people lack access to each other's introspections: they regard only their own as reliable. [6]

Unreliability of introspection

[I]ntrospection does not provide a direct pipeline to nonconscious mental processes. Instead, it is best thought of as a process whereby people use the contents of consciousness to construct a personal narrative that may or may not correspond to their nonconscious states.

Timothy D. Wilson and Elizabeth W. Dunn (2004) [7]

The idea that people can be mistaken about their inner functioning is one applied by eliminative materialists. These philosophers suggest that some concepts, including "belief" or "pain", will turn out to be quite different from what is commonly expected as science advances. The faulty guesses that people make to explain their thought processes have been called "causal theories". [3] The causal theories provided after an action will often serve only to justify the person's behaviour in order to relieve cognitive dissonance. That is, a person may not have noticed the true reasons for their behaviour, even when trying to explain it. The result is an explanation that mostly merely makes themselves feel better. An example might be a man who mistreats others who have a specific quality because he is embarrassed that he himself has that quality. He may not admit this to himself, instead claiming that his prejudice is because he has concluded that the specific quality is bad.

A 1977 paper by psychologists Richard Nisbett and Timothy D. Wilson challenged the directness and reliability of introspection, thereby becoming one of the most cited papers in the science of consciousness. [8] [9] Nisbett and Wilson reported on experiments in which subjects verbally explained why they had a particular preference, or how they arrived at a particular idea. On the basis of these studies and existing attribution research, they concluded that reports on mental processes are confabulated. They wrote that subjects had, "little or no introspective access to higher order cognitive processes". [10] They distinguished between mental contents (such as feelings) and mental processes, arguing that while introspection gives us access to contents, processes remain hidden. [8]

Research continues to find that humans evolved only limited abilities to introspect Five Senses03.jpg
Research continues to find that humans evolved only limited abilities to introspect

Although some other experimental work followed from the Nisbett and Wilson paper, difficulties with testing the hypothesis of introspective access meant that research on the topic generally stagnated. [9] A ten-year-anniversary review of the paper raised several objections, questioning the idea of "process" they had used and arguing that unambiguous tests of introspective access are hard to achieve. [2] Updating the theory in 2002, Wilson admitted that the 1977 claims had been too far-reaching. [10] He instead relied on the theory that the adaptive unconscious does much of the moment-to-moment work of perception and behaviour. When people are asked to report on their mental processes, they cannot access this unconscious activity. [7] However, rather than acknowledge their lack of insight, they confabulate a plausible explanation, and "seem" to be "unaware of their unawareness". [11]

A study conducted by philosopher Eric Schwitzgebel and psychologist Russell T. Hurlburt was set up to measure the extent of introspective accuracy by gathering introspective reports from a single individual who was given the pseudonym "Melanie". Melanie was given a beeper which sounded at random moments, and when it did she had to note what she was currently feeling and thinking. After analyzing the reports the authors had mixed views about the results, the correct interpretation of Melanie's claims and her introspective accuracy. Even after long discussion the two authors disagreed with each other in the closing remarks, Schwitzgebel being pessimistic and Hurlburt optimistic about the reliability of introspection. [12]

Factors in accuracy

Nisbett and Wilson conjectured about several factors that they found to contribute to the accuracy of introspective self-reports on cognition. [8]

Unawareness of error

Several hypotheses to explain people's unawareness of their inaccuracies in introspection were provided by Nisbett and Wilson: [8]

Criticisms

Some evolutionary biologists criticize the claim that confabulation of justifications evolved to relieve cognitive dissonance because it assumes the evolution of a mechanism for feeling dissonance by a lack of justification. These evolutionary biologists argue that if causal theories had no higher predictive accuracy than prejudices that would have been in place even without causal theories, there would be no evolutionary selection for experiencing any form of discomfort from lack of causal theories. [14] [ page needed ] [15] [ page needed ] The similar claim that the apparent link between homophobia and homosexuality found in the U.S. can be explained by an actual link between homophobia and homosexuality is criticized by many scholars. Since much homophobia in the United States is due to religious indoctrination and therefore unrelated to personal sexual preferences, they argue that the appearance of a link is due to volunteer-biased erotica research in which religious homophobes fear God's judgment but not being recorded as "homosexual" by Earthly psychologists while most non-homophobes are misled by false dichotomies to assume that the notion that men can be sexually fluid is somehow "homophobic" and "unethical". [16] [ page needed ]

Choice blindness

Inspired by the Nisbett and Wilson paper, Petter Johansson and colleagues investigated subjects' insight into their own preferences using a new technique. Subjects saw two photographs of people and were asked which they found more attractive. They were given a closer look at their "chosen" photograph and asked to verbally explain their choice. However, in some trials, the experimenter had slipped them the other photograph rather than the one they had chosen, using sleight of hand. [17] A majority of subjects failed to notice that the picture they were looking at did not match the one they had chosen just seconds before. Many subjects confabulated explanations of their preference. For example, a man might say "I preferred this one because I prefer blondes" when he had in fact pointed to the dark-haired woman, but had been handed a blonde. [9] These must have been confabulated because they explain a choice that was never made. [17] The large proportion of subjects who were taken in by the deception contrasts with the 84% who, in post-test interviews, said that hypothetically they would have detected a switch if it had been made in front of them. The researchers coined the phrase "choice blindness" for this failure to detect a mismatch. [18]

A follow-up experiment involved shoppers in a supermarket tasting two different kinds of jam, then verbally explaining their preferred choice while taking further spoonfuls from the "chosen" pot. However, the pots were rigged so that, when explaining their choice, the subjects were tasting the jam they had actually rejected. A similar experiment was conducted with tea. [19] Another variation involved subjects choosing between two objects displayed on PowerPoint slides, then explaining their choice when the description of what they chose had been altered. [20]

Research by Paul Eastwick and Eli Finkel (relationship psychologist) [21] at Northwestern University also undermined the idea that subjects have direct introspective awareness of what attracts them to other people. These researchers examined male and female subjects' reports of what they found attractive. Men typically reported that physical attractiveness was crucial while women identified earning potential as most important. These subjective reports did not predict their actual choices in a speed dating context, or their dating behaviour in a one-month follow-up. [22]

Consistent with choice blindness, Henkel and Mather found that people are easily convinced by false reminders that they chose different options than they actually chose and that they show greater choice-supportive bias in memory for whichever option they believe they chose. [23]

Criticisms

It is not clear, however, the extent to which these findings apply to real-life experience when we have more time to reflect or use actual faces (as opposed to gray-scale photos). [24] As Prof. Kaszniak points out: "although a priori theories are an important component of people's causal explanations, they are not the sole influence, as originally hypothesized by Nisbett & Wilson. Actors also have privileged information access that includes some degree of introspective access to pertinent causal stimuli and thought processes, as well as better access (than observers) to stimulus-response covariation data about their own behaviour". [25] [ better source needed ] Other criticisms point out that people who volunteer to psychology lab studies are not representative of the general population and also are behaving in ways that do not reflect how they would behave in real life. Examples include people of many different non-open political ideologies, despite their enmity to each other, having a shared belief that it is "ethical" to give an appearance of humans justifying beliefs and "unethical" to admit that humans are open-minded in the absence of threats that inhibit critical thinking, making them fake justifications. [26] [ page needed ] [27] [ page needed ]

Attitude change

Studies that ask participants to introspect upon their reasoning (for liking, choosing, or believing something, etc.) tend to see a subsequent decrease in correspondence between attitude and behaviour in the participants. [28] For example, in a study by Wilson et al., participants rated their interest in puzzles that they had been given. Prior to rating, one group had been instructed to contemplate and write down their reasons for liking or disliking the puzzles, while the control group was given no such task. The amount of time participants spent playing with each puzzle was then recorded. The correlation between ratings of and time spent playing each puzzle was much smaller for the introspection group than the control group. [29]

A subsequent study was performed to show the generalizability of these results to more "realistic" circumstances. In this study, participants were all involved in a steady romantic relationship. All were asked to rate how well-adjusted their relationship was. One group was beforehand asked to list all of the reasons behind their feelings for their partner, while the control group did not do so. Six months later, the experimenters followed up with participants to check if they were still in the same relationship. Those who had been asked to introspect showed much less attitude-behaviour consistency based upon correlations between earlier relationship ratings and whether they were still dating their partners. This shows that introspection was not predictive, but this also probably means that the introspection has changed the evolution of the relationship. [29]

The authors theorize that these effects are due to participants changing their attitudes, when confronted with a need for justification, without changing their corresponding behaviours. The authors hypothesize that this attitude shift is the result of a combination of things: a desire to avoid feeling foolish for simply not knowing why one feels a certain way; a tendency to make justifications based upon cognitive reasons, despite the large influence of emotion; ignorance of mental biases (e.g., halo effects); and self-persuasion that the reasons one has come up with must be representative with their attitude. In effect, people attempt to supply a "good story" to explain their reasoning, which often leads to convincing themselves that they actually hold a different belief. [28] In studies wherein participants chose an item to keep, their subsequent reports of satisfaction with the item decreased, suggesting that their attitude changes were temporary, returning to the original attitude over time. [30]

Introspection by focusing on feelings

In contrast with introspection by focusing on reasoning, that which instructs one to focus on their feelings has actually been shown to increase attitude-behaviour correlations. [28] This finding suggests that introspecting on one's feelings is not a maladaptive process.

Criticisms

The theory that there are mental processes that act as justifications do not make behaviour more adaptive is criticized by some biologists who argue that the cost in nutrients for brain function selects against any brain mechanism that does not make behaviour more adapted to the environment. They argue that the cost in essential nutrients causes even more difficulty than the cost in calories, especially in social groups of many individuals needing the same scarce nutrients, which imposes substantial difficulty on feeding the group and lowers their potential size. These biologists argue that the evolution of argumentation was driven by the effectiveness of arguments on changing risk perception attitudes and life and death decisions to a more adaptive state, as "luxury functions" that did not enhance life and death survival would lose the evolutionary "tug of war" against the selection for nutritional thrift. While there have been claims of non-adaptive brain functions being selected by sexual selection, these biologists criticize any applicability to introspection illusion's causal theories because sexually selected traits are most disabling as a fitness signal during or after puberty but human brains require the highest amount of nutrients before puberty (enhancing the nerve connections in ways that make adult brains capable of faster and more nutrient-efficient firing). [31] [ page needed ] [32] [ page needed ]

A priori causal theories

In their classic paper, Nisbett and Wilson proposed that introspective confabulations result from a priori theories, of which they put forth four possible origins: [8]

The authors note that the use of these theories does not necessarily lead to inaccurate assumptions, but that this frequently occurs because the theories are improperly applied.

Explaining biases

Pronin argues that over-reliance on intentions is a factor in a number of different biases. For example, by focusing on their current good intentions, people can overestimate their likelihood of behaving virtuously. [33]

In perceptions of bias

The bias blind spot is an established phenomenon that people rate themselves as less susceptible to bias than their peer group. Emily Pronin and Matthew Kugler argue that this phenomenon is due to the introspection illusion. [34] Pronin and Kugler's interpretation is that when people decide whether someone else is biased, they use overt behaviour. On the other hand, when assessing whether or not they themselves are biased, people look inward, searching their own thoughts and feelings for biased motives. Since biases operate unconsciously, these introspections are not informative, but people wrongly treat them as reliable indication that they themselves, unlike other people, are immune to bias. [34]

In their experiments, subjects had to make judgments about themselves and about other subjects. [35] They displayed standard biases, for example rating themselves above the others on desirable qualities (demonstrating illusory superiority). The experimenters explained cognitive bias, and asked the subjects how it might have affected their judgment. The subjects rated themselves as less susceptible to bias than others in the experiment (confirming the bias blind spot). When they had to explain their judgments, they used different strategies for assessing their own and others' bias. [35]

Pronin and Kugler tried to give their subjects access to others' introspections. To do this, they made audio recordings of subjects who had been told to say whatever came into their heads as they decided whether their answer to a previous question might have been affected by bias. Although subjects persuaded themselves they were unlikely to be biased, their introspective reports did not sway the assessments of observers. [35]

When asked what it would mean to be biased, subjects were more likely to define bias in terms of introspected thoughts and motives when it applied to themselves, but in terms of overt behaviour when it applied to other people. When subjects were explicitly told to avoid relying on introspection, their assessments of their own bias became more realistic. [35]

Additionally, Nisbett and Wilson found that asking participants whether biases (such as the position effect in the stocking study)[ clarification needed ] had an effect on their decisions resulted in a negative response, in contradiction with the data. [8]

In perceptions of conformity

Another series of studies by Pronin and colleagues examined perceptions of conformity. Subjects reported being more immune to social conformity than their peers. In effect, they saw themselves as being "alone in a crowd of sheep". The introspection illusion appeared to contribute to this effect. When deciding whether others respond to social influence, subjects mainly looked at their behaviour, for example explaining other student's political opinions in terms of following the group. When assessing their own conformity, subjects treat their own introspections as reliable. In their own minds, they found no motive to conform, and so decided that they had not been influenced. [36]

In perceptions of control and free will

Psychologist Daniel Wegner has argued that an introspection illusion contributes to belief in paranormal phenomena such as psychokinesis. [37] He observes that in everyday experience, intention (such as wanting to turn on a light) is followed by action (such as flicking a light switch) in a reliable way, but the processes connecting the two are not consciously accessible. Hence though subjects may feel that they directly introspect their own free will, the experience of control is actually inferred from relations between the thought and the action. This theory, called "apparent mental causation", acknowledges the influence of David Hume's view of the mind. [37] This process for detecting when one is responsible for an action is not totally reliable, and when it goes wrong there can be an illusion of control. This could happen when an external event follows, and is congruent with, a thought in someone's mind, without an actual causal link. [37]

As evidence, Wegner cites a series of experiments on magical thinking in which subjects were induced to think they had influenced external events. In one experiment, subjects watched a basketball player taking a series of free throws. When they were instructed to visualise him making his shots, they felt that they had contributed to his success. [38]

If the introspection illusion contributes to the subjective feeling of free will, then it follows that people will more readily attribute free will to themselves rather than others. This prediction has been confirmed by three of Pronin and Kugler's experiments. When college students were asked about personal decisions in their own and their roommate's lives, they regarded their own choices as less predictable. Staff at a restaurant described their co-workers' lives as more determined (having fewer future possibilities) than their own lives. When weighing up the influence of different factors on behaviour, students gave desires and intentions the strongest weight for their own behaviour, but rated personality traits as most predictive of other people. [39]

However, criticism of Wegner's claims regarding the significance of introspection illusion for the notion of free will has been published. [40]

Criticisms

Research shows that human volunteers can estimate their response times accurately, in fact knowing their "mental processes" well, but only with substantial demands made on their attention and cognitive resources (i.e. they are distracted while estimating). Such estimation is likely more than post hoc interpretation and may incorporate privileged information. [41] [42] Mindfulness training can also increase introspective accuracy in some instances. [43] [44] [45] Nisbett and Wilson's findings were criticized by psychologists Ericsson and Simon, among others. [46]

Correction

A study that investigated the effect of educating people about unconscious biases on their subsequent self-ratings of susceptibility to bias showed that those who were educated did not exhibit the bias blind spot, in contrast with the control group. This finding provides hope that being informed about unconscious biases such as the introspection illusion may help people to avoid making biased judgments, or at least make them aware that they are biased. Findings from other studies on correction of the bias yielded mixed results. In a later review of the introspection illusion, Pronin suggests that the distinction is that studies that merely provide a warning of unconscious biases will not see a correction effect, whereas those that inform about the bias and emphasize its unconscious nature do yield corrections. Thus, knowledge that bias can operate during conscious awareness seems the defining factor in leading people to correct for it. [4]

Timothy Wilson has tried to find a way out from "introspection illusion", recounted in his book Strangers to Ourselves. He suggests that the observation of our own behaviours more than our thoughts can be one of the keys for clearer introspective knowledge. [47]

Criticisms

Some 21st century critical rationalists argue that claims of correcting for introspection illusions or other cognitive biases pose a threat of immunizing themselves to criticism by alleging that criticism of psychological theories that claim cognitive bias are "justifications" for cognitive bias, making it non-falsifiable by labelling of critics and also potentially totalitarian. These modern critical rationalists argue that defending a theory by claiming that it overcomes bias and alleging that critics are biased, can defend any pseudoscience from criticism; and that the claim that "criticism of A is a defense of B" is inherently incapable of being evidence-based, and that any actual "most humans" bias (if it existed) would be shared by most psychologists thus make psychological claims of biases a way of accusing unbiased criticism of being biased and marketing the biases as overcoming of bias. [48] [ page needed ] [49] [ page needed ]

See also

Notes

  1. Wilson 2002 , p. 167
  2. 1 2 White, Peter A. (1988). "Knowing more about what we can tell: 'Introspective access' and causal report accuracy 10 years later". British Journal of Psychology. 79 (1): 13–45. doi:10.1111/j.2044-8295.1988.tb02271.x.
  3. 1 2 Aronson, Elliot; Wilson, Timothy D.; Akert, Robin M.; Sommers, Samuel R. (2015). Social Psychology (9th ed.). Pearson Education. p. 128. ISBN   9780133936544.
  4. 1 2 Pronin 2009 , pp. 52–53
  5. Shermer, Michael (2007). The Mind of the Market: Compassionate Apes, Competitive Humans, and Other Tales from Evolutionary Economics . Times Books. p.  72. ISBN   978-0-8050-7832-9.
  6. Pronin 2009 , p. 5
  7. 1 2 Wilson, Timothy D.; Dunn, Elizabeth W. (2004). "Self-Knowledge: Its Limits, Value, and Potential for Improvement". Annual Review of Psychology. 55 (1): 493–518. doi:10.1146/annurev.psych.55.090902.141954. PMID   14744224.
  8. 1 2 3 4 5 6 Nisbett, Richard E.; Wilson, Timothy D. (1977). "Telling more than we can know: Verbal reports on mental processes". Psychological Review. 84 (3): 231–259. doi:10.1037/0033-295x.84.3.231. hdl: 2027.42/92167 . S2CID   7742203. reprinted in David Lewis Hamilton, ed. (2005). Social cognition: key readings. Psychology Press. ISBN   978-0-86377-591-8.
  9. 1 2 3 Johansson, P; Hall, L; Sikström, S; Tärning, B; Lind, A (2006). "How something can be said about telling more than we can know: On choice blindness and introspection" (PDF). Consciousness and Cognition. 15 (4): 673–692. doi:10.1016/j.concog.2006.09.004. PMID   17049881. S2CID   14863202. Archived from the original on 2016-06-05.{{cite journal}}: CS1 maint: bot: original URL status unknown (link)
  10. 1 2 Wilson 2002 , pp. 104–106
  11. Wilson, T. D.; Bar-Anan, Y (August 22, 2008). "The Unseen Mind". Science. 321 (5892): 1046–1047. doi:10.1126/science.1163029. PMID   18719269. S2CID   11434647.
  12. Schwitzgebel, Eric; Hurlburt, Russell T. (2007). Describing Inner Experience?. MIT Press. ISBN   978-0-262-08366-9. Archived from the original on 2012-10-12. Retrieved 2011-03-17.
  13. Petitmengin, Claire; Remillieux, Anne; Cahour, Béatrice; Carter-Thomas, Shirley (June 2013). "A gap in Nisbett and Wilson's findings? A first-person access to our cognitive processes" (PDF). Consciousness and Cognition. 22 (2): 654–669. doi:10.1016/j.concog.2013.02.004. PMID   23719334. S2CID   29907087.
  14. Relethford, John H. (2017). 50 Great Myths of Human Evolution. doi:10.1002/9781119308058. ISBN   9780470673911.
  15. Zilhão, António (2010). Evolution, Rationality and Cognition. Taylor & Francis. ISBN   9780415591607.
  16. Nestor, Paul G.; Schutt, Russell K. (2014). Research Methods in Psychology: Investigating Human Behavior. SAGE Publications. ISBN   9781483369150.
  17. 1 2 Johansson, P; Hall, L; Sikström, S; Olsson, A (October 7, 2005). "Failure to Detect Mismatches Between Intention and Outcome in a Simple Decision Task" (PDF). Science. 310 (5745): 116–119. Bibcode:2005Sci...310..116J. doi:10.1126/science.1111709. PMID   16210542. S2CID   16249982. Archived from the original on December 22, 2014.{{cite journal}}: CS1 maint: bot: original URL status unknown (link)
  18. Hall, Lars; Johansson, Petter; Sikström, Sverker; Tärning, Betty; Lind, Andreas (2008). "Reply to commentary by Moore and Haggard". Consciousness and Cognition. 15 (4): 697–699. doi:10.1016/j.concog.2006.10.001. S2CID   54399436.
  19. Hall, L.; Johansson, P.; Tärning, B.; Sikström, S.; Deutgen, T. (2010). "Magic at the marketplace: Choice blindness for the taste of jam and the smell of tea". Cognition. 117 (1): 54–61. doi:10.1016/j.cognition.2010.06.010. PMID   20637455. S2CID   14872715.
  20. Hall, Lars; Petter Johansson. "Using choice blindness to study decision making and introspection" (PDF). Archived from the original (PDF) on 2016-03-04. Retrieved 2009-07-02. In P. Gärdenfors & A. Wallin (Eds.) (2008). Cognition – A Smorgasbord. pp. 267-283.
  21. "Unorthodox advice for rescuing a marriage". The Economist . 12 October 2017.
  22. Eastwick, P. W.; Finkel, E. J. (February 2008). "Sex differences in mate preferences revisited: Do people know what they initially desire in a romantic partner?". Journal of Personality and Social Psychology. 94 (2): 245–264. doi:10.1037/0022-3514.94.2.245. PMID   18211175.
  23. Henkel, L; Mather, M (2007). "Memory attributions for choices: How beliefs shape our memories". Journal of Memory and Language. 57 (2): 163–176. doi:10.1016/j.jml.2006.08.012.
  24. Johansson, Petter; Hall, Lars; Sikstrom, Sverker (2008). "From Change Blindness to Choice Blindness". Psychologia. 51 (2): 142–155. doi: 10.2117/psysoc.2008.142 .
  25. Kaszniak, A. W. (2002). "How well can we know ourselves? — Further Exploration of Introspection". Psychology of Consciousness Class Notes. University of Arizona. Archived from the original on 2009-02-04.
  26. Swatridge, Colin (2014). Oxford Guide to Effective Argument and Critical Thinking. Oxford University Press. ISBN   9780199671724.
  27. Speelman, Craig P.; McGann, Marek (2016). Challenges to Mean-Based Analysis in Psychology: The Contrast between Individual People and General Science. Frontiers Research Topics. doi: 10.3389/978-2-88945-043-5 . ISBN   9782889450435.
  28. 1 2 3 Wilson, Timothy D.; Dunn, Dana S.; Kraft, Dolores; Lisle, Douglas J. (1989). "Introspection, attitude change, and attitude-behavior consistency: The disruptive effects of explaining why we feel the way we do". Advances in Experimental Social Psychology. Vol. 22. pp. 287–343. doi:10.1016/S0065-2601(08)60311-1. ISBN   9780120152223.
  29. 1 2 Wilson, Timothy; D. Dunn; J. Bybee; D. Hyman; J. Rotondo (1984). "Effects of analyzing reasons on attitude-behavior consistency". Journal of Personality and Social Psychology. 47: 5–16. doi:10.1037/0022-3514.47.1.5.
  30. Wilson, Timothy; D. Lisle; J. Schooler; S. Hodges; K. Klaaren; S. LaFleur (1993). "Introspecting about reasons can reduce post-choice satisfaction". Personality and Social Psychology Bulletin. 19 (3): 331–339. doi:10.1177/0146167293193010. S2CID   145374820.
  31. Martínez-García, Fernando; Puelles, Luis; Ten Donkelaar, Hans J.; González, Agustín (2014). Adaptive Function and Brain Evolution. Frontiers Research Topic. ISBN   978-2-88919-306-6.
  32. National Academy of Sciences; Striedter, G. F.; Avise, J. C.; Ayala, F. J. (2013). In the Light of Evolution. Volume VI: Brain and Behavior. National Academies Press (US). doi:10.17226/13462. ISBN   978-0-309-26175-3. PMID   24901185.
  33. Pronin, Emily (January 2007). "Perception and misperception of bias in human judgment". Trends in Cognitive Sciences. 11 (1): 37–43. doi:10.1016/j.tics.2006.11.001. PMID   17129749. S2CID   2754235.
  34. 1 2 Gilovich, Thomas; Nicholas Epley; Karlene Hanko (2005). "Shallow Thoughts About the Self: The Automatic Components of Self-Assessment". In Mark D. Alicke; David A. Dunning; Joachim I. Krueger (eds.). The Self in Social Judgment. Studies in Self and Identity. New York: Psychology Press. p. 77. ISBN   978-1-84169-418-4.
  35. 1 2 3 4 Pronin, Emily; Kugler, Matthew B. (July 2007). "Valuing thoughts, ignoring behavior: The introspection illusion as a source of the bias blind spot". Journal of Experimental Social Psychology. 43 (4): 565–578. doi:10.1016/j.jesp.2006.05.011.
  36. Pronin, E; Berger, J; Molouki, S (2007). "Alone in a Crowd of Sheep: Asymmetric Perceptions of Conformity and Their Roots in an Introspection Illusion". Journal of Personality and Social Psychology. 92 (4): 585–595. doi:10.1037/0022-3514.92.4.585. PMID   17469946.
  37. 1 2 3 Wegner, Daniel M. (2008). "Self is Magic" (PDF). In John Baer; James C. Kaufman; Roy F. Baumeister (eds.). Are we free? Psychology and Free Will. New York: Oxford University Press. ISBN   978-0-19-518963-6. Archived from the original (PDF) on 2017-01-20.
  38. Pronin, E; Wegner, D. M.; McCarthy, K; Rodriguez, S (2006). "Everyday Magical Powers: The Role of Apparent Mental Causation in the Overestimation of Personal Influence" (PDF). Journal of Personality and Social Psychology. 91 (2): 218–231. CiteSeerX   10.1.1.405.3118 . doi:10.1037/0022-3514.91.2.218. PMID   16881760. Archived from the original (PDF) on 2011-01-05. Retrieved 2009-07-03.
  39. Pronin 2009 , pp. 42–43
  40. e.g. criticism by H. Andersen in his paper with the title 'Two Causal Mistakes in Wegner's Illusion of Conscious Will'; Also as a criticism, read "On the alleged illusion of conscious will' by Van Duijn and Sacha Bem. Other papers can be found).
  41. Marti, Sébastien; Sackur, Jérôme; Sigman, Mariano; Dehaene, Stanislas (2010). "Mapping introspection's blind spot: Reconstruction of dual-task phenomenology using quantified introspection". Cognition. 115 (2): 303–313. doi:10.1016/j.cognition.2010.01.003. PMID   20129603. S2CID   8912503.
  42. Guggisberg, Adrian G.; Dalal, Sarang S.; Schnider, Armin; Nagarajan, Srikantan S. (2011). "The neural basis of event-time introspection". Consciousness and Cognition. 20 (4): 1899–1915. doi:10.1016/j.concog.2011.03.008. PMC   3161169 . PMID   21498087.
  43. Djikic, Maja; Langer, Ellen J.; Fulton Stapleton, Sarah (June 2008). "Reducing Stereotyping Through Mindfulness: Effects on Automatic Stereotype-Activated Behaviors" (PDF). Journal of Adult Development. 15 (2): 106–111. doi:10.1007/s10804-008-9040-0. S2CID   53626094. Archived from the original (PDF) on 2012-07-29.
  44. Roberts-Wolfe, D; Sacchet, M. D.; Hastings, E; Roth, H; Britton, W (2012). "Mindfulness training alters emotional memory recall compared to active controls: Support for an emotional information processing model of mindfulness". Frontiers in Human Neuroscience. 6: 15. doi: 10.3389/fnhum.2012.00015 . PMC   3277910 . PMID   22347856.
  45. Chiesa, Alberto; Calati, Raffaella; Serretti, Alessandro (April 2011). "Does mindfulness training improve cognitive abilities? A systematic review of neuropsychological findings" (PDF). Clinical Psychology Review. 31 (3): 449–464. doi:10.1016/j.cpr.2010.11.003. PMID   21183265. S2CID   33953894. Archived from the original (PDF) on 2013-10-29. Retrieved 2012-04-03.
  46. Ericsson, K. Anders; Simon, Herbert A. (May 1980). "Verbal reports as data". Psychological Review. 87 (3): 215–251. doi:10.1037/0033-295X.87.3.215.
  47. Wilson 2002.
  48. Nola, R.; Sankey, H. (2012). After Popper, Kuhn and Feyerabend: Recent Issues in Theories of Scientific Method. doi:10.1007/978-94-011-3935-9. ISBN   9789401139359.
  49. Sassower, Raphael (2014). Popper's Legacy: Rethinking Politics, Economics and Science. Routledge. ISBN   9781317493723.

Sources

Further reading

Related Research Articles

<span class="mw-page-title-main">Cognitive bias</span> Systematic pattern of deviation from norm or rationality in judgment

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.

In the field of psychology, cognitive dissonance is the perception of contradictory information and the mental toll of it. Relevant items of information include a person's actions, feelings, ideas, beliefs, values, and things in the environment. Cognitive dissonance is typically experienced as psychological stress when persons participate in an action that goes against one or more of those things. According to this theory, when two actions or ideas are not psychologically consistent with each other, people do all in their power to change them until they become consistent. The discomfort is triggered by the person's belief clashing with new information perceived, wherein the individual tries to find a way to resolve the contradiction to reduce their discomfort.

In social psychology, fundamental attribution error, also known as correspondence bias or attribution effect, is a cognitive attribution bias where observers underemphasize situational and environmental factors for the behavior of an actor while overemphasizing dispositional or personality factors. In other words, observers tend to overattribute the behaviors of others to their personality and underattribute them to the situation or context. Although personality traits and predispositions are considered to be observable facts in psychology, the fundamental attribution error is an error because it misinterprets their effects.

Introspection is the examination of one's own conscious thoughts and feelings. In psychology, the process of introspection relies on the observation of one's mental state, while in a spiritual context it may refer to the examination of one's soul. Introspection is closely related to human self-reflection and self-discovery and is contrasted with external observation.

Social cognition is a topic within psychology that focuses on how people process, store, and apply information about other people and social situations. It focuses on the role that cognitive processes play in social interactions.

In psychology, an attribution bias or attributional errors is a cognitive bias that refers to the systematic errors made when people evaluate or try to find reasons for their own and others' behaviors. It refers to the systematic patterns of deviation from norm or rationality in judgment, often leading to perceptual distortions, inaccurate assessments, or illogical interpretations of events and behaviors.

Trait ascription bias is the tendency for people to view themselves as relatively variable in terms of personality, behavior and mood while viewing others as much more predictable in their personal traits across different situations. More specifically, it is a tendency to describe one's own behaviour in terms of situational factors while preferring to describe another's behaviour by ascribing fixed dispositions to their personality. This may occur because peoples' own internal states are more readily observable and available to them than those of others.

<span class="mw-page-title-main">Lee Ross</span> American academic (1942–2021)

Lee David Ross was a Canadian-American professor. He held the title of the Stanford Federal Credit Union Professor of Humanities and Sciences at Stanford University and was an influential social psychologist who studied attributional biases, shortcomings in judgment and decision making, and barriers to conflict resolution, often with longtime collaborator Mark Lepper. Ross was known for his identification and explication of the fundamental attribution error and for the demonstration and analysis of other phenomena and shortcomings that have become standard topics in textbooks and in some cases, even popular media. His interests included ongoing societal problems, in particular protracted inter-group conflicts, the individual and collective rationalization of evil, and the psychological processes that make it difficult to confront societal challenges. Ross went beyond the laboratory to involve himself in conflict resolution and public peace processes in the Middle East, Northern Ireland, and other areas of the world.

The bias blind spot is the cognitive bias of recognizing the impact of biases on the judgment of others, while failing to see the impact of biases on one's own judgment. The term was created by Emily Pronin, a social psychologist from Princeton University's Department of Psychology, with colleagues Daniel Lin and Lee Ross. The bias blind spot is named after the visual blind spot. Most people appear to exhibit the bias blind spot. In a sample of more than 600 residents of the United States, more than 85% believed they were less biased than the average American. Only one participant believed that they were more biased than the average American. People do vary with regard to the extent to which they exhibit the bias blind spot. This phenomenon has been successfully replicated and it appears that in general, stronger personal free will beliefs are associated with bias blind spot. It appears to be a stable individual difference that is measurable.

Self-knowledge is a term used in psychology to describe the information that an individual draws upon when finding answers to the questions "What am I like?" and "Who am I?".

Belief bias is the tendency to judge the strength of arguments based on the plausibility of their conclusion rather than how strongly they support that conclusion. A person is more likely to accept an argument that supports a conclusion that aligns with their values, beliefs and prior knowledge, while rejecting counter arguments to the conclusion. Belief bias is an extremely common and therefore significant form of error; we can easily be blinded by our beliefs and reach the wrong conclusion. Belief bias has been found to influence various reasoning tasks, including conditional reasoning, relation reasoning and transitive reasoning.

The adaptive unconscious, first coined by social psychologist Daniel Wegner in 2002, is described as a set of mental processes that is able to affect judgement and decision-making, but is out of reach of the conscious mind. It is thought to be adaptive as it helps to keep the organism alive. Architecturally, the adaptive unconscious is said to be unreachable because it is buried in an unknown part of the brain. This type of thinking evolved earlier than the conscious mind, enabling the mind to transform information and think in ways that enhance an organism's survival. It can be described as a quick sizing up of the world which interprets information and decides how to act very quickly and outside the conscious view. The adaptive unconscious is active in everyday activities such as learning new material, detecting patterns, and filtering information. It is also characterized by being unconscious, unintentional, uncontrollable, and efficient without requiring cognitive tools. Lacking the need for cognitive tools does not make the adaptive unconscious any less useful than the conscious mind as the adaptive unconscious allows for processes like memory formation, physical balancing, language, learning, and some emotional and personalities processes that includes judgement, decision making, impression formation, evaluations, and goal pursuing. Despite being useful, the series of processes of the adaptive unconscious will not always result in accurate or correct decisions by the organism. The adaptive unconscious is affected by things like emotional reaction, estimations, and experience and is thus inclined to stereotyping and schema which can lead to inaccuracy in decision making. The adaptive conscious does however help decision making to eliminate cognitive biases such as prejudice because of its lack of cognitive tools.

<span class="mw-page-title-main">Richard E. Nisbett</span> American psychologist (born 1941)

Richard Eugene Nisbett is an American social psychologist and writer. He is the Theodore M. Newcomb Distinguished Professor of social psychology and co-director of the Culture and Cognition program at the University of Michigan at Ann Arbor. Nisbett's research interests are in social cognition, culture, social class, and aging. He received his Ph.D. from Columbia University, where his advisor was Stanley Schachter, whose other students at that time included Lee Ross and Judith Rodin.

Implicit cognition refers to cognitive processes that occur outside conscious awareness or conscious control. This includes domains such as learning, perception, or memory which may influence a person's behavior without their conscious awareness of those influences.

Shelley Elizabeth Taylor is an American psychologist. She serves as a distinguished professor of psychology at the University of California, Los Angeles. She received her Ph.D. from Yale University, and was formerly on the faculty at Harvard University. A prolific author of books and scholarly journal articles, Taylor has long been a leading figure in two subfields related to her primary discipline of social psychology: social cognition and health psychology. Her books include The Tending Instinct and Social Cognition, the latter by Susan Fiske and Shelley Taylor.

In psychology, the human mind is considered to be a cognitive miser due to the tendency of humans to think and solve problems in simpler and less effortful ways rather than in more sophisticated and effortful ways, regardless of intelligence. Just as a miser seeks to avoid spending money, the human mind often seeks to avoid spending cognitive effort. The cognitive miser theory is an umbrella theory of cognition that brings together previous research on heuristics and attributional biases to explain when and why people are cognitive misers.

In social psychology, naïve realism is the human tendency to believe that we see the world around us objectively, and that people who disagree with us must be uninformed, irrational, or biased.

The proportionality bias, also known as major event/major cause heuristic, is the tendency to assume that big events have big causes. It is a type of cognitive bias and plays an important role in people's tendency to accept conspiracy theories. Academic psychologist Rob Brotherton summarises it as “When something big happens, we tend to assume that something big must have caused it”.

Emily Pronin is an American psychologist who specializes in human self-perception and decision making. She is a professor of Psychology and Public affair at Princeton University. She created and coined the terms Bias blind spot and Introspection illusion.