Illusion of explanatory depth

Last updated

The illusion of explanatory depth (IOED) is cognitive bias or an illusion where people tend to believe they understand a topic better than they actually do. [1] [2] [3] The term was coined by Yale researchers Leonid Rozenblit and Frank Keil in 2002. [1] [4] The effect was observed in only one type of knowledge called explanatory knowledge, in this case defined as "knowledge that involves complex causal patterns" (see causal reasoning ). The effect has not been observed in procedural, narrative, or factual (descriptive) knowledge. [2] [5] Evidence of the IOED occurring has been found in everyday mechanical and electrical devices such as bicycles, in addition to mental disorders, natural phenomena, folk theories, and politics, with the most studied effect of IOED being in politics in the form of political polarization. [6] [2]

Contents

The illusion is related to the Dunning–Kruger effect, differing in that the IOED examines explanatory knowledge as opposed to ability. [1] [3] Limited evidence exists suggesting that the effects of the IOED are less significant in subject matter experts, [7] but it is believed to affect almost everyone, compared to the Dunning–Kruger effect which is usually defined to apply only to those of low to moderate competence. [3] [8] The IOED is more significant for historical knowledge, in cases when knowing about the topic is perceived as socially desirable. [9]

Another description of the IOED is that "we mistake our familiarity with a situation for an understanding of how it works". [10] IOED has also been suggested to explain the perception that psychology as a field is "simple" or "obvious". [10] [ non-primary source needed ]

In politics

There is evidence to support the theory that the IOED is a contributing factor to increased political polarization in the United States. [11] A 2018 study with participants recruited in the context of the 2016 United States presidential election found that higher levels of IOED about political topics is associated with increased support in conspiracy theories. [12]

Management

It is thought that the effects of IOED, especially in politics, can be reduced by asking people to explain the topic rather than only asking people to provide reasons for their beliefs. [1] [11] The specific ways in which people are asked to explain the topic are important, as they may backfire. This was found in research that showed when people are asked to "justify their position", people's beliefs become more extreme. Asking for "reasons" may lead people to strengthen their beliefs by selectively thinking of support for their position, while asking for "explanations" may lead them to confront their lack of knowledge. [11]

Original experiment

The term for IOED was coined by Yale researchers Leonid Rozenblit and Frank Keil in 2002. [2] One inspiration for the IOED concept was research in change blindness suggesting at the time that people grossly overestimated their own spatial memory. [13]

In the experiment they conducted with 16 Yale undergraduate students, they asked them to rate their understanding of devices and simple items. They were then asked to generate a detailed explanation of how they worked and then rerate their understanding of that item. Consistently, ratings were lower after generating an explanation, suggesting they then began to understand that they lacked understanding of that item after attempting to explain. Rozenblit and Keil concluded that having to explain basic concepts or mechanisms, confronts people with the reality that they may not understand the subject as much as they think they do.

Related Research Articles

<span class="mw-page-title-main">Cognitive science</span> Interdisciplinary scientific study of cognitive processes

Cognitive science is the interdisciplinary, scientific study of the mind and its processes with input from linguistics, psychology, neuroscience, philosophy, computer science/artificial intelligence, and anthropology. It examines the nature, the tasks, and the functions of cognition. Cognitive scientists study intelligence and behavior, with a focus on how nervous systems represent, process, and transform information. Mental faculties of concern to cognitive scientists include language, perception, memory, attention, reasoning, and emotion; to understand these faculties, cognitive scientists borrow from fields such as linguistics, psychology, artificial intelligence, philosophy, neuroscience, and anthropology. The typical analysis of cognitive science spans many levels of organization, from learning and decision to logic and planning; from neural circuitry to modular brain organization. One of the fundamental concepts of cognitive science is that "thinking can best be understood in terms of representational structures in the mind and computational procedures that operate on those structures."

Cognitive psychology is the scientific study of mental processes such as attention, language use, memory, perception, problem solving, creativity, and reasoning.

Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values. People display this bias when they select information that supports their views, ignoring contrary information, or when they interpret ambiguous evidence as supporting their existing attitudes. The effect is strongest for desired outcomes, for emotionally charged issues, and for deeply entrenched beliefs. Confirmation bias is insuperable for most people, but they can manage it, for example, by education and training in critical thinking skills.

<span class="mw-page-title-main">Cognition</span> Act or process of knowing

Cognition is the "mental action or process of acquiring knowledge and understanding through thought, experience, and the senses". It encompasses all aspects of intellectual functions and processes such as: perception, attention, thought, imagination, intelligence, the formation of knowledge, memory and working memory, judgment and evaluation, reasoning and computation, problem-solving and decision-making, comprehension and production of language. Cognitive processes use existing knowledge and discover new knowledge.

Understanding is a cognitive process related to an abstract or physical object, such as a person, situation, or message whereby one is able to use concepts to model that object. Understanding is a relation between the knower and an object of understanding. Understanding implies abilities and dispositions with respect to an object of knowledge that are sufficient to support intelligent behavior.

Ignorance is a lack of knowledge or understanding. Deliberate ignorance is a culturally-induced phenomenon, the study of which is called agnotology.

Introspection is the examination of one's own conscious thoughts and feelings. In psychology, the process of introspection relies on the observation of one's mental state, while in a spiritual context it may refer to the examination of one's soul. Introspection is closely related to human self-reflection and self-discovery and is contrasted with external observation.

In psychology, theory of mind refers to the capacity to understand other people by ascribing mental states to them. A theory of mind includes the knowledge that others' beliefs, desires, intentions, emotions, and thoughts may be different from one's own. Possessing a functional theory of mind is crucial for success in everyday human social interactions. People utilise a theory of mind when analyzing, judging, and inferring others' behaviors. The discovery and development of theory of mind primarily came from studies done with animals and infants. Factors including drug and alcohol consumption, language development, cognitive delays, age, and culture can affect a person's capacity to display theory of mind. Having a theory of mind is similar to but not identical with having the capacity for empathy or sympathy.

Metacognition is an awareness of one's thought processes and an understanding of the patterns behind them. The term comes from the root word meta, meaning "beyond", or "on top of". Metacognition can take many forms, such as reflecting on one's ways of thinking and knowing when and how to use particular strategies for problem-solving. There are generally two components of metacognition: (1) knowledge about cognition and (2) regulation of cognition. A metacognitive model differs from other scientific models in that the creator of the model is per definition also enclosed within it. Scientific models are often prone to distancing the observer from the object or field of study whereas a metacognitive model in general tries to include the observer in the model.

<span class="mw-page-title-main">Dunning–Kruger effect</span> Cognitive bias about ones own skill

The Dunning–Kruger effect is a cognitive bias in which people with limited competence in a particular domain overestimate their abilities. Some researchers also include the opposite effect for high performers: their tendency to underestimate their skills. In popular culture, the Dunning–Kruger effect is often misunderstood as a claim about general overconfidence of people with low intelligence instead of specific overconfidence of people unskilled at a particular task.

The overconfidence effect is a well-established bias in which a person's subjective confidence in their judgments is reliably greater than the objective accuracy of those judgments, especially when confidence is relatively high. Overconfidence is one example of a miscalibration of subjective probabilities. Throughout the research literature, overconfidence has been defined in three distinct ways: (1) overestimation of one's actual performance; (2) overplacement of one's performance relative to others; and (3) overprecision in expressing unwarranted certainty in the accuracy of one's beliefs.

Positive illusions are unrealistically favorable attitudes that people have towards themselves or to people that are close to them. Positive illusions are a form of self-deception or self-enhancement that feel good; maintain self-esteem; or avoid discomfort, at least in the short term. There are three general forms: inflated assessment of one's own abilities, unrealistic optimism about the future, and an illusion of control. The term "positive illusions" originates in a 1988 paper by Taylor and Brown. "Taylor and Brown's (1988) model of mental health maintains that certain positive illusions are highly prevalent in normal thought and predictive of criteria traditionally associated with mental health."

In social psychology, illusory superiority is a cognitive bias wherein a person overestimates their own qualities and abilities compared to other people. Illusory superiority is one of many positive illusions, relating to the self, that are evident in the study of intelligence, the effective performance of tasks and tests, and the possession of desirable personal characteristics and personality traits. Overestimation of abilities compared to an objective measure is known as the overconfidence effect.

Causal reasoning is the process of identifying causality: the relationship between a cause and its effect. The study of causality extends from ancient philosophy to contemporary neuropsychology; assumptions about the nature of causality may be shown to be functions of a previous event preceding a later one. The first known protoscientific study of cause and effect occurred in Aristotle's Physics. Causal inference is an example of causal reasoning.

<span class="mw-page-title-main">Introspection illusion</span> Cognitive bias of people thinking they understand their own mental states but others are inaccurate

The introspection illusion is a cognitive bias in which people wrongly think they have direct insight into the origins of their mental states, while treating others' introspections as unreliable. The illusion has been examined in psychological experiments, and suggested as a basis for biases in how people compare themselves to others. These experiments have been interpreted as suggesting that, rather than offering direct access to the processes underlying mental states, introspection is a process of construction and inference, much as people indirectly infer others' mental states from their behaviour.

The curse of knowledge is a cognitive bias that occurs when an individual, who is communicating with others, assumes that others have information that is only available to themselves, assuming they all share a background and understanding. This bias is also called by some authors the curse of expertise.

Metacognitive therapy (MCT) is a psychotherapy focused on modifying metacognitive beliefs that perpetuate states of worry, rumination and attention fixation. It was created by Adrian Wells based on an information processing model by Wells and Gerald Matthews. It is supported by scientific evidence from a large number of studies.

Hypercorrection is the higher likelihood of correcting a general knowledge error when originally certain that the information they understand is accurate as opposed to unsure of the information. The phenomenon suggests that once a general knowledge information is confidently misremembered by someone and the person learns the right version after their initial response is corrected, their likelihood of remembering this piece of information will be higher than someone who was unsure of their initial answer. It refers to the finding that when given corrective feedback, errors that are committed with high confidence are easier to correct than low confidence errors.

The false-uniqueness effect is an attributional type of cognitive bias in social psychology that describes how people tend to view their qualities, traits, and personal attributes as unique when in reality they are not. This bias is often measured by looking at the difference between estimates that people make about how many of their peers share a certain trait or behaviour and the actual number of peers who report these traits and behaviours.

References

  1. 1 2 3 4 Waytz, Adam (26 January 2022). "2017 : What scientific term or concept ought to be more widely known?". Edge.org . Retrieved 26 January 2022.
  2. 1 2 3 4 Rozenblit, Leonid; Keil, Frank (2002). "The misunderstood limits of folk science: an illusion of explanatory depth". Cognitive Science. Wiley. 26 (5): 521–562. doi:10.1207/s15516709cog2605_1. ISSN   0364-0213. PMC   3062901 . PMID   21442007.
  3. 1 2 3 Chromik, Michael; Eiband, Malin; Buchner, Felicitas; Krüger, Adrian; Butz, Andreas (13 April 2021). "I Think I Get Your Point, AI! The Illusion of Explanatory Depth in Explainable AI". 26th International Conference on Intelligent User Interfaces. New York, NY, USA: ACM. pp. 307–317. doi:10.1145/3397481.3450644. ISBN   9781450380171.
  4. "The Illusion of Explanatory Depth". The Decision Lab. Retrieved 26 January 2022.
  5. Mills, Candice M; Keil, Frank C (2004). "Knowing the limits of one's understanding: The development of an awareness of an illusion of explanatory depth". Journal of Experimental Child Psychology. Elsevier BV. 87 (1): 1–32. doi:10.1016/j.jecp.2003.09.003. ISSN   0022-0965. PMID   14698687.
  6. Zeveney, Marsh, Andrew, Jessacae (2016). "The Illusion of Explanatory Depth in a Misunderstood Field: The IOED in Mental Disorders" (PDF). Cognitive Science Society: 1020.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  7. Lawson, Rebecca (2006). "The science of cycology: Failures to understand how everyday objects work". Memory & Cognition. Springer Science and Business Media LLC. 34 (8): 1667–1675. doi: 10.3758/bf03195929 . ISSN   0090-502X. PMID   17489293. S2CID   4998257.
  8. McIntosh, Robert D.; Fowler, Elizabeth A.; Lyu, Tianjiao; Della Sala, Sergio (November 2019). "Wise up: Clarifying the role of metacognition in the Dunning-Kruger effect". Journal of Experimental Psychology. General. 148 (11): 1882–1897. doi:10.1037/xge0000579. hdl: 20.500.11820/b5c09c5f-d2f2-4f46-b533-9e826ab85585 . ISSN   1939-2222. PMID   30802096. S2CID   73460013.
  9. Gaviria, Christian; Corredor, Javier (23 June 2021). "Illusion of explanatory depth and social desirability of historical knowledge". Metacognition and Learning. Springer Science and Business Media LLC. 16 (3): 801–832. doi:10.1007/s11409-021-09267-7. ISSN   1556-1623. S2CID   237878736.
  10. 1 2 Stafford, Tom (February 2007). "Isn't it all just obvious?". The Psychologist. Retrieved 28 January 2022.
  11. 1 2 3 Fernbach, Philip M.; Rogers, Todd; Fox, Craig R.; Sloman, Steven A. (25 April 2013). "Political Extremism Is Supported by an Illusion of Understanding". Psychological Science. SAGE Publications. 24 (6): 939–946. doi:10.1177/0956797612464058. ISSN   0956-7976. PMID   23620547. S2CID   6173291.
  12. Vitriol, Joseph A.; Marsh, Jessecae K. (15 June 2018). "The illusion of explanatory depth and endorsement of conspiracy beliefs". European Journal of Social Psychology. Wiley. 48 (7): 955–969. doi:10.1002/ejsp.2504. ISSN   0046-2772. S2CID   149811872.
  13. Levin, Daniel T.; Momen, Nausheen; Drivdahl, Sarah B.; Simons, Daniel J. (January 2000). "Change Blindness Blindness: The Metacognitive Error of Overestimating Change-detection Ability". Visual Cognition. 7 (1–3): 397–412. doi:10.1080/135062800394865. ISSN   1350-6285. S2CID   14623812.