Dysrationalia is defined as the inability to think and behave rationally despite adequate intelligence. [1] It is a concept in educational psychology and is not a clinical disorder such as a thought disorder. Dysrationalia can be a resource to help explain why smart people fall for Ponzi schemes and other fraudulent encounters.
The concept of dysrationalia was first proposed by psychologist Keith Stanovich in the early 1990s. Stanovich originally classified dysrationalia as a learning disability and characterized it as a difficulty in belief formation, in assessing belief consistency, or in the determination of action to achieve one's goals. [2] However, special education researcher Kenneth Kavale noted that dysrationalia may be more aptly categorized as a thinking disorder, rather than a learning disability, because it does not have a direct impact upon academic performance. [3]
Psychologist Robert Sternberg argued that the construct of dysrationalia needed to be better conceptualized since it lacked a theoretical framework (explaining why people are dysrational and how they become this way) and operationalization (how dysrationalia could be measured). [4] [5] Sternberg also noted that the concept had the potential for misuse, as one may label another as dysrational simply because he or she does not agree with the other person's view: "I am afraid that Stanovich has fallen into a trap—that of labeling people as 'dysrational' who have beliefs that he does not accept. And therein lies frightening potential for misuse." [4] : 23
Stanovich then replied to both Kavale [6] and Sternberg. [7] In response to Sternberg's concern about the construct's potential for misuse, Stanovich said that in that respect it is no different from other constructs such as intelligence, which is a construct that Sternberg himself uses. [7] Stanovich emphasized that use of the dysrationalia construct should be carefully based on rigorous standards of epistemic justification that do not depend solely on social agreement or disagreement and that refer to the process of justifying beliefs, not to the content of beliefs themselves. [7] Stanovich and his colleagues further developed the theoretical framework for, and operationalization of, dysrationalia in later books.
In 2002 Sternberg edited a book, Why Smart People Can Be So Stupid, in which the dysrationalia concept was extensively discussed. [8] In his 2009 book What Intelligence Tests Miss, Stanovich provided the detailed conceptualization that Sternberg called for in his earlier critique. [9] In that book, Stanovich showed that variation in rational thinking skills is surprisingly independent of intelligence. One implication of this finding is that dysrationalia should not be rare.
Stanovich proposed two concepts related to dysrationalia: mindware gap and contaminated mindware. [10]
A mindware gap results from gaps in education and experience. This idea focuses on the lack or limitations within a person's knowledge in logic, probability theory, or scientific method when it comes to belief orientation or decision-making. Due to these gaps, intelligent people can make seemingly irrational decisions.
Contaminated mindware focuses on how intelligent people believe irrational ideologies, conspiracy theories, pseudosciences, and/or get-rich-quick schemes. A person can be led into such contaminated mindware through heuristic trust or fallacious reasoning.
One example that Stanovich related to dysrationalia centers on two former Illinois schoolteachers who pulled their children from the local public school in the area because discussions of the Holocaust are a part of the school's history curriculum. [1] : 503 These parents, who are presumably competent due to their college education, believe that the Holocaust is a myth and should not be taught to their children. This is an example of a problem in belief formation regardless of intelligence.
A survey was given to Canadian Mensa club members on the topic of paranormal belief. Mensa members are provided membership strictly because of their high-IQ scores. The survey results showed that 44% of the members believed in astrology, 51% believed in biorhythms, and 56% believed in the existence of extraterrestrial visitors. Stanovich argued that these beliefs have no valid evidence and thus might have been an example of dysrationalia. [1] : 503 Sternberg countered that "No one has yet conclusively proven any of these beliefs to be false", so endorsement of the beliefs should not be considered evidence of dysrationalia. [5] Stanovich's rebuttal to Sternberg explained that the purpose of the example was to question the epistemic rationality of the process by which people arrived at their unlikely conclusions, a process of evaluating the quality of arguments and evidence for and against each conclusion, not to assume irrationality based on the content of the conclusion alone. [7]
There are many examples of people who are famous because of their intelligence, but often display irrational behavior. Two examples cited by Stanovich were Martin Heidegger and William Crookes. Heidegger, a renowned philosopher, was also a Nazi apologist and "used the most specious of arguments to justify his beliefs". [1] : 503 Crookes, a famous scientist who discovered the element thallium and was a Fellow of the Royal Society, "was repeatedly duped by spiritualist 'mediums' but never gave up his belief in spiritualism". [1] : 503 Science journalist David Robson cited the example of Kary Mullis, an American biochemist and 1993 Nobel Prize winner who was also an astrology supporter and a climate change and HIV/AIDS denier. [11]
Reason is the capacity of consciously applying logic by drawing conclusions from new or existing information, with the aim of seeking the truth. It is closely associated with such characteristically human activities as philosophy, science, language, mathematics, and art, and is normally considered to be a distinguishing ability possessed by humans. Reason is sometimes referred to as rationality.
A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, or what is broadly called irrationality.
Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values. People display this bias when they select information that supports their views, ignoring contrary information, or when they interpret ambiguous evidence as supporting their existing attitudes. The effect is strongest for desired outcomes, for emotionally charged issues, and for deeply entrenched beliefs. Confirmation bias cannot be eliminated, but it can be managed, for example, by education and training in critical thinking skills.
Rationality is the quality of being guided by or based on reasons. In this regard, a person acts rationally if they have a good reason for what they do or a belief is rational if it is based on strong evidence. This quality can apply to an ability, as in rational animal, to a psychological process, like reasoning, to mental states, such as beliefs and intentions, or to persons who possess these other forms of rationality. A thing that lacks rationality is either arational, if it is outside the domain of rational evaluation, or irrational, if it belongs to this domain but does not fulfill its standards.
Bounded rationality is the idea that rationality is limited when individuals make decisions, and under these limitations, rational individuals will select a decision that is satisfactory rather than optimal.
The theory of multiple intelligences proposes the differentiation of human intelligence into specific modalities of intelligence, rather than defining intelligence as a single, general ability. The theory has been criticized by mainstream psychology for its lack of empirical evidence, and its dependence on subjective judgement.
Critical thinking is the analysis of available facts, evidence, observations, and arguments to form a judgement. The subject is complex; several different definitions exist, which generally include the rational, skeptical, and unbiased analysis or evaluation of factual evidence. Critical thinking is self-directed, self-disciplined, self-monitored, and self-corrective thinking. It presupposes assent to rigorous standards of excellence and mindful command of their use. It entails effective communication and problem-solving abilities as well as a commitment to overcome native egocentrism and sociocentrism.
Intelligence has been defined in many ways: the capacity for abstraction, logic, understanding, self-awareness, learning, emotional knowledge, reasoning, planning, creativity, critical thinking, and problem-solving. More generally, it can be described as the ability to perceive or infer information, and to retain it as knowledge to be applied towards adaptive behaviors within an environment or context.
Irrationality is cognition, thinking, talking, or acting without inclusion of rationality. It is more specifically described as an action or opinion given through inadequate use of reason, or through emotional distress or cognitive deficiency. The term is used, usually pejoratively, to describe thinking and actions that are, or appear to be, less useful, or more illogical than other more rational alternatives.
Keith E. Stanovich is a Canadian psychologist. He is an Emeritus Professor of Applied Psychology and Human Development at the University of Toronto and former Canada Research Chair of Applied Cognitive Science. His research areas are the psychology of reasoning and the psychology of reading. His research in the field of reading was fundamental to the emergence of today's scientific consensus about what reading is, how it works, and what it does for the mind. His research on the cognitive basis of rationality has been featured in the journal Behavioral and Brain Sciences and in recent books by Yale University Press and University of Chicago Press. His book What Intelligence Tests Miss won the 2010 Grawemeyer Award in Education. He received the 2012 E. L. Thorndike Career Achievement Award from the American Psychological Association.
Belief bias is the tendency to judge the strength of arguments based on the plausibility of their conclusion rather than how strongly they support that conclusion. A person is more likely to accept an argument that supports a conclusion that aligns with their values, beliefs and prior knowledge, while rejecting counter arguments to the conclusion. Belief bias is an extremely common and therefore significant form of error; we can easily be blinded by our beliefs and reach the wrong conclusion. Belief bias has been found to influence various reasoning tasks, including conditional reasoning, relation reasoning and transitive reasoning.
In psychology, a dual process theory provides an account of how thought can arise in two different ways, or as a result of two different processes. Often, the two processes consist of an implicit (automatic), unconscious process and an explicit (controlled), conscious process. Verbalized explicit processes or attitudes and actions may change with persuasion or education; though implicit process or attitudes usually take a long amount of time to change with the forming of new habits. Dual process theories can be found in social, personality, cognitive, and clinical psychology. It has also been linked with economics via prospect theory and behavioral economics, and increasingly in sociology through cultural analysis.
The following outline is provided as an overview of and topical guide to thought (thinking):
In psychology, the human mind is considered to be a cognitive miser due to the tendency of humans to think and solve problems in simpler and less effortful ways rather than in more sophisticated and effortful ways, regardless of intelligence. Just as a miser seeks to avoid spending money, the human mind often seeks to avoid spending cognitive effort. The cognitive miser theory is an umbrella theory of cognition that brings together previous research on heuristics and attributional biases to explain when and why people are cognitive misers.
Stupidity is a lack of intelligence, understanding, reason, or wit. It may be innate, assumed or reactive. The word stupid comes from the Latin word stupere. Stupid characters are often used for comedy in fictional stories. Walter B. Pitkin called stupidity "evil", but in a more Romantic spirit William Blake and Carl Jung believed stupidity can be the mother of wisdom.
The psychology of reasoning is the study of how people reason, often broadly defined as the process of drawing conclusions to inform how people solve problems and make decisions. It overlaps with psychology, philosophy, linguistics, cognitive science, artificial intelligence, logic, and probability theory.
Neurath's boat is a simile used in anti-foundational accounts of knowledge, especially in the philosophy of science. It was first formulated by Otto Neurath. It is based in part on the Ship of Theseus which, however, is standardly used to illustrate other philosophical questions, to do with problems of identity. It was popularised by Willard Van Orman Quine in Word and Object (1960).
The following outline is provided as an overview of and topical guide to human intelligence:
The Rationality Debate—also called the Great Rationality Debate—is the question of whether humans are rational or not. This issue is a topic in the study of cognition and is important in fields such as economics where it is relevant to the theories of market efficiency.