Peter Cathcart Wason

Last updated

Peter Wason
Born(1924-04-22)22 April 1924
Died17 April 2003(2003-04-17) (aged 78)
Resting place Highgate Cemetery
Nationality English
Citizenship British
Alma mater Oxford, University College, London
Known for Psychology of Reasoning
Relatives Sydney Rigby Wason (uncle)
Scientific career
Fields Psychology
Institutions University of Aberdeen, University College, London
Doctoral students
Influences Karl Popper, Jean Piaget

Peter Cathcart Wason (22 April 1924 – 17 April 2003) was a cognitive psychologist at University College, London who pioneered the Psychology of Reasoning. He progressed explanations as to why people make certain consistent mistakes in logical reasoning. He designed problems and tests to demonstrate these processes, for example the Wason selection task, the THOG problem and the 2-4-6 problem. He also coined the term "confirmation bias" [1] to describe the tendency for people to immediately favor information that validates their preconceptions, hypotheses and personal beliefs regardless of whether they are true or not.


Grave of Peter Cathcart Wason in Highgate Cemetery (east side) Grave of Peter Cathcart Wason in Highgate Cemetery.jpg
Grave of Peter Cathcart Wason in Highgate Cemetery (east side)

Personal life

Wason was born in Bath Somerset on 22 April 1924, and died at seventy-nine in Wallingford, Oxfordshire on 17 April 2003. Peter Wason was the grandson to Eugene Wason, [2] and the son to Eugene Monier and Kathleen (Woodhouse) Wason. [3] Wason married Marjorie Vera Salberg in 1951, and the couple had two children, Armorer and Sarah. [3] His uncle was Lieutenant General Sydney Rigby Wason.

Peter Wason endured his schooling, which was marked by consistent failure. [2] With the beginning of World War II, Wason completed officer training at Sandhurst, and then served as a liaison officer for the 8th Armoured Brigade, by then an independent brigade. [3] Wason returned home in 1945, having been released from his duties of being an officer due to extreme injuries. Wason then pursued more academic ventures by studying English at Oxford in 1948, [3] and continued on to become a lecturer at the University of Aberdeen. After the realization that he did not really prefer English, and actually found it quite boring, Wason returned to Oxford University to obtain a master's degree in psychology in 1953, and then a doctorate in 1956 from University College London. [3] He remained teaching at University College London until his retirement in the early 1980s. [2]

Early studies

Much of Peter Wason’s first areas of experimentation was not in the field of psychology of reasoning, but instead, language and psycholinguistics. Wason and Jones performed an experiment in which subjects were asked to evaluate numerical statements, such as “7 is even” and “9 is not odd”, and state whether the statement is true or false. The results revealed that affirmative assertions were evaluated faster as true than as false, but evaluation of negative assertions occurred faster as false than true. [4] From these results, Wason came to the conclusion that negatives are used in daily lives and discourse to correct common misconceptions. An example of this usage would be “The chair is not here”. Wason continued to explore and experiment in the field of psycholinguistics. Alongside Susan Carey [5] at the Harvard Center for Cognitive Studies, Wason found that context affects comprehension of an utterance, measured in time taken to respond. Participants were likely to respond more quickly to the statement “Circle number 4 is not blue” in a context in which all of the other circles were red. [6] Wason came to the conclusion context affects comprehension.

The Beginning of the Psychology of Reasoning

Before the creation of psychology of reasoning, it was a commonly held belief that humans reasoned by logical analysis. Wason argued against this logicism, saying that humans are unable to reason, and quite frequently fall prey to biases. Wason thought many of the things in his life were inconsistent and therefore unreasonable. [7] When he designed his experiments, Wason's goal was to examine the illogical nature of humans. Wason also wanted to look further into the confirmation bias, the tendency to strive toward proving one’s hypothesis instead of disproving it.

Wason and the 2-4-6 Task

In 1960 Wason developed the first of many tasks he would devise to reveal the failures of human reasoning. The “2-4-6” task was the first experiment that showed people to be illogical and irrational. In this study, subjects were told that the experimenter had a rule in mind that only applied to sets of threes. The “2-4-6” rule the experimenter had in mind was “any ascending sequence”. In most cases, subjects not only formed hypotheses that were more specific than necessary, but they also only tested positive examples of their hypothesis. Wason was surprised by the large number of subjects who failed to get the task correct. The subjects failed to test instances inconsistent with their own hypothesis, which further supported Wason’s hypothesis of confirmation bias. [8]

The Four-Card Task

Wason created the Selection Task, also known as the 4-card task, in 1966. In this task, participants were exposed to four cards on a table, and given a rule by the experimenter. The participants were then told to choose just cards to determine whether the rule given to them by the experimenter was true or false. As Wason expected, a majority of participants failed to answer the question correctly. Only ten percent of participants solved this task correctly. [9] The confirmation bias played a large part in this result, as participants usually chose cards to confirm their hypothesis, instead of eliminating it.


Wason devised yet another task, called the THOG task, to further his studies in psychology of reasoning. In this task, participants were shown cards with a white diamond, a black diamond, a white circle, and a black circle. They were then given a rule, and instructed to choose which of the cards would be a THOG, which were not, and which could not be classified. The THOG task required subjects to carry out a combinational analysis, a feat an adult should be able to accomplish, using reason and logic. That being said, half of the participants answered the problem incorrectly. [7]

Approach to experimentation

Peter Wason took a rather unconventional approach to his studies. When running experiments, he took a more active approach. Although he had some lab aides, he insisted on being present when experiments were run, so he could actively watch the subjects’ behavior throughout the process. It is also said that Wason infused a clinical psychology atmosphere into his study by asking his subjects how they felt about the experiment itself, as well as the results delivered. These evaluations were recorded and placed in his papers, giving them a more personal and unique feel than many other academic papers of the time. Wason’s goal was to discover new psychological phenomena and new aspects of human behavior, and not only to test his own hypotheses.


Wason wrote the following books:

Related Research Articles

<span class="mw-page-title-main">Social psychology</span> Study of social effects on peoples thoughts, feelings, and behaviors

Social psychology is the scientific study of how thoughts, feelings, and behaviors are influenced by the real or imagined presence of other people or by social norms. Social psychologists typically explain human behavior as a result of the relationship between mental states and social situations, studying the social conditions under which thoughts, feelings, and behaviors occur, and how these variables influence social interactions.

Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values. People display this bias when they select information that supports their views, ignoring contrary information, or when they interpret ambiguous evidence as supporting their existing attitudes. The effect is strongest for desired outcomes, for emotionally charged issues, and for deeply entrenched beliefs. Confirmation bias cannot be eliminated, but it can be managed, for example, by education and training in critical thinking skills.

Deductive reasoning is the mental process of drawing deductive inferences. An inference is deductively valid if its conclusion follows logically from its premises, i.e. if it is impossible for the premises to be true and the conclusion to be false. For example, the inference from the premises "all men are mortal" and "Socrates is a man" to the conclusion "Socrates is mortal" is deductively valid. An argument is sound if it is valid and all its premises are true. Some theorists define deduction in terms of the intentions of the author: they have to intend for the premises to offer deductive support to the conclusion. With the help of this modification, it is possible to distinguish valid from invalid deductive reasoning: it is invalid if the author's belief about the deductive support is false, but even invalid deductive reasoning is a form of deductive reasoning.

<span class="mw-page-title-main">Wishful thinking</span> Formation of beliefs based on what might be pleasing to imagine

Wishful thinking is the formation of beliefs based on what might be pleasing to imagine, rather than on evidence, rationality, or reality. It is a product of resolving conflicts between belief and desire.

Hindsight bias, also known as the knew-it-all-along phenomenon or creeping determinism, is the common tendency for people to perceive past events as having been more predictable than they actually were. People often believe that after an event has occurred, they would have predicted or perhaps even would have known with a high degree of certainty what the outcome of the event would have been before the event occurred. Hindsight bias may cause distortions of memories of what was known or believed before an event occurred, and is a significant source of overconfidence regarding an individual's ability to predict the outcomes of future events. Examples of hindsight bias can be seen in the writings of historians describing outcomes of battles, physicians recalling clinical trials, and in judicial systems as individuals attribute responsibility on the basis of the supposed predictability of accidents.

In statistics, hypotheses suggested by a given dataset, when tested with the same dataset that suggested them, are likely to be accepted even when they are not true. This is because circular reasoning would be involved: something seems true in the limited data set; therefore we hypothesize that it is true in general; therefore we wrongly test it on the same, limited data set, which seems to confirm that it is true. Generating hypotheses based on data already observed, in the absence of testing them on new data, is referred to as post hoc theorizing.

Egocentric bias is the tendency to rely too heavily on one's own perspective and/or have a higher opinion of oneself than reality. It appears to be the result of the psychological need to satisfy one's ego and to be advantageous for memory consolidation. Research has shown that experiences, ideas, and beliefs are more easily recalled when they match one's own, causing an egocentric outlook. Michael Ross and Fiore Sicoly first identified this cognitive bias in their 1979 paper, "Egocentric biases in availability and attribution". Egocentric bias is referred to by most psychologists as a general umbrella term under which other related phenomena fall.

<span class="mw-page-title-main">Wason selection task</span> Test in the study of deductive reasoning

The Wason selection task is a logic puzzle devised by Peter Cathcart Wason in 1966. It is one of the most famous tasks in the study of deductive reasoning. An example of the puzzle is:

You are shown a set of four cards placed on a table, each of which has a number on one side and a colored patch on the other side. The visible faces of the cards show 3, 8, red and brown. Which card(s) must you turn over in order to test the truth of the proposition that if a card shows an even number on one face, then its opposite face is red?

<span class="mw-page-title-main">Observer-expectancy effect</span> Cognitive bias of experimental subject

The observer-expectancy effect is a form of reactivity in which a researcher's cognitive bias causes them to subconsciously influence the participants of an experiment. Confirmation bias can lead to the experimenter interpreting results incorrectly because of the tendency to look for information that conforms to their hypothesis, and overlook information that argues against it. It is a significant threat to a study's internal validity, and is therefore typically controlled using a double-blind experimental design.

<span class="mw-page-title-main">Problem solving</span> Approaches to problem solving

Problem solving is the process of achieving a goal by overcoming obstacles, a frequent part of most activities. Problems in need of solutions range from simple personal tasks to complex issues in business and technical fields. The former is an example of simple problem solving (SPS) addressing one issue, whereas the latter is complex problem solving (CPS) with multiple interrelated obstacles. Another classification is into well-defined problems with specific obstacles and goals, and ill-defined problems in which the current situation is troublesome but it is not clear what kind of resolution to aim for. Similarly, one may distinguish formal or fact-based problems requiring psychometric intelligence, versus socio-emotional problems which depend on the changeable emotions of individuals or groups, such as tactful behavior, fashion, or gift choices.

Belief bias is the tendency to judge the strength of arguments based on the plausibility of their conclusion rather than how strongly they support that conclusion. A person is more likely to accept an argument that supports a conclusion that aligns with their values, beliefs and prior knowledge, while rejecting counter arguments to the conclusion. Belief bias is an extremely common and therefore significant form of error; we can easily be blinded by our beliefs and reach the wrong conclusion. Belief bias has been found to influence various reasoning tasks, including conditional reasoning, relation reasoning and transitive reasoning.

Congruence bias is the tendency of people to over-rely on testing their initial hypothesis while neglecting to test alternative hypotheses. That is, people rarely try experiments that could disprove their initial belief, but rather try to repeat their initial results. It is a special case of the confirmation bias.

<span class="mw-page-title-main">Demand characteristics</span>

In social research, particularly in psychology, the term demand characteristic refers to an experimental artifact where participants form an interpretation of the experiment's purpose and subconsciously change their behavior to fit that interpretation. Typically, demand characteristics are considered an extraneous variable, exerting an effect on behavior other than that intended by the experimenter. Pioneering research was conducted on demand characteristics by Martin Orne.

In psychology, a dual process theory provides an account of how thought can arise in two different ways, or as a result of two different processes. Often, the two processes consist of an implicit (automatic), unconscious process and an explicit (controlled), conscious process. Verbalized explicit processes or attitudes and actions may change with persuasion or education; though implicit process or attitudes usually take a long amount of time to change with the forming of new habits. Dual process theories can be found in social, personality, cognitive, and clinical psychology. It has also been linked with economics via prospect theory and behavioral economics, and increasingly in sociology through cultural analysis.

<span class="mw-page-title-main">Psychology of reasoning</span> Study of how people reason

The psychology of reasoning is the study of how people reason, often broadly defined as the process of drawing conclusions to inform how people solve problems and make decisions. It overlaps with psychology, philosophy, linguistics, cognitive science, artificial intelligence, logic, and probability theory.

<span class="mw-page-title-main">Hypothesis</span> Proposed explanation for an observation, phenomenon, or scientific problem

A hypothesis is a proposed explanation for a phenomenon. For a hypothesis to be a scientific hypothesis, the scientific method requires that one can test it. Scientists generally base scientific hypotheses on previous observations that cannot satisfactorily be explained with the available scientific theories. Even though the words "hypothesis" and "theory" are often used interchangeably, a scientific hypothesis is not the same as a scientific theory. A working hypothesis is a provisionally accepted hypothesis proposed for further research in a process beginning with an educated guess or thought.

<span class="mw-page-title-main">Introspection illusion</span> Cognitive bias of people thinking they understand their own mental states but others are inaccurate

The introspection illusion is a cognitive bias in which people wrongly think they have direct insight into the origins of their mental states, while treating others' introspections as unreliable. The illusion has been examined in psychological experiments, and suggested as a basis for biases in how people compare themselves to others. These experiments have been interpreted as suggesting that, rather than offering direct access to the processes underlying mental states, introspection is a process of construction and inference, much as people indirectly infer others' mental states from their behaviour.

Raymond S. Nickerson is an American psychologist and author. He was a senior vice president at BBN Technologies, from which he is retired, He is now research professor at Tufts University in the Psychology Department. He has authored several books and is the founding editor of The Journal of Experimental Psychology: Applied.

Collective induction is a task developed by Steiner and used in research on group problem solving. Broadly, the method entails "the cooperative search for descriptive, predictive, and explanatory generalizations, rules, and principles" among members in a group working on the same task. James Larson further defined collective induction tasks as "[tasks] in which problem solvers work cooperatively to induce a general rule or principle that can account parsimoniously for a given set of facts or observations" This particular process has been used to determine if groups are better problem solvers than individuals.

Jonathan St B. T. Evans is a British cognitive psychologist, currently Emeritus Professor of Psychology at the University of Plymouth. In 1975, with Peter Wason, Evans proposed one of the first dual-process theories of reasoning, an idea later developed and popularized by Daniel Kahneman. In a 2011 Festschrift, Evans' peers described him as "one of the most influential figures in the psychology of human reasoning".


  1. The Telegraph. "Peter Wason". The Telegraph. Retrieved 24 November 2014.
  2. 1 2 3 The Guardian (25 April 2003). "Peter Wason". The Guardian. Retrieved 24 November 2014.
  3. 1 2 3 4 5 "Wason, Peter Cathcart". Gale Literature: Contemporary Authors: Gale In Context: Biography. Gale. 2003. Retrieved 8 November 2022.
  4. Wason, Peter; Jones, Sheila (1963). "Negatives: Denotation and Connotation". British Journal of Psychology. 54 (4): 299–307. doi:10.1111/j.2044-8295.1963.tb00885.x. PMID   14079021.
  5. Wason, Peter; Carey, Susan (February 1965). "The contexts of plausible denial". Journal of Verbal Learning and Verbal Behavior. 4 (1): 7–11. doi:10.1016/s0022-5371(65)80060-3.
  6. Newstead, S (2003). "Peter Wason (1924-2003)". Thinking and Reasoning. 9 (3): 177–184. CiteSeerX . doi:10.1080/13546780244000141. S2CID   144447692.
  7. 1 2 Newstead, Stephen; Evans, Jonathan St. B.T. (1 July 1995). Perspectives on Thinking and Reasoning: Essays in Honor of Peter Wason . Sussex, UK: Lawrence Erlbaum Associates Ltd. ISBN   978-0863773587.
  8. Wason, Peter (1960). "On The Failure to Eliminate Hypotheses in a Conceptual Task". Quarterly Journal of Experimental Psychology. 12 (3): 129–140. doi:10.1080/17470216008416717. S2CID   19237642.
  9. Chater, N; Oaksford, M (2001). "Human rationality and the psychology of reasoning: Where do we go from here?". British Journal of Psychology. 92: 193–216. doi:10.1348/000712601162031.