Three men make a tiger

Last updated

"Three men make a tiger" (Chinese : 三人成虎 ; pinyin :sān rén chéng hǔ) is a Chinese proverb or chengyu (four-character idiom). "Three men make a tiger" refers to an individual's tendency to accept absurd information as long as it is repeated by enough people. It refers to the idea that if an unfounded premise or urban legend is mentioned and repeated by many individuals, the premise will be erroneously accepted as the truth. This concept is related to communal reinforcement or the fallacy of argumentum ad populum and argumentum ad nauseam .



The proverb came from the story of an alleged speech by Pang Cong (龐蔥), an official of the state of Wei in the Warring States period (475 BC – 221 BC) in Chinese History. According to the Warring States Records , or Zhan Guo Ce, before he left on a trip to the state of Zhao, Pang Cong asked the King of Wei whether he would hypothetically believe in one civilian's report that a tiger was roaming the markets in the capital city, to which the King replied no. Pang Cong asked what the King thought if two people reported the same thing, and the King said he would begin to wonder. Pang Cong then asked, "what if three people all claimed to have seen a tiger?" The King replied that he would believe in it. Pang Cong reminded the King that the notion of a live tiger in a crowded market was absurd, yet when repeated by numerous people, it seemed real.

Since Pang Cong, as a high-ranking official, had more than three opponents and critics, he was in fact urging the King to pay no attention to those who would spread rumors about him (Pang Cong) while he was away. "I understand", the King replied, and Pang Cong left for Zhao. Yet, slanderous talk took place. When Pang Cong returned to Wei, the King indeed stopped seeing him. [1]

Cognitive biases

The tendency to accept absurd information is caused by certain cognitive biases. The first of which is the motivated reasoning concept, which is an emotion-biased decision-making phenomenon. It is the idea that humans are motivated to believe whatever confirms their opinions. Motivated reasoning can lead to a false social consensus over time. The second concept is social consensus reality, which explains that beliefs with high societal consensus are treated like facts, whereas beliefs with relatively low consensus are more susceptible to persuasion and attitude change. The latter is most likely a product of the social consensus of the specific community one lives in. [2]

Examples from economics

One application of the cognitive biases highlighted through the anecdote is that markets are efficient. Often investors jump on a wagon that is either directed in buying or shorting a certain stock or index with the main motivation that many other investors are behaving in a unilateral way. In the short-term when many investors buy a certain stock the market experiences a self-fulfilling prophecy and the stock actually gains value although the company might be underperforming and just benefiting from current market trends. Investors who take such decisions are not basing their justification on fundamental analysis or certain limited information but mainly follow an investment trend that is demonstrated by a high number of other investors. [3]

See also

Related Research Articles

<i>Ad hominem</i> Argumentative strategies, usually fallacious

Ad hominem, short for argumentum ad hominem, refers to several types of arguments, most of which are fallacious.

<span class="mw-page-title-main">Cognitive bias</span> Systematic pattern of deviation from norm or rationality in judgment

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, or what is broadly called irrationality.

A fallacy is the use of invalid or otherwise faulty reasoning, or "wrong moves" in the construction of an argument which may appear stronger than it really is if the fallacy is not spotted. The term in the Western intellectual tradition was introduced in the Aristotelian De Sophisticis Elenchis.

<span class="mw-page-title-main">Sunk cost</span> Cost that has already been incurred and cannot be recovered

In economics and business decision-making, a sunk cost is a cost that has already been incurred and cannot be recovered. Sunk costs are contrasted with prospective costs, which are future costs that may be avoided if action is taken. In other words, a sunk cost is a sum paid in the past that is no longer relevant to decisions about the future. Even though economists argue that sunk costs are no longer relevant to future rational decision-making, people in everyday life often take previous expenditures in situations, such as repairing a car or house, into their future decisions regarding those properties.

<span class="mw-page-title-main">Behavioral economics</span> Academic discipline

Behavioral economics studies the effects of psychological, cognitive, emotional, cultural and social factors on the decisions of individuals or institutions, such as how those decisions vary from those implied by classical economic theory.

The conventional wisdom or received opinion is the body of ideas or explanations generally accepted by the public and/or by experts in a field. In religion, this is known as orthodoxy.

Communal reinforcement is a social phenomenon in which a concept or idea is repeatedly asserted in a community, regardless of whether sufficient empirical evidence has been presented to support it. Over time, the concept or idea is reinforced to become a strong belief in many people's minds, and may be regarded by the members of the community as fact. Often, the concept or idea may be further reinforced by publications in the mass media, books, or other means of communication. The phrase "millions of people can't all be wrong" is indicative of the common tendency to accept a communally reinforced idea without question, which often aids in the widespread acceptance of factoids. A very similar term to this term is community-reinforcement, which is a behavioral method to stop drug addiction.

<span class="mw-page-title-main">Wishful thinking</span> Formation of beliefs based on what might be pleasing to imagine

Wishful thinking is the formation of beliefs based on what might be pleasing to imagine, rather than on evidence, rationality, or reality. It is a product of resolving conflicts between belief and desire.

In psychology, an attribution bias or attributional bias is a cognitive bias that refers to the systematic errors made when people evaluate or try to find reasons for their own and others' behaviors. People constantly make attributions—judgements and assumptions about why people behave in certain ways. However, attributions do not always accurately reflect reality. Rather than operating as objective perceivers, people are prone to perceptual errors that lead to biased interpretations of their social world. Attribution biases are present in everyday life. For example, when a driver cuts someone off, the person who has been cut off is often more likely to attribute blame to the reckless driver's inherent personality traits rather than situational circumstances. Additionally, there are many different types of attribution biases, such as the ultimate attribution error, fundamental attribution error, actor-observer bias, and hostile attribution bias. Each of these biases describes a specific tendency that people exhibit when reasoning about the cause of different behaviors.

Herd, mob, or pack mentality describes how people can be influenced by their peers to adopt certain behaviors on a largely emotional, rather than rational, basis. When individuals are affected by mob mentality, they may make different decisions than they would have individually.

The overconfidence effect is a well-established bias in which a person's subjective confidence in his or her judgments is reliably greater than the objective accuracy of those judgments, especially when confidence is relatively high. Overconfidence is one example of a miscalibration of subjective probabilities. Throughout the research literature, overconfidence has been defined in three distinct ways: (1) overestimation of one's actual performance; (2) overplacement of one's performance relative to others; and (3) overprecision in expressing unwarranted certainty in the accuracy of one's beliefs.

The wisdom of the crowd is the collective opinion of a diverse independent group of individuals rather than that of a single expert. This process, while not new to the Information Age, has been pushed into the mainstream spotlight by social information sites such as Quora, Reddit, Stack Exchange, Wikipedia, Yahoo! Answers, and other web resources which rely on collective human knowledge. An explanation for this phenomenon is that there is idiosyncratic noise associated with each individual judgment, and taking the average over a large number of responses will go some way toward canceling the effect of this noise.

The "hot hand" is a phenomenon, previously considered a cognitive social bias, that a person who experiences a successful outcome has a greater chance of success in further attempts. The concept is often applied to sports and skill-based tasks in general and originates from basketball, where a shooter is more likely to score if their previous attempts were successful; i.e., while having the "hot hand.” While previous success at a task can indeed change the psychological attitude and subsequent success rate of a player, researchers for many years did not find evidence for a "hot hand" in practice, dismissing it as fallacious. However, later research questioned whether the belief is indeed a fallacy. Some recent studies using modern statistical analysis have observed evidence for the "hot hand" in some sporting activities; however, other recent studies have not observed evidence of the "hot hand". Moreover, evidence suggests that only a small subset of players may show a "hot hand" and, among those who do, the magnitude of the "hot hand" tends to be small.

In argumentation theory, an argumentum ad populum is a fallacious argument which is based on claiming a truth or affirming something is good because the majority thinks so.

An argument from authority, also called an appeal to authority, or argumentum ad verecundiam, is a form of argument in which the opinion of an authority on a topic is used as evidence to support an argument. Some consider that it is used in a cogent form if all sides of a discussion agree on the reliability of the authority in the given context, and others consider it to be a fallacy to cite the views of an authority on the discussed topic as a means of supporting an argument.

<span class="mw-page-title-main">Woozle effect</span> False credibility due to quantity of citations

The Woozle effect, also known as evidence by citation, occurs when a source is widely cited for a claim it does not adequately support, giving said claim undeserved credibility. If replication studies are not done and no one notices that a key claim was never well-supported in its original publication, faulty assumptions may affect further research.


  1. ""It is clear that there is no tiger in the market-place, and yet three men's words would make a tiger"" (PDF). Archived from the original (PDF) on 2014-01-12. Retrieved 2012-09-26.
  2. Waytz, Adam (March 6, 2017). "The Psychology Behind Fake News". KelloggInsight.
  3. M. E. Landry. "Investors Must Watch Out for Self-Fulfilling Prophecies" (PDF). Retrieved 25 Jun 2017.
Listen to this article (3 minutes)
This audio file was created from a revision of this article dated 15 December 2005 (2005-12-15), and does not reflect subsequent edits.