Thinking, Fast and Slow

Last updated

Thinking, Fast and Slow
Thinking, Fast and Slow.jpg
Hardcover edition
Author Daniel Kahneman
Language English language
Subject Psychology
Genre Non-fiction
Publisher Farrar, Straus and Giroux
Publication date
2011
Publication placeUnited States
Media typePrint (hardcover, paperback), audio
Pages499 pages
ISBN 978-0374275631
OCLC 706020998

Thinking, Fast and Slow is a 2011 popular science book by psychologist Daniel Kahneman. The book's main thesis is a differentiation between two modes of thought: "System 1" is fast, instinctive and emotional; "System 2" is slower, more deliberative, and more logical.

Contents

The book delineates rational and non-rational motivations or triggers associated with each type of thinking process, and how they complement each other, starting with Kahneman's own research on loss aversion. From framing choices to people's tendency to replace a difficult question with one which is easy to answer, the book summarizes several decades of research to suggest that people have too much confidence in human judgment. [1] Kahneman performed his own research, often in collaboration with Amos Tversky, which enriched his experience to write the book. [2] [3] It covers different phases of his career: his early work concerning cognitive biases, his work on prospect theory and happiness, and with the Israel Defense Forces.

The book was a New York Times bestseller [4] and was the 2012 winner of the National Academies Communication Award for best creative work that helps the public understanding of topics in behavioral science, engineering and medicine. [5] The integrity of some priming studies cited in the book has been called into question in the midst of the psychological replication crisis. [6]

Two systems

In the book's first section, Kahneman describes two different ways the brain forms thoughts:

Kahneman describes a number of experiments which purport to examine the differences between these two thought systems and how they arrive at different results even given the same inputs. Terms and concepts include coherence, attention, laziness, association, jumping to conclusions, WYSIATI (What you see is all there is), and how one forms judgments. The System 1 vs. System 2 debate includes the reasoning or lack thereof for human decision making, with big implications for many areas including law and market research. [7]

Heuristics and biases

The second section offers explanations for why humans struggle to think statistically. It begins by documenting a variety of situations in which we either arrive at binary decisions or fail to associate precisely reasonable probabilities with outcomes. Kahneman explains this phenomenon using the theory of heuristics. Kahneman and Tversky originally discussed this topic in their 1974 article titled Judgment Under Uncertainty: Heuristics and Biases. [8]

Kahneman uses heuristics to assert that System 1 thinking involves associating new information with existing patterns, or thoughts, rather than creating new patterns for each new experience. For example, a child who has only seen shapes with straight edges might perceive an octagon when first viewing a circle. As a legal metaphor, a judge limited to heuristic thinking would only be able to think of similar historical cases when presented with a new dispute, rather than considering the unique aspects of that case. In addition to offering an explanation for the statistical problem, the theory also offers an explanation for human biases.

Anchoring

The "anchoring effect" names a tendency to be influenced by irrelevant numbers. Shown greater/lesser numbers, experimental subjects gave greater/lesser responses. [2] As an example, most people, when asked whether Gandhi was more than 114 years old when he died, will provide a much greater estimate of his age at death than others who were asked whether Gandhi was more or less than 35 years old. Experiments show that people's behavior is influenced, much more than they are aware, by irrelevant information.

Availability

The availability heuristic is a mental shortcut that occurs when people make judgments about the probability of events on the basis of how easy it is to think of examples. The availability heuristic operates on the notion that, "if you can think of it, it must be important". The availability of consequences associated with an action is related positively to perceptions of the magnitude of the consequences of that action. In other words, the easier it is to recall the consequences of something, the greater we perceive these consequences to be. Sometimes, this heuristic is beneficial, but the frequencies at which events come to mind are usually not accurate representations of the probabilities of such events in real life. [9] [10]

Conjunction fallacy

System 1 is prone to substituting a simpler question for a difficult one. In what Kahneman terms their "best-known and most controversial" experiment, "the Linda problem," subjects were told about an imaginary Linda, young, single, outspoken, and intelligent, who, as a student, was very concerned with discrimination and social justice. They asked whether it was more probable that Linda is a bank teller or that she is a bank teller and an active feminist. The overwhelming response was that "feminist bank teller" was more likely than "bank teller," violating the laws of probability. (All feminist bank tellers are bank tellers, so the former can't be more likely). In this case System 1 substituted the easier question, "Is Linda a feminist?", neglecting the occupation qualifier. An alternative interpretation is that the subjects added an unstated cultural implicature to the effect that the other answer implied an exclusive or, that Linda was not a feminist. [2]

Optimism and loss aversion

Kahneman writes of a "pervasive optimistic bias", which "may well be the most significant of the cognitive biases." This bias generates the illusion of control: the illusion that we have substantial control of our lives.

A natural experiment reveals the prevalence of one kind of unwarranted optimism. The planning fallacy is the tendency to overestimate benefits and underestimate costs, impelling people to begin risky projects. In 2002, American kitchen remodeling was expected on average to cost $18,658, but actually cost $38,769. [2]

To explain overconfidence, Kahneman introduces the concept he terms What You See Is All There Is (WYSIATI). This theory states that when the mind makes decisions, it deals primarily with Known Knowns, phenomena it has observed already. It rarely considers Known Unknowns, phenomena that it knows to be relevant but about which it does not have information. Finally it appears oblivious to the possibility of Unknown Unknowns, unknown phenomena of unknown relevance.

He explains that humans fail to take into account complexity and that their understanding of the world consists of a small and necessarily un-representative set of observations. Furthermore, the mind generally does not account for the role of chance and therefore falsely assumes that a future event will be similar to a past event.

Framing

Framing is the context in which choices are presented. Experiment: subjects were asked whether they would opt for surgery if the "survival" rate is 90 percent, while others were told that the mortality rate is 10 percent. The first framing increased acceptance, even though the situation was no different. [11]

Sunk cost

Rather than consider the odds that an incremental investment would produce a positive return, people tend to "throw good money after bad" and continue investing in projects with poor prospects that have already consumed significant resources. In part this is to avoid feelings of regret. [11]

Overconfidence

This part (part III, sections 19–24) of the book is dedicated to the undue confidence in what the mind believes it knows. It suggests that people often overestimate how much they understand about the world and underestimate the role of chance in particular. This is related to the excessive certainty of hindsight, when an event seems to be understood after it has occurred or developed. Kahneman's opinions concerning overconfidence are influenced by Nassim Nicholas Taleb. [12]

Choices

In this section Kahneman returns to economics and expands his seminal work on Prospect Theory. He discusses the tendency for problems to be addressed in isolation and how, when other reference points are considered, the choice of that reference point (called a frame) has a disproportionate effect on the outcome. This section also offers advice on how some of the shortcomings of System 1 thinking can be avoided.

Prospect theory

Kahneman developed prospect theory, the basis for his Nobel prize, to account for experimental errors he noticed in Daniel Bernoulli's traditional utility theory. [13] According to Kahneman, Utility Theory makes logical assumptions of economic rationality that do not represent people's actual choices, and does not take into account cognitive biases.

One example is that people are loss-averse: they are more likely to act to avert a loss than to achieve a gain. Another example is that the value people place on a change in probability (e.g., of winning something) depends on the reference point: people seem to place greater value on a change from 0% to 10% (going from impossibility to possibility) than from, say, 45% to 55%, and they place the greatest value of all on a change from 90% to 100% (going from possibility to certainty). This occurs despite the fact that by traditional utility theory all three changes give the same increase in utility. Consistent with loss-aversion, the order of the first and third of those is reversed when the event is presented as losing rather than winning something: there, the greatest value is placed on eliminating the probability of a loss to 0.

After the book's publication, the Journal of Economic Literature published a discussion of its parts concerning prospect theory, [14] as well as an analysis of the four fundamental factors on which it is based. [15]

Two selves

The fifth part of the book describes recent evidence which introduces a distinction between two selves, the 'experiencing self' and 'remembering self'. [16] Kahneman proposed an alternative measure that assessed pleasure or pain sampled from moment to moment, and then summed over time. Kahneman termed this "experienced" well-being and attached it to a separate "self." He distinguished this from the "remembered" well-being that the polls had attempted to measure. He found that these two measures of happiness diverged. [17]

Life as a story

The author's significant discovery was that the remembering self does not care about the duration of a pleasant or unpleasant experience. Instead, it retrospectively rates an experience by the maximum or minimum of the experience, and by the way it ends. The remembering self dominated the patient's ultimate conclusion.

"Odd as it may seem," Kahneman writes, "I am my remembering self, and the experiencing self, who does my living, is like a stranger to me." [3]

Experienced well-being

Kahneman first began the study of well-being in the 1990s. At the time most happiness research relied on polls about life satisfaction. Having previously studied unreliable memories, the author was doubtful that life satisfaction was a good indicator of happiness. He designed a question that emphasized instead the well-being of the experiencing self. The author proposed that "Helen was happy in the month of March" if she spent most of her time engaged in activities that she would rather continue than stop, little time in situations that she wished to escape, and not too much time in a neutral state that wouldn't prefer continuing or stopping the activity either way.

Thinking about life

Kahneman suggests that emphasizing a life event such as a marriage or a new car can provide a distorted illusion of its true value. This "focusing illusion" revisits earlier ideas of substituting difficult questions and WYSIATI.

Awards and honors

Reception

As of 2012 the book had sold over one million copies. [23] On the year of its publication, it was on the New York Times Bestseller List. [4] The book was reviewed in media including the Huffington Post , [24] The Guardian , [25] The New York Times , [2] The Financial Times , [26] The Independent , [27] Bloomberg [11] and The New York Review of Books . [28] [ further explanation needed ] On Book Marks, the book received a "rave" consensus, based on eight critic reviews: six "rave" and two "positive". [29] In Bookmarks March/April 2012 issue, a magazine that aggregates critic reviews of books, the book received a Star full.svgStar full.svgStar full.svgStar full.svgStar empty.svg (4.00 out of 5) with the critical summary stating, "Either way, it's an enlightening tome on how--fast or slow--we make decisions". [30]

The book was also widely reviewed in academic journals, including the Journal of Economic Literature , [14] American Journal of Education , [31] The American Journal of Psychology , [32] Planning Theory , [33] The American Economist , [34] The Journal of Risk and Insurance , [35] The Michigan Law Review , [36] American Scientist , [37] Contemporary Sociology , [38] Science , [39] Contexts , [40] The Wilson Quarterly , [41] Technical Communication , [42] The University of Toronto Law Journal, [43] A Review of General Semantics [44] and Scientific American Mind . [45] The book was also reviewed in a monthly magazine Observer, published by the Association for Psychological Science. [46] [ further explanation needed ]

The book has achieved a large following among baseball scouts and baseball executives. The ways of thinking described in the book are believed to help scouts, who have to make major judgements off little information and can easily fall into prescriptive yet inaccurate patterns of analysis. [47]

The last chapter of Paul Bloom's Against Empathy discusses concepts also touched in Daniel Kahneman's book, Thinking, Fast and Slow, that suggest people make a series of rational and irrational decisions. [48] [48] :214 He criticizes the argument that "regardless of reason's virtues, we just aren't any good at it." His point is that people are not as "stupid as scholars think they are." [48] :216 He explains that people are rational because they make thoughtful decisions in their everyday lives. For example, when someone has to make a big life decision they critically assess the outcomes, consequences, and alternative options. [48] :230

Author Nicholas Taleb has equated the book's importance to that of Adam Smith’s “The Wealth of Nations” and Sigmund Freud’s “The Interpretation of Dreams.” [49]

Replication crisis

Part of the book has been swept up in the replication crisis facing psychology and the social sciences. It was discovered many prominent research findings were difficult or impossible for others to replicate, and thus the original findings were called into question. An analysis [50] of the studies cited in chapter 4, "The Associative Machine", found that their replicability index (R-index) [51] is 14, indicating essentially low to no reliability. Kahneman himself responded to the study in blog comments and acknowledged the chapter's shortcomings: "I placed too much faith in underpowered studies." [52] Others have noted the irony in the fact that Kahneman made a mistake in judgment similar to the ones he studied. [53]

A later analysis [54] made a bolder claim that, despite Kahneman's previous contributions to the field of decision making, most of the book's ideas are based on 'scientific literature with shaky foundations'. A general lack of replication in the empirical studies cited in the book was given as a justification.

See also

Related Research Articles

<span class="mw-page-title-main">Cognitive bias</span> Systematic pattern of deviation from norm or rationality in judgment

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.

A heuristic or heuristic technique is any approach to problem solving that employs a pragmatic method that is not fully optimized, perfected, or rationalized, but is nevertheless "good enough" as an approximation or attribute substitution. Where finding an optimal solution is impossible or impractical, heuristic methods can be used to speed up the process of finding a satisfactory solution. Heuristics can be mental shortcuts that ease the cognitive load of making a decision.

Heuristic reasoning is often based on induction, or on analogy[.] [...] Induction is the process of discovering general laws [...] Induction tries to find regularity and coherence [...] Its most conspicuous instruments are generalization, specialization, analogy. [...] Heuristic discusses human behavior in the face of problems [...that have been] preserved in the wisdom of proverbs.

Bounded rationality is the idea that rationality is limited when individuals make decisions, and under these limitations, rational individuals will select a decision that is satisfactory rather than optimal.

<span class="mw-page-title-main">Daniel Kahneman</span> Israeli-American psychologist and economist (1934–2024)

Daniel Kahneman was an Israeli-American psychologist best known for his work on the psychology of judgment and decision-making as well as behavioral economics, for which he was awarded the 2002 Nobel Memorial Prize in Economic Sciences together with Vernon L. Smith. Kahneman's published empirical findings challenge the assumption of human rationality prevailing in modern economic theory. Kahneman became known as the "grandfather of behavioral economics."

<span class="mw-page-title-main">Amos Tversky</span> Israeli psychologist (1937–1996)

Amos Nathan Tversky was an Israeli cognitive and mathematical psychologist and a key figure in the discovery of systematic human cognitive bias and handling of risk.

<span class="mw-page-title-main">Behavioral economics</span> Academic discipline

Behavioral economics is the study of the psychological, cognitive, emotional, cultural and social factors involved in the decisions of individuals or institutions, and how these decisions deviate from those implied by classical economic theory.

The availability heuristic, also known as availability bias, is a mental shortcut that relies on immediate examples that come to a given person's mind when evaluating a specific topic, concept, method, or decision. This heuristic, operating on the notion that, if something can be recalled, it must be important, or at least more important than alternative solutions not as readily recalled, is inherently biased toward recently acquired information.

<span class="mw-page-title-main">Decision theory</span> Branch of applied probability theory

Decision theory or the theory of rational choice is a branch of probability, economics, and analytic philosophy that uses the tools of expected utility and probability to model how individuals should behave rationally under uncertainty. It differs from the cognitive and behavioral sciences in that it is prescriptive and concerned with identifying optimal decisions for a rational agent, rather than describing how people really do make decisions. Despite this, the field is extremely important to the study of real human behavior by social scientists, as it lays the foundations for the rational agent models used to mathematically model and analyze individuals in fields such as sociology, economics, criminology, cognitive science, and political science.

The representativeness heuristic is used when making judgments about the probability of an event being representional in character and essence of a known prototypical event. It is one of a group of heuristics proposed by psychologists Amos Tversky and Daniel Kahneman in the early 1970s as "the degree to which [an event] (i) is similar in essential characteristics to its parent population, and (ii) reflects the salient features of the process by which it is generated". The representativeness heuristic works by comparing an event to a prototype or stereotype that we already have in mind. For example, if we see a person who is dressed in eccentric clothes and reading a poetry book, we might be more likely to think that they are a poet than an accountant. This is because the person's appearance and behavior are more representative of the stereotype of a poet than an accountant.

The conjunction fallacy is an inference that a conjoint set of two or more specific conclusions is likelier than any single member of that same set, in violation of the laws of probability. It is a type of formal fallacy.

<span class="mw-page-title-main">Gerd Gigerenzer</span> German cognitive psychologist

Gerd Gigerenzer is a German psychologist who has studied the use of bounded rationality and heuristics in decision making. Gigerenzer is director emeritus of the Center for Adaptive Behavior and Cognition (ABC) at the Max Planck Institute for Human Development, Berlin, director of the Harding Center for Risk Literacy, University of Potsdam, and vice president of the European Research Council (ERC).

<span class="mw-page-title-main">Simulation heuristic</span> Mental strategy

The simulation heuristic is a psychological heuristic, or simplified mental strategy, according to which people determine the likelihood of an event based on how easy it is to picture the event mentally. Partially as a result, people experience more regret over outcomes that are easier to imagine, such as "near misses". The simulation heuristic was first theorized by psychologists Daniel Kahneman and Amos Tversky as a specialized adaptation of the availability heuristic to explain counterfactual thinking and regret. However, it is not the same as the availability heuristic. Specifically the simulation heuristic is defined as "how perceivers tend to substitute normal antecedent events for exceptional ones in psychologically 'undoing' this specific outcome."

The overconfidence effect is a well-established bias in which a person's subjective confidence in their judgments is reliably greater than the objective accuracy of those judgments, especially when confidence is relatively high. Overconfidence is one example of a miscalibration of subjective probabilities. Throughout the research literature, overconfidence has been defined in three distinct ways: (1) overestimation of one's actual performance; (2) overplacement of one's performance relative to others; and (3) overprecision in expressing unwarranted certainty in the accuracy of one's beliefs.

In psychology, the human mind is considered to be a cognitive miser due to the tendency of humans to think and solve problems in simpler and less effortful ways rather than in more sophisticated and effortful ways, regardless of intelligence. Just as a miser seeks to avoid spending money, the human mind often seeks to avoid spending cognitive effort. The cognitive miser theory is an umbrella theory of cognition that brings together previous research on heuristics and attributional biases to explain when and why people are cognitive misers.

Attribute substitution is a psychological process thought to underlie a number of cognitive biases and perceptual illusions. It occurs when an individual has to make a judgment that is computationally complex, and instead substitutes a more easily calculated heuristic attribute. This substitution is thought of as taking place in the automatic intuitive judgment system, rather than the more self-aware reflective system. Hence, when someone tries to answer a difficult question, they may actually answer a related but different question, without realizing that a substitution has taken place. This explains why individuals can be unaware of their own biases, and why biases persist even when the subject is made aware of them. It also explains why human judgments often fail to show regression toward the mean.

Heuristics is the process by which humans use mental shortcuts to arrive at decisions. Heuristics are simple strategies that humans, animals, organizations, and even machines use to quickly form judgments, make decisions, and find solutions to complex problems. Often this involves focusing on the most relevant aspects of a problem or situation to formulate a solution. While heuristic processes are used to find the answers and solutions that are most likely to work or be correct, they are not always right or the most accurate. Judgments and decisions based on heuristics are simply good enough to satisfy a pressing need in situations of uncertainty, where information is incomplete. In that sense they can differ from answers given by logic and probability.

Illusion of validity is a cognitive bias in which a person overestimates their ability to interpret and predict accurately the outcome when analyzing a set of data, in particular when the data analyzed show a very consistent pattern—that is, when the data "tell" a coherent story.

Cognitive bias mitigation is the prevention and reduction of the negative effects of cognitive biases – unconscious, automatic influences on human judgment and decision making that reliably produce reasoning errors.

Social heuristics are simple decision making strategies that guide people's behavior and decisions in the social environment when time, information, or cognitive resources are scarce. Social environments tend to be characterised by complexity and uncertainty, and in order to simplify the decision-making process, people may use heuristics, which are decision making strategies that involve ignoring some information or relying on simple rules of thumb.

Intuitive statistics, or folk statistics, is the cognitive phenomenon where organisms use data to make generalizations and predictions about the world. This can be a small amount of sample data or training instances, which in turn contribute to inductive inferences about either population-level properties, future data, or both. Inferences can involve revising hypotheses, or beliefs, in light of probabilistic data that inform and motivate future predictions. The informal tendency for cognitive animals to intuitively generate statistical inferences, when formalized with certain axioms of probability theory, constitutes statistics as an academic discipline.

References

  1. Shaw, Tamsin (April 20, 2017). "Invisible Manipulators of Your Mind". New York Review of Books. ISSN   0028-7504 . Retrieved August 10, 2020.
  2. 1 2 3 4 5 Holt, Jim (November 27, 2011). "Two Brains Running". The New York Times. p. 16.
  3. 1 2 Daniel Kahneman (2011). Thinking, Fast and Slow. Macmillan. ISBN   978-1-4299-6935-2 . Retrieved April 8, 2012.
  4. 1 2 "The New York Times Best Seller List – December 25, 2011" (PDF). www.hawes.com. Retrieved August 17, 2014.
  5. 1 2 "Daniel Kahneman's Thinking, Fast and Slow Wins Best Book Award From Academies; Milwaukee Journal Sentinel, Slate Magazine, and WGBH/NOVA Also Take Top Prizes in Awards' 10th Year" . Retrieved March 10, 2018.
  6. Schimmack, Ulrich (December 30, 2020). "A Meta-Scientific Perspective on "Thinking: Fast and Slow". Replicability-Index. Retrieved February 11, 2023.
  7. "Web Page Under Construction". www.upfrontanalytics.com.
  8. Tversky, Amos; Kahneman, Daniel (1974). "Judgment under Uncertainty: Heuristics and Biases" (PDF). Science. 185 (4157): 1124–31. Bibcode:1974Sci...185.1124T. doi:10.1126/science.185.4157.1124. PMID   17835457. S2CID   143452957. Archived from the original on March 18, 2012.{{cite journal}}: CS1 maint: bot: original URL status unknown (link)
  9. Tversky, Amos (1982). "11 – Availability: A heuristic for judging frequency and probability" (PDF). In Kahneman, Daniel (ed.). Judgment under uncertainty : heuristics and biases. Vol. 185. Cambridge [u.a.]: Cambridge Univ. Press. pp. 1124–31. Bibcode:1974Sci...185.1124T. doi:10.1126/science.185.4157.1124. ISBN   9780521240642. PMID   17835457. S2CID   143452957.{{cite book}}: |journal= ignored (help)
  10. Tversky, Amos; Kahneman, Daniel (September 1973). "Availability: A heuristic for judging frequency and probability". Cognitive Psychology. 5 (2): 207–232. doi:10.1016/0010-0285(73)90033-9.(subscription required)
  11. 1 2 3 Reprints, Roger Lowenstein (October 28, 2011). "Book Review: Thinking, Fast and Slow by Daniel Kahneman". Bloomberg.com. Retrieved May 27, 2016.
  12. Kahneman, Daniel (2011). Thinking, fast and slow. London: Penguin Books. pp.  14. ISBN   9780141033570. OCLC   781497062.
  13. Kahneman, Daniel; Tversky, Amos (March 1979). "Prospect Theory: An Analysis of Decision under Risk" (PDF). Econometrica. 47 (2): 263–291. CiteSeerX   10.1.1.407.1910 . doi:10.2307/1914185. JSTOR   1914185. Archived from the original on November 17, 2014.{{cite journal}}: CS1 maint: unfit URL (link)
  14. 1 2 Psychologists at the Gate: A Review of Daniel Kahneman's Thinking, Fast and Slow (PDF). 2012.
  15. Psychologists at the Gate: A Review of Daniel Kahneman's Thinking, Fast and Slow (PDF). 2012. pp. 7–9.
  16. Lazari-Radek, Katarzyna de; Singer, Peter (2014). The Point of View of the Universe: Sidgwick and Contemporary Ethics. Oxford University Press. p. 276.
  17. Kahneman, Daniel (2011). "35. Two Selves". Thinking, Fast and Slow. New York: Farrar, Straus & Giroux.
  18. "2011 Los Angeles Times Book Prize Winners & Finalists". Los Angeles Times. Archived from the original on April 16, 2016.
  19. "10 Best Books of 2011". The New York Times. November 30, 2011. ISSN   0362-4331 . Retrieved March 10, 2018.
  20. Stein, Janice Gross; et al. "The Globe 100: The very best books of 2011" . Retrieved March 10, 2018.
  21. "The Economist - Books of the Year 2011 (50 books)". www.goodreads.com.
  22. "The Best Nonfiction of 2011". Wall Street Journal. December 17, 2011.
  23. Cooper, Glenda (July 14, 2012). "Thinking, Fast and Slow: the 'landmark in social thought' going head to head with Fifty Shades of Grey". Daily Telegraph. ISSN   0307-1235 . Retrieved February 17, 2018.
  24. Levine, David K. (September 22, 2012). "Thinking Fast and Slow and Poorly and Well". Huffington Post. Retrieved February 17, 2018.
  25. Strawson, Galen (December 13, 2011). "Thinking, Fast and Slow by Daniel Kahneman – review". the Guardian. Retrieved February 17, 2018.
  26. "Thinking, Fast and Slow". Financial Times. November 5, 2011. Retrieved February 17, 2018.
  27. "Thinking, Fast and Slow, By Daniel Kahneman" . The Independent. November 18, 2011. Archived from the original on May 7, 2022. Retrieved February 17, 2018.
  28. Dyson, Freeman (December 22, 2011). "How to Dispel Your Illusions". The New York Review of Books. ISSN   0028-7504 . Retrieved February 17, 2018.
  29. "Thinking, Fast and Slow". Book Marks . Retrieved January 16, 2024.
  30. "Thinking, Fast and Slow By Daniel Kahneman". Bookmarks Magazine . Archived from the original on September 5, 2015. Retrieved January 14, 2023.
  31. Durr, Tony (February 1, 2014). "Thinking, Fast and Slow by Daniel Kahneman". American Journal of Education. 120 (2): 287–291. doi:10.1086/674372. ISSN   0195-6744.
  32. Krueger, Joachim I. (2012). Kahneman, Daniel (ed.). "Reviewing, Fast and Slow". The American Journal of Psychology. 125 (3): 382–385. doi:10.5406/amerjpsyc.125.3.0382. JSTOR   10.5406/amerjpsyc.125.3.0382.
  33. Baum, Howell (2013). "Review of Thinking, fast and slow". Planning Theory. 12 (4): 442–446. doi:10.1177/1473095213486667. JSTOR   26166233. S2CID   149027956.
  34. Brock, John R. (2012). "Review of Thinking, Fast and Slow". The American Economist. 57 (2): 259–261. doi:10.1177/056943451205700211. JSTOR   43664727. S2CID   149090700.
  35. Gardner, Lisa A. (2012). "Review of Thinking, Fast and Slow". The Journal of Risk and Insurance. 79 (4): 1143–1145. doi:10.1111/j.1539-6975.2012.01494.x. JSTOR   23354961.
  36. Stein, Alex (2013). "Are People Probabilistically Challenged?". Michigan Law Review. 111 (6): 855–875. JSTOR   23812713.
  37. Sloman, Steven (2012). "The Battle Between Intuition and Deliberation". American Scientist. 100 (1): 73–75. JSTOR   23222820.
  38. Etzioni, Amitai (2012). Kahneman, Daniel (ed.). "The End of Rationality?". Contemporary Sociology. 41 (5): 594–597. doi:10.1177/0094306112457657b. JSTOR   41722908. S2CID   143107781.
  39. Sherman, Steven J. (2011). "Blink with Muscles". Science. 334 (6059): 1062–1064. Bibcode:2011Sci...334.1062S. doi:10.1126/science.1214243. JSTOR   41351778. S2CID   145337277.
  40. jasper, james m. (2012). "thinking in context". Contexts. 11 (2): 70–71. doi: 10.1177/1536504212446467 . JSTOR   41960818.
  41. Akst, Daniel (2011). "Rushing to Judgment". The Wilson Quarterly. 35 (4): 97–98. JSTOR   41484407.
  42. Harrison, Kelly A. (2012). "Review of Thinking, Fast and Slow". Technical Communication. 59 (4): 342–343. JSTOR   43093040.
  43. Richardson, Megan Lloyd (2012). "Review of Thinking, Fast and Slow [sic, included in a set of reviews]". The University of Toronto Law Journal. 62 (3): 453–457. doi:10.1353/tlj.2012.0013. JSTOR   23263811. S2CID   144044453.
  44. Vassallo, Philip (2012). "Review of Thinking, Fast and Slow". ETC: A Review of General Semantics. 69 (4): 480. JSTOR   42579224.
  45. Upson, Sandra (2012). "Cognitive Illusions". Scientific American Mind. 22 (6): 68–69. JSTOR   24943506.
  46. Bazerman, Max H. (October 21, 2011). "Review of Thinking, Fast and Slow by Daniel Kahneman". APS Observer. 24 (10).
  47. This Book Is Not About Baseball. But Baseball Teams Swear by It.
  48. 1 2 3 4 Bloom, Paul (2016). Against Empathy: The Case for Rational Compassion. Ecco. ISBN   978-0-06-233935-5.
  49. Jr, Robert D. Hershey (March 27, 2024). "Daniel Kahneman, Who Plumbed the Psychology of Economics, Dies at 90". The New York Times. ISSN   0362-4331 . Retrieved March 29, 2024.
  50. R, Dr (February 2, 2017). "Reconstruction of a Train Wreck: How Priming Research Went off the Rails". Replicability-Index. Retrieved April 30, 2019.
  51. R, Dr (January 31, 2016). "A Revised Introduction to the R-Index". Replicability-Index. Retrieved April 30, 2019.
  52. McCook, Alison (February 20, 2017). ""I placed too much faith in underpowered studies:" Nobel Prize winner admits mistakes". Retraction Watch. Retrieved April 30, 2019.
  53. Engber, Daniel (December 21, 2016). "How a Pioneer in the Science of Mistakes Ended Up Mistaken". Slate Magazine. Retrieved April 30, 2019.
  54. Schimmack, Ulrich (December 30, 2020). "A Meta-Scientific Perspective on "Thinking: Fast and Slow". Replicability-Index. Retrieved February 21, 2021.