Thomas Gilovich

Last updated
Thomas Gilovich
Gilovich (fr Feist).jpg
Gilovich in 2015
Born (1954-01-16) January 16, 1954 (age 70)
NationalityAmerican
Alma mater University of California, Santa Barbara (BA)
Stanford University (PhD)
Known forResearch in heuristics and cognitive biases
Scientific career
Fields Psychology
Institutions Cornell University
Thesis Biased evaluation and persistence in gambling  (1981)
Doctoral advisors Lee Ross
Mark Lepper
Doctoral students Justin Kruger

Thomas Dashiff Gilovich (born January 16, 1954) an American psychologist who is the Irene Blecker Rosenfeld Professor of Psychology at Cornell University. He has conducted research in social psychology, decision making, behavioral economics, and has written popular books on these subjects. Gilovich has collaborated with Daniel Kahneman, Richard Nisbett, Lee Ross and Amos Tversky. His articles in peer-reviewed journals on subjects such as cognitive biases have been widely cited. In addition, Gilovich has been quoted in the media on subjects ranging from the effect of purchases on happiness [1] to people's most common regrets, to perceptions of people and social groups. [2] Gilovich is a fellow of the Committee for Skeptical Inquiry.

Contents

Early history and education

Gilovich earned his B.A. from the University of California, Santa Barbara and his PhD from Stanford University. After hearing Amos Tversky and Daniel Kahneman give a lecture about judgment and decision making in his very first classroom experience there, Gilovich changed his program of research to focus on the intersection of social psychology and judgment and decision making . [3] He went on to earn his Ph.D. in psychology from Stanford in 1981.

Research in social and cognitive psychology

Gilovich is best known for his research in heuristics and biases in the field of social psychology. He describes his research as dealing with "how people evaluate the evidence of their everyday experience to make judgments, form beliefs, and decide on courses of action, and how they sometimes misevaluate that evidence and make faulty judgments, form dubious beliefs, and embark on counterproductive courses of action." [4] According to Google Scholar, he has an h-index of 77 for all his published academic papers, which is considered exceptional. [5] [6] The focus of Gilovich's work is reflected in two influential texts, Heuristics and Biases: The Psychology of Intuitive Judgment [7] (with Dale Griffin and Daniel Kahneman) and Social Psychology [8] (with Serena Chen, Dacher Keltner and Robert Nisbett), both of which are used as textbooks in academic courses in psychology and social psychology throughout the USA. Summarizing the research in an interview when asked what the benefits are, he responded, "I think that field has an enormous amount to offer, because we make consequential decisions all the time, and they aren't always easy, we don't always do them well," and that his research program is about trying to figure out how the mind works so we "understand why some decisions are easy, and we tend to do certain things very well, and why some decisions are difficult, and we tend to do them poorly." He further explained that his hope is that he and his colleagues are "providing lots of information to help us understand those difficult decisions, and give people the tools so that they can make better decisions so they less often in life are going down paths that don't serve them well." [9]

Gilovich condensed his academic research in judgement and decision making into a popular book, How We Know What Isn't So. Writing in Skeptical Inquirer , Carl Sagan called it "a most illuminating book" that "shows how people systematically err in understanding numbers, in rejecting unpleasant evidence, in being influenced by the opinions of others. We're good in some things, but not in everything. Wisdom lies in understanding our limitations." [10] Reviewing the book for The New York Times , George Johnson wrote, "Over time, the ability to infer rules about the way the world works from skimpy evidence confers a survival advantage, even if much of the time the lessons are wrong. From evolution's standpoint, it is better to be safe than sorry." [11] In an interview, Gilovich summarized the thesis of How We Know What Isn't So as people "thinking we really have the evidence for things, [that] the world is telling us something, but in fact the world is telling us something a little more complicated, and how is it that we can misread the evidence of our everyday experience, and be convinced that something is true when it really isn't." He further elaborated on some of the erroneous beliefs his book discusses, including the sophomore jinx , the idea that things such as natural disasters come in threes, and the belief that the lines we are in slow down but the lines we leave speed up. [12] In the same interview he called confirmation bias the "mother of all biases."

Notable contributions in biases and heuristics research

Through his published work in biases and heuristics, Gilovich has made notable contributions to the field through the following concepts:

Hot hands

Gilovich's research in the alleged "hot hand" effect, or the belief that success in a particular endeavor, usually sports, will likely be followed by further success, has been particularly influential. A paper he wrote with Amos Tversky in 1985 became the benchmark on the subject for years. [13] Some of the research from the 1985 paper has been contested recently, with a new journal article arguing that Gilovich and his coauthors themselves fell victim to a cognitive bias in interpreting the data from the original study. Specifically, that in a truly random situation, a hit would be expected to be followed by another hit less than 50 percent of the time, but if one hit followed another at 50 percent, that would be evidence for the hot hand. [14]

Spotlight effect

The spotlight effect, the phenomenon where people tend to believe that they're noticed more than they really are, is a term Gilovich coined. In a paper he wrote with two graduate students in 1999, he explained that "because we are so focused on our own behavior, it can be difficult to arrive at an accurate assessment of how much–or how little–our behavior is noticed by others. Indeed, close inspection reveals frequent disparities between the way we view our performance (and think others will view it) and the way it is actually seen by others." [15] For the paper, Gilovich and his coauthors conducted an experiment asking college students to put on a Barry Manilow shirt and walk into a room of strangers facing the door. The researchers predicted that the students would assume that more people would notice their T-shirt than was actually true. The results were as predicted, with participants thinking that roughly half the strangers would have recognized the Barry Manilow shirt, when in fact the number was closer to 20 percent. [15] [16]

Bias blind spot

Gilovich has contributed to an understanding of bias blind spot, or the tendency to recognize biases in other people, but not in ourselves. Several studies he coauthored found that people tend to believe that their own personal connection to a given issue is a source of accuracy and enlightenment, but that such personal connections in the case of others who hold different views are a source of bias. [17] Similarly, he has found that people look to external behavior in evaluating biases in others, but engage in introspection when evaluating their own. [18] Two examples he gave in a talk are that both older and younger siblings felt the other were held to a higher standard, and that Democrats and Republicans both felt that the electoral college helped the other side more than their own party. [19]

Clustering illusion

Gilovich was an early author in the clustering illusion, which is closely related to the "hot hand" fallacy, and is the tendency to see "clusters" of data in a random sequence of data as nonrandom. In How We Know What Isn't So, Gilovich explains how people want to see a sequence such as xoooxoooxooxxxoxxoo as planned, even though it was arbitrary. In addition, he stated that people tend to misjudge randomness, thinking that rolling the same number on dice 4 times in a row is not truly random, when in fact it is. [20]

Illusion of transparency

Building on his research on the spotlight effect, Gilovich helped to discover the illusion of transparency, or the tendency to overestimate the extent to which people telegraph their inner thoughts and emotions. In a study he conducted with two coauthors in 1998, individuals read questions from index cards and answered them out loud. They either lied or told the truth based on what the card said to do on a label only they could see. Half of the liars thought they had been caught, but in fact only a quarter were, hence the illusion of transparency. In addition, they found in the same study that in an emergency situation, people assumed the emergency and concern would show in their expression and behavior, but it didn't, which the authors believe partially explains the bystander effect: "When confronted with a potential emergency, people typically play it cool, adopt a look of nonchalance, and monitor the reactions of others to determine if a crisis is really at hand. No one wants to overreact, after all, if it might not be a true emergency. However, because each individual holds back, looks nonchalant, and monitors the reactions of others, sometimes everyone concludes (perhaps erroneously) that the situation is not an emergency and hence does not require intervention." [21]

Regret

"We evolved to be goal-striving creatures. You'll regret more the things that you didn't do than the things you did."

Thomas Gilovich [22]

Gilovich has researched the causes of regret. A study he conducted in 1994 found that specific actions people wish they hadn't taken are regretted more in the short run, but ultimately, inactions are regretted more in the long run. He has continued to emphasize that people tend to regret the things they don't do more than the things they did. [22] [23]

Anchoring

Following Amos Tversky and Daniel Kahneman, Gilovich and his colleagues have conducted research in anchoring, the tendency to anchor on information that comes to mind and adjust until a plausible estimate is reached when making decisions. A study he co-authored with Nicholas Epley found that anchoring is actually several different effects, and the multiple causes are at play. [24] Another study that Gilovich and Nicholas Epley coauthored found that once an anchor is set, people adjust away from it, though their adjustments tend to be insufficient, so their final guess is close to the initial anchor. [25]

Self-handicapping

In his social psychology research, Gilovich discovered the phenomenon of self-handicapping , which he described as "attempts to manage how others perceive us by controlling the attributions they make for our performance." An example of self-handicapping, according to Gilovich, would be drawing attention to elements that inhibit performance, and so discount failure in others' eyes, or make success the result of overcoming insurmountable odds. The self-handicapping can either be real (failing to study or drinking excessively), or faked (merely claiming that there were difficult obstacles present). Gilovich has stated that the strategy is most common in sports and undergraduate academics, but that it often backfires. [20]

Research in behavioral economics

Besides his contributions to the field of social psychology, Gilovich's research in cognitive psychology has influenced the field of behavioral economics. Gilovich has written a popular book condensing his academic research in the field, and which touches on many of the topics in How We Know What Isn't So, The Wisest One in the Room: How You Can Benefit from Social Psychology's Most Powerful Insights. In an interview with Brian Lehrer, Gilovich discussed the book and the subjects it touches on, such as the difference between intelligence and wisdom, the latter being knowledge of other people and how to connect with them, the negative impact of income inequality on happiness, motivation, and what can create "virtuous cycles" in a university environment. [26] Kirkus Reviews gave it a positive review, writing, "The authors leap from personal behavior and motivation in the first half into societal, cultural, and even international change in the second, offering suggestions, if not necessarily a working blueprint, for how to achieve goals such as global environmental responsibility. None of this is riveting reading, but it rarely lapses into academic jargon." [27]

Experiential purchases

A major recurring theme in Gilovich's work in behavioral economics is the importance of experience over ownership of material things. For instance, a paper he co-authored with Leaf Van Boven found that people overwhelmingly preferred "experiential purchases" to "material purchases." [28] Writing for The Atlantic , James Hamblin noted the growing body of research, pioneered by Gilovich, showing that experiences tend to bring people more happiness than possessions: "It's the fleetingness of experiential purchases that endears us to them. Either they're not around long enough to become imperfect, or they are imperfect, but our memories and stories of them get sweet with time. Even a bad experience becomes a good story." [29] In a talk about barriers to gratitude, Gilovich further noted that a survey of his students at Cornell found that they enjoyed their conversations about their experiences than their material purchases, and that happiness from experiential purchases is more enduring than that from material purchases. The reason being that experiences make for better stories, cultivate personal identity more, and connect people to each other. Gilovich explained that the implication is that experiential purchases lead to more gratitude and thus to more pro-social behavior. [30] In addition, Gilovich has emphasized the importance of being active and seeking goals: "We evolved to be goal-striving creatures. You’ll regret more the things that you didn’t do rather than the things you did." Along similar lines, in one talk he urged his audience, "mind your peaks and ends. You won’t remember the length of your vacation experience, but you’ll remember the intensity. And do something special at the end." [22]

Publications

Books

Journal articles

Awards and recognition

Personal life

Thomas Gilovich is married to Karen Dashiff Gilovich, with whom he has two daughters, Ilana and Rebecca. [20] Gilovich stated in an interview that the best part about being a scientist is going to work every day asking "what do I want to do today?" and not so often "what do I have to do today?" and that the best quality of a scientist is knowing how to respond to failure.

Related Research Articles

<span class="mw-page-title-main">Cognitive bias</span> Systematic pattern of deviation from norm or rationality in judgment

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.

A heuristic, or heuristic technique, is any approach to problem solving that employs a practical method that is not fully optimized, perfected, or rationalized, but is nevertheless sufficient for reaching an immediate, short-term goal or approximation. Where finding an optimal solution is impossible or impractical, heuristic methods can be used to speed up the process of finding a satisfactory solution. Heuristics can be mental shortcuts that ease the cognitive load of making a decision.

<span class="mw-page-title-main">Daniel Kahneman</span> Israeli-American psychologist and economist (1934–2024)

Daniel Kahneman was an Israeli-American author, psychologist, and economist notable for his work on hedonism, the psychology of judgment, and decision-making. He is also known for his work in behavioral economics, for which he was awarded the 2002 Nobel Memorial Prize in Economic Sciences shared with Vernon L. Smith. Kahneman's published empirical findings challenge the assumption of human rationality prevailing in modern economic theory.

In social psychology, fundamental attribution error, also known as correspondence bias or attribution effect, is a cognitive attribution bias where observers underemphasize situational and environmental factors for the behavior of an actor while overemphasizing dispositional or personality factors. In other words, observers tend to overattribute the behaviors of others to their personality and underattribute them to the situation or context. Although personality traits and predispositions are considered to be observable facts in psychology, the fundamental attribution error is an error because it misinterprets their effects.

The availability heuristic, also known as availability bias, is a mental shortcut that relies on immediate examples that come to a given person's mind when evaluating a specific topic, concept, method, or decision. This heuristic, operating on the notion that, if something can be recalled, it must be important, or at least more important than alternative solutions not as readily recalled, is inherently biased toward recently acquired information.

The representativeness heuristic is used when making judgments about the probability of an event being representional in character and essence of known prototypical event. It is one of a group of heuristics proposed by psychologists Amos Tversky and Daniel Kahneman in the early 1970s as "the degree to which [an event] (i) is similar in essential characteristics to its parent population, and (ii) reflects the salient features of the process by which it is generated". The representativeness heuristic works by comparing an event to a prototype or stereotype that we already have in mind. For example, if we see a person who is dressed in eccentric clothes and reading a poetry book, we might be more likely to think that they are a poet than an accountant. This is because the person's appearance and behavior are more representative of the stereotype of a poet than an accountant.

Trait ascription bias is the tendency for people to view themselves as relatively variable in terms of personality, behavior and mood while viewing others as much more predictable in their personal traits across different situations. More specifically, it is a tendency to describe one's own behaviour in terms of situational factors while preferring to describe another's behaviour by ascribing fixed dispositions to their personality. This may occur because peoples' own internal states are more readily observable and available to them than those of others.

The anchoring effect is a psychological phenomenon in which an individual's judgements or decisions are influenced by a reference point or "anchor" which can be completely irrelevant. Both numeric and non-numeric anchoring have been reported in research. In numeric anchoring, once the value of the anchor is set, subsequent arguments, estimates, etc. made by an individual may change from what they would have otherwise been without the anchor. For example, an individual may be more likely to purchase a car if it is placed alongside a more expensive model. Prices discussed in negotiations that are lower than the anchor may seem reasonable, perhaps even cheap to the buyer, even if said prices are still relatively higher than the actual market value of the car. Another example may be when estimating the orbit of Mars, one might start with the Earth's orbit and then adjust upward until they reach a value that seems reasonable.

<span class="mw-page-title-main">Lee Ross</span> American academic (1942–2021)

Lee David Ross was a Canadian-American professor. He held the title of the Stanford Federal Credit Union Professor of Humanities and Sciences at Stanford University and was an influential social psychologist who studied attributional biases, shortcomings in judgment and decision making, and barriers to conflict resolution, often with longtime collaborator Mark Lepper. Ross was known for his identification and explication of the fundamental attribution error and for the demonstration and analysis of other phenomena and shortcomings that have become standard topics in textbooks and in some cases, even popular media. His interests included ongoing societal problems, in particular protracted inter-group conflicts, the individual and collective rationalization of evil, and the psychological processes that make it difficult to confront societal challenges. Ross went beyond the laboratory to involve himself in conflict resolution and public peace processes in the Middle East, Northern Ireland, and other areas of the world.

The overconfidence effect is a well-established bias in which a person's subjective confidence in their judgments is reliably greater than the objective accuracy of those judgments, especially when confidence is relatively high. Overconfidence is one example of a miscalibration of subjective probabilities. Throughout the research literature, overconfidence has been defined in three distinct ways: (1) overestimation of one's actual performance; (2) overplacement of one's performance relative to others; and (3) overprecision in expressing unwarranted certainty in the accuracy of one's beliefs.

The illusion of transparency is a tendency for people to overestimate the degree to which their personal mental state is known by others. Another manifestation of the illusion of transparency is a tendency for people to overestimate how well they understand others' personal mental states. This cognitive bias is similar to the illusion of asymmetric insight.

The negativity bias, also known as the negativity effect, is a cognitive bias that, even when positive or neutral things of equal intensity occur, things of a more negative nature have a greater effect on one's psychological state and processes than neutral or positive things. In other words, something very positive will generally have less of an impact on a person's behavior and cognition than something equally emotional but negative. The negativity bias has been investigated within many different domains, including the formation of impressions and general evaluations; attention, learning, and memory; and decision-making and risk considerations.

Positive illusions are unrealistically favorable attitudes that people have towards themselves or to people that are close to them. Positive illusions are a form of self-deception or self-enhancement that feel good; maintain self-esteem; or avoid discomfort, at least in the short term. There are three general forms: inflated assessment of one's own abilities, unrealistic optimism about the future, and an illusion of control. The term "positive illusions" originates in a 1988 paper by Taylor and Brown. "Taylor and Brown's (1988) model of mental health maintains that certain positive illusions are highly prevalent in normal thought and predictive of criteria traditionally associated with mental health."

In psychology, a heuristic is an easy-to-compute procedure or rule of thumb that people use when forming beliefs, judgments or decisions. The familiarity heuristic was developed based on the discovery of the availability heuristic by psychologists Amos Tversky and Daniel Kahneman; it happens when the familiar is favored over novel places, people, or things. The familiarity heuristic can be applied to various situations that individuals experience in day-to-day life. When these situations appear similar to previous situations, especially if the individuals are experiencing a high cognitive load, they may regress to the state of mind in which they have felt or behaved before. This heuristic is useful in most situations and can be applied to many fields of knowledge; however, there are both positives and negatives to this heuristic as well.

Counterfactual thinking is a concept in psychology that involves the human tendency to create possible alternatives to life events that have already occurred; something that is contrary to what actually happened. Counterfactual thinking is, as it states: "counter to the facts". These thoughts consist of the "What if?" and the "If only..." that occur when thinking of how things could have turned out differently. Counterfactual thoughts include things that – in the present – could not have happened because they are dependent on events that did not occur in the past.

The spotlight effect is the psychological phenomenon by which people tend to believe they are being noticed more than they really are. Being that one is constantly in the center of one's own world, an accurate evaluation of how much one is noticed by others is uncommon. The reason for the spotlight effect is the innate tendency to forget that although one is the center of one's own world, one is not the center of everyone else's. This tendency is especially prominent when one does something atypical.

Heuristics is the process by which humans use mental shortcuts to arrive at decisions. Heuristics are simple strategies that humans, animals, organizations, and even machines use to quickly form judgments, make decisions, and find solutions to complex problems. Often this involves focusing on the most relevant aspects of a problem or situation to formulate a solution. While heuristic processes are used to find the answers and solutions that are most likely to work or be correct, they are not always right or the most accurate. Judgments and decisions based on heuristics are simply good enough to satisfy a pressing need in situations of uncertainty, where information is incomplete. In that sense they can differ from answers given by logic and probability.

In social psychology, naïve realism is the human tendency to believe that we see the world around us objectively, and that people who disagree with us must be uninformed, irrational, or biased.

Debiasing is the reduction of bias, particularly with respect to judgment and decision making. Biased judgment and decision making is that which systematically deviates from the prescriptions of objective standards such as facts, logic, and rational behavior or prescriptive norms. Biased judgment and decision making exists in consequential domains such as medicine, law, policy, and business, as well as in everyday life. Investors, for example, tend to hold onto falling stocks too long and sell rising stocks too quickly. Employers exhibit considerable discrimination in hiring and employment practices, and some parents continue to believe that vaccinations cause autism despite knowing that this link is based on falsified evidence. At an individual level, people who exhibit less decision bias have more intact social environments, reduced risk of alcohol and drug use, lower childhood delinquency rates, and superior planning and problem solving abilities.

References

  1. Almendrala, Anna (3 September 2014). "More Evidence Happiness Doesn't Come From Buying New Things". Huffington Post Australia. Retrieved September 3, 2014.
  2. Lombrozo, Tania (June 30, 2014). "3 Things Everyone Should Know Before Growing Up". 13.7: Cosmos & Culture (blog). Retrieved June 30, 2014.
  3. "Psychology professor Tom Gilovich – ScienceLives". Archived from the original on 2021-12-22. Retrieved January 13, 2016 via YouTube.
  4. "Tom Gilovich". gilovich.socialpsychology.org. Social Psychology Network . Retrieved January 18, 2016.
  5. "Thomas Gilovich". Google Scholar. Citation indices. Retrieved March 25, 2015.
  6. Oswald, Nick (2 April 2009). "Does Your h-index Measure Up?". /bitesizebio.com. Science Squared. Retrieved January 14, 2016.
  7. Gilovich, Thomas; Griffin, Dale; Kahneman, Daniel (2002-07-08). Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge University Press. ISBN   9780521796798.
  8. Chen, Serena; Gilovich, Thomas; Keltner, Dacher; Nisbett, Robert (2015-09-08). Social Psychology (4th ed.). W.W. Norton & Company. ISBN   978-0-393-93896-8.
  9. "ScienceLives: Tom Gilovich: Do What You Find Interesting". Archived from the original on 2021-12-22. Retrieved September 27, 2015 via YouTube.
  10. Sagan, Carl (March 1996). "Does Truth Matter? Science, Pseudoscience, and Civilization". Skeptical Inquirer. 20.2 March/April 1996. Retrieved April 1, 2015.
  11. Johnson, George (August 26, 1991). "Books of The Times; Why Unshakable Belief Isn't the Same as Truth". New York Times . Retrieved April 1, 2015.
  12. "Episode 8 − Extraordinary Claims: Uncut conversation with Tom Gilovich". Archived from the original on 2021-12-22. Retrieved October 6, 2015 via YouTube.
  13. Gilovich, Thomas; Tversky, A.; Vallone, R. (1985). "The Hot Hand in Basketball: On the Misperception of Random Sequences". Cognitive Psychology . 3 (17): 295–314. doi:10.1016/0010-0285(85)90010-6. S2CID   317235.
  14. Miller, Joshua Bengamin; Sanjurjo, A. (2015). "Surprised by the Gambler's and Hot Hand Fallacies? A Truth in the Law of Small Numbers". Social Science Research Network, IGIER Working Paper #552. doi:10.2139/ssrn.2627354. S2CID   17952286.
  15. 1 2 Gilovich, T.; Medvec, V. H.; Savitsky, K. (2000). "The spotlight effect in social judgment: An egocentric bias in estimates of the salience of one's own actions and appearance". Journal of Personality and Social Psychology . 78 (2): 211–222. doi:10.1037/0022-3514.78.2.211. PMID   10707330. S2CID   12809711.
  16. Morfitt, Russ (May 19, 2014). "The Spotlight Effect: or why Barry Manilow is still relevant". learntolive.com. Retrieved January 13, 2016.
  17. Ehrkinger, J.; Gilovich, T.; Ross, L. (2005). "Peering Into the Bias Blind Spot: People's Assessments of Bias in Themselves and Others". Personality and Social Psychology Bulletin . 31 (5): 680–692. doi:10.1177/0146167204271570. PMID   15802662. S2CID   1210432.
  18. Gilovich, Thomas; Epley, Nicholas; Hanko, Karlene (2005). "Shallow Thoughts About the Self: The Automatic Components of Self-Assessment". In Mark D. Alicke; David A. Dunning; Joachim I. Krueger (eds.). The Self in Social Judgment. Studies in Self and Identity. New York: Psychology Press. p. 77. ISBN   978-1-84169-418-4.
  19. "Cultivating Gratitude in a Consumerist Society". YouTube . Retrieved 2016-01-20.
  20. 1 2 3 Gilovich, Thomas (1993-03-05). How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life. Free Press. ISBN   9780029117064.
  21. Gilovich, T.; Savitsky, K.; Medvec, V. H. (1998). "The Illusion of Transparency: Biased Assessments of Others' Ability to Read One's Emotional States" (PDF). Journal of Personality and Social Psychology. 75 (2): 332–346. doi:10.1037/0022-3514.75.2.332. PMID   9731312 . Retrieved 20 July 2011.
  22. 1 2 3 Glaser, Linda B. (April 28, 2015). "Behavioral economists discuss their emerging field". Cornell Chronicle. Cornell University. Retrieved April 28, 2015.
  23. Gilovich, Thomas; Medvec, Victoria Husted (1995). "The experience of regret: What, when, and why". Psychological Review . 102 (2): 379–395. doi:10.1037/0033-295X.102.2.379. PMID   7740094.
  24. Epley, Nicholas; Gilovich, Thomas (2005). "When effortful thinking influences judgmental anchoring: differential effects of forewarning and incentives on self-generated and externally provided anchors". Journal of Behavioral Decision Making. 18 (3): 199–212. doi:10.1002/bdm.495. S2CID   14747114.
  25. Epley, Nicholas; Gilovich, Thomas (2006). "The Anchoring-and-Adjustment Heuristic: Why the Adjustments Are Insufficient". Psychological Science . 17 (4): 311–318. doi:10.1111/j.1467-9280.2006.01704.x. PMID   16623688. S2CID   10279390.
  26. A Wise Guy's Guide to Happiness. The Brian Lehrer Show. December 10, 2015. Retrieved January 18, 2016.
  27. "The Wisest One in the Room". Kirkus Reviews . Retrieved January 18, 2016.
  28. Postrel, Virginia (September 9, 2004). "In New Age economics, it's more about the experience than about just owning stuff". New York Times. Economic Scene. cited in "In New Age economics, it's more about the experience than about just owning stuff". vpostrel.com. Retrieved April 13, 2015.
  29. Hamblin, James (October 7, 2014). "Buy Experiences, Not Things". Atlantic Magazine. Retrieved December 31, 2015.
  30. "Cultivating Gratitude in a Consumerist Society" . Retrieved January 18, 2016 via YouTube.
  31. "CSI Fellows and Staff". Committee for Skeptical Inquiry. Retrieved August 7, 2011.
  32. "Russell Distinguished Teaching Award". College of Arts & Sciences, Cornell University. Retrieved 19 February 2016.
  33. "Tom Gilovich". W. W. Norton & Company, publishers. Retrieved January 18, 2016.