Sunk cost

Last updated

In economics and business decision-making, a sunk cost (also known as retrospective cost) is a cost that has already been incurred and cannot be recovered. [1] [2] Sunk costs are contrasted with prospective costs , which are future costs that may be avoided if action is taken. [3] In other words, a sunk cost is a sum paid in the past that is no longer relevant to decisions about the future. Even though economists argue that sunk costs are no longer relevant to future rational decision-making, people in everyday life often take previous expenditures in situations, such as repairing a car or house, into their future decisions regarding those properties.

Contents

Bygones principle

According to classical economics and standard microeconomic theory, only prospective (future) costs are relevant to a rational decision. [4] At any moment in time, the best thing to do depends only on current alternatives. [5] The only things that matter are the future consequences. [6] Past mistakes are irrelevant. [5] Any costs incurred prior to making the decision have already been incurred no matter what decision is made. They may be described as "water under the bridge", [7] and making decisions on their basis may be described as "crying over spilt milk". [8] In other words, people should not let sunk costs influence their decisions; sunk costs are irrelevant to rational decisions. Thus, if a new factory was originally projected to cost $100 million, and yield $120 million in value, and after $30 million is spent on it the value projection falls to $65 million, the company should abandon the project rather than spending an additional $70 million to complete it. Conversely, if the value projection falls to $75 million, the company, as a rational actor, should continue the project. This is known as the bygones principle [6] [9] or the marginal principle. [10]

The bygones principle is grounded in the branch of normative decision theory known as rational choice theory , particularly in expected utility hypothesis. Expected utility theory relies on a property known as cancellation, which says that it is rational in decision-making to disregard (cancel) any state of the world that yields the same outcome regardless of one's choice. [11] Past decisions—including sunk costs—meet that criterion.

The bygones principle can also be formalised as the notion of "separability". Separability requires agents to take decisions by comparing the available options in eventualities that can still occur, uninfluenced by how the current situation was reached or by eventualities that are precluded by that history. In the language of decision trees, it requires the agent's choice at a particular choice node to be independent of unreachable parts of the tree. This formulation makes clear how central the principle is to standard economic theory by, for example, founding the folding-back algorithm for individual sequential decisions and game-theoretical concepts such as sub-game perfection. [12]

Until a decision-maker irreversibly commits resources, the prospective cost is an avoidable future cost and is properly included in any decision-making process. [9] For instance, if someone is considering pre-ordering movie tickets, but has not actually purchased them yet, the cost remains avoidable.

Both retrospective and prospective costs could be either fixed costs (continuous for as long as the business is operating and unaffected by output volume) or variable costs (dependent on volume). [13] However, many economists consider it a mistake to classify sunk costs as "fixed" or "variable". For example, if a firm sinks $400 million on an enterprise software installation, that cost is "sunk" because it was a one-time expense and cannot be recovered once spent. A "fixed" cost would be monthly payments made as part of a service contract or licensing deal with the company that set up the software. The upfront irretrievable payment for the installation should not be deemed a "fixed" cost, with its cost spread out over time. Sunk costs should be kept separate. The "variable costs" for this project might include data centre power usage, for example.

There are cases in which taking sunk costs into account in decision-making, violating the bygones principle, is rational. [14] For example, for a manager who wishes to be perceived as persevering in the face of adversity, or to avoid blame for earlier mistakes, it may be rational to persist with a project for personal reasons even if it is not the benefit of their company. Or, if they hold private information about the undesirability of abandoning a project, it is fully rational to persist with a project that outsiders think displays the fallacy of sunk cost. [15]

Fallacy effect

The bygones principle does not always accord with real-world behavior. Sunk costs often influence people's decisions, [7] [14] with people believing that investments (i.e., sunk costs) justify further expenditures. [16] People demonstrate "a greater tendency to continue an endeavor once an investment in money, effort, or time has been made". [17] [18] This is the sunk cost fallacy, and such behavior may be described as "throwing good money after bad", [19] [14] while refusing to succumb to what may be described as "cutting one's losses". [14] People can remain in failing relationships because they "have already invested too much to leave". Other people are swayed by arguments that a war must continue because lives will have been sacrificed in vain unless victory is achieved. Individuals caught up in psychologically manipulative scams will continue investing time, money and emotional energy into the project, despite doubts or suspicions that something is not right. [20] These types of behaviour do not seem to accord with rational choice theory and are often classified as behavioural errors. [21]

Rego, Arantes, and Magalhães point out that the sunk cost effect exists in committed relationships. They devised two experiments, one of which showed that people in a relationship which they had invested money and effort in were more likely to keep that relationship going than end it; and in the second experiment, while people are in a relationship which they had invested enough time in, they tended to devote more time to the relationship. [22] It also means people fall into the sunk cost fallacy. Although people should ignore sunk costs and make rational decisions when planning for the future, time, money, and effort often make people continue to maintain this relationship, which is equivalent to continuing to invest in failed projects.

According to evidence reported by De Bondt and Makhija (1988)[ full citation needed ], managers of many utility companies in the United States have been overly reluctant to terminate economically unviable nuclear plant projects. In the 1960s, the nuclear power industry promised "energy too cheap to meter". Nuclear power lost public support in the 1970s and 1980s, when public service commissions around the nation ordered prudency reviews. From these reviews, De Bondt and Makhija find evidence that the commissions denied many utility companies even partial recovery of nuclear construction costs on the grounds that they had been mismanaging the nuclear construction projects in ways consistent with throwing good money after bad. [23]

The sunk cost fallacy has also been called the "Concorde fallacy": the British and French governments took their past expenses on the costly supersonic jet as a rationale for continuing the project, as opposed to "cutting their losses". Concorde 216 (G-BOAF) last flight.jpg
The sunk cost fallacy has also been called the "Concorde fallacy": the British and French governments took their past expenses on the costly supersonic jet as a rationale for continuing the project, as opposed to "cutting their losses".

There is also evidence of government representatives failing to ignore sunk costs. [21] The term "Concorde fallacy" [24] derives from the fact that the British and French governments continued to fund the joint development of the costly Concorde supersonic airplane even after it became apparent that there was no longer an economic case for the aircraft. The British government privately regarded the project as a commercial disaster that should never have been started. Political and legal issues made it impossible for either government to pull out. [9]

The idea of sunk costs is often employed when analyzing business decisions. A common example of a sunk cost for a business is the promotion of a brand name. This type of marketing incurs costs that cannot normally be recovered. It is not typically possible to later "demote" one's brand names in exchange for cash. A second example is research and development (R&D) costs. Once spent, such costs are sunk and should have no effect on future pricing decisions[ citation needed ]. A pharmaceutical company's attempt to justify high prices because of the need to recoup R&D expenses would be fallacious. The company would charge a high price whether R&D cost one dollar or one million. R&D costs and the ability to recoup those costs are a factor in deciding whether to spend the money on R&D in the first place. [25]

Dijkstra and Hong proposed that part of a person's behavior is influenced by a person's current emotions. Their experiments showed that emotional responses benefit from the sunk cost fallacy. Negative influences lead to the sunk cost fallacy. For example, anxious people face the stress brought about by the sunk cost fallacy. When stressed, they are more motivated to invest in failed projects rather than take additional approaches. Their report shows that the sunk cost fallacy will have a greater impact on people under high load conditions and people's psychological state and external environment will be the key influencing factors. [26]

The sunk cost effect may cause cost overrun. In business, an example of sunk costs may be an investment into a factory or research that now has a lower value or none. For example, $20 million has been spent on building a power plant; the value now is zero because it is incomplete (and no sale or recovery is feasible). The plant can be completed for an additional $10 million or abandoned and a different but equally valuable facility built for $5 million. Abandonment and construction of the alternative facility is the more rational decision, even though it represents a total loss of the original expenditure—the original sum invested is a sunk cost. If decision-makers are irrational or have the "wrong" (different) incentives, the completion of the project may be chosen. For example, politicians or managers may have more incentive to avoid the appearance of a total loss. In practice, there is considerable ambiguity and uncertainty in such cases, and decisions may in retrospect appear irrational that were, at the time, reasonable to the economic actors involved and in the context of their incentives. A decision-maker might make rational decisions according to their incentives, outside of efficiency or profitability. This is considered to be an incentive problem and is distinct from a sunk cost problem. Some research has also noted circumstances where the sunk cost effect is reversed; that is, where individuals appear irrationally eager to write off earlier investments in order to take up a new endeavor. [27]

Plan continuation bias

A related phenomenon is plan continuation bias, [28] [29] [30] [31] [32] which is recognised as a subtle cognitive bias that tends to force the continuation of a plan or course of action even in the face of changing conditions. In the field of aerospace it has been recognised as a significant causal factor in accidents, with a 2004 NASA study finding that in 9 out of the 19 accidents studied, aircrew exhibited this behavioural bias. [28]

This is a hazard for ships' captains or aircraft pilots who may stick to a planned course even when it is leading to fatal disaster and they should abort instead. A famous example is the Torrey Canyon oil spill in which a tanker ran aground when its captain persisted with a risky course rather than accepting a delay. [33] It has been a factor in numerous air crashes and an analysis of 279 approach and landing accidents (ALAs) found that it was the fourth most common cause, occurring in 11% of cases. [34] Another analysis of 76 accidents found that it was a contributory factor in 42% of cases. [35]

There are also two predominant factors that characterise the bias. The first is an overly optimistic estimate of probability of success, possibly to reduce cognitive dissonance having made a decision. The second is that of personal responsibility: when you are personally accountable, it is difficult for you to admit that you were wrong. [28]

Projects often suffer cost overruns and delays due to the planning fallacy and related factors including excessive optimism, an unwillingness to admit failure, groupthink and aversion to loss of sunk costs. [36]

Psychological factors

Daniel Kahneman, an Israeli psychologist known for his work in behavioral economics and studies of rationality in economics Daniel Kahneman (3283955327) (cropped).jpg
Daniel Kahneman, an Israeli psychologist known for his work in behavioral economics and studies of rationality in economics

Evidence from behavioral economics suggests that there are at least four specific psychological factors underlying the sunk cost effect:

Taken together, these results suggest that the sunk cost effect may reflect non-standard measures of utility, which is ultimately subjective and unique to the individual.

Framing effect

The framing effect which underlies the sunk cost effect builds upon the concept of extensionality where the outcome is the same regardless of how the information is framed. This is in contradiction to the concept of intentionality, which is concerned with whether the presentation of information changes the situation in question.

Take two mathematical functions:

  1. f(x) = 2x + 10
  2. f(x) = 2 · (x + 5)

While these functions are framed differently, regardless of the input 'x', the outcome is analytically equivalent. Therefore, if a rational decision maker were to choose between these two functions, the likelihood of each function being chosen should be the same. However, a framing effect places unequal biases towards preferences that are otherwise equal.

The most common type of framing effect was theorised in Kahneman & Tversky, 1979 in the form of valence framing effects. [39] This form of framing signifies types of framing. The first type can be considered positive where the 'sure thing' option highlights the positivity whereas if it is negative, the 'sure thing' option highlights the negativity, while both being analytically identical. For example, saving 200 people from a sinking ship of 600 is equivalent to letting 400 people drown. The former framing type is positive and the latter is negative.

Ellingsen, Johannesson, Möllerström and Munkammar [40] have categorised framing effects in a social and economic orientation into three broad classes of theories. Firstly, the framing of options presented can affect internalised social norms or social preferences - this is called variable sociality hypothesis. Secondly, the social image hypothesis suggests that the frame in which the options are presented will affect the way the decision maker is viewed and will in turn affect their behaviour. Lastly, the frame may affect the expectations that people have about each other's behaviour and will in turn affect their own behaviour.

Overoptimistic probability bias

In 1968, Knox and Inkster [41] approached 141 horse bettors: 72 of the people had just finished placing a $2.00 bet within the past 30 seconds, and 69 people were about to place a $2.00 bet in the next 30 seconds. Their hypothesis was that people who had just committed themselves to a course of action (betting $2.00) would reduce post-decision dissonance by believing more strongly than ever that they had picked a winner. Knox and Inkster asked the bettors to rate their horse's chances of winning on a 7-point scale. What they found was that people who were about to place a bet rated the chance that their horse would win at an average of 3.48 which corresponded to a "fair chance of winning" whereas people who had just finished betting gave an average rating of 4.81 which corresponded to a "good chance of winning". Their hypothesis was confirmed: after making a $2.00 commitment, people became more confident their bet would pay off. Knox and Inkster performed an ancillary test on the patrons of the horses themselves and managed (after normalization) to repeat their finding almost identically. Other researchers have also found evidence of inflated probability estimations. [42] [43]

Sense of personal responsibility

In a study of 96 business students, Staw and Fox [44] gave the subjects a choice between making an R&D investment either in an underperforming company department, or in other sections of the hypothetical company. Staw and Fox divided the participants into two groups: a low responsibility condition and a high responsibility condition. In the high responsibility condition, the participants were told that they, as manager, had made an earlier, disappointing R&D investment. In the low responsibility condition, subjects were told that a former manager had made a previous R&D investment in the underperforming division and were given the same profit data as the other group. In both cases, subjects were then asked to make a new $20 million investment. There was a significant interaction between assumed responsibility and average investment, with the high responsibility condition averaging $12.97 million and the low condition averaging $9.43 million. Similar results have been obtained in other studies. [45] [42] [46]

Desire not to appear wasteful

A ticket buyer who purchases a ticket in advance to an event they eventually turn out not to enjoy makes a semi-public commitment to watching it. To leave early is to make this lapse of judgment manifest to strangers, an appearance they might otherwise choose to avoid. As well, the person may not want to leave the event because they have already paid, so they may feel that leaving would waste their expenditure. Alternatively, they may take a sense of pride in having recognised the opportunity cost of the alternative use of time.

See also

Related Research Articles

<span class="mw-page-title-main">Microeconomics</span> Behavior of individuals and firms

Microeconomics is a branch of economics that studies the behavior of individuals and firms in making decisions regarding the allocation of scarce resources and the interactions among these individuals and firms. Microeconomics focuses on the study of individual markets, sectors, or industries as opposed to the national economy as a whole, which is studied in macroeconomics.

In microeconomic theory, the opportunity cost of a choice is the value of the best alternative forgone where, given limited resources, a choice needs to be made between several mutually exclusive alternatives. Assuming the best choice is made, it is the "cost" incurred by not enjoying the benefit that would have been had by taking the second best available choice. The New Oxford American Dictionary defines it as "the loss of potential gain from other alternatives when one alternative is chosen". As a representation of the relationship between scarcity and choice, the objective of opportunity cost is to ensure efficient use of scarce resources. It incorporates all associated costs of a decision, both explicit and implicit. Thus, opportunity costs are not restricted to monetary or financial costs: the real cost of output forgone, lost time, pleasure, or any other benefit that provides utility should also be considered an opportunity cost.

<span class="mw-page-title-main">Cognitive bias</span> Systematic pattern of deviation from norm or rationality in judgment

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.

A heuristic, or heuristic technique, is any approach to problem solving that employs a practical method that is not fully optimized, perfected, or rationalized, but is nevertheless sufficient for reaching an immediate, short-term goal or approximation. Where finding an optimal solution is impossible or impractical, heuristic methods can be used to speed up the process of finding a satisfactory solution. Heuristics can be mental shortcuts that ease the cognitive load of making a decision.

Bounded rationality is the idea that rationality is limited when individuals make decisions, and under these limitations, rational individuals will select a decision that is satisfactory rather than optimal.

Behavioral economics is the study of the psychological, cognitive, emotional, cultural and social factors involved in the decisions of individuals or institutions, and how these decisions deviate from those implied by classical economic theory.

<span class="mw-page-title-main">Prospect theory</span> Theory of behavioral economics

Prospect theory is a theory of behavioral economics, judgment and decision making that was developed by Daniel Kahneman and Amos Tversky in 1979. The theory was cited in the decision to award Kahneman the 2002 Nobel Memorial Prize in Economics.

The theory of consumer choice is the branch of microeconomics that relates preferences to consumption expenditures and to consumer demand curves. It analyzes how consumers maximize the desirability of their consumption, by maximizing utility subject to a consumer budget constraint. Factors influencing consumers' evaluation of the utility of goods include: income level, cultural factors, product information and physio-psychological factors.

<span class="mw-page-title-main">Decision theory</span> Branch of applied probability theory

Decision theory is a branch of applied probability theory and analytic philosophy concerned with the theory of making decisions based on assigning probabilities to various factors and assigning numerical consequences to the outcome.

Managerial economics is a branch of economics involving the application of economic methods in the organizational decision-making process. Economics is the study of the production, distribution, and consumption of goods and services. Managerial economics involves the use of economic theories and principles to make decisions regarding the allocation of scarce resources. It guides managers in making decisions relating to the company's customers, competitors, suppliers, and internal operations.

<span class="mw-page-title-main">Consumption (economics)</span> Using money to obtain an item for use

Consumption is the act of using resources to satisfy current needs and wants. It is seen in contrast to investing, which is spending for acquisition of future income. Consumption is a major concept in economics and is also studied in many other social sciences.

<span class="mw-page-title-main">Planning fallacy</span> Cognitive bias of underestimating time needed

The planning fallacy is a phenomenon in which predictions about how much time will be needed to complete a future task display an optimism bias and underestimate the time needed. This phenomenon sometimes occurs regardless of the individual's knowledge that past tasks of a similar nature have taken longer to complete than generally planned. The bias affects predictions only about one's own tasks. On the other hand, when outside observers predict task completion times, they tend to exhibit a pessimistic bias, overestimating the time needed. The planning fallacy involves estimates of task completion times more optimistic than those encountered in similar projects in the past.

<span class="mw-page-title-main">Mental accounting</span>

Mental accounting is a model of consumer behaviour developed by Richard Thaler that attempts to describe the process whereby people code, categorize and evaluate economic outcomes. Mental accounting incorporates the economic concepts of prospect theory and transactional utility theory to evaluate how people create distinctions between their financial resources in the form of mental accounts, which in turn impacts the buyer decision process and reaction to economic outcomes. People are presumed to make mental accounts as a self control strategy to manage and keep track of their spending and resources. People budget money into mental accounts for savings or expense categories. People also are assumed to make mental accounts to facilitate savings for larger purposes. Mental accounting can result in people demonstrating greater loss aversion for certain mental accounts, resulting in cognitive bias that incentivizes systematic departures from consumer rationality. Through increased understanding of mental accounting differences in decision making based on different resources, and different reactions based on similar outcomes can be greater understood.

Escalation of commitment is a human behavior pattern in which an individual or group facing increasingly negative outcomes from a decision, action, or investment nevertheless continue the behavior instead of altering course. The actor maintains behaviors that are irrational, but align with previous decisions and actions.

A cost overrun, also known as a cost increase or budget overrun, involves unexpected incurred costs. When these costs are in excess of budgeted amounts due to a value engineering underestimation of the actual cost during budgeting, they are known by these terms.

Choice architecture is the design of different ways in which choices can be presented to decision makers, and the impact of that presentation on decision-making. For example, each of the following:

Heuristics is the process by which humans use mental shortcuts to arrive at decisions. Heuristics are simple strategies that humans, animals, organizations, and even machines use to quickly form judgments, make decisions, and find solutions to complex problems. Often this involves focusing on the most relevant aspects of a problem or situation to formulate a solution. While heuristic processes are used to find the answers and solutions that are most likely to work or be correct, they are not always right or the most accurate. Judgments and decisions based on heuristics are simply good enough to satisfy a pressing need in situations of uncertainty, where information is incomplete. In that sense they can differ from answers given by logic and probability.

Cognitive bias mitigation is the prevention and reduction of the negative effects of cognitive biases – unconscious, automatic influences on human judgment and decision making that reliably produce reasoning errors.

Behavioral strategy refers to the application of insights from psychology and behavioral economics to the research and practice of strategic management. In one definition of the field, "Behavioral strategy merges cognitive and social psychology with strategic management theory and practice. Behavioral strategy aims to bring realistic assumptions about human cognition, emotions, and social behavior to the strategic management of organizations and, thereby, to enrich strategy theory, empirical research, and real-world practice".

References

  1. Mankiw, N. Gregory (2009). Principles of Microeconomics (5th ed.). Mason, OH: Cengage Learning. pp. 296–297. ISBN   978-1-111-80697-2.
  2. Mankiw, N. Gregory (2018). Principles of Economics (8th ed.). Boston, MA: Cengage Learning. pp. 274–276. ISBN   978-1-305-58512-6.
  3. Warnacut, Joyce I. (2017). The Monetary Value of Time: Why Traditional Accounting Systems Make Customers Wait. Taylor & Francis. ISBN   978-1-4987-4967-1.
  4. Sharma, Sanjay; Sharma, Pramodita (2019). Patient Capital. Cambridge University Press. ISBN   978-1-107-12366-3.
  5. 1 2 Lipsey, Richard G.; Harbury, Colin (1992). First Principles of Economics. Oxford University Press. p. 143. ISBN   978-0-297-82120-5.
  6. 1 2 Ryan, Bob (2004). Finance and Accounting for Business. Cengage Learning EMEA. pp. 229–230. ISBN   978-1-86152-993-0.
  7. 1 2 Bernheim, B. Douglas; Whinston, Michael Dennis (2008). Microeconomics. McGraw-Hill Irwin. ISBN   978-0-07-721199-8.
  8. Jain, P. K. (2000). Cost Accounting. Tata McGraw-Hill Education. ISBN   978-0-07-040224-9.
  9. 1 2 3 Gupta, K. P. (2009). Cost Management: Measuring, Monitoring & Motivating Performance. Global India Publications. ISBN   978-93-80228-02-0.
  10. Samuelson, Paul A. (2010). Economics. Tata McGraw-Hill Education. ISBN   978-0-07-070071-0.
  11. Tversky, Amos; Kahneman, Daniel (1986). "Rational choice and the framing of decisions". The Journal of Business. 59 (4): S251–S278. doi:10.1086/296365. ISSN   0021-9398. JSTOR   2352759.
  12. Cubitt, Robin; Ruiz-Martos, Maria; Starmer, Chris (2012). "Are bygones bygones?". Theory and Decision. 73 (2): S185–S202. doi:10.1007/s11238-010-9233-4. S2CID   5051889.
  13. Sherman, Roger (2008). Market Regulation. Pearson / Addison Wesley. ISBN   978-0-321-32232-6.
  14. 1 2 3 4 Parayre, Roch (1995). "The strategic implications of sunk costs: A behavioral perspective". Journal of Economic Behavior & Organization. 28 (3): 417–442. doi:10.1016/0167-2681(95)00045-3. ISSN   0167-2681.
  15. Staw, Barry M.; Ross, Jerry (1987). "Knowing When to Pull the Plug". Harvard Business Review. No. March 1987. ISSN   0017-8012 . Retrieved 2019-08-09.
  16. Arkes, Hal (2000). "Think Like a Dog". Psychology Today. 33 (1): 10. ISSN   0033-3107 . Retrieved 2019-08-05.
  17. Arkes, Hal R.; Ayton, Peter (1999). "The sunk cost and Concorde effects: Are humans less rational than lower animals?". Psychological Bulletin. 125 (5): 591–600. doi:10.1037/0033-2909.125.5.591. ISSN   1939-1455. S2CID   10296273.
  18. 1 2 Arkes, Hal R; Blumer, Catherine (1985). "The psychology of sunk cost". Organizational Behavior and Human Decision Processes. 35 (1): 124–140. doi:10.1016/0749-5978(85)90049-4. ISSN   0749-5978.
  19. "sunk cost fallacy". Cambridge English Dictionary. Cambridge University Press. 2019. Retrieved 2019-08-07.
  20. Radford, Benjamin (January 2017). "Psychic Arrested in Exorcism Scam". Skeptical Inquirer. 41 (1): 12–13. Retrieved 18 April 2021.
  21. 1 2 McAfee, Preston; Mialon, Hugo; Mialon, Sue (2010). "Do Sunk Costs Matter?". Economic Inquiry. 48 (2): 323–336. doi:10.1111/j.1465-7295.2008.00184.x. S2CID   154805248.
  22. Rego, Sara; Arantes, Joana; Magalhães, Paula (2016-11-29). "Is there a Sunk Cost Effect in Committed Relationships?". Current Psychology. 37 (3): 508–519. doi:10.1007/s12144-016-9529-9. ISSN   1046-1310. S2CID   152208754.
  23. Roth, Stefan; Robbert, Thomas; Straus, Lennart (2014). "On the sunk-cost effect in economic decision-making: a meta-analytic review". Business Research (Göttingen). 8 (1): 99–138. doi: 10.1007/s40685-014-0014-8 . hdl: 10419/156273 . S2CID   154851729.
  24. Weatherhead, P.J. (1979). "Do Savannah Sparrows Commit the Concorde Fallacy?". Behav. Ecol. Sociobiol. 5 (4). Springer Berlin: 373–381. doi:10.1007/BF00292525. S2CID   6144898.
  25. Yoram, Bauman; Klein, Grady (2010). The Cartoon Introduction to Economics. Vol. One: Microeconomics (1st ed.). New York: Hill and Wang. pp. 24–25. ISBN   978-0-8090-9481-3.
  26. Dijkstra, Koen A.; Hong, Ying-yi (2019-01-08). "The feeling of throwing good money after bad: The role of affective reaction in the sunk-cost fallacy". PLOS ONE. 14 (1): e0209900. Bibcode:2019PLoSO..1409900D. doi: 10.1371/journal.pone.0209900 . ISSN   1932-6203. PMC   6324799 . PMID   30620741.
  27. Heath, Chip. "Escalation and de-escalation of commitment in response to sunk costs: The role of budgeting in mental accounting." Organizational Behavior and Human Decision Processes 62 (1995): 38-38.
  28. 1 2 3 "Flying in the rear view mirror". Critical Uncertainties. 2011-06-26. Retrieved 2019-12-28.
  29. "Safety and The Sunk Cost Fallacy". SafetyRisk.net. 2015-06-20. Retrieved 2019-12-28.
  30. "17 Cognitive Biases which Contribute to Diving Accidents". www.thehumandiver.com. Retrieved 2019-12-28.
  31. Winter, Scott R.; Rice, Stephen; Capps, John; Trombley, Justin; Milner, Mattie N.; Anania, Emily C.; Walters, Nathan W.; Baugh, Bradley S. (2020-03-01). "An analysis of a pilot's adherence to their personal weather minimums". Safety Science. 123: 104576. doi:10.1016/j.ssci.2019.104576. ISSN   0925-7535. S2CID   212959377.
  32. "FAA Safety Briefing – July August 2018" (PDF). FAA.
  33. Harford, Tim (18 January 2019), "Brexit lessons from the wreck of the Torrey Canyon" , Financial Times, archived from the original on 2022-12-10
  34. Khatwa, Ratan; Helmreich, Robert (November 1998 – February 1999), "Analysis of Critical Factors During Approach and Landing in Accidents and Normal Flight" (PDF), Flight Safety Digest, pp. 1–77
  35. Bermin, Benjamin A.; Dismukes, R. Key (December 2006), "Pressing the Approach" (PDF), Aviation Safety World, pp. 28–33
  36. Behavioural Insights Team (July 2017). "A review of optimism bias, planning fallacy, sunk cost bias and groupthink in project delivery and organisational decision making" (PDF). An Exploration of Behavioural Biases in Project Delivery at the Department for Transport. GOV.UK.
  37. Plous, Scott (1993). The psychology of judgment and decision making. McGraw-Hill. ISBN   978-0-07-050477-6.
  38. Tversky, Amos; Kahneman, Daniel (1981). "The Framing of decisions and the psychology of choice". Science. 211 (4481): 453–58. Bibcode:1981Sci...211..453T. doi:10.1126/science.7455683. PMID   7455683. S2CID   5643902.
  39. Levin, Irwin P.; Schneider, Sandra L.; Gaeth, Gary J. (2 November 1998). "All Frames Are Not Created Equal: A Typology and Critical Analysis of Framing Effects". Organizational Behavior and Human Decision Processes. 76 (2): 149–188. doi:10.1006/obhd.1998.2804. PMID   9831520.
  40. Ellingsen, Tore; Johannesson, Magnus; Mollerstrom, Johanna; Munkhammar, Sara (17 May 2012). "Social framing effects: Preferences or beliefs?". Games and Economic Behavior. 76: 117–130. doi:10.1016/j.geb.2012.05.007.
  41. Knox, RE; Inkster, JA (1968). "Postdecision dissonance at post time". Journal of Personality and Social Psychology. 8 (4): 319–323. doi:10.1037/h0025528. PMID   5645589.
  42. 1 2 Arkes, Hal; Blumer, Catherine (1985). "The Psychology of Sunk Cost". Organizational Behavior and Human Decision Processes. 35: 124–140. doi:10.1016/0749-5978(85)90049-4.
  43. Arkes, Hal; Hutzel, Laura (2000). "The Role of Probability of Success Estimates in the Sunk Cost Effect". Journal of Behavioral Decision Making. 13 (3): 295–306. doi:10.1002/1099-0771(200007/09)13:3<295::AID-BDM353>3.0.CO;2-6.
  44. Staw, Barry M.; Fox, Frederick V. (1977). "Escalation: The Determinants of Commitment to a Chosen Course of Action". Human Relations. 30 (5): 431–450. doi:10.1177/001872677703000503. S2CID   146542771 . Retrieved 2019-08-06.
  45. Staw, Barry M. (1976). "Knee-deep in the big muddy: A study of escalating commitment to a chosen course of action" (PDF). Organizational Behavior and Human Performance. 16 (1): 27–44. doi:10.1016/0030-5073(76)90005-2. ISSN   0030-5073 . Retrieved 2019-08-05.
  46. Whyte, Glen (1986). "Escalating Commitment to a Course of Action: A Reinterpretation". The Academy of Management Review. 11 (2): 311–321. doi:10.2307/258462. ISSN   0363-7425. JSTOR   258462.

Further reading