Cognitive miser

Last updated

In psychology, the human mind is considered to be a cognitive miser due to the tendency of humans to think and solve problems in simpler and less effortful ways rather than in more sophisticated and effortful ways, regardless of intelligence. [1] Just as a miser seeks to avoid spending money, the human mind often seeks to avoid spending cognitive effort. The cognitive miser theory is an umbrella theory of cognition that brings together previous research on heuristics and attributional biases to explain when and why people are cognitive misers. [2] [3]

Contents

The term cognitive miser was first introduced by Susan Fiske and Shelley Taylor in 1984. It is an important concept in social cognition theory and has been influential in other social sciences such as economics and political science. [2]

People are limited in their capacity to process information, so they take shortcuts whenever they can. [2]

Assumption

The metaphor of the cognitive miser assumes that the human mind is limited in time, knowledge, attention, and cognitive resources. [4] Usually people do not think rationally or cautiously, but use cognitive shortcuts to make inferences and form judgments. [5] [6] These shortcuts include the use of schemas, scripts, stereotypes, and other simplified perceptual strategies instead of careful thinking. For example, people tend to make correspondent reasoning and are likely to believe that behaviors should be correlated to or representative of stable characteristics. [7]

Background

The naïve scientist and attribution theory

Before Fiske and Taylor's cognitive miser theory, the predominant model of social cognition was the naïve scientist. First proposed in 1958 by Fritz Heider in The Psychology of Interpersonal Relations, this theory holds that humans think and act with dispassionate rationality whilst engaging in detailed and nuanced thought processes for both complex and routine actions. [8] In this way, humans were thought to think like scientists, albeit naïve ones, measuring and analyzing the world around them. Applying this framework to human thought processes, naïve scientists seek the consistency and stability that comes from a coherent view of the world and need for environmental control. [9] [ page needed ]

In order to meet these needs, naïve scientists make attributions. [10] [ page needed ] Thus, attribution theory emerged from the study of the ways in which individuals assess causal relationships and mechanisms. [11] Through the study of causal attributions, led by Harold Kelley and Bernard Weiner amongst others, social psychologists began to observe that subjects regularly demonstrate several attributional biases including but not limited to the fundamental attribution error. [12]

The study of attributions had two effects: it created further interest in testing the naive scientist and opened up a new wave of social psychology research that questioned its explanatory power. This second effect helped to lay the foundation for Fiske and Taylor's cognitive miser. [9] [ page needed ]

Stereotypes

According to Walter Lippmann's arguments in his classic book Public Opinion , [13] people are not equipped to deal with complexity. Attempting to observe things freshly and in detail is mentally exhausting, especially among busy affairs. The term stereotype is thus introduced: people have to reconstruct the complex situation on a simpler model before they can cope with it, and the simpler model can be regarded as a stereotype. Stereotypes are formed from outside sources which identified with people's interests and can be reinforced since people could be impressed by those facts that fit their philosophy.

On the other hand, in Lippmann's view, people are told about the world before they see it. [13] People's behavior is not based on direct and certain knowledge, but pictures made or given to them. Hence, influence from external factors are unneglectable in shaping people’s stereotypes. "The subtlest and most pervasive of all influences are those which create and maintain the repertory of stereotypes." [13] That is to say, people live in a second-handed world with mediated reality, where the simplified model for thinking (i.e., stereotypes) could be created and maintained by external forces. Lippmann suggested that the public "cannot be wise", since they can be easily misled by overly simplified reality which is consistent with their pre-existing pictures in mind, and any disturbance of the existing stereotypes will seem like "an attack upon the foundation of the universe". [13]

Although Lippmann did not directly define the term cognitive miser, stereotypes have important functions in simplifying people's thinking process. As cognitive simplification, it is useful for realistic economic management, otherwise people will be overwhelmed by the complexity of the real rationales. Stereotype, as a phenomenon, has become a standard topic in sociology and social psychology. [14]

Heuristics

Much of the cognitive miser theory is built upon work done on heuristics in judgment and decision-making, [15] [ page needed ] most notably Amos Tversky and Daniel Kahneman results published in a series of influential articles. [16] [17] [18] Heuristics can be defined as the "judgmental shortcuts that generally get us where we need to go—and quickly—but at the cost of occasionally sending us off course." [19] In their work, Kahneman and Tversky demonstrated that people rely upon different types of heuristics or mental short cuts in order to save time and mental energy. [18] However, in relying upon heuristics instead of detailed analysis, like the information processing employed by Heider's naïve scientist, biased information processing is more likely to occur. [9] [ page needed ] Some of these heuristics include:

The frequency with which Kahneman and Tversky and other attribution researchers found the individuals employed mental shortcuts to make decisions and assessments laid important groundwork for the overarching idea that individuals and their minds act efficiently instead of analytically. [15] [ page needed ]

Cognitive miser theory

The wave of research on attributional biases done by Kahneman, Tversky and others effectively ended the dominance of Heider's naïve scientist within social psychology. [15] Fiske and Taylor, building upon the prevalence of heuristics in human cognition, offered their theory of the cognitive miser. It is, in many ways, a unifying theory of ad-hoc decision-making which suggests that humans engage in economically prudent thought processes instead of acting like scientists who rationally weigh cost and benefit data, test hypotheses, and update expectations based upon the results of the discrete experiments that are our everyday actions. [2] In other words, humans are more inclined to act as cognitive misers using mental short cuts to make assessments and decisions regarding issues and ideas about which they know very little, including issues of great salience. Fiske and Taylor argue that it is rational to act as a cognitive miser due to the sheer volume and intensity of information and stimuli humans intake. [2] [20] Given the limited information processing capabilities of individuals, people try to adopt strategies that economise complex problems. Cognitive misers usually act in two ways: by disregarding part of the information to reduce their own cognitive load, or by overusing some kind of information to avoid the burden of finding and processing more information.

Other psychologists also argue that the cognitively miserly tendency of humans is a primary reason why "humans are often less than rational". [3] This view holds that evolution has made the brain's allocation and use of cognitive resources extremely embarrassing. The basic principle is to save mental energy as much as possible, even when it is required to "use your head". [21] Unless the cognitive environment meets certain criteria, we will, by default, try to avoid thinking as much as possible.

Implications

The implications of this theory raise important questions about both cognition and human behavior. In addition to streamlining cognition in complicated, analytical tasks, the cognitive miser approach is also used when dealing with unfamiliar issues and issues of great importance. [2] [20]

Politics

Voting behavior in democracies are an arena in which the cognitive miser is at work. Acting as a cognitive miser should lead those with expertise in an area to more efficient information processing and streamlined decision making. [22] However, as Lau and Redlawsk note, acting as cognitive miser who employs heuristics can have very different results for high-information and low-information voters. They write, "...cognitive heuristics are at times employed by almost all voters, and that they are particularly likely to be used when the choice situation facing voters is complex... heuristic use generally increases the probability of a correct vote by political experts but decreases the probability of a correct vote by novices." [22] In democracies, where no vote is weighted more or less because of the expertise behind its casting, low-information voters, acting as cognitive misers, can have broad and potentially deleterious choices for a society. [22]

Samuel Popkin argues that voters make rational choices by using information shortcuts that they receive during campaigns, usually using something akin to a drunkard's search. Voters use small amounts of personal information to construct a narrative about candidates. Essentially, they ask themselves this: "Based on what I know about the candidate personally, what is the probability that this presidential candidate was a good governor? What is the probability that he will be a good president?" Popkin's analysis is based on one main premise: voters use low information rationality gained in their daily lives, through the media and through personal interactions, to evaluate candidates and facilitate electoral choices. [23]

Economics

Cognitive misers could also be one of the contributors to the prisoner's dilemma in game theory. To save cognitive energy, cognitive misers tend to assume that other people are similar to themselves. That is, habitual cooperators assume most of the others as cooperators, and habitual defectors assume most of the others as defectors. Experimental research has shown that since cooperators offer to play more often, and fellow cooperators will also more often accept their offer, cooperators would have a higher expected payoff compared with defectors when certain boundary conditions are met. [24]

Mass communication

Lack of public support towards emerging techniques are commonly attributed to lack of relevant information and the low scientific literacy among the public. Known as the knowledge deficit model, this point of view is based on idealistic assumptions that education for science literacy could increase public support of science, and the focus of science communication should be increasing scientific understanding among lay public. [25] [26] However, the relationship between information and attitudes towards scientific issues are not empirically supported. [27] [28]

Based on the assumption that human beings are cognitive misers and tend to minimize the cognitive costs, low-information rationality was introduced as an empirically grounded alternative in explaining decision making and attitude formation. Rather than using an in-depth understanding of scientific topics, people make decisions based on other shortcuts or heuristics such as ideological predistortions or cues from mass media due to the subconscious compulsion to use only as much information as necessary. [29] [30] The less expertise citizens have on an issue initially, the more likely they will rely on these shortcuts. [30] Further, people spend less cognitive effort in buying toothpaste than they do when picking a new car, and that difference in information-seeking is largely a function of the costs. [30]

The cognitive miser theory thus has implications for persuading the public: attitude formation is a competition between people's value systems and prepositions (or their own interpretive schemata) on a certain issue, and how public discourses frame it. [30] Framing theory suggest that the same topic will result in different interpretations among audience, if the information is presented in different ways. [31] Audiences' attitude change is closely connected with relabeling or re-framing the certain issue. In this sense, effective communication can be achieved if media provide audiences with cognitive shortcuts or heuristics that are resonate with underlying audience schemata.

Risk assessment

The metaphor of cognitive misers could assist people in drawing lessons from risks, which is the possibility that an undesirable state of reality may occur. [32] People apply a number of shortcuts or heuristics in making judgements about the likelihood of an event, because the rapid answers provided by heuristics are often right. [2] [33] Yet certain pitfalls may be neglected in these shortcuts. A practical example of the cognitively miserly way of thinking in the context of a risk assessment of Deepwater Horizon explosion, is presented below. [34]

Psychology

The theory that human beings are cognitive misers, also shed light on the dual process theory in psychology. Dual process theory proposes that there are two types of cognitive processes in human mind. Daniel Kahneman described these as intuitive (System 1) and reasoning (System 2), respectively. [35]

When processing with System 1, which starts automatically and without control, people expend little to no effort, but can generate complex patterns of ideas. When processing with System 2, people actively consider how best to distribute mental effort to accurately process data, and can construct thoughts in an orderly series of steps. [36] These two cognitive processing systems are not separate and can have interactions with each other. Here is an example of how people's beliefs are formed under the dual process model:

  1. System 1 generates suggestions for System 2, with impressions, intuitions, intentions or feelings;
  2. If System 1's proposal is endorsed by System 2, those impressions and intuitions will turn into beliefs, and the sudden inspiration generated by System 1 will turn into voluntary actions;
  3. When everything goes smoothly (as is often the case), System 2 adopts the suggestions of System 1 with little or no modification. Herein there is a window for bias to form, as System 2 may be trained to incorrectly regard the accuracy of data derived from observations gathered via System 1.

The reasoning process can be activated to help with the intuition when:

Conflicts also exists in this dual-process. A brief example provided by Kahneman is that when we try not to stare at the oddly dressed couple at the neighboring table in a restaurant, our automatic reaction (System 1) makes us stare at them, but conflicts emerge as System 2 tries to control this behavior. [36]

The dual processing system can produce cognitive illusions. System 1 always operates automatically, with our easiest shortcut but often with error. System 2 may also have no clue to the error.[ clarification needed ] Errors can be prevented only by enhanced monitoring of System 2, which costs a plethora of cognitive efforts. [36]

Limitations

Omission of motivation

The cognitive miser theory did not originally specify the role of motivation. [37] In Fiske's subsequent research, the omission of the role of intent in the metaphor of cognitive miser is recognized. Motivation does affect the activation and use of stereotypes and prejudices. [38]

Updates and later research

Motivated tactician

People tend to use heuristic shortcuts when making decisions. But the problem remains that although these shortcuts could not compare to effortful thoughts in accuracy, people should have a certain parameter to help them adopt one of the most adequate shortcuts. [39] Kruglanski proposed that people are combination of naïve scientists and cognitive misers: people are flexible social thinkers who choose between multiple cognitive strategies (i.e., speed/ease vs. accuracy/logic) based on their current goals, motives, and needs. [39]

Later models suggest that the cognitive miser and the naïve scientist create two poles of social cognition that are too monolithic. Instead, Fiske, Taylor, and Arie W. Kruglanski and other social psychologists offer an alternative explanation of social cognition: the motivated tactician. [2] According to this theory, people employ either shortcuts or thoughtful analysis based upon the context and salience of a particular issue. In other words, this theory suggests that humans are, in fact, both naive scientists and cognitive misers. [9] [ page needed ] In this sense people are strategic instead of passively choosing the most effortless shortcuts when they allocate their cognitive efforts, and therefore they can decide to be naïve scientists or cognitive misers depending on their goals.

See also

Related Research Articles

<span class="mw-page-title-main">Cognitive bias</span> Systematic pattern of deviation from norm or rationality in judgment

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.

A heuristic or heuristic technique is any approach to problem solving that employs a pragmatic method that is not fully optimized, perfected, or rationalized, but is nevertheless "good enough" as an approximation or attribute substitution. Where finding an optimal solution is impossible or impractical, heuristic methods can be used to speed up the process of finding a satisfactory solution. Heuristics can be mental shortcuts that ease the cognitive load of making a decision.

Heuristic reasoning is often based on induction, or on analogy[.] [...] Induction is the process of discovering general laws [...] Induction tries to find regularity and coherence [...] Its most conspicuous instruments are generalization, specialization, analogy. [...] Heuristic discusses human behavior in the face of problems [...that have been] preserved in the wisdom of proverbs.

Bounded rationality is the idea that rationality is limited when individuals make decisions, and under these limitations, rational individuals will select a decision that is satisfactory rather than optimal.

<span class="mw-page-title-main">Behavioral economics</span> Academic discipline

Behavioral economics is the study of the psychological factors involved in the decisions of individuals or institutions, and how these decisions deviate from those implied by traditional economic theory.

<span class="mw-page-title-main">Decision theory</span> Branch of applied probability theory

Decision theory or the theory of rational choice is a branch of probability, economics, and analytic philosophy that uses the tools of expected utility and probability to model how individuals should behave rationally under uncertainty. It differs from the cognitive and behavioral sciences in that it is prescriptive and concerned with identifying optimal decisions for a rational agent, rather than describing how people really do make decisions. Despite this, the field is important to the study of real human behavior by social scientists, as it lays the foundations for the rational agent models used to mathematically model and analyze individuals in fields such as sociology, economics, criminology, cognitive science, and political science.

The representativeness heuristic is used when making judgments about the probability of an event being representional in character and essence of a known prototypical event. It is one of a group of heuristics proposed by psychologists Amos Tversky and Daniel Kahneman in the early 1970s as "the degree to which [an event] (i) is similar in essential characteristics to its parent population, and (ii) reflects the salient features of the process by which it is generated". The representativeness heuristic works by comparing an event to a prototype or stereotype that we already have in mind. For example, if we see a person who is dressed in eccentric clothes and reading a poetry book, we might be more likely to think that they are a poet than an accountant. This is because the person's appearance and behavior are more representative of the stereotype of a poet than an accountant.

<span class="mw-page-title-main">Gerd Gigerenzer</span> German cognitive psychologist

Gerd Gigerenzer is a German psychologist who has studied the use of bounded rationality and heuristics in decision making. Gigerenzer is director emeritus of the Center for Adaptive Behavior and Cognition (ABC) at the Max Planck Institute for Human Development, Berlin, director of the Harding Center for Risk Literacy, University of Potsdam, and vice president of the European Research Council (ERC).

<span class="mw-page-title-main">Risk perception</span>

Risk perception is the subjective judgement that people make about the characteristics and severity of a risk. Risk perceptions often differ from statistical assessments of risk since they are affected by a wide range of affective, cognitive, contextual, and individual factors. Several theories have been proposed to explain why different people make different estimates of the dangerousness of risks. Three major families of theory have been developed: psychology approaches, anthropology/sociology approaches and interdisciplinary approaches.

In psychology, a dual process theory provides an account of how thought can arise in two different ways, or as a result of two different processes. Often, the two processes consist of an implicit (automatic), unconscious process and an explicit (controlled), conscious process. Verbalized explicit processes or attitudes and actions may change with persuasion or education; though implicit process or attitudes usually take a long amount of time to change with the forming of new habits. Dual process theories can be found in social, personality, cognitive, and clinical psychology. It has also been linked with economics via prospect theory and behavioral economics, and increasingly in sociology through cultural analysis.

In social psychology, a motivated tactician is someone who shifts between quick-and-dirty cognitively economical tactics and more thoughtful, thorough strategies when processing information, depending on the type and degree of motivation. Such behavior is a type of motivated reasoning. The idea has been used to explain why people use stereotyping, biases and categorization in some situations, and more analytical thinking in others.

Attribute substitution is a psychological process thought to underlie a number of cognitive biases and perceptual illusions. It occurs when an individual has to make a judgment that is computationally complex, and instead substitutes a more easily calculated heuristic attribute. This substitution is thought of as taking place in the automatic intuitive judgment system, rather than the more self-aware reflective system. Hence, when someone tries to answer a difficult question, they may actually answer a related but different question, without realizing that a substitution has taken place. This explains why individuals can be unaware of their own biases, and why biases persist even when the subject is made aware of them. It also explains why human judgments often fail to show regression toward the mean.

Heuristics is the process by which humans use mental shortcuts to arrive at decisions. Heuristics are simple strategies that humans, animals, organizations, and even machines use to quickly form judgments, make decisions, and find solutions to complex problems. Often this involves focusing on the most relevant aspects of a problem or situation to formulate a solution. While heuristic processes are used to find the answers and solutions that are most likely to work or be correct, they are not always right or the most accurate. Judgments and decisions based on heuristics are simply good enough to satisfy a pressing need in situations of uncertainty, where information is incomplete. In that sense they can differ from answers given by logic and probability.

<i>Thinking, Fast and Slow</i> 2011 book by Daniel Kahneman

Thinking, Fast and Slow is a 2011 popular science book by psychologist Daniel Kahneman. The book's main thesis is a differentiation between two modes of thought: "System 1" is fast, instinctive and emotional; "System 2" is slower, more deliberative, and more logical.

Cognitive-experiential self-theory (CEST) is a dual-process model of perception developed by Seymour Epstein. CEST is based around the idea that people operate using two separate systems for information processing: analytical-rational and intuitive-experiential. The analytical-rational system is deliberate, slow, and logical. The intuitive-experiential system is fast, automatic, and emotionally driven. These are independent systems that operate in parallel and interact to produce behavior and conscious thought.

Cognitive bias mitigation is the prevention and reduction of the negative effects of cognitive biases – unconscious, automatic influences on human judgment and decision making that reliably produce reasoning errors.

Social heuristics are simple decision making strategies that guide people's behavior and decisions in the social environment when time, information, or cognitive resources are scarce. Social environments tend to be characterised by complexity and uncertainty, and in order to simplify the decision-making process, people may use heuristics, which are decision making strategies that involve ignoring some information or relying on simple rules of thumb.

Ecological rationality is a particular account of practical rationality, which in turn specifies the norms of rational action – what one ought to do in order to act rationally. The presently dominant account of practical rationality in the social and behavioral sciences such as economics and psychology, rational choice theory, maintains that practical rationality consists in making decisions in accordance with some fixed rules, irrespective of context. Ecological rationality, in contrast, claims that the rationality of a decision depends on the circumstances in which it takes place, so as to achieve one's goals in this particular context. What is considered rational under the rational choice account thus might not always be considered rational under the ecological rationality account. Overall, rational choice theory puts a premium on internal logical consistency whereas ecological rationality targets external performance in the world. The term ecologically rational is only etymologically similar to the biological science of ecology.

Intuitive statistics, or folk statistics, is the cognitive phenomenon where organisms use data to make generalizations and predictions about the world. This can be a small amount of sample data or training instances, which in turn contribute to inductive inferences about either population-level properties, future data, or both. Inferences can involve revising hypotheses, or beliefs, in light of probabilistic data that inform and motivate future predictions. The informal tendency for cognitive animals to intuitively generate statistical inferences, when formalized with certain axioms of probability theory, constitutes statistics as an academic discipline.

Political cognition refers to the study of how individuals come to understand the political world, and how this understanding leads to political behavior. Some of the processes studied under the umbrella of political cognition include attention, interpretation, judgment, and memory. Most of the advancements in the area have been made by scholars in the fields of social psychology, political science, and communication studies.

References

  1. Stanovich, Keith E. (2009). "The cognitive miser: ways to avoid thinking". What intelligence tests miss: the psychology of rational thought. New Haven: Yale University Press. pp.  70–85. ISBN   9780300123852. OCLC   216936066. See also other chapters in the same book: "Framing and the cognitive miser" (chapter 7); "A different pitfall of the cognitive miser: thinking a lot, but losing" (chapter 9).
  2. 1 2 3 4 5 6 7 8 Fiske, Susan T.; Taylor, Shelley E. (1991) [1984]. Social cognition (2nd ed.). New York: McGraw-Hill. ISBN   978-0070211919. OCLC   22810253.
  3. 1 2 Toplak, Maggie E.; West, Richard F.; Stanovich, Keith E. (April 2014). "Assessing miserly information processing: an expansion of the Cognitive Reflection Test". Thinking & Reasoning. 20 (2): 147–168. doi:10.1080/13546783.2013.844729. S2CID   53340418.
  4. Simon, H. A. (1956). "Rational choice and the structure of the environment". Psychological Review. 63 (2). American Psychological Association (APA): 129–138. doi:10.1037/h0042769. ISSN   1939-1471. PMID   13310708. S2CID   8503301.
  5. Gilovich, Thomas. (2008). How we know what isn't so: the fallibility of human reason in everyday life. Free Press. OCLC   700511906.
  6. Nisbett, Richard E. (c. 1985). Human inference: strategies and shortcomings of social judgment . Prentice-Hall. ISBN   0134451309. OCLC   899043502.
  7. Jones, Edward E.; Davis, Keith E. (1965). "From Acts To Dispositions The Attribution Process In Person Perception". Advances in Experimental Social Psychology. Vol. 2. Elsevier. pp. 219–266. doi:10.1016/s0065-2601(08)60107-0. ISBN   9780120152025. ISSN   0065-2601.
  8. Heider, Fritz (1958). The psychology of interpersonal relations (1st ed.). New York: John Wiley & Sons. ISBN   978-0898592825. OCLC   225326.
  9. 1 2 3 4 5 Crisp, Richard J.; Turner, Rhiannon N. (2014). Essential social psychology (3rd ed.). New York: SAGE Publications. ISBN   9781446270769. OCLC   873005953.
  10. Kassin, Saul; Fein, Steven; Markus, Hazel Rose (2016). Social psychology (10th ed.). Cengage Learning. ISBN   9781305580220. OCLC   952391832.
  11. Ross, Lee (1977). "The intuitive psychologist and his shortcomings: distortions in the attribution process". In Berkowitz, Leonard (ed.). Advances in experimental social psychology. Vol. 10. New York: Academic Press. pp. 173–220. ISBN   978-0120152100. OCLC   1283539.
  12. Jones, Edward E.; Harris, Victor A. (1967). "The attribution of attitudes". Journal of Experimental Social Psychology . 3 (1): 1–24. doi:10.1016/0022-1031(67)90034-0.
  13. 1 2 3 4 Lippmann, W. (1922). Public Opinion (PDF). Harcourt, Brace & Co.
  14. Jones, E. E.; Colman, A. M. (1996). "Stereotypes". In A. Kuper; J. Kuper (eds.). The social science encyclopedia (2nd ed.). London: Routledge. pp. 843–844.
  15. 1 2 3 Barone, David F.; Maddux, James E.; Snyder, Charles R. (1997). Social cognitive psychology: history and current domains (1st ed.). New York: Plenum Press. ISBN   978-0306454752. OCLC   36330837.
  16. 1 2 3 Kahneman, Daniel; Tversky, Amos (1973). "On the psychology of prediction". Psychological Review . 80 (4): 237–251. doi:10.1037/h0034747. S2CID   17786757.
  17. Tversky, Amos; Kahneman, Daniel (1973). "Availability: a heuristic for judging frequency and probability". Cognitive Psychology. 5 (2): 207–232. doi:10.1016/0010-0285(73)90033-9. S2CID   260634872.
  18. 1 2 3 Tversky, Amos; Kahneman, Daniel (1974). "Judgment under uncertainty: heuristics and biases". Science . 185 (4157): 1124–1131. Bibcode:1974Sci...185.1124T. doi:10.1126/science.185.4157.1124. PMID   17835457. S2CID   143452957.
  19. Gilovich, Thomas; Savitsky, Kenneth (1996). "Like goes with like: the role of representativeness in erroneous and pseudoscientific beliefs" (PDF). The Skeptical Inquirer. 20 (2): 34–40. Archived from the original (PDF) on 2016-03-07.
  20. 1 2 Scheufele, Dietram A.; Lewenstein, Bruce V. (17 May 2005). "The public and nanotechnology: how citizens make sense of emerging technologies". Journal of Nanoparticle Research. 7 (6): 659–667 [660]. Bibcode:2005JNR.....7..659S. doi:10.1007/s11051-005-7526-2. S2CID   136549696.
  21. Hull, David L. (2001). Science and selection: essays on biological evolution and the philosophy of science . Cambridge University Press. ISBN   0521643392. OCLC   876723188.
  22. 1 2 3 Lau, Richard R.; David P. Redlawsk (4 Oct 2001). "Advantages and disadvantages of cognitive heuristics in political decision making". American Journal of Political Science . 45 (4): 951–971. CiteSeerX   10.1.1.609.340 . doi:10.2307/2669334. JSTOR   2669334.
  23. Popkin, Samuel (1991). The Reasoning Voter. Chicago, IL: The University of Chicago Press. ISBN   0226675440.
  24. Orbell, John; Dawes, Robyn M. (June 1991). "A 'Cognitive Miser' Theory of Cooperators Advantage". American Political Science Review. 85 (2): 515–528. doi:10.2307/1963172. ISSN   0003-0554. JSTOR   1963172. S2CID   145799460.
  25. Irwin, Alan; Wynne, Brian, eds. (1996). Misunderstanding science?. Cambridge University Press. doi:10.1017/cbo9780511563737. ISBN   9780521432689.
  26. Marks, Nicola J (2016-11-15), "Public Understanding of Genetics: The Deficit Model", eLS, John Wiley & Sons, Ltd, pp. 1–5, doi:10.1002/9780470015902.a0005862.pub3, ISBN   9780470015902
  27. Kellstedt, Paul M.; Zahran, Sammy; Vedlitz, Arnold (February 2008). "Personal Efficacy, the Information Environment, and Attitudes Toward Global Warming and Climate Change in the United States". Risk Analysis. 28 (1): 113–126. Bibcode:2008RiskA..28..113K. doi:10.1111/j.1539-6924.2008.01010.x. ISSN   0272-4332. PMID   18304110. S2CID   8606155.
  28. Scheufele, Dietram A. (12 August 2013). "Communicating science in social settings". Proceedings of the National Academy of Sciences. 110 (Suppl 3): 14040–14047. doi: 10.1073/pnas.1213275110 . ISSN   0027-8424. PMC   3752169 . PMID   23940341.
  29. Popkin, Samuel (October 1991). The reasoning voter: communication and persuasion in presidential campaigns. Chicago: The University of Chicago Press. ISBN   0-226-67544-0. OCLC   23082066.
  30. 1 2 3 4 Scheufele, Dietram. "Messages and Heuristics: How audiences form attitudes about emerging technologies" (PDF). The Consortium for Science, Policy & Outcomes. Retrieved 24 August 2023.
  31. Scheufele, Dietram A.; Tewksbury, David (2007). "Cadrage, programmes d'action et préparation: l'évolution de ces trois modèles d'effets des médias" [Framing, agenda setting, and priming: The evolution of three media effects models]. Journal of Communication (in French). 57 (1). Oxford University Press (OUP): 9–20. doi:10.1111/j.1460-2466.2006.00326_5.x. ISSN   0021-9916.
  32. US National Research Council Committee on the Institutional Means for Assessment of Risks to Public Health (1983-01-01). Risk Assessment in the Federal Government. Washington, D.C.: National Academies Press. doi:10.17226/366. ISBN   978-0-309-03349-7. PMID   25032414.
  33. Marteau, T. M (1999-01-01). "Communicating genetic risk information". British Medical Bulletin. 55 (2): 414–428. doi: 10.1258/0007142991902466 . ISSN   0007-1420. PMID   10723866.
  34. Brooks, David (May 27, 2010). "Drilling for Certainty". The New York Times (Opinion). Retrieved Sep 16, 2019.
  35. Kahneman, Daniel (2003). "A perspective on judgment and choice: Mapping bounded rationality". American Psychologist. 58 (9). American Psychological Association (APA): 697–720. CiteSeerX   10.1.1.186.3636 . doi:10.1037/0003-066x.58.9.697. ISSN   1935-990X. PMID   14584987. S2CID   16994141.
  36. 1 2 3 Kahneman, D. (2011). Thinking, Fast and Slow. Penguin Books Limited. ISBN   978-0-14-191892-1.
  37. Fiske, Susan T. (2017-03-15). Social cognition: from brains to culture. SAGE Publications. ISBN   978-1473969292. OCLC   968775128.
  38. Fiske, Susan T. (2004). "Intent and Ordinary Bias: Unintended Thought and Social Motivation Create Casual Prejudice". Social Justice Research. 17 (2). Springer Science and Business Media LLC: 117–127. doi:10.1023/b:sore.0000027405.94966.23. ISSN   0885-7466. S2CID   144716889.
  39. 1 2 Kruglanski, A. W. (1994). "The social-cognitive bases of scientific knowledge". In Shadish, W.R.; Fuller, S. (eds.). The Social Psychology of Science. Conduct of science series. Guilford Publications. pp. 197–213. ISBN   978-0-89862-021-4.

Further reading