The intentional stance is a term coined by philosopher Daniel Dennett for the level of abstraction in which we view the behavior of an entity in terms of mental properties. It is part of a theory of mental content proposed by Dennett, which provides the underpinnings of his later works on free will, consciousness, folk psychology, and evolution.
Here is how it works: first you decide to treat the object whose behavior is to be predicted as a rational agent; then you figure out what beliefs that agent ought to have, given its place in the world and its purpose. Then you figure out what desires it ought to have, on the same considerations, and finally you predict that this rational agent will act to further its goals in the light of its beliefs. A little practical reasoning from the chosen set of beliefs and desires will in most instances yield a decision about what the agent ought to do; that is what you predict the agent will do.
— Daniel Dennett, The Intentional Stance, p. 17
Dennett (1971, p. 87) states that he took the concept of "intentionality" from the work of the German philosopher Franz Brentano. [1] When clarifying the distinction between mental phenomena (viz., mental activity) and physical phenomena, Brentano (p. 97) argued that, in contrast with physical phenomena, [2] the "distinguishing characteristic of all mental phenomena" [3] was "the reference to something as an object" – a characteristic he called "intentional inexistence". [4] Dennett constantly speaks of the " aboutness " of intentionality; for example: "the aboutness of the pencil marks composing a shopping list is derived from the intentions of the person whose list it is" (Dennett, 1995, p. 240).
John Searle (1999, pp. 85) stresses that "competence" in predicting/explaining human behaviour involves being able to both recognize others as "intentional" beings, and interpret others' minds as having "intentional states" (e.g., beliefs and desires):
According to Dennett (1987, pp. 48–49), folk psychology provides a systematic, "reason-giving explanation" for a particular action, and an account of the historical origins of that action, based on deeply embedded assumptions about the agent; [6] namely that:
This approach is also consistent with the earlier work of Fritz Heider and Marianne Simmel, whose joint study revealed that, when subjects were presented with an animated display of 2-dimensional shapes, they were inclined to ascribe intentions to the shapes. [9]
Further, Dennett (1987, p. 52) argues that, based on our fixed personal views of what all humans ought to believe, desire and do, we predict (or explain) the beliefs, desires and actions of others "by calculating in a normative system"; [10] and, driven by the reasonable assumption that all humans are rational beings – who do have specific beliefs and desires and do act on the basis of those beliefs and desires in order to get what they want – these predictions/explanations are based on four simple rules:
The core idea is that, when understanding, explaining, and/or predicting the behavior of an object, we can choose to view it at varying levels of abstraction. The more concrete the level, the more accurate in principle our predictions are; the more abstract, the greater the computational power we gain by zooming out and skipping over the irrelevant details.
Dennett defines three levels of abstraction, attained by adopting one of three entirely different "stances", or intellectual strategies: the physical stance; the design stance; and the intentional stance: [14]
A key point is that switching to a higher level of abstraction has its risks as well as its benefits. For example, when we view both a bimetallic strip and a tube of mercury as thermometers, we can lose track of the fact that they differ in accuracy and temperature range, leading to false predictions as soon as the thermometer is used outside the circumstances for which it was designed. The actions of a mercury thermometer heated to 500 °C can no longer be predicted on the basis of treating it as a thermometer; we have to sink down to the physical stance to understand it as a melted and boiled piece of junk. For that matter, the "actions" of a dead bird are not predictable in terms of beliefs or desires.
Even when there is no immediate error, a higher-level stance can simply fail to be useful. If we were to try to understand the thermostat at the level of the intentional stance, ascribing to it beliefs about how hot it is and a desire to keep the temperature just right, we would gain no traction over the problem as compared to staying at the design stance, but we would generate theoretical commitments that expose us to absurdities, such as the possibility of the thermostat not being in the mood to work today because the weather is so nice. Whether to take a particular stance, then, is determined by how successful that stance is when applied.
Dennett argues that it is best to understand human behavior at the level of the intentional stance, without making any specific commitments to any deeper reality of the artifacts of folk psychology. In addition to the controversy inherent in this, there is also some dispute about the extent to which Dennett is committing to realism about mental properties. Initially, Dennett's interpretation was seen as leaning more towards instrumentalism, [19] but over the years, as this idea has been used to support more extensive theories of consciousness, it has been taken as being more like Realism. His own words hint at something in the middle, as he suggests that the self is as real as a center of gravity, "an abstract object, a theorist's fiction", but operationally valid. [20]
As a way of thinking about things, Dennett's intentional stance is entirely consistent with everyday commonsense understanding; and, thus, it meets Eleanor Rosch's (1978, p. 28) criterion of the "maximum information with the least cognitive effort". Rosch argues that, implicit within any system of categorization, are the assumptions that:
Also, the intentional stance meets the criteria Dennett specified (1995, pp. 50–51) for algorithms:
The general notion of a three level system was widespread in the late 1970s/early 1980s; for example, when discussing the mental representation of information from a cognitive psychology perspective, Glass and his colleagues (1979, p. 24) distinguished three important aspects of representation:
Other significant cognitive scientists who also advocated a three level system were Allen Newell, Zenon Pylyshyn, and David Marr. The parallels between the four representations (each of which implicitly assumed that computers and human minds displayed each of the three distinct levels) are detailed in the following table:
Daniel Dennett "Stances" | Zenon Pylyshyn "Levels of Organization" | Allen Newell "Levels of Description" | David Marr "Levels of Analysis" |
---|---|---|---|
Physical Stance. [23] | Physical Level, or Biological Level. [24] | Physical Level, or Device Level. [25] | Hardware Implementation Level. [26] |
Design Stance. [27] | Symbol Level. [28] | Program Level, or Symbol Level. [29] | Representation and Algorithm Level. [30] |
Intentional Stance. [31] | Semantic, [32] or Knowledge Level. [33] | Knowledge Level. [34] [35] | Computational Theory Level. [36] |
The most obvious objection to Dennett is the intuition that it "matters" to us whether an object has an inner life or not. The claim is that we don't just imagine the intentional states of other people in order to predict their behaviour; the fact that they have thoughts and feelings just like we do is central to notions such as trust, friendship and love. The Blockhead argument proposes that someone, Jones, has a twin who is in fact not a person but a very sophisticated robot which looks and acts like Jones in every way, but who (it is claimed) somehow does not have any thoughts or feelings at all, just a chip which controls his behaviour; in other words, "the lights are on but no one's home". According to the intentional systems theory (IST), Jones and the robot have precisely the same beliefs and desires, but this is claimed to be false. The IST expert assigns the same mental states to Blockhead as he does to Jones, "whereas in fact [Blockhead] has not a thought in his head." Dennett has argued against this by denying the premise, on the basis that the robot is a philosophical zombie and therefore metaphysically impossible. In other words, if something acts in all ways conscious, it necessarily is, as consciousness is defined in terms of behavioral capacity, not ineffable qualia. [37]
Another objection attacks the premise that treating people as ideally rational creatures will yield the best predictions. Stephen Stich argues that people often have beliefs or desires which are irrational or bizarre, and IST doesn't allow us to say anything about these. If the person's "environmental niche" is examined closely enough, and the possibility of malfunction in their brain (which might affect their reasoning capacities) is looked into, it may be possible to formulate a predictive strategy specific to that person. Indeed this is what we often do when someone is behaving unpredictably — we look for the reasons why. In other words, we can only deal with irrationality by contrasting it against the background assumption of rationality. This development significantly undermines the claims of the intentional stance argument.
The rationale behind the intentional stance is based on evolutionary theory, particularly the notion that the ability to make quick predictions of a system's behaviour based on what we think it might be thinking was an evolutionary adaptive advantage. The fact that our predictive powers are not perfect is a further result of the advantages sometimes accrued by acting contrary to expectations.
Philip Robbins and Anthony I. Jack suggest that "Dennett's philosophical distinction between the physical and intentional stances has a lot going for it" from the perspective of psychology and neuroscience. They review studies on abilities to adopt an intentional stance (variously called "mindreading", "mentalizing", or "theory of mind") as distinct from adopting a physical stance ("folk physics", "intuitive physics", or "theory of body"). Autism seems to be a deficit in the intentional stance with preservation of the physical stance, while Williams syndrome can involve deficits in the physical stance with preservation of the intentional stance. This tentatively suggests a double dissociation of intentional and physical stances in the brain. [38] However, most studies have found no evidence of impairment in autistic individuals' ability to understand other people's basic intentions or goals; instead, data suggests that impairments are found in understanding more complex social emotions or in considering others' viewpoints. [39]
Robbins and Jack point to a 2003 study [40] in which participants viewed animated geometric shapes in different "vignettes," some of which could be interpreted as constituting social interaction, while others suggested mechanical behavior. Viewing social interactions elicited activity in brain regions associated with identifying faces and biological objects (posterior temporal cortex), as well as emotion processing (right amygdala and ventromedial prefrontal cortex). Meanwhile, the mechanical interactions activated regions related to identifying objects like tools that can be manipulated (posterior temporal lobe). The authors suggest "that these findings reveal putative 'core systems' for social and mechanical understanding that are divisible into constituent parts or elements with distinct processing and storage capabilities." [40]
Robbins and Jack argue for an additional stance beyond the three that Dennett outlined. They call it the phenomenal stance: Attributing consciousness, emotions, and inner experience to a mind. The explanatory gap of the hard problem of consciousness illustrates this tendency of people to see phenomenal experience as different from physical processes. The authors suggest that psychopathy may represent a deficit in the phenomenal but not intentional stance, while people with autism appear to have intact moral sensibilities, just not mind-reading abilities. These examples suggest a double dissociation between the intentional and phenomenal stances. [38] [41]
In a follow-up paper, Robbins and Jack describe four experiments about how the intentional and phenomenal stances relate to feelings of moral concern. The first two experiments showed that talking about lobsters as strongly emotional led to a much greater sentiment that lobsters deserved welfare protections than did talking about lobsters as highly intelligent. The third and fourth studies found that perceiving an agent as vulnerable led to greater attributions of phenomenal experience. Also, people who scored higher on the empathetic-concern subscale of the Interpersonal Reactivity Index had generally higher absolute attributions of mental experience. [42]
Bryce Huebner (2010) performed two experimental philosophy studies to test students' ascriptions of various mental states to humans compared with cyborgs and robots. Experiment 1 showed that while students attributed both beliefs and pains most strongly to humans, they were more willing to attribute beliefs than pains to robots and cyborgs. [43] : 138 "[T]hese data seem to confirm that commonsense psychology does draw a distinction between phenomenal and non-phenomenal states—and this distinction seems to be dependent on the structural properties of an entity in a way that ascriptions of non-phenomenal states are not." [43] : 138–39 However, this conclusion is only tentative in view of the high variance among participants. [43] : 139 Experiment 2 showed analogous results: Both beliefs and happiness were ascribed most strongly to biological humans, and ascriptions of happiness to robots or cyborgs were less common than ascriptions of beliefs. [43] : 142
Daniel Clement Dennett III is an American philosopher, writer, and cognitive scientist whose research centers on the philosophy of mind, philosophy of science, and philosophy of biology, particularly as those fields relate to evolutionary biology and cognitive science.
Epiphenomenalism is a position on the mind–body problem which holds that subjective mental events are completely dependent for their existence on corresponding physical and biochemical events within the human body, yet themselves have no influence over physical events. The appearance that subjective mental states influence physical events is merely an illusion, consciousness being a by-product of physical states of the world. For instance, fear seems to make the heart beat faster, but according to epiphenomenalism the biochemical secretions of the brain and nervous system —not the experience of fear—is what raises the heartbeat. Because mental events are a kind of overflow that cannot cause anything physical, yet have non-physical properties, epiphenomenalism is viewed as a form of property dualism.
The mind is that which thinks, imagines, remembers, wills, and senses, or is the set of faculties responsible for such phenomena. The mind is also associated with experiencing perception, pleasure and pain, belief, desire, intention, and emotion. The mind can include conscious and non-conscious states as well as sensory and non-sensory experiences.
Free will is the capacity or ability to choose between different possible courses of action unimpeded.
Rationality is the quality of being guided by or based on reason. In this regard, a person acts rationally if they have a good reason for what they do or a belief is rational if it is based on strong evidence. This quality can apply to an ability, as in a rational animal, to a psychological process, like reasoning, to mental states, such as beliefs and intentions, or to persons who possess these other forms of rationality. A thing that lacks rationality is either arational, if it is outside the domain of rational evaluation, or irrational, if it belongs to this domain but does not fulfill its standards.
Intentionality is the power of minds to be about something: to represent or to stand for things, properties and states of affairs. Intentionality is primarily ascribed to mental states, like perceptions, beliefs or desires, which is why it has been regarded as the characteristic mark of the mental by many philosophers. A central issue for theories of intentionality has been the problem of intentional inexistence: to determine the ontological status of the entities which are the objects of intentional states.
In the philosophy of mind, functionalism is the thesis that each and every mental state is constituted solely by its functional role, which means its causal relation to other mental states, sensory inputs, and behavioral outputs. Functionalism developed largely as an alternative to the identity theory of mind and behaviorism.
Eliminative materialism is a materialist position in the philosophy of mind. It is the idea that the majority of mental states in folk psychology do not exist. Some supporters of eliminativism argue that no coherent neural basis will be found for many everyday psychological concepts such as belief or desire, since they are poorly defined. The argument is that psychological concepts of behavior and experience should be judged by how well they reduce to the biological level. Other versions entail the nonexistence of conscious mental states such as pain and visual perceptions.
In philosophy of mind and cognitive science, folk psychology, or commonsense psychology, is a human capacity to explain and predict the behavior and mental state of other people. Processes and items encountered in daily life such as pain, pleasure, excitement, and anxiety use common linguistic terms as opposed to technical or scientific jargon. Folk psychology allows for an insight into social interactions and communication, thus stretching the importance of connection and how it is experienced.
Jerry Alan Fodor was an American philosopher and the author of many crucial works in the fields of philosophy of mind and cognitive science. His writings in these fields laid the groundwork for the modularity of mind and the language of thought hypotheses, and he is recognized as having had "an enormous influence on virtually every portion of the philosophy of mind literature since 1960." At the time of his death in 2017, he held the position of State of New Jersey Professor of Philosophy, Emeritus, at Rutgers University, and had taught previously at the City University of New York Graduate Center and MIT.
In psychology, theory of mind refers to the capacity to understand other people by ascribing mental states to them. A theory of mind includes the knowledge that others' beliefs, desires, intentions, emotions, and thoughts may be different from one's own. Possessing a functional theory of mind is crucial for success in everyday human social interactions. People utilize a theory of mind when analyzing, judging, and inferring others' behaviors. The discovery and development of theory of mind primarily came from studies done with animals and infants. Factors including drug and alcohol consumption, language development, cognitive delays, age, and culture can affect a person's capacity to display theory of mind. Having a theory of mind is similar to but not identical with having the capacity for empathy or sympathy.
The language of thought hypothesis (LOTH), sometimes known as thought ordered mental expression (TOME), is a view in linguistics, philosophy of mind and cognitive science, forwarded by American philosopher Jerry Fodor. It describes the nature of thought as possessing "language-like" or compositional structure. On this view, simple concepts combine in systematic ways to build thoughts. In its most basic form, the theory states that thought, like language, has syntax.
A mental state, or a mental property, is a state of mind of a person. Mental states comprise a diverse class, including perception, pain/pleasure experience, belief, desire, intention, emotion, and memory. There is controversy concerning the exact definition of the term. According to epistemic approaches, the essential mark of mental states is that their subject has privileged epistemic access while others can only infer their existence from outward signs. Consciousness-based approaches hold that all mental states are either conscious themselves or stand in the right relation to conscious states. Intentionality-based approaches, on the other hand, see the power of minds to refer to objects and represent the world as the mark of the mental. According to functionalist approaches, mental states are defined in terms of their role in the causal network independent of their intrinsic properties. Some philosophers deny all the aforementioned approaches by holding that the term "mental" refers to a cluster of loosely related ideas without an underlying unifying feature shared by all. Various overlapping classifications of mental states have been proposed. Important distinctions group mental phenomena together according to whether they are sensory, propositional, intentional, conscious or occurrent. Sensory states involve sense impressions like visual perceptions or bodily pains. Propositional attitudes, like beliefs and desires, are relations a subject has to a proposition. The characteristic of intentional states is that they refer to or are about objects or states of affairs. Conscious states are part of the phenomenal experience while occurrent states are causally efficacious within the owner's mind, with or without consciousness. An influential classification of mental states is due to Franz Brentano, who argues that there are only three basic kinds: presentations, judgments, and phenomena of love and hate.
A mental representation, in philosophy of mind, cognitive psychology, neuroscience, and cognitive science, is a hypothetical internal cognitive symbol that represents external reality or its abstractions.
Philosophy of mind is a branch of philosophy that deals with the nature of the mind and its relation to the body and the external world.
According to Franz Brentano, intentionality refers to the "aboutness of mental states that cannot be a physical relation between a mental state and what it is about because in a physical relation each of the relata must exist whereas the objects of mental states might not."
Externalism is a group of positions in the philosophy of mind which argues that the conscious mind is not only the result of what is going on inside the nervous system, but also what occurs or exists outside the subject. It is contrasted with internalism which holds that the mind emerges from neural activity alone. Externalism is a belief that the mind is not just the brain or functions of the brain.
An intention is a mental state in which the agent commits themselves to a course of action. Having the plan to visit the zoo tomorrow is an example of an intention. The action plan is the content of the intention while the commitment is the attitude towards this content. Other mental states can have action plans as their content, as when one admires a plan, but differ from intentions since they do not involve a practical commitment to realizing this plan. Successful intentions bring about the intended course of action while unsuccessful intentions fail to do so. Intentions, like many other mental states, have intentionality: they represent possible states of affairs.
The first half of the topic of agency deals with the behavioral sense, or outward expressive evidence thereof. In behavioral psychology, agents are goal-directed entities that are able to monitor their environment to select and perform efficient means-ends actions that are available in a given situation to achieve an intended goal. Behavioral agency, therefore, implies the ability to perceive and to change the environment of the agent. Crucially, it also entails intentionality to represent the goal-state in the future, equifinal variability to be able to achieve the intended goal-state with different actions in different contexts, and rationality of actions in relation to their goal to produce the most efficient action available. Cognitive scientists and Behavioral psychologists have thoroughly investigated agency attribution in humans and non-human animals, since social cognitive mechanisms such as communication, social learning, imitation, or theory of mind presuppose the ability to identify agents and differentiate them from inanimate, non-agentive objects. This ability has also been assumed to have a major effect on inferential and predictive processes of the observers of agents, because agentive entities are expected to perform autonomous behavior based on their current and previous knowledge and intentions. On the other hand, inanimate objects are supposed to react to external physical forces.
Keith Frankish is a British philosopher specializing in philosophy of mind, philosophy of psychology, and philosophy of cognitive science. He is an Honorary Reader at the University of Sheffield, UK, Visiting Research Fellow with The Open University, and adjunct Professor with the Brain and Mind Programme at the University of Crete. He is known for his "illusionist" stance in the theory of consciousness. He holds that the conscious mind is a virtual system, a trick of the biological mind. In other words, phenomenality is an introspective illusion. This position is in opposition to dualist theories, reductive realist theories, and panpsychism.