Sentience

Last updated
A cat in an affectionate frame of mind, by T. W. Wood (1872) Expression of the Emotions Figure 10.png
A cat in an affectionate frame of mind, by T. W. Wood (1872)

Sentience is the simplest or most primitive form of cognition, consisting of a conscious awareness of stimuli without association or interpretation. [1] The word was first coined by philosophers in the 1630s for the concept of an ability to feel, derived from Latin sentiens (feeling), [2] to distinguish it from the ability to think ( reason ).[ citation needed ]

Contents

In modern Western philosophy, sentience is the ability to experience sensations. In different Asian religions, the word "sentience" has been used to translate a variety of concepts. In science fiction, the word "sentience" is sometimes used interchangeably with "sapience", "self-awareness", or "consciousness". [3]

Some writers differentiate between the mere ability to perceive sensations, such as light or pain, and the ability to perceive emotions, such as fear or grief. The subjective awareness of experiences by a conscious individual are known as qualia in Western philosophy. [3]

Philosophy and sentience

In philosophy, different authors draw different distinctions between consciousness and sentience. According to Antonio Damasio, sentience is a minimalistic way of defining consciousness, which otherwise commonly and collectively describes sentience plus further features of the mind and consciousness, such as creativity, intelligence, sapience, self-awareness, and intentionality (the ability to have thoughts about something). These further features of consciousness may not be necessary for sentience, which is the capacity to feel sensations and emotions. [4]

Consciousness

According to Thomas Nagel in his paper "What Is It Like to Be a Bat?", consciousness can refer to the ability of any entity to have subjective perceptual experiences, or as some philosophers refer to them, "qualia"—in other words, the ability to have states that it feels like something to be in. [5] Some philosophers, notably Colin McGinn, believe that the physical process causing consciousness to happen will never be understood, a position known as "new mysterianism". They do not deny that most other aspects of consciousness are subject to scientific investigation but they argue that qualia will never be explained. [6] Other philosophers, such as Daniel Dennett, argue that qualia is not a meaningful concept. [7]

Regarding animal consciousness, according to the Cambridge Declaration of Consciousness, which was publicly proclaimed on 7 July 2012 at Cambridge University, consciousness is that which requires specialized neural structures, chiefly neuroanatomical, neurochemical, and neurophysiological substrates, which manifests in more complex organisms as the central nervous system, to exhibit consciousness. [a] Accordingly, only organisms that possess these substrates, all within the animal kingdom, are said to be conscious. [8]

Phenomenal vs. affective consciousness

David Chalmers argues that sentience is sometimes used as shorthand for phenomenal consciousness, the capacity to have any subjective experience at all, but sometimes refers to the narrower concept of affective consciousness, the capacity to experience subjective states that have affective valence (i.e., a positive or negative character), such as pain and pleasure. [9]

Recognition paradox and relation to sapience

Chimps in a playful mood Gombe Stream NP Mutter und Kinder.jpg
Chimps in a playful mood

While it has been traditionally assumed that sentience and sapience are, in principle, independent of each other, there are criticisms of that assumption. One such criticism is about recognition paradoxes, one example of which is that an entity that cannot distinguish a spider from a non-spider cannot be arachnophobic. More generally, it is argued that since it is not possible to attach an emotional response to stimuli that cannot be recognized, emotions cannot exist independently of cognition that can recognize. The claim that precise recognition exists as specific attention to some details in a modular mind is criticized both with regard to data loss, as a small system of disambiguating synapses in a module physically cannot make as precise distinctions as a bigger synaptic system encompassing the whole brain, and for energy loss, as having one system for motivation that needs some built-in cognition to recognize anything, as well as another cognitive system for making strategies, would cost more energy than integrating it all in one system that use the same synapses. Data losses inherent in all information transfer from more precise systems to less precise systems are also argued to make it impossible for any imprecise system to use a more precise system as an "emissary", as a less precise system would not be able to tell whether the outdata from the more precise system was in the interest of the less precise system or not. [10] [11]

Empirical data on conditioned reflex precision

The original studies by Ivan Pavlov that showed that conditioned reflexes in human children are more discriminating than those in dogs, human children salivating only at ticking frequencies very close to those at which food was served while dogs drool at a wider range of frequencies, have been followed up in recent years with comparative studies on more species. It is shown that both brain size and brain-wide connectivity contribute to make perception more discriminating, as predicted by the theory of a brain-wide perception system but not by the theory of separate systems for emotion and cognition. [12]

Eastern religions

Eastern religions including Hinduism, Buddhism, Sikhism, and Jainism recognise non-humans as sentient beings. [13] The term sentient beings is translated from various Sanskrit terms (jantu, bahu jana, jagat, sattva) and "conventionally refers to the mass of living things subject to illusion, suffering, and rebirth (Saṃsāra)". [14] It is related to the concept of ahimsa, non-violence toward other beings. [15] In some forms of Buddhism, plants, stones and other inanimate objects are considered to be 'sentient'. [16] [17] In Jainism many things are endowed with a soul, jīva , which is sometimes translated as 'sentience'. [18] [19] Some things are without a soul, ajīva , such as a chair or spoon. [20] There are different rankings of jīva based on the number of senses it has. Water, for example, is a sentient being of the first order, as it is considered to possess only one sense, that of touch. [21]

Sentience in Buddhism is the state of having senses. In Buddhism, there are six senses, the sixth being the subjective experience of the mind. Sentience is simply awareness prior to the arising of Skandha. Thus, an animal qualifies as a sentient being. According to Buddhism, sentient beings made of pure consciousness are possible. In Mahayana Buddhism, which includes Zen and Tibetan Buddhism, the concept is related to the Bodhisattva, an enlightened being devoted to the liberation of others. The first vow of a Bodhisattva states, "Sentient beings are numberless; I vow to free them."

Animal welfare, rights, and sentience

Sentience has been a central concept in the animal rights movement, tracing back to the well-known writing of Jeremy Bentham in An Introduction to the Principles of Morals and Legislation : "The question is not, Can they reason? nor, Can they talk? but, Can they suffer?"

Richard D. Ryder defines sentientism broadly as the position according to which an entity has moral status if and only if it is sentient. [22] In David Chalmer's more specific terminology, Bentham is a narrow sentientist, since his criterion for moral status is not only the ability to experience any phenomenal consciousness at all, but specifically the ability to experience conscious states with negative affective valence (i.e. suffering). [9] Animal welfare and rights advocates often invoke similar capacities. For example, the documentary Earthlings argues that while animals do not have all the desires and ability to comprehend as do humans, they do share the desires for food and water, shelter and companionship, freedom of movement and avoidance of pain. [23] [b]

Animal-welfare advocates typically argue that any sentient being is entitled, at a minimum, to protection from unnecessary suffering[ citation needed ], though animal-rights advocates may differ on what rights (e.g., the right to life) may be entailed by simple sentience. Sentiocentrism describes the theory that sentient individuals are the center of moral concern.

Gary Francione also bases his abolitionist theory of animal rights, which differs significantly from Singer's, on sentience. He asserts that, "All sentient beings, humans or nonhuman, have one right: the basic right not to be treated as the property of others." [24]

Andrew Linzey, a British theologian, considers that Christianity should regard sentient animals according to their intrinsic worth, rather than their utility to humans. [25]

In 1997 the concept of animal sentience was written into the basic law of the European Union. The legally binding protocol annexed to the Treaty of Amsterdam recognises that animals are "sentient beings", and requires the EU and its member states to "pay full regard to the welfare requirements of animals". [26]

Digital sentience

Digital sentience (or artificial sentience) means the sentience of artificial intelligences. The question of whether artificial intelligences can be sentient is controversial. [27]

The AI research community does not consider sentience (that is, the "ability to feel sensations") as an important research goal, unless it can be shown that consciously "feeling" a sensation can make a machine more intelligent than just receiving input from sensors and processing it as information. Stuart Russell and Peter Norvig wrote in 2021: "We are interested in programs that behave intelligently. Individual aspects of consciousness -- awareness, self-awareness, attention -- can be programmed and can be part of an intelligent machine. The additional project making a machine conscious in exactly the way humans are is not one that we are equipped to take on." [28] Indeed, leading AI textbooks do not mention "sentience" at all. [29]

Digital sentience is of considerable interest to the philosophy of mind. Functionalist philosophers consider that sentience is about "causal roles" played by mental states, which involve information processing. In this view, the physical substrate of this information processing does not need to be biological, so there is no theoretical barrier to the possibility of sentient machines. [30] According to type physicalism however, the physical constitution is important; and depending on the types of physical systems required for sentience, it may or may not be possible for certain types of machines (such as electronic computing devices) to be sentient. [31]

The discussion on the topic of alleged sentience of artificial intelligence has been reignited in 2022 by the claims made about Google's LaMDA (Language Model for Dialogue Applications) artificial intelligence system that it is "sentient" and had a "soul." [32] LaMDA is an artificial intelligence system that creates chatbots – AI robots designed to communicate with humans – by gathering vast amounts of text from the internet and using algorithms to respond to queries in the most fluid and natural way possible. The transcripts of conversations between scientists and LaMDA reveal that the AI system excels at this, providing answers to challenging topics about the nature of emotions, generating Aesop-style fables on queue, and even describing its alleged fears. [33]

In 2022, philosopher David Chalmers made a speech on whether large language models (LLMs) can be conscious, encouraging more research on the subject. He said that it is very plausible that the training of AI models can cause a world model to emerge in them. He personally estimated the chances that the most advanced LLMs are conscious to be less than 10% in 2022 and more than 20% in 2032, reaching around 50% if it attains "virtual perception, language, action, unified agents" exceeding the cognition level of a fish. He stated that "If you see conscious A.I. coming somewhere down the line, then that's going to raise a whole new important group of extremely snarly ethical challenges with, you know, the potential for new forms of injustice". [34]

Nick Bostrom considers that while LaMDA is probably not sentient, being very sure of it would require understanding how consciousness works, having access to unpublished information about LaMDA's architecture, and finding how to apply the philosophical theory to the machine. [35] He also said about LLMs that "it's not doing them justice to say they're simply regurgitating text", noting that they "exhibit glimpses of creativity, insight and understanding that are quite impressive and may show the rudiments of reasoning". He thinks that "sentience is a matter of degree". [27]

Sentience quotient

The sentience quotient concept was introduced by Robert A. Freitas Jr. in the late 1970s. [36] It defines sentience as the relationship between the information processing rate of each individual processing unit (neuron), the weight/size of a single unit, and the total number of processing units (expressed as mass). It was proposed as a measure for the sentience of all living beings and computers from a single neuron up to a hypothetical being at the theoretical computational limit of the entire universe. On a logarithmic scale it runs from −70 up to +50.

See also

Notes

a. ^ Quote: "The absence of a neocortex does not appear to preclude an organism from experiencing affective states. Convergent evidence indicates that non-human animals have the neuroanatomical, neurochemical, and neurophysiological substrates of conscious states along with the capacity to exhibit intentional behaviors. Consequently, the weight of evidence indicates that humans are not unique in possessing the neurological substrates that generate consciousness. Non-human animals, including all mammals and birds, and many other creatures, including octopuses, also possess these neurological substrates." [8]

b. ^ Quote: "Granted, these animals do not have all the desires we humans have; granted, they do not comprehend everything we humans comprehend; nevertheless, we and they do have some of the same desires and do comprehend some of the same things. The desires for food and water, shelter and companionship, freedom of movement and avoidance of pain." [23]

Related Research Articles

Epiphenomenalism is a position on the mind–body problem which holds that subjective mental events are completely dependent for their existence on corresponding physical and biochemical events within the human body, yet themselves have no influence over physical events. The appearance that subjective mental states influence physical events is merely an illusion, consciousness being a by-product of physical states of the world. For instance, fear seems to make the heart beat faster, but according to epiphenomenalism the biochemical secretions of the brain and nervous system —not the experience of fear—is what raises the heartbeat. Because mental events are a kind of overflow that cannot cause anything physical, yet have non-physical properties, epiphenomenalism is viewed as a form of property dualism.

<span class="mw-page-title-main">Mind</span> Faculties responsible for mental phenomena

The mind is that which thinks, imagines, remembers, wills, and senses, or is the set of faculties responsible for such phenomena. The mind is also associated with experiencing perception, pleasure and pain, belief, desire, intention, and emotion. The mind can include conscious and non-conscious states as well as sensory and non-sensory experiences.

<span class="mw-page-title-main">Mind uploading</span> Hypothetical process of digitally emulating a brain

Mind uploading is a speculative process of whole brain emulation in which a brain scan is used to completely emulate the mental state of the individual in a digital computer. The computer would then run a simulation of the brain's information processing, such that it would respond in essentially the same way as the original brain and experience having a sentient conscious mind.

Artificial consciousness (AC), also known as machine consciousness (MC), synthetic consciousness or digital consciousness, is the consciousness hypothesized to be possible in artificial intelligence. It is also the corresponding field of study, which draws insights from philosophy of mind, philosophy of artificial intelligence, cognitive science and neuroscience. The same terminology can be used with the term "sentience" instead of "consciousness" when specifically designating phenomenal consciousness.

<span class="mw-page-title-main">David Chalmers</span> Australian philosopher and cognitive scientist

David John Chalmers (;) is an Australian philosopher and cognitive scientist specializing in the areas of philosophy of mind and philosophy of language. He is a professor of philosophy and neural science at New York University, as well as co-director of NYU's Center for Mind, Brain and Consciousness. In 2006, he was elected a Fellow of the Australian Academy of the Humanities. In 2013, he was elected a Fellow of the American Academy of Arts & Sciences.

In the philosophy of mind and consciousness, the explanatory gap is the difficulty that physicalist philosophies have in explaining how physical properties give rise to the way things feel subjectively when they are experienced. It is a term introduced by philosopher Joseph Levine. In the 1983 paper in which he first used the term, he used as an example the sentence, "Pain is the firing of C fibers", pointing out that while it might be valid in a physiological sense, it does not help us to understand how pain feels.

Eliminative materialism is a materialist position in the philosophy of mind. It is the idea that the majority of mental states in folk psychology do not exist. Some supporters of eliminativism argue that no coherent neural basis will be found for many everyday psychological concepts such as belief or desire, since they are poorly defined. The argument is that psychological concepts of behavior and experience should be judged by how well they reduce to the biological level. Other versions entail the nonexistence of conscious mental states such as pain and visual perceptions.

The knowledge argument is a philosophical thought experiment proposed by Frank Jackson in his article "Epiphenomenal Qualia" (1982) and extended in "What Mary Didn't Know" (1986).

<span class="mw-page-title-main">Hard problem of consciousness</span> Philosophical concept, first stated by David Chalmers in 1995

In philosophy of mind, the hard problem of consciousness is to explain why and how humans and other organisms have qualia, phenomenal consciousness, or subjective experiences. It is contrasted with the "easy problems" of explaining why and how physical systems give a (healthy) human being the ability to discriminate, to integrate information, and to perform behavioral functions such as watching, listening, speaking, and so forth. The easy problems are amenable to functional explanation: that is, explanations that are mechanistic or behavioral, as each physical system can be explained purely by reference to the "structure and dynamics" that underpin the phenomenon.

A philosophical zombie is a being in a thought experiment in philosophy of mind that is physically identical to a normal person but does not have conscious experience.

The philosophy of artificial intelligence is a branch of the philosophy of mind and the philosophy of computer science that explores artificial intelligence and its implications for knowledge and understanding of intelligence, ethics, consciousness, epistemology, and free will. Furthermore, the technology is concerned with the creation of artificial animals or artificial people so the discipline is of considerable interest to philosophers. These factors contributed to the emergence of the philosophy of artificial intelligence.

Philosophy of mind is a branch of philosophy that deals with the nature of the mind and its relation to the body and the external world.

<span class="mw-page-title-main">Michael Tye (philosopher)</span> British philosopher (born 1950)

Michael Tye is a British philosopher who is currently the Dallas TACA Centennial Professor in Liberal Arts at the University of Texas at Austin. He has made significant contributions to the philosophy of mind.

<span class="mw-page-title-main">Animal consciousness</span> Quality or state of self-awareness within an animal

Animal consciousness, or animal awareness, is the quality or state of self-awareness within an animal, or of being aware of an external object or something within itself. In humans, consciousness has been defined as: sentience, awareness, subjectivity, qualia, the ability to experience or to feel, wakefulness, having a sense of selfhood, and the executive control system of the mind. Despite the difficulty in definition, many philosophers believe there is a broadly shared underlying intuition about what consciousness is.

<span class="mw-page-title-main">Qualia</span> Instances of subjective experience

In philosophy of mind, qualia are defined as instances of subjective, conscious experience. The term qualia derives from the Latin neuter plural form (qualia) of the Latin adjective quālis meaning "of what sort" or "of what kind" in relation to a specific instance, such as "what it is like to taste a specific apple — this particular apple now".

Externalism is a group of positions in the philosophy of mind which argues that the conscious mind is not only the result of what is going on inside the nervous system, but also what occurs or exists outside the subject. It is contrasted with internalism which holds that the mind emerges from neural activity alone. Externalism is a belief that the mind is not just the brain or functions of the brain.

The concept of absent qualia is one of two major functionalist objections to the existence of qualia, the other being the inverted spectrum hypothesis. Qualia is a philosophical term used to refer to an individual's subjective experience, that is to say, the way something feels to that individual at that particular moment.

Strong artificial intelligence may refer to a concept in philosophy or to a range of levels of intelligence in prospective computational systems:

<span class="mw-page-title-main">Susan Schneider</span> American philosopher and artificial intelligence expert

Susan Lynn Schneider is an American philosopher and artificial intelligence expert. She is the founding director of the Center for the Future Mind at Florida Atlantic University where she also holds the William F. Dietrich Distinguished Professorship. Schneider has also held the Baruch S. Blumberg NASA/Library of Congress Chair in Astrobiology, Exploration, and Scientific Innovation at NASA and the Distinguished Scholar Chair at the Library of Congress.

<span class="mw-page-title-main">Ethics of uncertain sentience</span> Applied ethics issue

The ethics of uncertain sentience refers to questions surrounding the treatment of and moral obligations towards individuals whose sentience—the capacity to subjectively sense and feel—and resulting ability to experience pain is uncertain; the topic has been particularly discussed within the field of animal ethics, with the precautionary principle frequently invoked in response.

References

  1. "Dictionary definition of sentience". American Psychological Association. Retrieved 31 January 2024.
  2. "Sentient". Etymology Online. Douglas Harper. Retrieved 31 January 2021.
  3. 1 2 Scerri, Mariella; Grech, Victor E. (2016). "Sentience in science fiction 101". SFRA Review. 315: 14–18. Retrieved 31 January 2021.
  4. Damasio, Antonio (October 2001). "Fundamental feelings". Nature. 413 (6858): 781. Bibcode:2001Natur.413..781D. doi: 10.1038/35101669 . ISSN   1476-4687. PMID   11677584. S2CID   226085.
  5. Nagel, Thomas (1974). "What Is It Like to Be a Bat?". The Philosophical Review. 83 (4): 435–450. doi:10.2307/2183914. JSTOR   2183914.
  6. Shermer, Michael (2018-07-01). "Will Science Ever Solve the Mysteries of Consciousness, Free Will and God?". Scientific American. Retrieved 2024-03-10.
  7. Ramsey, William (2013). "Eliminative Materialism". In Zalta, Edward N. (ed.). The Stanford Encyclopedia of Philosophy (Summer 2013 ed.). Stanford University. Retrieved 19 June 2014.
  8. 1 2 Low, Philip (7 July 2012). "The Cambridge Declaration on Consciousness" (PDF). FCM Conference. Cambridge University. Retrieved 5 August 2020.
  9. 1 2 Massimo Pigliucci, David Chalmers (Dec 18, 2020). Philosophy Day 2020: David Chalmers - Consciousness and moral status (YouTube). Figs in Winter. Archived from the original on 2021-10-31. Retrieved Sep 12, 2021.
  10. A. D. Milner, M. D. Rugg (2013). "The Neuropsychology of Consciousness"
  11. E T Mullin (2007). "The Creation of Sensation and the Evolution of Consciousness"
  12. Catania, A.C. (June 7, 1994). "Query: Did Pavlov's research ring a bell?". Psycoloquy Newsletter
  13. Shanta, Bhakti Niskama (September–October 2015). "Life and consciousness – The Vedāntic view". Communicative & Integrative Biology. 8 (5): e1085138. doi: 10.1080/19420889.2015.1085138 . PMC   4802748 . PMID   27066168. 27066168.
  14. Getz, Daniel A. (2004). "Sentient beings"; cited in Buswell, Robert E. (2004). Encyclopedia of Buddhism. Volume 2. New York, USA: Macmillan Reference USA. ISBN   0-02-865720-9 (Volume 2): pp.760
  15. "ahimsa". Britannica. Retrieved 2024-03-10.
  16. Keiji, Nishitani (ed.)(1976). The Eastern Buddhist. 9.2: p.72. Kyoto: Eastern Buddhist Society; cited in Dumoulin, Henrich (author); Heisig, James (translator); and Knitter, Paul (translator)(2005). Zen Buddhism: A History ~ Volume 2: Japan. With an Introduction by Victor Sogen Hori. Bloomington, Indiana, USA: World Wisdom, Inc. ISBN   978-0-941532-90-7
  17. Ray, Reginald A. (2002). Indestructible truth: the living spirituality of Tibetan Buddhism. World of Tibetan Buddhism. Boston: Shambhala. pp. 26–27. ISBN   978-1-57062-910-5.
  18. Nemicandra, Acarya; Balbir, Nalini (2010), Dravyasamgrha: Exposition of the Six Substances, (in Prakrit and English) Pandit Nathuram Premi Research Series (vol-19), Mumbai: Hindi Granth Karyalay, pp. 1 of Introduction, ISBN   978-81-88769-30-8
  19. Grimes, John (1996), A Concise Dictionary of Indian Philosophy: Sanskrit Terms Defined in English, New York: SUNY Press, pp. 118–119, ISBN   0-7914-3068-5
  20. Shah, Natubhai (November 1998), Jainism : The World of Conquerors, Sussex Academic Press, p. 50, ISBN   1-898723-30-3
  21. Doniger, Wendy, ed. (1993), Purana Perennis: Reciprocity and Transformation in Hindu and Jaina Texts, State University of New York Press, ISBN   0-7914-1381-0
  22. Ryder, Richard D. (1991). "Souls and Sentientism". Between the Species. 7 (1): Article 3. doi: 10.15368/bts.1991v7n1.1 .
  23. 1 2 Monson S (2005), "Earthlings".
  24. Francione, Gary. Official blog
  25. "BBC - Religions - Christianity: Animal rights". www.bbc.co.uk. 2009-08-03. Retrieved 2024-03-10.
  26. "The Lisbon Treaty: recognising animal sentience". CIWF. 1 December 2009. Retrieved 2024-03-10.
  27. 1 2 Jackson, Lauren (2023-04-12). "What if A.I. Sentience Is a Question of Degree?". The New York Times. ISSN   0362-4331 . Retrieved 2023-06-23.
  28. Russell & Norvig 2021, p. 986.
  29. Leading AI textbooks in 2023:
  30. Manzotti, Riccardo; Chella, Antonio (2018). "Good Old-Fashioned Artificial Consciousness and the Intermediate Level Fallacy". Frontiers in Robotics and AI. 5: 39. doi: 10.3389/frobt.2018.00039 . ISSN   2296-9144. PMC   7805708 . PMID   33500925.
  31. Searle, John R. (1980). "Minds, brains, and programs". Behavioral and Brain Sciences. 3 (3): 417–424. doi:10.1017/S0140525X00005756. ISSN   1469-1825. S2CID   55303721. Archived from the original on 2007-12-10.
  32. Brandon Specktor published (2022-06-13). "Google AI 'is sentient,' software engineer claims before being suspended". livescience.com. Retrieved 2022-06-14.
  33. Lemoine, Blake (2022-06-11). "Is LaMDA Sentient? — an Interview". Medium. Retrieved 2022-06-14.
  34. "AI could have 20% chance of sentience in 10 years, says philosopher David Chalmers". ZDNET. Retrieved 2023-06-22.
  35. Leith, Sam (2022-07-07). "Nick Bostrom: How can we be certain a machine isn't conscious?". The Spectator. Retrieved 2023-06-23.
  36. Freitas, R.A. Jr. (April 1984). "Xenopsychology". Analog Science Fiction/Science Fact . 104: 41–53.

Further reading