China brain

Last updated
A "China brain". The population of China is 1.4x10 people, while a human brain is estimated to have [?]10 neurons. China brain.png
A "China brain". The population of China is 1.4×10 people, while a human brain is estimated to have ≈10 neurons.

In the philosophy of mind, the China brain thought experiment (also known as the Chinese Nation or Chinese Gym) considers what would happen if each member of the Chinese nation were asked to simulate the action of one neuron in the brain, using telephones or walkie-talkies to simulate the axons and dendrites that connect neurons. Would this arrangement have a mind or consciousness in the same way that brains do?

Contents

Early versions of this scenario were put forward in 1961 by Anatoly Dneprov, [1] [2] [3] in 1974 by Lawrence Davis, [4] and again in 1978 by Ned Block. [5] Block argues that the China brain would not have a mind, whereas Daniel Dennett argues that it would. [6] The China brain problem is a special case of the more general problem whether minds could exist within other, larger minds. [7]

The Chinese room scenario analyzed by John Searle, [8] is a similar thought experiment in philosophy of mind that relates to artificial intelligence. Instead of people who each model a single neuron of the brain, in the Chinese room, clerks who do not speak Chinese accept notes in Chinese and return an answer in Chinese according to a set of rules, without the people in the room ever understanding what those notes mean. In fact, the original short story The Game (1961) by the Soviet physicist and writer Anatoly Dneprov contains both the China brain and the Chinese room scenarios as follows: All 1400 delegates of the Soviet Congress of Young Mathematicians willingly agree to take part in a "purely mathematical game" proposed by Professor Zarubin. The game requires the execution of a certain set of rules given to the participants, who communicate with each other using sentences composed only of the words "zero" and "one". After several hours of playing the game, the participants have no idea of what is going on as they get progressively tired. A young woman becomes too dizzy and leaves the game just before it ends. On the next day, Professor Zarubin reveals to everyone's excitement that the participants were simulating a computer machine that translated a sentence written in Portuguese "Os maiores resultados são produzidos por – pequenos mas contínuos esforços", a language that nobody from the participants understood, into the sentence in Russian "The greatest goals are achieved through minor but continuous ekkedt", a language that everyone from the participants understood. It becomes clear that the last word, which should have been "efforts", is mistranslated due to the young woman who had become dizzy leaving the simulation. [1] [2] [3]

Background

Many theories of mental states are materialist, that is, they describe the mind as the behavior of a physical object like the brain. One formerly prominent example is the identity theory, which says that mental states are brain states. One criticism is the problem of multiple realizability. The physicalist theory that responds to this is functionalism, which states that a mental state can be whatever functions as a mental state. That is, the mind can be composed of neurons, or it could be composed of wood, rocks or toilet paper, as long as it provides mental functionality.

The thought experiment

Suppose that the whole nation of China were reordered to simulate the workings of a single brain (that is, to act as a mind according to functionalism). Each Chinese person acts as (say) a neuron, and communicates by special two-way radio in corresponding way to the other people. The current mental state of the China brain is displayed on satellites that may be seen from anywhere in China. The China brain would then be connected via radio to a body, one that provides the sensory inputs and behavioral outputs of the China brain.

Thus, the China brain possesses all the elements of a functional description of mind: sensory inputs, behavioral outputs, and internal mental states causally connected to other mental states. If the nation of China can be made to act in this way, then, according to functionalism, this system would have a mind. Block's goal is to show how unintuitive it is to think that such an arrangement could create a mind capable of thoughts and feelings.

Consciousness

The China brain argues that consciousness is a problem for functionalism. Block's Chinese nation presents a version of what is known as the absent qualia objection to functionalism because it purports to show that it is possible for something to be functionally equivalent to a human being and yet have no conscious experience. A creature that functions like a human being but doesn't feel anything is known as a "philosophical zombie". So the absent qualia objection to functionalism could also be called the "zombie objection".

Criticisms

Some philosophers, like Daniel Dennett, have concluded that the China brain does create a mental state. [6] Functionalist philosophers of mind endorse the idea that something like the China brain can realise a mind, and that neurons are, in principle, not the only material that can create a mental state. [9]

See also

Related Research Articles

The Chinese room argument holds that a digital computer executing a program cannot have a "mind", "understanding", or "consciousness", regardless of how intelligently or human-like the program may make the computer behave. The argument was presented by philosopher John Searle in his paper "Minds, Brains, and Programs", published in Behavioral and Brain Sciences in 1980. Similar arguments were presented by Gottfried Leibniz (1714), Anatoly Dneprov (1961), Lawrence Davis (1974) and Ned Block (1978). Searle's version has been widely discussed in the years since. The centerpiece of Searle's argument is a thought experiment known as the Chinese room.

Epiphenomenalism is a position on the mind–body problem which holds that subjective mental events are completely dependent for their existence on corresponding physical and biochemical events within the human body, yet themselves have no influence over physical events. The appearance that subjective mental states influence physical events is merely an illusion, consciousness being a by-product of physical states of the world. For instance, fear seems to make the heart beat faster, but according to epiphenomenalism the biochemical secretions of the brain and nervous system —not the experience of fear—is what raises the heartbeat. Because mental events are a kind of overflow that cannot cause anything physical, yet have non-physical properties, epiphenomenalism is viewed as a form of property dualism.

The problem of other minds is a philosophical problem traditionally stated as the following epistemological question: Given that I can only observe the behavior of others, how can I know that others have minds? The problem is that knowledge of other minds is always indirect. The problem of other minds does not negatively impact social interactions due to people having a "theory of mind" – the ability to spontaneously infer the mental states of others – supported by innate mirror neurons, a theory of mind mechanism, or a tacit theory. There has also been an increase in evidence that behavior results from cognition which in turn requires consciousness and the brain.

<span class="mw-page-title-main">John Searle</span> American philosopher (born 1932)

John Rogers Searle is an American philosopher widely noted for contributions to the philosophy of language, philosophy of mind, and social philosophy. He began teaching at UC Berkeley in 1959, and was Willis S. and Marion Slusser Professor Emeritus of the Philosophy of Mind and Language and Professor of the Graduate School at the University of California, Berkeley, until June 2019, when his status as professor emeritus was revoked because he was found to have violated the university's sexual harassment policies.

<i>Consciousness Explained</i> 1991 book by Daniel Dennett

Consciousness Explained is a 1991 book by the American philosopher Daniel Dennett, in which the author offers an account of how consciousness arises from interaction of physical and cognitive processes in the brain. Dennett describes consciousness as an account of the various calculations occurring in the brain at close to the same time. He compares consciousness to an academic paper that is being developed or edited in the hands of multiple people at one time, the "multiple drafts" theory of consciousness. In this analogy, "the paper" exists even though there is no single, unified paper. When people report on their inner experiences, Dennett considers their reports to be more like theorizing than like describing. These reports may be informative, he says, but a psychologist is not to take them at face value. Dennett describes several phenomena that show that perception is more limited and less reliable than we perceive it to be.

In the philosophy of mind, functionalism is the thesis that each and every mental state is constituted solely by its functional role, which means its causal relation to other mental states, sensory inputs, and behavioral outputs. Functionalism developed largely as an alternative to the identity theory of mind and behaviorism.

Artificial consciousness (AC), also known as machine consciousness (MC), synthetic consciousness or digital consciousness, is the consciousness hypothesized to be possible in artificial intelligence. It is also the corresponding field of study, which draws insights from philosophy of mind, philosophy of artificial intelligence, cognitive science and neuroscience. The same terminology can be used with the term "sentience" instead of "consciousness" when specifically designating phenomenal consciousness.

Eliminative materialism is a materialist position in the philosophy of mind. It is the idea that the majority of mental states in folk psychology do not exist. Some supporters of eliminativism argue that no coherent neural basis will be found for many everyday psychological concepts such as belief or desire, since they are poorly defined. The argument is that psychological concepts of behavior and experience should be judged by how well they reduce to the biological level. Other versions entail the nonexistence of conscious mental states such as pain and visual perceptions.

A philosophical zombie is a being in a thought experiment in philosophy of mind that is physically identical to a normal person but does not have conscious experience.

Daniel Dennett's multiple drafts model of consciousness is a physicalist theory of consciousness based upon cognitivism, which views the mind in terms of information processing. The theory is described in depth in his book, Consciousness Explained, published in 1991. As the title states, the book proposes a high-level explanation of consciousness which is consistent with support for the possibility of strong AI.

The philosophy of artificial intelligence is a branch of the philosophy of mind and the philosophy of computer science that explores artificial intelligence and its implications for knowledge and understanding of intelligence, ethics, consciousness, epistemology, and free will. Furthermore, the technology is concerned with the creation of artificial animals or artificial people so the discipline is of considerable interest to philosophers. These factors contributed to the emergence of the philosophy of artificial intelligence.

In philosophy of mind, the computational theory of mind (CTM), also known as computationalism, is a family of views that hold that the human mind is an information processing system and that cognition and consciousness together are a form of computation. Warren McCulloch and Walter Pitts (1943) were the first to suggest that neural activity is computational. They argued that neural computations explain cognition. The theory was proposed in its modern form by Hilary Putnam in 1967, and developed by his PhD student, philosopher, and cognitive scientist Jerry Fodor in the 1960s, 1970s, and 1980s. It was vigorously disputed in analytic philosophy in the 1990s due to work by Putnam himself, John Searle, and others.

Type physicalism is a physicalist theory in the philosophy of mind. It asserts that mental events can be grouped into types, and can then be correlated with types of physical events in the brain. For example, one type of mental event, such as "mental pains" will, presumably, turn out to be describing one type of physical event.

<span class="mw-page-title-main">Biological naturalism</span>

Biological naturalism is a theory about, among other things, the relationship between consciousness and body, and hence an approach to the mind–body problem. It was first proposed by the philosopher John Searle in 1980 and is defined by two main theses: 1) all mental phenomena, ranging from pains, tickles, and itches to the most abstruse thoughts, are caused by lower-level neurobiological processes in the brain; and 2) mental phenomena are higher-level features of the brain.

Philosophy of mind is a branch of philosophy that deals with the nature of the mind and its relation to the body and the external world.

<span class="mw-page-title-main">Qualia</span> Instances of subjective experience

In philosophy of mind, qualia are defined as instances of subjective, conscious experience. The term qualia derives from the Latin neuter plural form (qualia) of the Latin adjective quālis meaning "of what sort" or "of what kind" in relation to a specific instance, such as "what it is like to taste a specific apple — this particular apple now".

<i>The Conscious Mind</i> 1996 philosophy book by David Chalmers

The Conscious Mind: In Search of a Fundamental Theory was published in 1996, and is the first book written by David Chalmers, an Australian philosopher specialising in philosophy of mind. Although the book has been greatly influential, Chalmers maintains that it is "far from perfect", as most of it was written as part of his PhD dissertation after "studying philosophy for only four years".

Interactionism or interactionist dualism is the theory in the philosophy of mind which holds that matter and mind are two distinct and independent substances that exert causal effects on one another. An example of your mind influencing your body would be if you are depressed, you can observe the effects on your body, such as a slouched posture, a lackluster smile, etc. Another example, this time of your body affecting your mind would be: If you struck your toe very forcefully on a door, you would experience terrible pain. Interactionism is one type of dualism, traditionally a type of substance dualism though more recently also sometimes a form of property dualism. Many philosophers and scientists have responded to this theory with arguments both supporting and opposing its relevance to life and whether the theory corresponds to reality.

<span class="mw-page-title-main">Anatoly Dneprov (writer)</span>

Anatoly Dneprov was a Soviet physicist, cyberneticist and writer of Ukrainian ancestry. His science fiction stories were published in the Soviet Union, Eastern Europe and the United States from 1958 to 1970. He is known best for his stories Crabs on the Island, The Maxwell Equations and The Purple Mummy.

The concept of absent qualia is one of two major functionalist objections to the existence of qualia, the other being the inverted spectrum hypothesis. Qualia is a philosophical term used to refer to an individual's subjective experience, that is to say, the way something feels to that individual at that particular moment.

References

  1. 1 2 Dneprov, Anatoly (1961). "The Game" (PDF). Knowledge—Power (in Russian). 1961 (5): 39–41.
  2. 1 2 Vadim Vasiliev, Dmitry Volkov, Robert Howell (15 June 2018). "A Russian Chinese Room story antedating Searle's 1980 discussion". hardproblem.ru. Moscow Center for Consciousness Studies. Retrieved 13 July 2021. A. Dneprov: "The Game" (originally published in 1961){{cite web}}: CS1 maint: multiple names: authors list (link)
  3. 1 2 Dneprov, Anatoly (1985). "The Game (1961)". The Clay God. Stories and Short Stories. Series "Galaxy" (in Bulgarian). Vol. 66. Varna: Georgi Bakalov.
  4. David Cole (2009). "Section 2.3 The Chinese Nation". The Chinese Room Argument. Stanford Encyclopedia of Philosophy.
  5. Ned Block (1978). "Troubles with functionalism". Minnesota Studies in the Philosophy of Science. 9: 261–325. Archived from the original on 2011-09-27. Retrieved 2011-06-23.
  6. 1 2 Daniel Dennett (1991). "Chapter 14. Consciousness Imagined". Consciousness Explained. Back Bay Books. pp. 431–455.
  7. Georgiev, Danko D. (2017-12-06). Quantum Information and Consciousness: A Gentle Introduction (1st ed.). Boca Raton: CRC Press. p. 362. doi:10.1201/9780203732519. ISBN   9781138104488. OCLC   1003273264. Zbl   1390.81001.
  8. John R. Searle (1980). "Minds, brains, and programs" (PDF). Behavioral and Brain Sciences. 3 (3): 417–457. doi:10.1017/S0140525X00005756. S2CID   55303721.
  9. Edward Feser (2006). "The "Chinese nation" argument". Philosophy of Mind: A Beginner's Guide. Oxford: Oneworld. pp. 89–93.