Susan Schneider

Last updated
ISBN 9780470674079 [7]
  • Science Fiction and Philosophy, Oxford: Wiley-Blackwell, 2009. ISBN   9781118922613 [7]
  • The Language of Thought: a New Philosophical Direction, MIT Press, 2011. ISBN   9780262527453 [7]
  • Artificial You: AI and the Future of Your Mind, Princeton University Press, 2019. ISBN   9780691180144 [19]
  • Amy Mind The Philosophy of Mind, Machine Minds (106-107)
  • Related Research Articles

    <span class="mw-page-title-main">Cognitive science</span> Interdisciplinary scientific study of cognitive processes

    Cognitive science is the interdisciplinary, scientific study of the mind and its processes with input from linguistics, psychology, neuroscience, philosophy, computer science/artificial intelligence, and anthropology. It examines the nature, the tasks, and the functions of cognition. Cognitive scientists study intelligence and behavior, with a focus on how nervous systems represent, process, and transform information. Mental faculties of concern to cognitive scientists include language, perception, memory, attention, reasoning, and emotion; to understand these faculties, cognitive scientists borrow from fields such as linguistics, psychology, artificial intelligence, philosophy, neuroscience, and anthropology. The typical analysis of cognitive science spans many levels of organization, from learning and decision to logic and planning; from neural circuitry to modular brain organization. One of the fundamental concepts of cognitive science is that "thinking can best be understood in terms of representational structures in the mind and computational procedures that operate on those structures."

    <span class="mw-page-title-main">Consciousness</span> Awareness of internal and external existence

    Consciousness, at its simplest, is awareness of internal and external existence. However, its nature has led to millennia of analyses, explanations and debate by philosophers, theologians, and all of science. Opinions differ about what exactly needs to be studied or even considered consciousness. In some explanations, it is synonymous with the mind, and at other times, an aspect of mind. In the past, it was one's "inner life", the world of introspection, of private thought, imagination and volition. Today, it often includes any kind of cognition, experience, feeling or perception. It may be awareness, awareness of awareness, or self-awareness either continuously changing or not. The disparate range of research, notions and speculations raises a curiosity about whether the right questions are being asked.

    The Chinese room argument holds that a digital computer executing a program cannot have a "mind", "understanding", or "consciousness", regardless of how intelligently or human-like the program may make the computer behave. The argument was presented by philosopher John Searle in his paper "Minds, Brains, and Programs", published in Behavioral and Brain Sciences in 1980. Similar arguments were presented by Gottfried Leibniz (1714), Anatoly Dneprov (1961), Lawrence Davis (1974) and Ned Block (1978). Searle's version has been widely discussed in the years since. The centerpiece of Searle's argument is a thought experiment known as the Chinese room.

    <span class="mw-page-title-main">Mind</span> Faculties responsible for mental phenomena

    The mind is that which thinks, imagines, remembers, wills, and senses, or is the set of faculties responsible for such phenomena. The mind is also associated with experiencing perception, pleasure and pain, belief, desire, intention, and emotion. The mind can include conscious and non-conscious states as well as sensory and non-sensory experiences.

    <span class="mw-page-title-main">Mind uploading</span> Hypothetical process of digitally emulating a brain

    Mind uploading is a speculative process of whole brain emulation in which a brain scan is used to completely emulate the mental state of the individual in a digital computer. The computer would then run a simulation of the brain's information processing, such that it would respond in essentially the same way as the original brain and experience having a sentient conscious mind.

    <span class="mw-page-title-main">Sentience</span> Ability to be aware of feelings and sensations

    Sentience is the ability to experience feelings and sensations. The word was first coined by philosophers in the 1630s for the concept of an ability to feel, derived from Latin sentiens (feeling), to distinguish it from the ability to think (reason). In modern Western philosophy, sentience is the ability to experience sensations. In different Asian religions, the word "sentience" has been used to translate a variety of concepts. In science fiction, the word "sentience" is sometimes used interchangeably with "sapience", "self-awareness", or "consciousness".

    Artificial consciousness (AC), also known as machine consciousness (MC), synthetic consciousness or digital consciousness, is the consciousness hypothesized to be possible in artificial intelligence. It is also the corresponding field of study, which draws insights from philosophy of mind, philosophy of artificial intelligence, cognitive science and neuroscience. The same terminology can be used with the term "sentience" instead of "consciousness" when specifically designating phenomenal consciousness.

    In the philosophy of mind and consciousness, the explanatory gap is the difficulty that physicalist philosophies have in explaining how physical properties give rise to the way things feel subjectively when they are experienced. It is a term introduced by philosopher Joseph Levine. In the 1983 paper in which he first used the term, he used as an example the sentence, "Pain is the firing of C fibers", pointing out that while it might be valid in a physiological sense, it does not help us to understand how pain feels.

    "Computing Machinery and Intelligence" is a seminal paper written by Alan Turing on the topic of artificial intelligence. The paper, published in 1950 in Mind, was the first to introduce his concept of what is now known as the Turing test to the general public.

    An artificial general intelligence (AGI) is a hypothetical type of intelligent agent. If realized, an AGI could learn to accomplish any intellectual task that human beings or animals can perform. Alternatively, AGI has been defined as an autonomous system that surpasses human capabilities in the majority of economically valuable tasks. Creating AGI is a primary goal of some artificial intelligence research and of companies such as OpenAI, DeepMind, and Anthropic. AGI is a common topic in science fiction and futures studies.

    A philosophical zombie is a being in a thought experiment in philosophy of mind that is physically identical to a normal person but does not have conscious experience.

    <span class="mw-page-title-main">Ned Block</span> American philosopher

    Ned Joel Block is an American philosopher working in philosophy of mind who has made important contributions to the understanding of consciousness and the philosophy of cognitive science. He has been professor of philosophy and psychology at New York University since 1996.

    An artificial brain is software and hardware with cognitive abilities similar to those of the animal or human brain.

    A physical symbol system takes physical patterns (symbols), combining them into structures (expressions) and manipulating them to produce new expressions.

    The philosophy of artificial intelligence is a branch of the philosophy of mind and the philosophy of computer science that explores artificial intelligence and its implications for knowledge and understanding of intelligence, ethics, consciousness, epistemology, and free will. Furthermore, the technology is concerned with the creation of artificial animals or artificial people so the discipline is of considerable interest to philosophers. These factors contributed to the emergence of the philosophy of artificial intelligence.

    The symbol grounding problem is a concept in the fields of artificial intelligence, cognitive science, philosophy of mind, and semantics. It addresses the challenge of connecting symbols, such as words or abstract representations, to the real-world objects or concepts they refer to. In essence, it is about how symbols acquire meaning in a way that is tied to the physical world. It is concerned with how it is that words get their meanings, and hence is closely related to the problem of what meaning itself really is. The problem of meaning is in turn related to the problem of how it is that mental states are meaningful, and hence to the problem of consciousness: what is the connection between certain physical systems and the contents of subjective experiences.

    In philosophy of mind, the computational theory of mind (CTM), also known as computationalism, is a family of views that hold that the human mind is an information processing system and that cognition and consciousness together are a form of computation. Warren McCulloch and Walter Pitts (1943) were the first to suggest that neural activity is computational. They argued that neural computations explain cognition. The theory was proposed in its modern form by Hilary Putnam in 1967, and developed by his PhD student, philosopher, and cognitive scientist Jerry Fodor in the 1960s, 1970s, and 1980s. It was vigorously disputed in analytic philosophy in the 1990s due to work by Putnam himself, John Searle, and others.

    Philosophy of mind is a branch of philosophy that studies the ontology and nature of the mind and its relationship with the body. The mind–body problem is a paradigmatic issue in philosophy of mind, although a number of other issues are addressed, such as the hard problem of consciousness and the nature of particular mental states. Aspects of the mind that are studied include mental events, mental functions, mental properties, consciousness and its neural correlates, the ontology of the mind, the nature of cognition and of thought, and the relationship of the mind to the body.

    <span class="mw-page-title-main">Aaron Sloman</span>

    Aaron Sloman is a philosopher and researcher on artificial intelligence and cognitive science. He held the Chair in Artificial Intelligence and Cognitive Science at the School of Computer Science at the University of Birmingham, and before that a chair with the same title at the University of Sussex. Since retiring he is Honorary Professor of Artificial Intelligence and Cognitive Science at Birmingham. He has published widely on philosophy of mathematics, epistemology, cognitive science, and artificial intelligence; he also collaborated widely, e.g. with biologist Jackie Chappell on the evolution of intelligence.

    <span class="mw-page-title-main">Turing test</span> Test of a machines ability to imitate human intelligence

    The Turing test, originally called the imitation game by Alan Turing in 1950, is a test of a machine's ability to exhibit intelligent behaviour equivalent to, or indistinguishable from, that of a human. Turing proposed that a human evaluator would judge natural language conversations between a human and a machine designed to generate human-like responses. The evaluator would be aware that one of the two partners in conversation was a machine, and all participants would be separated from one another. The conversation would be limited to a text-only channel, such as a computer keyboard and screen, so the result would not depend on the machine's ability to render words as speech. If the evaluator could not reliably tell the machine from the human, the machine would be said to have passed the test. The test results would not depend on the machine's ability to give correct answers to questions, only on how closely its answers resembled those a human would give. Since the Turing test is a test of indistinguishability in performance capacity, the verbal version generalizes naturally to all of human performance capacity, verbal as well as nonverbal (robotic).

    References

    1. 1 2 "Public Scholars 2018". National Endowment for the Humanities (NEH). Retrieved 2020-08-12.
    2. 1 2 "Advisory Board: Dr. Susan Schneider". Lifeboat Foundation. Retrieved 31 August 2020.
    3. 1 2 3 4 5 "SUSAN SCHNEIDER" (PDF). Florida Atlantic University. Retrieved 31 August 2020.
    4. 1 2 3 Figdor, Carrie (August 15, 2011). "Susan Schneider, "The Language of Thought: A New Philosophical Direction" (MIT Press, 2011)". New Books Network.
    5. "Susan Schneider, Ph.D." Nour Foundation. Retrieved 31 August 2020.
    6. Salisbury, Jenelle (2018-02-16). "Susan Schneider | AI, Mind and Society ("AIMS") Group" . Retrieved 2020-08-12.
    7. 1 2 3 4 5 "Susan Schneider Cognitive Philosopher". TEDX Cambridge. Retrieved 31 August 2020.
    8. "Susan Schneider". Institute for Advanced Study. 9 December 2019. Retrieved 31 August 2020.
    9. 1 2 3 4 5 "The Future of the Mind: How AI Technology Could Reshape the Human Mind and Create Alternate Synthetic Minds, A Conversation with Susan Schneider". Edge. January 28, 2019. Retrieved 31 August 2020.
    10. "Baruch S. Blumberg NASA/Library of Congress Chair in Astrobiology, Exploration, and Scientific Innovation". Library of Congress. Retrieved 31 August 2020.
    11. Burks, Polly (July 14, 2020). "FAU Hires Leading Philosopher and Futurist". Florida Atlantic University. Retrieved 31 August 2020.
    12. 1 2 Weinberg, Justin (2020-07-17). "Schneider from Connecticut to Florida Atlantic". Daily Nous. Retrieved 2020-08-12.
    13. 1 2 3 Stone, Maddie (2014-12-19). "The Dominant Life Form in the Cosmos Is Probably Superintelligent Robots". Motherboard. Vice. Retrieved 26 June 2015.
    14. Kastrup, Bernardo. "Idealism and Emergent Spacetime". Science and Nonduality (SAND). Retrieved 1 September 2020.
    15. Dickson, Ben (August 5, 2020). "The complicated world of AI consciousness". TechTalks. Retrieved 1 September 2020.
    16. 1 2 3 Schneider BBC World News America 10 1 19 , retrieved 2019-10-23
    17. Turello, Dan (October 2019). "Will AI Become Conscious? A Conversation with Susan Schneider October 1, 2019". Insights: Scholarly Work at the John W. Kluge Center. Retrieved 1 September 2020.
    18. Moring, Mark (2019-10-01). "Your Brain, AI, and the Future". ORBITER. Retrieved 2019-10-23.
    19. 1 2 McLemee, Scott (October 18, 2019). "Deletable You: Scott McLemee reviews Susan Schneider's Artificial You: AI and the Future of Your Mind". Inside Higher Ed. Retrieved 2020-08-31.
    20. Cocking, Simon (October 10, 2019). "Artificial You, AI And The Future of Your Mind, October, 2019, reviewed". Irish Tech News. Retrieved 2019-10-23.
    21. Goff, P. (2017). "Is it a Problem that Physics is Mathematical?". Journal of Consciousness Studies. 24 (9): 50–58. Retrieved 31 August 2020.
    22. Montero, B. G. (2017). "Should Physicalists Fear Abstracta?". Journal of Consciousness Studies. 24 (9–10): 40–49. Retrieved 2019-10-23.
    23. Vision, G. (2017). "On Physics' Faustian Bargain with Mathematics". Journal of Consciousness Studies. 24 (9–10): 59–71. Retrieved 2019-10-23.
    24. Dick, Steven J. (October 26, 2015). The Impact of Discovering Life Beyond Earth. Cambridge University Press. p. 156.
    25. Rescorla, Michael (May 28, 2019). "The Language of Thought Hypothesis". Stanford Encyclopedia of Philosophy. Retrieved 1 September 2020.
    26. Sprevak, Mark (1 April 2019). "The Language of Thought: A New Philosophical Direction, by Susan Schneider". Mind. 128 (510): 555–564. doi:10.1093/mind/fzy031. hdl: 20.500.11820/31cca3b7-f6b6-4388-b76f-6b3a58dbb94d . Retrieved 31 August 2020.
    27. Rupert, Robert D. (2008-03-01). "Frege's puzzle and Frege cases: Defending a quasi-syntactic solution". Cognitive Systems Research. Perspectives on Social Cognition. 9 (1–2): 76–91. doi:10.1016/j.cogsys.2007.07.003. S2CID   15273514.
    28. 1 2 3 4 "Articles, TV Interviews, Lectures and Podcasts". Susan Schneider. Retrieved 1 September 2020.
    29. Schneider, Susan (2019-06-10). "Opinion | Should You Add a Microchip to Your Brain?". The New York Times. ISSN   0362-4331 . Retrieved 2019-08-10.
    30. "Susan Schneider - Opinionator - The New York Times". opinionator.blogs.nytimes.com. 3 March 2014. Retrieved 2019-08-10.
    31. Schneider, Susan (13 August 2019). "Merging with AI would be suicide for the human mind". Financial Times. Retrieved 2019-08-14.
    32. Schneider, Susan. "Spacetime Emergence, Panpsychism and the Nature of Consciousness". Scientific American Blog Network. Retrieved 2019-08-10.
    33. "Susan Schneider". Muck Rack. Retrieved 1 September 2020.
    34. Zeeberg, Amos (October 22, 2009). "I Compute, Therefore I Am - Science Not Fiction". Discover Magazine. Retrieved 2015-10-17.
    35. Leser, Eric (2014-12-21). "La forme dominante de vie dans le cosmos est probablement celle de super robots". Slate.fr (in French). Retrieved 2015-10-17.
    36. Schneider, Susan (2019-10-23). "Conscious machines: How will we test artificial intelligence for feeling?". Big Think. Retrieved 2019-10-23.
    37. Little, Cole (2016-06-22). "Would You Have Any Cosmetic Neurology Done?". Nautilus. Retrieved 2019-08-14.
    38. Naff, Clay Farris (27 August 2014). "Can Humanism Survive the Coming Transhumanist Revolution?". thehumanist.com. Retrieved 2015-10-17.
    39. Naff, Clay Farris (27 August 2014). "Mind & Self in the Transhumanist Age". thehumanist.com. Retrieved 2015-10-17.
    40. Smerd, Georgina (May 20, 2018). "Film Review: SuperSapiens SuperSapiens will leave its audience in contemplation about humankind's future and what sort of world we may be creating". GLAM Adelaide. Retrieved 1 September 2020.
    41. "Supersapiens, the Rise of the Mind". Kurzweilaccelerating intelligence. July 21, 2017. Retrieved 1 September 2020.
    Susan Lynn Schneider [1]
    34266723-1886770384676793-8361192811432771584-o.jpg
    Schneider on AI panel, 2019
    NationalityAmerican
    OccupationPhilosopher
    Academic background
    Education University of California, Berkeley
    Rutgers University
    Doctoral advisor Jerry Fodor