Echoborg

Last updated

An echoborg is a person whose words and actions are determined, in whole or in part, by an artificial intelligence (AI). [1]

The term "echoborg" was coined by social psychologists Kevin Corti and Alex Gillespie, whose research at the London School of Economics explored unscripted face-to-face social encounters between research participants and confederates whose words were covertly supplied by rudimentary AIs known as “chat bots" and vocalized via speech shadowing. [2] [3] The idea is derivative of the cyranoid concept that originated with Stanley Milgram. [4] [5] [6]

The “echoborg method” allows one to investigate how people behave and make attributions toward an AI (or more precisely, a human-AI “hybrid”) when their psychological state is fully primed for human-human interaction. Other forms of human-AI interaction (e.g., computer-mediated conversation) involve a machine interface, anthropomorphic analog, or a virtual reality layer through which a person communicates with an AI, and these forms of mediation fundamentally alter the intersubjective relationship between the human and artificial agents party to an interaction. [7]

The echoborg concept has been explored in performance art as commentary on the increasing ubiquitousness of AI and its contribution to human culture, as well as people's dependency on various types of AI (e.g., GPS navigation systems) for carrying out mundane social tasks. [8] [9]

See also

Related Research Articles

Artificial intelligence (AI), in its broadest sense, is intelligence exhibited by machines, particularly computer systems, as opposed to the natural intelligence of living beings. As a field of research in computer science focusing on the automation of intelligent behavior through machine learning, it develops and studies methods and software which enable machines to perceive their environment and take actions that maximize their chances of achieving defined goals, with the aim of performing tasks typically associated with human intelligence. Such machines may be called AIs.

<span class="mw-page-title-main">ELIZA</span> Early natural language processing computer program

ELIZA is an early natural language processing computer program developed from 1964 to 1967 at MIT by Joseph Weizenbaum. Created to explore communication between humans and machines, ELIZA simulated conversation by using a pattern matching and substitution methodology that gave users an illusion of understanding on the part of the program, but had no representation that could be considered really understanding what was being said by either party. Whereas the ELIZA program itself was written (originally) in MAD-SLIP, the pattern matching directives that contained most of its language capability were provided in separate "scripts", represented in a lisp-like representation. The most famous script, DOCTOR, simulated a psychotherapist of the Rogerian school, and used rules, dictated in the script, to respond with non-directional questions to user inputs. As such, ELIZA was one of the first chatterbots and one of the first programs capable of attempting the Turing test.

<span class="mw-page-title-main">Stanley Milgram</span> American social psychologist

Stanley Milgram was an American social psychologist, best known for his controversial experiments on obedience conducted in the 1960s during his professorship at Yale.

<span class="mw-page-title-main">Chatbot</span> Program that simulates conversation

A chatbot is a software application or web interface that is designed to mimic human conversation through text or voice interactions. Modern chatbots are typically online and use generative artificial intelligence systems that are capable of maintaining a conversation with a user in natural language and simulating the way a human would behave as a conversational partner. Such chatbots often use deep learning and natural language processing, but simpler chatbots have existed for decades.

<span class="mw-page-title-main">Conversation</span> Interactive communication between two or more people

Conversation is interactive communication between two or more people. The development of conversational skills and etiquette is an important part of socialization. The development of conversational skills in a new language is a frequent focus of language teaching and learning. Conversation analysis is a branch of sociology which studies the structure and organization of human interaction, with a more specific focus on conversational interaction.

Artificial human companions may be any kind of hardware or software creation designed to give companionship to a person. These can include digital pets, such as the popular Tamagotchi, or robots, such as the Sony AIBO. Virtual companions can be used as a form of entertainment, or they can be medical or functional, to assist the elderly in maintaining an acceptable standard of life.

<span class="mw-page-title-main">Customer service</span> Provision of service to customers

Customer service is the assistance and advice provided by a company to those people who buy or use its products or services. Each industry requires different levels of customer service, but towards the end, the idea of a well-performed service is that of increasing revenues. The perception of success of the customer service interactions is dependent on employees "who can adjust themselves to the personality of the customer". Customer service is often practiced in a way that reflects the strategies and values of a firm. Good quality customer service is usually measured through customer retention. Customer service for some firms is part of the firm’s intangible assets and can differentiate it from others in the industry. One good customer service experience can change the entire perception a customer holds towards the organization.

<span class="mw-page-title-main">NICE Ltd.</span> Surveillence and data analytics company in Israel

NICE is an American technology company specializing in customer relations management software, artificial intelligence, digital and workforce engagement management. The company serves various industries, such as financial services, telecommunications, healthcare, outsourcers, retail, media, travel, service providers, and utilities.

In philosophy, psychology, sociology, and anthropology, intersubjectivity is the relation or intersection between people's cognitive perspectives.

In artificial intelligence, an embodied agent, also sometimes referred to as an interface agent, is an intelligent agent that interacts with the environment through a physical body within that environment. Agents that are represented graphically with a body, for example a human or a cartoon animal, are also called embodied agents, although they have only virtual, not physical, embodiment. A branch of artificial intelligence focuses on empowering such agents to interact autonomously with human beings and the environment. Mobile robots are one example of physically embodied agents; Ananova and Microsoft Agent are examples of graphically embodied agents. Embodied conversational agents are embodied agents that are capable of engaging in conversation with one another and with humans employing the same verbal and nonverbal means that humans do.

Cyranoids are "people who do not speak thoughts originating in their own central nervous system: Rather, the words they speak originate in the mind of another person who transmits these words to the cyranoid by radio transmission".

<span class="mw-page-title-main">Justine Cassell</span> American linguist, professor and human-computer interaction researcher

Justine M. Cassell is an American professor and researcher interested in human-human conversation, human-computer interaction, and storytelling. Since August 2010, she has been on the faculty of the Carnegie Mellon Human Computer Interaction Institute (HCII) and the Language Technologies Institute, with courtesy appointments in Psychology, and the Center for Neural Bases of Cognition. Cassell has served as the chair of the HCII, as associate vice-provost, and as Associate Dean of Technology Strategy and Impact for the School of Computer Science. She currently divides her time between Carnegie Mellon, where she now holds the Dean's Professorship in Language Technologies, and PRAIRIE, the Paris Institute on Interdisciplinary Research in AI, where she also holds the position of senior researcher at Inria Paris.

<span class="mw-page-title-main">Virtual assistant</span> Software agent

A virtual assistant (VA) is a software agent that can perform a range of tasks or services for a user based on user input such as commands or questions, including verbal ones. Such technologies often incorporate chatbot capabilities to simulate human conversation, such as via online chat, to facilitate interaction with their users. The interaction may be via text, graphical interface, or voice - as some virtual assistants are able to interpret human speech and respond via synthesized voices.

<span class="mw-page-title-main">Turing test</span> Test of a machines ability to imitate human intelligence

The Turing test, originally called the imitation game by Alan Turing in 1950, is a test of a machine's ability to exhibit intelligent behaviour equivalent to, or indistinguishable from, that of a human. Turing proposed that a human evaluator would judge natural language conversations between a human and a machine designed to generate human-like responses. The evaluator would be aware that one of the two partners in conversation was a machine, and all participants would be separated from one another. The conversation would be limited to a text-only channel, such as a computer keyboard and screen, so the result would not depend on the machine's ability to render words as speech. If the evaluator could not reliably tell the machine from the human, the machine would be said to have passed the test. The test results would not depend on the machine's ability to give correct answers to questions, only on how closely its answers resembled those a human would give. Since the Turing test is a test of indistinguishability in performance capacity, the verbal version generalizes naturally to all of human performance capacity, verbal as well as nonverbal (robotic).

Virtual intelligence (VI) is the term given to artificial intelligence that exists within a virtual world. Many virtual worlds have options for persistent avatars that provide information, training, role-playing, and social interactions.

<span class="mw-page-title-main">Eric Horvitz</span> American computer scientist, and Technical Fellow at Microsoft

Eric Joel Horvitz is an American computer scientist, and Technical Fellow at Microsoft, where he serves as the company's first Chief Scientific Officer. He was previously the director of Microsoft Research Labs, including research centers in Redmond, WA, Cambridge, MA, New York, NY, Montreal, Canada, Cambridge, UK, and Bangalore, India.

Computers are social actors (CASA) is a paradigm which states that humans unthinkingly apply the same social heuristics used for human interactions to computers, because they call to mind similar social attributes as humans.

Emotion recognition is the process of identifying human emotion. People vary widely in their accuracy at recognizing the emotions of others. Use of technology to help people with emotion recognition is a relatively nascent research area. Generally, the technology works best if it uses multiple modalities in context. To date, the most work has been conducted on automating the recognition of facial expressions from video, spoken expressions from audio, written expressions from text, and physiology as measured by wearables.

Anne Harper Anderson OBE FRSE former University of Glasgow Vice Principal and Head of the College of Social Sciences, and Gender Champion, specialising in communications including machine-human interaction. She served on the Engineering and Physical Sciences Research Council (EPSRC) which allocated £800million per annum for research. She was awarded an Order of the British Empire for services to social science (2002) and elected as a Fellow of the Royal Society of Edinburgh (2015).

Alex Gillespie is a professor of Psychological and Behavioral Science at the London School of Economics and a researcher at Oslo New University.

References

  1. Corti, Kevin; Gillespie, Alex (2015). "A truly human interface: interacting face-to-face with someone whose words are determined by a computer program". Frontiers in Psychology. 6: 634. doi: 10.3389/fpsyg.2015.00634 . PMC   4434916 . PMID   26042066.
  2. Corti, Kevin; Gillespie, Alex (2016). "Co-constructing intersubjectivity with artificial conversational agents: People are more likely to initiate repairs of misunderstandings with agents represented as human". Computers in Human Behavior. 58: 431–442. doi: 10.1016/j.chb.2015.12.039 .
  3. O'Grady, C. Human-AI echoborgs make chatbots more real, but still fail Turing test. Ars Technica (28 May 2015).
  4. Milgram, S. (1984). Cyranoids. In Milgram (Ed), The individual in a social world. New York: McGraw-Hill
  5. Robson, D. The people 'possessed' by computers. BBC Future (20 July 2015).
  6. "How to build an echoborg: PhD researcher Kevin Corti featured on the BBC". LSE Psychological & Behavioural Science. 2015-08-18. Retrieved 2024-03-26.
  7. Corti, Kevin; Gillespie, Alex (2016). "Co-constructing intersubjectivity with artificial conversational agents: People are more likely to initiate repairs of misunderstandings with agents represented as human". Computers in Human Behavior. 58: 431–442. doi: 10.1016/j.chb.2015.12.039 .
  8. Lander, Rik & Hall, Phil D. I am Echoborg Retrieved 16th February 2021.
  9. Copestake, J., Gillespie, A., & Corti, K. (2016). How artificial intelligence will change humanity Archived 2016-11-26 at the Wayback Machine [performance art]. BBC Future World Changing Ideas Summit 2016, Sydney, Australia.