Feelix Growing is a research project, started on December 1, 2006, [1] that is working to design robots that can detect and respond to human emotional cues. The project involves six countries and 25 roboticists, developmental psychologists and neuroscientists. [2]
The aim of the project was to build robots that "learn from humans and respond in a socially and emotionally appropriate manner". [3] The robots are designed to respond to emotional cues from humans and use them to adapt their own behavior. The project designers wanted to facilitate integration of robots into human society so that they could more easily provide services. The project aims to create robots that can "recognize" a given emotion, such as anger or fear, in a human, and adapt its behavior to the most appropriate response after repeated interactions. [4] Thus the project emphasizes development over time.
Robots are expected to be able to read emotions by picking up on physical cues like movement of body and facial muscles, posture, speed of movement, eyebrow position, [4] and distance between the human and the robot. [2] Project participants want to design the robots to detect those emotional cues that are universal to people, rather than those specific to individuals and cultures. [3]
The robots are made not only to detect emotions in people but also to have their own. According to Dr. Lola Cañamero, who is running the project, "Emotions foster adaptation to environment, so robots would be better at learning things. For example, anything that damages the body would be painful, so a robot would learn not to do it again." [4] Cañamero says that the robots will be given the equivalent of a system of pleasure and pain. [2]
The robots will have artificial neural networks. Rather than building complex hardware, the project coordinators plan to focus on designing software and to use mostly "off the shelf" hardware that is already available. The only part they plan to build themselves are heads with artificial faces capable of forming facial expressions. [3]
The scheme for 2.5 million euros is financed by the European Commission [2] (the executive body of the European Union) and is set to last for three years. Project participants hope to have a model of robot that can be used in homes and hospitals by the scheduled end date of the project.
The name Feelix is derived from the words feel, interact, and express.
An android is a humanoid robot or other artificial being often made from a flesh-like material. Historically, androids existed only in the domain of science fiction and were frequently seen in film and television, but advances in robot technology have allowed the design of functional and realistic humanoid robots.
Affective computing is the study and development of systems and devices that can recognize, interpret, process, and simulate human affects. It is an interdisciplinary field spanning computer science, psychology, and cognitive science. While some core ideas in the field may be traced as far back as to early philosophical inquiries into emotion, the more modern branch of computer science originated with Rosalind Picard's 1995 paper on affective computing and her book Affective Computing published by MIT Press. One of the motivations for the research is the ability to give machines emotional intelligence, including to simulate empathy. The machine should interpret the emotional state of humans and adapt its behavior to them, giving an appropriate response to those emotions.
Pathognomy is "a 'semiotik' of the transient features of someone's face or body, be it voluntary or involuntary". Examples of this can be laughter and winking to the involuntary such as sneezing or coughing. By studying the features or expressions, there is then an attempt to infer the mental state and emotion felt by the individual.
A microexpression is a facial expression that only lasts for a short moment. It is the innate result of a voluntary and an involuntary emotional response occurring simultaneously and conflicting with one another, and occurs when the amygdala responds appropriately to the stimuli that the individual experiences and the individual wishes to conceal this specific emotion. This results in the individual very briefly displaying their true emotions followed by a false emotional reaction.
Nonverbal communication is the transmission of messages or signals through a nonverbal platform such as eye contact (oculesics), body language (kinesics), social distance (proxemics), touch (haptics), voice (paralanguage), physical environments/appearance, and use of objects. When communicating, we utilize nonverbal channels as means to convey different messages or signals, whereas others can interpret these message. The study of nonverbal communication started in 1872 with the publication of The Expression of the Emotions in Man and Animals by Charles Darwin. Darwin began to study nonverbal communication as he noticed the interactions between animals such as lions, tigers, dogs etc. and realized they also communicated by gestures and expressions. For the first time, nonverbal communication was studied and its relevance questioned. Today, scholars argue that nonverbal communication can convey more meaning than verbal communication.
Face detection is a computer technology being used in a variety of applications that identifies human faces in digital images. Face detection also refers to the psychological process by which humans locate and attend to faces in a visual scene.
Developmental robotics (DevRob), sometimes called epigenetic robotics, is a scientific field which aims at studying the developmental mechanisms, architectures and constraints that allow lifelong and open-ended learning of new skills and new knowledge in embodied machines. As in human children, learning is expected to be cumulative and of progressively increasing complexity, and to result from self-exploration of the world in combination with social interaction. The typical methodological approach consists in starting from theories of human and animal development elaborated in fields such as developmental psychology, neuroscience, developmental and evolutionary biology, and linguistics, then to formalize and implement them in robots, sometimes exploring extensions or variants of them. The experimentation of those models in robots allows researchers to confront them with reality, and as a consequence, developmental robotics also provides feedback and novel hypotheses on theories of human and animal development.
Rosalind Wright Picard is an American scholar and inventor who is Professor of Media Arts and Sciences at MIT, founder and director of the Affective Computing Research Group at the MIT Media Lab, and co-founder of the startups Affectiva and Empatica.
iCub is a one meter tall open source robotics humanoid robot testbed for research into human cognition and artificial intelligence.
Display rules are a social group or culture's informal norms that distinguish how one should express oneself. They function as a way to maintain the social order of a given culture, creating an expected standard of behaviour to guide people in their interactions. Display rules can help to decrease situational ambiguity, help individuals to be accepted by their social groups, and can help groups to increase their group efficacy. They can be described as culturally prescribed rules that people learn early on in their lives by interactions and socializations with other people. Members of a social group learn these cultural standards at a young age which determine when one would express certain emotions, where and to what extent.
Emotional responsivity is the ability to acknowledge an affective stimuli by exhibiting emotion. It is a sharp change of emotion according to a person's emotional state. Increased emotional responsivity refers to demonstrating more response to a stimulus. Reduced emotional responsivity refers to demonstrating less response to a stimulus. Any response exhibited after exposure to the stimulus, whether it is appropriate or not, would be considered as an emotional response. Although emotional responsivity applies to nonclinical populations, it is more typically associated with individuals with schizophrenia and autism.
Robotics is the interdisciplinary study and practice of the design, construction, operation, and use of robots.
Non-verbal leakage is a form of non-verbal behavior that occurs when a person verbalizes one thing, but their body language indicates another, common forms of which include facial movements and hand-to-face gestures. The term "non-verbal leakage" got its origin in literature in 1968, leading to many subsequent studies on the topic throughout the 1970s, with related studies continuing today.
Affective haptics is an area of research which focuses on the study and design of devices and systems that can elicit, enhance, or influence the emotional state of a human by means of sense of touch. The research field is originated with the Dzmitry Tsetserukou and Alena Neviarouskaya papers on affective haptics and real-time communication system with rich emotional and haptic channels. Driven by the motivation to enhance social interactivity and emotionally immersive experience of users of real-time messaging, virtual, augmented realities, the idea of reinforcing (intensifying) own feelings and reproducing (simulating) the emotions felt by the partner was proposed. Four basic haptic (tactile) channels governing our emotions can be distinguished:
Social cues are verbal or non-verbal signals expressed through the face, body, voice, motion and guide conversations as well as other social interactions by influencing our impressions of and responses to others. These percepts are important communicative tools as they convey important social and contextual information and therefore facilitate social understanding.
Emotions in virtual communication are expressed and understood in a variety of different ways from those in face-to-face interactions. Virtual communication continues to evolve as technological advances emerge that give way to new possibilities in computer-mediated communication (CMC). The lack of typical auditory and visual cues associated with human emotion gives rise to alternative forms of emotional expression that are cohesive with many different virtual environments. Some environments provide only space for text based communication, where emotions can only be expressed using words. More newly developed forms of expression provide users the opportunity to portray their emotions using images.
Emotion perception refers to the capacities and abilities of recognizing and identifying emotions in others, in addition to biological and physiological processes involved. Emotions are typically viewed as having three components: subjective experience, physical changes, and cognitive appraisal; emotion perception is the ability to make accurate decisions about another's subjective experience by interpreting their physical changes through sensory systems responsible for converting these observed changes into mental representations. The ability to perceive emotion is believed to be both innate and subject to environmental influence and is also a critical component in social interactions. How emotion is experienced and interpreted depends on how it is perceived. Likewise, how emotion is perceived is dependent on past experiences and interpretations. Emotion can be accurately perceived in humans. Emotions can be perceived visually, audibly, through smell and also through bodily sensations and this process is believed to be different from the perception of non-emotional material.
Affectiva is a software development company that created artificial intelligence. In 2021, the company was acquired by SmartEye. The company claimed its AI understood human emotions, cognitive states, activities and the objects people use, by analyzing facial and vocal expressions. The offshoot of MIT Media Lab, Affectiva created a new technological category of Artificial Emotional Intelligence, namely, Emotion AI.
Artificial empathy or computational empathy is the development of AI systems—such as companion robots or virtual agents—that can detect emotions and respond to them in an empathic way.
A companion robot is a robot created to create real or apparent companionship for human beings. Target markets for companion robots include the elderly and single children. Companions robots are expected to communicate with non-experts in a natural and intuitive way. They offer a variety of functions, such as monitoring the home remotely, communicating with people, or waking people up in the morning. Their aim is to perform a wide array of tasks including educational functions, home security, diary duties, entertainment and message delivery services, etc.