Affective haptics

Last updated

Affective haptics is an area of research which focuses on the study and design of devices and systems that can elicit, enhance, or influence the emotional state of a human by means of sense of touch. The research field is originated with the Dzmitry Tsetserukou and Alena Neviarouskaya papers [1] [2] on affective haptics and real-time communication system with rich emotional and haptic channels. Driven by the motivation to enhance social interactivity and emotionally immersive experience of users of real-time messaging, virtual, augmented realities, the idea of reinforcing (intensifying) own feelings and reproducing (simulating) the emotions felt by the partner was proposed. Four basic haptic (tactile) channels governing our emotions can be distinguished:

Contents

  1. physiological changes (e.g., heart beat rate, body temperature, etc.)
  2. physical stimulation (e.g., tickling)
  3. social touch (e.g., hug, handshake)
  4. emotional haptic design (e.g., shape of device, material, texture).

Emotion theories

Affective haptic devices worn on a human body User wearing affective haptic devices.jpg
Affective haptic devices worn on a human body

According to James-Lange theory, [3] the conscious experience of emotion occurs after the cortex receives signals about changes in physiological state. Researchers argued that feelings are preceded by certain physiological changes. Thus, when we see a venomous snake, we feel fear, because our cortex has received signals about our racing heart, knocking knees, etc. Damasio [4] distinguishes primary and secondary emotions. Both involve changes in bodily states, but the secondary emotions are evoked by thoughts. Recent empirical studies support non-cognitive theories of nature of emotions. It was proven that we can easily evoke our emotions by something as simple as changing facial expression (e.g., smile brings on a feeling of happiness). [5]

Sense of touch in affective haptics

Human emotions can be easily evoked by different cues, and the sense of touch is one of the most emotionally charged channels. Affective haptic devices produce different senses of touch including kinesthetic and coetaneous channels. Kinesthetic stimulations, which are produced by forces exerted on the body, are sensed by mechanoreceptors in the tendons and muscles. On the other hand, mechanoreceptors in the skin layers are responsible for the perception of cutaneous stimulation. Different types of tactile corpuscles allow us sensing thermal property of the object, pressure, vibration frequency, and stimuli location. [6]

Technologies of affective haptics

Social touch

HaptiHug: haptic display for communication of hug over a distance HaptiHug.png
HaptiHug: haptic display for communication of hug over a distance
Structure of HaptiHug HaptiHug structure.png
Structure of HaptiHug

Online interactions rely heavily on vision and hearing, so a substantial need exists for mediated social touch. [7] Of the forms of physical contact, hugging is particularly emotionally charged; it conveys warmth, love, and affiliation. Recently, researchers have made several attempts to create a hugging device providing some sense of physical co-presence over a distance. [8] [9] HaptiHug's [10] key feature is that it physically reproduces the human-hug pattern, generating pressure simultaneously on each user's chest and back. The idea to realistically reproduce hugging is in integration of active-haptic device HaptiHug and pseudo-haptic touch simulated by hugging animation. Thus, high immersion into the physical contact of partners while hugging is achieved.

Intimate touch

Affection goes further than hugging alone. People in long distance relationships are faced with a lack of physical intimacy on a day-to-day basis. Haptic technology allows for kinesthetic and tactile interface design. The field of digital co-presence envelops Teledildonics as well. The Kiiroo SVir is a good example of an Adult CyberToy that incorporates tactile input, by means of a surface that is touch capacitive, and an inside that is kinesthetic in nature. 12 rings contract, pulse and vibrate according to the movements one's partner makes in real time. The SVir mimics the actual motion of its OPue counterpart and is also compatible with other SVir models. The SVir enables women to have intercourse with their partner through use of the OPue interactive vibrator no matter how great the distance that separates the couple.

Implicit emotion elicitation

Different types of devices can be used to produce the physiological changes. Of the bodily organs, the heart plays a particularly important role in our emotional experience. The heart imitator HaptiHeart [2] produces special heartbeat patterns according to emotion to be conveyed or elicited (sadness is associated with slightly intense heartbeat, anger with quick and violent heartbeat, fear with intense heart rate). False heart beat feedback can be directly interpreted as a real heart beat, so it can change the emotional perception. HaptiButterfly [2] reproduces “butterflies in your stomach” (the fluttery or tickling feeling felt by people experiencing love) through arrays of vibration motors attached to the user's abdomen. HaptiShiver [2] sends “shivers up and down your spine” through a row of vibration motors. HaptiTemper [2] sends “chills up and down your spine” through both cold airflow from a fan and the cold side of a Peltier element. HaptiTemper is also intended for simulation of warmth on the human skin to evoke either pleasant feeling or aggression

Explicit emotion elicitation

HaptiTickler [2] directly evokes joy by tickling the user's ribs. It includes four vibration motors reproducing stimuli similar to human finger movements

Affective (emotional) haptic design.

Recent findings show that attractive things make people feel good, which in turn makes them think more creatively. [11] The concept of emotional haptic design was proposed. [2] The core idea is to make user to feel an affinity for the device through:

Affective computing

Affective computing can be used to measure and to recognize emotional information in systems and devises employing affective haptics. Emotional information is extracted by using such techniques as speech recognition, natural language processing, facial expression detection, and measurement of physiological data.

Potential applications

The iFeel_IM! architecture IFeelIM.png
The iFeel_IM! architecture

Possible applications are as follows:

Affective Haptics is in the vanguard of emotional telepresence, [12] technology that lets users feel emotionally as if they were present and communicating at a remote physical location. The remote environment can be real, virtual, or augmented.

Application examples

EmoHeart object on the avatars' chest vividly and expressively represents the communicated emotions EmoHeart.png
EmoHeart object on the avatars’ chest vividly and expressively represents the communicated emotions

The philosophy behind the iFeel_IM! (intelligent system for Feeling enhancement powered by affect sensitive Instant Messenger) is “I feel [therefore] I am!”. In the iFeel_IM! system, great importance is placed on the automatic sensing of emotions conveyed through textual messages in 3D virtual world Second Life (artificial intelligence), the visualization of the detected emotions by avatars in virtual environment, enhancement of user's affective state, and reproduction of feeling of social touch (e.g., hug) by means of haptic stimulation in a real world. The control of the conversation is implemented through the Second Life object called EmoHeart [13] attached to the avatar's chest. In addition to communication with the system for textual affect sensing (Affect Analysis Model), [14] EmoHeart is responsible for sensing symbolic cues or keywords of ‘hug’ communicative function conveyed by text, and for visualization of ‘hugging’ in Second Life. The iFeel_IM! system considerably enhance emotionally immersive experience of real-time messaging.

In order to build a social interface, Réhman et al. developed two tactile systems (i.e. mobile phone based and chair based) which could render human facial expression for visually impaired persons. [15] [16] These systems conveyed human emotions to visually impaired persons using vibrotactile stimuli.

To produce movie-specific tactile stimuli influencing the viewer's emotions to the viewer's body, the wearable tactile jacket was developed by Philips researchers. [17] The motivation was to increase emotional immersion in a movie-viewing. The jacket contains 64 vibration motors that produce specially designed tactile patterns on the human torso.

See also

Related Research Articles

<span class="mw-page-title-main">Emotion</span> Conscious subjective experience of humans

Emotions are physical and mental states brought on by neurophysiological changes, variously associated with thoughts, feelings, behavioral responses, and a degree of pleasure or displeasure. There is no scientific consensus on a definition. Emotions are often intertwined with mood, temperament, personality, disposition, or creativity.

A tactile illusion is an illusion that affects the sense of touch. Some tactile illusions require active touch, whereas others can be evoked passively. In recent years, a growing interest among perceptual researchers has led to the discovery of new tactile illusions and to the celebration of tactile illusions in the popular science press. Some tactile illusions are analogous to visual and auditory illusions, suggesting that these sensory systems may process information in similar ways; other tactile illusions don't have obvious visual or auditory analogs.

Affective computing is the study and development of systems and devices that can recognize, interpret, process, and simulate human affects. It is an interdisciplinary field spanning computer science, psychology, and cognitive science. While some core ideas in the field may be traced as far back as to early philosophical inquiries into emotion, the more modern branch of computer science originated with Rosalind Picard's 1995 paper on affective computing and her book Affective Computing published by MIT Press. One of the motivations for the research is the ability to give machines emotional intelligence, including to simulate empathy. The machine should interpret the emotional state of humans and adapt its behavior to them, giving an appropriate response to those emotions.

Haptic technology is technology that can create an experience of touch by applying forces, vibrations, or motions to the user. These technologies can be used to create virtual objects in a computer simulation, to control virtual objects, and to enhance remote control of machines and devices (telerobotics). Haptic devices may incorporate tactile sensors that measure forces exerted by the user on the interface. The word haptic, from the Greek: ἁπτικός (haptikos), means "tactile, pertaining to the sense of touch". Simple haptic devices are common in the form of game controllers, joysticks, and steering wheels.

Telehaptic is the term for computer generated tactile sensations (haptics) over a network, between physically distant human beings, or between a local user and a remote location, using sensors and effectors. Microcontrollers input information from sensors, and control effectors to create human sensations as outputs.

<span class="mw-page-title-main">Haptic communication</span> Communication via touch

Haptic communication is a branch of nonverbal communication that refers to the ways in which people and animals communicate and interact via the sense of touch. Touch is the most sophisticated and intimate of the five senses. Touch or haptics, from the ancient Greek word haptikos is extremely important for communication; it is vital for survival.

<i>Emotional Design</i> Book by American writer Donald Norman

Emotional Design is both the title of a book by Donald Norman and of the concept it represents.

Sensory substitution is a change of the characteristics of one sensory modality into stimuli of another sensory modality.

<span class="mw-page-title-main">Rosalind Picard</span> American computer scientist

Rosalind Wright Picard is an American scholar and inventor who is Professor of Media Arts and Sciences at MIT, founder and director of the Affective Computing Research Group at the MIT Media Lab, and co-founder of the startups Affectiva and Empatica.

Affective design describes the design of products, services, and user interfaces that aim to evoke intended emotional responses from consumers, ultimately improving customer satisfaction. It is often regarded within the domain of technology interaction and computing, in which emotional information is communicated to the computer from the user in a natural and comfortable way. The computer processes the emotional information and adapts or responds to try to improve the interaction in some way. The notion of affective design emerged from the field of human–computer interaction (HCI), specifically from the developing area of affective computing. Affective design serves an important role in user experience (UX) as it contributes to the improvement of the user's personal condition in relation to the computing system. Decision-making, brand loyalty, and consumer connections have all been associated with the integration of affective design. The goals of affective design focus on providing users with an optimal, proactive experience. Amongst overlap with several fields, applications of affective design include ambient intelligence, human–robot interaction, and video games.

Haptic perception means literally the ability "to grasp something". Perception in this case is achieved through the active exploration of surfaces and objects by a moving subject, as opposed to passive contact by a static subject during tactile perception.

In computing, 3D interaction is a form of human-machine interaction where users are able to move and perform interaction in 3D space. Both human and machine process information where the physical position of elements in the 3D space is relevant.

<span class="mw-page-title-main">Somatosensory system</span> Nerve system for sensing touch, temperature, body position, and pain

Touch is perceiving the environment using skin. Specialized receptors in the skin send signals to the brain indicating light and soft pressure, hot and cold, body position and pain. It is a subset of the sensory nervous system, which also includes the visual, auditory, olfactory, gustatory and vestibular senses.

Sonic interaction design is the study and exploitation of sound as one of the principal channels conveying information, meaning, and aesthetic/emotional qualities in interactive contexts. Sonic interaction design is at the intersection of interaction design and sound and music computing. If interaction design is about designing objects people interact with, and such interactions are facilitated by computational means, in sonic interaction design, sound is mediating interaction either as a display of processes or as an input medium.

<span class="mw-page-title-main">Tactile sensor</span>

A tactile sensor is a device that measures information arising from physical interaction with its environment. Tactile sensors are generally modeled after the biological sense of cutaneous touch which is capable of detecting stimuli resulting from mechanical stimulation, temperature, and pain. Tactile sensors are used in robotics, computer hardware and security systems. A common application of tactile sensors is in touchscreen devices on mobile phones and computing.

Robotic sensing is a subarea of robotics science intended to provide sensing capabilities to robots. Robotic sensing provides robots with the ability to sense their environments and is typically used as feedback to enable robots to adjust their behavior based on sensed input. Robot sensing includes the ability to see, touch, hear and move and associated algorithms to process and make use of environmental feedback and sensory data. Robot sensing is important in applications such as vehicular automation, robotic prosthetics, and for industrial, medical, entertainment and educational robots.

Sensory branding is a type of marketing that appeals to all the senses in relation to the brand. It uses the senses to relate with customers on an emotional level. It is believed that the difference between an ordinary product and a captivating product is emotion. When emotion flows in the marketplace, your product shines. When there is no emotion from the product, customers lack the enthusiasm and passion that launches a product to success. Brands can forge emotional associations in the customers' minds by appealing to their senses. A multi-sensory brand experience generates certain beliefs, feelings, thoughts and opinions to create a brandgon image in the consumer's mind.

Emotions in virtual communication are expressed and understood in a variety of different ways from those in face-to-face interactions. Virtual communication continues to evolve as technological advances emerge that give way to new possibilities in computer-mediated communication (CMC). The lack of typical auditory and visual cues associated with human emotion gives rise to alternative forms of emotional expression that are cohesive with many different virtual environments. Some environments provide only space for text based communication, where emotions can only be expressed using words. More newly developed forms of expression provide users the opportunity to portray their emotions using images.

Visuo-haptic mixed reality (VHMR) is a branch of mixed reality that has the ability of merging visual and tactile perceptions of both virtual and real objects with a collocated approach. The first known system to overlay augmented haptic perceptions on direct views of the real world is the Virtual Fixtures system developed in 1992 at the US Air Force Research Laboratories. Like any emerging technology, the development of the VHMR systems is accompanied by challenges that, in this case, deal with the efforts to enhance the multi-modal human perception with the user-computer interface and interaction devices at the moment available. Visuo-haptic mixed reality (VHMR) consists of adding to a real scene the ability to see and touch virtual objects. It requires the use of see-through display technology for visually mixing real and virtual objects and haptic devices necessary to provide haptic stimuli to the user while interacting with the virtual objects. A VHMR setup allows the user to perceive visual and kinesthetic stimuli in a co-located manner, i.e., the user can see and touch virtual objects at the same spatial location. This setup overcomes the limits of the traditional one, i.e, display and haptic device, because the visuo-haptic co-location of the user's hand and a virtual tool improve the sensory integration of multimodal cues and makes the interaction more natural. But it also comes with technological challenges in order to improve the naturalness of the perceptual experience.

Emotion recognition is the process of identifying human emotion. People vary widely in their accuracy at recognizing the emotions of others. Use of technology to help people with emotion recognition is a relatively nascent research area. Generally, the technology works best if it uses multiple modalities in context. To date, the most work has been conducted on automating the recognition of facial expressions from video, spoken expressions from audio, written expressions from text, and physiology as measured by wearables.

References

  1. Tsetserukou, Dzmitry; Alena Neviarouskaya; Helmut Prendinger; Naoki Kawakami; Mitsuru Ishizuka; Susumu Tachi (2009). "Enhancing Mediated Interpersonal Communication through Affective Haptics". Intelligent Technologies for Interactive Entertainment. INTETAIN: International Conference on Intelligent Technologies for Interactive Entertainment. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering. Vol. 9. Amsterdam: Springer. pp. 246–251. CiteSeerX   10.1.1.674.243 . doi:10.1007/978-3-642-02315-6_27. ISBN   978-3-642-02314-9.
  2. 1 2 3 4 5 6 7 Tsetserukou, Dzmitry; Alena Neviarouskaya; Helmut Prendinger; Naoki Kawakami; Susumu Tachi (2009). "Affective Haptics in Emotional Communication" (PDF). 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops. Amsterdam, the Netherlands: IEEE Press. pp. 181–186. doi:10.1109/ACII.2009.5349516. ISBN   978-1-4244-4800-5.
  3. James, William (1884). "What is an Emotion?" (PDF). Mind. 9 (34): 188–205. doi:10.1093/mind/os-IX.34.188.
  4. Antonio, Damasio (2000). The Feeling of What Happens: Body, Emotion and the Making of Consciousness. Vintage. ISBN   978-0-09-928876-3.
  5. Zajonc, Robert B.; Sheila T. Murphy; Marita Inglehart (1989). "Feeling and Facial Efference: Implication of the Vascular Theory of Emotion" (PDF). Psychological Review. 96 (3): 395–416. doi:10.1037/0033-295X.96.3.395. PMID   2756066. S2CID   8690629. Archived from the original (PDF) on April 16, 2022.
  6. Kandel, Eric R.; James H. Schwartz; Thomas M. Jessell (2000). Principles of Neural Science . McGraw-Hill. ISBN   978-0-8385-7701-1.
  7. Haans, Antal; Wijnand I. Ijsselsteijn (2006). "Mediated Social Touch: a Review of Current Research and Future Directions". Virtual Reality. 9 (2–3): 149–159. doi:10.1007/s10055-005-0014-2. PMID   24807416. S2CID   17809554.
  8. DiSalvo, Carl; F. Gemperle; J. Forlizzi; E. Montgomery (2003). "The Hug: An exploration of robotic form for intimate communication". The 12th IEEE International Workshop on Robot and Human Interactive Communication, 2003. Proceedings. ROMAN 2003. Millbrae: IEEE Press. pp. 403–408. doi:10.1109/ROMAN.2003.1251879. ISBN   0-7803-8136-X. S2CID   380315.
  9. Mueller, Florian; F. Vetere; M.R. Gibbs; J. Kjeldskov; S. Pedell; S. Howard (2009). "Affective haptics in emotional communication" (PDF). 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops. in Proc. of the ACM Conf. on Human Factors in Computing Systems (CHI 05), Portland, USA, ACM Press. pp. 1673–1676. doi:10.1109/ACII.2009.5349516. ISBN   978-1-4244-4800-5. S2CID   12632537. Archived from the original (PDF) on 2010-09-23.
  10. Tsetserukou, Dzmitry (2009). "HaptiHug: a Novel Haptic Display for Communication of Hug over a Distance". Haptics: Generating and Perceiving Tangible Sensations. EuroHaptics: International Conference on Human Haptic Sensing and Touch Enabled Computer Applications. Lecture Notes in Computer Science. Vol. 6191. Amsterdam: Springer. pp. 340–347. doi:10.1007/978-3-642-14064-8_49. ISBN   978-3-642-14063-1.
  11. Norman, Donald A. (2004). Emotional Design: Why We Love (or Hate) Everyday Things. Basic Books. ISBN   978-0-465-05135-9.
  12. Tsetserukou, Dzmitry; Alena Neviarouskaya (September–October 2010). "iFeel_IM!: augmenting emotions during online communication". IEEE Computer Graphics and Applications (PDF). 30 (5): 72–80. doi:10.1007/s10055-005-0014-2. PMID   24807416. S2CID   17809554.
  13. Neviarouskaya, Alena; Helmut Prendinger; Mitsuru Ishizuka (2010). "EmoHeart: Conveying Emotions in Second Life Based on Affect Sensing from Text". Advances in Human-Computer Interaction. 2010: 1–13. doi: 10.1155/2010/209801 .
  14. Neviarouskaya, Alena; Helmut Prendinger; Mitsuru Ishizuka (2010). "Recognition of Fine-Grained Emotions from Text: an Approach Based on the Compositionality Principle". In Nishida, T.; Jain, L.; Faucher, C. (eds.). Modelling Machine Emotions for Realizing Intelligence: Foundations and Applications. Smart Innovation, Systems and Technologies. Vol. 1. Springer. pp. 179–207. doi:10.1007/978-3-642-12604-8_9. ISBN   978-3-642-12603-1.
  15. Réhman, Shafiq; Li Liu; Haibo Li (October 2007). Manifold of Facial Expressions for Tactile Perception. IEEE 9th Workshop on Multimedia Signal Processing, 2007 (PDF). pp. 239–242. doi:10.1109/MMSP.2007.4412862. ISBN   978-1-4244-1273-0. S2CID   17661808.
  16. Réhman, Shafiq; Li Liu (July 2008). "Vibrotactile Rendering of Human Emotions on the Manifold of Facial Expressions". Journal of Multimedia (PDF). 3 (3): 18–25. CiteSeerX   10.1.1.408.3683 . doi:10.4304/jmm.3.3.18-25.
  17. Lemmens, Paul; F. Crompvoets; D. Brokken; J.V.D. Eerenbeemd; G.-J.D. Vries (2009). "A Body-Conforming Tactile Jacket to Enrich Movie Viewing". World Haptics 2009 - Third Joint Euro Haptics conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. in Proc. the Third Joint EuroHaptics Conf. and Symp. on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Salt Lake City, USA. pp. 7–12. doi:10.1109/WHC.2009.4810832. ISBN   978-1-4244-3858-7. S2CID   28364374.