Telehaptics

Last updated

Telehaptic is the term for computer generated tactile (tangible or touch) sensations (haptics) over a network, between physically distant human beings, or between a local user and a remote location, using sensors and effectors. Microcontrollers input information from sensors, and control effectors to create human sensations as outputs.

Contents

Sensors range from pressure, temperature and kinesthetic sensing devices, to biofeedback equipment. Haptic effectors, evoking precise perceivable sensations, range from small motors, fans, heating elements, or vibrators; to micro-voltage electrodes which gently stimulate areas of the skin (creating subtle, localized, "tingling" electrotactile sensations). Telehaptic interactivity, a form of assistive technology, may involve synesthesia; e.g. sensed inputs such as breathing, brain activity, or heartbeats might be presented as gentle, precisely variable bodily sensations in any combination, including warmth, cold, vibration, pressure, etc.; opening possibilities for levels of awareness, and interpersonal communication, difficult (or impossible) to attain prior to telehaptic (and biofeedback) technologies.

One of the challenges entailed in telehaptic applications involves the requirement for stability and the synchronized functioning of multiple tasks in order to effectively operate in a real-time environment. [1]

Applications

Interaction using telehaptic technology offers a new approach to communication including interpersonal interaction. It can be deployed to carry out telehealth encounters and the so-called telerehabilitation. [2] Since the technology provides sensed inputs such as brain activity, heartbeats, and breathing, it creates an effective telehealth encounter that can enable real-time examination by a health professional. Telehaptic interactivity is now being integrated in therapy such as in sampling the position of body parts and the provision of resistive forces using telehaptic virtual gloves. [2] Once challenges involving the irregularities of the communication network, delay, jitter, and packet loss, among others are addressed, telesurgery is also expected to advance. [3]

Telehaptic systems are also useful in different handling applications. There are researcher, for instance, who are able to utilize its interfaces to perform actions such as pushing or more complicated tasks that require dexterity like grasping micro-objects. [4] That is why, while the technology has not been perfected yet, scientists are optimistic about its potential. It means that telehaptic applications could play an important role in the future of microassembly and biological applications (e.g. handling cells and tissues). [4] The concept of teleoperation, which involves the remote operation of a machine, demonstrates the potential of telehaptic technology application in the case of risky activities and tasks such as nuclear waste disposal and wreckage exploration. [5]

See also

Related Research Articles

Haptic technology Any form of interaction involving touch

Haptic technology, also known as kinaesthetic communication or 3D touch, refers to any technology that can create an experience of touch by applying forces, vibrations, or motions to the user. These technologies can be used to create virtual objects in a computer simulation, to control virtual objects, and to enhance remote control of machines and devices (telerobotics). Haptic devices may incorporate tactile sensors that measure forces exerted by the user on the interface. The word haptic, from the Greek: ἁπτικός (haptikos), means "tactile, pertaining to the sense of touch". Simple haptic devices are common in the form of game controllers, joysticks, and steering wheels.

Telepresence refers to a set of technologies which allow a person to feel as if they were present, to give the appearance or effect of being present via telerobotics, at a place other than their true location.

Interactivity Interaction between users and computers

Across the many fields concerned with interactivity, including information science, computer science, human-computer interaction, communication, and industrial design, there is little agreement over the meaning of the term "interactivity", but most definitions are related to interaction between users and computers and other machines through a user interface. Interactivity can however also refer to interaction between people. It nevertheless usually refers to interaction between people and computers – and sometimes to interaction between computers – through software, hardware, and networks.

Touchscreen Input and output device

A touchscreen or touch screen is the assembly of both an input and output ('display') device. The touch panel is normally layered on the top of an electronic visual display of an information processing system. The display is often an LCD, AMOLED or OLED display while the system is usually a laptop, tablet, or smartphone. A user can give input or control the information processing system through simple or multi-touch gestures by touching the screen with a special stylus or one or more fingers. Some touchscreens use ordinary or specially coated gloves to work while others may only work using a special stylus or pen. The user can use the touchscreen to react to what is displayed and, if the software allows, to control how it is displayed; for example, zooming to increase the text size.

Haptic communication Communication via touch

Haptic communication is a branch of nonverbal communication that refers to the ways in which people and animals communicate and interact via the sense of touch. Touch is the most sophisticated and intimate of the five senses. Touch or haptics, from the ancient Greek word haptikos is extremely important for communication; it is vital for survival.

Gesture recognition Topic in computer science and language technology

Gesture recognition is a topic in computer science and language technology with the goal of interpreting human gestures via mathematical algorithms. It is a subdiscipline of computer vision. Gestures can originate from any bodily motion or state but commonly originate from the face or hand. Current focuses in the field include emotion recognition from face and hand gesture recognition. Users can use simple gestures to control or interact with devices without physically touching them. Many approaches have been made using cameras and computer vision algorithms to interpret sign language. However, the identification and recognition of posture, gait, proxemics, and human behaviors is also the subject of gesture recognition techniques. Gesture recognition can be seen as a way for computers to begin to understand human body language, thus building a better bridge between machines and humans than older text user interfaces or even GUIs, which still limit the majority of input to keyboard and mouse and interact naturally without any mechanical devices.

The concept of the "sensor web" is a type of sensor network that is especially well suited for environmental monitoring. The phrase the "sensor web" is also associated with a sensing system which heavily utilizes the World Wide Web. OGC's Sensor Web Enablement (SWE) framework defines a suite of web service interfaces and communication protocols abstracting from the heterogeneity of sensor (network) communication.

Sensory substitution is a change of the characteristics of one sensory modality into stimuli of another sensory modality.

Multimodal interaction provides the user with multiple modes of interacting with a system. A multimodal interface provides several distinct tools for input and output of data.

Actuator Sensor Interface is an industrial networking solution used in PLC, DCS and PC-based automation systems. It is designed for connecting simple field I/O devices in discrete manufacturing and process applications using a single two-conductor cable.

Lighting control system Intelligent network based lighting control solution

A lighting control system is an intelligent network based lighting control solution that incorporates communication between various system inputs and outputs related to lighting control with the use of one or more central computing devices. Lighting control systems are widely used on both indoor and outdoor lighting of commercial, industrial, and residential spaces. Lighting control systems are sometimes referred to under the term smart lighting. Lighting control systems serve to provide the right amount of light where and when it is needed.

Biosignal Any signal in living beings that can be continually measured and monitored

A biosignal is any signal in living beings that can be continually measured and monitored. The term biosignal is often used to refer to bioelectrical signals, but it may refer to both electrical and non-electrical signals. The usual understanding is to refer only to time-varying signals, although spatial parameter variations are sometimes subsumed as well.

Amigdalae is a biofeedback based art project by the artist Massimiliano Peretti. The project has been presented for the first time in a scientific environment at the CNRS, National Center of Scientific Research in Paris, November 2005. This project explores how emotional reactions filter and distort human perception and observation.

Haptic perception means literally the ability "to grasp something". Perception in this case is achieved through the active exploration of surfaces and objects by a moving subject, as opposed to passive contact by a static subject during tactile perception.

In computing, 3D interaction is a form of human-machine interaction where users are able to move and perform interaction in 3D space. Both human and machine process information where the physical position of elements in the 3D space is relevant.

Somatosensory system Widely distributed parts of the sensory nervous system

The somatosensory system is the network of neural structures in the brain and body that produce the perception of touch, as well as temperature, body position (proprioception), and pain. It is a subset of the sensory nervous system, which also represents visual, auditory, olfactory, and gustatory stimuli. Somatosensation begins when mechano- and thermosensitive structures in the skin or internal organs sense physical stimuli such as pressure on the skin. Activation of these structures, or receptors, leads to activation of peripheral sensory neurons that convey signals to the spinal cord as patterns of action potentials. Sensory information is then processed locally in the spinal cord to drive reflexes, and is also conveyed to the brain for conscious perception of touch and proprioception. Note, somatosensory information from the face and head enters the brain through peripheral sensory neurons in the cranial nerves, such as the trigeminal nerve.

Tactile sensor

A tactile sensor is a device that measures information arising from physical interaction with its environment. Tactile sensors are generally modeled after the biological sense of cutaneous touch which is capable of detecting stimuli resulting from mechanical stimulation, temperature, and pain. Tactile sensors are used in robotics, computer hardware and security systems. A common application of tactile sensors is in touchscreen devices on mobile phones and computing.

Affective haptics is the emerging area of research which focuses on the study and design of devices and systems that can elicit, enhance, or influence the emotional state of a human by means of sense of touch. The research field is originated with the Dzmitry Tsetserukou and Alena Neviarouskaya papers on affective haptics and real-time communication system with rich emotional and haptic channels. Driven by the motivation to enhance social interactivity and emotionally immersive experience of users of real-time messaging, virtual, augmented realities, the idea of reinforcing (intensifying) own feelings and reproducing (simulating) the emotions felt by the partner was proposed. Four basic haptic (tactile) channels governing our emotions can be distinguished: (1) physiological changes, (2) physical stimulation, (3) social touch, (4) emotional haptic design.

Force Touch Force-sensing touch technology developed by Apple Inc.

Force Touch is a haptic technology developed by Apple Inc. that enables trackpads and touchscreens to distinguish between various levels of force being applied to their surfaces. It uses pressure sensors to add another method of input to Apple's devices. The technology was first unveiled on September 9, 2014, during the introduction of Apple Watch. Starting with the Apple Watch, Force Touch has been incorporated into many products within Apple's lineup. This notably includes MacBooks and the Magic Trackpad 2. The technology is known as 3D Touch on the iPhone models. The technology brings usability enhancements to the software by offering a third dimension to accept input. Accessing shortcuts, previewing details, drawing art and system wide features enable users to additionally interact with the displayed content by applying force on the input surface.

Asynchronous multi-body framework Robotic simulator

Asynchronous multi-body framework (AMBF) is an open-source 3D versatile simulator for robots developed in April 2019. This multi-body framework provides a real-time dynamic simulation of multi-bodies such as robots, free bodies, and multi-link puzzles, paired with real-time haptic interaction with various input devices. The framework integrates a real surgeon master console, haptic or not, to control simulated robots in real-time. This feature results in the simulator being used in real-time training applications for surgical and non-surgical tasks. It offers the possibility to interact with soft bodies to simulate surgical tasks where tissues are subject to deformations. It also provides a Python Client to interact easily with the simulated bodies and train neural networks on real-time data with in-loop simulation. It includes a wide range of robots, grippers, sensors, puzzles, and soft bodies. Each simulated object is represented as an afObject; likewise, the simulation world is represented as an afWorld. Both utilize two communication interfaces: state and command. Through the State command, the object can send data outside the simulation environment, while the Command allows to apply commands to the underlying afObject.

References

  1. Furht, Borko (2008). Encyclopedia of Multimedia. Berlin: Springer. p. 847. ISBN   9780387747248.
  2. 1 2 Charness, Neil; Demiris, George; Krupinski, Elizabeth (2011). Designing Telehealth for an Aging Population: A Human Factors Perspective. Boca Raton: CRC Press. p. 46. ISBN   978-1-4398-2529-7.
  3. Prattichizzo, Domenico; Shinoda, Hiroyuki; Tan, Hong; Ruffaldi, Emanuele; Frisoli, Antonio (2018). Haptics: Science, Technology, and Applications: 11th International ..., Part 2. Berlin: Springer. p. 660. ISBN   9783319933986.
  4. 1 2 Ratchev, Svetan (2010). Precision Assembly Technologies and Systems: 5th IFIP WG 5.5 International Precision Assembly Seminar, IPAS 2010, Chamonix, France, February 14-17, 2010, Proceedings. Berlin: Springer. p. 14. ISBN   9783642115974.
  5. Zin, Thi Thi; Lin, Jerry; Pan, Jeng-Shyang; Tin, Pyke; Yokota, Mitsuhiro (2015). Genetic and Evolutionary Computing. Heidelberg: Springer. p. 92. ISBN   9783319232065.