Visuo-haptic mixed reality

Last updated

Visuo-haptic mixed reality (VHMR) is a branch of mixed reality that has the ability of merging visual and tactile perceptions of both virtual and real objects with a collocated approach. The first known system to overlay augmented haptic perceptions on direct views of the real world is the Virtual Fixtures system developed in 1992 at the US Air Force Research Laboratories. [1] Like any emerging technology, the development of the VHMR systems is accompanied by challenges that, in this case, deal with the efforts to enhance the multi-modal human perception with the user-computer interface and interaction devices at the moment available. Visuo-haptic mixed reality (VHMR) consists of adding to a real scene the ability to see and touch virtual objects. [2] It requires the use of see-through display technology for visually mixing real and virtual objects and haptic devices necessary to provide haptic stimuli to the user while interacting with the virtual objects. A VHMR setup allows the user to perceive visual and kinesthetic stimuli in a co-located manner, i.e., the user can see and touch virtual objects at the same spatial location. This setup overcomes the limits of the traditional one, i.e, display and haptic device, because the visuo-haptic co-location of the user's hand and a virtual tool improve the sensory integration of multimodal cues and makes the interaction more natural. But it also comes with technological challenges in order to improve the naturalness of the perceptual experience. [3]

See also

Related Research Articles

<span class="mw-page-title-main">Virtual reality</span> Computer-simulated experience

Virtual reality (VR) is a simulated experience that employs pose tracking and 3D near-eye displays to give the user an immersive feel of a virtual world. Applications of virtual reality include entertainment, education and business. Other distinct types of VR-style technology include augmented reality and mixed reality, sometimes referred to as extended reality or XR, although definitions are currently changing due to the nascence of the industry.

<span class="mw-page-title-main">User interface</span> Means by which a user interacts with and controls a machine

In the industrial design field of human–computer interaction, a user interface (UI) is the space where interactions between humans and machines occur. The goal of this interaction is to allow effective operation and control of the machine from the human end, while the machine simultaneously feeds back information that aids the operators' decision-making process. Examples of this broad concept of user interfaces include the interactive aspects of computer operating systems, hand tools, heavy machinery operator controls and process controls. The design considerations applicable when creating user interfaces are related to, or involve such disciplines as, ergonomics and psychology.

<span class="mw-page-title-main">Augmented reality</span> View of the real world with computer-generated supplementary features

Augmented reality (AR) is an interactive experience that combines the real world and computer-generated content. The content can span multiple sensory modalities, including visual, auditory, haptic, somatosensory and olfactory. AR can be defined as a system that incorporates three basic features: a combination of real and virtual worlds, real-time interaction, and accurate 3D registration of virtual and real objects. The overlaid sensory information can be constructive, or destructive. This experience is seamlessly interwoven with the physical world such that it is perceived as an immersive aspect of the real environment. In this way, augmented reality alters one's ongoing perception of a real-world environment, whereas virtual reality completely replaces the user's real-world environment with a simulated one.

<span class="mw-page-title-main">Haptic technology</span> Any form of interaction involving touch

Haptic technology is technology that can create an experience of touch by applying forces, vibrations, or motions to the user. These technologies can be used to create virtual objects in a computer simulation, to control virtual objects, and to enhance remote control of machines and devices (telerobotics). Haptic devices may incorporate tactile sensors that measure forces exerted by the user on the interface. The word haptic, from the Greek: ἁπτικός (haptikos), means "tactile, pertaining to the sense of touch". Simple haptic devices are common in the form of game controllers, joysticks, and steering wheels.

Telepresence refers to a set of technologies which allow a person to feel as if they were present, to give the appearance or effect of being present via telerobotics, at a place other than their true location.

<span class="mw-page-title-main">Computer-mediated reality</span> Ability to manipulate ones perception of reality through the use of a computer

Computer-mediated reality refers to the ability to add to, subtract information from, or otherwise manipulate one's perception of reality through the use of a wearable computer or hand-held device such as a smartphone.

<span class="mw-page-title-main">Mixed reality</span> Merging of real and virtual worlds to produce new environments

Mixed reality (MR) is a term used to describe the merging of a real-world environment and a computer-generated one. Physical and virtual objects may co-exist in mixed reality environments and interact in real time.

<span class="mw-page-title-main">Tangible user interface</span>

A tangible user interface (TUI) is a user interface in which a person interacts with digital information through the physical environment. The initial name was Graspable User Interface, which is no longer used. The purpose of TUI development is to empower collaboration, learning, and design by giving physical forms to digital information, thus taking advantage of the human ability to grasp and manipulate physical objects and materials.

Sensory substitution is a change of the characteristics of one sensory modality into stimuli of another sensory modality.

<span class="mw-page-title-main">Enactive interfaces</span>

Enactive interfaces are interactive systems that allow organization and transmission of knowledge obtained through action. Examples are interfaces that couple a human with a machine to do things usually done unaided, such as shaping a three-dimensional object using multiple modality interactions with a database, or using interactive video to allow a student to visually engage with mathematical concepts. Enactive interface design can be approached through the idea of raising awareness of affordances, that is, optimization of the awareness of possible actions available to someone using the enactive interface. This optimization involves visibility, affordance, and feedback.

A virtual fixture is an overlay of augmented sensory information upon a user's perception of a real environment in order to improve human performance in both direct and remotely manipulated tasks. Developed in the early 1990s by Louis Rosenberg at the U.S. Air Force Research Laboratory (AFRL), Virtual Fixtures was a pioneering platform in virtual reality and augmented reality technologies.

<span class="mw-page-title-main">Immersion (virtual reality)</span> Perception of being physically present in a non-physical world

Immersion into virtual reality (VR) is a perception of being physically present in a non-physical world. The perception is created by surrounding the user of the VR system in images, sound or other stimuli that provide an engrossing total environment.

A projection augmented model is an element sometimes employed in virtual reality systems. It consists of a physical three-dimensional model onto which a computer image is projected to create a realistic looking object. Importantly, the physical model is the same geometric shape as the object that the PA model depicts.

Haptic perception means literally the ability "to grasp something". Perception in this case is achieved through the active exploration of surfaces and objects by a moving subject, as opposed to passive contact by a static subject during tactile perception.

In computing, 3D interaction is a form of human-machine interaction where users are able to move and perform interaction in 3D space. Both human and machine process information where the physical position of elements in the 3D space is relevant.

Affective haptics is the emerging area of research which focuses on the study and design of devices and systems that can elicit, enhance, or influence the emotional state of a human by means of sense of touch. The research field is originated with the Dzmitry Tsetserukou and Alena Neviarouskaya papers on affective haptics and real-time communication system with rich emotional and haptic channels. Driven by the motivation to enhance social interactivity and emotionally immersive experience of users of real-time messaging, virtual, augmented realities, the idea of reinforcing (intensifying) own feelings and reproducing (simulating) the emotions felt by the partner was proposed. Four basic haptic (tactile) channels governing our emotions can be distinguished:

  1. physiological changes
  2. physical stimulation
  3. social touch
  4. emotional haptic design.

Haptic memory is the form of sensory memory specific to touch stimuli. Haptic memory is used regularly when assessing the necessary forces for gripping and interacting with familiar objects. It may also influence one's interactions with novel objects of an apparently similar size and density. Similar to visual iconic memory, traces of haptically acquired information are short lived and prone to decay after approximately two seconds. Haptic memory is best for stimuli applied to areas of the skin that are more sensitive to touch. Haptics involves at least two subsystems; cutaneous, or everything skin related, and kinesthetic, or joint angle and the relative location of body. Haptics generally involves active, manual examination and is quite capable of processing physical traits of objects and surfaces.

<span class="mw-page-title-main">Peripheral head-mounted display</span>

A peripheral head-mounted display (PHMD) is avisual display mounted to the user's head that is in the peripheral of the user's field of view (FOV) / peripheral vision. Whereby the actual position of the mounting is considered to be irrelevant as long as it does not cover the entire FOV. While a PHMD provide an additional, always-available visual output channel, it does not limit the user performing real world tasks.

Ken Hinckley is an American computer scientist and inventor. He is a senior principal research manager at Microsoft Research. He is known for his research in human-computer interaction, specifically on sensing techniques, pen computing, and cross-device interaction.

Vincent Hayward, was an engineer specializing in touch and haptics. He was a professor at Sorbonne University, Institute of Intelligent Systems and Robotics (ISIR), where since 2008 he led a team dedicated to the study of haptic perception and the creation of tactile stimulation devices. In 2020, he was elected to the French Academy of sciences.

References

  1. Rosenberg, L.B. (1992). "The Use of Virtual Fixtures As Perceptual Overlays to Enhance Operator Performance in Remote Environments". Technical Report AL-TR-0089, USAF Armstrong Laboratory, Wright-Patterson AFB OH, 1992.
  2. Cosco, Garre, Bruno, Muzzupappa, and Otaduy (January 2013). "Visuo-Haptic Mixed Reality with Unobstructed Tool-Hand Integration" (PDF). IEEE Transactions on Visualization and Computer Graphics. 19 (1): 159–172. doi:10.1109/TVCG.2012.107. PMID   22508901. S2CID   2894269 . Retrieved October 21, 2014.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  3. Barbieri, L.; Bruno, F.; Cosco, F.; Muzzupappa, M. (December 2014). "Effects of device obtrusion and tool-hand misalignment on user performance and stiffness perception in visuo-haptic mixed reality". International Journal of Human-Computer Studies. 72 (12): 846–859. doi:10.1016/j.ijhcs.2014.07.006.