Virtual fixture

Last updated

A virtual fixture is an overlay of augmented sensory information upon a user's perception of a real environment in order to improve human performance in both direct and remotely manipulated tasks. [1] Developed in the early 1990s by Louis Rosenberg at the U.S. Air Force Research Laboratory (AFRL), Virtual Fixtures was a pioneering platform in virtual reality and augmented reality technologies.

Contents

History

Virtual Fixtures was first developed by Louis Rosenberg in 1992 at the USAF Armstrong Labs, resulting in the first immersive augmented reality system ever built. [2] [3] [4] [5] [6] Because 3D graphics were too slow in the early 1990s to present a photorealistic and spatially-registered augmented reality, Virtual Fixtures used two real physical robots, controlled by a full upper-body exoskeleton worn by the user. To create the immersive experience for the user, a unique optics configuration was employed that involved a pair of binocular magnifiers aligned so that the user's view of the robot arms were brought forward so as to appear registered in the exact location of the user's real physical arms. [2] [7] [5] The result was a spatially-registered immersive experience in which the user moved his or her arms, while seeing robot arms in the place where his or her arms should be. The system also employed computer-generated virtual overlays in the form of simulated physical barriers, fields, and guides, designed to assist in the user while performing real physical tasks. [8] [9] [3] [10] [11] [12]

Fitts Law performance testing was conducted on batteries of human test subjects, demonstrating for the first time, that a significant enhancement in human performance of real-world dexterous tasks could be achieved by providing immersive augmented reality overlays to users. [5] [13]

Concept

Virtual fixtures: Used to enhance operator performance in the telerobotic control of Fitt's Law peg-board task. Conceptual fixture pic sm 2.jpg
Virtual fixtures: Used to enhance operator performance in the telerobotic control of Fitt's Law peg-board task.

The concept of virtual fixtures was first introduced [2] as an overlay of virtual sensory information on a workspace in order to improve human performance in direct and remotely manipulated tasks. The virtual sensory overlays can be presented as physically realistic structures, registered in space such that they are perceived by the user to be fully present in the real workspace environment. The virtual sensory overlays can also be abstractions that have properties not possible of real physical structures. The concept of sensory overlays is difficult to visualize and talk about, as a consequence the virtual fixture metaphor was introduced. To understand what a virtual fixture is an analogy with a real physical fixture such as a ruler is often used. A simple task such as drawing a straight line on a piece of paper free-hand is a task that most humans are unable to perform with good accuracy and high speed. However, the use of a simple device such as a ruler allows the task to be carried out quickly and with good accuracy. The use of a ruler helps the user by guiding the pen along the ruler reducing the tremor and mental load of the user, thus increasing the quality of the results.

Virtual Fixtures used for Augmented Reality Surgery, enables enhanced surgical dexterity. Virtual Fixtures (Medical Concept 1991) XR2020.jpg
Virtual Fixtures used for Augmented Reality Surgery, enables enhanced surgical dexterity.

When the Virtual Fixture concept was proposed to the U.S. Air Force in 1991, augmented surgery was an example use case, expanding the idea from a virtual ruler guiding a real pencil, to a virtual medical fixture guiding a real physical scalpel manipulated by a real surgeon. [2] The objective was to overlay virtual content upon the surgeon's direct perception of the real workspace with sufficient realism that it would be perceived as authentic additions to the surgical environment and thereby enhance surgical skill, dexterity, and performance. A proposed benefit of virtual medical fixtures as compared to real hardware was that because they were virtual additions to the ambient reality, they could be partially submerged within real patients, providing guidance and/or barriers within unexposed tissues. [14] [2] [15]

The definition of virtual fixtures [2] [7] [9] is much broader than simply providing guidance of the end-effector. For example, auditory virtual fixtures are used to increase the user awareness by providing audio clues that helps the user by providing multi modal cues for localization of the end-effector. However, in the context of human-machine collaborative systems, the term virtual fixtures is often used to refer to a task dependent virtual aid that is overlaid upon a real environment and guides the user's motion along desired directions while preventing motion in undesired directions or regions of the workspace.

Virtual fixtures can be either guiding virtual fixtures or forbidden regions virtual fixtures. A forbidden regions virtual fixture could be used, for example, in a teleoperated setting where the operator has to drive a vehicle at a remote site to accomplish an objective. If there are pits at the remote site which would be harmful for the vehicle to fall into forbidden regions could be defined at the various pits locations, thus preventing the operator from issuing commands that would result in the vehicle ending up in such a pit. [16] [17] [18]

Example of a forbidden regions virtual fixture Forbidden regions virtual fixture.png
Example of a forbidden regions virtual fixture

Such illegal commands could easily be sent by an operator because of, for instance, delays in the teleoperation loop, poor telepresence or a number of other reasons.

An example of a guiding virtual fixture could be when the vehicle must follow a certain trajectory,

Example of a guiding virtual fixture Guiding virtual fixture.png
Example of a guiding virtual fixture

The operator is then able to control the progress along the preferred direction while motion along the non-preferred direction is constrained.

With both forbidden regions and guiding virtual fixtures the stiffness, or its inverse the compliance, of the fixture can be adjusted. If the compliance is high (low stiffness) the fixture is soft. On the other hand, when the compliance is zero (maximum stiffness) the fixture is hard.

The stiffness of a virtual fixture can be soft or hard. A hard fixture completely constrains the motion to the fixture while a softer fixture allows some deviations from the fixture. Soft hard virtual fixture.png
The stiffness of a virtual fixture can be soft or hard. A hard fixture completely constrains the motion to the fixture while a softer fixture allows some deviations from the fixture.

Virtual fixture control law

This section describes how a control law that implements virtual fixtures can be derived. It is assumed that the robot is a purely kinematic device with end-effector position and end-effector orientation expressed in the robot's base frame . The input control signal to the robot is assumed to be a desired end-effector velocity . In a tele-operated system it is often useful to scale the input velocity from the operator, before feeding it to the robot controller. If the input from the user is of another form such as a force or position it must first be transformed to an input velocity, by for example scaling or differentiating.

Thus the control signal would be computed from the operator's input velocity as:

If there exists a one-to-one mapping between the operator and the slave robot.

If the constant is replaced by a diagonal matrix it is possible to adjust the compliance independently for different dimensions of . For example, setting the first three elements on the diagonal of to and all other elements to zero would result in a system that only permits translational motion and not rotation. This would be an example of a hard virtual fixture that constrains the motion from to . If the rest of the elements on the diagonal were set to a small value, instead of zero, the fixture would be soft, allowing some motion in the rotational directions.

To express more general constraints assume a time-varying matrix which represents the preferred direction at time . Thus if the preferred direction is along a curve in . Likewise, would give preferred directions that span a surface. From two projection operators can be defined, [19] the span and kernel of the column space:

If does not have full column rank the span can not be computed, consequently it is better to compute the span by using the pseudo-inverse, [19] thus in practice the span is computed as:

where denotes the pseudo-inverse of .

If the input velocity is split into two components as:

it is possible to rewrite the control law as:

Next introduce a new compliance that affects only the non-preferred component of the velocity input and write the final control law as:

Related Research Articles

<span class="mw-page-title-main">Virtual reality</span> Computer-simulated experience

Virtual reality (VR) is a simulated experience that employs pose tracking and 3D near-eye displays to give the user an immersive feel of a virtual world. Applications of virtual reality include entertainment, education and business. Other distinct types of VR-style technology include augmented reality and mixed reality, sometimes referred to as extended reality or XR, although definitions are currently changing due to the nascence of the industry.

<span class="mw-page-title-main">Augmented reality</span> View of the real world with computer-generated supplementary features

Augmented reality (AR) is an interactive experience that combines the real world and computer-generated content. The content can span multiple sensory modalities, including visual, auditory, haptic, somatosensory and olfactory. AR can be defined as a system that incorporates three basic features: a combination of real and virtual worlds, real-time interaction, and accurate 3D registration of virtual and real objects. The overlaid sensory information can be constructive, or destructive. This experience is seamlessly interwoven with the physical world such that it is perceived as an immersive aspect of the real environment. In this way, augmented reality alters one's ongoing perception of a real-world environment, whereas virtual reality completely replaces the user's real-world environment with a simulated one.

<span class="mw-page-title-main">Haptic technology</span> Any form of interaction involving touch

Haptic technology is technology that can create an experience of touch by applying forces, vibrations, or motions to the user. These technologies can be used to create virtual objects in a computer simulation, to control virtual objects, and to enhance remote control of machines and devices (telerobotics). Haptic devices may incorporate tactile sensors that measure forces exerted by the user on the interface. The word haptic, from the Greek: ἁπτικός (haptikos), means "tactile, pertaining to the sense of touch". Simple haptic devices are common in the form of game controllers, joysticks, and steering wheels.

Telepresence refers to a set of technologies which allow a person to feel as if they were present, to give the appearance or effect of being present via telerobotics, at a place other than their true location.

<span class="mw-page-title-main">Telerobotics</span>

Telerobotics is the area of robotics concerned with the control of semi-autonomous robots from a distance, chiefly using television, wireless networks or tethered connections. It is a combination of two major subfields, which are teleoperation and telepresence.

<span class="mw-page-title-main">Computer-mediated reality</span> Ability to manipulate ones perception of reality through the use of a computer

Computer-mediated reality refers to the ability to add to, subtract information from, or otherwise manipulate one's perception of reality through the use of a wearable computer or hand-held device such as a smartphone.

<span class="mw-page-title-main">Mixed reality</span> Merging of real and virtual worlds to produce new environments

Mixed reality (MR) is a term used to describe the merging of a real-world environment and a computer-generated one. Physical and virtual objects may co-exist in mixed reality environments and interact in real time.

<i>Rainbows End</i> (novel) 2006 novel by Vernor Vinge

Rainbows End is a 2006 science fiction novel by Vernor Vinge. It was awarded the 2007 Hugo Award for Best Novel. The book is set in San Diego, California, in 2025, in a variation of the fictional world Vinge explored in his 2002 Hugo-winning novella "Fast Times at Fairmont High" and 2004's "Synthetic Serendipity". Vinge has tentative plans for a sequel, picking up some of the loose threads left at the end of the novel. The many technological advances depicted in the novel suggest that the world is undergoing ever-increasing change, following the technological singularity, a recurring subject in Vinge's fiction and nonfiction writing.

Virtual reality in telerehabilitation is a method used first in the training of musculoskeletal patients using asynchronous patient data uploading, and an internet video link. Subsequently, therapists using virtual reality-based telerehabilitation prescribe exercise routines via the web which are then accessed and executed by patients through a web browser. Therapists then monitor the patient's progress via the web and modify the therapy asynchronously without real-time interaction or training.

Characteristic modes (CM) form a set of functions which, under specific boundary conditions, diagonalizes operator relating field and induced sources. Under certain conditions, the set of the CM is unique and complete (at least theoretically) and thereby capable of describing the behavior of a studied object in full.

<span class="mw-page-title-main">Immersion (virtual reality)</span> Perception of being physically present in a non-physical world

Immersion into virtual reality (VR) is a perception of being physically present in a non-physical world. The perception is created by surrounding the user of the VR system in images, sound or other stimuli that provide an engrossing total environment.

<span class="mw-page-title-main">MIMO</span> Use of multiple antennas in radio

In radio, multiple-input and multiple-output (MIMO) is a method for multiplying the capacity of a radio link using multiple transmission and receiving antennas to exploit multipath propagation. MIMO has become an essential element of wireless communication standards including IEEE 802.11n, IEEE 802.11ac, HSPA+ (3G), WiMAX, and Long Term Evolution (LTE). More recently, MIMO has been applied to power-line communication for three-wire installations as part of the ITU G.hn standard and of the HomePlug AV2 specification.

A projection augmented model is an element sometimes employed in virtual reality systems. It consists of a physical three-dimensional model onto which a computer image is projected to create a realistic looking object. Importantly, the physical model is the same geometric shape as the object that the PA model depicts.

<span class="mw-page-title-main">Velocity obstacle</span> Term in robotics and motion planning

In robotics and motion planning, a velocity obstacle, commonly abbreviated VO, is the set of all velocities of a robot that will result in a collision with another robot at some moment in time, assuming that the other robot maintains its current velocity. If the robot chooses a velocity inside the velocity obstacle then the two robots will eventually collide, if it chooses a velocity outside the velocity obstacle, such a collision is guaranteed not to occur.

Affective haptics is the emerging area of research which focuses on the study and design of devices and systems that can elicit, enhance, or influence the emotional state of a human by means of sense of touch. The research field is originated with the Dzmitry Tsetserukou and Alena Neviarouskaya papers on affective haptics and real-time communication system with rich emotional and haptic channels. Driven by the motivation to enhance social interactivity and emotionally immersive experience of users of real-time messaging, virtual, augmented realities, the idea of reinforcing (intensifying) own feelings and reproducing (simulating) the emotions felt by the partner was proposed. Four basic haptic (tactile) channels governing our emotions can be distinguished:

  1. physiological changes
  2. physical stimulation
  3. social touch
  4. emotional haptic design.

The TreadPort Active Wind Tunnel is a unique immersive virtual environment that integrates locomotion interfaces with sensory cues such as visual, auditory, olfactory, radiant heat and wind display. The TPAWT augments the Sarcos Treadport consisting of the Cave automatic virtual environment(CAVE) with a subsonic wind tunnel built around the user environment, and adds wind to the virtual environment. The Treadport Active Wind Tunnel is one of the first virtual environments to include wind into the sensory experience of the user. Other systems considering wind display, directly use fans.

<span class="mw-page-title-main">Point-set registration</span>

In computer vision, pattern recognition, and robotics, point-set registration, also known as point-cloud registration or scan matching, is the process of finding a spatial transformation that aligns two point clouds. The purpose of finding such a transformation includes merging multiple data sets into a globally consistent model, and mapping a new measurement to a known data set to identify features or to estimate its pose. Raw 3D point cloud data are typically obtained from Lidars and RGB-D cameras. 3D point clouds can also be generated from computer vision algorithms such as triangulation, bundle adjustment, and more recently, monocular image depth estimation using deep learning. For 2D point set registration used in image processing and feature-based image registration, a point set may be 2D pixel coordinates obtained by feature extraction from an image, for example corner detection. Point cloud registration has extensive applications in autonomous driving, motion estimation and 3D reconstruction, object detection and pose estimation, robotic manipulation, simultaneous localization and mapping (SLAM), panorama stitching, virtual and augmented reality, and medical imaging.

Visuo-haptic mixed reality (VHMR) is a branch of mixed reality that has the ability of merging visual and tactile perceptions of both virtual and real objects with a collocated approach. The first known system to overlay augmented haptic perceptions on direct views of the real world is the Virtual Fixtures system developed in 1992 at the US Air Force Research Laboratories. Like any emerging technology, the development of the VHMR systems is accompanied by challenges that, in this case, deal with the efforts to enhance the multi-modal human perception with the user-computer interface and interaction devices at the moment available. Visuo-haptic mixed reality (VHMR) consists of adding to a real scene the ability to see and touch virtual objects. It requires the use of see-through display technology for visually mixing real and virtual objects and haptic devices necessary to provide haptic stimuli to the user while interacting with the virtual objects. A VHMR setup allows the user to perceive visual and kinesthetic stimuli in a co-located manner, i.e., the user can see and touch virtual objects at the same spatial location. This setup overcomes the limits of the traditional one, i.e, display and haptic device, because the visuo-haptic co-location of the user's hand and a virtual tool improve the sensory integration of multimodal cues and makes the interaction more natural. But it also comes with technological challenges in order to improve the naturalness of the perceptual experience.

<span class="mw-page-title-main">Virtual reality applications</span> Overview of the various applications that make use of virtual reality

Virtual reality applications are applications that make use of virtual reality (VR), an immersive sensory experience that digitally simulates a virtual environment. Applications have been developed in a variety of domains, such as education, architectural and urban design, digital marketing and activism, engineering and robotics, entertainment, virtual communities, fine arts, healthcare and clinical therapies, heritage and archaeology, occupational safety, social science and psychology.

<span class="mw-page-title-main">Louis B. Rosenberg</span> American engineer and entrepreneur, born 1969

Louis Barry Rosenberg is an American engineer, researcher, inventor, and entrepreneur. He researches augmented reality, virtual reality, and artificial intelligence. He was the Cotchett Endowed Professor of Educational Technology at the California Polytechnic State University, San Luis Obispo. He founded the Immersion Corporation and Unanimous A.I., and he wrote the screenplay for the 2009 romantic comedy film, Lab Rats.

References

  1. Rosenberg, Louis B. (2022). Arai, Kohei (ed.). "Augmented Reality: Reflections at Thirty Years". Proceedings of the Future Technologies Conference (FTC) 2021, Volume 1. Lecture Notes in Networks and Systems. Cham: Springer International Publishing: 1–11. doi:10.1007/978-3-030-89906-6_1. ISBN   978-3-030-89906-6.
  2. 1 2 3 4 5 6 L. B. Rosenberg (1992). "The Use of Virtual Fixtures As Perceptual Overlays to Enhance Operator Performance in Remote Environments" (PDF). Technical Report AL-TR-0089. Wright-Patterson AFB OH: USAF Armstrong Laboratory. Archived (PDF) from the original on July 10, 2019.
  3. 1 2 Rosenberg, L.B. (1993). "Virtual fixtures: Perceptual tools for telerobotic manipulation". Proceedings of IEEE Virtual Reality Annual International Symposium. IEEE. pp. 76–82. doi:10.1109/vrais.1993.380795. ISBN   0-7803-1363-1.
  4. Rosenberg, Louis (1993). "The use of virtual fixtures to enhance telemanipulation with time delay". Proceedings of the ASME Winter Annual Meeting on Advances in Robotics, Mechatronics, and Haptic Interfaces. New Orleans, LA. 49: 29–36.
  5. 1 2 3 Rosenberg, Louis (1993). "The use of virtual fixtures to enhance operator performance in time delayed teleoperation" (PDF). J. Dyn. Syst. Control. 49: 29–36. Archived (PDF) from the original on July 10, 2019.
  6. Noer, Michael (1998-09-21). "Desktop fingerprints". Forbes . Retrieved 22 April 2014.
  7. 1 2 Rosenberg, L. (1993). Kim, Won S. (ed.). "Virtual fixtures as tools to enhance operator performance in telepresence environments". SPIE Manipulator Technology. Telemanipulator Technology and Space Telerobotics. 2057: 10. Bibcode:1993SPIE.2057...10R. doi:10.1117/12.164901. S2CID   111277519.
  8. Abbott, Jake J.; Marayong, Panadda; Okamura, Allison M. (2007). "Haptic Virtual Fixtures for Robot-Assisted Manipulation". In Thrun, Sebastian; Brooks, Rodney; Durrant-Whyte, Hugh (eds.). Robotics Research. Springer Tracts in Advanced Robotics. Vol. 28. Berlin, Heidelberg: Springer. pp. 49–64. doi:10.1007/978-3-540-48113-3_5. ISBN   978-3-540-48113-3.
  9. 1 2 Rosenberg (1994). Das, Hari (ed.). "Virtual Haptic Overlays Enhance Performance in Telepresence Tasks". Telemanipulator and Telepresence Technologies. 2351: 99–108. doi:10.1117/12.197302. S2CID   110971407.
  10. Makhataeva, Zhanat; Varol, Huseyin Atakan (2020). "Augmented Reality for Robotics: A Review". Robotics. 9 (2): 21. doi: 10.3390/robotics9020021 . ISSN   2218-6581.
  11. Leonard, Simon (2015). "Registration of planar virtual fixtures by using augmented reality with dynamic textures". 2015 IEEE International Conference on Robotics and Automation (ICRA). Seattle, WA, USA: IEEE. pp. 4418–4423. doi:10.1109/ICRA.2015.7139810. ISBN   978-1-4799-6923-4. S2CID   16744811.
  12. Xia, Tian; Léonard, Simon; Deguet, Anton; Whitcomb, Louis; Kazanzides, Peter (2012). "Augmented reality environment with virtual fixtures for robotic telemanipulation in space". 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems. pp. 5059–5064. doi:10.1109/IROS.2012.6386169. ISBN   978-1-4673-1736-8. S2CID   2708501.
  13. Rosenberg, Louis B. (1993). Kim, Won S. (ed.). "Virtual fixtures as tools to enhance operator performance in telepresence environments". Telemanipulator Technology and Space Telerobotics. 2057: 10–21. Bibcode:1993SPIE.2057...10R. doi:10.1117/12.164901. S2CID   111277519.
  14. Rosenberg, L. B. (1992). "The Use of Virtual Fixtures as Perceptual Overlays to Enhance Operator Performance" Stanford University, Stanford CA, Center for Design Research (CDR)
  15. Yamamoto, Tomonori; Abolhassani, Niki; Jung, Sung; Okamura, Allison M.; Judkins, Timothy N. (2011-11-08). "Augmented reality and haptic interfaces for robot-assisted surgery". The International Journal of Medical Robotics and Computer Assisted Surgery. 8 (1): 45–56. doi:10.1002/rcs.421. ISSN   1478-5951. PMID   22069247. S2CID   1603125.
  16. Abbott, J.J.; Okamura, A.M. (2003). "Virtual fixture architectures for telemanipulation". 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422). Vol. 2. Taipei, Taiwan: IEEE. pp. 2798–2805. doi:10.1109/ROBOT.2003.1242016. ISBN   978-0-7803-7736-3. S2CID   8678829.
  17. Marayong, Panadda; Hager, Gregory D.; Okamura, Allison M. (2008). "Control methods for guidance virtual fixtures in compliant human-machine interfaces". 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems. pp. 1166–1172. doi:10.1109/IROS.2008.4650838. ISBN   978-1-4244-2057-5. S2CID   6828466.
  18. Marayong, P.; Hager, G.D.; Okamura, A.M. (2006). "Effect of Hand Dynamics on Virtual Fixtures for Compliant Human-Machine Interfaces". 2006 14th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. Alexandria, VA, USA: IEEE. pp. 109–115. doi:10.1109/HAPTIC.2006.1627075. ISBN   978-1-4244-0226-7.
  19. 1 2 Marayong, P.; Okamura, A.M.; Hager, G.D. (2003). "Spatial motion constraints: theory and demonstrations for robot guidance using virtual fixtures". 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422). IEEE. pp. 1270–1275. doi:10.1109/robot.2003.1241880. ISBN   0-7803-7736-2.