Augmented Reality Sandtable

Last updated

The Augmented Reality Sandtable (ARES) is an interactive, digital sand table that uses augmented reality (AR) technology to create a 3D battlespace map. It was developed by the Human Research and Engineering Directorate (HRED) at the Army Research Laboratory (ARL) to combine the positive aspects of traditional military sand tables with the latest digital technologies to better support soldier training and offer new possibilities of learning. [1] It uses a projector to display a topographical map on top of the sand in a regular sandbox as well as a motion sensor that keeps track of changes in the layout of the sand to appropriately adjust the computer-generated terrain display. [2] [3]

Contents

An ARL study conducted in 2017 with 52 active duty military personnel (36 males and 16 females) found that the participants who used ARES spent less time setting up the table compared to participants who used a traditional sand table. In addition, ARES demonstrated a lower perceived workload score, as measured using the NASA Task Load Index (NASA-TLX) ratings, compared to the traditional sand table. However, there was no significant difference in post-knowledge test scores in recreating the visual map. [4]

Development

The ARES project was one of the 25 ARL initiatives in development from 1995 to 2015 that focused on visualizing spatial data on virtual or sand table interfaces. [1] [5] It was developed by HRED's Simulation and Training Technology Center (STTC) with Charles Amburn as the principal investigator. [1] Collaborations involved with ARES included Dignitas Technologies, Design Interactive (DI), the University of Central Florida's Institute for Simulation and Training, and the U.S. Military Academy at West Point. [6]

ARES was largely designed to be a tangible user interface (TUI), in which digital information can be manipulated using physical objects such as a person's hand. It was constructed using commercial off-the-shelf components, including a projector, a laptop, an LCD monitor, Microsoft's Xbox Kinect sensor, and government-developed ARES software. With the projector and Kinect sensor both facing down on the surface of the sandbox, the projector provides a digital overlay over the sand and the Kinect sensor scans the surface of the map to detect any user gestures inside the boundaries of the sandbox. [1]

During development, researchers explored the possibility of incorporating ideas such as multi-touch surfaces, 3D holographic displays, and virtual environments. However, budget restrictions limited the implementation of such ideas. [5]

In September 2014 during the Modern Day Marine exhibition in Quantico, Virginia, researchers from ARL showcased ARES for the first time. [7]

Uses

According to a 2015 technical report by ARL scientists, ARES is reported to have the following capabilities. [1]

Related Research Articles

Virtual reality Computer-simulated environment simulating physical presence in real or imagined worlds

Virtual reality (VR) is a simulated experience that can be similar to or completely different from the real world. Applications of virtual reality include entertainment, education and business. Other distinct types of VR-style technology include augmented reality and mixed reality, sometimes referred to as extended reality or XR.

Augmented reality View of the real world with computer-generated supplementary features

Augmented reality (AR) is an interactive experience of a real-world environment where the objects that reside in the real world are enhanced by computer-generated perceptual information, sometimes across multiple sensory modalities, including visual, auditory, haptic, somatosensory and olfactory. AR can be defined as a system that incorporates three basic features: a combination of real and virtual worlds, real-time interaction, and accurate 3D registration of virtual and real objects. The overlaid sensory information can be constructive, or destructive. This experience is seamlessly interwoven with the physical world such that it is perceived as an immersive aspect of the real environment. In this way, augmented reality alters one's ongoing perception of a real-world environment, whereas virtual reality completely replaces the user's real-world environment with a simulated one. Augmented reality is related to two largely synonymous terms: mixed reality and computer-mediated reality.

Motion capture Process of recording the movement of objects or people

Motion capture is the process of recording the movement of objects or people. It is used in military, entertainment, sports, medical applications, and for validation of computer vision and robots. In filmmaking and video game development, it refers to recording actions of human actors, and using that information to animate digital character models in 2-D or 3-D computer animation. When it includes face and fingers or captures subtle expressions, it is often referred to as performance capture. In many fields, motion capture is sometimes called motion tracking, but in filmmaking and games, motion tracking usually refers more to match moving.

Mixed reality Merging of real and virtual worlds to produce new environments

Mixed reality (MR) is the merging of real and virtual worlds to produce new environments and visualizations, where physical and digital objects co-exist and interact in real time. Mixed reality does not exclusively take place in either the physical world or virtual world, but is a hybrid of augmented reality and virtual reality. To mark the difference: Augmented reality takes place in the physical world, with information or objects added virtually like an overlay; Virtual Reality immerses you in a fully virtual world without the intervention of the physical world.

Tangible user interface

A tangible user interface (TUI) is a user interface in which a person interacts with digital information through the physical environment. The initial name was Graspable User Interface, which is no longer used. The purpose of TUI development is to empower collaboration, learning, and design by giving physical forms to digital information, thus taking advantage of the human ability to grasp and manipulate physical objects and materials.

United States Army Research Laboratory Research facility of the United States Army

The Army Research Laboratory (ARL) is the U.S. Army's corporate research laboratory. ARL is headquartered at the Adelphi Laboratory Center (ALC) in Adelphi, Maryland. Its largest single site is at Aberdeen Proving Ground, Maryland. Other major ARL locations include Research Triangle Park, North Carolina, White Sands Missile Range, New Mexico, Orlando, Florida, and NASA's Glenn Research Center, Ohio and Langley Research Center, Virginia. ARL also has regional sites in Los Angeles, Chicago, Austin, TX, and Boston.

Sand table Table using constrained sand for modelling or educational purposes

A sand table uses constrained sand for modelling or educational purposes. The original version of a sand table may be the abax used by early Greek students. In the modern era, one common use for a sand table is to make terrain models for military planning and wargaming.

Radar MASINT is a subdiscipline of measurement and signature intelligence (MASINT) and refers to intelligence gathering activities that bring together disparate elements that do not fit within the definitions of signals intelligence (SIGINT), imagery intelligence (IMINT), or human intelligence (HUMINT).

A Dynamic terrain is the representation of terrain together with the capability for modification during a simulation.

Wearable technology Clothing and accessories incorporating computer and advanced electronic technologies

Wearable technology, wearables, fashion technology, smartwear, tech togs, streetwear tech, skin electronics or fashion electronics are smart electronic devices that are worn close to and/or on the surface of the skin, where they detect, analyze, and transmit information concerning e.g. body signals such as vital signs, and/or ambient data and which allow in some cases immediate biofeedback to the wearer.

A virtual touch screen (VTS) is a user interface system that augments virtual objects into reality either through a projector or optical display using sensors to track a person's interaction with the object. For instance, using a display and a rear projector system a person could create images that look three-dimensional and appear to float in midair. Some systems utilize an optical head-mounted display to augment the virtual objects onto the transparent display utilizing sensors to determine visual and physical interactions with the virtual objects projected.

Military Open Simulator Enterprise Strategy (MOSES) is a U.S. Army project evaluating the ability of OpenSimulator to provide independent and secured access to a virtual world.

IllumiRoom

IllumiRoom is a Microsoft Research project that augments a television screen with images projected onto the wall and surrounding objects. The current proof-of-concept uses a Kinect sensor and video projector. The Kinect sensor captures the geometry and colors of the area of the room that surrounds the television, and the projector displays video around the television that corresponds to a video source on the television, such as a video game or movie.

Pointman is a seated user interface for controlling one's avatar in a 3D virtual environment. It combines head tracking, a gamepad, and sliding foot pedals to provide positional control over many aspects of the avatar's posture. Pointman was developed by the US Naval Research Laboratory (NRL) to support the use of dismounted infantry simulation for USMC training and mission rehearsal. NRL's goal in developing Pointman was to extend the range and precision of actions supported by virtual simulators, to better represent what infantrymen can do.

Windows Mixed Reality Mixed reality platform

Windows Mixed Reality is a platform introduced as part of the Windows 10 and 11 operating system, which provides augmented reality and mixed reality experiences with compatible head-mounted displays.

Microsoft HoloLens Mixed reality smartglasses

Microsoft HoloLens, known under development as Project Baraboo, are a pair of mixed reality smartglasses developed and manufactured by Microsoft. HoloLens was the first head-mounted display running the Windows Mixed Reality platform under the Windows 10 computer operating system. The tracking technology used in HoloLens can trace its lineage to Kinect, an add-on for Microsoft's Xbox game console that was introduced in 2010.

Virtual reality applications Overview of the various applications that make use of virtual reality

Virtual reality applications are applications that make use of virtual reality (VR), an immersive sensory experience that digitally simulates a virtual environment. Applications have been developed in a variety of domains, such as education, architectural and urban design, digital marketing and activism, engineering and robotics, entertainment, virtual communities, fine arts, healthcare and clinical therapies, heritage and archaeology, occupational safety, social science and psychology.

The Buckeye system is an operational airborne surveying system that provides high-resolution spatial imagery over an area of interest to support military operations involved with intelligence, surveillance, and reconnaissance. Once mounted on a helicopter or an unmanned aerial vehicle (UAV), it incorporates visual information from a digital camera and elevation data from a Light Detection and Ranging (LIDAR) system to create a two and three-dimensional colored map with orthorectified, 4 to 6-inch resolution.

The Synchronous Impulse Reconstruction (SIRE) radar is a multiple-input, multiple-output (MIMO) radar system designed to detect landmines and improvised explosive devices (IEDs). It consists of a low frequency, impulse-based ultra-wideband (UWB) radar that uses 16 receivers with 2 transmitters at the ends of the 2 meter-wide receive array that send alternating, orthogonal waveforms into the ground and return signals reflected from targets in a given area. The SIRE radar system comes mounted on top of a vehicle and receives signals that form images that uncover up to 33 meters in the direction that the transmitters are facing. It is able to collect and process data as part of an affordable and lightweight package due to slow (40 MHz) yet inexpensive analog-to-digital (A/D) converters that sample the wide bandwidth of radar signals. It uses a GPS and Augmented Reality (AR) technology in conjunction with camera to create a live video stream with a more comprehensive visual display of the targets.

The Spectrally Agile Frequency-Incrementing Reconfigurable (SAFIRE) radar is a vehicle-mounted, forward-looking ground-penetrating radar (FLGPR) system designed to detect buried or hidden explosive hazards. It was developed by the U.S. Army Research Laboratory (ARL) in 2016 as part of a long generation of ultra-wideband (UWB) and synthetic aperture radar (SAR) systems created to combat buried landmines and IEDs. Past iterations include the railSAR, the boomSAR, and the SIRE radar.

References

  1. 1 2 3 4 5 Amburn, Charles; Vey, Nathan; Boyce, Michael; Mize, Jerry (October 2015). "The Augmented REality Sandtable (ARES)". The US Army Research Laboratory.
  2. "Microsoft's Kinect aids in 'augmented reality sand' mapping tool for Marines, Army". Marine Corps Times. September 23, 2014. Retrieved August 2, 2018.
  3. Mufson, Beckett (November 5, 2014). "Design Digital Terrain with the Army's Projection-Mapped Sandtable". Vice Creators. Retrieved August 2, 2018.
  4. Hale, Kelly; Riley, Jennifer; Amburn, Charles; Vey, Nathan (January 1, 2018). "Evaluation of Augmented REality Sandtable (ARES) during Sand Table Construction". US Army Research Laboratory via Defense Technical Information Center.[ dead link ]
  5. 1 2 Garneau, Christopher; Boyce, Michael; Shorter, Paul; Vey, Nathan; Amburn, Charles (February 1, 2018). "The Augmented Reality Sandtable (ARES) Research Strategy". US Army Research Laboratory via Defense Technical Information Center.[ dead link ]
  6. Glass, Dolly (December 18, 2014). "Army and Marines research sand table technology". Team Orlando. Retrieved August 2, 2018.
  7. Hedelt, Carden (September 24, 2014). "New Sand Table Technology Featured at Modern Day Marine". CHIPS. Retrieved August 2, 2018.