Acoustic wayfinding

Last updated

Acoustic wayfinding is the practice of using the auditory system to orient oneself and navigate physical space. It is commonly used by the visually impaired, allowing them to retain their mobility without relying on visual cues from their environment.

Contents

Method

Acoustic wayfinding involves using a variety of auditory cues to create a mental map of the surrounding environment. This can include a number of techniques: navigating by sounds from the natural environment, such as pedestrian crossing signals; echolocation, or creating sound waves (by tapping a cane or making clicking noises) to determine the location and size of surrounding objects; and memorizing the unique sounds in a given space to recognize it again later. For the visually impaired, these auditory cues become the primary substitute for visual information about the direction and distance of people and objects in their environment. [1]

Visually impaired person using a cane to navigate a city street Detectable Warnings.jpg
Visually impaired person using a cane to navigate a city street

However, there are a number of common obstacles to acoustic wayfinding techniques: noisy outdoor environments can challenge an individual's ability to identify useful sounds, while indoors, the architecture may not provide an acoustic response which is useful for orientation and destination. Among the most difficult environments to navigate for individuals who rely on acoustic wayfinding are crowded places like department stores, transit stations, and hotel lobbies, or open spaces like parking lots and parks, where distinct sound cues are lacking. This means that, in practice, individuals who navigate primarily by acoustic wayfinding must also rely on a number of other senses – including touch, smell, and residual sight – to supplement auditory cues. [2] These different methods can be used in tandem. For example, visually impaired individuals often use a white cane, not only to physically locate obstacles in front of them, but also to acoustically get a sense of what those obstacles may be. [3] By tapping the cane, they also create sound waves that help them to gauge the location and size of nearby objects.

Importance in architecture

Recently[ when? ], architects and acousticians have begun to address the problems faced by people who rely primarily on acoustic wayfinding to navigate urban spaces. [4] The primary work on the architectural implications of acoustic wayfinding comes from a collaboration between Christopher Downey, an architect who went blind in 2008 and has since worked to improve architectural design for the visually impaired, [5] and Joshua Cushner, who leads the Acoustic consulting practice for engineering design firm Arup in San Francisco. Their work focuses on how to plan new facilities to include sensible systems of sound markers and architectural spaces which provide orientation through acoustic cues. On 20 September 2011, the San Francisco chapter of the American Institute of Architects organized an acoustic wayfinding discussion and walking tour, [6] led by Chris Downey and Joshua Cushner. The purpose of the tour was to highlight the ways that visual impaired people associate sounds with particular buildings and locations, creating "sound markers" that help them find their way on the street or indoors, and to discuss implementing more unique sound markers into urban design projects. [7]

See also

Related Research Articles

Animal echolocation Method used by several animal species to determine location using sound

Echolocation, also called bio sonar, is a biological sonar used by several animal species. Echolocating animals emit calls out to the environment and listen to the echoes of those calls that return from various objects near them. They use these echoes to locate and identify the objects. Echolocation is used for navigation, foraging, and hunting in various environments.

Agnosia

Agnosia is the inability to process sensory information. Often there is a loss of ability to recognize objects, persons, sounds, shapes, or smells while the specific sense is not defective nor is there any significant memory loss. It is usually associated with brain injury or neurological illness, particularly after damage to the occipitotemporal border, which is part of the ventral stream. Agnosia only affects a single modality, such as vision or hearing. More recently, a top-down interruption is considered to cause the disturbance of handling perceptual information.

An audio game is an electronic game played on a device such as a personal computer. It is similar to a video game save that there is audible and tactile feedback but not visual.

Human echolocation is the ability of humans to detect objects in their environment by sensing echoes from those objects, by actively creating sounds: for example, by tapping their canes, lightly stomping their foot, snapping their fingers, or making clicking noises with their mouths. People trained to orient by echolocation can interpret the sound waves reflected by nearby objects, accurately identifying their location and size.

Sound localization is a listener's ability to identify the location or origin of a detected sound in direction and distance.

Wayfinding

Wayfinding encompasses all of the ways in which people orient themselves in physical space and navigate from place to place.

Sensory substitution is a change of the characteristics of one sensory modality into stimuli of another sensory modality.

Visual impairment, also known as vision impairment or vision loss, is a decreased ability to see to a degree that causes problems not fixable by usual means, such as glasses. Some also include those who have a decreased ability to see because they do not have access to glasses or contact lenses. Visual impairment is often defined as a best corrected visual acuity of worse than either 20/40 or 20/60. The term blindness is used for complete or nearly complete vision loss. Visual impairment may cause difficulties with normal daily activities such as reading and walking without adaptive training and equipment.

Brown long-eared bat Species of bat

The brown long-eared bat or common long-eared bat is a small Eurasian insectivorous bat. It has distinctive ears, long and with a distinctive fold. It is extremely similar to the much rarer grey long-eared bat which was only validated as a distinct species in the 1960s. An adult brown long-eared bat has a body length of 4.5-4.8 cm, a tail of 4.1-4.6 cm, and a forearm length of 4-4.2 cm. The ears are 3.3-3.9 cm in length, and readily distinguish the long-eared bats from most other bat species. They are relatively slow flyers compared to other bat species.

Crypsis aspect of animal behaviour and morphology

In ecology, crypsis is the ability of an animal to avoid observation or detection by other animals. It may be a predation strategy or an antipredator adaptation. Methods include camouflage, nocturnality, subterranean lifestyle and mimicry. Crypsis can involve visual, olfactory, or auditory concealment. When it is visual, the term cryptic coloration, effectively a synonym for animal camouflage, is sometimes used, but many different methods of camouflage are employed by animals.

Since the Global Positioning System (GPS) was introduced in the late 1980s there have been many attempts to integrate it into a navigation-assistance system for blind and visually impaired people.

Acoustic location Use of reflected sound waves to locate objects

Acoustic location is the use of sound to determine the distance and direction of its source or reflector. Location can be done actively or passively, and can take place in gases, liquids, and in solids.

Eric Knudsen is a professor of neurobiology at Stanford University. He is best known for his discovery, along with Masakazu Konishi, of a brain map of sound location in two dimensions in the barn owl, tyto alba. His work has contributed to the understanding of information processing in the auditory system of the barn owl, the plasticity of the auditory space map in developing and adult barn owls, the influence of auditory and visual experience on the space map, and more recently, mechanisms of attention and learning. He is a recipient of the Lashley Award, the Gruber Prize in Neuroscience, and the Newcomb Cleveland prize and is a member of both the National Academy of Sciences and the American Philosophical Society.

A sensory cue is a statistic or signal that can be extracted from the sensory input by a perceiver, that indicates the state of some property of the world that the perceiver is interested in perceiving.

Active sensory systems are sensory receptors that are activated by probing the environment with self-generated energy. Examples include echolocation of bats and dolphins and insect antennae. Using self-generated energy allows more control over signal intensity, direction, timing and spectral characteristics. By contrast, passive sensory systems involve activation by ambient energy. For example, human vision relies on using light from the environment.

Orientation and Mobility

Orientation and Mobility, or O&M, is a profession which focuses on instructing individuals who are blind or visually impaired with safe and effective travel through their environment. Individual O&M specialists can work for schools, government agencies or work as private contractors. The Academy for Certification of Vision Rehabilitation and Education Professionals (ACVREP) offers certification for vision rehabilitation professionals in the United States.

In computing, 3D interaction is a form of human-machine interaction where users are able to move and perform interaction in 3D space. Both human and machine process information where the physical position of elements in the 3D space is relevant.

Visual capture

In psychology, visual capture is the dominance of vision over other sense modalities in creating a percept. In this process, the visual senses influence the other parts of the somatosensory system, to result in a perceived environment that is not congruent with the actual stimuli. Through this phenomenon, the visual system is able to disregard what other information a different sensory system is conveying, and provide a logical explanation for whatever output the environment provides. Visual capture allows one to interpret the location of sound as well as the sensation of touch without actually relying on those stimuli but rather creating an output that allows the individual to perceive a coherent environment.

Auditory feedback (AF) is an aid used by humans to control speech production and singing by helping the individual verify whether the current production of speech or singing is in accordance with his acoustic-auditory intention. This process is possible through what is known as the auditory feedback loop, a three-part cycle that allows individuals to first speak, then listen to what they have said, and lastly, correct it when necessary. From the viewpoint of movement sciences and neurosciences, the acoustic-auditory speech signal can be interpreted as the result of movements of speech articulators. Auditory feedback can hence be inferred as a feedback mechanism controlling skilled actions in the same way that visual feedback controls limb movements.

VR positional tracking

In virtual reality (VR), positional tracking detects the precise position of the head-mounted displays, controllers, other objects or body parts within Euclidean space. Because the purpose of VR is to emulate perceptions of reality, it is paramount that positional tracking be both accurate and precise so as not to break the illusion of three-dimensional space. Several methods of tracking the position and orientation of the display and any associated objects or devices have been developed to achieve this. All of said methods utilize sensors which repeatedly record signals from transmitters on or near the tracked object(s), and then send that data to the computer in order to maintain an approximation of their physical locations. By and large, these physical locations are identified and defined using one or more of three coordinate systems: the Cartesian rectilinear system, the spherical polar system, and the cylindrical system. Many interfaces have also been designed to monitor and control one’s movement within and interaction with the virtual 3D space; such interfaces must work closely with positional tracking systems to provide a seamless user experience.

References

  1. Reginald G. Golledge; Robert John Stimson (1997). Spatial behavior: a geographic perspective. Guilford Press. p. 508. ISBN   978-1-57230-050-7.
  2. Juval Portugali (1996). The construction of cognitive maps. Springer. p. 230. ISBN   978-0-7923-3949-6.
  3. Barry Truax (2001). Acoustic communication. Greenwood Publishing Group. p. 21. ISBN   978-1-56750-536-8.
  4. Arthur, Paul and Romedi Passini (1992). Wayfinding: people, signs, and architecture. McGraw-Hill Book Co. ISBN   0-07-551016-2.
  5. McGray, Douglas (October 2010). "Design Within Reach: A blind architect relearns his craft". The Atlantic Monthly Group. Retrieved 24 September 2011.
  6. Britt, Aaron (5 September 2011). "SF's Architecture and the City". Dwell Media LLC. Retrieved 24 September 2011.
  7. "Architecture and the city festival guide" (Press release). AIASF. September 2011.