Acoustic location is a method of determining the position of an object or sound source by using sound waves. Location can take place in gases (such as the atmosphere), liquids (such as water), and in solids (such as in the earth).
Location can be done actively or passively:
Both of these techniques, when used in water, are known as sonar; passive sonar and active sonar are both widely used.
Acoustic mirrors and dishes, when using microphones, are a means of passive acoustic localization, but when using speakers are a means of active localization. Typically, more than one device is used, and the location is then triangulated between the several devices.
As a military air defense tool, passive acoustic location was used from mid-World War I [1] to the early years of World War II to detect enemy aircraft by picking up the noise of their engines. It was rendered obsolete before and during World War II by the introduction of radar, which was far more effective (but interceptable). Acoustic techniques had the advantage that they could 'see' around corners and over hills, due to sound diffraction.
Civilian uses include locating wildlife [2] and locating the shooting position of a firearm. [3]
Acoustic source localization [4] is the task of locating a sound source given measurements of the sound field. The sound field can be described using physical quantities like sound pressure and particle velocity. By measuring these properties it is (indirectly) possible to obtain a source direction.
Traditionally sound pressure is measured using microphones. Microphones have a polar pattern describing their sensitivity as a function of the direction of the incident sound. Many microphones have an omnidirectional polar pattern which means their sensitivity is independent of the direction of the incident sound. Microphones with other polar patterns exist that are more sensitive in a certain direction. This however is still no solution for the sound localization problem as one tries to determine either an exact direction, or a point of origin. Besides considering microphones that measure sound pressure, it is also possible to use a particle velocity probe to measure the acoustic particle velocity directly. The particle velocity is another quantity related to acoustic waves however, unlike sound pressure, particle velocity is a vector. By measuring particle velocity one obtains a source direction directly. Other more complicated methods using multiple sensors are also possible. Many of these methods use the time difference of arrival (TDOA) technique.
Some have termed acoustic source localization an "inverse problem" in that the measured sound field is translated to the position of the sound source.
Different methods for obtaining either source direction or source location are possible.
The traditional method to obtain the source direction is using the time difference of arrival (TDOA) method. This method can be used with pressure microphones as well as with particle velocity probes.
With a sensor array (for instance a microphone array) consisting of at least two probes it is possible to obtain the source direction using the cross-correlation function between each probes' signal. The cross-correlation function between two microphones is defined as
which defines the level of correlation between the outputs of two sensors and . In general, a higher level of correlation means that the argument is relatively close to the actual time-difference-of-arrival. For two sensors next to each other the TDOA is given by
where is the speed of sound in the medium surrounding the sensors and the source.
A well-known example of TDOA is the interaural time difference. The interaural time difference is the difference in arrival time of a sound between two ears. The interaural time difference is given by
where
In trigonometry and geometry, triangulation is the process of determining the location of a point by measuring angles to it from known points at either end of a fixed baseline, rather than measuring distances to the point directly (trilateration). The point can then be fixed as the third point of a triangle with one known side and two known angles.
For acoustic localization this means that if the source direction is measured at two or more locations in space, it is possible to triangulate its location.
Steered response power (SRP) methods are a class of indirect acoustic source localization methods. Instead of estimating a set of time-differences of arrival (TDOAs) between pairs of microphones and combining the acquired estimates to find the source location, indirect methods search for a candidate source location over a grid of spatial points. In this context, methods such as the steered-response power with phase transform (SRP-PHAT) [5] are usually interpreted as finding the candidate location that maximizes the output of a delay-and-sum beamformer. The method has been shown to be very robust to noise and reverberation, motivating the development of modified approaches aimed at increasing its performance in real-time acoustic processing applications. [6]
Military uses have included locating submarines [7] and aircraft. [8] The first use of this type of equipment was claimed by Commander Alfred Rawlinson of the Royal Naval Volunteer Reserve, who in the autumn of 1916 was commanding a mobile anti-aircraft battery on the east coast of England. He needed a means of locating Zeppelins during cloudy conditions and improvised an apparatus from a pair of gramophone horns mounted on a rotating pole. Several of these equipments were able to give a fairly accurate fix on the approaching airships, allowing the guns to be directed at them despite being out of sight. [9] Although no hits were obtained by this method, Rawlinson claimed to have forced a Zeppelin to jettison its bombs on one occasion. [10]
The air-defense instruments usually consisted of large horns or microphones connected to the operators' ears using tubing, much like a very large stethoscope. [11] [12]
End of the 1920s, an operational comparison of multiple large acoustic listening devices from different nations by the Meetgebouw in The Netherlands showed drawbacks. Fundamental research showed that the human ear is better than one understood in the 20s and 30s. New listening devices closer to the ears and with airtight connections were developed. Moreover, mechanical prediction equipment, given the slow speed of sound as compared to the faster planes, and height corrections provided information to point the searchlight operators and the anti-aircraft gunners to where the detected aircraft flies. Searchlights and guns needed to be at a distance from the listening device. Therefore, electric direction indicator devices were developed. [13]
Most of the work on anti-aircraft sound ranging was done by the British. They developed an extensive network of sound mirrors that were used from World War I through World War II. [14] [15] Sound mirrors normally work by using moveable microphones to find the angle that maximizes the amplitude of sound received, which is also the bearing angle to the target. Two sound mirrors at different positions will generate two different bearings, which allows the use of triangulation to determine a sound source's position.
As World War II neared, radar began to become a credible alternative to the sound location of aircraft. For typical aircraft speeds of that time, sound location only gave a few minutes of warning. [8] The acoustic location stations were left in operation as a backup to radar, as exemplified during the Battle of Britain. [16] Today, the abandoned sites are still in existence and are readily accessible. [14] [ dead link ]
After World War II, sound ranging played no further role in anti-aircraft operations.[ citation needed ]
Active locators have some sort of signal generation device, in addition to a listening device. The two devices do not have to be located together.
Sonar (sound navigation and ranging) is a technique that uses sound propagation under water (or occasionally in air) to navigate, communicate or to detect other vessels. There are two kinds of sonar – active and passive. A single active sonar can localize in range and bearing as well as measuring radial speed. However, a single passive sonar can only localize in bearing directly, though Target Motion Analysis can be used to localize in range, given time. Multiple passive sonars can be used for range localization by triangulation or correlation, directly.
Dolphins, whales and bats use echolocation to detect prey and avoid obstacles.
Having speakers/ultrasonic transmitters emitting sound at known positions and time, the position of a target equipped with a microphone/ultrasonic receiver can be estimated based on the time of arrival of the sound. The accuracy is usually poor under non-line-of-sight conditions, where there are blockages in between the transmitters and the receivers. [17]
Seismic surveys involve the generation of sound waves to measure underground structures. Source waves are generally created by percussion mechanisms located near the ground or water surface, typically dropped weights, vibroseis trucks, or explosives. Data are collected with geophones, then stored and processed by computer. Current technology allows the generation of 3D images of underground rock structures using such equipment.
Because the cost of the associated sensors and electronics is dropping, the use of sound ranging technology is becoming accessible for other uses, such as for locating wildlife. [18]
Sonar is a technique that uses sound propagation to navigate, measure distances (ranging), communicate with or detect objects on or under the surface of the water, such as other vessels.
Flow measurement is the quantification of bulk fluid movement. Flow can be measured using devices called flowmeters in various ways. The common types of flowmeters with industrial applications are listed below:
Time of flight (ToF) is the measurement of the time taken by an object, particle or wave to travel a distance through a medium. This information can then be used to measure velocity or path length, or as a way to learn about the particle or medium's properties. The traveling object may be detected directly or indirectly. Time of flight technology has found valuable applications in the monitoring and characterization of material and biomaterials, hydrogels included.
Sound intensity, also known as acoustic intensity, is defined as the power carried by sound waves per unit area in a direction perpendicular to that area. The SI unit of intensity, which includes sound intensity, is the watt per square meter (W/m2). One application is the noise measurement of sound intensity in the air at a listener's location as a sound energy quantity.
Simultaneous localization and mapping (SLAM) is the computational problem of constructing or updating a map of an unknown environment while simultaneously keeping track of an agent's location within it. While this initially appears to be a chicken or the egg problem, there are several algorithms known to solve it in, at least approximately, tractable time for certain environments. Popular approximate solution methods include the particle filter, extended Kalman filter, covariance intersection, and GraphSLAM. SLAM algorithms are based on concepts in computational geometry and computer vision, and are used in robot navigation, robotic mapping and odometry for virtual reality or augmented reality.
An acoustic mirror is a passive device used to reflect and focus (concentrate) sound waves. Parabolic acoustic mirrors are widely used in parabolic microphones to pick up sound from great distances, employed in surveillance and reporting of outdoor sporting events. Pairs of large parabolic acoustic mirrors which function as "whisper galleries" are displayed in science museums to demonstrate sound focusing.
Sound localization is a listener's ability to identify the location or origin of a detected sound in direction and distance.
Acoustic homing is the process in which a system uses the sound or acoustic signals of a target or destination to guide a moving object. There are two types of acoustic homing: passive acoustic homing and active acoustic homing. Objects using passive acoustic homing rely on detecting acoustic emissions produced by the target. Conversely, objects using active acoustic homing make use of sonar to emit a signal and detect its reflection off the target. The signal detected is then processed by the system to determine the proper response for the object. Acoustic homing is useful for applications where other forms of navigation and tracking can be ineffective. It is commonly used in environments where radio or GPS signals can not be detected, such as underwater.
A sonobuoy is a small expendable sonar buoy dropped from aircraft or ships for anti-submarine warfare or underwater acoustic research. Sonobuoys are typically around 13 cm (5 in) in diameter and 91 cm (3 ft) long. When floating on the water, sonobuoys have both a radio transmitter above the surface and hydrophone sensors underwater.
Array processing is a wide area of research in the field of signal processing that extends from the simplest form of 1 dimensional line arrays to 2 and 3 dimensional array geometries. Array structure can be defined as a set of sensors that are spatially separated, e.g. radio antenna and seismic arrays. The sensors used for a specific problem may vary widely, for example microphones, accelerometers and telescopes. However, many similarities exist, the most fundamental of which may be an assumption of wave propagation. Wave propagation means there is a systemic relationship between the signal received on spatially separated sensors. By creating a physical model of the wave propagation, or in machine learning applications a training data set, the relationships between the signals received on spatially separated sensors can be leveraged for many applications.
Beamforming or spatial filtering is a signal processing technique used in sensor arrays for directional signal transmission or reception. This is achieved by combining elements in an antenna array in such a way that signals at particular angles experience constructive interference while others experience destructive interference. Beamforming can be used at both the transmitting and receiving ends in order to achieve spatial selectivity. The improvement compared with omnidirectional reception/transmission is known as the directivity of the array.
Pseudo-range multilateration, often simply multilateration (MLAT) when in context, is a technique for determining the position of an unknown point, such as a vehicle, based on measurement of the times of arrival (TOAs) of energy waves traveling between the unknown point and multiple stations at known locations. When the waves are transmitted by the vehicle, MLAT is used for surveillance; when the waves are transmitted by the stations, MLAT is used for navigation. In either case, the stations' clocks are assumed synchronized but the vehicle's clock is not.
In land warfare, artillery sound ranging is a method of determining the coordinates of a hostile battery using data derived from the sound of its guns firing, so called target acquisition.
Analysis of sound and acoustics plays a role in such engineering tasks as product design, production test, machine performance, and process control. For instance, product design can require modification of sound level or noise for compliance with standards from ANSI, IEC, and ISO. The work might also involve design fine-tuning to meet market expectations. Here, examples include tweaking an automobile door latching mechanism to impress a consumer with a satisfying click or modifying an exhaust manifold to change the tone of an engine's rumble. Aircraft designers are also using acoustic instrumentation to reduce the noise generated on takeoff and landing.
Underwater acoustics is the study of the propagation of sound in water and the interaction of the mechanical waves that constitute sound with the water, its contents and its boundaries. The water may be in the ocean, a lake, a river or a tank. Typical frequencies associated with underwater acoustics are between 10 Hz and 1 MHz. The propagation of sound in the ocean at frequencies lower than 10 Hz is usually not possible without penetrating deep into the seabed, whereas frequencies above 1 MHz are rarely used because they are absorbed very quickly.
Geophysical MASINT is a branch of Measurement and Signature Intelligence (MASINT) that involves phenomena transmitted through the earth and manmade structures including emitted or reflected sounds, pressure waves, vibrations, and magnetic field or ionosphere disturbances.
An acoustic camera is an imaging device used to locate sound sources and to characterize them. It consists of a group of microphones, also called a microphone array, from which signals are simultaneously collected and processed to form a representation of the location of the sound sources.
3D sound localization refers to an acoustic technology that is used to locate the source of a sound in a three-dimensional space. The source location is usually determined by the direction of the incoming sound waves and the distance between the source and sensors. It involves the structure arrangement design of the sensors and signal processing techniques.
Sonar systems are generally used underwater for range finding and detection. Active sonar emits an acoustic signal, or pulse of sound, into the water. The sound bounces off the target object and returns an echo to the sonar transducer. Unlike active sonar, passive sonar does not emit its own signal, which is an advantage for military vessels. But passive sonar cannot measure the range of an object unless it is used in conjunction with other passive listening devices. Multiple passive sonar devices must be used for triangulation of a sound source. No matter whether active sonar or passive sonar, the information included in the reflected signal can not be used without technical signal processing. To extract the useful information from the mixed signal, some steps are taken to transfer the raw acoustic data.
In virtual reality (VR) and augmented reality (AR), a pose tracking system detects the precise pose of head-mounted displays, controllers, other objects or body parts within Euclidean space. Pose tracking is often referred to as 6DOF tracking, for the six degrees of freedom in which the pose is often tracked.