Auditory event

Last updated

Auditory events describe the subjective perception, when listening to a certain sound situation. This term was introduced by Jens Blauert (Ruhr-University Bochum) in 1966, in order to distinguish clearly between the physical sound field and the auditory perception of the sound. [1]

Auditory events are the central objects of psychoacoustical investigations. Focus of these investigations is the relationship between the characteristics of a physical sound field and the corresponding perception of listeners. From this relationship conclusions can be drawn about the processing methods of the human auditory system.

Aspects of auditory event investigations can be:

Relationships between sound field and auditory events

The sound field is described by physical quantities, while auditory events are described by quantities of psychoacoustical perception. Below you can find a list with physical sound field quantities and the related psychoacoustical quantities of corresponding auditory events. Mostly there is no simple or proportional relationship between sound field characteristics and auditory events. For example, the auditory event property loudness depends not only on the physical quantity sound pressure but also on the spectral characteristics of the sound and on the sound history.

sound field characteristicsauditory event
sound pressure level loudness
frequency pitch
spectrum timbre
position of a sound source sound localization

Related Research Articles

Pitch (music) Perceptual property in music ordering sounds from low to high

Pitch is a perceptual property of sounds that allows their ordering on a frequency-related scale, or more commonly, pitch is the quality that makes it possible to judge sounds as "higher" and "lower" in the sense associated with musical melodies. Pitch can be determined only in sounds that have a frequency that is clear and stable enough to distinguish from noise. Pitch is a major auditory attribute of musical tones, along with duration, loudness, and timbre.

Head-related transfer function

A head-related transfer function (HRTF), also sometimes known as the anatomical transfer function (ATF), is a response that characterizes how an ear receives a sound from a point in space. As sound strikes the listener, the size and shape of the head, ears, ear canal, density of the head, size and shape of nasal and oral cavities, all transform the sound and affect how it is perceived, boosting some frequencies and attenuating others. Generally speaking, the HRTF boosts frequencies from 2–5 kHz with a primary resonance of +17 dB at 2,700 Hz. But the response curve is more complex than a single bump, affects a broad frequency spectrum, and varies significantly from person to person.

Absolute threshold of hearing minimum sound level that an average human can hear

The absolute threshold of hearing (ATH) is the minimum sound level of a pure tone that an average human ear with normal hearing can hear with no other sound present. The absolute threshold relates to the sound that can just be heard by the organism. The absolute threshold is not a discrete point, and is therefore classed as the point at which a sound elicits a response a specified percentage of the time. This is also known as the auditory threshold.

Psychophysics quantitatively investigates the relationship between physical stimuli and the sensations and perceptions they produce. Psychophysics has been described as "the scientific study of the relation between stimulus and sensation" or, more completely, as "the analysis of perceptual processes by studying the effect on a subject's experience or behaviour of systematically varying the properties of a stimulus along one or more physical dimensions".

Loudness Subjective perception of sound pressure

In acoustics, loudness is the subjective perception of sound pressure. More formally, it is defined as, "That attribute of auditory sensation in terms of which sounds can be ordered on a scale extending from quiet to loud". The relation of physical attributes of sound to perceived loudness consists of physical, physiological and psychological components. The study of apparent loudness is included in the topic of psychoacoustics and employs methods of psychophysics.

Stimulus modality, also called sensory modality, is one aspect of a stimulus or what is perceived after a stimulus. For example, the temperature modality is registered after heat or cold stimulate a receptor. Some sensory modalities include: light, sound, temperature, taste, pressure, and smell. The type and location of the sensory receptor activated by the stimulus plays the primary role in coding the sensation. All sensory modalities work together to heighten stimuli sensation when necessary.

The precedence effect or law of the first wavefront is a binaural psychoacoustical effect. When a sound is followed by another sound separated by a sufficiently short time delay, listeners perceive a single auditory event; its perceived spatial location is dominated by the location of the first-arriving sound. The lagging sound also affects the perceived location. However, its effect is suppressed by the first-arriving sound.

Sound localization is a listener's ability to identify the location or origin of a detected sound in direction and distance.

Audiometry is a branch of audiology and the science of measuring hearing acuity for variations in sound intensity and pitch and for tonal purity, involving thresholds and differing frequencies. Typically, audiometric tests determine a subject's hearing levels with the help of an audiometer, but may also measure ability to discriminate between different sound intensities, recognize pitch, or distinguish speech from background noise. Acoustic reflex and otoacoustic emissions may also be measured. Results of audiometric tests are used to diagnose hearing loss or diseases of the ear, and often make use of an audiogram.

Auditory imagery is a form of mental imagery that is used to organize and analyze sounds when there is no external auditory stimulus present. This form of imagery is broken up into a couple of auditory modalities such as verbal imagery or musical imagery. This modality of mental imagery differs from other sensory images such as motor imagery or visual imagery. The vividness and detail of auditory imagery can vary from person to person depending on their background and condition of their brain. Through all of the research developed to understand auditory imagery behavioral neuroscientists have found that the auditory images developed in subjects' minds are generated in real time and consist of fairly precise information about quantifiable auditory properties as well as melodic and harmonic relationships. These studies have been able to recently gain confirmation and recognition due to the arrival of Positron emission tomography and fMRI scans that can confirm a physiological and psychological correlation.

In audiology and psychoacoustics the concept of critical bands, introduced by Harvey Fletcher in 1933 and refined in 1940, describes the frequency bandwidth of the "auditory filter" created by the cochlea, the sense organ of hearing within the inner ear. Roughly, the critical band is the band of audio frequencies within which a second tone will interfere with the perception of the first tone by auditory masking.

Hearing range range of frequencies that can be heard by humans or other animals

Hearing range describes the range of frequencies that can be heard by humans or other animals, though it can also refer to the range of levels. The human range is commonly given as 20 to 20,000 Hz, although there is considerable variation between individuals, especially at high frequencies, and a gradual loss of sensitivity to higher frequencies with age is considered normal. Sensitivity also varies with frequency, as shown by equal-loudness contours. Routine investigation for hearing loss usually involves an audiogram which shows threshold levels relative to a normal.

Perceptual Evaluation of Audio Quality (PEAQ) is a standardized algorithm for objectively measuring perceived audio quality, developed in 1994-1998 by a joint venture of experts within Task Group 6Q of the International Telecommunication Union's Radiocommunication Sector (ITU-R). It was originally released as ITU-R Recommendation BS.1387 in 1998 and last updated in 2001. It utilizes software to simulate perceptual properties of the human ear and then integrates multiple model output variables into a single metric. PEAQ characterizes the perceived audio quality as subjects would do in a listening test according to ITU-R BS.1116. PEAQ results principally model mean opinion scores that cover a scale from 1 (bad) to 5 (excellent).

Auditory masking occurs when the perception of one sound is affected by the presence of another sound.

Sound Vibration that propagates as an acoustic wave

In physics, sound is a vibration that propagates as an acoustic wave, through a transmission medium such as a gas, liquid or solid.

William M. Hartmann

William M. Hartmann is a noted physicist, psychoacoustician, author, and former president of the Acoustical Society of America. His major contributions in psychoacoustics are in pitch perception, binaural hearing, and sound localization. Working with junior colleagues, he discovered several major pitch effects: the binaural edge pitch, the binaural coherence edge pitch, the pitch shifts of mistuned harmonics, and the harmonic unmasking effect. His textbook, Signals, Sound and Sensation, is widely used in courses on psychoacoustics. He is currently a professor of physics at Michigan State University.

Psychoacoustics is the branch of psychophysics involving the scientific study of sound perception and audiology—how humans perceive various sounds. More specifically, it is the branch of science studying the psychological responses associated with sound. Psychoacoustics is an interdisciplinary field of many areas, including psychology, acoustics, electronic engineering, physics, biology, physiology, and computer science.


A mixing engineer is responsible for combining ("mixing") different sonic elements of an auditory piece into a complete rendition, whether in music, film, or any other content of auditory nature. The finished piece, recorded or live, must achieve a good balance of properties, such as volume, pan positioning, and other effects, while resolving any arising frequency conflicts from various sound sources. These sound sources can comprise the different musical instruments or vocals in a band or orchestra, dialogue or foley in a film, and more.

Ernst Terhardt is a German engineer and psychoacoustician who made significant contributions in diverse areas of audio communication including pitch perception, music cognition, and Fourier transformation. He was professor in the area of acoustic communication at the Institute of Electroacoustics, Technical University of Munich, Germany.

Auditory science or hearing science is a field of research and education concerning the perception of sounds by humans, animals, or machines. It is a heavily interdisciplinary field at the crossroad between acoustics, neuroscience, and psychology. It is often related to one or many of these other fields: psychophysics, psychoacoustics, audiology, physiology, otorhinolaryngology, speech science, automatic speech recognition, music psychology, linguistics, and psycholinguistics.

References

  1. Blauert, J.: Spatial hearing - the psychophysics of human sound localization; MIT Press; Cambridge, Massachusetts (1983), chapter 1