Stimulus filtering

Last updated

Stimulus filtering occurs when an animal's nervous system fails to respond to stimuli that would otherwise cause a reaction to occur. [1] The nervous system has developed the capability to perceive and distinguish between minute differences in stimuli, which allows the animal to only react to significant impetus. [2] This enables the animal to conserve energy as it is not responding to unimportant signals.

Contents

Adaptive value

The proximate causes of stimulus filtering can be many things in and around an animal's environment, but the ultimate cause of this response may be the evolutionary advantage offered by stimulus filtering. An animal that saves energy by not responding to unnecessary stimuli may have increased fitness, which means that it would be able to produce more offspring, whereas an animal that does not filter stimuli may have reduced fitness due to depleted energy stores. [3] An animal that practices stimulus filtering may also be more likely to respond appropriately to serious threats than an animal that is distracted by unimportant stimuli.

Physiological mechanism

When particular signals are received by the animal, the superior-ranking neurons determine which signals are important enough to preserve and which signals are insignificant and can be ignored. [2] This process essentially works as a filter as the synapses of the neural network enhance certain signals and repress others, with simple stimuli receiving attention from lower-level neurons, and more complicated stimuli receiving attention from higher level neurons. [2]

Relation to humans

Stimulus filtering is also seen in humans on a day-to-day basis. The cocktail party effect refers to the situation where people in a crowded room tend to ignore other conversations and just focus on the one they are participating in. This effect also works in that when an individual hears their name in another's conversation they immediately focus on that conversation.

Examples

Moths

The evolution of a moth’s auditory system has helped them escape a bat’s echolocation. Physically a moth has two ears on each side of the thorax where they receive ultrasonic indicators to hear the distinct vocalizations that then vibrate the membranes of the moths ears at one of two auditory receptors: A1 or A2. [4] These are attached to the tympanum in the ear. Intense sound pressure waves sweep over the moth's body causing the tympanum to vibrate and deforming these receptor cells. This opens stretch-sensitive channels in the cell membrane and provides the effective stimuli for a moth auditory receptor. These receptors work in the same ways that most neurons do, by responding to the energy contained in selected stimuli and changing the permeability of their cell membranes to positively charged ions. Even though the A1 and A2 receptors work in a similar fashion, there are significant differences between them. The A1 receptor is the main bat detector, and as the rate of firing increases the moth turns away from the bat to reduce sonar echo. In other words, the A1 receptor is sensitive to low frequencies. To determine the relative position of the bat the differential firing rates of the A1 cells will fire on either side of the moth's head and if the bat is farther away cells receive a weaker signal and will fire at a slower rate. The A2 receptor is the emergency back-up system by initiating erratic flight movements as a last-ditch effort to evade capture. [3] This differential sensitivity of the A1 and A2 sensory neurons leads to stimulus filtering of the bat vocalizations. Long-distance evasion tactics are engaged when the bat is far away and therefore the A1 sensory neurons fire. When the bat is in extremely close range, short-distance evasion tactics are engaged with the use of A2 sensory neurons. [4] The adaptive value of the physiological mechanisms of two distinct receptors aids in the evasion of capture from bats.

Parasitoid flies

Female flies of the genus Ormia ochracea possess organs in their bodies that can detect frequencies of cricket sounds from meters away. This process is important for the survival of their species because females will lay their first instar larvae into the body of the cricket, where they will feed and molt for approximately seven days. After this period, the larvae grow into flies and the cricket usually perishes.

Researchers were puzzled about how precise hearing ability could arise from a small ear structure. Normal animals detect and locate sounds using the interaural time difference (ITD) and the interaural level difference (ILD). [5] The ITD is the difference in the time it takes sound to reach the ear. ILD is the difference in sound intensity measure between both ears. At maximum, the ITD would only reach about 1.5 microseconds and the ILD would be less than one decibel. [5] These small values make it hard to sense the differences. To solve these issues, researchers studied the mechanical aspects of flies’ ears. They found that they have a presternum structure linking both tympanal membranes that is critical in detecting sound and localization. The structure acts as a lever by transferring and amplifying vibrational energy between the membranes. [5] After sound hits the membranes at different amplitudes, the presternum sets up symmetrical vibration modes through bending and rocking. [5] This effect helps the nervous system distinguish which side the sound is coming from. Because the presternum acts as an intertympanal bridge, the ITD is increased from 1.5 us to 55 us and the ILD is increased from less than one decibel to over 10 decibels. [5]

When looking at the nervous systems of flies, researchers found three auditory afferents. Type one fires only one spike to the stimulus onset, has low jitter (variability in timing over stimulus presentations), no spontaneous activity, and is the most common type. [6] Type two fires two to four spikes to the stimulus onset, has increased jitter with subsequent spikes, and has low spontaneous activity. [6] Finally, type three has tonic spiking to the presented stimulus, has low jitter only with the first spikes, has low spontaneous activity, and is the least common type. Researchers discovered that neurons responded the strongest to sound frequencies between 4 and 9 kHz, which includes the frequencies present in cricket songs. [5] Also, neurons were found to have responded strongest at 4.5 kHz, which is the frequency of the Gryllus song. [5] Despite the type of auditory afferent, all observed neurons revealed an inverse/latency relationship. The stronger the stimulus, the shorter the time until the neuron begins to respond. The difference in the number of afferents above the threshold on a side of the animal is called population code and can be used to account for sound localization. [6]

Midshipman fish

Female midshipman fish undergo stimulus filtering when it comes time to mate with a male. Midshipman fish use stimulus filtering when listening to sounds produced by underwater species. [7] Dominant signals underwater range between 60–120 Hz, which is the most normally the most sensitive to the fish's auditory receptor. [3] However, the female auditory system changes seasonally to acoustical stimuli in the songs of male midshipman fish. In the summer when female midshipman fish are reproducing they listen to a male humming song that can be produce a frequency level of 400 Hz. [3] The summer is reproducing season for the females so their hearing is more sensitive to the high frequency of the male humming.

Related Research Articles

<span class="mw-page-title-main">Vestibulocochlear nerve</span> Cranial nerve VIII, for hearing and balance

The vestibulocochlear nerve or auditory vestibular nerve, also known as the eighth cranial nerve, cranial nerve VIII, or simply CN VIII, is a cranial nerve that transmits sound and equilibrium (balance) information from the inner ear to the brain. Through olivocochlear fibers, it also transmits motor and modulatory information from the superior olivary complex in the brainstem to the cochlea.

<span class="mw-page-title-main">Sensory nervous system</span> Part of the nervous system

The sensory nervous system is a part of the nervous system responsible for processing sensory information. A sensory system consists of sensory neurons, neural pathways, and parts of the brain involved in sensory perception and interoception. Commonly recognized sensory systems are those for vision, hearing, touch, taste, smell, balance and visceral sensation. Sense organs are transducers that convert data from the outer physical world to the realm of the mind where people interpret the information, creating their perception of the world around them.

<span class="mw-page-title-main">Stimulus (physiology)</span> Detectable change in the internal or external surroundings

In physiology, a stimulus is a detectable change in the physical or chemical structure of an organism's internal or external environment. The ability of an organism or organ to detect external stimuli, so that an appropriate reaction can be made, is called sensitivity (excitability). Sensory receptors can receive information from outside the body, as in touch receptors found in the skin or light receptors in the eye, as well as from inside the body, as in chemoreceptors and mechanoreceptors. When a stimulus is detected by a sensory receptor, it can elicit a reflex via stimulus transduction. An internal stimulus is often the first component of a homeostatic control system. External stimuli are capable of producing systemic responses throughout the body, as in the fight-or-flight response. In order for a stimulus to be detected with high probability, its level of strength must exceed the absolute threshold; if a signal does reach threshold, the information is transmitted to the central nervous system (CNS), where it is integrated and a decision on how to react is made. Although stimuli commonly cause the body to respond, it is the CNS that finally determines whether a signal causes a reaction or not.

Stimulus modality, also called sensory modality, is one aspect of a stimulus or what is perceived after a stimulus. For example, the temperature modality is registered after heat or cold stimulate a receptor. Some sensory modalities include: light, sound, temperature, taste, pressure, and smell. The type and location of the sensory receptor activated by the stimulus plays the primary role in coding the sensation. All sensory modalities work together to heighten stimuli sensation when necessary.

<span class="mw-page-title-main">Sensory neuron</span> Nerve cell that converts environmental stimuli into corresponding internal stimuli

Sensory neurons, also known as afferent neurons, are neurons in the nervous system, that convert a specific type of stimulus, via their receptors, into action potentials or graded receptor potentials. This process is called sensory transduction. The cell bodies of the sensory neurons are located in the dorsal ganglia of the spinal cord.

<span class="mw-page-title-main">Auditory cortex</span> Part of the temporal lobe of the brain

The auditory cortex is the part of the temporal lobe that processes auditory information in humans and many other vertebrates. It is a part of the auditory system, performing basic and higher functions in hearing, such as possible relations to language switching. It is located bilaterally, roughly at the upper sides of the temporal lobes – in humans, curving down and onto the medial surface, on the superior temporal plane, within the lateral sulcus and comprising parts of the transverse temporal gyri, and the superior temporal gyrus, including the planum polare and planum temporale.

Sound localization is a listener's ability to identify the location or origin of a detected sound in direction and distance.

Virtual acoustic space (VAS), also known as virtual auditory space, is a technique in which sounds presented over headphones appear to originate from any desired direction in space. The illusion of a virtual sound source outside the listener's head is created.

<span class="mw-page-title-main">Superior olivary complex</span> Collection of brainstem nuclei related to hearing

The superior olivary complex (SOC) or superior olive is a collection of brainstem nuclei that is located in pons, functions in multiple aspects of hearing and is an important component of the ascending and descending auditory pathways of the auditory system. The SOC is intimately related to the trapezoid body: most of the cell groups of the SOC are dorsal to this axon bundle while a number of cell groups are embedded in the trapezoid body. Overall, the SOC displays a significant interspecies variation, being largest in bats and rodents and smaller in primates.

<span class="mw-page-title-main">Interaural time difference</span> Difference in time that it takes a sound to travel between two ears

The interaural time difference when concerning humans or animals, is the difference in arrival time of a sound between two ears. It is important in the localization of sounds, as it provides a cue to the direction or angle of the sound source from the head. If a signal arrives at the head from one side, the signal has further to travel to reach the far ear than the near ear. This pathlength difference results in a time difference between the sound's arrivals at the ears, which is detected and aids the process of identifying the direction of sound source.

Binaural fusion or binaural integration is a cognitive process that involves the combination of different auditory information presented binaurally, or to each ear. In humans, this process is essential in understanding speech as one ear may pick up more information about the speech stimuli than the other.

Computational auditory scene analysis (CASA) is the study of auditory scene analysis by computational means. In essence, CASA systems are "machine listening" systems that aim to separate mixtures of sound sources in the same way that human listeners do. CASA differs from the field of blind signal separation in that it is based on the mechanisms of the human auditory system, and thus uses no more than two microphone recordings of an acoustic environment. It is related to the cocktail party problem.

Coincidence detection is a neuronal process in which a neural circuit encodes information by detecting the occurrence of temporally close but spatially distributed input signals. Coincidence detectors influence neuronal information processing by reducing temporal jitter and spontaneous activity, allowing the creation of variable associations between separate neural events in memory. The study of coincidence detectors has been crucial in neuroscience with regards to understanding the formation of computational maps in the brain.

A sense is a biological system used by an organism for sensation, the process of gathering information about the surroundings through the detection of stimuli. Although, in some cultures, five human senses were traditionally identified as such, many more are now recognized. Senses used by non-human organisms are even greater in variety and number. During sensation, sense organs collect various stimuli for transduction, meaning transformation into a form that can be understood by the brain. Sensation and perception are fundamental to nearly every aspect of an organism's cognition, behavior and thought.

Sensory maps are areas of the brain which respond to sensory stimulation, and are spatially organized according to some feature of the sensory stimulation. In some cases the sensory map is simply a topographic representation of a sensory surface such as the skin, cochlea, or retina. In other cases it represents other stimulus properties resulting from neuronal computation and is generally ordered in a manner that reflects the periphery. An example is the somatosensory map which is a projection of the skin's surface in the brain that arranges the processing of tactile sensation. This type of somatotopic map is the most common, possibly because it allows for physically neighboring areas of the brain to react to physically similar stimuli in the periphery or because it allows for greater motor control.

Feature detection is a process by which the nervous system sorts or filters complex natural stimuli in order to extract behaviorally relevant cues that have a high probability of being associated with important objects or organisms in their environment, as opposed to irrelevant background or noise.

Electrocochleography is a technique of recording electrical potentials generated in the inner ear and auditory nerve in response to sound stimulation, using an electrode placed in the ear canal or tympanic membrane. The test is performed by an otologist or audiologist with specialized training, and is used for detection of elevated inner ear pressure or for the testing and monitoring of inner ear and auditory nerve function during surgery.

Infrasound is sound at frequencies lower than the low frequency end of human hearing threshold at 20 Hz. It is known, however, that humans can perceive sounds below this frequency at very high pressure levels. Infrasound can come from many natural as well as man-made sources, including weather patterns, topographic features, ocean wave activity, thunderstorms, geomagnetic storms, earthquakes, jet streams, mountain ranges, and rocket launchings. Infrasounds are also present in the vocalizations of some animals. Low frequency sounds can travel for long distances with very little attenuation and can be detected hundreds of miles away from their sources.

Perceptual-based 3D sound localization is the application of knowledge of the human auditory system to develop 3D sound localization technology.

<span class="mw-page-title-main">Sound localization in owls</span> Ability of owls to locate sounds in 3D space

Most owls are nocturnal or crepuscular birds of prey. Because they hunt at night, they must rely on non-visual senses. Experiments by Roger Payne have shown that owls are sensitive to the sounds made by their prey, not the heat or the smell. In fact, the sound cues are both necessary and sufficient for localization of mice from a distant location where they are perched. For this to work, the owls must be able to accurately localize both the azimuth and the elevation of the sound source.

References

  1. "Stimulus filtering". Oxford Reference. 2012-02-17. Retrieved 2015-02-25.
  2. 1 2 3 "5 - Stimulus filtering: vision and motion detection - University Publishing Online". Ebooks.cambridge.org. Retrieved 2015-02-25.
  3. 1 2 3 4 Alcock, J. (2009). Animal Behavior (Ninth ed., Vol. 1). Sunderland, MA: Sinauer Associates, Inc.
  4. 1 2 "Predator and Prey Interactions, Sinervo©1997". Bio.research.ucsc.edu. Retrieved 2015-02-25.
  5. 1 2 3 4 5 6 7 "Neuroethology: Fly Hearing". Nelson.beckman.illinois.edu. 2003-04-29. Retrieved 2015-02-25.
  6. 1 2 3 Michael L. Oshinsky1 and Ronald R. Hoy2 (2002-08-15). "Physiology of the Auditory Afferents in an Acoustic Parasitoid Fly". Jneurosci.org. Retrieved 2015-02-25.{{cite web}}: CS1 maint: numeric names: authors list (link)
  7. Alderks, P. W., & Sisneros, J. A. (2011). "Ontogeny of auditory saccular sensitivity in the plainfin midshipman fish, Porichthys notatus." J Comp Physiol A, 127, 387-398.

Further reading