Crossmodal attention

Last updated

Crossmodal attention refers to the distribution of attention to different senses. Attention is the cognitive process of selectively emphasizing and ignoring sensory stimuli. According to the crossmodal attention perspective, attention often occurs simultaneously through multiple sensory modalities. [1] These modalities process information from the different sensory fields, such as: visual, auditory, spatial, and tactile. [2] While each of these is designed to process a specific type of sensory information, there is considerable overlap between them which has led researchers to question whether attention is modality-specific or the result of shared "cross-modal" resources. [1] Cross-modal attention is considered to be the overlap between modalities that can both enhance and limit attentional processing. The most common example given of crossmodal attention is the Cocktail Party Effect, which is when a person is able to focus and attend to one important stimulus instead of other less important stimuli. This phenomenon allows deeper levels of processing to occur for one stimulus while others are then ignored.

Contents

A primary concern for cognitive psychologists researching attention is to determine whether directing attention to one specific sensory modality occurs at the expense of others. [3] Previous research has often examined how directing attention to different modalities can affect the efficiency of performance in various tasks. [3] [4] [5] [6] Studies have found that the interplay between attentional modalities exists at the neurological level [7] [8] providing evidence for the influences of cross-modal attention. However a greater number of studies have emphasized the deficits in attention caused by the shifting between modalities. [1] [3] [4] [5]

Deficits caused by crossmodal attention

As cross-modal attention requires attending to two or more types of sensory information simultaneously, attentional resources are typically divided unequally. It has been suggested by most research that this divided attention can result in more attentional deficits than benefits. This has raised the question as to the effectiveness of multitasking and the potential dangers associated with it. Significant amounts of delay in reaction times are present when various distractions across modalities occur. [9] In real-life situations these slower reaction times can result in dangerous situations. Recent concerns in the media on this topic revolve around the topic of cellphone usage while driving. Studies have found that processing, and therefore attending to, auditory information can impair the simultaneous processing of visual information. [10] This suggests that attending to the auditory information from cellphone usage while driving will impair a driver's visual attention and ability to drive. This would result in the endangering of the driver, passengers of the driver, pedestrians, and other drivers and their passengers. Similar studies have examined how visual attention is affected by auditory stimuli as it relates to hemispatial neglect, [4] responses to cuing, [5] and general spatial processing. [2] The majority of this research suggests that multitasking and dividing attention, while possible, degrade the quality of the directed attention. This also suggests that attention is a limited resource that cannot be infinitely divided between modalities and tasks.

Benefits

While research on cross-modal attention has found that deficits in attending often occur, this research has led to a better understanding of attentional processing. Some studies have used positron emission tomography (PET) to examine the neurological basis for how we selectively attend to information using different sensory modalities. [2] Event related potentials (ERPs). have also been used to help researchers measure how humans encode and process attended information in the brain. [10] By increasing our understanding of modality-specific and cross-modal attention, we are better able to understand how we think and direct our attention.

In addition to greater general understanding of attention, other benefits of crossmodal attention have been found. Studies show that reinforcing information through more than one modality can increase learning. [11] This would support the traditional theory that pairing auditory and visual stimuli that communicate the same information improves processing and memory.

See also

Related Research Articles

<span class="mw-page-title-main">Attention</span> Psychological process of selectively perceiving and prioritising discrete aspects of information

Attention is the concentration of awareness on some phenomenon to the exclusion of other stimuli. It is a process of selectively concentrating on a discrete aspect of information, whether considered subjective or objective. William James (1890) wrote that "Attention is the taking possession by the mind, in clear and vivid form, of one out of what seem several simultaneously possible objects or trains of thought. Focalization, concentration, of consciousness are of its essence." Attention has also been described as the allocation of limited cognitive processing resources. Attention is manifested by an attentional bottleneck, in terms of the amount of data the brain can process each second; for example, in human vision, only less than 1% of the visual input data can enter the bottleneck, leading to inattentional blindness.

<span class="mw-page-title-main">McGurk effect</span> Perceptual illusion

The McGurk effect is a perceptual phenomenon that demonstrates an interaction between hearing and vision in speech perception. The illusion occurs when the auditory component of one sound is paired with the visual component of another sound, leading to the perception of a third sound. The visual information a person gets from seeing a person speak changes the way they hear the sound. If a person is getting poor-quality auditory information but good-quality visual information, they may be more likely to experience the McGurk effect. Integration abilities for audio and visual information may also influence whether a person will experience the effect. People who are better at sensory integration have been shown to be more susceptible to the effect. Many people are affected differently by the McGurk effect based on many factors, including brain damage and other disorders.

<span class="mw-page-title-main">Hemispatial neglect</span> Medical condition

Hemispatial neglect is a neuropsychological condition in which, after damage to one hemisphere of the brain, a deficit in attention and awareness towards the side of space opposite brain damage is observed. It is defined by the inability of a person to process and perceive stimuli towards the contralesional side of the body or environment. Hemispatial neglect is very commonly contralateral to the damaged hemisphere, but instances of ipsilesional neglect have been reported.

Multisensory integration, also known as multimodal integration, is the study of how information from the different sensory modalities may be integrated by the nervous system. A coherent representation of objects combining modalities enables animals to have meaningful perceptual experiences. Indeed, multisensory integration is central to adaptive behavior because it allows animals to perceive a world of coherent perceptual entities. Multisensory integration also deals with how different sensory modalities interact with one another and alter each other's processing.

<span class="mw-page-title-main">Cocktail party effect</span> Ability of the brain to focus on a single auditory stimulus by filtering out background noise

The cocktail party effect refers to the phenomenon wherein the brain focuses a person's attention on a particular stimulus, usually auditory. This focus excludes a range of other stimuli from conscious awareness, as when a partygoer follows a single conversation in a noisy room. This ability is widely distributed among humans, with most listeners more or less easily able to portion the totality of sound detected by the ears into distinct streams, and subsequently to decide which streams are most pertinent, excluding all or most others.

<span class="mw-page-title-main">Associative visual agnosia</span> Medical condition

Associative visual agnosia is a form of visual agnosia. It is an impairment in recognition or assigning meaning to a stimulus that is accurately perceived and not associated with a generalized deficit in intelligence, memory, language or attention. The disorder appears to be very uncommon in a "pure" or uncomplicated form and is usually accompanied by other complex neuropsychological problems due to the nature of the etiology. Affected individuals can accurately distinguish the object, as demonstrated by the ability to draw a picture of it or categorize accurately, yet they are unable to identify the object, its features or its functions.

Sensory gating describes neural processes of filtering out redundant or irrelevant stimuli from all possible environmental stimuli reaching the brain. Also referred to as gating or filtering, sensory gating prevents an overload of information in the higher cortical centers of the brain. Sensory gating can also occur in different forms through changes in both perception and sensation, affected by various factors such as "arousal, recent stimulus exposure, and selective attention."

The mismatch negativity (MMN) or mismatch field (MMF) is a component of the event-related potential (ERP) to an odd stimulus in a sequence of stimuli. It arises from electrical activity in the brain and is studied within the field of cognitive neuroscience and psychology. It can occur in any sensory system, but has most frequently been studied for hearing and for vision, in which case it is abbreviated to vMMN. The (v)MMN occurs after an infrequent change in a repetitive sequence of stimuli For example, a rare deviant (d) stimulus can be interspersed among a series of frequent standard (s) stimuli. In hearing, a deviant sound can differ from the standards in one or more perceptual features such as pitch, duration, loudness, or location. The MMN can be elicited regardless of whether someone is paying attention to the sequence. During auditory sequences, a person can be reading or watching a silent subtitled movie, yet still show a clear MMN. In the case of visual stimuli, the MMN occurs after an infrequent change in a repetitive sequence of images.

The cutaneous rabbit illusion is a tactile illusion evoked by tapping two or more separate regions of the skin in rapid succession. The illusion is most readily evoked on regions of the body surface that have relatively poor spatial acuity, such as the forearm. A rapid sequence of taps delivered first near the wrist and then near the elbow creates the sensation of sequential taps hopping up the arm from the wrist towards the elbow, although no physical stimulus was applied between the two actual stimulus locations. Similarly, stimuli delivered first near the elbow then near the wrist evoke the illusory perception of taps hopping from elbow towards wrist. The illusion was discovered by Frank Geldard and Carl Sherrick of Princeton University, in the early 1970s, and further characterized by Geldard (1982) and in many subsequent studies. Geldard and Sherrick likened the perception to that of a rabbit hopping along the skin, giving the phenomenon its name. While the rabbit illusion has been most extensively studied in the tactile domain, analogous sensory saltation illusions have been observed in audition and vision. The word "saltation" refers to the leaping or jumping nature of the percept.

Echoic memory is the sensory memory that registers specific to auditory information (sounds). Once an auditory stimulus is heard, it is stored in memory so that it can be processed and understood. Unlike most visual memory, where a person can choose how long to view the stimulus and can reassess it repeatedly, auditory stimuli are usually transient and cannot be reassessed. Since echoic memories are heard once, they are stored for slightly longer periods of time than iconic memories. Auditory stimuli are received by the ear one at a time before they can be processed and understood.

Extinction is a neurological disorder that impairs the ability to perceive multiple stimuli of the same type simultaneously. Extinction is usually caused by damage resulting in lesions on one side of the brain. Those who are affected by extinction have a lack of awareness in the contralesional side of space and a loss of exploratory search and other actions normally directed toward that side.

Perceptual learning is learning better perception skills such as differentiating two musical tones from one another or categorizations of spatial and temporal patterns relevant to real-world expertise. Examples of this may include reading, seeing relations among chess pieces, and knowing whether or not an X-ray image shows a tumor.

<span class="mw-page-title-main">Cross modal plasticity</span> Reorganization of neurons in the brain to integrate the function of two or more sensory systems

Cross modal plasticity is the adaptive reorganization of neurons to integrate the function of two or more sensory systems. Cross modal plasticity is a type of neuroplasticity and often occurs after sensory deprivation due to disease or brain damage. The reorganization of the neural network is greatest following long-term sensory deprivation, such as congenital blindness or pre-lingual deafness. In these instances, cross modal plasticity can strengthen other sensory systems to compensate for the lack of vision or hearing. This strengthening is due to new connections that are formed to brain cortices that no longer receive sensory input.

Auditory spatial attention is a specific form of attention, involving the focusing of auditory perception to a location in space.

<span class="mw-page-title-main">Visual N1</span>

The visual N1 is a visual evoked potential, a type of event-related electrical potential (ERP), that is produced in the brain and recorded on the scalp. The N1 is so named to reflect the polarity and typical timing of the component. The "N" indicates that the polarity of the component is negative with respect to an average mastoid reference. The "1" originally indicated that it was the first negative-going component, but it now better indexes the typical peak of this component, which is around 150 to 200 milliseconds post-stimulus. The N1 deflection may be detected at most recording sites, including the occipital, parietal, central, and frontal electrode sites. Although, the visual N1 is widely distributed over the entire scalp, it peaks earlier over frontal than posterior regions of the scalp, suggestive of distinct neural and/or cognitive correlates. The N1 is elicited by visual stimuli, and is part of the visual evoked potential – a series of voltage deflections observed in response to visual onsets, offsets, and changes. Both the right and left hemispheres generate an N1, but the laterality of the N1 depends on whether a stimulus is presented centrally, laterally, or bilaterally. When a stimulus is presented centrally, the N1 is bilateral. When presented laterally, the N1 is larger, earlier, and contralateral to the visual field of the stimulus. When two visual stimuli are presented, one in each visual field, the N1 is bilateral. In the latter case, the N1's asymmetrical skewedness is modulated by attention. Additionally, its amplitude is influenced by selective attention, and thus it has been used to study a variety of attentional processes.

Broadbent's filter model is an early selection theory of attention.

The Colavita visual dominance effect refers to the phenomenon in which study participants respond more often to the visual component of an audiovisual stimulus, when presented with bimodal stimuli.

Selective auditory attention, or selective hearing, is a process of the auditory system where an individual selects or focuses on certain stimuli for auditory information processing while other stimuli are disregarded. This selection is very important as the processing and memory capabilities for humans has a limited capacity. When people use selective hearing, noise from the surrounding environment is heard by the auditory system but only certain parts of the auditory information are chosen to be processed by the brain.

Visual selective attention is a brain function that controls the processing of retinal input based on whether it is relevant or important. It selects particular representations to enter perceptual awareness and therefore guide behaviour. Through this process, less relevant information is suppressed.

<span class="mw-page-title-main">Laura Busse</span> German neuroscientist

Laura Busse is a German neuroscientist and professor of Systemic Neuroscience within the Division of Neurobiology at the Ludwig Maximilian University of Munich. Busse's lab studies context-dependent visual processing in mouse models by performing large scale in vivo electrophysiological recordings in the thalamic and cortical circuits of awake and behaving mice.

References

  1. 1 2 3 Rapp, B; Hendel, SK (2003). "Principles of cross-modal competition: Evidence from deficits of attention" (PDF). Psychonomic Bulletin & Review. 10 (1): 210–9. doi: 10.3758/BF03196487 . PMID   12747510.
  2. 1 2 3 MacAluso, E.; Frith, CD; Driver, J (2002-04-01). "Directing Attention to Locations and to Sensory Modalities: Multiple Levels of Selective Processing revealed with PET". Cerebral Cortex. 12 (4): 357–68. doi: 10.1093/cercor/12.4.357 . PMID   11884351.
  3. 1 2 3 Driver, Jon; Spence, Charles (1998). "Crossmodal attention" (PDF). Current Opinion in Neurobiology. 8 (2): 245–53. doi:10.1016/S0959-4388(98)80147-5. PMID   9635209. Archived from the original (PDF) on 2015-09-23. Retrieved 2012-11-27.
  4. 1 2 3 Van Vleet, TM; Robertson, LC (2006). "Cross-modal interactions in time and space: Auditory influence on visual attention in hemispatial neglect". Journal of Cognitive Neuroscience. 18 (8): 1368–79. CiteSeerX   10.1.1.507.4133 . doi:10.1162/jocn.2006.18.8.1368. PMID   16859421. S2CID   7545178.
  5. 1 2 3 Prime, David J.; McDonald, John J.; Green, Jessica; Ward, Lawrence M. (2008). "When cross-modal spatial attention fails". Canadian Journal of Experimental Psychology. 62 (3): 192–197. doi:10.1037/1196-1961.62.3.192. PMID   18778148.
  6. Driver, J; Spence, C (1998). "Cross-modal links in spatial attention". Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences. 353 (1373): 1319–31. doi:10.1098/rstb.1998.0286. PMC   1692335 . PMID   9770225.
  7. Kida, T.; Inui, K.; Tanaka, E.; Kakigi, R. (2010). "Dynamics of Within-, Inter-, and Cross-Modal Attentional Modulation". Journal of Neurophysiology. 105 (2): 674–86. doi:10.1152/jn.00807.2009. PMID   21148089.
  8. Herdman, Chris M.; Friedman, Alinda (1985). "Multiple resources in divided attention: A cross-modal test of the independence of hemispheric resources". Journal of Experimental Psychology: Human Perception and Performance. 11 (1): 40–49. doi:10.1037/0096-1523.11.1.40. PMID   3156957.
  9. CHEN, X. (2012). "Interaction between endogenous and exogenous orienting in crossmodal attention". Scandinavian Journal of Psychology, 53(4), 303-308.
  10. 1 2 Gherri, E; Eimer, M (2011). "Active listening impairs visual perception and selectivity: An ERP study of auditory dual-task costs on visual attention" (PDF). Journal of Cognitive Neuroscience. 23 (4): 832–44. doi:10.1162/jocn.2010.21468. hdl:20.500.11820/5709c455-cee7-425e-ba1e-d97e1e0993ea. PMID   20465407. S2CID   3170458. Archived from the original (PDF) on 2013-12-20. Retrieved 2012-11-27.
  11. Robinson C. W.; Sloutsky V. M. (2013). "When audition dominates vision: Evidence from cross-modal statistical learning". Experimental Psychology. 60 (2): 113–121. doi:10.1027/1618-3169/a000177. PMID   23047918.