Sensory processing

Last updated

Sensory processing is the process that organizes and distinguishes sensation (sensory information) from one's own body and the environment, thus making it possible to use the body effectively within the environment. Specifically, it deals with how the brain processes multiple sensory modality inputs, [1] [2] such as proprioception, vision, auditory system, tactile, olfactory, vestibular system, interoception, and taste into usable functional outputs.

Contents

It has been believed for some time that inputs from different sensory organs are processed in different areas in the brain. The communication within and among these specialized areas of the brain is known as functional integration. [3] [4] [5] Newer research has shown that these different regions of the brain may not be solely responsible for only one sensory modality, but could use multiple inputs to perceive what the body senses about its environment. Multisensory integration is necessary for almost every activity that we perform because the combination of multiple sensory inputs is essential for us to comprehend our surroundings.

Overview

It has been believed for some time that inputs from different sensory organs are processed in different areas in the brain, relating to systems neuroscience. Using functional neuroimaging, it can be seen that sensory-specific cortices are activated by different inputs. For example, regions in the occipital cortex are tied to vision and those on the superior temporal gyrus are recipients of auditory inputs. There exist studies suggesting deeper multisensory convergences than those at the sensory-specific cortices, which were listed earlier. This convergence of multiple sensory modalities is known as multisensory integration.

Sensory processing deals with how the brain processes sensory input from multiple sensory modalities. These include the five classic senses of vision (sight), audition (hearing), tactile stimulation (touch), olfaction (smell), and gustation (taste). Other sensory modalities exist, for example the vestibular sense (balance and the sense of movement) and proprioception (the sense of knowing one's position in space) Along with Time (The sense of knowing where one is in time or activities). It is important that the information of these different sensory modalities must be relatable. The sensory inputs themselves are in different electrical signals, and in different contexts. [6] Through sensory processing, the brain can relate all sensory inputs into a coherent percept, upon which our interaction with the environment is ultimately based.

Basic structures involved

The different senses were always thought to be controlled by separate lobes of the brain, [7] called projection areas. The lobes of the brain are the classifications that divide the brain both anatomically and functionally. [8] These lobes are the Frontal lobe, responsible for conscious thought, Parietal lobe, responsible for visuospatial processing, the Occipital lobe, responsible for the sense of sight, and the temporal lobe, responsible for the senses of smell and sound. From the earliest times of neurology, it has been thought that these lobes are solely responsible for their one sensory modality input. [9] However, newer research has shown that that may not entirely be the case. It is worth noting that in the mid-20th century, Gonzalo conducted research that led him to establish cortical functional gradients where functional specificity would be in gradation throughout the cortex. [10]

Problems

Sometimes there can be a problem with the encoding of the sensory information. This disorder is known as Sensory processing disorder (SPD). This disorder can be further classified into three main types. [11]

There are several therapies used to treat SPD. Anna Jean Ayres claimed that a child needs a healthy "sensory diet," which is all of the activities that children engage in, that gives them the necessary sensory inputs that they need to get their brain into improving sensory processing.

History

In the 1930s, Wilder Penfield was conducting a very bizarre operation at the Montreal Neurological Institute. [12] Penfield "pioneered the incorporation of neurophysiological principles in the practice of neurosurgery. [4] [13] Penfield was interested in determining a solution to solve the epileptic seizure problems that his patients were having. He used an electrode to stimulate different regions of the brain's cortex, and would ask his still conscious patient what he or she felt. This process led to the publication of his book, The Cerebral Cortex of Man. The "mapping" of the sensations his patients felt led Penfield to chart out the sensations that were triggered by stimulating different cortical regions. [14] Mrs. H. P. Cantlie was the artist Penfield hired to illustrate his findings. The result was the conception of the first sensory Homunculus.

The Homonculus is a visual representation of the intensity of sensations derived from different parts of the body. Wilder Penfield and his colleague Herbert Jasper developed the Montreal procedure using an electrode to stimulate different parts of the brain to determine which parts were the cause of the epilepsy. This part could then be surgically removed or altered in order to regain optimal brain performance. While performing these tests, they discovered that the functional maps of the sensory and motor cortices were similar in all patients. Because of their novelty at the time, these Homonculi were hailed as the "E=mc² of Neuroscience". [12]

Current research

There are still no definitive answers to the questions regarding the relationship between functional and structural asymmetries in the brain. [15] There are a number of asymmetries in the human brain including how language is processed mainly in the left hemisphere of the brain. There have been some cases, however, in which individuals have comparable language skills to someone who uses his left hemisphere to process language, yet they mainly use their right or both hemispheres. These cases pose the possibility that function may not follow structure in some cognitive tasks. [15] Current research in the fields of sensory processing and multisensory integration is aiming to hopefully unlock the mysteries behind the concept of brain lateralization.

Research on sensory processing has much to offer towards understanding the function of the brain as a whole. The primary task of multisensory integration is to figure out and sort out the vast quantities of sensory information in the body through multiple sensory modalities. These modalities not only are not independent, but they are also quite complementary. Where one sensory modality may give information on one part of a situation, another modality can pick up other necessary information. Bringing this information together facilitates the better understanding of the physical world around us.

It may seem redundant that we are being provided with multiple sensory inputs about the same object, but that is not necessarily the case. This so-called "redundant" information is in fact verification that what we are experiencing is in fact happening. Perceptions of the world are based on models that we build of the world. Sensory information informs these models, but this information can also confuse the models. Sensory illusions occur when these models do not match up. For example, where our visual system may fool us in one case, our auditory system can bring us back to a ground reality. This prevents sensory misrepresentations, because through the combination of multiple sensory modalities, the model that we create is much more robust and gives a better assessment of the situation. Thinking about it logically, it is far easier to fool one sense than it is to simultaneously fool two or more senses.

Examples

One of the earliest sensations is the olfactory sensation. Evolutionary, gustation and olfaction developed together. This multisensory integration was necessary for early humans in order to ensure that they were receiving proper nutrition from their food, and also to make sure that they were not consuming poisonous materials.[ citation needed ] There are several other sensory integrations that developed early on in the human evolutionary time line. The integration between vision and audition was necessary for spatial mapping. Integration between vision and tactile sensations developed along with our finer motor skills including better hand-eye coordination. While humans developed into bipedal organisms, balance became exponentially more essential to survival. The multisensory integration between visual inputs, vestibular (balance) inputs, and proprioception inputs played an important role in our development into upright walkers.

Audiovisual system

Perhaps one of the most studied sensory integrations is the relationship between vision and audition. [16] These two senses perceive the same objects in the world in different ways, and by combining the two, they help us understand this information better. [17] Vision dominates our perception of the world around us. This is because visual spatial information is one of the most reliable sensory modalities. Visual stimuli are recorded directly onto the retina, and there are few, if any, external distortions that provide incorrect information to the brain about the true location of an object. [18] Other spatial information is not as reliable as visual spatial information. For example, consider auditory spatial input. The location of an object can sometimes be determined solely on its sound, but the sensory input can easily be modified or altered, thus giving a less reliable spatial representation of the object. [19] Auditory information therefore is not spatially represented unlike visual stimuli. But once one has the spatial mapping from the visual information, multisensory integration helps bring the information from both the visual and auditory stimuli together to make a more robust mapping.

There have been studies done that show that a dynamic neural mechanism exists for matching the auditory and visual inputs from an event that stimulates multiple senses. [20] One example of this that has been observed is how the brain compensates for target distance. When you are speaking with someone or watching something happen, auditory and visual signals are not being processed concurrently, but they are perceived as being simultaneous. [21] This kind of multisensory integration can lead to slight misperceptions in the visual-auditory system in the form of the ventriloquism effect. [22] An example of the ventriloquism effect is when a person on the television appears to have his voice coming from his mouth, rather than the television's speakers. This occurs because of a pre-existing spatial representation within the brain which is programmed to think that voices come from another human's mouth. This then makes it so the visual response to the audio input is spatially misrepresented, and therefore misaligned.

Sensorimotor system

Hand eye coordination is one example of sensory integration. In this case, we require a tight integration of what we visually perceive about an object, and what we tactilely perceive about that same object. If these two senses were not combined within the brain, then one would have less ability to manipulate an object. Eye–hand coordination is the tactile sensation in the context of the visual system. The visual system is very static, in that it does not move around much, but the hands and other parts used in tactile sensory collection can freely move around. This movement of the hands must be included in the mapping of both the tactile and visual sensations, otherwise one would not be able to comprehend where they were moving their hands, and what they were touching and looking at. An example of this happening is looking at an infant. The infant picks up objects and puts them in his mouth, or touches them to his feet or face. All of these actions are culminating to the formation of spatial maps in the brain and the realization that "Hey, that thing that's moving this object is actually a part of me." Seeing the same thing that they are feeling is a major step in the mapping that is required for infants to begin to realize that they can move their arms and interact with an object. This is the earliest and most explicit way of experiencing sensory integration.

Further research

In the future, research on sensory integration will be used to better understand how different sensory modalities are incorporated within the brain to help us perform even the simplest of tasks. For example, we do not currently have the understanding needed to comprehend how neural circuits transform sensory cues into changes in motor activities. More research done on the sensorimotor system can help understand how these movements are controlled. [23] This understanding can potentially be used to learn more about how to make better prosthetics, and eventually help patients who have lost the use of a limb. Also, by learning more about how different sensory inputs can combine can have profound effects on new engineering approaches using robotics. The robot's sensory devices may take in inputs of different modalities, but if we understand multisensory integration better, we might be able to program these robots to convey these data into a useful output to better serve our purposes.

See also

Related Research Articles

<span class="mw-page-title-main">Perception</span> Interpretation of sensory information

Perception is the organization, identification, and interpretation of sensory information in order to represent and understand the presented information or environment. All perception involves signals that go through the nervous system, which in turn result from physical or chemical stimulation of the sensory system. Vision involves light striking the retina of the eye; smell is mediated by odor molecules; and hearing involves pressure waves.

<span class="mw-page-title-main">Hallucination</span> Perception in the absence of external stimulation that has the qualities of real perception

A hallucination is a perception in the absence of an external stimulus that has the qualities of a real perception. Hallucinations are vivid, substantial, and are perceived to be located in external objective space. Hallucination is a combination of two conscious states of brain wakefulness and REM sleep. They are distinguishable from several related phenomena, such as dreaming, which does not involve wakefulness; pseudohallucination, which does not mimic real perception, and is accurately perceived as unreal; illusion, which involves distorted or misinterpreted real perception; and mental imagery, which does not mimic real perception, and is under voluntary control. Hallucinations also differ from "delusional perceptions", in which a correctly sensed and interpreted stimulus is given some additional significance.

An illusion is a distortion of the senses, which can reveal how the mind normally organizes and interprets sensory stimulation. Although illusions distort the human perception of reality, they are generally shared by most people.

<span class="mw-page-title-main">Sensory nervous system</span> Part of the nervous system

The sensory nervous system is a part of the nervous system responsible for processing sensory information. A sensory system consists of sensory neurons, neural pathways, and parts of the brain involved in sensory perception and interoception. Commonly recognized sensory systems are those for vision, hearing, touch, taste, smell, balance and visceral sensation. Sense organs are transducers that convert data from the outer physical world to the realm of the mind where people interpret the information, creating their perception of the world around them.

<span class="mw-page-title-main">Parietal lobe</span> Part of the brain responsible for sensory input and some language processing

The parietal lobe is one of the four major lobes of the cerebral cortex in the brain of mammals. The parietal lobe is positioned above the temporal lobe and behind the frontal lobe and central sulcus.

<span class="mw-page-title-main">Claustrum</span> Structure in the brain

The claustrum is a thin sheet of neurons and supporting glial cells, that connects to the cerebral cortex and subcortical regions including the amygdala, hippocampus and thalamus of the brain. It is located between the insular cortex laterally and the putamen medially, encased by the extreme and external capsules respectively. Blood to the claustrum is supplied by the middle cerebral artery. It is considered to be the most densely connected structure in the brain, and thus hypothesized to allow for the integration of various cortical inputs such as vision, sound and touch, into one experience. Other hypotheses suggest that the claustrum plays a role in salience processing, to direct attention towards the most behaviorally relevant stimuli amongst the background noise. The claustrum is difficult to study given the limited number of individuals with claustral lesions and the poor resolution of neuroimaging.

Stimulus modality, also called sensory modality, is one aspect of a stimulus or what is perceived after a stimulus. For example, the temperature modality is registered after heat or cold stimulate a receptor. Some sensory modalities include: light, sound, temperature, taste, pressure, and smell. The type and location of the sensory receptor activated by the stimulus plays the primary role in coding the sensation. All sensory modalities work together to heighten stimuli sensation when necessary.

Multisensory integration, also known as multimodal integration, is the study of how information from the different sensory modalities may be integrated by the nervous system. A coherent representation of objects combining modalities enables animals to have meaningful perceptual experiences. Indeed, multisensory integration is central to adaptive behavior because it allows animals to perceive a world of coherent perceptual entities. Multisensory integration also deals with how different sensory modalities interact with one another and alter each other's processing.

Sensory substitution is a change of the characteristics of one sensory modality into stimuli of another sensory modality.

<span class="mw-page-title-main">Associative visual agnosia</span> Medical condition

Associative visual agnosia is a form of visual agnosia. It is an impairment in recognition or assigning meaning to a stimulus that is accurately perceived and not associated with a generalized deficit in intelligence, memory, language or attention. The disorder appears to be very uncommon in a "pure" or uncomplicated form and is usually accompanied by other complex neuropsychological problems due to the nature of the etiology. Affected individuals can accurately distinguish the object, as demonstrated by the ability to draw a picture of it or categorize accurately, yet they are unable to identify the object, its features or its functions.

The two-streams hypothesis is a model of the neural processing of vision as well as hearing. The hypothesis, given its initial characterisation in a paper by David Milner and Melvyn A. Goodale in 1992, argues that humans possess two distinct visual systems. Recently there seems to be evidence of two distinct auditory systems as well. As visual information exits the occipital lobe, and as sound leaves the phonological network, it follows two main pathways, or "streams". The ventral stream leads to the temporal lobe, which is involved with object and visual identification and recognition. The dorsal stream leads to the parietal lobe, which is involved with processing the object's spatial location relative to the viewer and with speech repetition.


Sensory integration therapy (SIT) was originally developed by occupational therapist A. Jean Ayres in the 1970s to help children with sensory-processing difficulties. It was specifically designed to treat Sensory Processing Disorder. Sensory Integration Therapy is based on A. Jean Ayres's Sensory Integration Theory, which proposes that sensory-processing is linked to emotional regulation, learning, behavior, and participation in daily life. Sensory integration is the process of organizing sensations from the body and from environmental stimuli.

Body schema is an organism's internal model of its own body, including the position of its limbs. The neurologist Sir Henry Head originally defined it as a postural model of the body that actively organizes and modifies 'the impressions produced by incoming sensory impulses in such a way that the final sensation of body position, or of locality, rises into consciousness charged with a relation to something that has happened before'. As a postural model that keeps track of limb position, it plays an important role in control of action.

Apperceptive agnosia is a failure in recognition that is due to a failure of perception. In contrast, associative agnosia is a type of agnosia where perception occurs but recognition still does not occur. When referring to apperceptive agnosia, visual and object agnosia are most commonly discussed; this occurs because apperceptive agnosia is most likely to present visual impairments. However, in addition to visual apperceptive agnosia there are also cases of apperceptive agnosia in other sensory areas.

<span class="mw-page-title-main">Somatosensory system</span> Nerve system for sensing touch, temperature, body position, and pain

In physiology, the somatosensory system is the network of neural structures in the brain and body that produce the perception of touch, as well as temperature (thermoception), body position (proprioception), and pain (Nociception). It is a subset of the sensory nervous system, which also represents visual, auditory, olfactory, gustatory and vestibular stimuli.

Extinction is a neurological disorder that impairs the ability to perceive multiple stimuli of the same type simultaneously. Extinction is usually caused by damage resulting in lesions on one side of the brain. Those who are affected by extinction have a lack of awareness in the contralesional side of space and a loss of exploratory search and other actions normally directed toward that side.

A sense is a biological system used by an organism for sensation, the process of gathering information about the surroundings through the detection of stimuli. Although, in some cultures, five human senses were traditionally identified as such, many more are now recognized. Senses used by non-human organisms are even greater in variety and number. During sensation, sense organs collect various stimuli for transduction, meaning transformation into a form that can be understood by the brain. Sensation and perception are fundamental to nearly every aspect of an organism's cognition, behavior and thought.

<span class="mw-page-title-main">Cross modal plasticity</span> Reorganization of neurons in the brain to integrate the function of two or more sensory systems

Cross modal plasticity is the adaptive reorganization of neurons to integrate the function of two or more sensory systems. Cross modal plasticity is a type of neuroplasticity and often occurs after sensory deprivation due to disease or brain damage. The reorganization of the neural network is greatest following long-term sensory deprivation, such as congenital blindness or pre-lingual deafness. In these instances, cross modal plasticity can strengthen other sensory systems to compensate for the lack of vision or hearing. This strengthening is due to new connections that are formed to brain cortices that no longer receive sensory input.

<span class="mw-page-title-main">Sensory processing disorder</span> Medical condition

Sensory processing disorder is a condition in which multisensory input is not adequately processed in order to provide appropriate responses to the demands of the environment. Sensory processing disorder is present in many people with dyspraxia, autism spectrum disorder and attention deficit hyperactivity disorder. Individuals with SPD may inadequately process visual, auditory, olfactory (smell), gustatory (taste), tactile (touch), vestibular (balance), proprioception, and interoception sensory stimuli.

Multisensory learning is the assumption that individuals learn better if they are taught using more than one sense (modality). The senses usually employed in multisensory learning are visual, auditory, kinesthetic, and tactile – VAKT. Other senses might include smell, taste and balance.

References

  1. Stein BE, Stanford TR, Rowland BA (December 2009). "The neural basis of multisensory integration in the midbrain: its organization and maturation". Hear. Res. 258 (1–2): 4–15. doi:10.1016/j.heares.2009.03.012. PMC   2787841 . PMID   19345256.
  2. Stein BE, Rowland BA (2011). "Organization and plasticity in multisensory integration". Enhancing Performance for Action and Perception - Multisensory Integration, Neuroplasticity and Neuroprosthetics, Part I. Progress in Brain Research. Vol. 191. pp. 145–63. doi:10.1016/B978-0-444-53752-2.00007-2. ISBN   9780444537522. PMC   3245961 . PMID   21741550.
  3. Macaluso E, Driver J (May 2005). "Multisensory spatial interactions: a window onto functional integration in the human brain". Trends Neurosci. 28 (5): 264–271. doi:10.1016/j.tins.2005.03.008. PMID   15866201. S2CID   5685282.
  4. 1 2 Todman D. (2008). "Wilder Penfield (1891-1976)". Journal of Neurology. 255 (7): 1104–1105. doi:10.1007/s00415-008-0915-6. PMID   18500490. S2CID   36953396.
  5. Harrison BJ, Pujol J, Lopez-Sola M, Hernandez-Ribas R, Deus J, et al. (2008). "Consistency and functional specialization in the default mode brain network". Proceedings of the National Academy of Sciences of the United States of America. 105 (28): 9781–9786. Bibcode:2008PNAS..105.9781H. doi: 10.1073/pnas.0711791105 . PMC   2474491 . PMID   18621692.
  6. Vanzetta I, Grinvald A (2008). "Coupling between neuronal activity and microcirculation: implications for functional brain imaging". HFSP Journal. 2 (2): 79–98. doi:10.2976/1.2889618. PMC   2645573 . PMID   19404475.
  7. Pirotte B, Voordecker P, Neugroschl C, et al. (June 2008). "Combination of functional magnetic resonance imaging-guided neuronavigation and intraoperative cortical brain mapping improves targeting of motor cortex stimulation in neuropathic pain". Neurosurgery. 62 (6 Suppl 3): 941–56. doi: 10.1227/01.neu.0000333762.38500.ac . PMID   18695580. S2CID   207141116.
  8. Hagmann P, Cammoun L, Gigandet X, Meuli R, Honey CJ, et al. . (2008). Friston, Karl J. (ed.). "olsen Mapping the Structural Core of Human Cerebral Cortex". PLOS Biology. 6 (7): 1479–1493. doi: 10.1371/journal.pbio.0060159 . PMC   2443193 . PMID   18597554.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  9. Marrelec G, Bellec P, Krainik A, Duffau H, Pelegrini-Isaac M, et al. (2008). "Multisensory Regions, systems, and the brain: Hierarchical measures of functional integration in fMRI". Medical Image Analysis. 12 (4): 484–496. doi:10.1016/j.media.2008.02.002. PMID   18396441.
  10. Gonzalo, J. (1945, 1950, 1952, 2010, 2023). Dinámica Cerebral, Open Access. Edición facsímil 2010 del Vol. 1 1945, Vol. 2 1950 (Madrid: Inst. S. Ramón y Cajal, CSIC), Suplemento I 1952 (Trab. Inst. Cajal Invest. Biol.) y 1ª ed. Suplemento II. Red Temática en Tecnologías de Computación Artificial/Natural (RTNAC) y Universidad de Santiago de Compostela (USC). ISBN 978-84-9887-458-7. English edition 2023 Brain Dynamics (Vols.1 and 2, Supplements I and II), Ediciones CSIC, Open Access.
  11. Miller LJ, Nielsen DM, Schoen SA, Brett-Green BA (2009). "Perspectives on sensory processing disorder: a call for translational research". Front Integr Neurosci. 3: 22. doi: 10.3389/neuro.07.022.2009 . PMC   2759332 . PMID   19826493.
  12. 1 2 Blakeslee, Sandra; Blakeslee, Matthew. (2007). The Body has a Mind of its Own . Random House. pp.  440. ISBN   978-1-4000-6469-4.
  13. Yang F, Kruggel F (2008). "Automatic segmentation of human brain sulci". Medical Image Analysis. 12 (4): 442–451. doi:10.1016/j.media.2008.01.003. PMID   18325826.
  14. Seth AK, Dienes Z, Cleeremans A, Overgaard M, Pessoa L (2008). "Measuring consciousness: relating behavioural and neurophysiological approaches". Trends in Cognitive Sciences. 12 (8): 314–321. doi:10.1016/j.tics.2008.04.008. PMC   2767381 . PMID   18606562.
  15. 1 2 Lin SY, Burdine RD (2005). "Brain asymmetry: Switching from left to right". Current Biology. 15 (9): R343–R345. doi: 10.1016/j.cub.2005.04.026 . PMID   15886094.
  16. Witten IB, Knudsen EI (2005). "Why Seeing Is Believing: Merging Auditory and Visual Worlds". Neuron. 48 (3): 489–496. doi: 10.1016/j.neuron.2005.10.020 . PMID   16269365.
  17. Burr D; Alais D; S. Martinez-Conde (2006). "Chapter 14 Combining visual and auditory information". Visual Perception - Fundamentals of Awareness: Multi-Sensory Integration and High-Order Perception. Progress in Brain Research. Vol. 155. pp. 243–258. doi:10.1016/S0079-6123(06)55014-9. ISBN   9780444519276. PMID   17027392.
  18. Huddleston WE, Lewis JW, Phinney RE, DeYoe EA (2008). "Mapping Auditory and visual attention-based apparent motion share functional parallels". Perception & Psychophysics. 70 (7): 1207–1216. doi: 10.3758/PP.70.7.1207 . PMID   18927004.
  19. Jaekl PM; Harris, LR (2007). "Auditory-visual temporal integration measured by shifts in perceived temporal location". Neuroscience Letters. 417 (3): 219–224. CiteSeerX   10.1.1.519.7743 . doi:10.1016/j.neulet.2007.02.029. PMID   17428607. S2CID   7420746.
  20. King, AJ (2005). "Multisensory Integration: Strategies for Synchronization". Current Biology. 15 (9): R339–R341. doi: 10.1016/j.cub.2005.04.022 . PMID   15886092.
  21. Bulkin DA, Groh JM (2006). "Seeing sounds: visual and auditory interactions in the brain". Current Opinion in Neurobiology. 16 (4): 415–419. doi:10.1016/j.conb.2006.06.008. PMID   16837186. S2CID   11042371.
  22. Alais D, Burr D (2004). "The ventriloquist effect results from near-optimal bimodal integration". Current Biology. 14 (3): 257–262. doi: 10.1016/j.cub.2004.01.029 . hdl: 2158/202581 . PMID   14761661. S2CID   3125842.
  23. Samuel AD, Sengupta P (2005). "Sensorimotor integration: Locating locomotion in neural circuits". Current Biology. 15 (9): R341–R353. doi: 10.1016/j.cub.2005.04.021 . PMID   15886093.