Multisensory integration

Last updated

Multisensory integration, also known as multimodal integration, is the study of how information from the different sensory modalities (such as sight, sound, touch, smell, self-motion, and taste) may be integrated by the nervous system. [1] A coherent representation of objects combining modalities enables animals to have meaningful perceptual experiences. Indeed, multisensory integration is central to adaptive behavior because it allows animals to perceive a world of coherent perceptual entities. [2] Multisensory integration also deals with how different sensory modalities interact with one another and alter each other's processing.

Contents

General introduction

Multimodal perception is how animals form coherent, valid, and robust perception by processing sensory stimuli from various modalities. Surrounded by multiple objects and receiving multiple sensory stimulations, the brain is faced with the decision of how to categorize the stimuli resulting from different objects or events in the physical world. The nervous system is thus responsible for whether to integrate or segregate certain groups of signals. Multimodal perception has been widely studied in cognitive science, behavioral science, and neuroscience.

Stimuli and sensory modalities

There are four attributes of stimulus: modality, intensity, location, and duration. The neocortex in the mammalian brain has parcellations that primarily process sensory input from one modality. For example, primary visual area, V1, or primary somatosensory area, S1. These areas mostly deal with low-level stimulus features such as brightness, orientation, intensity, etc. These areas have extensive connections to each other as well as to higher association areas that further process the stimuli and are believed to integrate sensory input from various modalities. However, multisensory effects have been shown to occur in primary sensory areas as well. [3]

Binding problem

The relationship between the binding problem and multisensory perception can be thought of as a question – the binding problem – and its potential solution – multisensory perception. The binding problem stemmed from unanswered questions about how mammals (particularly higher primates) generate a unified, coherent perception of their surroundings from the cacophony of electromagnetic waves, chemical interactions, and pressure fluctuations that forms the physical basis of the world around us. It was investigated initially in the visual domain (colour, motion, depth, and form), then in the auditory domain, and recently in the multisensory areas. It can be said therefore, that the binding problem is central to multisensory perception. [4]

However, considerations of how unified conscious representations are formed are not the full focus of multisensory Integration research. It is obviously important for the senses to interact in order to maximize how efficiently people interact with the environment. For perceptual experience and behavior to benefit from the simultaneous stimulation of multiple sensory modalities, integration of the information from these modalities is necessary. Some of the mechanisms mediating this phenomenon and its subsequent effects on cognitive and behavioural processes will be examined hereafter. Perception is often defined as one's conscious experience, and thereby combines inputs from all relevant senses and prior knowledge. Perception is also defined and studied in terms of feature extraction, which is several hundred milliseconds away from conscious experience. Notwithstanding the existence of Gestalt psychology schools that advocate a holistic approach to the operation of the brain, [5] [6] the physiological processes underlying the formation of percepts and conscious experience have been vastly understudied. Nevertheless, burgeoning neuroscience research continues to enrich our understanding of the many details of the brain, including neural structures implicated in multisensory integration such as the superior colliculus (SC) [7] and various cortical structures such as the superior temporal gyrus (GT) and visual and auditory association areas. Although the structure and function of the SC are well known, the cortex and the relationship between its constituent parts are presently the subject of much investigation. Concurrently, the recent impetus on integration has enabled investigation into perceptual phenomena such as the ventriloquism effect, [8] rapid localization of stimuli and the McGurk effect; [9] culminating in a more thorough understanding of the human brain and its functions.

History

Studies of sensory processing in humans and other animals has traditionally been performed one sense at a time, [10] and to the present day, numerous academic societies and journals are largely restricted to considering sensory modalities separately ('Vision Research', 'Hearing Research' etc.). However, there is also a long and parallel history of multisensory research. An example is the Stratton's (1896) experiments on the somatosensory effects of wearing vision-distorting prism glasses. [11] [12] Multisensory interactions or crossmodal effects in which the perception of a stimulus is influenced by the presence of another type of stimulus are referred since very early in the past. They were reviewed by Hartmann [13] in a fundamental book where, among several references to different types of multisensory interactions, reference is made to the work of Urbantschitsch in 1888 [14] who reported on the improvement of visual acuity by auditive stimuli in subjects with damaged brains. This effect was also found later in individuals with undamaged brains by Krakov [15] and Hartmann, [16] as well as the fact that the visual acuity could be improved by other type of stimuli. [16] It is also noteworthy the amount of work in the early 1930s on intersensory relations in the Soviet Union, reviewed by London. [17] A remarkable multisensory research is the extensive and pioneering work of Gonzalo [18] in the mid-20th century on the characterization of a multisensory syndrome in patients with parieto-occipital cortical lesions. In this syndrome, all the sensory functions are affected, and with symmetric bilaterality, in spite of being a unilateral lesion where the primary areas were not involved. A feature of this syndrome is the great permeability to crossmodal effects between visual, tactile, auditive stimuli as well as muscular effort to improve the perception, also decreasing the reaction times. The improvement by crossmodal effect was found to be greater as the primary stimulus to be perceived was weaker, and as the cortical lesion was greater (Vol 1 and 2 of reference [18] ). This author interpreted these phenomena under a dynamic physiological concept, and from a model based on functional gradients through the cortex and scaling laws of dynamical systems, thus highlighting the functional unity of the cortex. According to the functional cortical gradients, the specificity of the cortex would be distributed in gradation, and the overlap of different specific gradients would be related to multisensory interactions. [19] Multisensory research has recently gained enormous interest and popularity.

Example of spatial and structural congruence

When we hear a car honk, we would determine which car triggers the honk by which car we see is the spatially closest to the honk. It's a spatially congruent example by combining visual and auditory stimuli. On the other hand, the sound and the pictures of a TV program would be integrated as structurally congruent by combining visual and auditory stimuli. However, if the sound and the pictures did not meaningfully fit, we would segregate the two stimuli. Therefore, spatial or structural congruence comes from not only combining the stimuli but is also determined by our understanding.

Theories and approaches

Visual dominance

Literature on spatial crossmodal biases suggests that visual modality often influences information from other senses. [20] Some research indicates that vision dominates what we hear, when varying the degree of spatial congruency. This is known as the ventriloquist effect. [21] In cases of visual and haptic integration, children younger than 8 years of age show visual dominance when required to identify object orientation. However, haptic dominance occurs when the factor to identify is object size. [22] [23]

Modality appropriateness

According to Welch and Warren (1980), the Modality Appropriateness Hypothesis states that the influence of perception in each modality in multisensory integration depends on that modality's appropriateness for the given task. Thus, vision has a greater influence on integrated localization than hearing, and hearing and touch have a greater bearing on timing estimates than vision. [24] [25]

More recent studies refine this early qualitative account of multisensory integration. Alais and Burr (2004), found that following progressive degradation in the quality of a visual stimulus, participants' perception of spatial location was determined progressively more by a simultaneous auditory cue. [26] However, they also progressively changed the temporal uncertainty of the auditory cue; eventually concluding that it is the uncertainty of individual modalities that determine to what extent information from each modality is considered when forming a percept. [26] This conclusion is similar in some respects to the 'inverse effectiveness rule'. The extent to which multisensory integration occurs may vary according to the ambiguity of the relevant stimuli. In support of this notion, a recent study shows that weak senses such as olfaction can even modulate the perception of visual information as long as the reliability of visual signals is adequately compromised. [27]

Bayesian integration

The theory of Bayesian integration is based on the fact that the brain must deal with a number of inputs, which vary in reliability. [28] In dealing with these inputs, it must construct a coherent representation of the world that corresponds to reality. The Bayesian integration view is that the brain uses a form of Bayesian inference. [29] This view has been backed up by computational modeling of such a Bayesian inference from signals to coherent representation, which shows similar characteristics to integration in the brain. [29]

Cue combination vs. causal inference models

With the assumption of independence between various sources, the traditional cue combination model is successful in modality integration. However, depending on the discrepancies between modalities, there might be different forms of stimuli fusion: integration, partial integration, and segregation. To fully understand the other two types, we have to use causal inference model without the assumption as cue combination model. This freedom gives us general combination of any numbers of signals and modalities by using Bayes' rule to make causal inference of sensory signals. [30]

The hierarchical vs. non-hierarchical models

The difference between two models is that hierarchical model can explicitly make causal inference to predict certain stimulus while non-hierarchical model can only predict joint probability of stimuli. However, hierarchical model is actually a special case of non-hierarchical model by setting joint prior as a weighted average of the prior to common and independent causes, each weighted by their prior probability. Based on the correspondence of these two models, we can also say that hierarchical is a mixture modal of non-hierarchical model.

Independence of likelihoods and priors

For Bayesian model, the prior and likelihood generally represent the statistics of the environment and the sensory representations. The independence of priors and likelihoods is not assured since the prior may vary with likelihood only by the representations. However, the independence has been proved by Shams with series of parameter control in multi sensory perception experiment. [31]

Shared intentionality

The shared intentionality approach proposes the holistic explanation of the neurophysiological processes underlying the formation of percepts and conscious experience. The psychological construct of shared intentionality was introduced at the end of 20th century. [32] [33] [34] Michael Tomasello developed it to explain cognition beginning in the earlier developmental stage through unaware collaboration in mother-child dyads. [35] [36] Over the past twenty years, knowledge of this notion has evolved through observing shared intentionality from different perspectives, e.g., psychophysiology, [37] [38] and neurobiology. [39] According to Igor Val Danilov, shared intentionality enables the mother-child pair to share the essential sensory stimulus of the actual cognitive problem. [40] The hypothesis of neurophysiological processes occurring during shared intentionality explains its integrative complexity from neuronal to interpersonal dynamics levels. [41] This collaborative interaction provides environmental learning of the immature organism, starting at the reflexes stage of development, for processing the organization, identification, and interpretation of sensory information in developing perception. [42] From this perspective, Shared intentionality contributes to the formation of percepts and conscious experiences, solving the binding problem, or at least complements one of the mechanisms noted above. [43]

Principles

The contributions of Barry Stein, Alex Meredith, and their colleagues (e.g."The merging of the senses" 1993, [44] ) are widely considered to be the groundbreaking work in the modern field of multisensory integration. Through detailed long-term study of the neurophysiology of the superior colliculus, they distilled three general principles by which multisensory integration may best be described.

Perceptual and behavioral consequences

A unimodal approach dominated scientific literature until the beginning of this century. Although this enabled rapid progression of neural mapping, and an improved understanding of neural structures, the investigation of perception remained relatively stagnant, with a few exceptions. The recent revitalized enthusiasm into perceptual research is indicative of a substantial shift away from reductionism and toward gestalt methodologies. Gestalt theory, dominant in the late 19th and early 20th centuries espoused two general principles: the 'principle of totality' in which conscious experience must be considered globally, and the 'principle of psychophysical isomorphism' which states that perceptual phenomena are correlated with cerebral activity. Just these ideas were already applied by Justo Gonzalo in his work of brain dynamics, where a sensory-cerebral correspondence is considered in the formulation of the "development of the sensory field due to a psychophysical isomorphism" (pag. 23 of the English translation of ref. [19] ). Both ideas 'principle of totality' and 'psychophysical isomorphism' are particularly relevant in the current climate and have driven researchers to investigate the behavioural benefits of multisensory integration.

Decreasing sensory uncertainty

It has been widely acknowledged that uncertainty in sensory domains results in an increased dependence of multisensory integration. [26] Hence, it follows that cues from multiple modalities that are both temporally and spatially synchronous are viewed neurally and perceptually as emanating from the same source. The degree of synchrony that is required for this 'binding' to occur is currently being investigated in a variety of approaches. [50] The integrative function only occurs to a point beyond which the subject can differentiate them as two opposing stimuli. Concurrently, a significant intermediate conclusion can be drawn from the research thus far. Multisensory stimuli that are bound into a single percept, are also bound on the same receptive fields of multisensory neurons in the SC and cortex. [26]

Decreasing reaction time

Responses to multiple simultaneous sensory stimuli can be faster than responses to the same stimuli presented in isolation. Hershenson (1962) presented a light and tone simultaneously and separately, and asked human participants to respond as rapidly as possible to them. As the asynchrony between the onsets of both stimuli was varied, it was observed that for certain degrees of asynchrony, reaction times were decreased. [51] These levels of asynchrony were quite small, perhaps reflecting the temporal window that exists in multisensory neurons of the SC. Further studies have analysed the reaction times of saccadic eye movements; [52] and more recently correlated these findings to neural phenomena. [53] In patients studied by Gonzalo, [18] with lesions in the parieto-occipital cortex, the decrease in the reaction time to a given stimulus by means of intersensory facilitation was shown to be very remarkable.

Redundant target effects

The redundant target effect is the observation that people typically respond faster to double targets (two targets presented simultaneously) than to either of the targets presented alone. This difference in latency is termed the redundancy gain (RG). [54]

In a study done by Forster, Cavina-Pratesi, Aglioti, and Berlucchi (2001), normal observers responded faster to simultaneous visual and tactile stimuli than to single visual or tactile stimuli. RT to simultaneous visual and tactile stimuli was also faster than RT to simultaneous dual visual or tactile stimuli. The advantage for RT to combined visual-tactile stimuli over RT to the other types of stimulation could be accounted for by intersensory neural facilitation rather than by probability summation. These effects can be ascribed to the convergence of tactile and visual inputs onto neural centers which contain flexible multisensory representations of body parts. [55]

Multisensory illusions

McGurk effect

It has been found that two converging bimodal stimuli can produce a perception that is not only different in magnitude than the sum of its parts, but also quite different in quality. In a classic study labeled the McGurk effect, [56] a person's phoneme production was dubbed with a video of that person speaking a different phoneme. [57] The result was the perception of a third, different phoneme. McGurk and MacDonald (1976) explained that phonemes such as ba, da, ka, ta, ga and pa can be divided into four groups, those that can be visually confused, i.e. (da, ga, ka, ta) and (ba and pa), and those that can be audibly confused. Hence, when ba – voice and ga lips are processed together, the visual modality sees ga or da, and the auditory modality hears ba or da, combining to form the percept da. [56]

Ventriloquism

Ventriloquism has been used as the evidence for the modality appropriateness hypothesis. The ventriloquism effect is the situation in which auditory location perception is shifted toward a visual cue. The original study describing this phenomenon was conducted by Howard and Templeton, (1966) after which several studies have replicated and built upon the conclusions they reached. [58] In conditions in which the visual cue is unambiguous, visual capture reliably occurs. Thus to test the influence of sound on perceived location, the visual stimulus must be progressively degraded. [26] Furthermore, given that auditory stimuli are more attuned to temporal changes, recent studies have tested the ability of temporal characteristics to influence the spatial location of visual stimuli. Some types of EVP – electronic voice phenomenon, mainly the ones using sound bubbles are considered a kind of modern ventriloquism technique and is played by the use of sophisticated software, computers and sound equipment.

Double-flash illusion

The double flash illusion was reported as the first illusion to show that visual stimuli can be qualitatively altered by audio stimuli. [59] In the standard paradigm participants are presented combinations of one to four flashes accompanied by zero to 4 beeps. They were then asked to say how many flashes they perceived. Participants perceived illusory flashes when there were more beeps than flashes. fMRI studies have shown that there is crossmodal activation in early, low level visual areas, which was qualitatively similar to the perception of a real flash. This suggests that the illusion reflects subjective perception of the extra flash. [60] Further, studies suggest that timing of multisensory activation in unisensory cortexes is too fast to be mediated by a higher order integration suggesting feed forward or lateral connections. [61] One study has revealed the same effect but from vision to audition, as well as fission rather than fusion effects, although the level of the auditory stimulus was reduced to make it less salient for those illusions affecting audition. [62]

Rubber hand illusion

Schematic diagram of the experimental set-up in the rubber hand illusion task. Rubber hand illusion.png
Schematic diagram of the experimental set-up in the rubber hand illusion task.

In the rubber hand illusion (RHI), [63] human participants view a dummy hand being stroked with a paintbrush, while they feel a series of identical brushstrokes applied to their own hand, which is hidden from view. If this visual and tactile information is applied synchronously, and if the visual appearance and position of the dummy hand is similar to one's own hand, then people may feel that the touches on their own hand are coming from the dummy hand, and even that the dummy hand is, in some way, their own hand. [63] This is an early form of body transfer illusion.

The RHI is an illusion of vision, touch, and posture (proprioception), but a similar illusion can also be induced with touch and proprioception. [64] It has also been found that the illusion may not require tactile stimulation at all, but can be completely induced using mere vision of the rubber hand being in a congruent posture with the hidden real hand. [65]

The first report of this kind of illusion may have been as early as 1937 (Tastevin, 1937). [66] [67] [68]

Body transfer illusion

Body transfer illusion typically involves the use of virtual reality devices to induce the illusion in the subject that the body of another person or being is the subject's own body.

Neural mechanisms

Subcortical areas

Superior colliculus

Superior colliculus Slide5ff.JPG
Superior colliculus

The superior colliculus (SC) or optic tectum (OT) is part of the tectum, located in the midbrain, superior to the brainstem and inferior to the thalamus. It contains seven layers of alternating white and grey matter, of which the superficial contain topographic maps of the visual field; and deeper layers contain overlapping spatial maps of the visual, auditory and somatosensory modalities. [69] The structure receives afferents directly from the retina, as well as from various regions of the cortex (primarily the occipital lobe), the spinal cord and the inferior colliculus. It sends efferents to the spinal cord, cerebellum, thalamus and occipital lobe via the lateral geniculate nucleus (LGN). The structure contains a high proportion of multisensory neurons and plays a role in the motor control of orientation behaviours of the eyes, ears and head. [53]

Receptive fields from somatosensory, visual and auditory modalities converge in the deeper layers to form a two-dimensional multisensory map of the external world. Here, objects straight ahead are represented caudally and objects on the periphery are represented rosterally. Similarly, locations in superior sensory space are represented medially, and inferior locations are represented laterally. [44]

However, in contrast to simple convergence, the SC integrates information to create an output that differs from the sum of its inputs. Following a phenomenon labelled the 'spatial rule', neurons are excited if stimuli from multiple modalities fall on the same or adjacent receptive fields, but are inhibited if the stimuli fall on disparate fields. [70] Excited neurons may then proceed to innervate various muscles and neural structures to orient an individual's behaviour and attention toward the stimulus. Neurons in the SC also adhere to the 'temporal rule', in which stimulation must occur within close temporal proximity to excite neurons. However, due to the varying processing time between modalities and the relatively slower speed of sound to light, it has been found the neurons may be optimally excited when stimulated some time apart. [71]

Putamen

Single neurons in the macaque putamen have been shown to have visual and somatosensory responses closely related to those in the polysensory zone of the premotor cortex and area 7b in the parietal lobe. [72] [73]

Cortical areas

Multisensory neurons exist in a large number of locations, often integrated with unimodal neurons. They have recently been discovered in areas previously thought to be modality specific, such as the somatosensory cortex; as well as in clusters at the borders between the major cerebral lobes, such as the occipito-parietal space and the occipito-temporal space. [74] [53] [75]

However, in order to undergo such physiological changes, there must exist continuous connectivity between these multisensory structures. It is generally agreed that information flow within the cortex follows a hierarchical configuration. [76] Hubel and Wiesel showed that receptive fields and thus the function of cortical structures, as one proceeds out from V1 along the visual pathways, become increasingly complex and specialized. [76] From this it was postulated that information flowed outwards in a feed-forward fashion; the complex end products eventually binding to form a percept. However, via fMRI and intracranial recording technologies, it has been observed that the activation time of successive levels of the hierarchy does not correlate with a feed-forward structure. That is, late activation has been observed in the striate cortex, markedly after activation of the prefrontal cortex in response to the same stimulus. [77]

Complementing this, afferent nerve fibres have been found that project to early visual areas such as the lingual gyrus from late in the dorsal (action) and ventral (perception) visual streams, as well as from the auditory association cortex. [78] Feedback projections have also been observed in the opossum directly from the auditory association cortex to V1. [76] This last observation currently highlights a point of controversy within the neuroscientific community. Sadato et al. (2004) concluded, in line with Bernstein et al. (2002), that the primary auditory cortex (A1) was functionally distinct from the auditory association cortex, in that it was void of any interaction with the visual modality. They hence concluded that A1 would not at all be effected by cross modal plasticity. [79] [80] This concurs with Jones and Powell's (1970) contention that primary sensory areas are connected only to other areas of the same modality. [81]

In contrast, the dorsal auditory pathway, projecting from the temporal lobe is largely concerned with processing spatial information, and contains receptive fields that are topographically organized. Fibers from this region project directly to neurons governing corresponding receptive fields in V1. [76] The perceptual consequences of this have not yet been empirically acknowledged. However, it can be hypothesized that these projections may be the precursors of increased acuity and emphasis of visual stimuli in relevant areas of perceptual space. Consequently, this finding rejects Jones and Powell's (1970) hypothesis [81] and thus is in conflict with Sadato et al.'s (2004) findings. [79] A resolution to this discrepancy includes the possibility that primary sensory areas can not be classified as a single group, and thus may be far more different from what was previously thought.

The multisensory syndrome with symmetric bilaterality, characterized by Gonzalo and called by this author 'central syndrome of the cortex', [18] [19] was originated from a unilateral parieto-occipital cortical lesion equidistant from the visual, tactile, and auditory projection areas (the middle of area 19, the anterior part of area 18 and the most posterior of area 39, in Brodmann terminology) that was called 'central zone'. The gradation observed between syndromes led this author to propose a functional gradient scheme in which the specificity of the cortex is distributed with a continuous variation, [19] the overlap of the specific gradients would be high or maximum in that 'central zone'.

Frontal lobe

Area F4 in macaques

Area F5 in macaques [82] [83]

Polysensory zone of premotor cortex (PZ) in macaques [84]

Occipital lobe

Primary visual cortex (V1) [85]

Lingual gyrus in humans

Lateral occipital complex (LOC), including lateral occipital tactile visual area (LOtv) [86]

Parietal lobe

Ventral intraparietal sulcus (VIP) in macaques [82]

Lateral intraparietal sulcus (LIP) in macaques [82]

Area 7b in macaques [87]

Second somatosensory cortex (SII) [88]

Temporal lobe

Primary auditory cortex (A1)

Superior temporal cortex (STG/STS/PT) Audio visual cross modal interactions are known to occur in the auditory association cortex which lies directly inferior to the Sylvian fissure in the temporal lobe. [79] Plasticity was observed in the superior temporal gyrus (STG) by Petitto et al. (2000). [89] Here, it was found that the STG was more active during stimulation in native deaf signers compared to hearing non signers. Concurrently, further research has revealed differences in the activation of the Planum temporale (PT) in response to non linguistic lip movements between the hearing and deaf; as well as progressively increasing activation of the auditory association cortex as previously deaf participants gain hearing experience via a cochlear implant. [79]

Anterior ectosylvian sulus (AES) in cats [90] [91] [92]

Rostral lateral suprasylvian sulcus (rLS) in cats [91]

Cortical-subcortical interactions

The most significant interaction between these two systems (corticotectal interactions) is the connection between the anterior ectosylvian sulcus (AES), which lies at the junction of the parietal, temporal and frontal lobes, and the SC. The AES is divided into three unimodal regions with multisensory neurons at the junctions between these sections. [93] (Jiang & Stein, 2003). Neurons from the unimodal regions project to the deep layers of the SC and influence the multiplicative integration effect. That is, although they can receive inputs from all modalities as normal, the SC can not enhance or depress the effect of multisensory stimulation without input from the AES. [93]

Concurrently, the multisensory neurons of the AES, although also integrally connected to unimodal AES neurons, are not directly connected to the SC. This pattern of division is reflected in other areas of the cortex, resulting in the observation that cortical and tectal multisensory systems are somewhat dissociated. [94] Stein, London, Wilkinson and Price (1996) analysed the perceived luminance of an LED in the context of spatially disparate auditory distracters of various types. A significant finding was that a sound increased the perceived brightness of the light, regardless of their relative spatial locations, provided the light's image was projected onto the fovea. [95] Here, the apparent lack of the spatial rule, further differentiates cortical and tectal multisensory neurons. Little empirical evidence exists to justify this dichotomy. Nevertheless, cortical neurons governing perception, and a separate sub cortical system governing action (orientation behavior) is synonymous with the perception action hypothesis of the visual stream. [96] Further investigation into this field is necessary before any substantial claims can be made.

Dual "what" and "where" multisensory routes

Research suggests the existence of two multisensory routes for "what" and "where". The "what" route identifying the identity of things involving area Brodmann area 9 in the right inferior frontal gyrus and right middle frontal gyrus, Brodmann area 13 and Brodmann area 45 in the right insula-inferior frontal gyrus area, and Brodmann area 13 bilaterally in the insula. The "where" route detecting their spatial attributes involving the Brodmann area 40 in the right and left inferior parietal lobule and the Brodmann area 7 in the right precuneus-superior parietal lobule and Brodmann area 7 in the left superior parietal lobule. [97]

Development of multisensory operations

Theories of development

All species equipped with multiple sensory systems, utilize them in an integrative manner to achieve action and perception. [44] However, in most species, especially higher mammals and humans, the ability to integrate develops in parallel with physical and cognitive maturity. Children until certain ages do not show mature integration patterns. [98] [99] Classically, two opposing views that are principally modern manifestations of the nativist/empiricist dichotomy have been put forth. The integration (empiricist) view states that at birth, sensory modalities are not at all connected. Hence, it is only through active exploration that plastic changes can occur in the nervous system to initiate holistic perceptions and actions. Conversely, the differentiation (nativist) perspective asserts that the young nervous system is highly interconnected; and that during development, modalities are gradually differentiated as relevant connections are rehearsed and the irrelevant are discarded. [100]

Using the SC as a model, the nature of this dichotomy can be analysed. In the newborn cat, deep layers of the SC contain only neurons responding to the somatosensory modality. Within a week, auditory neurons begin to occur, but it is not until two weeks after birth that the first multisensory neurons appear. Further changes continue, with the arrival of visual neurons after three weeks, until the SC has achieved its fully mature structure after three to four months. Concurrently in species of monkey, newborns are endowed with a significant complement of multisensory cells; however, along with cats there is no integration effect apparent until much later. [53] This delay is thought to be the result of the relatively slower development of cortical structures including the AES; which as stated above, is essential for the existence of the integration effect. [93]

Furthermore, it was found by Wallace (2004) that cats raised in a light deprived environment had severely underdeveloped visual receptive fields in deep layers of the SC. [53] Although, receptive field size has been shown to decrease with maturity, the above finding suggests that integration in the SC is a function of experience. Nevertheless, the existence of visual multisensory neurons, despite a complete lack of visual experience, highlights the apparent relevance of nativist viewpoints. Multisensory development in the cortex has been studied to a lesser extent, however a similar study to that presented above was performed on cats whose optic nerves had been severed. These cats displayed a marked improvement in their ability to localize stimuli through audition; and consequently also showed increased neural connectivity between V1 and the auditory cortex. [76] Such plasticity in early childhood allows for greater adaptability, and thus more normal development in other areas for those with a sensory deficit.

In contrast, following the initial formative period, the SC does not appear to display any neural plasticity. Despite this, habituation and sensititisation over the long term is known to exist in orientation behaviors. This apparent plasticity in function has been attributed to the adaptability of the AES. That is, although neurons in the SC have a fixed magnitude of output per unit input, and essentially operate an all or nothing response, the level of neural firing can be more finely tuned by variations in input by the AES.

Although there is evidence for either perspective of the integration/differentiation dichotomy, a significant body of evidence also exists for a combination of factors from either view. Thus, analogous to the broader nativist/empiricist argument, it is apparent that rather than a dichotomy, there exists a continuum, such that the integration and differentiation hypotheses are extremes at either end.

Psychophysical development of integration

Not much is known about the development of the ability to integrate multiple estimates such as vision and touch. [98] Some multisensory abilities are present from early infancy, but it is not until children are eight years or older before they use multiple modalities to reduce sensory uncertainty. [98]

One study demonstrated that cross-modal visual and auditory integration is present from within 1 year of life. [101] This study measured response time for orientating towards a source. Infants who were 8–10 months old showed significantly decreased response times when the source was presented through both visual and auditory information compared to a single modality. Younger infants, however, showed no such change in response times to these different conditions. Indeed, the results of the study indicates that children potentially have the capacity to integrate sensory sources at any age. However, in certain cases, for example visual cues, intermodal integration is avoided. [98]
Another study found that cross-modal integration of touch and vision for distinguishing size and orientation is available from at least 8 years of age. [99] For pre-integration age groups, one sense dominates depending on the characteristic discerned (see visual dominance). [99]

A study investigating sensory integration within a single modality (vision) found that it cannot be established until age 12 and above. [98] This particular study assessed the integration of disparity and texture cues to resolve surface slant. Though younger age groups showed a somewhat better performance when combining disparity and texture cues compared to using only disparity or texture cues, this difference was not statistically significant. [98] In adults, the sensory integration can be mandatory, meaning that they no longer have access to the individual sensory sources. [102]

Acknowledging these variations, many hypotheses have been established to reflect why these observations are task-dependent. Given that different senses develop at different rates, it has been proposed that cross-modal integration does not appear until both modalities have reached maturity. [99] [103] The human body undergoes significant physical transformation throughout childhood. Not only is there growth in size and stature (affecting viewing height), but there is also change in inter-ocular distance and eyeball length. Therefore, sensory signals need to be constantly re-evaluated to appreciate these various physiological changes. [99] Some support comes from animal studies that explore the neurobiology behind integration. Adult monkeys have deep inter-neuronal connections within the superior colliculus providing strong, accelerated visuo-auditory integration. [104] Young animals conversely, do not have this enhancement until unimodal properties are fully developed. [105] [106]

Additionally, to rationalize sensory dominance, Gori et al. (2008) advocates that the brain utilises the most direct source of information during sensory immaturity. [99] In this case, orientation is primarily a visual characteristic. It can be derived directly from the object image that forms on the retina, irrespective of other visual factors. In fact, data shows that a functional property of neurons within primate visual cortices' are their discernment to orientation. [107] In contrast, haptic orientation judgements are recovered through collaborated patterned stimulations, evidently an indirect source susceptible to interference. Likewise, when size is concerned haptic information coming from positions of the fingers is more immediate. Visual-size perceptions, alternatively, have to be computed using parameters such as slant and distance. Considering this, sensory dominance is a useful instinct to assist with calibration. During sensory immaturity, the more simple and robust information source could be used to tweak the accuracy of the alternate source. [99] Follow-up work by Gori et al. (2012) showed that, at all ages, vision-size perceptions are near perfect when viewing objects within the haptic workspace (i.e. at arm's reach). [108] However, systematic errors in perception appeared when the object was positioned beyond this zone. [109] Children younger than 14 years tend to underestimate object size, whereas adults overestimated. However, if the object was returned to the haptic workspace, those visual biases disappeared. [108] These results support the hypothesis that haptic information may educate visual perceptions. If sources are used for cross-calibration they cannot, therefore, be combined (integrated). Maintaining access to individual estimates is a trade-off for extra plasticity over accuracy, which could be beneficial in retrospect to the developing body. [99] [103]

Alternatively, Ernst (2008) advocates that efficient integration initially relies upon establishing correspondence – which sensory signals belong together. [103] Indeed, studies have shown that visuo-haptic integration fails in adults when there is a perceived spatial separation, suggesting sensory information is coming from different targets. [110] Furthermore, if the separation can be explained, for example viewing an object through a mirror, integration is re-established and can even be optimal. [111] [112] Ernst (2008) suggests that adults can obtain this knowledge from previous experiences to quickly determine which sensory sources depict the same target, but young children could be deficient in this area. [103] Once there is a sufficient bank of experiences, confidence to correctly integrate sensory signals can then be introduced in their behaviour.

Lastly, Nardini et al. (2010) recently hypothesised that young children have optimized their sensory appreciation for speed over accuracy. [98] When information is presented in two forms, children may derive an estimate from the fastest available source, subsequently ignoring the alternate, even if it contains redundant information. Nardini et al. (2010) provides evidence that children's (aged 6 years) response latencies are significantly lower when stimuli are presented in multi-cue over single-cue conditions. [98] Conversely, adults showed no change between these conditions. Indeed, adults display mandatory fusion of signals, therefore they can only ever aim for maximum accuracy. [98] [102] However, the overall mean latencies for children were not faster than adults, which suggests that speed optimization merely enable them to keep up with the mature pace. Considering the haste of real-world events, this strategy may prove necessary to counteract the general slower processing of children and maintain effective vision-action coupling. [113] [114] [115] Ultimately the developing sensory system may preferentially adapt for different goals – speed and detecting sensory conflicts – those typical of objective learning.

The late development of efficient integration has also been investigated from computational point of view. [116] Daee et al. (2014) showed that having one dominant sensory source at early age, rather than integrating all sources, facilitates the overall development of cross-modal integrations.

Applications

Prosthesis

Prosthetics designers should carefully consider the nature of dimensionality alteration of sensorimotor signaling from and to the CNS when designing prosthetic devices. As reported in literatures, neural signaling from the CNS to the motors is organized in a way that the dimensionalities of the signals are gradually increased as you approach the muscles, also called muscle synergies. In the same principal, but in opposite ordering, on the other hand, signals dimensionalities from the sensory receptors are gradually integrated, also called sensory synergies, as they approaches the CNS. This bow tie like signaling formation enables the CNS to process abstract yet valuable information only. Such as process will decrease complexity of the data, handle the noises and guarantee to the CNS the optimum energy consumption. Although the current commercially available prosthetic devices mainly focusing in implementing the motor side by simply uses EMG sensors to switch between different activation states of the prosthesis. Very limited works have proposed a system to involve by integrating the sensory side. The integration of tactile sense and proprioception is regarded as essential for implementing the ability to perceive environmental input. [117]

Visual rehabilitation

Multisensory integration has also been shown to ameliorate visual hemianopia. Through the repeated presentation of multisensory stimuli in the blind hemifield, the ability to respond to purely visual stimuli gradually returns to that hemifield in a central to peripheral manner. These benefits persist even after the explicit multisensory training ceases. [118]

See also

Related Research Articles

<span class="mw-page-title-main">Visual cortex</span> Region of the brain that processes visual information

The visual cortex of the brain is the area of the cerebral cortex that processes visual information. It is located in the occipital lobe. Sensory input originating from the eyes travels through the lateral geniculate nucleus in the thalamus and then reaches the visual cortex. The area of the visual cortex that receives the sensory input from the lateral geniculate nucleus is the primary visual cortex, also known as visual area 1 (V1), Brodmann area 17, or the striate cortex. The extrastriate areas consist of visual areas 2, 3, 4, and 5.

<span class="mw-page-title-main">Sensory nervous system</span> Part of the nervous system

The sensory nervous system is a part of the nervous system responsible for processing sensory information. A sensory system consists of sensory neurons, neural pathways, and parts of the brain involved in sensory perception and interoception. Commonly recognized sensory systems are those for vision, hearing, touch, taste, smell, balance and visceral sensation. Sense organs are transducers that convert data from the outer physical world to the realm of the mind where people interpret the information, creating their perception of the world around them.

<span class="mw-page-title-main">Parietal lobe</span> Part of the brain responsible for sensory input and some language processing

The parietal lobe is one of the four major lobes of the cerebral cortex in the brain of mammals. The parietal lobe is positioned above the temporal lobe and behind the frontal lobe and central sulcus.

<span class="mw-page-title-main">Claustrum</span> Structure in the brain

The claustrum is a thin sheet of neurons and supporting glial cells, that connects to the cerebral cortex and subcortical regions including the amygdala, hippocampus and thalamus of the brain. It is located between the insular cortex laterally and the putamen medially, encased by the extreme and external capsules respectively. Blood to the claustrum is supplied by the middle cerebral artery. It is considered to be the most densely connected structure in the brain, and thus hypothesized to allow for the integration of various cortical inputs such as vision, sound and touch, into one experience. Other hypotheses suggest that the claustrum plays a role in salience processing, to direct attention towards the most behaviorally relevant stimuli amongst the background noise. The claustrum is difficult to study given the limited number of individuals with claustral lesions and the poor resolution of neuroimaging.

Stimulus modality, also called sensory modality, is one aspect of a stimulus or what is perceived after a stimulus. For example, the temperature modality is registered after heat or cold stimulate a receptor. Some sensory modalities include: light, sound, temperature, taste, pressure, and smell. The type and location of the sensory receptor activated by the stimulus plays the primary role in coding the sensation. All sensory modalities work together to heighten stimuli sensation when necessary.

<span class="mw-page-title-main">Superior colliculus</span> Structure in the midbrain

In neuroanatomy, the superior colliculus is a structure lying on the roof of the mammalian midbrain. In non-mammalian vertebrates, the homologous structure is known as the optic tectum or optic lobe. The adjective form tectal is commonly used for both structures.

A gamma wave or gamma rhythm is a pattern of neural oscillation in humans with a frequency between 30 and 100 Hz, the 40 Hz point being of particular interest. Gamma rhythms are correlated with large-scale brain network activity and cognitive phenomena such as working memory, attention, and perceptual grouping, and can be increased in amplitude via meditation or neurostimulation. Altered gamma activity has been observed in many mood and cognitive disorders such as Alzheimer's disease, epilepsy, and schizophrenia.

Sensory substitution is a change of the characteristics of one sensory modality into stimuli of another sensory modality.

Sensory processing is the process that organizes and distinguishes sensation from one's own body and the environment, thus making it possible to use the body effectively within the environment. Specifically, it deals with how the brain processes multiple sensory modality inputs, such as proprioception, vision, auditory system, tactile, olfactory, vestibular system, interoception, and taste into usable functional outputs.

In neuroanatomy, topographic map is the ordered projection of a sensory surface or an effector system to one or more structures of the central nervous system. Topographic maps can be found in all sensory systems and in many motor systems.

Body schema is an organism's internal model of its own body, including the position of its limbs. The neurologist Sir Henry Head originally defined it as a postural model of the body that actively organizes and modifies 'the impressions produced by incoming sensory impulses in such a way that the final sensation of body position, or of locality, rises into consciousness charged with a relation to something that has happened before'. As a postural model that keeps track of limb position, it plays an important role in control of action.

Recurrent thalamo-cortical resonance or Thalamocortical oscillation is an observed phenomenon of oscillatory neural activity between the thalamus and various cortical regions of the brain. It is proposed by Rodolfo Llinas and others as a theory for the integration of sensory information into the whole of perception in the brain. Thalamocortical oscillation is proposed to be a mechanism of synchronization between different cortical regions of the brain, a process known as temporal binding. This is possible through the existence of thalamocortical networks, groupings of thalamic and cortical cells that exhibit oscillatory properties.

Extinction is a neurological disorder that impairs the ability to perceive multiple stimuli of the same type simultaneously. Extinction is usually caused by damage resulting in lesions on one side of the brain. Those who are affected by extinction have a lack of awareness in the contralesional side of space and a loss of exploratory search and other actions normally directed toward that side.

<span class="mw-page-title-main">Cross modal plasticity</span> Reorganization of neurons in the brain to integrate the function of two or more sensory systems

Cross modal plasticity is the adaptive reorganization of neurons to integrate the function of two or more sensory systems. Cross modal plasticity is a type of neuroplasticity and often occurs after sensory deprivation due to disease or brain damage. The reorganization of the neural network is greatest following long-term sensory deprivation, such as congenital blindness or pre-lingual deafness. In these instances, cross modal plasticity can strengthen other sensory systems to compensate for the lack of vision or hearing. This strengthening is due to new connections that are formed to brain cortices that no longer receive sensory input.

Auditory spatial attention is a specific form of attention, involving the focusing of auditory perception to a location in space.

Chronostasis is a type of temporal illusion in which the first impression following the introduction of a new event or task-demand to the brain can appear to be extended in time. For example, chronostasis temporarily occurs when fixating on a target stimulus, immediately following a saccade. This elicits an overestimation in the temporal duration for which that target stimulus was perceived. This effect can extend apparent durations by up to half a second and is consistent with the idea that the visual system models events prior to perception.

Andreas Karl Engel is a German neuroscientist. He is the director of the Department of Neurophysiology and Pathophysiology at the University Medical Center Hamburg-Eppendorf (UKE).

<span class="mw-page-title-main">Medial pulvinar nucleus</span>

Medial pulvinar nucleus is one of four traditionally anatomically distinguished nuclei of the pulvinar of the thalamus. The other three nuclei of the pulvinar are called lateral, inferior and anterior pulvinar nuclei.

The tectopulvinar pathway and the geniculostriate pathway are the two visual pathways that travel from the retina to the early visual cortical areas. From the optic tract, the tectopulvinar pathway sends neuronal radiations to the superior colliculus in the tectum, then to the lateral posterior-pulvinar thalamic complex. Approximately 10% of retinal ganglion cells project onto the tectopulvinar pathway.

<span class="mw-page-title-main">Laura Busse</span> German neuroscientist (born 1977)

Laura Busse is a German neuroscientist and professor of Systemic Neuroscience within the Division of Neurobiology at the Ludwig Maximilian University of Munich. Busse's lab studies context-dependent visual processing in mouse models by performing large scale in vivo electrophysiological recordings in the thalamic and cortical circuits of awake and behaving mice.

References

  1. Stein, BE.; Stanford, TR.; Rowland, BA. (Dec 2009). "The neural basis of multisensory integration in the midbrain: its organization and maturation". Hear Res. 258 (1–2): 4–15. doi:10.1016/j.heares.2009.03.012. PMC   2787841 . PMID   19345256.
  2. Lewkowicz DJ, Ghazanfar AA (November 2009). "The emergence of multisensory systems through perceptual narrowing" (PDF). Trends Cogn. Sci. (Regul. Ed.). 13 (11): 470–8. CiteSeerX   10.1.1.554.4323 . doi:10.1016/j.tics.2009.08.004. PMID   19748305. S2CID   14289579.
  3. Lemus L, Hernández A, Luna R, Zainos A, Romo R (July 2010). "Do sensory cortices process more than one sensory modality during perceptual judgements?". Neuron. 67 (2): 335–48. doi: 10.1016/j.neuron.2010.06.015 . PMID   20670839. S2CID   16043442.
  4. Zmigrod, S.; Hommel, B. (Jan 2010). "Temporal dynamics of unimodal and multimodal feature binding" (PDF). Atten Percept Psychophys. 72 (1): 142–52. doi: 10.3758/APP.72.1.142 . PMID   20045885. S2CID   7055915.
  5. Wagemans, J.; Elder, JH.; Kubovy, M.; Palmer, SE.; Peterson, MA.; Singh, M.; von der Heydt, R. (Nov 2012). "A century of Gestalt psychology in visual perception: I. Perceptual grouping and figure-ground organization". Psychol Bull. 138 (6): 1172–217. CiteSeerX   10.1.1.452.8394 . doi:10.1037/a0029333. PMC   3482144 . PMID   22845751.
  6. Wagemans, J.; Feldman, J.; Gepshtein, S.; Kimchi, R.; Pomerantz, JR.; van der Helm, PA.; van Leeuwen, C. (Nov 2012). "A century of Gestalt psychology in visual perception: II. Conceptual and theoretical foundations". Psychol Bull. 138 (6): 1218–52. doi:10.1037/a0029334. PMC   3728284 . PMID   22845750.
  7. Stein, BE.; Rowland, BA. (2011). "Organization and plasticity in multisensory integration". Enhancing Performance for Action and Perception - Multisensory Integration, Neuroplasticity and Neuroprosthetics, Part I. Progress in Brain Research. Vol. 191. pp. 145–63. doi:10.1016/B978-0-444-53752-2.00007-2. ISBN   9780444537522. PMC   3245961 . PMID   21741550.{{cite book}}: |journal= ignored (help)
  8. Recanzone, GH. (Dec 2009). "Interactions of auditory and visual stimuli in space and time". Hear Res. 258 (1–2): 89–99. doi:10.1016/j.heares.2009.04.009. PMC   2787663 . PMID   19393306.
  9. Smith, E.; Duede, S.; Hanrahan, S.; Davis, T.; House, P.; Greger, B. (2013). "Seeing is believing: neural representations of visual stimuli in human auditory cortex correlate with illusory auditory perceptions". PLOS ONE. 8 (9): e73148. Bibcode:2013PLoSO...873148S. doi: 10.1371/journal.pone.0073148 . PMC   3762867 . PMID   24023823.
  10. Fodor, Jerry A. (1983). Modularity of mind: an essay on faculty psychology. Cambridge, Mass: MIT Press. ISBN   978-0-262-06084-4. OCLC   551957787.
  11. Stratton, George M. (1896). "Some preliminary experiments on vision without inversion of the retinal image". Psychological Review. 3 (6): 611–617. doi:10.1037/h0072918.
  12. Stratton, George M. (1897). "Vision without inversion of the retinal image". Psychological Review. 4 (4): 341–360, 463–481. doi:10.1037/h0075482.
  13. Hartmann, G.M. (1935). Gestalt Psychology. New York: The Ronald Press.
  14. Urbantschitsch, V. (1888). "Über den Einfluss einer Sinneserregung auf die übrigen Sinnesempfindungen". Pflügers Archiv. 42: 154–182. doi:10.1007/bf01669354. S2CID   42136599.
  15. Kravkov, S.V. (1930). "Über die Abhängigkeit der Sehschärfe vom Schallreiz". Arch. Ophthalmol. 124 (2): 334–338. doi:10.1007/bf01853661. S2CID   30040170.
  16. 1 2 Hartmann, G.W. (1933). "Changes in Visual Acuity through Simultaneous Stimulation of Other Sense Organs". J. Exp. Psychol. 16 (3): 393–407. doi:10.1037/h0074549.
  17. London, I.D. (1954). "Research of sensory interaction in the Soviet Union". Psychol. Bull. 51 (6): 531–568. doi:10.1037/h0056730. PMID   13215683.
  18. 1 2 3 4 Gonzalo, J. (1945, 1950, 1952, 2010, 2023). Dinámica Cerebral, Open Access. Edición facsímil 2010 del Vol. 1 1945, Vol. 2 1950 (Madrid: Inst. S. Ramón y Cajal, CSIC), Suplemento I 1952 (Trab. Inst. Cajal Invest. Biol.) y 1ª ed. Suplemento II 2010. Red Temática en Tecnologías de Computación Artificial/Natural (RTNAC) y Universidad de Santiago de Compostela (USC). ISBN 978-84-9887-458-7. Brain Dynamics, English edition 2023 (Vols. 1 and 2, Supplements I and II), I. Gonzalo-Fonrodona (ed.) Editorial CSIC, Open Access.
  19. 1 2 3 4 Gonzalo, J. (1952). "Las funciones cerebrales humanas según nuevos datos y bases fisiológicas. Una introducción a los estudios de Dinámica Cerebral". Trabajos del Inst. Cajal de Investigaciones Biológicas, XLIV: pp. 95–157. It is the Supplement I of Brain Dynamics, English edition 2023 (Vols. 1 and 2, Supplements I and II), I. Gonzalo-Fonrodona (ed.) Editorial CSIC, Open Access.
  20. Witten, IB.; Knudsen, EI. (Nov 2005). "Why seeing is believing: merging auditory and visual worlds". Neuron. 48 (3): 489–96. doi: 10.1016/j.neuron.2005.10.020 . PMID   16269365. S2CID   17244783.
  21. Shams, L.; Beierholm, UR. (Sep 2010). "Causal inference in perception". Trends Cogn Sci. 14 (9): 425–32. doi:10.1016/j.tics.2010.07.001. PMID   20705502. S2CID   7750709.
  22. Gori, M.; Del Viva, M.; Sandini, G.; Burr, DC. (May 2008). "Young children do not integrate visual and haptic form information" (PDF). Curr Biol. 18 (9): 694–8. Bibcode:2008CBio...18..694G. doi: 10.1016/j.cub.2008.04.036 . PMID   18450446. S2CID   13899031.
  23. Gori, M.; Sandini, G.; Burr, D. (2012). "Development of visuo-auditory integration in space and time". Front Integr Neurosci. 6: 77. doi: 10.3389/fnint.2012.00077 . PMC   3443931 . PMID   23060759.
  24. Welch RB, Warren DH (November 1980). "Immediate perceptual response to intersensory discrepancy". Psychol Bull. 88 (3): 638–67. doi:10.1037/0033-2909.88.3.638. PMID   7003641.
  25. Lederman, Susan J.; Klatzky, Roberta L. (2004). "Multisensory Texture Perception". In Calvert, Gemma A.; Spence, Charles; Stein, Barry E. (eds.). The Handbook of Multisensory Processing. Cambridge, MA: MIT Press. pp.  107–122. ISBN   978-0-262-03321-3.
  26. 1 2 3 4 5 Alais D, Burr D (February 2004). "The ventriloquist effect results from near-optimal bimodal integration". Curr. Biol. 14 (3): 257–62. Bibcode:2004CBio...14..257A. CiteSeerX   10.1.1.220.4159 . doi:10.1016/j.cub.2004.01.029. PMID   14761661. S2CID   3125842.
  27. Kuang, S.; Zhang, T. (2014). "Smelling directions: Olfaction modulates ambiguous visual motion perception". Scientific Reports. 4: 5796. Bibcode:2014NatSR...4E5796K. doi:10.1038/srep05796. PMC   4107342 . PMID   25052162.
  28. Deneve S, Pouget A (2004). "Bayesian multisensory integration and cross-modal spatial links" (PDF). J. Physiol. Paris. 98 (1–3): 249–58. CiteSeerX   10.1.1.133.7694 . doi:10.1016/j.jphysparis.2004.03.011. PMID   15477036. S2CID   9831111.
  29. 1 2 Pouget A, Deneve S, Duhamel JR (September 2002). "A computational perspective on the neural basis of multisensory spatial representations". Nature Reviews Neuroscience. 3 (9): 741–7. doi:10.1038/nrn914. PMID   12209122. S2CID   1035721.
  30. Vilares, I.; Kording, K. (Apr 2011). "Bayesian models: the structure of the world, uncertainty, behavior, and the brain". Annals of the New York Academy of Sciences. 1224 (1): 22–39. Bibcode:2011NYASA1224...22V. doi:10.1111/j.1749-6632.2011.05965.x. PMC   3079291 . PMID   21486294.
  31. Beierholm, UR.; Quartz, SR.; Shams, L. (2009). "Bayesian priors are encoded independently from likelihoods in human multisensory perception". J Vis. 9 (5): 23.1–9. doi: 10.1167/9.5.23 . PMID   19757901.
  32. Gilbert M. (1989). On social facts. London: Routledge.
  33. Searle JR. (1992). The rediscovery of the mind. London: MIT Press.
  34. Tuomela R. (1995). The importance of us. SUP. Stanford, CA: Stanford University Press.
  35. Tomasello, M. (1999). The Cultural Origins of Human Cognition. Cambridge, Massachusetts: Harvard University Press.
  36. Tomasello, M. (2019). Becoming Human: A Theory of Ontogeny. Cambridge, Massachusetts: Harvard University Press.
  37. McClung, J. S., Placì, S., Bangerter, A., Clément, F., & Bshary, R. (2017). "The language of cooperation: shared intentionality drives variation in helping as a function of group membership." Proceedings of the Royal Society B: Biological Sciences, 284(1863), 20171682. http://dx.doi.org/10.1098/rspb.2017.1682.
  38. Shteynberg, G., & Galinsky, A. D. (2011). "Implicit coordination: Sharing goals with similar others intensifies goal pursuit." Journal of Experimental Social Psychology, 47(6), 1291-1294., https://doi.org/10.1016/j.jesp. 2011.04.012.
  39. Fishburn, F. A., Murty, V. P., Hlutkowsky, C. O., MacGillivray, C. E., Bemis, L. M., Murphy, M. E., ... & Perlman, S. B. (2018). "Putting our heads together: interpersonal neural synchronization as a biological mechanism for shared intentionality." Social cognitive and affective neuroscience, 13(8), 841-849.
  40. Val Danilov I. (2023). "Theoretical Grounds of Shared Intentionality for Neuroscience in Developing Bioengineering Systems." OBM Neurobiology 2023; 7(1): 156; doi:10.21926/obm.neurobiol.2301156
  41. Val Danilov, I. & Mihailova S. (2021). "Neuronal Coherence Agent for Shared Intentionality: A Hypothesis of Neurobiological Processes Occurring during Social Interaction." OBM Neurobiology 2021;5(4):26; doi:10.21926/obm.neurobiol.2104113
  42. Val Danilov I. (2023)."Low-Frequency Oscillations for Nonlocal Neuronal Coupling in Shared Intentionality Before and After Birth: Toward the Origin of Perception." OBM Neurobiology 2023; 7(4): 192; doi:10.21926/obm.neurobiol.2304192. https://www.lidsen.com/journals/neurobiology/neurobiology-07-04-192
  43. Val Danilov I. (2023). "Shared Intentionality Modulation at the Cell Level: Low-Frequency Oscillations for Temporal Coordination in Bioengineering Systems." OBM Neurobiology 2023; 7(4): 185; doi:10.21926/obm.neurobiol.2304185. https://www.lidsen.com/journals/neurobiology/neurobiology-07-04-185
  44. 1 2 3 Stein, Barry; Meredith, M. Alex (1993). The merging of the senses. Cambridge, Mass: MIT Press. ISBN   978-0-262-19331-3. OCLC   25869284.
  45. Meredith, MA.; Stein, BE. (Feb 1986). "Spatial factors determine the activity of multisensory neurons in cat superior colliculus". Brain Res. 365 (2): 350–4. doi:10.1016/0006-8993(86)91648-3. PMID   3947999. S2CID   12807282.
  46. 1 2 King AJ, Palmer AR (1985). "Integration of visual and auditory information in bimodal neurones in the guinea-pig superior colliculus". Exp Brain Res. 60 (3): 492–500. doi:10.1007/bf00236934. PMID   4076371. S2CID   25796198.
  47. Meredith, MA.; Nemitz, JW.; Stein, BE. (Oct 1987). "Determinants of multisensory integration in superior colliculus neurons. I. Temporal factors". J Neurosci. 7 (10): 3215–29. doi:10.1523/JNEUROSCI.07-10-03215.1987. PMC   6569162 . PMID   3668625.
  48. Meredith MA, Stein BE (July 1983). "Interactions among converging sensory inputs in the superior colliculus". Science. 221 (4608): 389–91. Bibcode:1983Sci...221..389M. doi:10.1126/science.6867718. PMID   6867718.
  49. Meredith, MA.; Stein, BE. (Sep 1986). "Visual, auditory, and somatosensory convergence on cells in superior colliculus results in multisensory integration". J Neurophysiol. 56 (3): 640–62. doi:10.1152/jn.1986.56.3.640. PMID   3537225.
  50. Cervantes Constantino, F.; Sánchez-Costa, T.; Cipriani, G.A.; Carboni, A. (2023). "Visuospatial attention revamps cortical processing of sound amid audiovisual uncertainty". Psychophysiology. 60 (10): e14329. doi:10.1111/psyp.14329. PMID   37166096. S2CID   258617930.
  51. Hershenson M (March 1962). "Reaction time as a measure of intersensory facilitation". J Exp Psychol. 63 (3): 289–93. doi:10.1037/h0039516. PMID   13906889.
  52. Hughes, HC.; Reuter-Lorenz, PA.; Nozawa, G.; Fendrich, R. (Feb 1994). "Visual-auditory interactions in sensorimotor processing: saccades versus manual responses". J Exp Psychol Hum Percept Perform. 20 (1): 131–53. doi:10.1037/0096-1523.20.1.131. PMID   8133219.
  53. 1 2 3 4 5 Wallace, Mark T. (2004). "The development of multisensory processes". Cognitive Processing. 5 (2): 69–83. doi:10.1007/s10339-004-0017-z. ISSN   1612-4782. S2CID   16710851.
  54. Ridgway N, Milders M, Sahraie A (May 2008). "Redundant target effect and the processing of colour and luminance". Exp Brain Res. 187 (1): 153–60. doi:10.1007/s00221-008-1293-0. PMID   18264703. S2CID   23092762.
  55. Forster B, Cavina-Pratesi C, Aglioti SM, Berlucchi G (April 2002). "Redundant target effect and intersensory facilitation from visual-tactile interactions in simple reaction time". Exp Brain Res. 143 (4): 480–7. doi:10.1007/s00221-002-1017-9. PMID   11914794. S2CID   115844.
  56. 1 2 McGurk H, MacDonald J (1976). "Hearing lips and seeing voices". Nature. 264 (5588): 746–8. Bibcode:1976Natur.264..746M. doi:10.1038/264746a0. PMID   1012311. S2CID   4171157.
  57. Nath, AR.; Beauchamp, MS. (Jan 2012). "A neural basis for interindividual differences in the McGurk effect, a multisensory speech illusion". NeuroImage. 59 (1): 781–7. doi:10.1016/j.neuroimage.2011.07.024. PMC   3196040 . PMID   21787869.
  58. Hairston WD, Wallace MT, Vaughan JW, Stein BE, Norris JL, Schirillo JA (January 2003). "Visual localization ability influences cross-modal bias". J Cogn Neurosci. 15 (1): 20–9. doi:10.1162/089892903321107792. PMID   12590840. S2CID   13636325.
  59. Shams L, Kamitani Y, Shimojo S (December 2000). "Illusions. What you see is what you hear". Nature. 408 (6814): 788. Bibcode:2000Natur.408..788S. doi: 10.1038/35048669 . PMID   11130706. S2CID   205012107.
  60. Watkins S, Shams L, Josephs O, Rees G (August 2007). "Activity in human V1 follows multisensory perception". NeuroImage. 37 (2): 572–8. doi:10.1016/j.neuroimage.2007.05.027. PMID   17604652. S2CID   17477883.
  61. Shams L, Iwaki S, Chawla A, Bhattacharya J (April 2005). "Early modulation of visual cortex by sound: an MEG study". Neurosci. Lett. 378 (2): 76–81. doi:10.1016/j.neulet.2004.12.035. PMID   15774261. S2CID   4675944.
  62. Andersen TS, Tiippana K, Sams M (November 2004). "Factors influencing audiovisual fission and fusion illusions". Brain Res Cogn Brain Res. 21 (3): 301–308. doi:10.1016/j.cogbrainres.2004.06.004. PMID   15511646.
  63. 1 2 Botvinick M, Cohen J (February 1998). "Rubber hands 'feel' touch that eyes see" (PDF). Nature. 391 (6669): 756. Bibcode:1998Natur.391..756B. doi: 10.1038/35784 . PMID   9486643. S2CID   205024516.
  64. Ehrsson HH, Holmes NP, Passingham RE (November 2005). "Touching a rubber hand: feeling of body ownership is associated with activity in multisensory brain areas". J. Neurosci. 25 (45): 10564–73. doi:10.1523/JNEUROSCI.0800-05.2005. PMC   1395356 . PMID   16280594.
  65. Samad M, Chung A, Shams L (February 2015). "Perception of Body Ownership is Driven by Bayesian Sensory Inference". PLOS ONE. 10 (2): e0117178. Bibcode:2015PLoSO..1017178S. doi: 10.1371/journal.pone.0117178 . PMC   4320053 . PMID   25658822.
  66. Holmes NP, Crozier G, Spence C (June 2004). "When mirrors lie: "visual capture" of arm position impairs reaching performance" (PDF). Cogn Affect Behav Neurosci. 4 (2): 193–200. doi:10.3758/CABN.4.2.193. PMC   1314973 . PMID   15460925.
  67. J. Tastevin (Feb 1937). "En partant de l'expérience d'Aristote: Les déplacements artificiels des parties du corps ne sont pas suivis par le sentiment de ces parties ni par les sensations qu'on peut y produire" [Starting from Aristotle's experiment: The artificial displacements of parts of the body are not followed by feeling in these parts or by the sensations which can be produced there]. L'Encephale  [ fr ] (in French). 32 (2): 57–84.  (English abstract)
  68. J. Tastevin (Mar 1937). "En partant de l'expérience d'Aristote". L'Encephale (in French). 32 (3): 140–158.
  69. Bergman, Ronald A.; Afifi, Adel K. (2005). Functional neuroanatomy: text and atlas. New York: McGraw-Hill. ISBN   978-0-07-140812-7. OCLC   475017241.
  70. Giard MH, Peronnet F (September 1999). "Auditory-visual integration during multimodal object recognition in humans: a behavioral and electrophysiological study". J Cogn Neurosci. 11 (5): 473–90. doi:10.1162/089892999563544. PMID   10511637. S2CID   5735865.
  71. Miller LM, D'Esposito M (June 2005). "Perceptual fusion and stimulus coincidence in the cross-modal integration of speech". J. Neurosci. 25 (25): 5884–93. doi:10.1523/JNEUROSCI.0896-05.2005. PMC   6724802 . PMID   15976077.
  72. Graziano MS, Gross CG (1993). "A bimodal map of space: somatosensory receptive fields in the macaque putamen with corresponding visual receptive fields" (PDF). Exp Brain Res. 97 (1): 96–109. doi:10.1007/BF00228820. PMID   8131835. S2CID   9387557.
  73. Gentile, G.; Petkova, VI.; Ehrsson, HH. (Feb 2011). "Integration of visual and tactile signals from the hand in the human brain: an FMRI study". J Neurophysiol. 105 (2): 910–22. doi:10.1152/jn.00840.2010. PMC   3059180 . PMID   21148091.
  74. Thesen, T., Vibell, J., Calvert, G.A., & Osterbauer, R. (2004). Neuroimaging of multisensory processing in vision, audition, touch and olfaction. Cognitive Processing, 5, 84-93.
  75. Wallace MT, Ramachandran R, Stein BE (February 2004). "A revised view of sensory cortical parcellation". Proc. Natl. Acad. Sci. U.S.A. 101 (7): 2167–72. Bibcode:2004PNAS..101.2167W. doi: 10.1073/pnas.0305697101 . PMC   357070 . PMID   14766982.
  76. 1 2 3 4 5 Clavagnier S, Falchier A, Kennedy H (June 2004). "Long-distance feedback projections to area V1: implications for multisensory integration, spatial awareness, and visual consciousness" (PDF). Cogn Affect Behav Neurosci. 4 (2): 117–26. doi: 10.3758/CABN.4.2.117 . PMID   15460918. S2CID   6907281.
  77. Foxe JJ, Simpson GV (January 2002). "Flow of activation from V1 to frontal cortex in humans. A framework for defining "early" visual processing". Exp Brain Res. 142 (1): 139–50. doi:10.1007/s00221-001-0906-7. PMID   11797091. S2CID   25506401.
  78. Macaluso E, Frith CD, Driver J (August 2000). "Modulation of human visual cortex by crossmodal spatial attention". Science. 289 (5482): 1206–8. Bibcode:2000Sci...289.1206M. CiteSeerX   10.1.1.420.5403 . doi:10.1126/science.289.5482.1206. PMID   10947990.
  79. 1 2 3 4 Sadato N, Yamada H, Okada T, et al. (December 2004). "Age-dependent plasticity in the superior temporal sulcus in deaf humans: a functional MRI study". BMC Neurosci. 5: 56. doi: 10.1186/1471-2202-5-56 . PMC   539237 . PMID   15588277.
  80. Bernstein LE, Auer ET, Moore JK, Ponton CW, Don M, Singh M (March 2002). "Visual speech perception without primary auditory cortex activation". NeuroReport. 13 (3): 311–5. doi:10.1097/00001756-200203040-00013. PMID   11930129. S2CID   44484836.
  81. 1 2 Jones EG, Powell TP (1970). "An anatomical study of converging sensory pathways within the cerebral cortex of the monkey". Brain. 93 (4): 793–820. doi:10.1093/brain/93.4.793. PMID   4992433.
  82. 1 2 3 Grefkes, C.; Fink, GR. (Jul 2005). "The functional organization of the intraparietal sulcus in humans and monkeys". J Anat. 207 (1): 3–17. doi:10.1111/j.1469-7580.2005.00426.x. PMC   1571496 . PMID   16011542.
  83. Murata, A.; Fadiga, L.; Fogassi, L.; Gallese, V.; Raos, V.; Rizzolatti, G. (Oct 1997). "Object representation in the ventral premotor cortex (area F5) of the monkey". J Neurophysiol. 78 (4): 2226–30. doi:10.1152/jn.1997.78.4.2226. PMID   9325390.
  84. Smiley, JF.; Falchier, A. (Dec 2009). "Multisensory connections of monkey auditory cerebral cortex". Hear Res. 258 (1–2): 37–46. doi:10.1016/j.heares.2009.06.019. PMC   2788085 . PMID   19619628.
  85. Sharma, J.; Dragoi, V.; Tenenbaum, JB.; Miller, EK.; Sur, M. (Jun 2003). "V1 neurons signal acquisition of an internal representation of stimulus location". Science. 300 (5626): 1758–63. Bibcode:2003Sci...300.1758S. doi:10.1126/science.1081721. PMID   12805552. S2CID   14716502.
  86. Lacey, S.; Tal, N.; Amedi, A.; Sathian, K. (May 2009). "A putative model of multisensory object representation". Brain Topogr. 21 (3–4): 269–74. doi:10.1007/s10548-009-0087-4. PMC   3156680 . PMID   19330441.
  87. Neal, JW.; Pearson, RC.; Powell, TP. (Jul 1990). "The ipsilateral cortico-cortical connections of area 7b, PF, in the parietal and temporal lobes of the monkey". Brain Res. 524 (1): 119–32. doi:10.1016/0006-8993(90)90500-B. PMID   1698108. S2CID   24535669.
  88. Eickhoff, SB.; Schleicher, A.; Zilles, K.; Amunts, K. (Feb 2006). "The human parietal operculum. I. Cytoarchitectonic mapping of subdivisions". Cereb Cortex. 16 (2): 254–67. doi: 10.1093/cercor/bhi105 . PMID   15888607.
  89. Petitto LA, Zatorre RJ, Gauna K, Nikelski EJ, Dostie D, Evans AC (December 2000). "Speech-like cerebral activity in profoundly deaf people processing signed languages: implications for the neural basis of human language". Proc. Natl. Acad. Sci. U.S.A. 97 (25): 13961–6. doi: 10.1073/pnas.97.25.13961 . PMC   17683 . PMID   11106400.
  90. Meredith, MA.; Clemo, HR. (Nov 1989). "Auditory cortical projection from the anterior ectosylvian sulcus (Field AES) to the superior colliculus in the cat: an anatomical and electrophysiological study". J Comp Neurol. 289 (4): 687–707. doi:10.1002/cne.902890412. PMID   2592605. S2CID   221577963.
  91. 1 2 Jiang, W.; Wallace, MT.; Jiang, H.; Vaughan, JW.; Stein, BE. (Feb 2001). "Two cortical areas mediate multisensory integration in superior colliculus neurons". J Neurophysiol. 85 (2): 506–22. doi:10.1152/jn.2001.85.2.506. PMID   11160489. S2CID   2499047.
  92. Wallace, MT.; Carriere, BN.; Perrault, TJ.; Vaughan, JW.; Stein, BE. (Nov 2006). "The development of cortical multisensory integration". Journal of Neuroscience. 26 (46): 11844–9. doi:10.1523/JNEUROSCI.3295-06.2006. PMC   6674880 . PMID   17108157.
  93. 1 2 3 Jiang W, Stein BE (October 2003). "Cortex controls multisensory depression in superior colliculus". J. Neurophysiol. 90 (4): 2123–35. doi:10.1152/jn.00369.2003. PMID   14534263.
  94. Wallace MT, Meredith MA, Stein BE (June 1993). "Converging influences from visual, auditory, and somatosensory cortices onto output neurons of the superior colliculus". J. Neurophysiol. 69 (6): 1797–809. doi:10.1152/jn.1993.69.6.1797. PMID   8350124.
  95. Stein, Barry E.; London, Nancy; Wilkinson, Lee K.; Price, Donald D. (1996). "Enhancement of Perceived Visual Intensity by Auditory Stimuli: A Psychophysical Analysis". Journal of Cognitive Neuroscience. 8 (6): 497–506. doi:10.1162/jocn.1996.8.6.497. PMID   23961981. S2CID   43705477.
  96. Goodale MA, Milner AD (January 1992). "Separate visual pathways for perception and action" (PDF). Trends Neurosci. 15 (1): 20–5. CiteSeerX   10.1.1.207.6873 . doi:10.1016/0166-2236(92)90344-8. PMID   1374953. S2CID   793980.
  97. Renier LA, Anurova I, De Volder AG, Carlson S, VanMeter J, Rauschecker JP (September 2009). "Multisensory integration of sounds and vibrotactile stimuli in processing streams for "what" and "where"". J. Neurosci. 29 (35): 10950–60. doi:10.1523/JNEUROSCI.0910-09.2009. PMC   3343457 . PMID   19726653.
  98. 1 2 3 4 5 6 7 8 9 Nardini, M; Bedford, R; Mareschal, D (Sep 28, 2010). "Fusion of visual cues is not mandatory in children". Proceedings of the National Academy of Sciences of the United States of America. 107 (39): 17041–6. Bibcode:2010PNAS..10717041N. doi: 10.1073/pnas.1001699107 . PMC   2947870 . PMID   20837526.
  99. 1 2 3 4 5 6 7 8 Gori, M; Del Viva, M; Sandini, G; Burr, DC (May 6, 2008). "Young children do not integrate visual and haptic form information" (PDF). Current Biology. 18 (9): 694–8. Bibcode:2008CBio...18..694G. doi: 10.1016/j.cub.2008.04.036 . PMID   18450446. S2CID   13899031.
  100. Lewkowicz, D; Kraebel, K (2004). Gemma Calvert; Charles Spence; Barry E Stein (eds.). The value of multisensory redundancy in the development of intersensory perception. Cambridge, Mass: MIT Press, cop. pp.  655–78. ISBN   9780262033213. OCLC   803222288.{{cite book}}: |work= ignored (help)
  101. Neil, PA; Chee-Ruiter, C; Scheier, C; Lewkowicz, DJ; Shimojo, S (Sep 2006). "Development of multisensory spatial integration and perception in humans". Developmental Science. 9 (5): 454–64. doi:10.1111/j.1467-7687.2006.00512.x. PMID   16911447. S2CID   25690976.
  102. 1 2 Hillis, JM; Ernst, MO; Banks, MS; Landy, MS (Nov 22, 2002). "Combining sensory information: mandatory fusion within, but not between, senses". Science. 298 (5598): 1627–30. Bibcode:2002Sci...298.1627H. CiteSeerX   10.1.1.278.6134 . doi:10.1126/science.1075396. PMID   12446912. S2CID   15607270.
  103. 1 2 3 4 Ernst, MO (Jun 24, 2008). "Multisensory integration: a late bloomer". Current Biology. 18 (12): R519–21. Bibcode:2008CBio...18.R519E. doi: 10.1016/j.cub.2008.05.002 . PMID   18579094. S2CID   130911.
  104. Stein, BE; Meredith, MA; Wallace, MT (1993). Chapter 8 the visually responsive neuron and beyond: Multisensory integration in cat and monkey. Progress in Brain Research. Vol. 95. pp. 79–90. doi:10.1016/s0079-6123(08)60359-3. ISBN   9780444894922. PMID   8493355.
  105. Stein, BE; Labos, E; Kruger, L (Jul 1973). "Sequence of changes in properties of neurons of superior colliculus of the kitten during maturation". Journal of Neurophysiology. 36 (4): 667–79. doi:10.1152/jn.1973.36.4.667. PMID   4713313.
  106. Wallace, MT; Stein, BE (Nov 15, 2001). "Sensory and multisensory responses in the newborn monkey superior colliculus". The Journal of Neuroscience. 21 (22): 8886–94. doi:10.1523/JNEUROSCI.21-22-08886.2001. PMC   6762279 . PMID   11698600.
  107. Tootell, RB; Hadjikhani, NK; Vanduffel, W; Liu, AK; Mendola, JD; Sereno, MI; Dale, AM (Feb 3, 1998). "Functional analysis of primary visual cortex (V1) in humans". Proceedings of the National Academy of Sciences of the United States of America. 95 (3): 811–7. Bibcode:1998PNAS...95..811T. doi: 10.1073/pnas.95.3.811 . PMC   33802 . PMID   9448245.
  108. 1 2 Gori, M; Giuliana, L; Sandini, G; Burr, D (Nov 2012). "Visual size perception and haptic calibration during development". Developmental Science. 15 (6): 854–62. doi:10.1111/j.1467-7687.2012.2012.01183.x. PMID   23106739.
  109. Granrud, CE; Schmechel, TT (Nov 2006). "Development of size constancy in children: a test of the proximal mode sensitivity hypothesis". Perception & Psychophysics. 68 (8): 1372–81. doi: 10.3758/bf03193736 . PMID   17378423.
  110. Gepshtein, S; Burge, J; Ernst, MO; Banks, MS (Dec 28, 2005). "The combination of vision and touch depends on spatial proximity". Journal of Vision. 5 (11): 1013–23. doi:10.1167/5.11.7. PMC   2632311 . PMID   16441199.
  111. Helbig, HB; Ernst, MO (Jun 2007). "Optimal integration of shape information from vision and touch". Experimental Brain Research. 179 (4): 595–606. doi:10.1007/s00221-006-0814-y. PMID   17225091. S2CID   12049308.
  112. Helbig, HB; Ernst, MO (2007). "Knowledge about a common source can promote visual- haptic integration". Perception. 36 (10): 1523–33. doi:10.1068/p5851. PMID   18265835. S2CID   14884284.
  113. Kail, RV; Ferrer, E (Nov–Dec 2007). "Processing speed in childhood and adolescence: longitudinal models for examining developmental change". Child Development. 78 (6): 1760–70. doi:10.1111/j.1467-8624.2007.01088.x. PMID   17988319.
  114. Kail, R (May 1991). "Developmental change in speed of processing during childhood and adolescence". Psychological Bulletin. 109 (3): 490–501. doi:10.1037/0033-2909.109.3.490. PMID   2062981.
  115. Ballard, DH; Hayhoe, MM; Pook, PK; Rao, RP (Dec 1997). "Deictic codes for the embodiment of cognition". The Behavioral and Brain Sciences. 20 (4): 723–42, discussion 743–67. CiteSeerX   10.1.1.49.3813 . doi:10.1017/s0140525x97001611. PMID   10097009. S2CID   1961389.
  116. Daee, Pedram; Mirian, Maryam S.; Ahmadabadi, Majid Nili (2014). "Reward Maximization Justifies the Transition from Sensory Selection at Childhood to Sensory Integration at Adulthood". PLOS ONE. 9 (7): e103143. Bibcode:2014PLoSO...9j3143D. doi: 10.1371/journal.pone.0103143 . PMC   4110011 . PMID   25058591.
  117. Rincon-Gonzalez L, WarrenJ P (2011). "Haptic interaction of touch and proprioception: implications for neuroprosthetics". IEEE Trans. Neural Syst. Rehabil. Eng. 19 (5): 490–500. doi:10.1109/tnsre.2011.2166808. PMID   21984518. S2CID   20575638.
  118. Jiang, Huai; Stein, Barry E.; McHaffie, John G. (2015-05-29). "Multisensory training reverses midbrain lesion-induced changes and ameliorates haemianopia". Nature Communications. 6: 7263. Bibcode:2015NatCo...6.7263J. doi:10.1038/ncomms8263. ISSN   2041-1723. PMC   6193257 . PMID   26021613.

Further reading