Perceptual narrowing

Last updated

Perceptual narrowing is a developmental process during which the brain uses environmental experiences to shape perceptual abilities. This process improves the perception of things that people experience often and causes them to experience a decline in the ability to perceive some things to which they are not often exposed. [1] [2] [3] This phenomenon is a result of neuroplasticity, including Hebbian learning [4] [5] and synaptic pruning. [2] Through these mechanisms, neural pathways that are more consistently used are strengthened, making them more efficient, while those pathways that are unused become less efficient. This process is most evident during sensitive periods of development. [6] The prevailing theory is that human infants are born with the ability to sense a wide variety of stimuli, and as they age, they begin to selectively narrow these perceptions by categorizing them in a more socio-culturally relevant way. Most of the research in this area focuses on facial discrimination and phoneme distinction in human infants. However, other work has found that perceptual narrowing also occurs for music [7] and sign language [8] perception. Perceptual narrowing has also been implicated in synaesthesia.

Contents

Facial discrimination

Cross racial

Most of the research done to date in the area of perceptual narrowing involves facial processing studies conducted with infants. Using a preferential looking procedure in cross racial studies, Caucasian infants were tested on their ability to distinguish two faces from four different racial groups. Facial prompts were presented from their own racial group, as well as, African, Asian, and Middle Eastern. At three months of age, infants were able to show recognition for familiar faces from all racial groups, but by six months, a pattern was beginning to emerge where the infants could only recognize faces from the Caucasian or Chinese groups—groups they had more familiarity with. At nine months, recognition took place only in the own-race group. These cross race studies provide strong evidence that children do start out with cross racial recognition abilities but as they age, they quickly begin to organize the data and select the stimuli that is most familiar to them, typically own-race faces. [9]

Cross species

Cross species studies have been conducted where human infants at 6 months of age were familiarized with individual monkeys. When the monkey faces were associated with unique proper name labels, the infants maintained their ability to discriminate between them when retested at nine months of age. If the exposure was just to monkey faces in general, without name labels, the infants were unable to discriminate between them when retested at the nine months mark. This research shows that the individuation process helps to shape and maintain discrimination abilities for categories of familiarity, and is instrumental in the recognition of familiar faces later in life. It also highlights the importance of experience in perceptual narrowing. [10]

Infants showed similar results to non-primate species such as cats and dogs at the ages of three to four months of age, however without the confirmation of whether or not the ability to discriminate decreases as development occurs after nine months. [11]

Phoneme distinction

At birth, infants have broad abilities to detect similarities and differences among languages. The phonemes of different languages sound distinct to infants less than six months of age, but as the infant grows and their brain develops, they become less able to distinguish phonemes of non-native languages and more responsive to their native language. [12] This is presumably due to infants experiencing their native language often, while not getting much experience with non-native languages. [13] Research suggests that this perceptual narrowing phenomenon occurs within the first year of life. Infants aged 6–8 months have a greater ability to distinguish between non-native sounds in comparison to infants who are 8–10 months of age. Near the end of 12 months, infants are beginning to understand and produce speech in their native language, and by the end of the first year of life infants detect these phonemic distinctions at low levels that are similar to that of adults. [12]

Neural mechanisms

Brain plasticity is responsible for this "tuning" of infants' perceptual ability. While plasticity is evident throughout the human lifespan, it occurs most often at younger ages, during sensitive periods of development. [6] This is a function of synaptic pruning, a mechanism of plasticity where the overall number of neurons and neural pathways are reduced, leaving only the most commonly used—and most efficient—neural pathways. These pathways are also more myelinated which increases the speeds at which processing occurs. [14] Evidence suggests that perceptual narrowing, especially phoneme distinction, is heavily reliant on infants' social interaction with the adults in their environment; this is referred to as the "social gating hypothesis". The Social Gating Hypothesis suggests that social interaction creates an optimal learning environment for infants, an environment that introduces learning through social context. Social Gating might function in a number of ways; for example, by increasing infants' attention or arousal, increasing infants' sense of relationship, and by strengthening an infant's link between perception and action. [15]

Synaesthesia

Synaesthesia is a condition in which the stimulation of one sense evokes an additional stimulation in another sense, [16] such as a tone (auditory stimulation) evoking the experience of color or shapes (visual stimulation). Some research suggests that infants universally have this experience due to the greater number of functional synaptic connections across sensory domains compared to adults, and that over the course of normal development this experience dissipates through the process of perceptual narrowing. There is evidence that a failure in the perceptual narrowing process can, in rare cases, lead to adult synaethesia. [17]

See also

Related Research Articles

<span class="mw-page-title-main">Developmental psychology</span> Scientific study of psychological changes in humans over the course of their lives

Developmental psychology is the scientific study of how and why humans grow, change, and adapt across the course of their lives. Originally concerned with infants and children, the field has expanded to include adolescence, adult development, aging, and the entire lifespan. Developmental psychologists aim to explain how thinking, feeling, and behaviors change throughout life. This field examines change across three major dimensions, which are physical development, cognitive development, and social emotional development. Within these three dimensions are a broad range of topics including motor skills, executive functions, moral understanding, language acquisition, social change, personality, emotional development, self-concept, and identity formation.

Lip reading, also known as speechreading, is a technique of understanding a limited range of speech by visually interpreting the movements of the lips, face and tongue without sound. Estimates of the range of lip reading vary, with some figures as low as 30% because lip reading relies on context, language knowledge, and any residual hearing. Although lip reading is used most extensively by deaf and hard-of-hearing people, most people with normal hearing process some speech information from sight of the moving mouth.

<span class="mw-page-title-main">Face perception</span> Cognitive process of visually interpreting the human face

Facial perception is an individual's understanding and interpretation of the face. Here, perception implies the presence of consciousness and hence excludes automated facial recognition systems. Although facial recognition is found in other species, this article focuses on facial perception in humans.

Cognitive development is a field of study in neuroscience and psychology focusing on a child's development in terms of information processing, conceptual resources, perceptual skill, language learning, and other aspects of the developed adult brain and cognitive psychology. Qualitative differences between how a child processes their waking experience and how an adult processes their waking experience are acknowledged. Cognitive development is defined as the emergence of the ability to consciously cognize, understand, and articulate their understanding in adult terms. Cognitive development is how a person perceives, thinks, and gains understanding of their world through the relations of genetic and learning factors. There are four stages to cognitive information development. They are, reasoning, intelligence, language, and memory. These stages start when the baby is about 18 months old, they play with toys, listen to their parents speak, they watch TV, anything that catches their attention helps build their cognitive development.

Multisensory integration, also known as multimodal integration, is the study of how information from the different sensory modalities may be integrated by the nervous system. A coherent representation of objects combining modalities enables animals to have meaningful perceptual experiences. Indeed, multisensory integration is central to adaptive behavior because it allows animals to perceive a world of coherent perceptual entities. Multisensory integration also deals with how different sensory modalities interact with one another and alter each other's processing.

In developmental psychology and developmental biology, a critical period is a maturational stage in the lifespan of an organism during which the nervous system is especially sensitive to certain environmental stimuli. If, for some reason, the organism does not receive the appropriate stimulus during this "critical period" to learn a given skill or trait, it may be difficult, ultimately less successful, or even impossible, to develop certain associated functions later in life. Functions that are indispensable to an organism's survival, such as vision, are particularly likely to develop during critical periods. "Critical period" also relates to the ability to acquire one's first language. Researchers found that people who passed the "critical period" would not acquire their first language fluently.

Visual learning is a learning style among the learning styles of Neil Fleming's VARK model in which information is presented to a learner in a visual format. Visual learners can utilize graphs, charts, maps, diagrams, and other forms of visual stimulation to effectively interpret information. The Fleming VARK model also includes Kinesthetic Learning and Auditory learning. There is no evidence that providing visual materials to students identified as having a visual style improves learning.

<span class="mw-page-title-main">Infant visual development</span>

Infant vision concerns the development of visual ability in human infants from birth through the first years of life. The aspects of human vision which develop following birth include visual acuity, tracking, color perception, depth perception, and object recognition.

Speech perception is the process by which the sounds of language are heard, interpreted, and understood. The study of speech perception is closely linked to the fields of phonology and phonetics in linguistics and cognitive psychology and perception in psychology. Research in speech perception seeks to understand how human listeners recognize speech sounds and use this information to understand spoken language. Speech perception research has applications in building computer systems that can recognize speech, in improving speech recognition for hearing- and language-impaired listeners, and in foreign-language teaching.

In psychology and cognitive neuroscience, pattern recognition describes a cognitive process that matches information from a stimulus with information retrieved from memory.

Phonological development refers to how children learn to organize sounds into meaning or language (phonology) during their stages of growth.

Infant cognitive development is the first stage of human cognitive development, in the youngest children. The academic field of infant cognitive development studies of how psychological processes involved in thinking and knowing develop in young children. Information is acquired in a number of ways including through sight, sound, touch, taste, smell and language, all of which require processing by our cognitive system. However, cognition begins through social bonds between children and caregivers, which gradually increase through the essential motive force of Shared intentionality. The notion of Shared intentionality describes unaware processes during social learning at the onset of life when organisms in the simple reflexes substage of the sensorimotor stage of cognitive development do not maintain communication via the sensory system.

Associative sequence learning (ASL) is a neuroscientific theory that attempts to explain how mirror neurons are able to match observed and performed actions, and how individuals are able to imitate body movements. The theory was proposed by Cecilia Heyes in 2000.. A conceptually similar model proposed by Christian Keysers and David Perrett, based on what we know about the neural properties of mirror neurons and spike-timing-dependent plasticity is the Hebbian learning account of mirror neurons.

Cognitive musicology is a branch of cognitive science concerned with computationally modeling musical knowledge with the goal of understanding both music and cognition.

Perceptual learning is learning better perception skills such as differentiating two musical tones from one another or categorizations of spatial and temporal patterns relevant to real-world expertise. Examples of this may include reading, seeing relations among chess pieces, and knowing whether or not an X-ray image shows a tumor.

The Troland Research Awards are an annual prize given by the United States National Academy of Sciences to two researchers in recognition of psychological research on the relationship between consciousness and the physical world. The areas where these award funds are to be spent include but are not limited to areas of experimental psychology, the topics of sensation, perception, motivation, emotion, learning, memory, cognition, language, and action. The award preference is given to experimental work with a quantitative approach or experimental research seeking physiological explanations.

Speech acquisition focuses on the development of vocal, acoustic and oral language by a child. This includes motor planning and execution, pronunciation, phonological and articulation patterns.

Statistical language acquisition, a branch of developmental psycholinguistics, studies the process by which humans develop the ability to perceive, produce, comprehend, and communicate with natural language in all of its aspects through the use of general learning mechanisms operating on statistical patterns in the linguistic input. Statistical learning acquisition claims that infants' language-learning is based on pattern perception rather than an innate biological grammar. Several statistical elements such as frequency of words, frequent frames, phonotactic patterns and other regularities provide information on language structure and meaning for facilitation of language acquisition.

Many experiments have been done to find out how the brain interprets stimuli and how animals develop fear responses. The emotion, fear, has been hard-wired into almost every individual, due to its vital role in the survival of the individual. Researchers have found that fear is established unconsciously and that the amygdala is involved with fear conditioning.

April A. Benasich is an American neuroscientist. She is the Elizabeth H. Solomon Professor of Developmental Cognitive Neuroscience, director of the Infancy Studies Laboratory at the Center for Molecular and Behavioral Neuroscience, and director of the Carter Center for Neurocognitive Research and Professor of Neuroscience at Rutgers University. She is also a principal investigator within the National Science Foundation-funded Temporal Dynamics of Learning Center headquartered at the University of California, San Diego’s Institute for Neural Computation.

References

  1. Lewkowicz, D. J., & Ghazanfar, A. A. (2009). The emergence of multisensory systems through perceptual narrowing. Trends in Cognitive Sciences, 13(11), 470-478. doi : 10.1016/j.tics.2009.08.004
  2. 1 2 Scott, L. S., Pascalis, O., & Nelson, C. A. (2007). A domain-general theory of the development of perceptual discrimination. Current Directions in Psychological Science (Wiley-Blackwell), 16(4), 197-201. doi : 10.1111/j.1467-8721.2007.00503.x
  3. Scott, L. S., & Monesson, A. (2010). Experience-dependent neural specialization during infancy. Neuropsychologia, 48(6), 1857-1861. doi : 10.1016/j.neuropsychologia.2010.02.008
  4. Tichko, P., & Large, E. W. (2019). Modeling infants’ perceptual narrowing to musical rhythms: neural oscillation and Hebbian plasticity. Annals of the New York Academy of Sciences, 1453(1), 125–139. https://doi.org/10.1111/nyas.14050
  5. McClelland, J. L., Thomas, A. G., McCandliss, B. D., & Fiez, J. A. (1999). Understanding failures of learning: Hebbian learning, competition for representational space, and some preliminary experimental data. In Progress in Brain Research (Vol. 6, pp. 75–80). https://doi.org/10.1016/S0079-6123(08)63068-X
  6. 1 2 Tierney, A.L., & Nelson III, C.A. (2009). Brain development and the role of experience in the early years. Zero to Three, 30(2), 9-13
  7. Hannon, E. E., & Trehub, S. E. (2005). Tuning in to musical rhythms: Infants learn more readily than adults. Proceedings of the National Academy of Sciences, 102(35), 12639–12643. https://doi.org/10.1073/pnas.0504254102
  8. Palmer, S. B., Fais, L., Golinkoff, R. M., & Werker, J. F. (2012). Perceptual Narrowing of Linguistic Sign Occurs in the 1st Year of Life. Child Development, 83(2), 543–553. https://doi.org/10.1111/j.1467-8624.2011.01715.x
  9. Kelly, D. J., Quinn, P. C., Slater, A. M., Lee, K., Ge, L., & Pascalis, O. (2007). The other-race effect develops during infancy: Evidence of perceptual narrowing. Psychological Science (Wiley-Blackwell), 18(12), 1084-1089
  10. Scott, L.S., & Monesson, A. (2009). The origin of biases in face perception. Psychological Science, 20(6), 676-680.
  11. Simpson, E. A., Varga, K., Frick, J. E. and Fragaszy, D. (2011), Infants Experience Perceptual Narrowing for Nonprimate Faces. Infancy, 16: 318–328. doi : 10.1111/j.1532-7078.2010.00052.x
  12. 1 2 Werker, J. F., & Tees, R. C. (2002). Cross-language speech perception: Evidence for perceptual reorganization during the first year of life. Infant Behavior and Development, 25(1), 121-133. doi : 10.1016/S0163-6383(02)00093-0
  13. Pons, F., Lewkowicz, D. J., Soto-Faraco, S., & Sebastián-Gallés, N. (2009). Narrowing of intersensory speech perception in infancy. PNAS Proceedings of the National Academy of Sciences of the United States of America, 106(26), 10598-10602. doi : 10.1073/pnas.0904134106
  14. Gerrard, S., & Rugg, G. (2009). Sensory impairment and autism: A re-examination of causal modeling. Journal of Autism and Developmental Disorders, 39(10), 1449-1463. doi : 10.1007/s10803-009-0773-9
  15. Kuhl, P. K. (2010). Brain mechanisms in early language acquisition. Neuron, 67(5), 713-727. doi : 10.1016/j.neuron.2010.08.038
  16. Wagner, K., & Dobkins, K. R. (2011). Synaesthetic associations decrease during infancy. Psychological Science, 22(8), 1067-1072. doi : 10.1177/0956797611416250
  17. Spector, F., & Maurer, D. (2009). Synesthesia: A new approach to understanding the development of perception. Developmental Psychology, 45(1), 175-189. doi : 10.1037/a0014171