Manual babbling

Last updated

Manual babbling is a linguistic phenomenon that has been observed in deaf children and hearing children born to deaf parents who have been exposed to sign language. Manual babbles are characterized by repetitive movements that are confined to a limited area in front of the body similar to the sign-phonetic space used in sign languages. In their 1991 paper, Pettito and Marentette concluded that between 40% and 70% of deaf children's manual activity can be classified as manual babbling, whereas manual babbling accounts for less than 10% of hearing children’s manual activity. Manual Babbling appears in both deaf and hearing children learning American Sign Language from 6 to 14 months old (Marschark, 2003). [1] [2] [3]

Contents

Manual babbling is not to be confused with movement that is motor-driven and non-communicative/common communicative in nature. Babbling occurs during the same period of development when an infant is also trying to establish a sense of their spatial orientation and cognition. This results in arm and hand movements outside of what could be categorized as manual babbling. For example, when an infant waves their arm back and forth, they may be transitioning between uncoordinated behaviors and intentional, voluntary behaviors like reaching. The frequency of these arm and hand gestures peaks between 5 and a half months and 9 and a half months, which is around the same time that babbling begins (6 to 9 months). [4]

Language development

Babbling is an important step in the language acquisition of infants (Chamberlain et al, 1998). Before an infant is even able to form their first words, they will produce these phonological tokens that, while meaningless, conform to the broad rules for syllable structure. Children that have access to spoken language will produce vocal babbles while children that have access to signed language will produce manual babbles. In other words, “vocal babbling is “triggered” by the patterned input of a spoken linguistic environment while manual babbling is triggered by the patterned input of a signed linguistic environment” (Cormier et al., 1998, p. 55). [4]

All infants are equipped to detect rhythmic patterns and properties of the linguistic input they receive. Non-hearing infants explore manual gestures (like those made in sign language) in the same way that a hearing child may explore phonemes of a spoken language. Where hearing children are triggered by the sound patterns they hear, deaf children are more attentive to the movement patterns they see. [2] In their studies, Petitto and Marentette researched the difference between manual babbling of both hearing and non-hearing infants, and found that non-hearing babies produce more tokens of manual babbles than hearing infants do. However, they did not find a significant difference between the frequency of communicative gestures (such as waving, reaching, and pointing) between hearing and non-hearing infants. [2]

Prelinguistic signing

In 1995, Meier and Willerman defined the three primary manual gestures as pointing, reaching, and waving. These common communicative gestures are different from babbles because they carry meaning (whereas babbles are meaningless). [2] Pettito and Marentette, referencing American Sign Language phonology, defined language-driven manual babbling as signed symbols that have a hand shape, location, and a movement that must be realized as a change in location, hand shape, or palm orientation (Marentette, 1989). [2]

Comparisons to vocal babbling

Differences in the vocal behavior of deaf and hearing children do not appear in the first 3 stages of vocal development: the phonation stage (0-1 month), the GOO stage (2–3 months), and the expansion stage (4–6 months). The most significant differences begin to appear in the reduplicative babbling stage (7–10 months), during which a hearing infant will begin producing marginal babbling and canonical babbling (repetitive consonant-vowel syllables). Deaf infants, like hearing infants, will begin producing marginal babbling, but they will rarely proceed to canonical babbling. During the typical reduplicative babbling stage, the vocal activity of deaf children decreases dramatically. This decrease in vocal activity indicates that a lack of auditory feedback significantly inhibits deaf children’s vocal development (Chamberlain, 1999). [5] It is important to note that while it has been determined that auditory feedback is crucial to the development and self-monitoring of spoken language, the role of visual feedback in the development of signed language has yet to be explored (Cutler, 2017). [6]

Furthermore, there is evidence that manual babbling resembles the vocal babbling of hearing children and allows for further communication through sign. As the deaf infant develops, the cyclical nature of their manual babbling increases, leading to the production of different hand shapes used in sign language, such the 5 hand, C hand, and S hand. These hand shapes become more significant if a caretaker receives the infant’s babbling and reinforces it using child-directed signing (i.e. sign motherese). Child-directed signing is similar to vocal motherese (aka "baby-talk"). A caretaker’s reinforcement will convey to a deaf child that they are producing sign speech in a way that is analogous to that of hearing parents and infants learning spoken language. [7] [8] [5]

Pettito and Marentette found that for a child learning sign language, they may produce their first sign around 8 to 10 months old while hearing children typically produce their first words around 12 to 13 months old. [9] Despite this slight difference in the onset of language, very few differences have been found in the acquisition of vocabulary and language between deaf and hearing children. Both hearing and deaf children produce babbles in rhythmic, temporally oscillating bundles that are syllabically organized and share phonological properties with fluent adults. In fact, manual babbling is “characterized by identical timing, patterning, structure, and use” as that of vocal babbling (Chamberlain, 1999, pg. 18). [5]

Summary Table: Vocal Babbling vs. Manual Babbling

Vocal BabblingManual Babbling
  • Meaningless phonological productions that conform to broad rules for syllable structure of spoken language
  • Begins between 6–9 months old
  • First spoken words around 12–13 months
  • Produced in rhythmic, temporally oscillating bundles
  • Shares phonological properties with spoken language
  • Produced only by hearing children
    • Dependent upon auditory feedback
  • Improved by vocal motherese
  • Triggered by patterned input of spoken linguistic environment
  • Reinforced by auditory feedback
  • Meaningless phonological productions that conform to broad rules for syllable structure of signed language
  • Begins between 6–9 months old
  • First signed words around 8–10 months
  • Produced in rhythmic, temporally oscillating bundles
  • Shares phonological properties with signed language
  • Produced by both deaf and hearing children
    • 40% to 70% of manual activity in deaf children and less than 10% of manual activity in hearing children
  • Improved by sign motherese
  • Triggered by patterned input of signed linguistic environment
  • The role of visual feedback in manual babbling and sign language acquisition is currently unknown

Supporting evidence

In a study by Adrianne Cheek, Kearsy Cormier, Christian Rathmarm, Ann Repp, and Richard Meier, they found similarities between babbles and first signs. The analysis of the properties of babbles and signs showed that all infants produced a relaxed hand with all fingers extended more often than any other hand shape; the same held for deaf infants in first signs. Infants also displayed downward movements more often for babbles and signs than any other movement category. [10] Finally, babies demonstrated a preference for one-handed babbles over two-handed ones. Deaf babies maintained this preference by producing more one-handed signs than two-handed signs. For palm orientation, subjects predominately babbled or signed with palms down. [10]

Kearsy Cormier, Claude Mauk, and Ann Repp conducted an observational study of the natural behaviors of hearing and deaf infants. They utilized a global approach to manual babbling in their coding, as suggested by Meier and Willerman. The two goals of their study were: “(1) Specify the time course of manual babbling in deaf and hearing infants; and (2) Examine the relationship between manual babbling and the onset of communicative gestures” (Cormier, 1998, p. 57). The results support the predictions and claims made previously by Meier and Willerman which found most early gestural behavior is a result of motor development in addition to linguistic influences. What's more, deaf children usually produced more referential gestures, specifically referential pointing, than hearing children, which could be the result of their distinct linguistic environments. For example, pointing becomes essential for deaf children learning sign language. While hearing children also engage in pointing behavior, it will always be an added gesture that is not required by the spoken language. The study concludes that, while a child’s early communicative gestures are partially determined by linguistic environment, manual babbling is mainly influenced by motor development, which occurs in both deaf and hearing children. [11] This finding differs from the initial research conducted by Pettito and Marentette, which found that manual babbling is mainly influenced by linguistic development. Pettito addressed this in a subsequent study and concluded that these differences were derived from the different coding methods used (Meier and Willerman's method used a more general definition of manual babbling than Pettito and Marentette had developed) (Pettito, 2004). [12]

See also

Related Research Articles

Language acquisition is the process by which humans acquire the capacity to perceive and comprehend language. In other words, it is how human beings gain the ability to be aware of language, to understand it, and to produce and use words and sentences to communicate.

<span class="mw-page-title-main">Sign language</span> Language that uses manual communication and body language to convey meaning

Sign languages are languages that use the visual-manual modality to convey meaning, instead of spoken words. Sign languages are expressed through manual articulation in combination with non-manual markers. Sign languages are full-fledged natural languages with their own grammar and lexicon. Sign languages are not universal and are usually not mutually intelligible, although there are also similarities among different sign languages.

<span class="mw-page-title-main">Baby sign language</span> Signed language systems used with hearing infants/toddlers

Baby sign language is the use of manual signing allowing infants and toddlers to communicate emotions, desires, and objects prior to spoken language development. With guidance and encouragement, signing develops from a natural stage in infant development known as gesture. These gestures are taught in conjunction with speech to hearing children, and are not the same as a sign language. Some common benefits that have been found through the use of baby sign programs include an increased parent-child bond and communication, decreased frustration, and improved self-esteem for both the parent and child. Researchers have found that baby sign neither benefits nor harms the language development of infants. Promotional products and ease of information access have increased the attention that baby sign receives, making it pertinent that caregivers become educated before making the decision to use baby sign.

<span class="mw-page-title-main">Gesture</span> Form of non-verbal/non-vocal communication

A gesture is a form of non-verbal communication or non-vocal communication in which visible bodily actions communicate particular messages, either in place of, or in conjunction with, speech. Gestures include movement of the hands, face, or other parts of the body. Gestures differ from physical non-verbal communication that does not communicate specific messages, such as purely expressive displays, proxemics, or displays of joint attention. Gestures allow individuals to communicate a variety of feelings and thoughts, from contempt and hostility to approval and affection, often together with body language in addition to words when they speak. Gesticulation and speech work independently of each other, but join to provide emphasis and meaning.

<span class="mw-page-title-main">Babbling</span> Stage in child development and language acquisition

Babbling is a stage in child development and a state in language acquisition during which an infant appears to be experimenting with uttering articulate sounds, but does not yet produce any recognizable words. Babbling begins shortly after birth and progresses through several stages as the infant's repertoire of sounds expands and vocalizations become more speech-like. Infants typically begin to produce recognizable words when they are around 12 months of age, though babbling may continue for some time afterward.

<span class="mw-page-title-main">Great ape language</span> Efforts to teach non-human primates to communicate with humans

Research into great ape language has involved teaching chimpanzees, bonobos, gorillas and orangutans to communicate with humans and each other using sign language, physical tokens, lexigrams, and imitative human speech. Some primatologists argue that the use of these communication methods indicate primate "language" ability, though this depends on one's definition of language. The cognitive tradeoff hypothesis suggests that human language skills evolved at the expense of the short-term and working memory capabilities observed in other hominids.

A language delay is a language disorder in which a child fails to develop language abilities at the usual age-appropriate period in their developmental timetable. It is most commonly seen in children ages two to seven years-old and can continue into adulthood. The reported prevalence of language delay ranges from 2.3 to 19 percent.

Home sign is a gestural communication system, often invented spontaneously by a deaf child who lacks accessible linguistic input. Home sign systems often arise in families where a deaf child is raised by hearing parents and is isolated from the Deaf community. Because the deaf child does not receive signed or spoken language input, these children are referred to as linguistically isolated.

Developmental linguistics is the study of the development of linguistic ability in an individual, particularly the acquisition of language in childhood. It involves research into the different stages in language acquisition, language retention, and language loss in both first and second languages, in addition to the area of bilingualism. Before infants can speak, the neural circuits in their brains are constantly being influenced by exposure to language. Developmental linguistics supports the idea that linguistic analysis is not timeless, as claimed in other approaches, but time-sensitive, and is not autonomous – social-communicative as well as bio-neurological aspects have to be taken into account in determining the causes of linguistic developments.

Gestures in language acquisition are a form of non-verbal communication involving movements of the hands, arms, and/or other parts of the body. Children can use gesture to communicate before they have the ability to use spoken words and phrases. In this way gestures can prepare children to learn a spoken language, creating a bridge from pre-verbal communication to speech. The onset of gesture has also been shown to predict and facilitate children's spoken language acquisition. Once children begin to use spoken words their gestures can be used in conjunction with these words to form phrases and eventually to express thoughts and complement vocalized ideas.

<span class="mw-page-title-main">Laura-Ann Petitto</span> American psychologist and neuroscientist (born c. 1954)

Laura-Ann Petitto is a cognitive neuroscientist and a developmental cognitive neuroscientist known for her research and scientific discoveries involving the language capacity of chimpanzees, the biological bases of language in humans, especially early language acquisition, early reading, and bilingualism, bilingual reading, and the bilingual brain. Significant scientific discoveries include the existence of linguistic babbling on the hands of deaf babies and the equivalent neural processing of signed and spoken languages in the human brain. She is recognized for her contributions to the creation of the new scientific discipline, called educational neuroscience. Petitto chaired a new undergraduate department at Dartmouth College, called "Educational Neuroscience and Human Development" (2002-2007), and was a Co-Principal Investigator in the National Science Foundation and Dartmouth's Science of Learning Center, called the "Center for Cognitive and Educational Neuroscience" (2004-2007). At Gallaudet University (2011–present), Petitto led a team in the creation of the first PhD in Educational Neuroscience program in the United States. Petitto is the Co-Principal Investigator as well as Science Director of the National Science Foundation and Gallaudet University’s Science of Learning Center, called the "Visual Language and Visual Learning Center (VL2)". Petitto is also founder and Scientific Director of the Brain and Language Laboratory for Neuroimaging (“BL2”) at Gallaudet University.

John D. Bonvillian (1948-2018) was a psychologist and associate professor - emeritus in the Department of Psychology and the Interdepartmental Program in Linguistics at the University of Virginia in Charlottesville, Virginia. He is the principal developer of Simplified Signs, a manual sign communication system designed to be easy to form, easy to understand and easy to remember. He is also known for his research contributions to the study of sign language, child development, psycholinguistics, and language acquisition.

<span class="mw-page-title-main">John L. Locke</span> American biolinguist

John L. Locke is an American biolinguist who has contributed to the understanding of language development and the evolution of language. His work has focused on how language emerges in the social context of interaction between infants, children and caregivers, how speech and language disorders can shed light on the normal developmental process and vice versa, how brain and cognitive science can help illuminate language capability and learning, and on how the special life history of humans offers perspectives on why humans are so much more intensely social and vocally communicative than their primate relatives. In recent time he has authored widely accessible volumes designed for the general public on the nature of human communication and its origins.

Prelingual deafness refers to deafness that occurs before learning speech or language. Speech and language typically begin to develop very early with infants saying their first words by age one. Therefore, prelingual deafness is considered to occur before the age of one, where a baby is either born deaf or loses hearing before the age of one. This hearing loss may occur for a variety of reasons and impacts cognitive, social, and language development.

In sign languages, the term classifier construction refers to a morphological system that can express events and states. They use handshape classifiers to represent movement, location, and shape. Classifiers differ from signs in their morphology, namely that signs consist of a single morpheme. Signs are composed of three meaningless phonological features: handshape, location, and movement. Classifiers, on the other hand, consist of many morphemes. Specifically, the handshape, location, and movement are all meaningful on their own. The handshape represents an entity and the hand's movement iconically represents the movement of that entity. The relative location of multiple entities can be represented iconically in two-handed constructions.

Language acquisition is a natural process in which infants and children develop proficiency in the first language or languages that they are exposed to. The process of language acquisition is varied among deaf children. Deaf children born to deaf parents are typically exposed to a sign language at birth and their language acquisition follows a typical developmental timeline. However, at least 90% of deaf children are born to hearing parents who use a spoken language at home. Hearing loss prevents many deaf children from hearing spoken language to the degree necessary for language acquisition. For many deaf children, language acquisition is delayed until the time that they are exposed to a sign language or until they begin using amplification devices such as hearing aids or cochlear implants. Deaf children who experience delayed language acquisition, sometimes called language deprivation, are at risk for lower language and cognitive outcomes. However, profoundly deaf children who receive cochlear implants and auditory habilitation early in life often achieve expressive and receptive language skills within the norms of their hearing peers; age at implantation is strongly and positively correlated with speech recognition ability. Early access to language through signed language or technology have both been shown to prepare children who are deaf to achieve fluency in literacy skills.

Language deprivation in deaf and hard-of-hearing children is a delay in language development that occurs when sufficient exposure to language, spoken or signed, is not provided in the first few years of a deaf or hard of hearing child's life, often called the critical or sensitive period. Early intervention, parental involvement, and other resources all work to prevent language deprivation. Children who experience limited access to language—spoken or signed—may not develop the necessary skills to successfully assimilate into the academic learning environment. There are various educational approaches for teaching deaf and hard of hearing individuals. Decisions about language instruction is dependent upon a number of factors including extent of hearing loss, availability of programs, and family dynamics.

Language exposure for children is the act of making language readily available and accessible during the critical period for language acquisition. Deaf and hard of hearing children, when compared to their hearing peers, tend to face more hardships when it comes to ensuring that they will receive accessible language during their formative years. Therefore, deaf and hard of hearing children are more likely to have language deprivation which causes cognitive delays. Early exposure to language enables the brain to fully develop cognitive and linguistic skills as well as language fluency and comprehension later in life. Hearing parents of deaf and hard of hearing children face unique barriers when it comes to providing language exposure for their children. Yet, there is a lot of research, advice, and services available to those parents of deaf and hard of hearing children who may not know how to start in providing language.

<span class="mw-page-title-main">Pointing</span> Gesture

Pointing is a gesture specifying a direction from a person's body, usually indicating a location, person, event, thing or idea. It typically is formed by extending the arm, hand, and index finger, although it may be functionally similar to other hand gestures. Types of pointing may be subdivided according to the intention of the person, as well as by the linguistic function it serves.

Jana Marie Iverson is a developmental psychologist known for her research on the development of gestures and motor skills in relation to communicative development. She has worked with various populations including children at high risk of autism spectrum disorder (ASD), blind individuals, and preterm infants. She is currently a professor of psychology at Boston University.

References

  1. Marschark, Marc; Spencer, Patricia Elizabeth (11 January 2011). The Oxford Handbook of Deaf Studies, Language, and Education. Oxford University Press. p. 230. ISBN   978-0-19-975098-6 . Retrieved 13 April 2012.
  2. 1 2 3 4 5 Petitto, L.; Marentette, P. (1991). "Babbling in the manual mode: evidence for the ontogeny of language" (PDF). Science. 251 (5000): 1493–1496. doi:10.1126/science.2006424. ISSN   0036-8075. PMID   2006424. S2CID   9812277. Archived from the original (PDF) on July 25, 2010. Retrieved April 13, 2012.
  3. Marschark, Marc and Patricia Elizabeth Spencer. Oxford Handbook of Deaf Studies, Language, and Education. Oxford University Press, USA, 2003. 219-231.
  4. 1 2 Cormier, Kearsy; Mauk, Claude; Repp, Ann (1998). "Manual Babbling in Deaf and Hearing Infants: A Longitudinal Study" (PDF). Proceedings of the Twenty-Ninth Annual Child Language Research Forum: 55–61. Retrieved 20 November 2015.
  5. 1 2 3 Chamberlain, Charlene, Jill P. Morford, and Rachel I. Mayberry. Language Acquisition By Eye. Psychology Press, 1999. 14; 26; 41-48.
  6. Cutler, Anne. Twenty-First Century Psycholinguistics: Four Cornerstones. Routledge, 2017. 294-296.
  7. Seal, Brenda C.; DePaolis, Rory A. (2014-09-05). "Manual Activity and Onset of First Words in Babies Exposed and Not Exposed to Baby Signing". Sign Language Studies. 14 (4): 444–465. doi:10.1353/sls.2014.0015. ISSN   1533-6263. S2CID   144534523.
  8. Swanwick, Ruth. Issues in Deaf Education. Routledge, 2012. 59-61.
  9. Marentette, Paula F. “Babbling in Sign Language: Implications for Maturational Processes of Language in the Developing Brain.” McGill University, 1989.
  10. 1 2 Cheek, Adrianne; Cormier, Kearsy; Rathmann, Christian; Repp, Ann; Meier, Richard (April 1998). "Motoric Constraints Link Manual Babbling and Early Signs". Infant Behavior and Development. 21: 340. doi:10.1016/s0163-6383(98)91553-3.
  11. Cormier, Mauk, Repp, Kearsy, Claude, Ann. "Manual babbling in deaf and hearing Infants: A longitudinal study" (PDF). Proceedings of the Twenty-Ninth Annual Child Language Research Forum: 55–61.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  12. Petitto, Laura Ann, Siobhan Holowka, Lauren E Sergio, Bronna Levy, and David J Ostry. “Baby Hands That Move to the Rhythm of Language: Hearing Babies Acquiring Sign Languages Babble Silently on the Hands.” Cognition 93, no. 1 (August 2004): 43–73. doi : 10.1016/j.cognition.2003.10.007.