Signalong

Last updated

Signalong is an alternative and augmentative key-word signing communication method used by those individuals with speech, language and communication needs. The Signalong methodology has been effectively used with individuals who have cognitive impairments, autism, Down's Syndrome, specific language impairment, multisensory impairment, and acquired neurological disorders that have negatively affected the ability to communicate, including stroke patients, and people who speak English as a second or third language.

Contents

The name "Signalong" is derived from the understanding that wherever possible the sign is accompanied by speech, hence you "sign along with speech".[ citation needed ] The programme was devised in 1991 by Gill Kennard, a language teacher, Linda Hall, a science teacher who produced the illustrations and Thelma Grove, a speech and language therapist from the Royal College of Speech & Language Therapists.[ citation needed ]

Signalong is a registered trade mark of The Signalong Group, a charity established in 1994. The original trademark application for Signalong was filed in the UK on 30 April 2001, with registration approved as of that date under UK trademark registration no. 2268715.[ citation needed ]

Programme

Signalong uses a total communication approach to teach language and literacy skills, through speech, signs and descriptions at the appropriate level for the child or adult's needs. Because of its unique methodology of handshape, orientation, placement and movement, idiosyncratic signs can be de-coded and translated into a format that is consistent and transferable.

Signalong consists of a Core Vocabulary of approximately 1726 concepts, but is not taught in a specific order. Vocabulary should be introduced when required and where possible, with real objects and in real situations to help re-enforce the link between the sign and spoken word.

Although Signalong is a key word signing system, once initial communication has been established, the learner can develop their language to 2, 3 or 4-word level as appropriate. In addition to the Core Vocabulary, there is an additional 7,000 concepts.

Development

In 1983 Kent County Council adopted Derbyshire Language Scheme (DLS), a flexible framework following typical language development used to help children develop their language skills. The DLS vocabulary is based on research into the type of objects and activities experienced by children as they develop. In 1991 due to the lack of vocabulary available from existing signing systems, Gill and Thelma developed vocabulary on the single-word level of the DLS and the vocabulary requested by parents and carers at Abbey Court School.

It has remained one of Signalong's most dearly held principles that the vocabulary should be led by the needs of the user of the resource and not dictated by others.

The first “pilot” copies of Phase 1 were published in April 1992 and as word of Signalong's existence spread the team came under heavy pressure to produce signs for sex education. Although they were already working on Phase 2, this was set aside and, in June 1993, “Personal and Social Education” was the next manual to be published. This brought Signalong to the attention of a much wider user group, the vocabulary being seen as particularly relevant to older learners and adults and in 1994 achieved charitable status. Signalong has over 70 different publications to date.

Signalong resources are designed to be self-explanatory and accessible, however by popular demand, the first Signalong training in sign-supported communication was made available in May 1992 and subsequent Tutor training in 1995.

Use

Signalong is based on British Sign Language adapted for the needs and abilities of children and adults with verbal communication difficulties. It uses one sign per concept, one concept per sign. Signalong is a sign–supporting system used in spoken word order and uses a total communication approach to reference links between signs and words. It also uses key-words, i.e. the essential word in any sentence, and uses signs at the partner's level and moderate language to ensure the message is understood. It is best to start with real objects and real experiences, generalise concepts before moving on to more abstract representations. Vocabulary is needs led and feedback from users helps Signalong to decide on vocabulary research.

When the sign has been selected, a description is worked out. This follows a consistent method of four elements; handshape - how the hands are formed; orientation - how the hands are held; placement - where the hands are held and movement - any changes in the first three elements.

Signalong is used extensively throughout the UK, but has also been adapted in countries including France, Germany, Indonesia, Italy, and Romania.[ citation needed ]

Training and Resources

Criticism

Signalong has been critiqued by some members of the Deaf community, including the British Deaf Association. In a 2022 statement, the British Deaf Association expressed serious concerns about the growth of social media posts using "language programmes" that incorporate sign, such as Signalong and Makaton. [1] The statement argued that schools and nurseries teaching children Signalong "give the misleading impression that they are teaching these children something useful, a skill for life" and emphasized the limited nature of Signalong, suggesting that it would make more sense to teach children British Sign Language. [1] The National Deaf Children's Society points out that sign systems such as Signalong are designed to support speech, and that communication between BSL users and people using sign systems can be very difficult. [2]

Related Research Articles

<span class="mw-page-title-main">American Sign Language</span> Sign language used predominantly in the US

American Sign Language (ASL) is a natural language that serves as the predominant sign language of Deaf communities in the United States and most of Anglophone Canada. ASL is a complete and organized visual language that is expressed by employing both manual and nonmanual features. Besides North America, dialects of ASL and ASL-based creoles are used in many countries around the world, including much of West Africa and parts of Southeast Asia. ASL is also widely learned as a second language, serving as a lingua franca. ASL is most closely related to French Sign Language (LSF). It has been proposed that ASL is a creole language of LSF, although ASL shows features atypical of creole languages, such as agglutinative morphology.

<span class="mw-page-title-main">Sign language</span> Language that uses manual communication and body language to convey meaning

Sign languages are languages that use the visual-manual modality to convey meaning, instead of spoken words. Sign languages are expressed through manual articulation in combination with non-manual markers. Sign languages are full-fledged natural languages with their own grammar and lexicon. Sign languages are not universal and are usually not mutually intelligible, although there are similarities among different sign languages.

<span class="mw-page-title-main">British Sign Language</span> Sign language used in the United Kingdom

British Sign Language (BSL) is a sign language used in the United Kingdom and is the first or preferred language among the deaf community in the UK. While private correspondence from William Stokoe hinted at a formal name for the language in 1960, the first usage of the term "British Sign Language" in an academic publication was likely by Aaron Cicourel. Based on the percentage of people who reported 'using British Sign Language at home' on the 2011 Scottish Census, the British Deaf Association estimates there are 151,000 BSL users in the UK, of whom 87,000 are Deaf. By contrast, in the 2011 England and Wales Census 15,000 people living in England and Wales reported themselves using BSL as their main language. People who are not deaf may also use BSL, as hearing relatives of deaf people, sign language interpreters or as a result of other contact with the British Deaf community. The language makes use of space and involves movement of the hands, body, face and head.

Makaton is a communication tool with speech, signs, and symbols to enable people with disabilities or learning disabilities to communicate. Makaton supports the development of essential communication skills such as attention, listening, comprehension, memory and expressive speech and language. The Makaton language programme has been used with individuals who have cognitive impairments, autism, Down syndrome, specific language impairment, multisensory impairment and acquired neurological disorders that have negatively affected the ability to communicate, including stroke and dementia patients.

Auslan is the sign language used by the majority of the Australian Deaf community. Auslan is related to British Sign Language (BSL) and New Zealand Sign Language (NZSL); the three have descended from the same parent language, and together comprise the BANZSL language family. As with other sign languages, Auslan's grammar and vocabulary is quite different from spoken English. Its origin cannot be attributed to any individual; rather, it is a natural language that emerged spontaneously and has changed over time.

Signing Exact English is a system of manual communication that strives to be an exact representation of English language vocabulary and grammar. It is one of a number of such systems in use in English-speaking countries. It is related to Seeing Essential English (SEE-I), a manual sign system created in 1945, based on the morphemes of English words. SEE-II models much of its sign vocabulary from American Sign Language (ASL), but modifies the handshapes used in ASL in order to use the handshape of the first letter of the corresponding English word.

Cued speech is a visual system of communication used with and among deaf or hard-of-hearing people. It is a phonemic-based system which makes traditionally spoken languages accessible by using a small number of handshapes, known as cues, in different locations near the mouth to convey spoken language in a visual format. The National Cued Speech Association defines cued speech as "a visual mode of communication that uses hand shapes and placements in combination with the mouth movements and speech to make the phonemes of spoken language look different from each other." It adds information about the phonology of the word that is not visible on the lips. This allows people with hearing or language difficulties to visually access the fundamental properties of language. It is now used with people with a variety of language, speech, communication, and learning needs. It is not a sign language such as American Sign Language (ASL), which is a separate language from English. Cued speech is considered a communication modality but can be used as a strategy to support auditory rehabilitation, speech articulation, and literacy development.

<span class="mw-page-title-main">New Zealand Sign Language</span>

New Zealand Sign Language or NZSL is the main language of the deaf community in New Zealand. It became an official language of New Zealand in April 2006 under the New Zealand Sign Language Act 2006. The purpose of the act was to create rights and obligations in the use of NZSL throughout the legal system and to ensure that the Deaf community had the same access to government information and services as everybody else. According to the 2013 Census, over 20,000 New Zealanders know NZSL.

The American Manual Alphabet (AMA) is a manual alphabet that augments the vocabulary of American Sign Language.

<span class="mw-page-title-main">Vocabulary development</span> Process of learning words

Vocabulary development is a process by which people acquire words. Babbling shifts towards meaningful speech as infants grow and produce their first words around the age of one year. In early word learning, infants build their vocabulary slowly. By the age of 18 months, infants can typically produce about 50 words and begin to make word combinations.

Manually Coded English (MCE) is an umbrella term referring to a number of invented manual codes intended to visually represent the exact grammar and morphology of spoken English. Different codes of MCE vary in the levels of adherence to spoken English grammar, morphology, and syntax. MCE is typically used in conjunction with direct spoken English.

<span class="mw-page-title-main">Augmentative and alternative communication</span> Techniques used for those with communication impairments

Augmentative and alternative communication (AAC) encompasses the communication methods used to supplement or replace speech or writing for those with impairments in the production or comprehension of spoken or written language. AAC is used by those with a wide range of speech and language impairments, including congenital impairments such as cerebral palsy, intellectual impairment and autism, and acquired conditions such as amyotrophic lateral sclerosis and Parkinson's disease. AAC can be a permanent addition to a person's communication or a temporary aid. Stephen Hawking, probably the best-known user of AAC, had amyotrophic lateral sclerosis, and communicated through a speech-generating device.

Manually coded languages (MCLs) are a family of gestural communication methods which include gestural spelling as well as constructed languages which directly interpolate the grammar and syntax of oral languages in a gestural-visual form—that is, signed versions of oral languages. Unlike the sign languages that have evolved naturally in deaf communities, these manual codes are the conscious invention of deaf and hearing educators, and as such lack the distinct spatial structures present in native deaf sign languages. MCLs mostly follow the grammar of the oral language—or, more precisely, of the written form of the oral language that they interpolate. They have been mainly used in deaf education in an effort to "represent English on the hands" and by sign language interpreters in K-12 schools, although they have had some influence on deaf sign languages where their implementation was widespread.

The Paget Gorman Sign System, also known as Paget Gorman Signed Speech (PGSS) or Paget Gorman Systematic Sign Language is a manually coded form of the English language, designed to be used with children with speech or communication difficulties.

Tangible symbols are a type of augmentative and alternative communication (AAC) that uses objects or pictures that share a perceptual relationship with the items they represent as symbols. A tangible symbol's relation to the item it represents is perceptually obvious and concrete – the visual or tactile properties of the symbol resemble the intended item. Tangible Symbols can easily be manipulated and are most strongly associated with the sense of touch. These symbols can be used by individuals who are not able to communicate using speech or other abstract symbol systems, such as sign language. However, for those who have the ability to communicate using speech, learning to use tangible symbols does not hinder further developing acquisition of natural speech and/or language development, and may even facilitate it.

American Sign Language literature is one of the most important shared cultural experiences in the American deaf community. Literary genres initially developed in residential Deaf institutes, such as American School for the Deaf in Hartford, Connecticut, which is where American Sign Language developed as a language in the early 19th century. There are many genres of ASL literature, such as narratives of personal experience, poetry, cinematographic stories, folktales, translated works, original fiction and stories with handshape constraints. Authors of ASL literature use their body as the text of their work, which is visually read and comprehended by their audience viewers. In the early development of ASL literary genres, the works were generally not analyzed as written texts are, but the increased dissemination of ASL literature on video has led to greater analysis of these genres.

Semantic compaction, (Minspeak), conceptually described as polysemic (multi-meaning) iconic encoding, is one of the three ways to represent language in Augmentative and alternative communication (AAC). It is a system utilized in AAC devices in which sequences of icons are combined in order to form a word or a phrase. The goal is to increase independent communication in individuals who cannot use speech. Minspeak is the only patented system for Semantic Compaction and is based on multi-meaning icons that code vocabulary in short sequences determined by rule-driven patterns. Minspeak has been used with both children and adults with various disabilities, including cerebral palsy, motor speech disorders, developmental disabilities, autism spectrum disorder, and adult onset disabilities such as Amyotrophic Lateral Sclerosis (ALS).

Nepalese Sign Language or Nepali Sign Language is the main sign language of Nepal. It is a partially standardized language based informally on the variety used in Kathmandu, with some input from varieties from Pokhara and elsewhere. As an indigenous sign language, it is not related to oral Nepali. The Nepali Constitution of 2015 specifically mentions the right to have education in Sign Language for the deaf. Likewise, the newly passed Disability Rights Act of 2072 BS defined language to include "spoken and sign languages and other forms of speechless language." in practice it is recognized by the Ministry of Education and the Ministry of Women, Children and Social Welfare, and is used in all schools for the deaf. In addition, there is legislation underway in Nepal which, in line with the UN Convention on the Rights of Persons with Disabilities (UNCRPD) which Nepal has ratified, should give Nepalese Sign Language equal status with the oral languages of the country.

Deaf Education in Kenya is a constantly changing section of the Kenyan education system that is focused on educating deaf, hard-of-hearing, and hearing-impaired Kenyan students. There are many organizations in Kenya made to protect the rights of Deaf Kenyans and promote progress in deaf education. The state of Kenyan deaf education is constantly changing and improving.

Language acquisition is a natural process in which infants and children develop proficiency in the first language or languages that they are exposed to. The process of language acquisition is varied among deaf children. Deaf children born to deaf parents are typically exposed to a sign language at birth and their language acquisition follows a typical developmental timeline. However, at least 90% of deaf children are born to hearing parents who use a spoken language at home. Hearing loss prevents many deaf children from hearing spoken language to the degree necessary for language acquisition. For many deaf children, language acquisition is delayed until the time that they are exposed to a sign language or until they begin using amplification devices such as hearing aids or cochlear implants. Deaf children who experience delayed language acquisition, sometimes called language deprivation, are at risk for lower language and cognitive outcomes. However, profoundly deaf children who receive cochlear implants and auditory habilitation early in life often achieve expressive and receptive language skills within the norms of their hearing peers; age at implantation is strongly and positively correlated with speech recognition ability. Early access to language through signed language or technology have both been shown to prepare children who are deaf to achieve fluency in literacy skills.

References

  1. 1 2 "BDA statement on "sign systems" and the oppression of BSL". British Deaf Association. 13 June 2022. Retrieved 5 December 2024.
  2. "What is a sign system?". National Deaf Children's Society. Retrieved 5 December 2024.