Protactile

Last updated

Protactile is a language used by deafblind people using tactile channels. Unlike other sign languages, which are heavily reliant on visual information, protactile is oriented towards touch and is practiced on the body. Protactile communication originated out of communications by DeafBlind people in Seattle in 2007 and incorporates signs from American Sign Language. Protactile is an emerging system of communication in the United States, with users relying on shared principles such as contact space, tactile imagery, and reciprocity.

Contents

History

In 2007, a group of three DeafBlind women working at the Deaf-Blind Service Center in Seattle, aj granda, Jelica Nuccio, and Jackie Engler, communicated with each other using American Sign Language (ASL) through the use of interpreters. [1] Using ASL required the group to either use interpreters to communicate simultaneously or limited their conversation to just two people communicating at a time (using hand over hand signing). [1] The three worked together to devise ways to talk with each other directly, using their sense of touch as the primary source of information. [2] They began inviting other DeafBlind people into their conversations and interacting using these new communication practices. [2]

In describing the origin of protactile, granda and Nuccio write: [1]

It happened organically. We didn't "invent" [protactile]. What we did was use our positions at the DeafBlind Service Center to set up programs and events that would put DeafBlind people in a teaching role more often. And then when practices started really changing, we created a politics around it. We labeled things, and tried to document what was happening.

Description

Protactile has emerged in communities of people who were born deaf, learned ASL as children, then gradually lost their sight over decades, as is common in Usher syndrome. [3] Leaders and educators granda and Nuccio describe a "protactile movement" as empowering the DeafBlind community with a sense of community, with a language in DeafBlind people's preferred modality providing a remedy to the isolation imposed by hearing and sighted culture. [4] They describe a protactile philosophy as supporting DeafBlind culture, relationships, and politics. [4] Protactile is described by Helen Keller Services for the Blind as "much more than a system of touch signals," instead "a philosophy and a movement which focuses on autonomy and equality for people who are deaf-blind." [5]

In protactile, communication takes place by touch and movement focused primarily on the hands, wrist, elbow, arm, upper back, and when in a seated position, knees and the top of the thigh. [6] In formal instruction of protactile while sitting and facing a conversation partner, the "listening hand" has the thumb, index finger, and pinky extended, and is rested on the thigh of the other participant. [7] For example, several rapid taps on the thigh with all four fingers would indicate "yes," where a rapid back and forth brushing movement with the fingers would indicate "no." [7]

Tactile maps are used in protactile, communicating spatial information about the environment to the DeafBlind person. [6] A map can be drawn on a recipient's hand, arm, or back to describe surroundings or give directions. [6]

Instead of the "air space" used in visual sign languages, that is, the space around a signer's body, protactile is rooted in "contact space." [8] While ASL and other sign languages rely on handshape as one of the core components distinguishing a sign from other signs, in protactile the handshape is less important than the sensation received (for example, a series of tapped signs using different handshapes would all just be received as taps, with the handshapes being indistinguishable). [9]

Reciprocity

A significant innovation in protactile involves the concept of reciprocity. [10] Communication partners are encouraged to use the same communication method (as opposed to using signed or spoken language along with protactile) to ensure vision is not unduly privileged. [1] Sharing experience is a core principle of protactile, with tactile imagery evoking sensations in storytelling in the same way that facial expressions do in a conversation between sighted people. [1]

Serving the same function as body language or verbal acknowledgments (such as "mm-hmm" or "yeah"), tactile backchanneling allows for smoother communication in protactile conversations. Tapping the partner's arm or leg during pauses or as confirmation of understanding serves as a continuous loop of backchannel feedback. [6] Agreement, disagreement, laughter, and other responses are signaled using manual cues. [6] These cues are not standardized, but are developed according to the needs of the individual and specific situation. [5]

Education and impact

The DeafBlind Interpreting National Training and Resource Center was launched in 2017 as a resource for deafblind people. [11] The Center staff work to train protactile interpreters; as DeafBlind author John Lee Clark writes, "instead of providing 'accurate and objective information' in a way that unsuccessfully attempts to create a replica of how they're experiencing the world, Protactile interpreters must be our informants, our partners, our accomplices." [11]

A grant from the National Science Foundation led to the creation of a hybrid learning environment for young deafblind children. [12] The DeafBlind Kids! website provides parents and caretakers with information about protactile concepts such as tactile exploration, backchanneling, and co-presence. [12]

Protactile communication fosters inclusion and autonomy by providing DeafBlind people with more information about their environment. [13] More robust communication leads to fewer misunderstandings and more sense of involvement and connection. [13]

Related Research Articles

<span class="mw-page-title-main">American Sign Language</span> Sign language used predominately in the United States

American Sign Language (ASL) is a natural language that serves as the predominant sign language of Deaf communities in the United States of America and most of Anglophone Canada. ASL is a complete and organized visual language that is expressed by employing both manual and nonmanual features. Besides North America, dialects of ASL and ASL-based creoles are used in many countries around the world, including much of West Africa and parts of Southeast Asia. ASL is also widely learned as a second language, serving as a lingua franca. ASL is most closely related to French Sign Language (LSF). It has been proposed that ASL is a creole language of LSF, although ASL shows features atypical of creole languages, such as agglutinative morphology.

<span class="mw-page-title-main">Fingerspelling</span> Form of communication using one or both hands

Fingerspelling is the representation of the letters of a writing system, and sometimes numeral systems, using only the hands. These manual alphabets have often been used in deaf education and have subsequently been adopted as a distinct part of a number of sign languages. There are about forty manual alphabets around the world. Historically, manual alphabets have had a number of additional applications—including use as ciphers, as mnemonics and in silent religious settings.

<span class="mw-page-title-main">Sign language</span> Language that uses manual communication and body language to convey meaning

Sign languages are languages that use the visual-manual modality to convey meaning, instead of spoken words. Sign languages are expressed through manual articulation in combination with non-manual markers. Sign languages are full-fledged natural languages with their own grammar and lexicon. Sign languages are not universal and are usually not mutually intelligible, although there are also similarities among different sign languages.

<span class="mw-page-title-main">British Sign Language</span> Sign language used in the United Kingdom

British Sign Language (BSL) is a sign language used in the United Kingdom, and is the first or preferred language among the deaf community in the UK. Based on the percentage of people who reported 'using British Sign Language at home' on the 2011 Scottish Census, the British Deaf Association estimates there are 151,000 BSL users in the UK, of which 87,000 are Deaf. By contrast, in the 2011 England and Wales Census 15,000 people living in England and Wales reported themselves using BSL as their main language. People who are not deaf may also use BSL, as hearing relatives of deaf people, sign language interpreters or as a result of other contact with the British Deaf community. The language makes use of space and involves movement of the hands, body, face, and head.

Signing Exact English is a system of manual communication that strives to be an exact representation of English language vocabulary and grammar. It is one of a number of such systems in use in English-speaking countries. It is related to Seeing Essential English (SEE-I), a manual sign system created in 1945, based on the morphemes of English words. SEE-II models much of its sign vocabulary from American Sign Language (ASL), but modifies the handshapes used in ASL in order to use the handshape of the first letter of the corresponding English word.

<span class="mw-page-title-main">Deafblindness</span> Condition of little or no useful sight and little or no useful hearing

Deafblindness is the condition of little or no useful hearing and little or no useful sight. Different degrees of vision loss and auditory loss occur within each individual. Because of this inherent diversity, each deafblind individual's needs regarding lifestyle, communication, education, and work need to be addressed based on their degree of dual-modality deprivation, to improve their ability to live independently. In 1994, an estimated 35,000–40,000 United States residents were medically deafblind. Helen Keller was a well-known example of a deafblind individual. To further her lifelong mission to help the deafblind community to expand its horizons and gain opportunities, the Helen Keller National Center for Deaf-Blind Youths and Adults, with a residential training program in Sands Point, New York, was established in 1967 by an act of Congress.

Hawaiʻi Sign Language, also known as Hoailona ʻŌlelo and Old Hawaiʻi Sign Language, is an indigenous sign language native to Hawaiʻi. Historical records document its presence on the islands as early as the 1820s, but HSL was not formally recognized by linguists until 2013.

Tactile signing is a common means of communication used by people with deafblindness. It is based on a sign language or another system of manual communication.

Manually Coded English (MCE) is a type of sign system that follows direct spoken English. The different codes of MCE vary in the levels of directness in following spoken English grammar. There may also be a combination with other visual clues, such as body language. MCE is typically used in conjunction with direct spoken English.

Home sign is a gestural communication system, often invented spontaneously by a deaf child who lacks accessible linguistic input. Home sign systems often arise in families where a deaf child is raised by hearing parents and is isolated from the Deaf community. Because the deaf child does not receive signed or spoken language input, these children are referred to as linguistically isolated.

Indo-Pakistani Sign Language (IPSL) is the predominant sign language in the subcontinent of South Asia, used by at least 15 million deaf signers. As with many sign languages, it is difficult to estimate numbers with any certainty, as the Census of India does not list sign languages and most studies have focused on the north and urban areas. As of 2021, it is the most used sign language in the world, and Ethnologue ranks it as the 151st most "spoken" language in the world.

<span class="mw-page-title-main">Stokoe notation</span> Phonemic script for sign languages

Stokoe notation is the first phonemic script used for sign languages. It was created by William Stokoe for American Sign Language (ASL), with Latin letters and numerals used for the shapes they have in fingerspelling, and iconic glyphs to transcribe the position, movement, and orientation of the hands. It was first published as the organizing principle of Sign Language Structure: An Outline of the Visual Communication Systems of the American Deaf (1960), and later also used in A Dictionary of American Sign Language on Linguistic Principles, by Stokoe, Casterline, and Croneberg (1965). In the 1965 dictionary, signs are themselves arranged alphabetically, according to their Stokoe transcription, rather than being ordered by their English glosses as in other sign-language dictionaries. This made it the only ASL dictionary where the reader could look up a sign without first knowing how to translate it into English. The Stokoe notation was later adapted to British Sign Language (BSL) in Kyle et al. (1985) and to Australian Aboriginal sign languages in Kendon (1988). In each case the researchers modified the alphabet to accommodate phonemes not found in ASL.

Signature is a United Kingdom national charity and awarding body for deaf communication qualifications. Signature attempts to improve communication between deaf, deafblind and hearing people, whilst creating better communities.

Croatian sign language is a sign language of the deaf community in Croatia. It has in the past been regarded as a dialect of Yugoslav Sign Language, although the dialectical diversity of the former Yugoslavia has not been assessed.

In sign languages, the term classifier construction refers to a morphological system that can express events and states. They use handshape classifiers to represent movement, location, and shape. Classifiers differ from signs in their morphology, namely in that signs consist of a single morpheme. Signs are composed of three meaningless phonological features: handshape, location, and movement. Classifiers, on the other hand, consist of many morphemes. Specifically, the handshape, location, and movement are all meaningful on their own. The handshape represents an entity and the hand's movement iconically represents the movement of that entity. The relative location of multiple entities can be represented iconically in two-handed constructions.

The sociolinguistics of sign languages is the application of sociolinguistic principles to the study of sign languages. The study of sociolinguistics in the American Deaf community did not start until the 1960s. Until recently, the study of sign language and sociolinguistics has existed in two separate domains. Nonetheless, now it is clear that many sociolinguistic aspects do not depend on modality and that the combined examination of sociolinguistics and sign language offers countless opportunities to test and understand sociolinguistic theories. The sociolinguistics of sign languages focuses on the study of the relationship between social variables and linguistic variables and their effect on sign languages. The social variables external from language include age, region, social class, ethnicity, and sex. External factors are social by nature and may correlate with the behavior of the linguistic variable. The choices made of internal linguistic variant forms are systematically constrained by a range of factors at both the linguistic and the social levels. The internal variables are linguistic in nature: a sound, a handshape, and a syntactic structure. What makes the sociolinguistics of sign language different from the sociolinguistics of spoken languages is that sign languages have several variables both internal and external to the language that are unique to the Deaf community. Such variables include the audiological status of a signer's parents, age of acquisition, and educational background. There exist perceptions of socioeconomic status and variation of "grassroots" deaf people and middle-class deaf professionals, but this has not been studied in a systematic way. "The sociolinguistic reality of these perceptions has yet to be explored". Many variations in dialects correspond or reflect the values of particular identities of a community.

Sign languages such as American Sign Language (ASL) are characterized by phonological processes analogous to, yet dissimilar from, those of oral languages. Although there is a qualitative difference from oral languages in that sign-language phonemes are not based on sound, and are spatial in addition to being temporal, they fulfill the same role as phonemes in oral languages.

<span class="mw-page-title-main">Nalaga'at</span> Israeli nonprofit organization

Na Laga'at is a nonprofit organization founded in 2002 by Adina Tal and Eran Gur around the first of its kind in the world ensemble whose actors are all deafblind. The organization established a unique cultural center at the Levantbondet House in the Port of Jaffa in Tel Aviv. The center is a platform for creative arts, which promote equal and open dialogue and lead to social change built on the belief in the human spirit and its ability to reach out and make a change.

John Lee Clark is an American deafblind poet, writer, and activist from Minnesota. He is the author of Suddenly Slow (2008) and Where I Stand: On the Signing Community and My DeafBlind Experience (2014), and the editor of anthologies Deaf American Poetry (2009) and Deaf Lit Extravaganza (2013). Clark was the recipient of a 2020 National Magazine Award. He is a prominent activist in the Protactile movement.

Satoshi Fukushima is a Japanese researcher and advocate for people with disabilities. Blind since age nine and deaf from the age of eighteen, Fukushima was the first deafblind student to earn a degree from a Japanese university when he graduated from Tokyo Metropolitan University in 1987. Fukushima leads the Barrier-Free Laboratory, part of the Research Center for Advanced Science and Technology at the University of Tokyo; the research done by members of the lab's staff focuses on accessibility.

References

  1. 1 2 3 4 5 granda, aj; Nuccio, Jelica. "Protactile Principles" (PDF). World Association of Sign Language Interpreters. Tactile Communications. Retrieved February 12, 2022.
  2. 1 2 Van Wing, Sage (January 5, 2022). "New Protactile language emerges in Oregon". Oregon Public Broadcasting. Retrieved February 12, 2022.
  3. Edwards, Terra; Brentari, Diane (2021). "The Grammatical Incorporation of Demonstratives in an Emerging Tactile Language". Frontiers in Psychology. 11: 579992. doi: 10.3389/fpsyg.2020.579992 . ISSN   1664-1078. PMC   7838441 . PMID   33519599.
  4. 1 2 granda, aj; Nuccio, Jelica (March 2016). "Pro-Tactile Vlog #5". Pro-Tactile: The DeafBlind Way. Retrieved February 12, 2022.
  5. 1 2 "Touch Signals Terminology & Signs". Helen Keller Services. Retrieved February 12, 2022.
  6. 1 2 3 4 5 Collins, Steven D. "Pro-Tactile : Empowering Deaf-Blind People" (PDF). Human Development Center. Louisiana State University Health Sciences Center New Orleans. Retrieved February 12, 2022.
  7. 1 2 "Unit 2: Proper Hand Placement and Use". DeafBlind Kids. Retrieved February 12, 2022.
  8. Edwards, Terra; Brentari, Diane (2020). "Feeling Phonology: The conventionalization of phonology in protactile communities in the United States". DeafBlind Culture and Community. Retrieved February 12, 2022.
  9. Nuccio, Jelica; Clark, John Lee (2020). "Protactile Linguistics: Discussing recent research findings". Journal of American Sign Languages and Literatures. Retrieved February 12, 2022.
  10. Yeh, James (December 1, 2020). ""New kinds of contact": A DeafBlind poet's push for a radical language of touch". Inverse. Retrieved February 12, 2022.
  11. 1 2 Clark, John Lee (2021). "Against Access". McSweeney's Quarterly Concern. 64. Retrieved February 12, 2022.
  12. 1 2 "DeafBlind Kids!". DeafBlind Kids!. Retrieved February 12, 2022.
  13. 1 2 "Q&A: How Pro-Tactile American Sign Language — PTASL — is changing the conversation". Perkins School for the Blind. October 2018. Retrieved February 12, 2022.