Input Processing theory

Last updated

The Input Processing theory, put forth by Bill VanPatten in 1993, [1] describes the process of strategies and mechanisms that learners use to link linguistic form with its meaning or function. [2] Input Processing is a theory in second language acquisition that focuses on how learners process linguistic data in spoken or written language. [3] [2]

Contents

The theory comprises two key principles, each with multiple sub-principles. [3] [2]

The first principle, the Primacy Principle of Meaning, has the following sub-principles: Primacy of Content Words, the Lexical Preference principle, the Preference for Non-redundancy principle, the Meaning-Before-Non-Meaning principle, the Availability of Resources principle, and the Sentence Location principle.[ citation needed ]

The second principle, the First Noun Principle, has the following sub-principles: The Lexical Semantics principle, the Event Probabilities principle, and the Contextual Constraint principle.[ citation needed ]

The Input Processing Theory has faced criticism. Opponents refuse the ‘acquisition-by-comprehension’ claim, as various processes may determine comprehension and production of language, [4] and there is disagreement regarding how to distinguish input and intake. [4] Some researchers claim that VanPatten's model ignores output. [4]

Overview

Input Processing (IP) was first proposed by VanPatten in 1993, [1] and there have been several updates to the theory. IP addresses how learners initially perceive and process linguistic data in spoken or written language. [3] [2] The theory addresses the psycholinguistic strategies and mechanisms that learners use to derive intake from input and also asks which psycholinguistic strategies the second language (L2) learner tends to rely upon during input processing. [3] [2] For example, it examines how learners extract form from input and the way they assign grammatical roles to nouns while the primary attention is on meaning. [5]

In 2003, VanPatten proposed that IP consists of two sub-processes: (1) making form-meaning connections; and (2) parsing. [6] Making form-meaning pertains to obtaining the connection between, for example, -s suffix and third person singular from the input. [6] In an earlier version of the theory, making form-meaning was composed of four principles: 1) The primacy of meaning principle; 2) the availability of resources principle; 3) the first noun principle; and 4) the sentence location principle. [6]

By contrast, parsing refers to the mapping of syntactic structures onto the utterance. For example, parsing examines how a subject knows which noun is the subject and which is the object when hearing a sentence. [6] Since the 2003 literature, VanPatten's theory [2] has been updated and modified.

Processing Instruction

Processing instruction is a particular type of pedagogical intervention that focuses on form derived from insights on input processing. Unlike other techniques, it is not concerned with the teaching of rules but the processing of morpho‐lexical units in the input. Processing instruction consists of referential and affective activities that manipulate input in particular ways to push learners away from less than optimal processing strategies. To date, there have been dozens of studies examining a variety of factors and issues, all pointing to robust findings of the positive effects of processing instruction.

Key principles

VanPatten's modified theory [2] is explained in the form of two principles, each with multiple sub-principles.[ citation needed ]

The first principle is the Primacy Principle of Meaning. [7] | This principle maintqains that a learner processes input for meaning before they process input for form. This principle has six sub-principles which are summarized as follows:[ citation needed ]

The second principle is the First Noun Principle.[ citation needed ] This principle describes that learners tend to process the first noun or pronoun they encounter in a sentence as the subject or agent.[ citation needed ] This principle has three sub-principles which are summarized as follows:[ citation needed ]

Opposition

In earlier versions of the Input Processing theory, several researchers disagreed with VanPatten's [1] claims. These opponents do not accept the ‘acquisition-by-comprehension’ claim as various processes may determine comprehension and production. [4] Salaberry states that within the Input Processing theory there are misunderstandings of how to distinguish input and intake, meaning it is difficult to understand the difference between the knowledge the environment provides and the amount of knowledge it provides. [4] [8] Some researchers claim that VanPatten's model ignores output. [4]

VanPatten's model does not exclude the role of output, but communicates a different status by comparing the process of language development. [4]

Related Research Articles

In linguistics, function words are words that have little lexical meaning or have ambiguous meaning and express grammatical relationships among other words within a sentence, or specify the attitude or mood of the speaker. They signal the structural relationships that words have to one another and are the glue that holds sentences together. Thus they form important elements in the structures of sentences.

A vocabulary is a set of words, typically the set in a language or the set known to an individual. The word vocabulary originated from the Latin vocabulum, meaning "a word, name." It forms an essential component of language and communication, helping convey thoughts, ideas, emotions, and information. Vocabulary can be oral, written, or signed and can be categorized into two main types: active vocabulary and passive vocabulary. An individual's vocabulary continually evolves through various methods, including direct instruction, independent reading, and natural language exposure, but it can also shrink due to forgetting, trauma, or disease. Furthermore, vocabulary is a significant focus of study across various disciplines, like linguistics, education, psychology, and artificial intelligence. Vocabulary is not limited to single words; it also encompasses multi-word units known as collocations, idioms, and other types of phraseology. Acquiring an adequate vocabulary is one of the largest challenges in learning a second language.

Parsing, syntax analysis, or syntactic analysis is the process of analyzing a string of symbols, either in natural language, computer languages or data structures, conforming to the rules of a formal grammar. The term parsing comes from Latin pars (orationis), meaning part.

In semantics, mathematical logic and related disciplines, the principle of compositionality is the principle that the meaning of a complex expression is determined by the meanings of its constituent expressions and the rules used to combine them. The principle is also called Frege's principle, because Gottlob Frege is widely credited for the first modern formulation of it. However, the principle has never been explicitly stated by Frege, and arguably it was already assumed by George Boole decades before Frege's work.

Second-language acquisition (SLA), sometimes called second-language learning — otherwise referred to as L2acquisition, is the process by which people learn a second language. Second-language acquisition is also the scientific discipline devoted to studying that process. The field of second-language acquisition is regarded by some but not everybody as a sub-discipline of applied linguistics but also receives research attention from a variety of other disciplines, such as psychology and education.

Shallow parsing is an analysis of a sentence which first identifies constituent parts of sentences and then links them to higher order units that have discrete grammatical meanings. While the most elementary chunking algorithms simply link constituent parts on the basis of elementary search patterns, approaches that use machine learning techniques can take contextual information into account and thus compose chunks in such a way that they better reflect the semantic relations between the basic constituents. That is, these more advanced methods get around the problem that combinations of elementary constituents can have different higher level meanings depending on the context of the sentence.

Sheltered instruction is an approach to teaching English-language learners that integrates language and content instruction. The phrase "sheltered instruction", the original concept, and the underlying theory of comprehensible input are all credited to Stephen Krashen.

<span class="mw-page-title-main">Biolinguistics</span> Study of the biology and evolution of language

Biolinguistics can be defined as the study of biology and the evolution of language. It is highly interdisciplinary as it is related to various fields such as biology, linguistics, psychology, anthropology, mathematics, and neurolinguistics to explain the formation of language. It is important as it seeks to yield a framework by which we can understand the fundamentals of the faculty of language. This field was first introduced by Massimo Piattelli-Palmarini, professor of Linguistics and Cognitive Science at the University of Arizona. It was first introduced in 1971, at an international meeting at the Massachusetts Institute of Technology (MIT). Biolinguistics, also called the biolinguistic enterprise or the biolinguistic approach, is believed to have its origins in Noam Chomsky's and Eric Lenneberg's work on language acquisition that began in the 1950s as a reaction to the then-dominant behaviorist paradigm. Fundamentally, biolinguistics challenges the view of human language acquisition as a behavior based on stimulus-response interactions and associations. Chomsky and Lenneberg militated against it by arguing for the innate knowledge of language. Chomsky in 1960s proposed the Language Acquisition Device (LAD) as a hypothetical tool for language acquisition that only humans are born with. Similarly, Lenneberg (1967) formulated the Critical Period Hypothesis, the main idea of which being that language acquisition is biologically constrained. These works were regarded as pioneers in the shaping of biolinguistic thought, in what was the beginning of a change in paradigm in the study of language.

Language teaching, like other educational activities, may employ specialized vocabulary and word use. This list is a glossary for English language learning and teaching using the communicative approach.

Nanosyntax is an approach to syntax where the terminal nodes of syntactic parse trees may be reduced to units smaller than a morpheme. Each unit may stand as an irreducible element and not be required to form a further "subtree." Due to its reduction to the smallest terminal possible, the terminals are smaller than morphemes. Therefore, morphemes and words cannot be itemised as a single terminal, and instead are composed by several terminals. As a result, Nanosyntax can serve as a solution to phenomena that are inadequately explained by other theories of syntax.

In linguistics, grammaticality is determined by the conformity to language usage as derived by the grammar of a particular speech variety. The notion of grammaticality rose alongside the theory of generative grammar, the goal of which is to formulate rules that define well-formed, grammatical, sentences. These rules of grammaticality also provide explanations of ill-formed, ungrammatical sentences.

In the field of second language acquisition, there are many theories about the most effective way for language learners to acquire new language forms. One theory of language acquisition is the comprehensible output hypothesis.

The input hypothesis, also known as the monitor model, is a group of five hypotheses of second-language acquisition developed by the linguist Stephen Krashen in the 1970s and 1980s. Krashen originally formulated the input hypothesis as just one of the five hypotheses, but over time the term has come to refer to the five hypotheses as a group. The hypotheses are the input hypothesis, the acquisition–learning hypothesis, the monitor hypothesis, the natural order hypothesis and the affective filter hypothesis. The input hypothesis was first published in 1977.

Processability theory is a theory of second language acquisition developed by Manfred Pienemann. The theory has been used as a framework by scientists from Europe, North America, Asia and Australia.

<span class="mw-page-title-main">Extensive reading</span>

Extensive Reading (ER) is the process of reading longer easier texts for an extended period of time without a breakdown of comprehension, feeling overwhelmed, or the need to take breaks. It stands in contrast to intensive or academic reading, which is focused on a close reading of dense shorter texts, typically not read for pleasure. Though used as a teaching strategy to promote second-language development, ER also applies to free voluntary reading and recreational reading both in and out of the classroom. ER is based on the assumption that we learn to read by reading.

Crosslinguistic influence (CLI) refers to the different ways in which one language can affect another within an individual speaker. It typically involves two languages that can affect one another in a bilingual speaker. An example of CLI is the influence of Korean on a Korean native speaker who is learning Japanese or French. Less typically, it could also refer to an interaction between different dialects in the mind of a monolingual speaker. CLI can be observed across subsystems of languages including pragmatics, semantics, syntax, morphology, phonology, phonetics, and orthography. Discussed further in this article are particular subcategories of CLI—transfer, attrition, the complementarity principle, and additional theories.

The main purpose of theories of second-language acquisition (SLA) is to shed light on how people who already know one language learn a second language. The field of second-language acquisition involves various contributions, such as linguistics, sociolinguistics, psychology, cognitive science, neuroscience, and education. These multiple fields in second-language acquisition can be grouped as four major research strands: (a) linguistic dimensions of SLA, (b) cognitive dimensions of SLA, (c) socio-cultural dimensions of SLA, and (d) instructional dimensions of SLA. While the orientation of each research strand is distinct, they are in common in that they can guide us to find helpful condition to facilitate successful language learning. Acknowledging the contributions of each perspective and the interdisciplinarity between each field, more and more second language researchers are now trying to have a bigger lens on examining the complexities of second language acquisition.

The interaction hypothesis is a theory of second-language acquisition which states that the development of language proficiency is promoted by face-to-face interaction and communication. Its main focus is on the role of input, interaction, and output in second language acquisition. It posits that the level of language that a learner is exposed to must be such that the learner is able to comprehend it, and that a learner modifying their speech so as to make it comprehensible facilitates their ability to acquire the language in question. The idea existed in the 1980s, and has been reviewed and expanded upon by a number of other scholars but is usually credited to Michael Long.

Syntactic bootstrapping is a theory in developmental psycholinguistics and language acquisition which proposes that children learn word meanings by recognizing syntactic categories and the structure of their language. It is proposed that children have innate knowledge of the links between syntactic and semantic categories and can use these observations to make inferences about word meaning. Learning words in one's native language can be challenging because the extralinguistic context of use does not give specific enough information about word meanings. Therefore, in addition to extralinguistic cues, conclusions about syntactic categories are made which then lead to inferences about a word's meaning. This theory aims to explain the acquisition of lexical categories such as verbs, nouns, etc. and functional categories such as case markers, determiners, etc.

Harris Winitz is professor emeritus at the University of Missouri, Kansas City where he taught courses in psycholinguistics, phonetics, acoustic phonetics, and language pathology and conducted research in second language learning.

References

  1. 1 2 3 VanPatten, Bill (1993). "Grammar Teaching for the Acquisition-Rich Classroom". Foreign Language Annals. 26 (4): 435–450. doi:10.1111/j.1944-9720.1993.tb01179.x. ISSN   1944-9720.
  2. 1 2 3 4 5 6 7 ocul-crl.primo.exlibrisgroup.com https://ocul-crl.primo.exlibrisgroup.com/discovery/fulldisplay?&context=PC&vid=01OCUL_CRL:CRL_DEFAULT&search_scope=MyInst_and_CI&tab=Everything&docid=cdi_proquest_miscellaneous_2131991457 . Retrieved 2020-12-01.{{cite web}}: Missing or empty |title= (help)
  3. 1 2 3 4 VanPatten, Bill (1996). Input Processing and Grammar Instruction in Second Language Acquisition. Greenwood Publishing Group. ISBN   978-1-56750-237-4.
  4. 1 2 3 4 5 6 7 Salaberry, M. (January 1998). "On Input Processing, True Language Competence, and Pedagogical Bandwagons: A Reply to Sanz and VanPatten". Canadian Modern Language Review. 54 (2): 274–285. doi:10.3138/cmlr.54.2.274. ISSN   0008-4506.
  5. Sanz, Cristina; VanPatten, Bill (January 1998). "On Input Processing, Processing Instruction, and the Nature of Replication Tasks: A Response to Salaberry". Canadian Modern Language Review. 54 (2): 263–272. doi:10.3138/cmlr.54.2.263. ISSN   0008-4506.
  6. 1 2 3 4 Vanpatten, B. (2003). "From input to output : a teacher's guide to second language acquisition". S2CID   60218470 . Retrieved 2020-12-01.{{cite journal}}: Cite journal requires |journal= (help)
  7. "The first principle is the primacy of learning". 29 August 2008.
  8. Acquiring Second Language (2012). difference-between-intake-and-input-in-second-language-learning/ "Difference between 'Intake' and 'Input' in L2 (Second language) learning".{{cite web}}: Check |url= value (help)