Arc pair grammar

Last updated

In linguistics, arc pair grammar (APG) is a theory of syntax that aims to formalize and expand upon relational grammar. It primarily builds upon the relational grammar concept of an arc, but also makes use of more formally stated ideas from model theory and graph theory. It was developed in the late 1970s by David E. Johnson and Paul Postal, and formalized in 1980 in the eponymous book Arc Pair Grammar.

Contents

History

Early syntactic theory concerned itself primarily with grammatical relations. This trend was abandoned by proponents of transformational grammar, except in semantic interpretation. [1] In the early 1970s, some linguists, such as Edward Keenan, began to challenge this notion from the transformationalist perspective, noting for instance the formation of relative clauses in Malagasy [1] and English passivization (see chômeur). Relational grammar (RG) itself was never formalized in one place; instead, Keenan, Johnson, and others began writing aspects of the framework in a series of dissertations around this time. Dissatisfied with the results and lack of formalization in RG, David Johnson and Paul Postal attempted to lay down a version of it using mathematical logic. This attempt swelled into a new theory, now known as APG. APG itself was developed by Johnson and Postal in the late 1970s, but was not published until 1980. [2] APG takes grammatical relations, the graph theory notion of an arc, and two operations (SPONSOR and ERASE) as primitives, with all other rules being derived (many of them mathematically, rather than empirically).

Postulates

In contrast to the generative-enumerative (proof-theoretic) approach to syntax assumed by transformational grammar, arc pair grammar takes a model-theoretic approach. In arc pair grammar, linguistic laws and language-specific rules of grammar are formalized as axiomatic logical statements. Sentences of a language, understood as structures of a certain type, follow the set of linguistic laws and language-specific statements. This reduces grammaticality to the logically satisfiable notion of model-theoretic satisfaction.

Pair Network and RLS-graphs

The pair network (PN) is the main method of representing sentences in APG. It is a mathematical model consisting of nodes (for each word and the clause as a whole) and arc connecting them, with operations acting on arcs rather than nodes. This is somewhat analogous to the TG idea of a tree, but with a few major differences. First, PNs are formal mathematical objects, while trees in TG are mathematical objects that are not formally defined. Second, the idea of branches in trees does not carry over; while branches attach nodes at varying levels of structure, with the overall web of branches determining grammatical roles, arcs encode grammatical roles to their heads with their only structural role being attaching words to their clause. Third, operations between arcs (i.e. intra-structural operations) create word order, whereas TG structure encodes word order at all levels. [2]

A pair network consists of four components: the 'relational-graph,' the 'logical-graph,' the 'surface-graph' (R-, L-, S-graphs), and the two operations Sponsor and Erase. The R-graph is simply the set of all items in the pair network, i.e., the structure as a whole of all arcs, labels (R-signs), and operations between them. The S-graph consists of those members of the R-graph which are actually spoken. Single phrasal element and word are treated as having a single root for the purpose of the S-graph, although the APG framework is theoretically applicable to lexical entries. [2] The L-graph represents the semantics of a pair network and the logical relations between its elements. L-graphs also contain the notion of "logical arcs," which are precisely those arcs in the R-graph that terminate in nodes labeled with the logical and semantic relations of an arc.

Sponsor operations are used between levels in the R-graph to establish different linguistic states (that is, a particular set of grammatical relations). Generally speaking, lower levels sponsor higher levels, and higher levels erase lower levels. Sponsor can be broken into two cases: Replace and Succeed. Succeed is the more basic of the two, in that an arc A is the successor of another arc B if and only if B sponsors A, A and B overlap, and B≠A. That is, every arc that is sponsored by an arc other than itself is the successor of that arc. Replace, meanwhile, occurs exclusively between arcs that are neighbors (arcs that share a head, but have distinct tails). As a consequence, Replace can only occur between arcs with an identical R-sign. Arcs that are part of Replace operations cannot also be part of Succeed operations. [2] Thus, Replace is significantly more restricted than Succeed. Although they ultimately have the same effect of establishing sponsorship, Replace and Succeed are subject to different rules and laws than each other. [2] The distinction helps define when exactly sponsorship can occur, and so while not technically necessary, it is useful for the purpose of brevity.

Erase operations occur between arcs when it becomes necessary to specify which linguistic level is phonologically attested. Where two arcs share the same grammatical relation to the same root node, the one at the surface level erases the one at lower levels. Successors always erase their predecessors, except in one case.

Limitations

There are very few syntacticians who would consider themselves to be practitioners of APG or its descendants. There are a few reasons for this. First, although it attempts to handle all aspects of language using pair networks and arc pairs, there is not a suitable APG account of phonology. [2] Second, the complexity of an APG structure in general increases exponentially with sentence complexity. For instance, in sentences with to-complements, nodes in the complement have arc relations with nodes outside of it, making for mathematically-represented though difficult-to-follow structures.

See also

Related Research Articles

In linguistics, syntax is the set of rules, principles, and processes that govern the structure of sentences in a given language, usually including word order. The term syntax is also used to refer to the study of such principles and processes. The goal of many syntacticians is to discover the syntactic rules common to all languages.

In linguistics, transformational grammar (TG) or transformational-generative grammar (TGG) is part of the theory of generative grammar, especially of natural languages. It considers grammar to be a system of rules that generate exactly those combinations of words that form grammatical sentences in a given language and involves the use of defined operations to produce new sentences from existing ones. The method is commonly associated with American linguist Noam Chomsky.

Lexical functional grammar (LFG) is a constraint-based grammar framework in theoretical linguistics. It posits two separate levels of syntactic structure, a phrase structure grammar representation of word order and constituency, and a representation of grammatical functions such as subject and object, similar to dependency grammar. The development of the theory was initiated by Joan Bresnan and Ronald Kaplan in the 1970s, in reaction to the theory of transformational grammar which was current in the late 1970s. It mainly focuses on syntax, including its relation with morphology and semantics. There has been little LFG work on phonology.

Deep structure and surface structure concepts are used in linguistics, specifically in the study of syntax in the Chomskyan tradition of transformational generative grammar.

A formal system is used for inferring theorems from axioms according to a set of rules. These rules, which are used for carrying out the inference of theorems from axioms, are the logical calculus of the formal system. A formal system is essentially an "axiomatic system".

Generative grammar Theory in linguistics

Generative grammar is a concept in generative linguistics, a linguistic theory that regards linguistics as the study of a hypothesised innate grammatical structure. It is a biological or biologistic modification of structuralist theories, deriving ultimately from glossematics. Generative Grammar considers grammar as a system of rules that generates exactly those combinations of words that form grammatical sentences in a given language. The difference from structural and functional models is that the object is placed into the verb phrase in generative grammar. This purportedly cognitive structure is thought of as being a part of a universal grammar, a syntactic structure which is caused by a genetic mutation in humans.

In generative grammar, a theta role or θ-role is the formal device for representing syntactic argument structure—the number and type of noun phrases—required syntactically by a particular verb. For example, the verb put requires three arguments.

The term phrase structure grammar was originally introduced by Noam Chomsky as the term for grammar studied previously by Emil Post and Axel Thue. Some authors, however, reserve the term for more restricted grammars in the Chomsky hierarchy: context-sensitive grammars or context-free grammars. In a broader sense, phrase structure grammars are also known as constituency grammars. The defining trait of phrase structure grammars is thus their adherence to the constituency relation, as opposed to the dependency relation of dependency grammars.

In computer science, graph transformation, or graph rewriting, concerns the technique of creating a new graph out of an original graph algorithmically. It has numerous applications, ranging from software engineering to layout algorithms and picture generation.

Construction grammar is a family of theories within the field of cognitive linguistics which posit that constructions, or learned pairings of linguistic patterns with meanings, are the fundamental building blocks of human language. Constructions include words, morphemes, fixed expressions and idioms, and abstract grammatical rules such as the passive voice or the ditransitive. Any linguistic pattern is considered to be a construction as long as some aspect of its form or its meaning cannot be predicted from its component parts, or from other constructions that are recognized to exist. In construction grammar, every utterance is understood to be a combination of multiple different constructions, which together specify its precise meaning and form.

<i>Syntactic Structures</i> Book by Noam Chomsky

Syntactic Structures is an influential work in linguistics by American linguist Noam Chomsky, originally published in 1957. It is an elaboration of his teacher's, Zellig Harris's, model of transformational generative grammar. A short monograph of about a hundred pages, Chomsky's presentation is recognized as one of the most significant studies of the 20th century. It contains the now-famous sentence "Colorless green ideas sleep furiously", which Chomsky offered as an example of a grammatically correct sentence that has no discernible meaning. Thus, Chomsky argued for the independence of syntax from semantics.

Paul Martin Postal is an American linguist and member of the faculty of New York University.

Grammatical relation

In linguistics, grammatical relations are functional relationships between constituents in a clause. The standard examples of grammatical functions from traditional grammar are subject, direct object, and indirect object. In recent times, the syntactic functions, typified by the traditional categories of subject and object, have assumed an important role in linguistic theorizing, within a variety of approaches ranging from generative grammar to functional and cognitive theories. Many modern theories of grammar are likely to acknowledge numerous further types of grammatical relations. The role of grammatical relations in theories of grammar is greatest in dependency grammars, which tend to posit dozens of distinct grammatical relations. Every head-dependent dependency bears a grammatical function.

Relational may refer to:

In linguistics, relational grammar (RG) is a syntactic theory which argues that primitive grammatical relations provide the ideal means to state syntactic rules in universal terms. Relational grammar began as an alternative to transformational grammar.

Donna B. Gerdts is professor of linguistics and associate director of the First Nations Languages Program at Simon Fraser University. She is a syntactician who has worked most extensively on Halkomelem and Korean. She has created extensive teaching materials for Halkomelem, and is currently engaged in further research on the language, funded by the Social Sciences and Humanities Research Council of Canada. Some of her key areas of interest are: syntactic theory, language typology and universals, the syntax/morphology interface, and the form and function of grammatical categories.

<i>Aspects of the Theory of Syntax</i>

Aspects of the Theory of Syntax is a book on linguistics written by American linguist Noam Chomsky, first published in 1965. In Aspects, Chomsky presented a deeper, more extensive reformulation of transformational generative grammar (TGG), a new kind of syntactic theory that he had introduced in the 1950s with the publication of his first book, Syntactic Structures. Aspects is widely considered to be the foundational document and a proper book-length articulation of Chomskyan theoretical framework of linguistics. It presented Chomsky's epistemological assumptions with a view to establishing linguistic theory-making as a formal discipline comparable to physical sciences, i.e. a domain of inquiry well-defined in its nature and scope. From a philosophical perspective, it directed mainstream linguistic research away from behaviorism, constructivism, empiricism and structuralism and towards mentalism, nativism, rationalism and generativism, respectively, taking as its main object of study the abstract, inner workings of the human mind related to language acquisition and production.

Formalism (linguistics)

In linguistics, formalism is a theoretical approach characterized by the idea that human language can be defined as a formal language like the language of mathematics and programming languages. It is contrasted with linguistic functionalism approaches like cognitive linguistics and usage-based linguistics.

The Integrational theory of language is the general theory of language that has been developed within the general linguistic approach of integrational linguistics.

Model-theoretic grammars, also known as constraint-based grammars, contrast with generative grammars in the way they define sets of sentences: they state constraints on syntactic structure rather than providing operations for generating syntactic objects. A generative grammar provides a set of operations such as rewriting, insertion, deletion, movement, or combination, and is interpreted as a definition of the set of all and only the objects that these operations are capable of producing through iterative application. A model-theoretic grammar simply states a set of conditions that an object must meet, and can be regarded as defining the set of all and only the structures of a certain sort that satisfy all of the constraints. The approach applies the mathematical techniques of model theory to the task of syntactic description: a grammar is a theory in the logician's sense and the well-formed structures are the models that satisfy the theory.

References

  1. 1 2 Johnson, David (1974). Toward a Theory of Relationally-Based Grammar. Urbana-Champaign, Illinois: University of Illinois Urbana-Champaign. ProQuest   302701744.
  2. 1 2 3 4 5 6 Johnson, David; Postal, Paul (1980). Arc Pair Grammar . Princeton University Press. JSTOR   j.ctt7ztvk9.
  1. Postal, Paul M. (1982). "Some arc pair grammar descriptions". In P. Jacobson & G. K. Pullum (Eds.), The nature of syntactic representation (pp. 341–425). Dordrecht: D. Reidel. ISBN   978-90-277-1290-5
  2. Newmeyer, Frederick (1980). Linguistics in America. New York: Academic Press. ISBN   978-90-277-1290-5
  3. Pullum, Geoffrey K. and Barbara C. Scholz. (2005). "Contrasting applications of logic in natural language syntactic description." In Petr Hájek, Luis Valdés-Villanueva, and Dag Westerståhl (eds.), Logic, Methodology and Philosophy of Science: Proceedings of the Twelfth International Congress, 481-503. ISBN   978-1-904987-21-5
  4. Pullum, Geoffrey K. (2007) "The evolution of model-theoretic frameworks in linguistics." In the proceedings of the Model-Theoretic Syntax at 10 workshop at ESSLLI 2007, Trinity College, Dublin.