Dynamic syntax

Last updated

Dynamic Syntax (DS) is a grammar formalism and linguistic theory whose overall aim is to explain the real-time processes of language understanding and production, and describe linguistic structures as happening step-by-step over time. Under the DS approach, syntactic knowledge is understood as the ability to incrementally analyse the structure and content of spoken and written language in context and in real-time. While it posits representations similar to those used in Combinatory categorial grammars (CCG), it builds those representations left-to-right going word-by-word. Thus it differs from other syntactic models which generally abstract away from features of everyday conversation such as interruption, backtracking, and self-correction. Moreover, it differs from other approaches in that it does not postulate an independent level of syntactic structure over words.

Contents

DS emerged in the late 1990s and early 2000s through the work of prominent figures such as Ruth Kempson, Ronnie Cann, Wilfried Meyer-Viol and Dov Gabbay. The first monograph-length work in the framework was released in 2001 under the title Dynamic Syntax: the flow of understanding Archived 2019-05-27 at the Wayback Machine . It was embedded in wider trends in linguistic thinking of the 20th century, especially in syntax, semantics, pragmatics and phonology. The Dynamics of Language (2005) by Ronnie Cann, Ruth Kempson and Lutz Marten followed on from the 2001 title and expanded the discussion and empirical coverage of the framework.

Subsequent years saw an expansion of the empirical coverage of the framework to modelling structures in Japanese, Korean, dialects of Modern Greek, Medieval Spanish and a variety of Bantu languages including Swahili, Rangi and siSwati. More recent work has also explored the way in which the framework can naturally be expanded to model dialogue.

Theoretical assumptions

While most grammar formalisms characterise properties of strings of words, in Dynamic Syntax it is propositional structure which is characterised. Propositional structure is modelled through recourse of binary semantic trees. Propositional structure is built up on a strictly incremental manner on a left-to-right basis and is represented through processes of tree growth. Under this framework, syntactic knowledge is considered to be the knowledge to parse/process strings in context. [1] :ix A consequence of this is that it is not just the final tree that is important for representational purposes, but all of the intermediate stages of the parsing/production process. Similarly, since the trees represent propositions, the same tree represents the same proposition (albeit represented through different sentences) from different languages.

The framework assumes a single level of representation. This contrasts with frameworks such as Lexical Functional Grammar in which multiple structures are posited. Similarly, no movement operations are considered necessary, unlike in Minimalism or other Generative approaches.

Parts of the formalism

Dynamic Syntax constitutes several core components: semantic formulae and composition calculus (epsilon calculus within typed lambda calculus), trees (lambda application ordering), and tree building actions (lexical and computational actions).

Semantic formulae and compositional calculus

The semantic formulae which classical Dynamic Syntax generates are a combination of Epsilon calculus formulae and Lambda calculus terms (in recent years DS-TTR has been developed alongside DS where Record Types from the formalism Type Theory with Records (TTR)) are used - see Purver et al. (2011). [2]

The formulae are either simple first order logic constants such as , predicate terms such as or functions such as . Normal lambda calculus substitution (-reduction) means a function can be applied to a simple term to return a predicate such that . The Epsilon calculus extension to first order logic is implemented in quantifiers, where , e.g. the string "a boy" may result in the formula being generated.

Tree growth

One of the basic assumptions behind DS is that natural language syntax can be seen as the progressive accumulation of transparent semantic representations with the upper goal being the construction of a logical propositional formula (a formula of type t). This process is driven by means of monotonic tree growth, representing the attempt to model the way information is processed in a time-linear, incremental, word-to-word manner. Tree growth is driven by means of requirements (indicated by the question mark (?)). [3]

Tree growth can take place in three ways: through computational rules, lexical input and pragmatic enrichment.

Computational rules involve an input and an output. These are considered to be universally available across languages. Given the right input the corresponding computational rule can – although need not – apply. This contrasts with lexical input which is lexically supplied and therefore language-specific.

The language of trees

The language of representation in Dynamic Syntax consists of binary trees. These trees are underpinned by the Logic Of Finite Trees (LOFT, Blackburn & Meyer-Viol 1994). LOFT is an expressive modal language that allows statements to be made about any treenode from the perspective of any treenode. LOFT uses two basic tree modalities, the up and down arrow relations. These correspond to the daughter and mother relations. Left nodes are addressed as 0 nodes and right nodes are 1 nodes. By convention, nodes on the left correspond to argument nodes, i.e. nodes in which arguments are represented, whereas right nodes correspond to the functor nodes, i.e. nodes in which all the various types of predicates are represented. The rootnode is given the treenode address 0 and it is defined as the sole node that does not have a mother node. [3]

Conference series

The First Dynamic Syntax conference was held at SOAS University of London in April 2017. [4] Prior to this there was a meeting of mainly Dynamic Syntax practitioners at Ghent University in Belgium. The Second Dynamic Syntax Conference was held at the University of Edinburgh in 2018. [5] The 3rd Dynamic Syntax conference was held at the University of Malta in May 2019. [6] The 4th Dynamic Syntax conference was originally planned to be held at the University of Oxford in May 2020 but was postponed and converted into an online event held on June 1, 2021. [7] [8]

A Dynamic Syntax course was held at ESSLLI in 2019. A PhD course was supposed to take place at Bergen University in May 2020, but was converted into an online course that took place in 2021. [9] [10] Dynamic Syntax has been taught at institutions around the world including SOAS, King's College London, the University of Essex in the UK, as well as at institutions in China. Abralin also offered a course in Dynamic Syntax in 2022. [11]

Related Research Articles

The propositional calculus is a branch of logic. It is also called propositional logic, statement logic, sentential calculus, sentential logic, or sometimes zeroth-order logic. It deals with propositions and relations between propositions, including the construction of arguments based on them. Compound propositions are formed by connecting propositions by logical connectives representing the truth functions of conjunction, disjunction, implication, biconditional, and negation. Some sources include other connectives, as in the table below.

Phrase structure rules are a type of rewrite rule used to describe a given language's syntax and are closely associated with the early stages of transformational grammar, proposed by Noam Chomsky in 1957. They are used to break down a natural language sentence into its constituent parts, also known as syntactic categories, including both lexical categories and phrasal categories. A grammar that uses phrase structure rules is a type of phrase structure grammar. Phrase structure rules as they are commonly employed operate according to the constituency relation, and a grammar that employs phrase structure rules is therefore a constituency grammar; as such, it stands in contrast to dependency grammars, which are based on the dependency relation.

In logic and proof theory, natural deduction is a kind of proof calculus in which logical reasoning is expressed by inference rules closely related to the "natural" way of reasoning. This contrasts with Hilbert-style systems, which instead use axioms as much as possible to express the logical laws of deductive reasoning.

<span class="mw-page-title-main">Parse tree</span> Tree in formal language theory

A parse tree or parsing tree is an ordered, rooted tree that represents the syntactic structure of a string according to some context-free grammar. The term parse tree itself is used primarily in computational linguistics; in theoretical syntax, the term syntax tree is more common.

In mathematics, and in other disciplines involving formal languages, including mathematical logic and computer science, a variable may be said to be either free or bound. Some older books use the terms real variable and apparent variable for free variable and bound variable, respectively. A free variable is a notation (symbol) that specifies places in an expression where substitution may take place and is not a parameter of this or any container expression. The idea is related to a placeholder, or a wildcard character that stands for an unspecified symbol.

In programming language theory and proof theory, the Curry–Howard correspondence is the direct relationship between computer programs and mathematical proofs. It is also known as the Curry–Howard isomorphism or equivalence, or the proofs-as-programs and propositions- or formulae-as-types interpretation.

Parsing, syntax analysis, or syntactic analysis is the process of analyzing a string of symbols, either in natural language, computer languages or data structures, conforming to the rules of a formal grammar. The term parsing comes from Latin pars (orationis), meaning part.

Tree-adjoining grammar (TAG) is a grammar formalism defined by Aravind Joshi. Tree-adjoining grammars are somewhat similar to context-free grammars, but the elementary unit of rewriting is the tree rather than the symbol. Whereas context-free grammars have rules for rewriting symbols as strings of other symbols, tree-adjoining grammars have rules for rewriting the nodes of trees as other trees.

Higher order grammar (HOG) is a grammar theory based on higher-order logic. It can be viewed simultaneously as generative-enumerative or model theoretic.

Categorial grammar is a family of formalisms in natural language syntax that share the central assumption that syntactic constituents combine as functions and arguments. Categorial grammar posits a close relationship between the syntax and semantic composition, since it typically treats syntactic categories as corresponding to semantic types. Categorial grammars were developed in the 1930s by Kazimierz Ajdukiewicz and in the 1950s by Yehoshua Bar-Hillel and Joachim Lambek. It saw a surge of interest in the 1970s following the work of Richard Montague, whose Montague grammar assumed a similar view of syntax. It continues to be a major paradigm, particularly within formal semantics.

In logic, a logical framework provides a means to define a logic as a signature in a higher-order type theory in such a way that provability of a formula in the original logic reduces to a type inhabitation problem in the framework type theory. This approach has been used successfully for (interactive) automated theorem proving. The first logical framework was Automath; however, the name of the idea comes from the more widely known Edinburgh Logical Framework, LF. Several more recent proof tools like Isabelle are based on this idea. Unlike a direct embedding, the logical framework approach allows many logics to be embedded in the same type system.

In linguistics, focus is a grammatical category that conveys which part of the sentence contributes new, non-derivable, or contrastive information. In the English sentence "Mary only insulted BILL", focus is expressed prosodically by a pitch accent on "Bill" which identifies him as the only person whom Mary insulted. By contrast, in the sentence "Mary only INSULTED Bill", the verb "insult" is focused and thus expresses that Mary performed no other actions towards Bill. Focus is a cross-linguistic phenomenon and a major topic in linguistics. Research on focus spans numerous subfields including phonetics, syntax, semantics, pragmatics, and sociolinguistics.

In mathematical logic and computer science the symbol ⊢ has taken the name turnstile because of its resemblance to a typical turnstile if viewed from above. It is also referred to as tee and is often read as "yields", "proves", "satisfies" or "entails".

Glue semantics, or simply Glue, is a linguistic theory of semantic composition and the syntax–semantics interface which assumes that meaning composition is constrained by a set of instructions stated within a formal logic. These instructions, called meaning constructors, state how the meanings of the parts of a sentence can be combined to provide the meaning of the sentence.

In programming language semantics, normalisation by evaluation (NBE) is a method of obtaining the normal form of terms in the λ-calculus by appealing to their denotational semantics. A term is first interpreted into a denotational model of the λ-term structure, and then a canonical (β-normal and η-long) representative is extracted by reifying the denotation. Such an essentially semantic, reduction-free, approach differs from the more traditional syntactic, reduction-based, description of normalisation as reductions in a term rewrite system where β-reductions are allowed deep inside λ-terms.

In semantics, a donkey sentence is a sentence containing a pronoun which is semantically bound but syntactically free. They are a classic puzzle in formal semantics and philosophy of language because they are fully grammatical and yet defy straightforward attempts to generate their formal language equivalents. In order to explain how speakers are able to understand them, semanticists have proposed a variety of formalisms including systems of dynamic semantics such as Discourse representation theory. Their name comes from the example sentence "Every farmer who owns a donkey beats it", in which "it" acts as a donkey pronoun because it is semantically but not syntactically bound by the indefinite noun phrase "a donkey". The phenomenon is known as donkey anaphora.

Meaning–text theory (MTT) is a theoretical linguistic framework, first put forward in Moscow by Aleksandr Žolkovskij and Igor Mel’čuk, for the construction of models of natural language. The theory provides a large and elaborate basis for linguistic description and, due to its formal character, lends itself particularly well to computer applications, including machine translation, phraseology, and lexicography.

Combinatory categorial grammar (CCG) is an efficiently parsable, yet linguistically expressive grammar formalism. It has a transparent interface between surface syntax and underlying semantic representation, including predicate–argument structure, quantification and information structure. The formalism generates constituency-based structures and is therefore a type of phrase structure grammar.

The categorical abstract machine (CAM) is a model of computation for programs that preserves the abilities of applicative, functional, or compositional style. It is based on the techniques of applicative computing.

Formal semantics is the study of grammatical meaning in natural languages using formal concepts from logic, mathematics and theoretical computer science. It is an interdisciplinary field, sometimes regarded as a subfield of both linguistics and philosophy of language. It provides accounts of what linguistic expressions mean and how their meanings are composed from the meanings of their parts. The enterprise of formal semantics can be thought of as that of reverse-engineering the semantic components of natural languages' grammars.

References

  1. Cann, Ronnie; Kempson, Ruth; Lutz, Marten (2005). The dynamics of language: an introduction. Amsterdam: Elsevier. ISBN   978-18-49-50873-5.
  2. Purver, M., Eshghi, A., & Hough, J. (2011, January). Incremental semantic construction in a dialogue system. In Proceedings of the Ninth International Conference on Computational Semantics (pp. 365-369). Association for Computational Linguistics.
  3. 1 2 Chatzikyriakidis, Stergios; Gibson, Hannah (3 February 2017). "The Bantu-Romance-Greek connection revisited: Processing constraints in auxiliary and clitic placement from a cross-linguistic perspective". Glossa: A Journal of General Linguistics. 2 (1): 4. doi: 10.5334/gjgl.135 . CC-BY icon.svg Material was copied from this source, which is available under a Creative Commons Attribution 4.0 International License.
  4. "The first Dynamic Syntax conference | SOAS University of London". www.soas.ac.uk. Retrieved 9 May 2022.
  5. The Second Dynamic Syntax Conference, archived from the original on 2021-02-27, retrieved 2022-05-09
  6. "The 3rd Dynamic Syntax Conference". sites.google.com. Retrieved 9 May 2022.
  7. "Dynamic Syntax Day, June 1st 2021". www.dynamicsyntax.org.
  8. "POSTPONED Fourth Dynamic Syntax Conference, Oxford, May 22-23 2020". www.dynamicsyntax.org. Retrieved 9 May 2022.
  9. "CANCELLED PhD course on Dynamic Syntax in Bergen (May 4-7 2020)". www.dynamicsyntax.org. Retrieved 9 May 2022.
  10. "PhD course on Dynamic Syntax via Zoom from May 21- June 1, 2021". www.dynamicsyntax.org. Retrieved 9 May 2022.
  11. "ABRALIN EAD". ead.abralin.org. Retrieved 9 May 2022.

Sources