Symbol (formal)

Last updated
This diagram shows the syntactic entities that may be constructed from formal languages. The symbols and strings of symbols may be broadly divided into nonsense and well-formed formulas. A formal language can be thought of as identical to the set of its well-formed formulas. The set of well-formed formulas may be broadly divided into theorems and non-theorems. Formal languages.svg
This diagram shows the syntactic entities that may be constructed from formal languages. The symbols and strings of symbols may be broadly divided into nonsense and well-formed formulas. A formal language can be thought of as identical to the set of its well-formed formulas. The set of well-formed formulas may be broadly divided into theorems and non-theorems.

A logical symbol is a fundamental concept in logic, tokens of which may be marks or a configuration of marks which form a particular pattern.[ citation needed ] Although the term symbol in common use sometimes refers to the idea being symbolized, and at other times to the marks on a piece of paper or chalkboard which are being used to express that idea; in the formal languages studied in mathematics and logic, the term symbol refers to the idea, and the marks are considered to be a token instance of the symbol.[ dubious discuss ] In logic, symbols build literal utility to illustrate ideas.

Contents

Overview

Symbols of a formal language need not be symbols of anything. For instance there are logical constants which do not refer to any idea, but rather serve as a form of punctuation in the language (e.g. parentheses). Symbols of a formal language must be capable of being specified without any reference to any interpretation of them.

A symbol or string of symbols may comprise a well-formed formula if it is consistent with the formation rules of the language.

In a formal system a symbol may be used as a token in formal operations. The set of formal symbols in a formal language is referred to as an alphabet (hence each symbol may be referred to as a "letter") [1] [ page needed ]

A formal symbol as used in first-order logic may be a variable (member from a universe of discourse), a constant, a function (mapping to another member of universe) or a predicate (mapping to T/F).

Formal symbols are usually thought of as purely syntactic structures, composed into larger structures using a formal grammar, though sometimes they may be associated with an interpretation or model (a formal semantics).

Can words be modeled as formal symbols?

The move to view units in natural language (e.g. English) as formal symbols was initiated by Noam Chomsky (it was this work that resulted in the Chomsky hierarchy in formal languages). The generative grammar model looked upon syntax as autonomous from semantics. Building on these models, the logician Richard Montague proposed that semantics could also be constructed on top of the formal structure:

There is in my opinion no important theoretical difference between natural languages and the artificial languages of logicians; indeed, I consider it possible to comprehend the syntax and semantics of both kinds of language within a single natural and mathematically precise theory. On this point I differ from a number of philosophers, but agree, I believe, with Chomsky and his associates." [2] [ page needed ]

This is the philosophical premise underlying Montague grammar.

However, this attempt to equate linguistic symbols with formal symbols has been challenged widely, particularly in the tradition of cognitive linguistics, by philosophers like Stevan Harnad, and linguists like George Lakoff and Ronald Langacker.

Related Research Articles

<span class="mw-page-title-main">Formal language</span> Sequence of words formed by specific rules

In logic, mathematics, computer science, and linguistics, a formal language consists of words whose letters are taken from an alphabet and are well-formed according to a specific set of rules called a formal grammar.

The following outline is provided as an overview and topical guide to linguistics:

In linguistics, transformational grammar (TG) or transformational-generative grammar (TGG) was the earliest model of grammar proposed within the research tradition of generative grammar. Like current generative theories, it treated grammar as a system of formal rules that generate all and only grammatical sentences of a given language. What was distinctive about transformational grammar was that it posited transformation rules that mapped a sentence's deep structure to its pronounced form. For example, in many variants of transformational grammar, the English active voice sentence "Emma saw Daisy" and its passive counterpart "Daisy was seen by Emma" share a common deep structure generated by phrase structure rules, differing only in that the latter's structure is modified by a passivization transformation rule.

Metalogic is the metatheory of logic. Whereas logic studies how logical systems can be used to construct valid and sound arguments, metalogic studies the properties of logical systems. Logic concerns the truths that may be derived using a logical system; metalogic concerns the truths that may be derived about the languages and systems that are used to express truths.

A formal system is an abstract structure and formalization of an axiomatic system used for deducing, using rules of inference, theorems from axioms by a set of inference rules.

<span class="mw-page-title-main">Generative grammar</span> Research tradition in linguistics

Generative grammar is a research tradition in linguistics that aims to explain the cognitive basis of language by formulating and testing explicit models of humans' subconscious grammatical knowledge. Generative linguists, or generativists, tend to share certain working assumptions such as the competence–performance distinction and the notion that some domain-specific aspects of grammar are partly innate in humans. These assumptions are rejected in non-generative approaches such as usage-based models of language. Generative linguistics includes work in core areas such as syntax, semantics, phonology, psycholinguistics, and language acquisition, with additional extensions to topics including biolinguistics and music cognition.

<span class="mw-page-title-main">Syntax (logic)</span> Rules used for constructing, or transforming the symbols and words of a language

In logic, syntax is anything having to do with formal languages or formal systems without regard to any interpretation or meaning given to them. Syntax is concerned with the rules used for constructing, or transforming the symbols and words of a language, as contrasted with the semantics of a language which is concerned with its meaning.

<span class="mw-page-title-main">Richard Montague</span> American mathematician (1930–1971)

Richard Merritt Montague was an American mathematician and philosopher who made contributions to mathematical logic and the philosophy of language. He is known for proposing Montague grammar to formalize the semantics of natural language. As a student of Alfred Tarski, he also contributed early developments to axiomatic set theory (ZFC). For the latter half of his life, he was a professor at the University of California, Los Angeles until his early death, believed to be a homicide, at age 40.

Montague grammar is an approach to natural language semantics, named after American logician Richard Montague. The Montague grammar is based on mathematical logic, especially higher-order predicate logic and lambda calculus, and makes use of the notions of intensional logic, via Kripke models. Montague pioneered this approach in the 1960s and early 1970s.

Categorial grammar is a family of formalisms in natural language syntax that share the central assumption that syntactic constituents combine as functions and arguments. Categorial grammar posits a close relationship between the syntax and semantic composition, since it typically treats syntactic categories as corresponding to semantic types. Categorial grammars were developed in the 1930s by Kazimierz Ajdukiewicz and in the 1950s by Yehoshua Bar-Hillel and Joachim Lambek. It saw a surge of interest in the 1970s following the work of Richard Montague, whose Montague grammar assumed a similar view of syntax. It continues to be a major paradigm, particularly within formal semantics.

In logic, the semantics of logic or formal semantics is the study of the semantics, or interpretations, of formal languages and natural languages usually trying to capture the pre-theoretic notion of logical consequence.

<i>Syntactic Structures</i> Book by Noam Chomsky

Syntactic Structures is an important work in linguistics by American linguist Noam Chomsky, originally published in 1957. A short monograph of about a hundred pages, it is recognized as one of the most significant and influential linguistic studies of the 20th century. It contains the now-famous sentence "Colorless green ideas sleep furiously", which Chomsky offered as an example of a grammatically correct sentence that has no discernible meaning, thus arguing for the independence of syntax from semantics.

In generative grammar and related approaches, the logical form (LF) of a linguistic expression is the variant of its syntactic structure which undergoes semantic interpretation. It is distinguished from phonetic form, the structure which corresponds to a sentence's pronunciation. These separate representations are postulated in order to explain the ways in which an expression's meaning can be partially independent of its pronunciation, e.g. scope ambiguities.

Logic is the formal science of using reason and is considered a branch of both philosophy and mathematics and to a lesser extent computer science. Logic investigates and classifies the structure of statements and arguments, both through the study of formal systems of inference and the study of arguments in natural language. The scope of logic can therefore be very large, ranging from core topics such as the study of fallacies and paradoxes, to specialized analyses of reasoning such as probability, correct reasoning, and arguments involving causality. One of the aims of logic is to identify the correct and incorrect inferences. Logicians study the criteria for the evaluation of arguments.

Philosophy of language investigates the nature of language and the relations between language, language users, and the world. Investigations may include inquiry into the nature of meaning, intentionality, reference, the constitution of sentences, concepts, learning, and thought.

The linguistics wars were extended disputes among American theoretical linguists that occurred mostly during the 1960s and 1970s, stemming from a disagreement between Noam Chomsky and several of his associates and students. The debates started in 1967 when linguists Paul Postal, John R. Ross, George Lakoff, and James D. McCawley —self-dubbed the "Four Horsemen of the Apocalypse"—proposed an alternative approach in which the relation between semantics and syntax is viewed differently, which treated deep structures as meaning rather than syntactic objects. While Chomsky and other generative grammarians argued that meaning is driven by an underlying syntax, generative semanticists posited that syntax is shaped by an underlying meaning. This intellectual divergence led to two competing frameworks in generative semantics and interpretive semantics.

Formal semantics is the study of grammatical meaning in natural languages using formal concepts from logic, mathematics and theoretical computer science. It is an interdisciplinary field, sometimes regarded as a subfield of both linguistics and philosophy of language. It provides accounts of what linguistic expressions mean and how their meanings are composed from the meanings of their parts. The enterprise of formal semantics can be thought of as that of reverse-engineering the semantic components of natural languages' grammars.

<span class="mw-page-title-main">Formalism (linguistics)</span> Concept in linguistics

In linguistics, the term formalism is used in a variety of meanings which relate to formal linguistics in different ways. In common usage, it is merely synonymous with a grammatical model or a syntactic model: a method for analyzing sentence structures. Such formalisms include different methodologies of generative grammar which are especially designed to produce grammatically correct strings of words; or the likes of Functional Discourse Grammar which builds on predicate logic.

Logical grammar or rational grammar is a term used in the history and philosophy of linguistics to refer to certain linguistic and grammatical theories that were prominent until the early 19th century and later influenced 20th-century linguistic thought. These theories were developed by scholars and philosophers who sought to establish a logical and rational basis for understanding the relationship between reality, meaning, cognition, and language. Examples from the classical and modern period represent a realistic approach to linguistics, while accounts written during the Age of Enlightenment represent rationalism, focusing on human thought.

References

See also