Symbol (formal)

Last updated
This diagram shows the syntactic entities that may be constructed from formal languages. The symbols and strings of symbols may be broadly divided into nonsense and well-formed formulas. A formal language can be thought of as identical to the set of its well-formed formulas. The set of well-formed formulas may be broadly divided into theorems and non-theorems. Formal languages.svg
This diagram shows the syntactic entities that may be constructed from formal languages. The symbols and strings of symbols may be broadly divided into nonsense and well-formed formulas. A formal language can be thought of as identical to the set of its well-formed formulas. The set of well-formed formulas may be broadly divided into theorems and non-theorems.

A logical symbol is a fundamental concept in logic, tokens of which may be marks or a configuration of marks which form a particular pattern.[ citation needed ] Although the term "symbol" in common use refers at some times to the idea being symbolized, and at other times to the marks on a piece of paper or chalkboard which are being used to express that idea; in the formal languages studied in mathematics and logic, the term "symbol" refers to the idea, and the marks are considered to be a token instance of the symbol.[ dubious ] In logic, symbols build literal utility to illustrate ideas.

Contents

Overview

Symbols of a formal language need not be symbols of anything. For instance there are logical constants which do not refer to any idea, but rather serve as a form of punctuation in the language (e.g. parentheses). Symbols of a formal language must be capable of being specified without any reference to any interpretation of them.

A symbol or string of symbols may comprise a well-formed formula if it is consistent with the formation rules of the language.

In a formal system a symbol may be used as a token in formal operations. The set of formal symbols in a formal language is referred to as an alphabet (hence each symbol may be referred to as a "letter") [1] [ page needed ]

A formal symbol as used in first-order logic may be a variable (member from a universe of discourse), a constant, a function (mapping to another member of universe) or a predicate (mapping to T/F).

Formal symbols are usually thought of as purely syntactic structures, composed into larger structures using a formal grammar, though sometimes they may be associated with an interpretation or model (a formal semantics).

Can words be modeled as formal symbols?

The move to view units in natural language (e.g. English) as formal symbols was initiated by Noam Chomsky (it was this work that resulted in the Chomsky hierarchy in formal languages). The generative grammar model looked upon syntax as autonomous from semantics. Building on these models, the logician Richard Montague proposed that semantics could also be constructed on top of the formal structure:

There is in my opinion no important theoretical difference between natural languages and the artificial languages of logicians; indeed, I consider it possible to comprehend the syntax and semantics of both kinds of language within a single natural and mathematically precise theory. On this point I differ from a number of philosophers, but agree, I believe, with Chomsky and his associates." [2] [ page needed ]

This is the philosophical premise underlying Montague grammar.

However, this attempt to equate linguistic symbols with formal symbols has been challenged widely, particularly in the tradition of cognitive linguistics, by philosophers like Stevan Harnad, and linguists like George Lakoff and Ronald Langacker.

Related Research Articles

<span class="mw-page-title-main">Formal language</span> Sequence of words formed by specific rules

In logic, mathematics, computer science, and linguistics, a formal language consists of words whose letters are taken from an alphabet and are well-formed according to a specific set of rules.

First-order logic—also known as predicate logic, quantificational logic, and first-order predicate calculus—is a collection of formal systems used in mathematics, philosophy, linguistics, and computer science. First-order logic uses quantified variables over non-logical objects, and allows the use of sentences that contain variables, so that rather than propositions such as "Socrates is a man", one can have expressions in the form "there exists x such that x is Socrates and x is a man", where "there exists" is a quantifier, while x is a variable. This distinguishes it from propositional logic, which does not use quantifiers or relations; in this sense, propositional logic is the foundation of first-order logic.

The following outline is provided as an overview and topical guide to linguistics:

Semantics is the study of reference, meaning, or truth. The term can be used to refer to subfields of several distinct disciplines, including philosophy, linguistics and computer science.

In linguistics, transformational grammar (TG) or transformational-generative grammar (TGG) is part of the theory of generative grammar, especially of natural languages. It considers grammar to be a system of rules that generate exactly those combinations of words that form grammatical sentences in a given language and involves the use of defined operations to produce new sentences from existing ones. The method is commonly associated with American linguist Noam Chomsky.

Metalogic is the study of the metatheory of logic. Whereas logic studies how logical systems can be used to construct valid and sound arguments, metalogic studies the properties of logical systems. Logic concerns the truths that may be derived using a logical system; metalogic concerns the truths that may be derived about the languages and systems that are used to express truths.

A formal system is an abstract structure used for inferring theorems from axioms according to a set of rules. These rules, which are used for carrying out the inference of theorems from axioms, are the logical calculus of the formal system. A formal system is essentially an "axiomatic system".

<span class="mw-page-title-main">Syntax (logic)</span> Rules used for constructing, or transforming the symbols and words of a language

In logic, syntax is anything having to do with formal languages or formal systems without regard to any interpretation or meaning given to them. Syntax is concerned with the rules used for constructing, or transforming the symbols and words of a language, as contrasted with the semantics of a language which is concerned with its meaning.

<span class="mw-page-title-main">Richard Montague</span> American mathematician

Richard Merritt Montague was an American mathematician and philosopher who made contributions to mathematical logic and the philosophy of language. He is known for proposing Montague grammar to formalize the semantics of natural language. As a student of Alfred Tarski, he also contributed early developments to axiomatic set theory (ZFC). For the latter half of his life, he was a professor at the University of California, Los Angeles until his early death, believed to be a homicide, at age 40.

Montague grammar is an approach to natural language semantics, named after American logician Richard Montague. The Montague grammar is based on mathematical logic, especially higher-order predicate logic and lambda calculus, and makes use of the notions of intensional logic, via Kripke models. Montague pioneered this approach in the 1960s and early 1970s.

Categorial grammar is a family of formalisms in natural language syntax that share the central assumption that syntactic constituents combine as functions and arguments. Categorial grammar posits a close relationship between the syntax and semantic composition, since it typically treats syntactic categories as corresponding to semantic types. Categorial grammars were developed in the 1930s by Kazimierz Ajdukiewicz and in the 1950s by Yehoshua Bar-Hillel and Joachim Lambek. It saw a surge of interest in the 1970s following the work of Richard Montague, whose Montague grammar assumed a similar view of syntax. It continues to be a major paradigm, particularly within formal semantics.

In logic, the semantics of logic or formal semantics is the study of the semantics, or interpretations, of formal and natural languages usually trying to capture the pre-theoretic notion of entailment.

<i>Syntactic Structures</i> Book by Noam Chomsky

Syntactic Structures is an influential work in linguistics by American linguist Noam Chomsky, originally published in 1957. It is an elaboration of his teacher Zellig Harris's model of transformational generative grammar. A short monograph of about a hundred pages, Chomsky's presentation is recognized as one of the most significant studies of the 20th century. It contains the now-famous sentence "Colorless green ideas sleep furiously", which Chomsky offered as an example of a grammatically correct sentence that has no discernible meaning. Thus, Chomsky argued for the independence of syntax from semantics.

In generative grammar and related approaches, the logical form (LF) of a linguistic expression is the variant of its syntactic structure which undergoes semantic interpretation. It is distinguished from phonetic form, the structure which corresponds to a sentence's pronunciation. These separate representations are postulated in order to explain the ways in which an expression's meaning can be partially independent of its pronunciation, e.g. scope ambiguities.

Logic is the formal science of using reason and is considered a branch of both philosophy and mathematics and to a lesser extent computer science. Logic investigates and classifies the structure of statements and arguments, both through the study of formal systems of inference and the study of arguments in natural language. The scope of logic can therefore be very large, ranging from core topics such as the study of fallacies and paradoxes, to specialized analyses of reasoning such as probability, correct reasoning, and arguments involving causality. One of the aims of logic is to identify the correct and incorrect inferences. Logicians study the criteria for the evaluation of arguments.

<span class="mw-page-title-main">Philosophy of language</span> Discipline of philosophy that deals with language and meaning

In analytic philosophy, philosophy of language investigates the nature of language and the relations between language, language users, and the world. Investigations may include inquiry into the nature of meaning, intentionality, reference, the constitution of sentences, concepts, learning, and thought.

Donkey sentences are sentences that contain a pronoun with clear meaning but whose syntactical role in the sentence poses challenges to grammarians. Such sentences defy straightforward attempts to generate their formal language equivalents. The difficulty is with understanding how English speakers parse such sentences.

Formal semantics is the study of grammatical meaning in natural languages using formal tools from logic and theoretical computer science. It is an interdisciplinary field, sometimes regarded as a subfield of both linguistics and philosophy of language. It provides accounts of what linguistic expressions mean and how their meanings are composed from the meanings of their parts. The enterprise of formal semantics can be thought of as that of reverse-engineering the semantic components of natural languages' grammars.

<span class="mw-page-title-main">Formalism (linguistics)</span>

In linguistics, the term formalism is used in a variety of meanings which relate to formal linguistics in different ways. In common usage, it is merely synonymous with a grammatical model or a syntactic model: a method for analyzing sentence structures. Such formalisms include different methodologies of generative grammar which are especially designed to produce grammatically correct strings of words; or the likes of Functional Discourse Grammar which builds on predicate logic.

References

See also