Syntax (logic)

Last updated
This diagram shows the syntactic entities which may be constructed from formal languages. The symbols and strings of symbols may be broadly divided into nonsense and well-formed formulas. A formal language is identical to the set of its well-formed formulas. The set of well-formed formulas may be broadly divided into theorems and non-theorems. Formal languages.svg
This diagram shows the syntactic entities which may be constructed from formal languages. The symbols and strings of symbols may be broadly divided into nonsense and well-formed formulas. A formal language is identical to the set of its well-formed formulas. The set of well-formed formulas may be broadly divided into theorems and non-theorems.

In logic, syntax is anything having to do with formal languages or formal systems without regard to any interpretation or meaning given to them. Syntax is concerned with the rules used for constructing, or transforming the symbols and words of a language, as contrasted with the semantics of a language which is concerned with its meaning.

Contents

The symbols, formulas, systems, theorems and proofs expressed in formal languages are syntactic entities whose properties may be studied without regard to any meaning they may be given, and, in fact, need not be given any.

Syntax is usually associated with the rules (or grammar) governing the composition of texts in a formal language that constitute the well-formed formulas of a formal system.

In computer science, the term syntax refers to the rules governing the composition of well-formed expressions in a programming language. As in mathematical logic, it is independent of semantics and interpretation.

Syntactic entities

Symbols

A symbol is an idea, abstraction or concept, tokens of which may be marks or a metalanguage of marks which form a particular pattern. Symbols of a formal language need not be symbols of anything. For instance there are logical constants which do not refer to any idea, but rather serve as a form of punctuation in the language (e.g. parentheses). A symbol or string of symbols may comprise a well-formed formula if the formulation is consistent with the formation rules of the language. Symbols of a formal language must be capable of being specified without any reference to any interpretation of them.

Formal language

A formal language is a syntactic entity which consists of a set of finite strings of symbols which are its words (usually called its well-formed formulas). Which strings of symbols are words is determined by the creator of the language, usually by specifying a set of formation rules. Such a language can be defined without reference to any meanings of any of its expressions; it can exist before any interpretation is assigned to it – that is, before it has any meaning.

Formation rules

Formation rules are a precise description of which strings of symbols are the well-formed formulas of a formal language. It is synonymous with the set of strings over the alphabet of the formal language which constitute well formed formulas. However, it does not describe their semantics (i.e. what they mean).

Propositions

A proposition is a sentence expressing something true or false. A proposition is identified ontologically as an idea, concept or abstraction whose token instances are patterns of symbols, marks, sounds, or strings of words. [2] Propositions are considered to be syntactic entities and also truthbearers.

Formal theories

A formal theory is a set of sentences in a formal language.

Formal systems

A formal system (also called a logical calculus, or a logical system) consists of a formal language together with a deductive apparatus (also called a deductive system). The deductive apparatus may consist of a set of transformation rules (also called inference rules) or a set of axioms, or have both. A formal system is used to derive one expression from one or more other expressions. Formal systems, like other syntactic entities may be defined without any interpretation given to it (as being, for instance, a system of arithmetic).

Syntactic consequence within a formal system

A formula A is a syntactic consequence [3] [4] [5] [6] within some formal system of a set Г of formulas if there is a derivation in formal system of A from the set Г.

Syntactic consequence does not depend on any interpretation of the formal system. [7]

Syntactic completeness of a formal system

A formal system is syntactically complete [8] [9] [10] [11] (also deductively complete, maximally complete, negation complete or simply complete) iff for each formula A of the language of the system either A or ¬A is a theorem of . In another sense, a formal system is syntactically complete iff no unprovable axiom can be added to it as an axiom without introducing an inconsistency. Truth-functional propositional logic and first-order predicate logic are semantically complete, but not syntactically complete (for example the propositional logic statement consisting of a single variable "a" is not a theorem, and neither is its negation, but these are not tautologies). Gödel's incompleteness theorem shows that no recursive system that is sufficiently powerful, such as the Peano axioms, can be both consistent and complete.

Interpretations

An interpretation of a formal system is the assignment of meanings to the symbols, and truth values to the sentences of a formal system. The study of interpretations is called formal semantics. Giving an interpretation is synonymous with constructing a model . An interpretation is expressed in a metalanguage, which may itself be a formal language, and as such itself is a syntactic entity.

See also

Related Research Articles

An axiom, postulate, or assumption is a statement that is taken to be true, to serve as a premise or starting point for further reasoning and arguments. The word comes from the Ancient Greek word ἀξίωμα (axíōma), meaning 'that which is thought worthy or fit' or 'that which commends itself as evident'.

<span class="mw-page-title-main">Formal language</span> Sequence of words formed by specific rules

In logic, mathematics, computer science, and linguistics, a formal language consists of words whose letters are taken from an alphabet and are well-formed according to a specific set of rules called a formal grammar.

First-order logic—also known as predicate logic, quantificational logic, and first-order predicate calculus—is a collection of formal systems used in mathematics, philosophy, linguistics, and computer science. First-order logic uses quantified variables over non-logical objects, and allows the use of sentences that contain variables, so that rather than propositions such as "Socrates is a man", one can have expressions in the form "there exists x such that x is Socrates and x is a man", where "there exists" is a quantifier, while x is a variable. This distinguishes it from propositional logic, which does not use quantifiers or relations; in this sense, propositional logic is the foundation of first-order logic.

<span class="mw-page-title-main">Gödel's completeness theorem</span> Fundamental theorem in mathematical logic

Gödel's completeness theorem is a fundamental theorem in mathematical logic that establishes a correspondence between semantic truth and syntactic provability in first-order logic.

The propositional calculus is a branch of logic. It is also called propositional logic, statement logic, sentential calculus, sentential logic, or sometimes zeroth-order logic. It deals with propositions and relations between propositions, including the construction of arguments based on them. Compound propositions are formed by connecting propositions by logical connectives representing the truth functions of conjunction, disjunction, implication, equivalence, and negation. Some sources include other connectives, as in the table below.

In logic and deductive reasoning, an argument is sound if it is both valid in form and its premises are true. Soundness has a related meaning in mathematical logic, wherein a formal system of logic is sound if and only if every well-formed formula that can be proven in the system is logically valid with respect to the logical semantics of the system.

<span class="mw-page-title-main">Theorem</span> In mathematics, a statement that has been proved

In mathematics, a theorem is a statement that has been proved, or can be proved. The proof of a theorem is a logical argument that uses the inference rules of a deductive system to establish that the theorem is a logical consequence of the axioms and previously proved theorems.

In philosophy of logic and logic, a rule of inference, inference rule or transformation rule is a logical form consisting of a function which takes premises, analyzes their syntax, and returns a conclusion.

In logic and mathematics, second-order logic is an extension of first-order logic, which itself is an extension of propositional logic. Second-order logic is in turn extended by higher-order logic and type theory.

Metalogic is the study of the metatheory of logic. Whereas logic studies how logical systems can be used to construct valid and sound arguments, metalogic studies the properties of logical systems. Logic concerns the truths that may be derived using a logical system; metalogic concerns the truths that may be derived about the languages and systems that are used to express truths.

A formal system is an abstract structure and formalization of an axiomatic system used for inferring theorems from axioms by a set of inference rules.

In mathematical logic, propositional logic and predicate logic, a well-formed formula, abbreviated WFF or wff, often simply formula, is a finite sequence of symbols from a given alphabet that is part of a formal language. A formal language can be identified with the set of formulas in the language.

In logic and mathematics, a formal proof or derivation is a finite sequence of sentences, each of which is an axiom, an assumption, or follows from the preceding sentences in the sequence by a rule of inference. It differs from a natural language argument in that it is rigorous, unambiguous and mechanically verifiable. If the set of assumptions is empty, then the last sentence in a formal proof is called a theorem of the formal system. The notion of theorem is not in general effective, therefore there may be no method by which we can always find a proof of a given sentence or determine that none exists. The concepts of Fitch-style proof, sequent calculus and natural deduction are generalizations of the concept of proof.

In mathematical logic, a theory is a set of sentences in a formal language. In most scenarios a deductive system is first understood from context, after which an element of a deductively closed theory is then called a theorem of the theory. In many deductive systems there is usually a subset that is called "the set of axioms" of the theory , in which case the deductive system is also called an "axiomatic system". By definition, every axiom is automatically a theorem. A first-order theory is a set of first-order sentences (theorems) recursively obtained by the inference rules of the system applied to the set of axioms.

Logic is the formal science of using reason and is considered a branch of both philosophy and mathematics and to a lesser extent computer science. Logic investigates and classifies the structure of statements and arguments, both through the study of formal systems of inference and the study of arguments in natural language. The scope of logic can therefore be very large, ranging from core topics such as the study of fallacies and paradoxes, to specialized analyses of reasoning such as probability, correct reasoning, and arguments involving causality. One of the aims of logic is to identify the correct and incorrect inferences. Logicians study the criteria for the evaluation of arguments.

An interpretation is an assignment of meaning to the symbols of a formal language. Many formal languages used in mathematics, logic, and theoretical computer science are defined in solely syntactic terms, and as such do not have any meaning until they are given some interpretation. The general study of interpretations of formal languages is called formal semantics.

In mathematical logic, formation rules are rules for describing which strings of symbols formed from the alphabet of a formal language are syntactically valid within the language. These rules only address the location and manipulation of the strings of the language. It does not describe anything else about a language, such as its semantics. .

Philosophy of logic is the area of philosophy that studies the scope and nature of logic. It investigates the philosophical problems raised by logic, such as the presuppositions often implicitly at work in theories of logic and in their application. This involves questions about how logic is to be defined and how different logical systems are connected to each other. It includes the study of the nature of the fundamental concepts used by logic and the relation of logic to other disciplines. According to a common characterisation, philosophical logic is the part of the philosophy of logic that studies the application of logical methods to philosophical problems, often in the form of extended logical systems like modal logic. But other theorists draw the distinction between the philosophy of logic and philosophical logic differently or not at all. Metalogic is closely related to the philosophy of logic as the discipline investigating the properties of formal logical systems, like consistency and completeness.

In mathematical logic and metalogic, a formal system is called complete with respect to a particular property if every formula having the property can be derived using that system, i.e. is one of its theorems; otherwise the system is said to be incomplete. The term "complete" is also used without qualification, with differing meanings depending on the context, mostly referring to the property of semantical validity. Intuitively, a system is called complete in this particular sense, if it can derive every formula that is true.

Logical consequence is a fundamental concept in logic which describes the relationship between statements that hold true when one statement logically follows from one or more statements. A valid logical argument is one in which the conclusion is entailed by the premises, because the conclusion is the consequence of the premises. The philosophical analysis of logical consequence involves the questions: In what sense does a conclusion follow from its premises? and What does it mean for a conclusion to be a consequence of premises? All of philosophical logic is meant to provide accounts of the nature of logical consequence and the nature of logical truth.

References

  1. Dictionary Definition
  2. Metalogic, Geoffrey Hunter
  3. Dummett, M. (1981). Frege: Philosophy of Language. Harvard University Press. p. 82. ISBN   9780674319318 . Retrieved 2014-10-15.
  4. Lear, J. (1986). Aristotle and Logical Theory. Cambridge University Press. p. 1. ISBN   9780521311786 . Retrieved 2014-10-15.
  5. Creath, R.; Friedman, M. (2007). The Cambridge Companion to Carnap. Cambridge University Press. p. 189. ISBN   9780521840156 . Retrieved 2014-10-15.
  6. "syntactic consequence from FOLDOC". swif.uniba.it. Archived from the original on 2013-04-03. Retrieved 2014-10-15.
  7. Hunter, Geoffrey, Metalogic: An Introduction to the Metatheory of Standard First-Order Logic, University of California Press, 1971, p. 75.
  8. "A Note on Interaction and Incompleteness" (PDF). Retrieved 2014-10-15.
  9. Wijesekera, Duminda; Ganesh, M.; Srivastava, Jaideep; Nerode, Anil (2001). "Normal forms and syntactic completeness proofs for functional independencies". Theoretical Computer Science. 266 (1–2). portal.acm.org: 365–405. doi:10.1016/S0304-3975(00)00195-X.
  10. Barwise, J. (1982). Handbook of Mathematical Logic. Elsevier Science. p. 236. ISBN   9780080933641 . Retrieved 2014-10-15.
  11. "syntactic completeness from FOLDOC". swif.uniba.it. Archived from the original on 2001-05-02. Retrieved 2014-10-15.

Commons-logo.svg Media related to Syntax (logic) at Wikimedia Commons