Logical grammar or rational grammar is a term used in the history and philosophy of linguistics to refer to certain linguistic and grammatical theories that were prominent until the early 19th century and later influenced 20th-century linguistic thought. These theories were developed by scholars and philosophers who sought to establish a logical and rational basis for understanding the relationship between reality, meaning, cognition, and language. Examples from the classical and modern period represent a realistic approach to linguistics, while accounts written during the Age of Enlightenment represent rationalism, focusing on human thought. [1] [2]
Logical, rational or general grammar was the dominant approach to language until it was supplanted by romanticism. [3] Since then, there have been attempts to revive logical grammar. The idea is today at least partially represented by categorial grammar, formal semantics, and transcendental phenomenology,
Logical grammar consists of the analysis of the sentence into a predicate-argument structure and of a commutation test, which breaks the form down paradigmatically into layers of syntactic categories. Through such procedure, formal grammar is extracted from the material. Applying the rules of the grammar produces grammatical sentences, which may be recursive.
The foundation of logical grammar was laid out by the Greek philosophers. According to Plato, the task of the sentence is to make a statement about the subject by means of predication. In the Sophist, he uses the example of "Theaetetus is sitting" to illustrate the idea of predication. This statement involves the subject "Theaetetus" and the predicate "is sitting". Plato then delves into questions about the relationship between these two elements and the nature of being and non-being. [4]
In the Parmenides, Plato uses examples like "Theaetetus is a man" and "Theaetetus is not a man" to illustrate the complexities and challenges of predication, particularly concerning the relationship between particulars and universal concepts. Plato's discussions of predication in these dialogues are part of his broader exploration of metaphysics, epistemology, and the nature of reality.
After Plato, Aristotle's syllogism relies on the concept of predication, as it forms the basis for his system of deductive reasoning. In Aristotelian syllogism, predication plays a central role in establishing the relationships between different terms within categorical statements. Syllogistic reasoning consists of a series of subjects (S) and predicates (p).
Following these philosophers, the analysis of the sentence into a subject-predicate structure became the cornerstone of classical grammar. Building on the Greek classics, Thomas of Erfurt's 14th-century Latin grammar expounds the role of linguistics within natural sciences. The task of language is to make statements concerning reality by means of predication. Erfurt's Modistae grammar also includes a transitive sentence. In his example "Plato strikes Socrates," Plato is the subject and "strikes Socrates" is the predicate relating to Plato. [5]
More examples of predication are found in the rational grammars of the Age of Enlightenment, such as the Port-Royal grammar. This approach is also elaborated by Edmund Husserl in the second edition of his Logical Investigations (1921). Husserl's phenomenological 'pure logical grammar' entails the study of the interconnectedness of language and the structures of consciousness. [6] It influenced Rudolph Carnap's 1935 logical syntax, which later formed the basis of categorial grammar. Such logical concepts of language, constructed by mathematicians and philosophers, represent the first approaches to generative grammar. [7] But linguists adopted the technique and replaced the logical and rational concept with biologism and psychologism. [8]
In the 20th century, the subject-centered view was supplanted in mathematical logic by predicate-argument structure, which focuses on the event (cf. predicate) and the relationship between the arguments, whose number is in principle unlimited: P(x,y,...). In modern linguistics, the mechanism familiar from classical predication often goes under the name of information structure but is considered as part of innate syntax in generative grammar. Formal semantics, as well as dependency grammar, employs transitive or n-ary predicates, but categorial grammar remains based on the unary predicate. Predicate-argument structure has been proposed for phenomenological linguistics, but such an enterprise is yet to materialize. [9]
The first philosopher to extensively discuss categories (or "predicables") in Western philosophy is Aristotle. In his work "Categories" (also known as "Categories of Being"), Aristotle systematically examined different types of predicables or categories, which are fundamental concepts for understanding the nature of reality and how language represents it. These include ten basic types that he identified as fundamental for understanding and classifying things in the world. [10]
The concept of syntactic categories, also known as parts of speech or word classes (e.g., nouns, verbs, adjectives, adverbs etc.), is related to but separable from the categories of being in the study of Ancient Greek grammar. Dionysius Thrax's work "Art of Grammar" is one of the earliest systematic grammatical treatises in Western tradition. Thrax classified words into eight parts of speech: [11]
The substitution of one element with another of the same syntactic category is discussed in the general and rational grammar of Port-Royal (1660) and elaborated by Husserl in his Logical Investigations, which introduces the commutation test (see also constituent test), which is based on such substitution. The identification of the elements belonging to a category is based on their grammaticality. For example, the adjective white in the statement 'This paper is white' is substituted by another adjective such as green or careless. In Husserl's taxonomy, a statement like 'This paper is careless' has a structured meaning but is "nonsense". By contrast, the statement 'This careless is green' violates the laws of structured meaning and is therefore "senseless". In modern terminology, the first statement is grammatical but the second one is ungrammatical. [12]
From another angle, pure logical grammar constitutes phrases, which represent a higher-level syntactic category, employing predication. The underlying logical proposition 'This paper is white' is transformed into the adjective phrase white paper. The whole sentence is constituted according to the principle of predication; and phrases are identified by means of substitution.
This insight led to the development of categorial and type logical grammar. [13] Sentences, whether acquired via empirical or introspective inquiry, are analyzed and synthesized into different-level syntactic categories to build a formal grammar. When the acquired rewrite rules are employed in reverse (i.e. starting from the sentence level and proceeding to clauses, phrases, single elements and terminals), the grammar generates all the grammatical sentences of the language, and an unrestricted (or "infinite") number of sentences. [14]
In general linguistics, logical and rational grammar was supplanted by romanticism in the beginning of the 19th century. One prominent figure who critiqued Enlightenment grammar during the Romantic era was Friedrich Schlegel. In his work Über die Sprache und Weisheit der Indier ('On the Language and Wisdom of the Indians'), Schlegel advocated for a more flexible and organic approach to language. He argued that language should be seen as a living and evolving entity, rather than a fixed set of rules. [15]
Another key figure was Novalis (Friedrich von Hardenberg), who expressed the idea that language was a dynamic and creative force, and that it should reflect the richness of human experience and emotions. Novalis wrote about the importance of poetic language and the need for language to capture the depths of the soul. However, the most influential figure in linguistic romanticism was Wilhelm von Humboldt, who argued that all languages have their own logic, or 'inner form,' rather than all languages being based on universal logic. [16]
Romanticism followed a time period when language education became politicized as education became accessible to a larger demographic, and language standardization became influenced by nationalism. Discussing language and authority from a modern and historical viewpoint, James Milroy and Lesley Milroy argue that logical explanations (alongside mathematical, functional and aesthetic considerations) of linguistic phenomena have no place in descriptive linguistics, which has the purpose of helping linguists guide the education authorities to more scientifically grounded policies. According to Milroy and Milroy, more appropriate theories for the purpose include those proposed by Ferdinand de Saussure, Noam Chomsky, and David Crystal. [17] Modern theorists including Chomsky and George Lakoff have counteracted contemporary efforts to revive logicism in linguistics, especially the Montague grammar and formal semantics. [18]
In linguistics, syntax is the study of how words and morphemes combine to form larger units such as phrases and sentences. Central concerns of syntax include word order, grammatical relations, hierarchical sentence structure (constituency), agreement, the nature of crosslinguistic variation, and the relationship between form and meaning (semantics). There are numerous approaches to syntax that differ in their central assumptions and goals.
In grammar, a part of speech or part-of-speech is a category of words that have similar grammatical properties. Words that are assigned to the same part of speech generally display similar syntactic behavior, sometimes similar morphological behavior in that they undergo inflection for similar properties and even similar semantic behavior. Commonly listed English parts of speech are noun, verb, adjective, adverb, pronoun, preposition, conjunction, interjection, numeral, article, and determiner.
In linguistics, transformational grammar (TG) or transformational-generative grammar (TGG) is part of the theory of generative grammar, especially of natural languages. It considers grammar to be a system of rules that generate exactly those combinations of words that form grammatical sentences in a given language and involves the use of defined operations to produce new sentences from existing ones.
A proposition is a central concept in the philosophy of language, semantics, logic, and related fields, often characterized as the primary bearer of truth or falsity. Propositions are also often characterized as being the kind of thing that declarative sentences denote. For instance the sentence "The sky is blue" denotes the proposition that the sky is blue. However, crucially, propositions are not themselves linguistic expressions. For instance, the English sentence "Snow is white" denotes the same proposition as the German sentence "Schnee ist weiß" even though the two sentences are not the same. Similarly, propositions can also be characterized as the objects of belief and other propositional attitudes. For instance if one believes that the sky is blue, what one believes is the proposition that the sky is blue. A proposition can also be thought of as a kind of idea: Collins Dictionary has a definition for proposition as "a statement or an idea that people can consider or discuss whether it is true."
In linguistics, X-bar theory is a model of phrase-structure grammar and a theory of syntactic category formation that was first proposed by Noam Chomsky in 1970 reformulating the ideas of Zellig Harris (1951), and further developed by Ray Jackendoff, along the lines of the theory of generative grammar put forth in the 1950s by Chomsky. It attempts to capture the structure of phrasal categories with a single uniform structure called the X-bar schema, basing itself on the assumption that any phrase in natural language is an XP that is headed by a given syntactic category X. It played a significant role in resolving issues that phrase structure rules had, representative of which is the proliferation of grammatical rules, which is against the thesis of generative grammar.
In logic and formal semantics, term logic, also known as traditional logic, syllogistic logic or Aristotelian logic, is a loose name for an approach to formal logic that began with Aristotle and was developed further in ancient history mostly by his followers, the Peripatetics. It was revived after the third century CE by Porphyry's Isagoge.
Montague grammar is an approach to natural language semantics, named after American logician Richard Montague. The Montague grammar is based on mathematical logic, especially higher-order predicate logic and lambda calculus, and makes use of the notions of intensional logic, via Kripke models. Montague pioneered this approach in the 1960s and early 1970s.
Categorial grammar is a family of formalisms in natural language syntax that share the central assumption that syntactic constituents combine as functions and arguments. Categorial grammar posits a close relationship between the syntax and semantic composition, since it typically treats syntactic categories as corresponding to semantic types. Categorial grammars were developed in the 1930s by Kazimierz Ajdukiewicz and in the 1950s by Yehoshua Bar-Hillel and Joachim Lambek. It saw a surge of interest in the 1970s following the work of Richard Montague, whose Montague grammar assumed a similar view of syntax. It continues to be a major paradigm, particularly within formal semantics.
The term phrase structure grammar was originally introduced by Noam Chomsky as the term for grammar studied previously by Emil Post and Axel Thue. Some authors, however, reserve the term for more restricted grammars in the Chomsky hierarchy: context-sensitive grammars or context-free grammars. In a broader sense, phrase structure grammars are also known as constituency grammars. The defining trait of phrase structure grammars is thus their adherence to the constituency relation, as opposed to the dependency relation of dependency grammars.
The term predicate is used in two ways in linguistics and its subfields. The first defines a predicate as everything in a standard declarative sentence except the subject, and the other defines it as only the main content verb or associated predicative expression of a clause. Thus, by the first definition, the predicate of the sentence Frank likes cake is likes cake, while by the second definition, it is only the content verb likes, and Frank and cake are the arguments of this predicate. The conflict between these two definitions can lead to confusion.
In logic, the logical form of a statement is a precisely-specified semantic version of that statement in a formal system. Informally, the logical form attempts to formalize a possibly ambiguous statement into a statement with a precise, unambiguous logical interpretation with respect to a formal system. In an ideal formal language, the meaning of a logical form can be determined unambiguously from syntax alone. Logical forms are semantic, not syntactic constructs; therefore, there may be more than one string that represents the same logical form in a given language.
In generative grammar and related approaches, the logical form (LF) of a linguistic expression is the variant of its syntactic structure which undergoes semantic interpretation. It is distinguished from phonetic form, the structure which corresponds to a sentence's pronunciation. These separate representations are postulated in order to explain the ways in which an expression's meaning can be partially independent of its pronunciation, e.g. scope ambiguities.
In linguistics, an argument is an expression that helps complete the meaning of a predicate, the latter referring in this context to a main verb and its auxiliaries. In this regard, the complement is a closely related concept. Most predicates take one, two, or three arguments. A predicate and its arguments form a predicate-argument structure. The discussion of predicates and arguments is associated most with (content) verbs and noun phrases (NPs), although other syntactic categories can also be construed as predicates and as arguments. Arguments must be distinguished from adjuncts. While a predicate needs its arguments to complete its meaning, the adjuncts that appear with a predicate are optional; they are not necessary to complete the meaning of the predicate. Most theories of syntax and semantics acknowledge arguments and adjuncts, although the terminology varies, and the distinction is generally believed to exist in all languages. Dependency grammars sometimes call arguments actants, following Lucien Tesnière (1959).
Logic is the formal science of using reason and is considered a branch of both philosophy and mathematics and to a lesser extent computer science. Logic investigates and classifies the structure of statements and arguments, both through the study of formal systems of inference and the study of arguments in natural language. The scope of logic can therefore be very large, ranging from core topics such as the study of fallacies and paradoxes, to specialized analyses of reasoning such as probability, correct reasoning, and arguments involving causality. One of the aims of logic is to identify the correct and incorrect inferences. Logicians study the criteria for the evaluation of arguments.
In analytic philosophy, philosophy of language investigates the nature of language and the relations between language, language users, and the world. Investigations may include inquiry into the nature of meaning, intentionality, reference, the constitution of sentences, concepts, learning, and thought.
The Port-Royal Grammar was a milestone in the analysis and philosophy of language. Published in 1660 by Antoine Arnauld and Claude Lancelot, it was the linguistic counterpart to the Port-Royal Logic (1662), both named after the Jansenist monastery of Port-Royal-des-Champs where their authors worked. The Port-Royal Grammar became used as a standard textbook in the study of language until the early nineteenth century, and it has been reproduced in several editions and translations. In the twentieth century, scholars including Edmund Husserl and Noam Chomsky maintained academic interest in the book.
Rasmus Viggo Brøndal was a Danish philologist and professor of Romance languages and literature at Copenhagen University.
Formal semantics is the study of grammatical meaning in natural languages using formal tools from logic, mathematics and theoretical computer science. It is an interdisciplinary field, sometimes regarded as a subfield of both linguistics and philosophy of language. It provides accounts of what linguistic expressions mean and how their meanings are composed from the meanings of their parts. The enterprise of formal semantics can be thought of as that of reverse-engineering the semantic components of natural languages' grammars.
In linguistics, the term formalism is used in a variety of meanings which relate to formal linguistics in different ways. In common usage, it is merely synonymous with a grammatical model or a syntactic model: a method for analyzing sentence structures. Such formalisms include different methodologies of generative grammar which are especially designed to produce grammatically correct strings of words; or the likes of Functional Discourse Grammar which builds on predicate logic.