Alphabet of human thought

Last updated

The alphabet of human thought (Latin : alphabetum cogitationum humanarum) is a concept originally proposed by Gottfried Wilhelm Leibniz that provides a universal way to represent and analyze ideas and relationships by breaking down their component pieces. [1] All ideas are compounded from a very small number of simple ideas which can be represented by a unique character. [2] [3]

Contents

Overview

Logic was Leibniz's earliest philosophic interest, going back to his teens. René Descartes had suggested that the lexicon of a universal language should consist of primitive elements. [4] The systematic combination of these elements, according to syntactical rules, would generate the infinite combinations of computational structures required to represent human language. In this way Descartes and Leibniz were precursors to computational linguistics as defined by Noam Chomsky. [5]

In the early 18th century, Leibniz outlined his characteristica universalis , an artificial language in which grammatical and logical structure would coincide, allowing reasoning to be reduced to calculation. Leibniz acknowledged the work of Ramon Llull, particularly the Ars generalis ultima (1305), as one of the inspirations for this idea. The basic elements of his characteristica would be pictographic characters unambiguously representing a limited number of elementary concepts. Leibniz called the inventory of these concepts "the alphabet of human thought." There are quite a few mentions of the characteristica in Leibniz's writings, but he never set out any details save for a brief outline of some possible sentences in his Dissertation on the Art of Combinations.

His main interest was what is known in modern logic as classification and composition. In modern terminology, Leibniz's alphabet was a proposal for an automated theorem prover or ontology classification reasoner written centuries before the technology to implement them. [6]

Semantic web implementation

John Giannandrea, co-founder and CTO of Metaweb Technologies, acknowledged in a 2008 speech that Freebase was at least linked to the alphabet of human thought, if not an implementation of it. [7]

See also

Related Research Articles

<span class="mw-page-title-main">Gottfried Wilhelm Leibniz</span> German mathematician and philosopher (1646–1716)

Gottfried Wilhelm Leibniz was a German polymath active as a mathematician, philosopher, scientist and diplomat who invented calculus in addition to many other branches of mathematics and statistics. Leibniz has been called the "last universal genius" due to his knowledge and skills in different fields and because such people became less common during the Industrial Revolution and spread of specialized labor after his lifetime. He is a prominent figure in both the history of philosophy and the history of mathematics. He wrote works on philosophy, theology, ethics, politics, law, history, philology, games, music, and other studies. Leibniz also made major contributions to physics and technology, and anticipated notions that surfaced much later in probability theory, biology, medicine, geology, psychology, linguistics and computer science. In addition, he contributed to the field of library science by devising a cataloguing system whilst working at the Herzog August Library in Wolfenbüttel, Germany, that would have served as a guide for many of Europe's largest libraries. Leibniz's contributions to a wide range of subjects were scattered in various learned journals, in tens of thousands of letters and in unpublished manuscripts. He wrote in several languages, primarily in Latin, French and German.

The following outline is provided as an overview and topical guide to linguistics:

<span class="mw-page-title-main">Ontology</span> Philosophical study of being and existence

In metaphysics, ontology is the philosophical study of being. It investigates what types of entities exist, how they are grouped into categories, and how they are related to one another on the most fundamental level. Ontologists often try to determine what the categories or highest kinds are and how they form a system of categories that encompasses the classification of all entities. Commonly proposed categories include substances, properties, relations, states of affairs, and events. These categories are characterized by fundamental ontological concepts, including particularity and universality, abstractness and concreteness, or possibility and necessity. Of special interest is the concept of ontological dependence, which determines whether the entities of a category exist on the most fundamental level. Disagreements within ontology are often about whether entities belonging to a certain category exist and, if so, how they are related to other entities.

In philosophy, rationalism is the epistemological view that "regards reason as the chief source and test of knowledge" or "any view appealing to reason as a source of knowledge or justification", often in contrast to other possible sources of knowledge such as faith, tradition, or sensory experience. More formally, rationalism is defined as a methodology or a theory "in which the criterion of truth is not sensory but intellectual and deductive".

<i>Mathesis universalis</i> Philosophy that mathematics can be used to define all aspects of the universe

Mathesis universalis is a hypothetical universal science modelled on mathematics envisaged by Descartes and Leibniz, among a number of other 16th- and 17th-century philosophers and mathematicians. For Leibniz, it would be supported by a calculus ratiocinator. John Wallis invokes the name as title in his Opera Mathematica, a textbook on arithmetic, algebra, and Cartesian geometry.

Universal language may refer to a hypothetical or historical language spoken and understood by all or most of the world's people. In some contexts, it refers to a means of communication said to be understood by all humans. It may be the idea of an international auxiliary language for communication between groups speaking different primary languages. In other conceptions, it may be the primary language of all speakers, or the only existing language. Some religious and mythological traditions state that there was once a single universal language among all people, or shared by humans and supernatural beings.

Universal science is a branch of metaphysics. In the work of Gottfried Wilhelm Leibniz, the universal science is the true logic. The idea of establishing a universal science originated in the seventeenth century with philosophers Francis Bacon and Rene Descartes. Bacon and Descartes conceptualized universal science as a unified approach to collect scientific information similar to encyclopedias of universal knowledge but were unsuccessful. Leibniz extended their ideas to use logic as an "index" to order universal scientific and mathematical information as an operational system with a universal language. Plato's system of idealism, formulated using the teachings of Socrates, is a predecessor to the concept of universal science and influenced Leibniz' s views against materialism in favor of logic. It emphasizes on the first principles which appear to be the reasoning behind everything, emerging and being in state with everything. This mode of reasoning had a supporting influence on great scientists such as Boole, Frege, Cantor, Hilbert, Gödel, and Turing. All of these great minds shared a similar dream, vision or belief in a future where universal computing would eventually change everything.

In the philosophy of mind, innatism is the view that the mind is born with already-formed ideas, knowledge, and beliefs. The opposing doctrine, that the mind is a tabula rasa at birth and all knowledge is gained from experience and the senses, is called empiricism.

aUI is a philosophical, a priori language created in the 1950s by W. John Weilgart, Ph.D. a philosopher and psychoanalyst originally from Vienna, Austria. He described it as "the Language of Space", connoting universal communication, and published the fourth edition of the textbook in 1979; a philosophic description of each semantic element of the language was published in 1975.

The cognitive revolution was an intellectual movement that began in the 1950s as an interdisciplinary study of the mind and its processes, from which emerged a new field known as cognitive science. The preexisting relevant fields were psychology, linguistics, computer science, anthropology, neuroscience, and philosophy. The approaches used were developed within the then-nascent fields of artificial intelligence, computer science, and neuroscience. In the 1960s, the Harvard Center for Cognitive Studies and the Center for Human Information Processing at the University of California, San Diego were influential in developing the academic study of cognitive science. By the early 1970s, the cognitive movement had surpassed behaviorism as a psychological paradigm. Furthermore, by the early 1980s the cognitive approach had become the dominant line of research inquiry across most branches in the field of psychology.

<i>De Arte Combinatoria</i>

The Dissertatio de arte combinatoria is an early work by Gottfried Leibniz published in 1666 in Leipzig. It is an extended version of his first doctoral dissertation, written before the author had seriously undertaken the study of mathematics. The booklet was reissued without Leibniz' consent in 1690, which prompted him to publish a brief explanatory notice in the Acta Eruditorum. During the following years he repeatedly expressed regrets about its being circulated as he considered it immature. Nevertheless it was a very original work and it provided the author the first glimpse of fame among the scholars of his time.

The Latin term characteristica universalis, commonly interpreted as universal characteristic, or universal character in English, is a universal and formal language imagined by Gottfried Leibniz able to express mathematical, scientific, and metaphysical concepts. Leibniz thus hoped to create a language usable within the framework of a universal logical calculation or calculus ratiocinator.

<span class="mw-page-title-main">Cartesian linguistics</span>

The term Cartesian linguistics was coined by Noam Chomsky in his book Cartesian Linguistics: A Chapter in the History of Rationalist Thought (1966). The adjective "Cartesian" pertains to René Descartes, a prominent 17th-century philosopher. As well as Descartes, Chomsky surveys other examples of rationalist thought in 17th-century linguistics, in particular the Port-Royal Grammar (1660), which foreshadows some of his own ideas concerning universal grammar.

The Port-Royal Grammar was a milestone in the analysis and philosophy of language. Published in 1660 by Antoine Arnauld and Claude Lancelot, it was the linguistic counterpart to the Port-Royal Logic (1662), both named after the Jansenist monastery of Port-Royal-des-Champs where their authors worked. The Port-Royal Grammar became used as a standard textbook in the study of language until the early nineteenth century, and it has been reproduced in several editions and translations. In the twentieth century, scholars including Edmund Husserl and Noam Chomsky maintained academic interest in the book.

<span class="mw-page-title-main">Diagrammatic reasoning</span>

Diagrammatic reasoning is reasoning by means of visual representations. The study of diagrammatic reasoning is about the understanding of concepts and ideas, visualized with the use of diagrams and imagery instead of by linguistic or algebraic means.

<span class="mw-page-title-main">Symbol (formal)</span> Token in a mathematical or logical formula

A logical symbol is a fundamental concept in logic, tokens of which may be marks or a configuration of marks which form a particular pattern. Although the term "symbol" in common use refers at some times to the idea being symbolized, and at other times to the marks on a piece of paper or chalkboard which are being used to express that idea; in the formal languages studied in mathematics and logic, the term "symbol" refers to the idea, and the marks are considered to be a token instance of the symbol. In logic, symbols build literal utility to illustrate ideas.

This is an index of Wikipedia articles in philosophy of language

<span class="mw-page-title-main">Digital infinity</span> Term in theoretical linguistics

Digital infinity is a technical term in theoretical linguistics. Alternative formulations are "discrete infinity" and "the infinite use of finite means". The idea is that all human languages follow a simple logical principle, according to which a limited set of digits—irreducible atomic sound elements—are combined to produce an infinite range of potentially meaningful expressions.

<span class="mw-page-title-main">Formalism (linguistics)</span> Concept in linguistics

In linguistics, the term formalism is used in a variety of meanings which relate to formal linguistics in different ways. In common usage, it is merely synonymous with a grammatical model or a syntactic model: a method for analyzing sentence structures. Such formalisms include different methodologies of generative grammar which are especially designed to produce grammatically correct strings of words; or the likes of Functional Discourse Grammar which builds on predicate logic.

Mathematicism is 'the effort to employ the formal structure and rigorous method of mathematics as a model for the conduct of philosophy'. or else it is the epistemological view that reality is fundamentally mathematical. The term has been applied to a number of philosophers, including Pythagoras and René Descartes although the term is not used by themselves.

References

  1. Leibniz, De Alphabeto cogitationum humanorum, (April 1679 to April 1681 (?)), Akademie VI.4 p.270
  2. Geiger, Richard A.; Rudzka-Ostyn, Brygida, eds. (1993). Conceptualizations and mental processing in language. International Cognitive Linguistics Conference (1 : 1989 : Duisburg). Walter de Gruyter. pp. 25–26. ISBN   978-3-11-012714-0.
  3. Bunnin, Nicholas; Jiyuan Yu (2004). The Blackwell Dictionary of Western Philosophy. Blackwell Publishing. p. 715. ISBN   978-1-4051-0679-5.
  4. Hatfield, Gary (3 December 2008). "René Descartes, The Stanford Encyclopedia of Philosophy (Summer 2014 Edition)". plato.stanford.edu. Stanford University. Retrieved 12 July 2014. he offered a new vision of the natural world that continues to shape our thought today: a world of matter possessing a few fundamental properties and interacting according to a few universal laws.
  5. Chomsky, Noam (13 April 2000). New Horizons in the Study of Language and Mind (Kindle ed.). Cambridge University Press. pp. 425–428. ISBN   0521658225. I mentioned that modern generative grammar has sought to address concerns that animated the tradition; in particular, the Cartesian idea that "the true distinction" (Descartes 1649/1927: 360) between humans and other creatures or machines is the ability to act in the manner they took to be most clearly illustrated in the ordinary use of language: without any finite limits, influenced but not determined by internal state, appropriate to situations but not caused by them, coherent and evoking thoughts that the hearer might have expressed, and so on. The goal of the work I have been discussing is to unearth some of the factors that enter into such normal practice.
  6. Russell, L.J. (1985). "Leibniz, Gottfried Wilhelm". In Paul Edwards (ed.). The Encyclopedia of Philosophy Volumes 3 and 4. Macmillan Publishing. pp. 422–423. ASIN   B0017IMQME. his main emphasis... was on classification, deduction was a natural consequence of combining classified items into new classes.
  7. "PARCForum Presentation by Giannandrea, J." YouTube. min 37+. Retrieved 2015-10-30.