Alphabet of human thought

Last updated

The alphabet of human thought (Latin : alphabetum cogitationum humanarum) is a concept originally proposed by Gottfried Wilhelm Leibniz that provides a universal way to represent and analyze ideas and relationships by breaking down their component pieces. [1] All ideas are compounded from a very small number of simple ideas which can be represented by a unique character. [2] [3]

Contents

Overview

Logic was Leibniz's earliest philosophic interest, going back to his teens. René Descartes had suggested that the lexicon of a universal language should consist of primitive elements. [4] The systematic combination of these elements, according to syntactical rules, would generate the infinite combinations of computational structures required to represent human language. In this way Descartes and Leibniz were precursors to computational linguistics as defined by Noam Chomsky. [5]

In the early 18th century, Leibniz outlined his characteristica universalis , an artificial language in which grammatical and logical structure would coincide, allowing reasoning to be reduced to calculation. Leibniz acknowledged the work of Ramon Llull, particularly the Ars generalis ultima (1305), as one of the inspirations for this idea. The basic elements of his characteristica would be pictographic characters unambiguously representing a limited number of elementary concepts. Leibniz called the inventory of these concepts "the alphabet of human thought." There are quite a few mentions of the characteristica in Leibniz's writings, but he never set out any details save for a brief outline of some possible sentences in his Dissertation on the Art of Combinations.

His main interest was what is known in modern logic as classification and composition. In modern terminology, Leibniz's alphabet was a proposal for an automated theorem prover or ontology classification reasoner written centuries before the technology to implement them. [6]

Semantic web implementation

John Giannandrea, co-founder and CTO of Metaweb Technologies, acknowledged in a 2008 speech that Freebase was at least linked to the alphabet of human thought, if not an implementation of it. [7]

See also

Related Research Articles

<span class="mw-page-title-main">Formal language</span> Sequence of words formed by specific rules

In logic, mathematics, computer science, and linguistics, a formal language consists of words whose letters are taken from an alphabet and are well-formed according to a specific set of rules called a formal grammar.

<span class="mw-page-title-main">Gottfried Wilhelm Leibniz</span> German mathematician and philosopher (1646–1716)

Gottfried Wilhelm Leibniz or Leibnitz was a German polymath active as a mathematician, philosopher, scientist and diplomat who is credited, alongside Sir Isaac Newton, with the invention of calculus in addition to many other branches of mathematics, such as binary arithmetic, and statistics. Leibniz has been called the "last universal genius" due to his knowledge and skills in different fields and because such people became much less common after his lifetime with the coming of the Industrial Revolution and the spread of specialized labor. He is a prominent figure in both the history of philosophy and the history of mathematics. He wrote works on philosophy, theology, ethics, politics, law, history, philology, games, music, and other studies. Leibniz also made major contributions to physics and technology, and anticipated notions that surfaced much later in probability theory, biology, medicine, geology, psychology, linguistics and computer science.

The following outline is provided as an overview and topical guide to linguistics:

In philosophy, rationalism is the epistemological view that "regards reason as the chief source and test of knowledge" or "any view appealing to reason as a source of knowledge or justification", often in contrast to other possible sources of knowledge such as faith, tradition, or sensory experience. More formally, rationalism is defined as a methodology or a theory "in which the criterion of truth is not sensory but intellectual and deductive".

<i>Mathesis universalis</i> Philosophy that mathematics can be used to define all aspects of the universe

Mathesis universalis is a hypothetical universal science modelled on mathematics envisaged by Descartes and Leibniz, among a number of other 16th- and 17th-century philosophers and mathematicians. For Leibniz, it would be supported by a calculus ratiocinator. John Wallis invokes the name as title in his Opera Mathematica, a textbook on arithmetic, algebra, and Cartesian geometry.

Universal language may refer to a hypothetical or historical language spoken and understood by all or most of the world's people. In some contexts, it refers to a means of communication said to be understood by all humans. It may be the idea of an international auxiliary language for communication between groups speaking different primary languages. A similar concept can be found in pidgin language, which is actually used to facilitate understanding between two or more people with no common language. In other conceptions, it may be the primary language of all speakers, or the only existing language. Some religious and mythological traditions state that there was once a single universal language among all people, or shared by humans and supernatural beings.

Universal science is a branch of metaphysics, dedicated to the study of the underlying principles of all science. Instead of viewing knowledge as being separated into branches, Universalists view all knowledge as being part of a single category. Universal science is related to, but distinct from universal language.

In the philosophy of mind, innatism is the view that the mind is born with already-formed ideas, knowledge, and beliefs. The opposing doctrine, that the mind is a tabula rasa at birth and all knowledge is gained from experience and the senses, is called empiricism.

aUI is a philosophical, a priori language created in the 1950s by W. John Weilgart, Ph.D., a philosopher and psychoanalyst originally from Vienna, Austria. He described it as "the Language of Space", connoting universal communication, and published the fourth edition of the textbook in 1979; a philosophic description of each semantic element of the language was published in 1975.

A philosophical language is any constructed language that is constructed from first principles, sometimes following a classification. It is considered a type of engineered language. Philosophical languages were popular in Early Modern times, partly motivated by the goal of revising normal language for philosophical purposes. The term ideal language is sometimes used near-synonymously, though more modern philosophical languages such as Toki Pona are less likely to involve such an exalted claim of perfection. The axioms and grammars of the languages together differ from commonly spoken languages.

The cognitive revolution was an intellectual movement that began in the 1950s as an interdisciplinary study of the mind and its processes, from which emerged a new field known as cognitive science. The preexisting relevant fields were psychology, linguistics, computer science, anthropology, neuroscience, and philosophy. The approaches used were developed within the then-nascent fields of artificial intelligence, computer science, and neuroscience. In the 1960s, the Harvard Center for Cognitive Studies and the Center for Human Information Processing at the University of California, San Diego were influential in developing the academic study of cognitive science. By the early 1970s, the cognitive movement had surpassed behaviorism as a psychological paradigm. Furthermore, by the early 1980s the cognitive approach had become the dominant line of research inquiry across most branches in the field of psychology.

<i>De Arte Combinatoria</i>

The Dissertatio de arte combinatoria is an early work by Gottfried Leibniz published in 1666 in Leipzig. It is an extended version of his first doctoral dissertation, written before the author had seriously undertaken the study of mathematics. The booklet was reissued without Leibniz' consent in 1690, which prompted him to publish a brief explanatory notice in the Acta Eruditorum. During the following years he repeatedly expressed regrets about its being circulated as he considered it immature. Nevertheless it was a very original work and it provided the author the first glimpse of fame among the scholars of his time.

The Latin term characteristica universalis, commonly interpreted as universal characteristic, or universal character in English, is a universal and formal language imagined by Gottfried Leibniz able to express mathematical, scientific, and metaphysical concepts. Leibniz thus hoped to create a language usable within the framework of a universal logical calculation or calculus ratiocinator.

<span class="mw-page-title-main">Cartesian linguistics</span>

The term Cartesian linguistics was coined by Noam Chomsky in his book Cartesian Linguistics: A Chapter in the History of Rationalist Thought (1966). The adjective "Cartesian" pertains to René Descartes, a prominent 17th-century philosopher. As well as Descartes, Chomsky surveys other examples of rationalist thought in 17th-century linguistics, in particular the Port-Royal Grammar (1660), which foreshadows some of his own ideas concerning universal grammar.

The Port-Royal Grammar was a milestone in the analysis and philosophy of language. Published in 1660 by Antoine Arnauld and Claude Lancelot, it was the linguistic counterpart to the Port-Royal Logic (1662), both named after the Jansenist monastery of Port-Royal-des-Champs where their authors worked. The Port-Royal Grammar became used as a standard textbook in the study of language until the early nineteenth century, and it has been reproduced in several editions and translations. In the twentieth century, scholars including Edmund Husserl and Noam Chomsky maintained academic interest in the book.

<span class="mw-page-title-main">Diagrammatic reasoning</span>

Diagrammatic reasoning is reasoning by means of visual representations. The study of diagrammatic reasoning is about the understanding of concepts and ideas, visualized with the use of diagrams and imagery instead of by linguistic or algebraic means.

<span class="mw-page-title-main">Symbol (formal)</span> Token in a mathematical or logical formula

A logical symbol is a fundamental concept in logic, tokens of which may be marks or a configuration of marks which form a particular pattern. Although the term "symbol" in common use refers at some times to the idea being symbolized, and at other times to the marks on a piece of paper or chalkboard which are being used to express that idea; in the formal languages studied in mathematics and logic, the term "symbol" refers to the idea, and the marks are considered to be a token instance of the symbol. In logic, symbols build literal utility to illustrate ideas.

This is an index of Wikipedia articles in philosophy of language

<span class="mw-page-title-main">Digital infinity</span> Term in theoretical linguistics

Digital infinity is a technical term in theoretical linguistics. Alternative formulations are "discrete infinity" and "the infinite use of finite means". The idea is that all human languages follow a simple logical principle, according to which a limited set of digits—irreducible atomic sound elements—are combined to produce an infinite range of potentially meaningful expressions.

Mathematicism is 'the effort to employ the formal structure and rigorous method of mathematics as a model for the conduct of philosophy', or the epistemological view that reality is fundamentally mathematical. The term has been applied to a number of philosophers, including Pythagoras and René Descartes although the term was not used by themselves.

References

  1. Leibniz, De Alphabeto cogitationum humanorum, (April 1679 to April 1681 (?)), Akademie VI.4 p.270
  2. Geiger, Richard A.; Rudzka-Ostyn, Brygida, eds. (1993). Conceptualizations and mental processing in language. International Cognitive Linguistics Conference (1 : 1989 : Duisburg). Walter de Gruyter. pp. 25–26. ISBN   978-3-11-012714-0.
  3. Bunnin, Nicholas; Jiyuan Yu (2004). The Blackwell Dictionary of Western Philosophy. Blackwell Publishing. p. 715. ISBN   978-1-4051-0679-5.
  4. Hatfield, Gary (3 December 2008). "René Descartes, The Stanford Encyclopedia of Philosophy (Summer 2014 Edition)". plato.stanford.edu. Stanford University. Retrieved 12 July 2014. he offered a new vision of the natural world that continues to shape our thought today: a world of matter possessing a few fundamental properties and interacting according to a few universal laws.
  5. Chomsky, Noam (13 April 2000). New Horizons in the Study of Language and Mind (Kindle ed.). Cambridge University Press. pp. 425–428. ISBN   0521658225. I mentioned that modern generative grammar has sought to address concerns that animated the tradition; in particular, the Cartesian idea that "the true distinction" (Descartes 1649/1927: 360) between humans and other creatures or machines is the ability to act in the manner they took to be most clearly illustrated in the ordinary use of language: without any finite limits, influenced but not determined by internal state, appropriate to situations but not caused by them, coherent and evoking thoughts that the hearer might have expressed, and so on. The goal of the work I have been discussing is to unearth some of the factors that enter into such normal practice.
  6. Russell, L.J. (1985). "Leibniz, Gottfried Wilhelm". In Paul Edwards (ed.). The Encyclopedia of Philosophy Volumes 3 and 4. Macmillan Publishing. pp. 422–423. ASIN   B0017IMQME. his main emphasis... was on classification, deduction was a natural consequence of combining classified items into new classes.
  7. "PARCForum Presentation by Giannandrea, J." YouTube. min 37+. Retrieved 2015-10-30.