Pragmatic theory of information

Last updated

The pragmatic theory of information is derived from Charles Sanders Peirce's general theory of signs and inquiry. Peirce explored a number of ideas about information throughout his career. One set of ideas is about the "laws of information" having to do with the logical properties of information . Another set of ideas about "time and thought" have to do with the dynamic properties of inquiry . All of these ideas contribute to the pragmatic theory of inquiry. Peirce set forth many of these ideas very early in his career, periodically returning to them on scattered occasions until the end, and they appear to be implicit in much of his later work on the logic of science and the theory of signs, but he never developed their implications to the fullest extent. The 20th century thinker Ernst Ulrich and his wife Christine von Weizsäcker reviewed the pragmatics of information; [1] their work is reviewed by Gennert. [2]

Contents

Overview

The pragmatic information content is the information content received by a recipient; it is focused on the recipient and defined in contrast to Claude Shannon's information definition, which focuses on the message. The pragmatic information measures the information received, not the information contained in the message. Pragmatic information theory requires not only a model of the sender and how it encodes information, but also a model of the receiver and how it acts on the information received. The determination of pragmatic information content is a precondition for the determination of the value of information.

Claude Shannon and Warren Weaver completed the viewpoint on information encoding in the seminal paper by Shannon A Mathematical Theory of Communication , [3] with two additional viewpoints (B and C): [4]

Pragmatics of communication is the observable effect a communication act (here receiving a message) has on the actions of the recipient. The pragmatic information content of a message may be different for different recipients or the same message may have the same content. Weizsäcker used the concept of novelty and irrelevance to separate information which is pragmatically useful or not. [1] Algebraically, the pragmatic information content must satisfy three rules:

More recently, Weinberger formulated a quantitative theory of pragmatic information. In contrast to standard information theory that says nothing about the semantic content of information, Weinberger's theory attempts to measure the amount of information actually used in making a decision. Included in Weinberger's paper is a demonstration that his version of pragmatic information increases over the course of time in a simple model of evolution known as the quasispecies model . This is demonstrably not true for the standard measure of information. [5]

The acquisition of the information and the use of it in decision making can be separated. The use of acquired information to make a decision is, in the general case, an optimization in an uncertain situation (which is included in Weinberger's theory). For deterministc rule based decisions, the agent can be formalized as an algebra with a set of operations and the state changes when these operations are executed (no optimization applied). The pragmatic information such an agent picks up from a messages is the transformation of the tokens in the message into operations the recipient is capable of.

Measuring the pragmatic information content with an agent-model of the receiver

Frank used agent-based models and the theory of autonomous agents with cognitive abilities (see multi-agent system) to operationalize measuring pragmatic information content. The transformation between the received message and the executed message is defined by the agents rules; the pragmatic information content is the information in the transformed message, measured by the methods given by Shannon. The general case can be split in (deterministic) actions to change the information the agent already has and the optimal decision using this information. To measure the pragmatic information content is relevant to assess the value of information received by an agent and influences the agents willingness to pay for information - not measured by Shannon communication information content, but by the received pragmatic information content.

The rules for the transformation of a received message to the pragmatic content drop information already available;

(1) information already known is ignored and

(2) elaborated messages can be reduced to the agents action, reducing the information content when the receiver understands and can execute actions more powerful than the encoding calls for.

The transformation achieves the three rules mentioned above (EQ, SAME, DIFF). [6]

The action of the agent can be taken as just 'updating its knowledge store" and the actual decision by the agent, optimizing the result, is modeled separately, as, for example, done in Weinberg's approach.

Example: car navigation

A familiar application may clarify the approach: Different car navigation systems produce different instructions but if they manage to guide you to the same location, their pragmatic information content must be the same (despite different information content when measured with Shannon's measure (SAME). A novice in the area may need all instructions received - the pragmatic information content and the (minimally encoded) information content of the message is the same. An experienced driver will ignore all "follow the road" and "go straight" instructions, thus the pragmatic information content is lower; for an driver with knowledge of the area, large parts of the instructions may be subsumed by simple instructions "drive to X"; typically, only the last part ("the last mile") of the instructions are meaningful - the pragmatic information content is smaller, because much knowledge is already available (DIFF). Messages with more or less verbiage have for this user the same pragmatic content (SAME). [6]

See also

Related Research Articles

The bit is the most basic unit of information in computing and digital communications. The name is a portmanteau of binary digit. The bit represents a logical state with one of two possible values. These values are most commonly represented as either "1" or "0", but other representations such as true/false, yes/no, +/, or on/off are commonly used.

Communication Act of conveying intended meaning

Communication is "an apparent answer to the painful divisions between self and other, private and public, and inner thought and outer world." As this definition indicates, communication is difficult to define in a consistent manner, because in common use it refers to a very wide range of different behaviors involved in the propagation of information. John Peters argues the difficulty of defining communication emerges from the fact that communication is both a universal phenomenon and a specific discipline of institutional academic study.

Claude Shannon American mathematician and information theorist

Claude Elwood Shannon was an American mathematician, electrical engineer, and cryptographer known as a "father of information theory".

Information theory is the scientific study of the quantification, storage, and communication of digital information. The field was fundamentally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering.

Entropy (information theory) Expected amount of information needed to specify the output of a stochastic data source

In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Given a discrete random variable , with possible outcomes , which occur with probability the entropy of is formally defined as:

Semiotics is the study of sign processes (semiosis), which are any activity, conduct, or process that involves signs, where a sign is defined as anything that communicates something, usually called a meaning, to the sign's interpreter. The meaning can be intentional such as a word uttered with a specific meaning, or unintentional, such as a symptom being a sign of a particular medical condition. Signs can also communicate feelings and may communicate internally or through any of the senses: visual, auditory, tactile, olfactory, or gustatory (taste).

Communication theory Proposed description of communication phenomena

A communication theory is a proposed description of communication phenomena, the relationships among them, a storyline describing these relationships, and an argument for these three elements. Communication theory provides a way of talking about and analyzing key events, processes, and commitments that together form communication. Theory can be seen as a way to map the world and make it navigable; communication theory gives us tools to answer empirical, conceptual, or practical communication questions.

In information theory, the information content, self-information, surprisal, or Shannon information is a basic quantity derived from the probability of a particular event occurring from a random variable. It can be thought of as an alternative way of expressing probability, much like odds or log-odds, but which has particular mathematical advantages in the setting of information theory.

In semiotics, a sign is anything that communicates a meaning that is not the sign itself to the interpreter of the sign. The meaning can be intentional such as a word uttered with a specific meaning, or unintentional, such as a symptom being a sign of a particular medical condition. Signs can communicate through any of the senses, visual, auditory, tactile, olfactory, or taste.

Decoding, in semiotics, is the process of interpreting a message sent by an addresser to an addressee. The complementary process – creating a message for transmission to an addressee – is called encoding.

A sign relation is the basic construct in the theory of signs, also known as semiotics, as developed by Charles Sanders Peirce.

In semiotics, the meaning of a sign is its place in a sign relation, in other words, the set of roles that the sign occupies within a given sign relation.

The philosophy of information (PI) is a branch of philosophy that studies topics relevant to information processing, representational system and consciousness, computer science, information science and information technology.

Social semiotics is a branch of the field of semiotics which investigates human signifying practices in specific social and cultural circumstances, and which tries to explain meaning-making as a social practice. Semiotics, as originally defined by Ferdinand de Saussure, is "the science of the life of signs in society". Social semiotics expands on Saussure's founding insights by exploring the implications of the fact that the "codes" of language and communication are formed by social processes. The crucial implication here is that meanings and semiotic systems are shaped by relations of power, and that as power shifts in society, our languages and other systems of socially accepted meanings can and do change.

The following outline is provided as an overview of and topical guide to communication:

Information Facts provided or learned about something or someones

Information is processed, organized and structured data. It provides context for data and enables decision making process. For example, a single customer’s sale at a restaurant is data – this becomes information when the business is able to identify the most popular or least popular dish.

Models of communication Conceptual model used to explain the human communication process

Models of communication are conceptual models used to explain the human communication process. The first major model for communication was developed in 1948 by Claude Shannon and published with an introduction by Warren Weaver for Bell Laboratories. It is also important to note Lasswell's model of communication, also developed in 1948 by Harold Lasswell. Following the basic concept, communication is the process of sending and receiving messages or transferring information from one part (sender) to another (receiver).

The following outline is provided as an overview of and topical guide to semiotics:

Aberrant decoding Unintended misinterpretation of messages

Aberrant decoding or aberrant reading is a concept used in fields such as communication and media studies, semiotics, and journalism about how messages can be interpreted differently from what was intended by their sender. The concept was proposed by Umberto Eco in an article published first in 1965 in Italian and in 1972 in English.

The sender-message-channel-receiver (SMCR) model of communication is an expansion of the Shannon-Weaver model of communication. David Berlo created this model, which separated Shannon and Weaver's linear model into clear parts, in 1960. It has been expanded upon by other scholars. Berlo described factors affecting the individual components in the communication making the communication more efficient.

References

  1. 1 2 Weizsäcker, von, Ernst Ulrich; Weizsäcker, von, Christine. Wiederaufnahme der begrifflichen Frage. Was ist Information. pp. 25–555.
  2. Gennert, Dieter. "Pragmatic Information: Historical Exposition and General Overview". Mind & Matter. 4 (2): 141–167.
  3. Claude E. Shannon: A Mathematical Theory of Communication, Bell System Technical Journal, Vol. 27, pp. 379–423, 623–656, 1948.
  4. Claude E. Shannon and Warren Weaver: The Mathematical Theory of Communication. The University of Illinois Press, Urbana, Illinois, 1949. ISBN   0-252-72548-4
  5. 1 2 Edward D. Weinberger: "A Theory of Pragmatic Information and Its Application to the Quasispecies Model of Biological Evolution", BioSystems 66 (3), 105–119, 2002. Eprint
  6. 1 2 3 Frank, Andrew (2003). M. Duckham; M. Goodchild; M. Worboys (eds.). "Pragmatic Information Content—How to Measure the Information in a Route Description". Vol. Foundations of Geographic Information Science. London: Taylor & Francis. pp. 47–68. ISBN   0-415-30726-0.