Good regulator

Last updated

The good regulator is a theorem conceived by Roger C. Conant and W. Ross Ashby that is central to cybernetics. Originally stated that "every good regulator of a system must be a model of that system", [1] but more accurately, every good regulator must contain a model of the system. That is, any regulator that is maximally simple among optimal regulators must behave as an image of that system under a homomorphism; while the authors sometimes say 'isomorphism', the mapping they construct is only a homomorphism.

Contents

Theorem

This theorem is obtained by considering the entropy of the variation of the output of the controlled system, and shows that, under very general conditions, that the entropy is minimized when there is a (deterministic) mapping from the states of the system to the states of the regulator. The authors view this map as making the regulator a 'model' of the system.

With regard to the brain, insofar as it is successful and efficient as a regulator for survival, it must proceed, in learning, by the formation of a model (or models) of its environment.

The theorem is general enough to apply to all regulating and self-regulating or homeostatic systems.

Variables involved in good regulation as according to the authors. Good regulator.png
Variables involved in good regulation as according to the authors.

Five variables are defined by the authors as involved in the process of system regulation. as primary disturbers, as a set of events in the regulator, as a set of events in the rest of the system outside of the regulator, as the total set of events (or outcomes) that may occur, as the subset of events (or outcomes) that are desirable to the system. [1]

The principal point that the authors present with this figure is that regulation requires of the regulator to conceive of all variables as it regards the set of events concerning the system to be regulated in order to render in satisfactory outcomes of this regulation. If the regulator is instead not able to conceive of all variables in the set of events concerning the system that exist outside of the regulator, then the set of events in the regulator may fail to account for the total variable disturbances which in turn may cause errors that lead to outcomes that are not satisfactory to the system (as illustrated by the events in the set that are not elements in the set ).

The theorem does not explain what it takes for the system to become a good regulator. In cybernetics, the problem of creating good regulators is addressed by the ethical regulator theorem, [2] and by the theory of practopoiesis. [3] The construction of good regulators is a general problem for any system (e.g., an automated information system) that regulates some domain of application.

When restricted to the ordinary differential equation (ODE) subset of control theory, it is referred to as the internal model principle, which was first articulated in 1976 by B. A. Francis and W. M. Wonham. [4] In this form, it stands in contrast to classical control, in that the classical feedback loop fails to explicitly model the controlled system (although the classical controller may contain an implicit model). [5]

See also

Related Research Articles

Control theory is a field of control engineering and applied mathematics that deals with the control of dynamical systems in engineered processes and machines. The objective is to develop a model or algorithm governing the application of system inputs to drive the system to a desired state, while minimizing any delay, overshoot, or steady-state error and ensuring a level of control stability; often with the aim to achieve a degree of optimality.

Information theory is the mathematical study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field, in applied mathematics, is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering.

<span class="mw-page-title-main">Isomorphism</span> In mathematics, invertible homomorphism

In mathematics, an isomorphism is a structure-preserving mapping between two structures of the same type that can be reversed by an inverse mapping. Two mathematical structures are isomorphic if an isomorphism exists between them. The word isomorphism is derived from the Ancient Greek: ἴσοςisos "equal", and μορφήmorphe "form" or "shape".

<span class="mw-page-title-main">Negative feedback</span> Reuse of output to stabilize a system

Negative feedback occurs when some function of the output of a system, process, or mechanism is fed back in a manner that tends to reduce the fluctuations in the output, whether caused by changes in the input or by other disturbances. A classic example of negative feedback is a heating system thermostat — when the temperature gets high enough, the heater is turned OFF. When the temperature gets too cold, the heat is turned back ON. In each case the "feedback" generated by the thermostat "negates" the trend.

<span class="mw-page-title-main">Random walk</span> Mathematical formalization of a path that consists of a succession of random steps

In mathematics, a random walk, sometimes known as a drunkard's walk, is a random process that describes a path that consists of a succession of random steps on some mathematical space.

<span class="mw-page-title-main">W. Ross Ashby</span> English psychiatrist (1903–1972)

William Ross Ashby was an English psychiatrist and a pioneer in cybernetics, the study of the science of communications and automatic control systems in both machines and living things. His first name was not used: he was known as Ross Ashby.

In classical statistical mechanics, the H-theorem, introduced by Ludwig Boltzmann in 1872, describes the tendency to decrease in the quantity H in a nearly-ideal gas of molecules. As this quantity H was meant to represent the entropy of thermodynamics, the H-theorem was an early demonstration of the power of statistical mechanics as it claimed to derive the second law of thermodynamics—a statement about fundamentally irreversible processes—from reversible microscopic mechanics. It is thought to prove the second law of thermodynamics, albeit under the assumption of low-entropy initial conditions.

<span class="mw-page-title-main">Decision tree learning</span> Machine learning algorithm

Decision tree learning is a supervised learning approach used in statistics, data mining and machine learning. In this formalism, a classification or regression decision tree is used as a predictive model to draw conclusions about a set of observations.

Complex dynamics, or holomorphic dynamics, is the study of dynamical systems obtained by iterating a complex analytic mapping. This article focuses on the case of algebraic dynamics, where a polynomial or rational function is iterated. In geometric terms, that amounts to iterating a mapping from some algebraic variety to itself. The related theory of arithmetic dynamics studies iteration over the rational numbers or the p-adic numbers instead of the complex numbers.

In universal algebra, a variety of algebras or equational class is the class of all algebraic structures of a given signature satisfying a given set of identities. For example, the groups form a variety of algebras, as do the abelian groups, the rings, the monoids etc. According to Birkhoff's theorem, a class of algebraic structures of the same signature is a variety if and only if it is closed under the taking of homomorphic images, subalgebras, and (direct) products. In the context of category theory, a variety of algebras, together with its homomorphisms, forms a category; these are usually called finitary algebraic categories.

<span class="mw-page-title-main">Setpoint (control system)</span> Target value for the process variable of a control system

In cybernetics and control theory, a setpoint is the desired or target value for an essential variable, or process value (PV) of a control system, which may differ from the actual measured value of the variable. Departure of such a variable from its setpoint is one basis for error-controlled regulation using negative feedback for automatic control. A setpoint can be any physical quantity or parameter that a control system seeks to regulate, such as temperature, pressure, flow rate, position, speed, or any other measurable attribute.

In physics, maximum entropy thermodynamics views equilibrium thermodynamics and statistical mechanics as inference processes. More specifically, MaxEnt applies inference techniques rooted in Shannon information theory, Bayesian probability, and the principle of maximum entropy. These techniques are relevant to any situation requiring prediction from incomplete or insufficient data. MaxEnt thermodynamics began with two papers by Edwin T. Jaynes published in the 1957 Physical Review.

<span class="mw-page-title-main">Causal model</span> Conceptual model in philosophy of science

In the philosophy of science, a causal model is a conceptual model that describes the causal mechanisms of a system. Several types of causal notation may be used in the development of a causal model. Causal models can improve study designs by providing clear rules for deciding which independent variables need to be included/controlled for.

In cybernetics, the term variety denotes the total number of distinguishable elements of a set, most often the set of states, inputs, or outputs of a finite-state machine or transformation, or the binary logarithm of the same quantity. Variety is used in cybernetics as an information theory that is easily related to deterministic finite automata, and less formally as a conceptual tool for thinking about organization, regulation, and stability. It is an early theory of complexity in automata, complex systems, and operations research.

<span class="mw-page-title-main">Internal model (motor control)</span>

In the subject area of control theory, an internal model is a process that simulates the response of the system in order to estimate the outcome of a system disturbance. The internal model principle was first articulated in 1976 by B. A. Francis and W. M. Wonham as an explicit formulation of the Conant and Ashby good regulator theorem. It stands in contrast to classical control, in that the classical feedback loop fails to explicitly model the controlled system.

The free energy principle is a theoretical framework suggesting that the brain reduces surprise or uncertainty by making predictions based on internal models and updating them using sensory input. It highlights the brain's objective of aligning its internal model with the external world to enhance prediction accuracy. This principle integrates Bayesian inference with active inference, where actions are guided by predictions and sensory feedback refines them. It has wide-ranging implications for comprehending brain function, perception, and action.

Self-organization, a process where some form of overall order arises out of the local interactions between parts of an initially disordered system, was discovered in cybernetics by William Ross Ashby in 1947. It states that any deterministic dynamic system automatically evolves towards a state of equilibrium that can be described in terms of an attractor in a basin of surrounding states. Once there, the further evolution of the system is constrained to remain in the attractor. This constraint implies a form of mutual dependency or coordination between its constituent components or subsystems. In Ashby's terms, each subsystem has adapted to the environment formed by all other subsystems.

<span class="mw-page-title-main">Ethical regulator theorem</span>

Mick Ashby's ethical regulator theorem builds upon the Conant-Ashby good regulator theorem, which is ambiguous because being good at regulating does not imply being good ethically.

<span class="mw-page-title-main">Walter Murray Wonham</span> Canadian physicist (1934–2023)

Walter Murray Wonham was a Canadian control theorist and professor at the University of Toronto. He focused on multi-variable geometric control theory, stochastic control and stochastic filters, and the control of discrete event systems from the standpoint of mathematical logic and formal languages.

An Introduction to Cybernetics is a book by W. Ross Ashby, first published in 1956 in London by Chapman and Hall. An Introduction is considered the first textbook on cybernetics, where the basic principles of the new field were first rigorously laid out. It was intended to serve as an elementary introduction to cybernetic principles of homeostasis, primarily for an audience of physiologists, psychologists, and sociologists. Ashby addressed adjacent topics in addition to cybernetics such as information theory, communications theory, control theory, game theory and systems theory.

References

  1. 1 2 R. C. Conant and W. R. Ashby, "Every good regulator of a system must be a model of that system", Int. J. Systems Sci., 1970, vol 1, No 2, pp. 89–97
  2. M. Ashby, "Ethical Regulators and Super-Ethical Systems". Systems, 2020; 8(4):53.
  3. Nikolić, D. (2015). Practopoiesis: Or how life fosters a mind. Journal of theoretical biology, 373, 40-61.
  4. B. A. Francis and W. M. Wonham, "The internal model principle of control theory", Automatica12 (1976) 457–465.
  5. Jan Swevers, "Internal model control (IMC) Archived 2017-08-30 at the Wayback Machine ", 2006