Good regulator

Last updated

The good regulator is a theorem conceived by Roger C. Conant and W. Ross Ashby that is central to cybernetics. Originally stated that "every good regulator of a system must be a model of that system", [1] but more accurately, every good regulator must contain a model of the system. That is, any regulator that is maximally simple among optimal regulators must behave as an image of that system under a homomorphism; while the authors sometimes say 'isomorphism', the mapping they construct is only a homomorphism.

Contents

Theorem

This theorem is obtained by considering the entropy of the variation of the output of the controlled system, and shows that, under very general conditions, that the entropy is minimized when there is a (deterministic) mapping from the states of the system to the states of the regulator. The authors view this map as making the regulator a 'model' of the system.

With regard to the brain, insofar as it is successful and efficient as a regulator for survival, it must proceed, in learning, by the formation of a model (or models) of its environment.

The theorem is general enough to apply to all regulating and self-regulating or homeostatic systems.

Variables involved in good regulation as according to the authors. Good regulator.png
Variables involved in good regulation as according to the authors.

Five variables are defined by the authors as involved in the process of system regulation. as primary disturbers, as a set of events in the regulator, as a set of events in the rest of the system outside of the regulator, as the total set of events (or outcomes) that may occur, as the subset of events (or outcomes) that are desirable to the system. [1]

The principal point that the authors present with this figure is that regulation requires of the regulator to conceive of all variables as it regards the set of events concerning the system to be regulated in order to render in satisfactory outcomes of this regulation. If the regulator is instead not able to conceive of all variables in the set of events concerning the system that exist outside of the regulator, then the set of events in the regulator may fail to account for the total variable disturbances which in turn may cause errors that lead to outcomes that are not satisfactory to the system (as illustrated by the events in the set that are not elements in the set ).

The theorem does not explain what it takes for the system to become a good regulator. Moreover, although highly cited, some concerns have been raised that the formal proof does not actually fully support the statement in the paper title. [2]

In cybernetics, the problem of creating good regulators is addressed by the ethical regulator theorem. [3] The construction of good regulators is a general problem for any system (e.g., an automated information system) that regulates some domain of application.

When restricted to the ordinary differential equation (ODE) subset of control theory, it is referred to as the internal model principle, which was first articulated in 1976 by B. A. Francis and W. M. Wonham. [4] In this form, it stands in contrast to classical control, in that the classical feedback loop fails to explicitly model the controlled system (although the classical controller may contain an implicit model). [5]

See also

Related Research Articles

Control theory is a field of control engineering and applied mathematics that deals with the control of dynamical systems in engineered processes and machines. The objective is to develop a model or algorithm governing the application of system inputs to drive the system to a desired state, while minimizing any delay, overshoot, or steady-state error and ensuring a level of control stability; often with the aim to achieve a degree of optimality.

<span class="mw-page-title-main">Isomorphism</span> Inversible mapping (mathematics)

In mathematics, an isomorphism is a structure-preserving mapping between two structures of the same type that can be reversed by an inverse mapping. Two mathematical structures are isomorphic if an isomorphism exists between them. The word is derived from Ancient Greek ἴσος (isos) 'equal' and μορφή (morphe) 'form, shape'.

<span class="mw-page-title-main">Probability theory</span> Branch of mathematics concerning probability

Probability theory or probability calculus is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set of axioms. Typically these axioms formalise probability in terms of a probability space, which assigns a measure taking values between 0 and 1, termed the probability measure, to a set of outcomes called the sample space. Any specified subset of the sample space is called an event.

In mathematics, rings are algebraic structures that generalize fields: multiplication need not be commutative and multiplicative inverses need not exist. Informally, a ring is a set equipped with two binary operations satisfying properties analogous to those of addition and multiplication of integers. Ring elements may be numbers such as integers or complex numbers, but they may also be non-numerical objects such as polynomials, square matrices, functions, and power series.

Bell's theorem is a term encompassing a number of closely related results in physics, all of which determine that quantum mechanics is incompatible with local hidden-variable theories, given some basic assumptions about the nature of measurement. "Local" here refers to the principle of locality, the idea that a particle can only be influenced by its immediate surroundings, and that interactions mediated by physical fields cannot propagate faster than the speed of light. "Hidden variables" are supposed properties of quantum particles that are not included in quantum theory but nevertheless affect the outcome of experiments. In the words of physicist John Stewart Bell, for whom this family of results is named, "If [a hidden-variable theory] is local it will not agree with quantum mechanics, and if it agrees with quantum mechanics it will not be local."

<span class="mw-page-title-main">Negative feedback</span> Reuse of output to stabilize a system

Negative feedback occurs when some function of the output of a system, process, or mechanism is fed back in a manner that tends to reduce the fluctuations in the output, whether caused by changes in the input or by other disturbances. A classic example of negative feedback is a heating system thermostat — when the temperature gets high enough, the heater is turned OFF. When the temperature gets too cold, the heat is turned back ON. In each case the "feedback" generated by the thermostat "negates" the trend.

In electronics, a linear regulator is a voltage regulator used to maintain a steady voltage. The resistance of the regulator varies in accordance with both the input voltage and the load, resulting in a constant voltage output. The regulating circuit varies its resistance, continuously adjusting a voltage divider network to maintain a constant output voltage and continually dissipating the difference between the input and regulated voltages as waste heat. By contrast, a switching regulator uses an active device that switches on and off to maintain an average value of output. Because the regulated voltage of a linear regulator must always be lower than input voltage, efficiency is limited and the input voltage must be high enough to always allow the active device to reduce the voltage by some amount.

<span class="mw-page-title-main">W. Ross Ashby</span> English psychiatrist (1903–1972)

William Ross Ashby was an English psychiatrist and a pioneer in cybernetics, the study of the science of communications and automatic control systems in both machines and living things. His first name was not used: he was known as Ross Ashby.

<span class="mw-page-title-main">Free product</span> Operation that combines groups

In mathematics, specifically group theory, the free product is an operation that takes two groups G and H and constructs a new group GH. The result contains both G and H as subgroups, is generated by the elements of these subgroups, and is the “universal” group having these properties, in the sense that any two homomorphisms from G and H into a group K factor uniquely through a homomorphism from GH to K. Unless one of the groups G and H is trivial, the free product is always infinite. The construction of a free product is similar in spirit to the construction of a free group.

In universal algebra, a variety of algebras or equational class is the class of all algebraic structures of a given signature satisfying a given set of identities. For example, the groups form a variety of algebras, as do the abelian groups, the rings, the monoids etc. According to Birkhoff's theorem, a class of algebraic structures of the same signature is a variety if and only if it is closed under the taking of homomorphic images, subalgebras, and (direct) products. In the context of category theory, a variety of algebras, together with its homomorphisms, forms a category; these are usually called finitary algebraic categories.

<span class="mw-page-title-main">Bellman equation</span> Necessary condition for optimality associated with dynamic programming

A Bellman equation, named after Richard E. Bellman, is a necessary condition for optimality associated with the mathematical optimization method known as dynamic programming. It writes the "value" of a decision problem at a certain point in time in terms of the payoff from some initial choices and the "value" of the remaining decision problem that results from those initial choices. This breaks a dynamic optimization problem into a sequence of simpler subproblems, as Bellman's “principle of optimality" prescribes. The equation applies to algebraic structures with a total ordering; for algebraic structures with a partial ordering, the generic Bellman's equation can be used.

<span class="mw-page-title-main">Setpoint (control system)</span> Target value for the process variable of a control system

In cybernetics and control theory, a setpoint is the desired or target value for an essential variable, or process value (PV) of a control system, which may differ from the actual measured value of the variable. Departure of such a variable from its setpoint is one basis for error-controlled regulation using negative feedback for automatic control. A setpoint can be any physical quantity or parameter that a control system seeks to regulate, such as temperature, pressure, flow rate, position, speed, or any other measurable attribute.

<span class="mw-page-title-main">Causal model</span> Conceptual model in philosophy of science

In metaphysics, a causal model is a conceptual model that describes the causal mechanisms of a system. Several types of causal notation may be used in the development of a causal model. Causal models can improve study designs by providing clear rules for deciding which independent variables need to be included/controlled for.

In cybernetics, the term variety denotes the total number of distinguishable elements of a set, most often the set of states, inputs, or outputs of a finite-state machine or transformation, or the binary logarithm of the same quantity. Variety is used in cybernetics as an information theory that is easily related to deterministic finite automata, and less formally as a conceptual tool for thinking about organization, regulation, and stability. It is an early theory of complexity in automata, complex systems, and operations research.

<span class="mw-page-title-main">Internal model (motor control)</span>

In the subject area of control theory, an internal model is a process that simulates the response of the system in order to estimate the outcome of a system disturbance. The internal model principle was first articulated in 1976 by B. A. Francis and W. M. Wonham as an explicit formulation of the Conant and Ashby good regulator theorem. It stands in contrast to classical control, in that the classical feedback loop fails to explicitly model the controlled system.

The free energy principle is a theoretical framework suggesting that the brain reduces surprise or uncertainty by making predictions based on internal models and updating them using sensory input. It highlights the brain's objective of aligning its internal model and the external world to enhance prediction accuracy. This principle integrates Bayesian inference with active inference, where actions are guided by predictions and sensory feedback refines them. It has wide-ranging implications for comprehending brain function, perception, and action.

Self-organization, a process where some form of overall order arises out of the local interactions between parts of an initially disordered system, was discovered in cybernetics by William Ross Ashby in 1947. It states that any deterministic dynamic system automatically evolves towards a state of equilibrium that can be described in terms of an attractor in a basin of surrounding states. Once there, the further evolution of the system is constrained to remain in the attractor. This constraint implies a form of mutual dependency or coordination between its constituent components or subsystems. In Ashby's terms, each subsystem has adapted to the environment formed by all other subsystems.

<span class="mw-page-title-main">Ethical regulator theorem</span>

Mick Ashby's ethical regulator theorem builds upon the Conant-Ashby good regulator theorem, which is ambiguous because being good at regulating does not imply being good ethically.

<span class="mw-page-title-main">Walter Murray Wonham</span> Canadian physicist (1934–2023)

Walter Murray Wonham was a Canadian control theorist and professor at the University of Toronto. He focused on multi-variable geometric control theory, stochastic control and stochastic filters, and the control of discrete event systems from the standpoint of mathematical logic and formal languages.

An Introduction to Cybernetics is a book by W. Ross Ashby, first published in 1956 in London by Chapman and Hall. An Introduction is considered the first textbook on cybernetics, where the basic principles of the new field were first rigorously laid out. It was intended to serve as an elementary introduction to cybernetic principles of homeostasis, primarily for an audience of physiologists, psychologists, and sociologists. Ashby addressed adjacent topics in addition to cybernetics such as information theory, communications theory, control theory, game theory and systems theory.

References

  1. 1 2 R. C. Conant and W. R. Ashby, "Every good regulator of a system must be a model of that system", Int. J. Systems Sci., 1970, vol 1, No 2, pp. 89–97
  2. Baez, John (27 January 2016). "The Internal Model Principle". Azimuth. Archived from the original on 5 October 2023. Retrieved 6 June 2024.{{cite web}}: CS1 maint: bot: original URL status unknown (link)
  3. M. Ashby, "Ethical Regulators and Super-Ethical Systems". Systems, 2020; 8(4):53.
  4. B. A. Francis and W. M. Wonham, "The internal model principle of control theory", Automatica12 (1976) 457–465.
  5. Jan Swevers, "Internal model control (IMC) Archived 2017-08-30 at the Wayback Machine ", 2006