In information theory, Interactions of actors theory is a theory developed by Gordon Pask and Gerard de Zeeuw. It is a generalisation of Pask's earlier conversation theory: The chief distinction being that conversation theory focuses on analysing the specific features that allow a conversation to emerge between two participants, whereas interaction of actor's theory focuses on the broader domain of conversation in which conversations may appear, disappear, and reappear over time. [1]
Interactions of actors theory was developed late in Pask's career. It is reminiscent of Freud's psychodynamics, Bateson's panpsychism (see "Mind and Nature: A Necessary Unity" 1980). Pask's nexus of analogy, dependence and mechanical spin produces the differences that are central to cybernetics.
While working with clients in the last years of his life, Pask produced an axiomatic scheme [2] for his interactions of actors theory, less well-known than his conversation theory. Interactions of Actors, Theory and Some Applications, as the manuscript is entitled, is essentially a concurrent spin calculus applied to the living environment with strict topological constraints. [3] One of the most notable associates of Gordon Pask, Gerard de Zeeuw, was a key contributor to the development of interactions of actors theory.
Interactions of actors theory is a process theory. [6] As a means to describe the interdisciplinary nature of his work, Pask would make analogies to physical theories in the classic positivist enterprises of the social sciences. Pask sought to apply the axiomatic properties of agreement or epistemological dependence to produce a "sharp-valued" social science with precision comparable to the results of the hard sciences. It was out of this inclination that he would develop his interactions of actors theory. Pask's concepts produce relations in all media and he regarded IA as a process theory. In his complementarity principle he stated "Processes produce products and all products (finite, bounded coherences) are produced by processes". [7]
Most importantly Pask also had his exclusion principle. He proved that no two concepts or products could be the same because of their different histories. He called this the "No Doppelgangers" clause or edict. [6] Later he reflected "Time is incommensurable for Actors". [8] He saw these properties as necessary to produce differentiation and innovation or new coherences in physical nature and, indeed, minds.
In 1995, Pask stated what he called his Last Theorem: "Like concepts repel and unlike concepts attract". For ease of application Pask stated the differences and similarities of descriptions (the products of processes) were context and perspective dependent. In the last three years of his life Pask presented models based on Knot theory knots which described minimal persisting concepts. He interpreted these as acting as computing elements which exert repulsive forces to interact and persist in filling the space. The knots, links and braids of his entailment mesh models of concepts, which could include tangle-like processes seeking "tail-eating" closure, Pask called "tapestries".
His analysis proceeded with like seeming concepts repelling or unfolding but after a sufficient duration of interaction (he called this duration "faith") a pair of similar or like-seeming concepts will always produce a difference and thus an attraction. Amity (availability for interaction), respectability (observability), responsibility (able to respond to stimulus), unity (not uniformity) were necessary properties to produce agreement (or dependence) and agreement-to-disagree (or relative independence) when Actors interact. Concepts could be applied imperatively or permissively when a Petri (see Petri net) condition for synchronous transfer of meaningful information occurred. Extending his physical analogy Pask associated the interactions of thought generation with radiation : "operations generating thoughts and penetrating conceptual boundaries within participants, excite the concepts bounded as oscillators, which, in ridding themselves of this surplus excitation, produce radiation" [9]
In sum, IA supports the earlier kinematic conversation theory work where minimally two concurrent concepts were required to produce a non-trivial third. One distinction separated the similarity and difference of any pair in the minimum triple. However, his formal methods denied the competence of mathematics or digital serial and parallel processes to produce applicable descriptions because of their innate pathologies in locating the infinitesimals of dynamic equilibria (Stafford Beer's "Point of Calm"). He dismissed the digital computer as a kind of kinematic "magic lantern". He saw mechanical models as the future for the concurrent kinetic computers required to describe natural processes. He believed that this implied the need to extend quantum computing to emulate true field concurrency rather than the current von Neumann architecture.
Reviewing IA [8] he said:
Interaction of actors has no specific beginning or end. It goes on forever. Since it does so it has very peculiar properties. Whereas a conversation is mapped (due to a possibility of obtaining a vague kinematic, perhaps picture-frame image, of it, onto Newtonian time, precisely because it has a beginning and end), an interaction, in general, cannot be treated in this manner. Kinematics are inadequate to deal with life: we need kinetics. Even so as in the minimal case of a strict conversation we cannot construct the truth value, metaphor or analogy of A and B. The A, B differences are generalizations about a coalescence of concepts on the part of A and B; their commonality and coherence is the similarity. The difference (reiterated) is the differentiation of A and B (their agreements to disagree, their incoherences). Truth value in this case meaning the coherence between all of the interacting actors.
He added:
It is essential to postulate vectorial times (where components of the vectors are incommensurate) and furthermore times which interact with each other in the manner of Louis Kaufmann's knots and tangles.
In experimental Epistemology Pask, the "philosopher mechanic", produced a tool kit to analyse the basis for knowledge and criticise the teaching and application of knowledge from all fields: the law, social and system sciences to mathematics, physics and biology. In establishing the vacuity of invariance Pask was challenged with the invariance of atomic number. "Ah", he said "the atomic hypothesis". He rejected this instead preferring the infinite nature of the productions of waves.
Pask held that concurrence is a necessary condition for modelling brain functions and he remarked IA was meant to stand AI, Artificial Intelligence, on its head. Pask believed it was the job of cybernetics to compare and contrast. His IA theory showed how to do this. Heinz von Foerster called him a genius, [10] "Mr. Cybernetics", the "cybernetician's cybernetician".
The Hewitt, Bishop and Steiger approach concerns sequential processing and inter-process communication in digital, serial, kinematic computers. It is a parallel or pseudo-concurrent theory as is the theory of concurrency. See Concurrency. In Pask's true field concurrent theory kinetic processes can interrupt (or, indeed, interact with) each other, simply reproducing or producing a new resultant force within a coherence (of concepts) but without buffering delays or priority. [11]
"There are no Doppelgangers" is a fundamental theorem, edict or clause of cybernetics due to Pask in support of his theories of learning and interaction in all media: conversation theory and interactions of actors theory. It accounts for physical differentiation and is Pask's exclusion principle. [12] It states no two products of concurrent interaction can be the same because of their different dynamic contexts and perspectives. No Doppelgangers is necessary to account for the production by interaction and intermodulation (c.f. beats) of different, evolving, persisting and coherent forms. Two proofs are presented both due to Pask.
Consider a pair of moving, dynamic participants and producing an interaction . Their separation will vary during . The duration of observed from will be different from the duration of observed from . [8] [13]
Let and be the start and finish times for the transfer of meaningful information, we can write:
TsA ≠ TfB, TsB ≠ TfB, TsA ≠ TsB, | TfA ≠ TsB TfA ≠ TsA TfA ≠ TfB |
Thus
A ≠ B
Pask remarked: [8]
Conversation is defined as having a beginning and an end and time is vectorial. The components of the vector are commensurable (in duration). On the other hand actor interaction time is vectorial with components that are incommensurable. In the general case there is no well-defined beginning and interaction goes on indefinitely. As a result the time vector has incommensurable components. Both the quantity and quality differ.
No Doppelgangers applies in both the conversation theory's kinematic domain (bounded by beginnings and ends) where times are commensurable and in the eternal kinetic interactions of actors domain where times are incommensurable.
The second proof [6] is more reminiscent of R.D. Laing: [14] Your concept of your concept is not my concept of your concept—a reproduced concept is not the same as the original concept. Pask defined concepts as persisting, countably infinite, recursively packed spin processes (like many cored cable, or skins of an onion) in any medium (stars, liquids, gases, solids, machines and, of course, brains) that produce relations.
Here we prove A(T) ≠ B(T).
D means "description of" and <Con A(T), DA(T)> reads A's concept of T produces A's description of T, evoking Dirac notation (required for the production of the quanta of thought: the transfer of "set-theoretic tokens", as Pask puts it in 1996 [8] ).
or, in general
also, in general
and vice versa, or, in general terms
given that for all Z and all T, the concepts
and that
AA = A(A) is not equal to BA = B(A) and vice versa, hence, there are no Doppelgangers.
Q.E.D.
Pask attached a piece of string to a bar [15] with three knots in it. Then he attached a piece of elastic to the bar with three knots in it. One observing actor, A, on the string would see the knotted intervals on the other actor as varying as the elastic was stretched and relaxed corresponding to the relative motion of B as seen from A. The knots correspond to the beginning of the experiment then the start and finish of the A/B interaction. Referring to the three intervals, where x, y, z, are the separation distances of the knots from the bar and each other, he noted x > y > z on the string for participant A does not imply x > z for participant B on the elastic. A change of separation between A and B producing Doppler shifts during interaction, recoil or the differences in relativistic proper time for A and B, would account for this for example. On occasion a second knotted string was tied to the bar representing coordinate time.
To set in further context Pask won a prize from Old Dominion University for his complementarity principle: "All processes produce products and all products are produced by processes". This can be written:
Ap(ConZ(T)) => DZ (T) where => means produces and Ap means the "application of", D means "description of" and Z is the concept mesh or coherence of which T is part. This can also be written
Pask distinguishes Imperative (written &Ap or IM) from Permissive Application (written Ap) [16] where information is transferred in the Petri net manner, the token appearing as a hole in a torus producing a Klein bottle containing recursively packed concepts. [6]
Pask's "hard" or "repulsive" [6] carapace was a condition he required for the persistence of concepts. He endorsed Nicholas Rescher's coherence theory of truth approach where a set membership criterion of similarity also permitted differences amongst set or coherence members, but he insisted repulsive force was exerted at set and members' coherence boundaries. He said of G. Spencer Brown's Laws of Form that distinctions must exert repulsive forces. This is not yet accepted by Spencer Brown and others. Without a repulsion, or Newtonian reaction at the boundary, sets, their members or interacting participants would diffuse away forming a "smudge"; Hilbertian marks on paper would not be preserved. Pask, the mechanical philosopher, wanted to apply these ideas to bring a new kind of rigour to cybernetic models.
Some followers of Pask emphasise his late work, done in the closing chapter of his life, which is neither as clear nor as grounded as the prior decades of research and machine- and theory-building. This tends to skew the impression gleaned by researchers as to Pask's contribution or even his lucidity.[ citation needed ]
A magnetic field is a physical field that describes the magnetic influence on moving electric charges, electric currents, and magnetic materials. A moving charge in a magnetic field experiences a force perpendicular to its own velocity and to the magnetic field. A permanent magnet's magnetic field pulls on ferromagnetic materials such as iron, and attracts or repels other magnets. In addition, a nonuniform magnetic field exerts minuscule forces on "nonmagnetic" materials by three other magnetic effects: paramagnetism, diamagnetism, and antiferromagnetism, although these forces are usually so small they can only be detected by laboratory equipment. Magnetic fields surround magnetized materials, electric currents, and electric fields varying in time. Since both strength and direction of a magnetic field may vary with location, it is described mathematically by a function assigning a vector to each point of space, called a vector field.
In probability theory, the central limit theorem (CLT) states that, under appropriate conditions, the distribution of a normalized version of the sample mean converges to a standard normal distribution. This holds even if the original variables themselves are not normally distributed. There are several versions of the CLT, each applying in the context of different conditions.
The Navier–Stokes equations are partial differential equations which describe the motion of viscous fluid substances. They were named after French engineer and physicist Claude-Louis Navier and the Irish physicist and mathematician George Gabriel Stokes. They were developed over several decades of progressively building the theories, from 1822 (Navier) to 1842–1850 (Stokes).
Kinematics is a subfield of physics and mathematics, developed in classical mechanics, that describes the motion of points, bodies (objects), and systems of bodies without considering the forces that cause them to move. Kinematics, as a field of study, is often referred to as the "geometry of motion" and is occasionally seen as a branch of both applied and pure mathematics since it can be studied without considering the mass of a body or the forces acting upon it. A kinematics problem begins by describing the geometry of the system and declaring the initial conditions of any known values of position, velocity and/or acceleration of points within the system. Then, using arguments from geometry, the position, velocity and acceleration of any unknown parts of the system can be determined. The study of how forces act on bodies falls within kinetics, not kinematics. For further details, see analytical dynamics.
In quantum mechanics, wave function collapse, also called reduction of the state vector, occurs when a wave function—initially in a superposition of several eigenstates—reduces to a single eigenstate due to interaction with the external world. This interaction is called an observation, and is the essence of a measurement in quantum mechanics, which connects the wave function with classical observables such as position and momentum. Collapse is one of the two processes by which quantum systems evolve in time; the other is the continuous evolution governed by the Schrödinger equation.
A mathematical symbol is a figure or a combination of figures that is used to represent a mathematical object, an action on mathematical objects, a relation between mathematical objects, or for structuring the other symbols that occur in a formula. As formulas are entirely constituted with symbols of various types, many symbols are needed for expressing all mathematics.
In quantum physics, a wave function is a mathematical description of the quantum state of an isolated quantum system. The most common symbols for a wave function are the Greek letters ψ and Ψ. Wave functions are complex-valued. For example, a wave function might assign a complex number to each point in a region of space. The Born rule provides the means to turn these complex probability amplitudes into actual probabilities. In one common form, it says that the squared modulus of a wave function that depends upon position is the probability density of measuring a particle as being at a given place. The integral of a wavefunction's squared modulus over all the system's degrees of freedom must be equal to 1, a condition called normalization. Since the wave function is complex-valued, only its relative phase and relative magnitude can be measured; its value does not, in isolation, tell anything about the magnitudes or directions of measurable observables. One has to apply quantum operators, whose eigenvalues correspond to sets of possible results of measurements, to the wave function ψ and calculate the statistical distributions for measurable quantities.
Quantum decoherence is the loss of quantum coherence. Quantum decoherence has been studied to understand how quantum systems convert to systems which can be explained by classical mechanics. Beginning out of attempts to extend the understanding of quantum mechanics, the theory has developed in several directions and experimental studies have confirmed some of the key issues. Quantum computing relies on quantum coherence and is one of the primary practical applications of the concept.
In mathematics, an algebra over a field is a vector space equipped with a bilinear product. Thus, an algebra is an algebraic structure consisting of a set together with operations of multiplication and addition and scalar multiplication by elements of a field and satisfying the axioms implied by "vector space" and "bilinear".
In mathematics, the magnitude or size of a mathematical object is a property which determines whether the object is larger or smaller than other objects of the same kind. More formally, an object's magnitude is the displayed result of an ordering of the class of objects to which it belongs. Magnitude as a concept dates to Ancient Greece and has been applied as a measure of distance from one object to another. For numbers, the absolute value of a number is commonly applied as the measure of units between a number and zero.
In mathematics, two non-zero real numbers a and b are said to be commensurable if their ratio a/b is a rational number; otherwise a and b are called incommensurable. There is a more general notion of commensurability in group theory.
In magnetic resonance imaging (MRI) and nuclear magnetic resonance spectroscopy (NMR), an observable nuclear spin polarization (magnetization) is created by a homogeneous magnetic field. This field makes the magnetic dipole moments of the sample precess at the resonance (Larmor) frequency of the nuclei. At thermal equilibrium, nuclear spins precess randomly about the direction of the applied field. They become abruptly phase coherent when they are hit by radiofrequency (RF) pulses at the resonant frequency, created orthogonal to the field. The RF pulses cause the population of spin-states to be perturbed from their thermal equilibrium value. The generated transverse magnetization can then induce a signal in an RF coil that can be detected and amplified by an RF receiver. The return of the longitudinal component of the magnetization to its equilibrium value is termed spin-latticerelaxation while the loss of phase-coherence of the spins is termed spin-spin relaxation, which is manifest as an observed free induction decay (FID).
Andrew Gordon Speedie Pask was a British cybernetician, inventor and polymath who made multiple contributions to cybernetics, educational psychology, educational technology, applied episteomology, chemical computing, architecture, and systems art. During his life, he gained three doctorate degrees. He was an avid writer, with more than two hundred and fifty publications which included a variety of journal articles, books, periodicals, patents, and technical reports. He worked as an academic and researcher for a variety of educational settings, research institutes, and private stakeholders including but not limited to the University of Illinois, Concordia University, the Open University, Brunel University and the Architectural Association School of Architecture. He is known for the development of conversation theory.
New Cybernetics, as used by cybernetician Gordon Pask, is the meaningful transfer of information between coherences in all media in terms of attractions and repulsions between clockwise and anti-clockwise spins. This is a possibly defining paradigm of the new cybernetics or second-order cybernetics.
Conversation theory is a cybernetic approach to the study of conversation, cognition and learning that may occur between two participants who are engaged in conversation with each other. It presents an experimental framework heavily utilizing human-computer interactions and computer theoretic models as a means to present a scientific theory explaining how conversational interactions lead to the emergence of knowledge between participants. The theory was developed by Gordon Pask, who credits Bernard Scott, Dionysius Kallikourdis, Robin McKinnon-Wood, and others during its initial development and implementation as well as Paul Pangaro during subsequent years.
Classical mechanics is a physical theory describing the motion of objects such as projectiles, parts of machinery, spacecraft, planets, stars, and galaxies. The development of classical mechanics involved substantial change in the methods and philosophy of physics. The qualifier classical distinguishes this type of mechanics from physics developed after the revolutions in physics of the early 20th century, all of which revealed limitations in classical mechanics.
Text and conversation is a theory in the field of organizational communication illustrating how communication makes up an organization. In the theory's simplest explanation, an organization is created and defined by communication. Communication "is" the organization and the organization exists because communication takes place. The theory is built on the notion, an organization is not seen as a physical unit holding communication. Text and conversation theory puts communication processes at the heart of organizational communication and postulates, an organization doesn't contain communication as a "causal influence", but is formed by the communication within. This theory is not intended for direct application, but rather to explain how communication exists. The theory provides a framework for better understanding organizational communication.
Interatomic potentials are mathematical functions to calculate the potential energy of a system of atoms with given positions in space. Interatomic potentials are widely used as the physical basis of molecular mechanics and molecular dynamics simulations in computational chemistry, computational physics and computational materials science to explain and predict materials properties. Examples of quantitative properties and qualitative phenomena that are explored with interatomic potentials include lattice parameters, surface energies, interfacial energies, adsorption, cohesion, thermal expansion, and elastic and plastic material behavior, as well as chemical reactions.
Self-organization, a process where some form of overall order arises out of the local interactions between parts of an initially disordered system, was discovered in cybernetics by William Ross Ashby in 1947. It states that any deterministic dynamic system automatically evolves towards a state of equilibrium that can be described in terms of an attractor in a basin of surrounding states. Once there, the further evolution of the system is constrained to remain in the attractor. This constraint implies a form of mutual dependency or coordination between its constituent components or subsystems. In Ashby's terms, each subsystem has adapted to the environment formed by all other subsystems.
Supersymmetric theory of stochastic dynamics or stochastics (STS) is an exact theory of stochastic (partial) differential equations (SDEs), the class of mathematical models with the widest applicability covering, in particular, all continuous time dynamical systems, with and without noise. The main utility of the theory from the physical point of view is a rigorous theoretical explanation of the ubiquitous spontaneous long-range dynamical behavior that manifests itself across disciplines via such phenomena as 1/f, flicker, and crackling noises and the power-law statistics, or Zipf's law, of instantonic processes like earthquakes and neuroavalanches. From the mathematical point of view, STS is interesting because it bridges the two major parts of mathematical physics – the dynamical systems theory and topological field theories. Besides these and related disciplines such as algebraic topology and supersymmetric field theories, STS is also connected with the traditional theory of stochastic differential equations and the theory of pseudo-Hermitian operators.