Effective complexity

Last updated

Effective complexity is a measure of complexity defined in a 1996 paper by Murray Gell-Mann and Seth Lloyd that attempts to measure the amount of non-random information in a system. [1] [2] It has been criticised as being dependent on the subjective decisions made as to which parts of the information in the system are to be discounted as random. [3]

Contents

See also

Related Research Articles

Complexity characterizes the behavior of a system or model whose components interact in multiple ways and follow local rules, leading to non-linearity, randomness, collective dynamics, hierarchy, and emergence.

<span class="mw-page-title-main">Many-worlds interpretation</span> Interpretation of quantum mechanics

The many-worlds interpretation (MWI) is an interpretation of quantum mechanics that asserts that the universal wavefunction is objectively real, and that there is no wave function collapse. This implies that all possible outcomes of quantum measurements are physically realized in different "worlds". The evolution of reality as a whole in MWI is rigidly deterministic and local. Many-worlds is also called the relative state formulation or the Everett interpretation, after physicist Hugh Everett, who first proposed it in 1957. Bryce DeWitt popularized the formulation and named it many-worlds in the 1970s.

<span class="mw-page-title-main">Murray Gell-Mann</span> American theoretical physicist (1929–2019)

Murray Gell-Mann was an American theoretical physicist who played a preeminent role in the development of the theory of elementary particles. Gell-Mann introduced the concept of quarks as the fundamental building blocks of the strongly interacting particles, and the renormalization group as a foundational element of quantum field theory and statistical mechanics. He played key roles in developing the concept of chirality in the theory of the weak interactions and spontaneous chiral symmetry breaking in the strong interactions, which controls the physics of the light mesons. In the 1970s he was a co-inventor of quantum chromodynamics (QCD) which explains the confinement of quarks in mesons and baryons and forms a large part of the Standard Model of elementary particles and forces.

<span class="mw-page-title-main">Quantum chromodynamics</span> Theory of the strong nuclear interactions

In theoretical physics, quantum chromodynamics (QCD) is the study of the strong interaction between quarks mediated by gluons. Quarks are fundamental particles that make up composite hadrons such as the proton, neutron and pion. QCD is a type of quantum field theory called a non-abelian gauge theory, with symmetry group SU(3). The QCD analog of electric charge is a property called color. Gluons are the force carriers of the theory, just as photons are for the electromagnetic force in quantum electrodynamics. The theory is an important part of the Standard Model of particle physics. A large body of experimental evidence for QCD has been gathered over the years.

A complex system is a system composed of many components which may interact with each other. Examples of complex systems are Earth's global climate, organisms, the human brain, infrastructure such as power grid, transportation or communication systems, complex software and electronic systems, social and economic organizations, an ecosystem, a living cell, and, ultimately, for some authors, the entire universe.

The strange quark or s quark is the third lightest of all quarks, a type of elementary particle. Strange quarks are found in subatomic particles called hadrons. Examples of hadrons containing strange quarks include kaons, strange D mesons, Sigma baryons, and other strange particles.

The up quark or u quark is the lightest of all quarks, a type of elementary particle, and a significant constituent of matter. It, along with the down quark, forms the neutrons and protons of atomic nuclei. It is part of the first generation of matter, has an electric charge of +2/3 e and a bare mass of 2.2+0.5
−0.4
 MeV/c2
. Like all quarks, the up quark is an elementary fermion with spin 1/2, and experiences all four fundamental interactions: gravitation, electromagnetism, weak interactions, and strong interactions. The antiparticle of the up quark is the up antiquark, which differs from it only in that some of its properties, such as charge have equal magnitude but opposite sign.

The down quark is a type of elementary particle, and a major constituent of matter. The down quark is the second-lightest of all quarks, and combines with other quarks to form composite particles called hadrons. Down quarks are most commonly found in atomic nuclei, where it combines with up quarks to form protons and neutrons. The proton is made of one down quark with two up quarks, and the neutron is made up of two down quarks with one up quark. Because they are found in every single known atom, down quarks are present in all everyday matter that we interact with.

Econophysics is a non-orthodox interdisciplinary research field, applying theories and methods originally developed by physicists in order to solve problems in economics, usually those including uncertainty or stochastic processes and nonlinear dynamics. Some of its application to the study of financial markets has also been termed statistical finance referring to its roots in statistical physics. Econophysics is closely related to social physics.

In quantum mechanics, the totalitarian principle states: "Everything not forbidden is compulsory." Physicists including Murray Gell-Mann borrowed this expression, and its satirical reference to totalitarianism, from the popular culture of the early twentieth century.

A complex adaptive system is a system that is complex in that it is a dynamic network of interactions, but the behavior of the ensemble may not be predictable according to the behavior of the components. It is adaptive in that the individual and collective behavior mutate and self-organize corresponding to the change-initiating micro-event or collection of events. It is a "complex macroscopic collection" of relatively "similar and partially connected micro-structures" formed in order to adapt to the changing environment and increase their survivability as a macro-structure. The Complex Adaptive Systems approach builds on replicator dynamics.

<span class="mw-page-title-main">Seth Lloyd</span> American mechanical engineer and physicist

Seth Lloyd is a professor of mechanical engineering and physics at the Massachusetts Institute of Technology.

A universal probability bound is a probabilistic threshold whose existence is asserted by William A. Dembski and is used by him in his works promoting intelligent design. It is defined as

A degree of improbability below which a specified event of that probability cannot reasonably be attributed to chance regardless of whatever probabilitistic resources from the known universe are factored in.

<span class="mw-page-title-main">George Johnson (writer)</span> American journalist and science writer (born 1952)

George Johnson is an American journalist and science writer.

In logic and theoretical computer science, and specifically proof theory and computational complexity theory, proof complexity is the field aiming to understand and analyse the computational resources that are required to prove or refute statements. Research in proof complexity is predominantly concerned with proving proof-length lower and upper bounds in various propositional proof systems. For example, among the major challenges of proof complexity is showing that the Frege system, the usual propositional calculus, does not admit polynomial-size proofs of all tautologies. Here the size of the proof is simply the number of symbols in it, and a proof is said to be of polynomial size if it is polynomial in the size of the tautology it proves.

Algorithmic information theory (AIT) is a branch of theoretical computer science that concerns itself with the relationship between computation and information of computably generated objects, such as strings or any other data structure. In other words, it is shown within algorithmic information theory that computational incompressibility "mimics" the relations or inequalities found in information theory. According to Gregory Chaitin, it is "the result of putting Shannon's information theory and Turing's computability theory into a cocktail shaker and shaking vigorously."

Logical depth is a measure of complexity for individual strings devised by Charles H. Bennett based on the computational complexity of an algorithm that can recreate a given piece of information. It differs from Kolmogorov complexity in that it considers the computation time of the algorithm with nearly minimal length, rather than the length of the minimal algorithm.

In algorithmic information theory, sophistication is a measure of complexity related to algorithmic entropy.

In the mathematical fields of graph theory and finite model theory, the logic of graphs deals with formal specifications of graph properties using sentences of mathematical logic. There are several variations in the types of logical operation that can be used in these sentences. The first-order logic of graphs concerns sentences in which the variables and predicates concern individual vertices and edges of a graph, while monadic second-order graph logic allows quantification over sets of vertices or edges. Logics based on least fixed point operators allow more general predicates over tuples of vertices, but these predicates can only be constructed through fixed-point operators, restricting their power.

Continuous-variable (CV) quantum information is the area of quantum information science that makes use of physical observables, like the strength of an electromagnetic field, whose numerical values belong to continuous intervals. One primary application is quantum computing. In a sense, continuous-variable quantum computation is "analog", while quantum computation using qubits is "digital." In more technical terms, the former makes use of Hilbert spaces that are infinite-dimensional, while the Hilbert spaces for systems comprising collections of qubits are finite-dimensional. One motivation for studying continuous-variable quantum computation is to understand what resources are necessary to make quantum computers more powerful than classical ones.

References

  1. Gell-Mann, Murray; Lloyd, Seth (1996). "Information Measures, Effective Complexity, and Total Information". Complexity. 2 (1): 44–52. Bibcode:1996Cmplx...2a..44G. doi:10.1002/(SICI)1099-0526(199609/10)2:1<44::AID-CPLX10>3.0.CO;2-X.
  2. Ay, Nihat; Muller, Markus; Szkola, Arleta (2010). "Effective Complexity and Its Relation to Logical Depth". IEEE Transactions on Information Theory. 56 (9): 4593–4607. arXiv: 0810.5663 . doi:10.1109/TIT.2010.2053892. S2CID   2217934.
  3. McAllister, James W. (2003). "Effective Complexity as a Measure of Information Content". Philosophy of Science. 70 (2): 302–307. doi:10.1086/375469. S2CID   120267550.