Hilbert's sixth problem is to axiomatize those branches of physics in which mathematics is prevalent. It occurs on the widely cited list of Hilbert's problems in mathematics that he presented in the year 1900. [1] In its common English translation, the explicit statement reads:
Hilbert gave the further explanation of this problem and its possible specific forms:
David Hilbert himself devoted much of his research to the sixth problem; [3] in particular, he worked in those fields of physics that arose after he stated the problem.
In the 1910s, celestial mechanics evolved into general relativity. Hilbert and Emmy Noether corresponded extensively with Albert Einstein on the formulation of the theory. [4]
In the 1920s, mechanics of microscopic systems evolved into quantum mechanics. Hilbert, with the assistance of John von Neumann, L. Nordheim, and E. P. Wigner, worked on the axiomatic basis of quantum mechanics (see Hilbert space). [5] At the same time, but independently, Dirac formulated quantum mechanics in a way that is close to an axiomatic system, as did Hermann Weyl with the assistance of Erwin Schrödinger.
In the 1930s, probability theory was put on an axiomatic basis by Andrey Kolmogorov, using measure theory.
Since the 1960s, following the work of Arthur Wightman and Rudolf Haag, modern quantum field theory can also be considered close to an axiomatic description.
In the 1990s-2000s the problem of "the limiting processes, there merely indicated, which lead from the atomistic view to the laws of motion of continua" was approached by many groups of mathematicians. Main recent results are summarized by Laure Saint-Raymond, [6] Marshall Slemrod, [7] Alexander N. Gorban and Ilya Karlin. [8]
Hilbert’s sixth problem was a proposal to expand the axiomatic method outside the existing mathematical disciplines, to physics and beyond. This expansion requires development of semantics of physics with formal analysis of the notion of physical reality that should be done. [9] Two fundamental theories capture the majority of the fundamental phenomena of physics:
Hilbert considered general relativity as an essential part of the foundation of physics. [11] [12] However, quantum field theory is not logically consistent with general relativity, indicating the need for a still-unknown theory of quantum gravity, where the semantics of physics is expected to play a central role. Hilbert's sixth problem thus remains open. [13] Nevertheless, in recent years it has fostered research regarding the foundations of physics with a particular emphasis on the role of logic and precision of language, leading to some interesting results viz. a direct realization of uncertainty principle from Cauchy's definition of `derivative' and the unravelling of a semantic obstacle in the path of any theory of quantum gravity from the axiomatic perspective, [14] unravelling of a logical tautology in the quantum tests of equivalence principle [15] and formal unprovability of the first Maxwell's equation. [16]
General relativity, also known as the general theory of relativity, and as Einstein's theory of gravity, is the geometric theory of gravitation published by Albert Einstein in 1915 and is the current description of gravitation in modern physics. General relativity generalizes special relativity and refines Newton's law of universal gravitation, providing a unified description of gravity as a geometric property of space and time or four-dimensional spacetime. In particular, the curvature of spacetime is directly related to the energy and momentum of whatever matter and radiation are present. The relation is specified by the Einstein field equations, a system of second-order partial differential equations.
Quantum gravity (QG) is a field of theoretical physics that seeks to describe gravity according to the principles of quantum mechanics. It deals with environments in which neither gravitational nor quantum effects can be ignored, such as in the vicinity of black holes or similar compact astrophysical objects, such as neutron stars, as well as in the early stages of the universe moments after the Big Bang.
In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. Sometimes called statistical physics or statistical thermodynamics, its applications include many problems in the fields of physics, biology, chemistry, neuroscience, computer science, information theory and sociology. Its main purpose is to clarify the properties of matter in aggregate, in terms of physical laws governing atomic motion.
A theory of everything (TOE), final theory, ultimate theory, unified field theory, or master theory is a hypothetical, singular, all-encompassing, coherent theoretical framework of physics that fully explains and links together all aspects of the universe. Finding a theory of everything is one of the major unsolved problems in physics.
Loop quantum gravity (LQG) is a theory of quantum gravity that incorporates matter of the Standard Model into the framework established for the intrinsic quantum gravity case. It is an attempt to develop a quantum theory of gravity based directly on Albert Einstein's geometric formulation rather than the treatment of gravity as a mysterious mechanism (force). As a theory, LQG postulates that the structure of space and time is composed of finite loops woven into an extremely fine fabric or network. These networks of loops are called spin networks. The evolution of a spin network, or spin foam, has a scale on the order of a Planck length, approximately 10−35 meters, and smaller scales are meaningless. Consequently, not just matter, but space itself, prefers an atomic structure.
Arthur Strong Wightman was an American mathematical physicist. He was one of the founders of the axiomatic approach to quantum field theory, and originated the set of Wightman axioms. With his rigorous treatment of quantum field theories, he promoted research on various aspects of modern mathematical physics.
In mathematical physics, the Wightman axioms, named after Arthur Wightman, are an attempt at a mathematically rigorous formulation of quantum field theory. Arthur Wightman formulated the axioms in the early 1950s, but they were first published only in 1964 after Haag–Ruelle scattering theory affirmed their significance.
The classical limit or correspondence limit is the ability of a physical theory to approximate or "recover" classical mechanics when considered over special values of its parameters. The classical limit is used with physical theories that predict non-classical behavior.
In theoretical physics, the Einstein–Cartan theory, also known as the Einstein–Cartan–Sciama–Kibble theory, is a classical theory of gravitation, one of several alternatives to general relativity. The theory was first proposed by Élie Cartan in 1922.
The Yang–Mills existence and mass gap problem is an unsolved problem in mathematical physics and mathematics, and one of the seven Millennium Prize Problems defined by the Clay Mathematics Institute, which has offered a prize of US$1,000,000 for its solution.
In physics, canonical quantum gravity is an attempt to quantize the canonical formulation of general relativity. It is a Hamiltonian formulation of Einstein's general theory of relativity. The basic theory was outlined by Bryce DeWitt in a seminal 1967 paper, and based on earlier work by Peter G. Bergmann using the so-called canonical quantization techniques for constrained Hamiltonian systems invented by Paul Dirac. Dirac's approach allows the quantization of systems that include gauge symmetries using Hamiltonian techniques in a fixed gauge choice. Newer approaches based in part on the work of DeWitt and Dirac include the Hartle–Hawking state, Regge calculus, the Wheeler–DeWitt equation and loop quantum gravity.
In mathematical physics, Gleason's theorem shows that the rule one uses to calculate probabilities in quantum physics, the Born rule, can be derived from the usual mathematical representation of measurements in quantum physics together with the assumption of non-contextuality. Andrew M. Gleason first proved the theorem in 1957, answering a question posed by George W. Mackey, an accomplishment that was historically significant for the role it played in showing that wide classes of hidden-variable theories are inconsistent with quantum physics. Multiple variations have been proven in the years since. Gleason's theorem is of particular importance for the field of quantum logic and its attempt to find a minimal set of mathematical axioms for quantum theory.
Objective-collapse theories, also known spontaneous collapse models or dynamical reduction models, are proposed solutions to the measurement problem in quantum mechanics. As with other interpretations of quantum mechanics, they are possible explanations of why and how quantum measurements always give definite outcomes, not a superposition of them as predicted by the Schrödinger equation, and more generally how the classical world emerges from quantum theory. The fundamental idea is that the unitary evolution of the wave function describing the state of a quantum system is approximate. It works well for microscopic systems, but progressively loses its validity when the mass / complexity of the system increases.
Categorical quantum mechanics is the study of quantum foundations and quantum information using paradigms from mathematics and computer science, notably monoidal category theory. The primitive objects of study are physical processes, and the different ways that these can be composed. It was pioneered in 2004 by Samson Abramsky and Bob Coecke. Categorical quantum mechanics is entry 18M40 in MSC2020.
The Koopman–von Neumann (KvN) theory is a description of classical mechanics as an operatorial theory similar to quantum mechanics, based on a Hilbert space of complex, square-integrable wavefunctions. As its name suggests, the KvN theory is loosely related to work by Bernard Koopman and John von Neumann in 1931 and 1932, respectively. As explained in this entry, however, the historical origins of the theory and its name are complicated.
In theoretical physics, the problem of time is a conceptual conflict between general relativity and quantum mechanics in that quantum mechanics regards the flow of time as universal and absolute, whereas general relativity regards the flow of time as malleable and relative. This problem raises the question of what time really is in a physical sense and whether it is truly a real, distinct phenomenon. It also involves the related question of why time seems to flow in a single direction, despite the fact that no known physical laws at the microscopic level seem to require a single direction.
Alexander Nikolaevich Gorban is a scientist of Russian origin, working in the United Kingdom. He is a professor at the University of Leicester, and director of its Mathematical Modeling Centre. Gorban has contributed to many areas of fundamental and applied science, including statistical physics, non-equilibrium thermodynamics, machine learning and mathematical biology.
Complex spacetime is a mathematical framework that combines the concepts of complex numbers and spacetime in physics. In this framework, the usual real-valued coordinates of spacetime are replaced with complex-valued coordinates. This allows for the inclusion of imaginary components in the description of spacetime, which can have interesting implications in certain areas of physics, such as quantum field theory and string theory.
A generalized probabilistic theory (GPT) is a general framework to describe the operational features of arbitrary physical theories. A GPT must specify what kind of physical systems one can find in the lab, as well as rules to compute the outcome statistics of any experiment involving labeled preparations, transformations and measurements. The framework of GPTs has been used to define hypothetical non-quantum physical theories which nonetheless possess quantum theory's most remarkable features, such as entanglement or teleportation. Notably, a small set of physically motivated axioms is enough to single out the GPT representation of quantum theory.
Robert Schrader was a German theoretical and mathematical physicist. He is known for the Osterwalder–Schrader axioms.