Hilbert's sixth problem

Last updated

Hilbert's sixth problem is to axiomatize those branches of physics in which mathematics is prevalent. It occurs on the widely cited list of Hilbert's problems in mathematics that he presented in the year 1900. [1] In its common English translation, the explicit statement reads:

Contents

Stairs of model reduction from microscopic dynamics (the atomistic view) to macroscopic continuum dynamics (the laws of motion of continua) (Illustration to the content of the book ) StairsOfReduction.png
Stairs of model reduction from microscopic dynamics (the atomistic view) to macroscopic continuum dynamics (the laws of motion of continua) (Illustration to the content of the book )
6. Mathematical Treatment of the Axioms of Physics. The investigations on the foundations of geometry suggest the problem: To treat in the same manner, by means of axioms, those physical sciences in which already today mathematics plays an important part; in the first rank are the theory of probabilities and mechanics.

Hilbert gave the further explanation of this problem and its possible specific forms:

"As to the axioms of the theory of probabilities, it seems to me desirable that their logical investigation should be accompanied by a rigorous and satisfactory development of the method of mean values in mathematical physics, and in particular in the kinetic theory of gases. ... Boltzmann's work on the principles of mechanics suggests the problem of developing mathematically the limiting processes, there merely indicated, which lead from the atomistic view to the laws of motion of continua."

History

David Hilbert himself devoted much of his research to the sixth problem; [3] in particular, he worked in those fields of physics that arose after he stated the problem.

In the 1910s, celestial mechanics evolved into general relativity. Hilbert and Emmy Noether corresponded extensively with Albert Einstein on the formulation of the theory. [4]

In the 1920s, mechanics of microscopic systems evolved into quantum mechanics. Hilbert, with the assistance of John von Neumann, L. Nordheim, and E. P. Wigner, worked on the axiomatic basis of quantum mechanics (see Hilbert space). [5] At the same time, but independently, Dirac formulated quantum mechanics in a way that is close to an axiomatic system, as did Hermann Weyl with the assistance of Erwin Schrödinger.

In the 1930s, probability theory was put on an axiomatic basis by Andrey Kolmogorov, using measure theory.

Since the 1960s, following the work of Arthur Wightman and Rudolf Haag, modern quantum field theory can also be considered close to an axiomatic description.

In the 1990s-2000s the problem of "the limiting processes, there merely indicated, which lead from the atomistic view to the laws of motion of continua" was approached by many groups of mathematicians. Main recent results are summarized by Laure Saint-Raymond, [6] Marshall Slemrod, [7] Alexander N. Gorban and Ilya Karlin. [8]

Status

Hilbert’s sixth problem was a proposal to expand the axiomatic method outside the existing mathematical disciplines, to physics and beyond. This expansion requires development of semantics of physics with formal analysis of the notion of physical reality that should be done. [9] Two fundamental theories capture the majority of the fundamental phenomena of physics:

Hilbert considered general relativity as an essential part of the foundation of physics. [11] [12] However, quantum field theory is not logically consistent with general relativity, indicating the need for a still-unknown theory of quantum gravity, where the semantics of physics is expected to play a central role. Hilbert's sixth problem thus remains open, [13] Nevertheless, in recent years it has fostered research regarding the foundations of physics with a particular emphasis on the role of logic and precision of language, leading to some interesting results viz. a direct realization of uncertainty principle from Cauchy's definition of `derivative' and the unravelling of a semantic obstacle in the path of any theory of quantum gravity from the axiomatic perspective, [14] unravelling of a logical tautology in the quantum tests of equivalence principle [15] and formal unprovability of the first Maxwell's equation. [16]

See also

Notes

  1. Hilbert, David (1902). "Mathematical Problems". Bulletin of the American Mathematical Society. 8 (10): 437–479. doi: 10.1090/S0002-9904-1902-00923-3 . MR   1557926. Earlier publications (in the original German) appeared in Göttinger Nachrichten, 1900, pp. 253–297, and Archiv der Mathematik und Physik, 3rd series, vol. 1 (1901), pp. 44-63, 213–237.
  2. Gorban, Alexander N.; Karlin, Ilya V. (2005). Invariant Manifolds for Physical and Chemical Kinetics. Lecture Notes in Physics (LNP, vol. 660). Berlin, Heidelberg: Springer. doi:10.1007/b98103. ISBN   978-3-540-22684-0. Archived from the original on 2020-08-19. Alt URL
  3. Corry, L. (1997). "David Hilbert and the axiomatization of physics (1894–1905)". Archive for History of Exact Sciences . 51 (2): 83–198. doi:10.1007/BF00375141.
  4. Sauer 1999, p. 6
  5. van Hove, Léon (1958). "Von Neumann's contributions to quantum theory". Bull. Amer. Math. Soc. 64 (3): 95–99. doi: 10.1090/s0002-9904-1958-10206-2 . MR   0092587. Zbl   0080.00416.
  6. Saint-Raymond, L. (2009). Hydrodynamic limits of the Boltzmann equation. Lecture Notes in Mathematics. Vol. 1971. Springer-Verlag. doi:10.1007/978-3-540-92847-8. ISBN   978-3-540-92847-8.
  7. Slemrod, M. (2013). "From Boltzmann to Euler: Hilbert's 6th problem revisited". Comput. Math. Appl. 65 (10): 1497–1501. doi: 10.1016/j.camwa.2012.08.016 . MR   3061719.
  8. Gorban, A.N.; Karlin, I. (2014). "Hilbert's 6th Problem: exact and approximate hydrodynamic manifolds for kinetic equations". Bull. Amer. Math. Soc. 51 (2): 186–246. arXiv: 1310.0406 . doi: 10.1090/S0273-0979-2013-01439-3 .
  9. Gorban, A.N. (2018). "Hilbert's sixth problem: the endless road to rigour". Phil. Trans. R. Soc. A. 376 (2118): 20170238. arXiv: 1803.03599 . Bibcode:2018RSPTA.37670238G. doi: 10.1098/rsta.2017.0238 . PMID   29555808.
  10. Wightman, A.S. (1976). "Hilbert's sixth problem: Mathematical treatment of the axioms of physics". In Felix E. Browder (ed.). Mathematical Developments Arising from Hilbert Problems. Proceedings of Symposia in Pure Mathematics. Vol. XXVIII. American Mathematical Society. pp. 147–240. ISBN   0-8218-1428-1.
  11. Hilbert, David (1915). "Die Grundlagen der Physik. (Erste Mitteilung)". Nahrichten von der Gesellschaft der Wissenschaften zu Göttingen, Mathematisch-physikalische Klasse. 1915: 395–407.
  12. Sauer 1999
  13. Theme issue "Hilbert's sixth problem". Phil. Trans. R. Soc. A. 376 (2118). 2018. doi: 10.1098/rsta/376/2118 .
  14. A. Majhi (2022). "Cauchy's Logico-Linguistic Slip, the Heisenberg Uncertainty Principle and a Semantic Dilemma Concerning "Quantum Gravity"". International Journal of Theoretical Physics. 61 (3). arXiv: 2204.00418 . doi:10.1007/s10773-022-05051-8.
  15. Majhi, A.; Sardar, G. (2023). "Scientific value of the quantum tests of equivalence principle in light of Hilbert's sixth problem". Pramana - J Phys. 97 (1). arXiv: 2301.06327 . doi:10.1007/s12043-022-02504-x.
  16. A. Majhi (2023). "Unprovability of first Maxwell's equation in light of EPR's completeness condition: a computational approach from logico-linguistic perspective". Pramana - J Phys. 61 (4). arXiv: 2310.14930 . doi:10.1007/s12043-023-02594-1.

Related Research Articles

<span class="mw-page-title-main">Quantum mechanics</span> Description of physical properties at the atomic and subatomic scale

Quantum mechanics is a fundamental theory in physics that describes the behavior of nature at the scale of atoms and subatomic particles. It is the foundation of all quantum physics including quantum chemistry, quantum field theory, quantum technology, and quantum information science.

<span class="mw-page-title-main">Quantum gravity</span> Description of gravity using discrete values

Quantum gravity (QG) is a field of theoretical physics that seeks to describe gravity according to the principles of quantum mechanics. It deals with environments in which neither gravitational nor quantum effects can be ignored, such as in the vicinity of black holes or similar compact astrophysical objects, such as neutron stars as well as in the early stages of the universe moments after the Big Bang.

In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. It does not assume or postulate any natural laws, but explains the macroscopic behavior of nature from the behavior of such ensembles.

<span class="mw-page-title-main">Theory of everything</span> Hypothetical physical concept

A theory of everything (TOE), final theory, ultimate theory, unified field theory or master theory is a hypothetical, singular, all-encompassing, coherent theoretical framework of physics that fully explains and links together all aspects of the universe. Finding a theory of everything is one of the major unsolved problems in physics.

<span class="mw-page-title-main">Loop quantum gravity</span> Theory of quantum gravity, merging quantum mechanics and general relativity

Loop quantum gravity (LQG) is a theory of quantum gravity, which aims to reconcile quantum mechanics and general relativity, incorporating matter of the Standard Model into the framework established for the intrinsic quantum gravity case. It is an attempt to develop a quantum theory of gravity based directly on Einstein's geometric formulation rather than the treatment of gravity as a mysterious mechanism (force). As a theory, LQG postulates that the structure of space and time is composed of finite loops woven into an extremely fine fabric or network. These networks of loops are called spin networks. The evolution of a spin network, or spin foam, has a scale above the order of a Planck length, approximately 10−35 meters, and smaller scales are meaningless. Consequently, not just matter, but space itself, prefers an atomic structure.

Arthur Strong Wightman was an American mathematical physicist. He was one of the founders of the axiomatic approach to quantum field theory, and originated the set of Wightman axioms. With his rigorous treatment of quantum field theories, he promoted research on various aspects of modern mathematical physics.

<span class="mw-page-title-main">Wightman axioms</span> Axiomatization of quantum field theory

In mathematical physics, the Wightman axioms, named after Arthur Wightman, are an attempt at a mathematically rigorous formulation of quantum field theory. Arthur Wightman formulated the axioms in the early 1950s, but they were first published only in 1964 after Haag–Ruelle scattering theory affirmed their significance.

The classical limit or correspondence limit is the ability of a physical theory to approximate or "recover" classical mechanics when considered over special values of its parameters. The classical limit is used with physical theories that predict non-classical behavior.

In theoretical physics, the Einstein–Cartan theory, also known as the Einstein–Cartan–Sciama–Kibble theory, is a classical theory of gravitation similar to general relativity. The theory was first proposed by Élie Cartan in 1922. Einstein–Cartan theory is the simplest Poincaré gauge theory.

Induced gravity is an idea in quantum gravity that spacetime curvature and its dynamics emerge as a mean field approximation of underlying microscopic degrees of freedom, similar to the fluid mechanics approximation of Bose–Einstein condensates. The concept was originally proposed by Andrei Sakharov in 1967.

The Yang–Mills existence and mass gap problem is an unsolved problem in mathematical physics and mathematics, and one of the seven Millennium Prize Problems defined by the Clay Mathematics Institute, which has offered a prize of US$1,000,000 for its solution.

<span class="mw-page-title-main">Canonical quantum gravity</span> A formulation of general relativity

In physics, canonical quantum gravity is an attempt to quantize the canonical formulation of general relativity. It is a Hamiltonian formulation of Einstein's general theory of relativity. The basic theory was outlined by Bryce DeWitt in a seminal 1967 paper, and based on earlier work by Peter G. Bergmann using the so-called canonical quantization techniques for constrained Hamiltonian systems invented by Paul Dirac. Dirac's approach allows the quantization of systems that include gauge symmetries using Hamiltonian techniques in a fixed gauge choice. Newer approaches based in part on the work of DeWitt and Dirac include the Hartle–Hawking state, Regge calculus, the Wheeler–DeWitt equation and loop quantum gravity.

In mathematical physics, Gleason's theorem shows that the rule one uses to calculate probabilities in quantum physics, the Born rule, can be derived from the usual mathematical representation of measurements in quantum physics together with the assumption of non-contextuality. Andrew M. Gleason first proved the theorem in 1957, answering a question posed by George W. Mackey, an accomplishment that was historically significant for the role it played in showing that wide classes of hidden-variable theories are inconsistent with quantum physics. Multiple variations have been proven in the years since. Gleason's theorem is of particular importance for the field of quantum logic and its attempt to find a minimal set of mathematical axioms for quantum theory.

Objective-collapse theories, also known as models of spontaneous wave function collapse or dynamical reduction models, are proposed solutions to the measurement problem in quantum mechanics. As with other theories called interpretations of quantum mechanics, they are possible explanations of why and how quantum measurements always give definite outcomes, not a superposition of them as predicted by the Schrödinger equation, and more generally how the classical world emerges from quantum theory. The fundamental idea is that the unitary evolution of the wave function describing the state of a quantum system is approximate. It works well for microscopic systems, but progressively loses its validity when the mass / complexity of the system increases.

Categorical quantum mechanics is the study of quantum foundations and quantum information using paradigms from mathematics and computer science, notably monoidal category theory. The primitive objects of study are physical processes, and the different ways that these can be composed. It was pioneered in 2004 by Samson Abramsky and Bob Coecke. Categorical quantum mechanics is entry 18M40 in MSC2020.

The Koopman–von Neumann (KvN) theory is a description of classical mechanics as an operatorial theory similar to quantum mechanics, based on a Hilbert space of complex, square-integrable wavefunctions. As its name suggests, the KvN theory is loosely related to work by Bernard Koopman and John von Neumann in 1931 and 1932, respectively. As explained in this entry, however, the historical origins of the theory and its name are complicated.

In theoretical physics, the problem of time is a conceptual conflict between general relativity and quantum mechanics in that quantum mechanics regards the flow of time as universal and absolute, whereas general relativity regards the flow of time as malleable and relative. This problem raises the question of what time really is in a physical sense and whether it is truly a real, distinct phenomenon. It also involves the related question of why time seems to flow in a single direction, despite the fact that no known physical laws at the microscopic level seem to require a single direction.

<span class="mw-page-title-main">Alexander Gorban</span> Russian-British scientist

Alexander Nikolaevich Gorban is a scientist of Russian origin, working in the United Kingdom. He is a professor at the University of Leicester, and director of its Mathematical Modeling Centre. Gorban has contributed to many areas of fundamental and applied science, including statistical physics, non-equilibrium thermodynamics, machine learning and mathematical biology.

Complex spacetime is a mathematical framework that combines the concepts of complex numbers and spacetime in physics. In this framework, the usual real-valued coordinates of spacetime are replaced with complex-valued coordinates. This allows for the inclusion of imaginary components in the description of spacetime, which can have interesting implications in certain areas of physics, such as quantum field theory and string theory.

A generalized probabilistic theory (GPT) is a general framework to describe the operational features of arbitrary physical theories. A GPT must specify what kind of physical systems one can find in the lab, as well as rules to compute the outcome statistics of any experiment involving labeled preparations, transformations and measurements. The framework of GPTs has been used to define hypothetical non-quantum physical theories which nonetheless possess quantum theory's most remarkable features, such as entanglement or teleportation. Notably, a small set of physically motivated axioms is enough to single out the GPT representation of quantum theory.

References