Past hypothesis

Last updated

In cosmology, the past hypothesis is a fundamental law of physics that postulates that the universe started in a low-entropy state, [1] in accordance with the second law of thermodynamics. The second law states that any closed system follows the arrow of time, meaning its entropy never decreases. Applying this idea to the entire universe, the hypothesis argues that the universe must have started from a special event with less entropy than is currently observed, in order to preserve the arrow of time globally.

Contents

This idea has been discussed since the development of statistical mechanics, [Note 1] but the term "past hypothesis" was coined by philosopher David Albert in 2000. [2] [3] Philosophical and theoretical efforts focus on trying to explain the consistency and the origin of the postulate. [4]

The past hypothesis is an exception to the principle of indifference, according to which every possible microstate within a certain macrostate would have an equal probability. The past hypothesis allows only those microstates that are compatible with a much-lower-entropy past, although these states are assigned equal probabilities. If the principle of indifference is applied without taking into account the past hypothesis, a low- or medium-entropy state would have likely evolved both from and toward higher-entropy macrostates, as there are more ways statistically to be high-entropy than low-entropy. The low- or medium-entropy state would have appeared as a "statistical fluctuation" amid a higher-entropy past and a higher-entropy future. [5]

Common theoretical frameworks have been developed in order to explain the origin of the past hypothesis based on inflationary models or the anthropic principle. [1] [2] The Weyl curvature hypothesis, an alternative model by Roger Penrose, argues a link between entropy, the arrow of time and the curvature of spacetime (encoded in the Weyl tensor). [2] [6]

See also

Notes

  1. See Ludwig Boltzmann Vorlesungen über Gastheorie ("Lectures on Gas Theory", 1896)

Related Research Articles

The anthropic principle, also known as the "observation selection effect", is the hypothesis, first proposed in 1957 by Robert Dicke, that the range of possible observations that could be made about the universe is limited by the fact that observations could happen only in a universe capable of developing intelligent life. Proponents of the anthropic principle argue that it explains why the universe has the age and the fundamental physical constants necessary to accommodate conscious life, since if either had been different, no one would have been around to make observations. Anthropic reasoning is often used to deal with the idea that the universe seems to be finely tuned for the existence of life.

<span class="mw-page-title-main">Entropy</span> Property of a thermodynamic system

Entropy is a scientific concept that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.

In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. It does not assume or postulate any natural laws, but explains the macroscopic behavior of nature from the behavior of such ensembles.

<span class="mw-page-title-main">Second law of thermodynamics</span> Physical law for entropy and heat

The second law of thermodynamics is a physical law based on universal empirical observation concerning heat and energy interconversions. A simple statement of the law is that heat always flows spontaneously from hotter to colder regions of matter. Another statement is: "Not all heat can be converted into work in a cyclic process."

<span class="mw-page-title-main">Phase space</span> Space of all possible states that a system can take

In dynamical systems theory and control theory, a phase space or state space is a space in which all possible "states" of a dynamical system or a control system are represented, with each possible state corresponding to one unique point in the phase space. For mechanical systems, the phase space usually consists of all possible values of position and momentum variables. It is the direct product of direct space and reciprocal space. The concept of phase space was developed in the late 19th century by Ludwig Boltzmann, Henri Poincaré, and Josiah Willard Gibbs.

<span class="mw-page-title-main">Arrow of time</span> Concept in physics of one-way time

The arrow of time, also called time's arrow, is the concept positing the "one-way direction" or "asymmetry" of time. It was developed in 1927 by the British astrophysicist Arthur Eddington, and is an unsolved general physics question. This direction, according to Eddington, could be determined by studying the organization of atoms, molecules, and bodies, and might be drawn upon a four-dimensional relativistic map of the world.

<span class="mw-page-title-main">Ludwig Boltzmann</span> Austrian physicist and philosopher (1844–1906)

Ludwig Eduard Boltzmann was an Austrian physicist and philosopher. His greatest achievements were the development of statistical mechanics, and the statistical explanation of the second law of thermodynamics. In 1877 he provided the current definition of entropy, , where Ω is the number of microstates whose energy equals the system's energy, interpreted as a measure of statistical disorder of a system. Max Planck named the constant kB the Boltzmann constant.

<span class="mw-page-title-main">Irreversible process</span> Process that cannot be undone

In science, a process that is not reversible is called irreversible. This concept arises frequently in thermodynamics. All complex natural processes are irreversible, although a phase transition at the coexistence temperature is well approximated as reversible.

In physics, Loschmidt's paradox, also known as the reversibility paradox, irreversibility paradox, or Umkehreinwand, is the objection that it should not be possible to deduce an irreversible process from time-symmetric dynamics. This puts the time reversal symmetry of (almost) all known low-level fundamental physical processes at odds with any attempt to infer from them the second law of thermodynamics which describes the behaviour of macroscopic systems. Both of these are well-accepted principles in physics, with sound observational and theoretical support, yet they seem to be in conflict, hence the paradox.

<span class="mw-page-title-main">Laws of thermodynamics</span> Observational basis of thermodynamics

The laws of thermodynamics are a set of scientific laws which define a group of physical quantities, such as temperature, energy, and entropy, that characterize thermodynamic systems in thermodynamic equilibrium. The laws also use various parameters for thermodynamic processes, such as thermodynamic work and heat, and establish relationships between them. They state empirical facts that form a basis of precluding the possibility of certain phenomena, such as perpetual motion. In addition to their use in thermodynamics, they are important fundamental laws of physics in general and are applicable in other natural sciences.

In physics, maximum entropy thermodynamics views equilibrium thermodynamics and statistical mechanics as inference processes. More specifically, MaxEnt applies inference techniques rooted in Shannon information theory, Bayesian probability, and the principle of maximum entropy. These techniques are relevant to any situation requiring prediction from incomplete or insufficient data. MaxEnt thermodynamics began with two papers by Edwin T. Jaynes published in the 1957 Physical Review.

<span class="mw-page-title-main">Microstate (statistical mechanics)</span> Specific microscopic configuration of a thermodynamic system

In statistical mechanics, a microstate is a specific configuration of a system that describes the precise positions and momenta of all the individual particles or components that make up the system. Each microstate has a certain probability of occurring during the course of the system's thermal fluctuations.

The Weyl curvature hypothesis, which arises in the application of Albert Einstein's general theory of relativity to physical cosmology, was introduced by the British mathematician and theoretical physicist Roger Penrose in an article in 1979 in an attempt to provide explanations for two of the most fundamental issues in physics. On the one hand, one would like to account for a universe which on its largest observational scales appears remarkably spatially homogeneous and isotropic in its physical properties ; on the other hand, there is the deep question on the origin of the second law of thermodynamics.

The mathematical expressions for thermodynamic entropy in the statistical thermodynamics formulation established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s are similar to the information entropy by Claude Shannon and Ralph Hartley, developed in the 1940s.

The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy is formulated as a statistical property using probability theory. The statistical entropy perspective was introduced in 1870 by Austrian physicist Ludwig Boltzmann, who established a new field of physics that provided the descriptive linkage between the macroscopic observation of nature and the microscopic view based on the rigorous treatment of large ensembles of microscopic states that constitute thermodynamic systems.

Entropy is one of the few quantities in the physical sciences that require a particular direction for time, sometimes called an arrow of time. As one goes "forward" in time, the second law of thermodynamics says, the entropy of an isolated system can increase, but not decrease. Thus, entropy measurement is a way of distinguishing the past from the future. In thermodynamic systems that are not isolated, local entropy can decrease over time, accompanied by a compensating entropy increase in the surroundings; examples include objects undergoing cooling, living systems, and the formation of typical crystals.

<span class="mw-page-title-main">Introduction to entropy</span> Non-technical introduction to entropy

In thermodynamics, entropy is a numerical quantity that shows that many physical processes can go in only one direction in time. For example, cream and coffee can be mixed together, but cannot be "unmixed"; a piece of wood can be burned, but cannot be "unburned". The word 'entropy' has entered popular usage to refer a lack of order or predictability, or of a gradual decline into disorder. A more physical interpretation of thermodynamic entropy refers to spread of energy or matter, or to extent and diversity of microscopic motion.

<span class="mw-page-title-main">Boltzmann's entropy formula</span> Equation in statistical mechanics

In statistical mechanics, Boltzmann's equation is a probability equation relating the entropy , also written as , of an ideal gas to the multiplicity, the number of real microstates corresponding to the gas's macrostate:

<span class="mw-page-title-main">Branches of physics</span> Overview of the branches of physics

Physics is a scientific discipline that seeks to construct and experimentally test theories of the physical universe. These theories vary in their scope and can be organized into several distinct branches, which are outlined in this article.

Conformal cyclic cosmology (CCC) is a cosmological model in the framework of general relativity and proposed by theoretical physicist Roger Penrose. In CCC, the universe iterates through infinite cycles, with the future timelike infinity of each previous iteration being identified with the Big Bang singularity of the next. Penrose popularized this theory in his 2010 book Cycles of Time: An Extraordinary New View of the Universe.

References

  1. 1 2 Callender, Craig (2011-04-07). The Oxford Handbook of Philosophy of Time. Oxford University Press. ISBN   978-0-19-161724-9.
  2. 1 2 3 Ainsworth, Peter Mark (2008). "Cosmic inflation and the past hypothesis". Synthese. 162 (2): 157–165. doi:10.1007/s11229-007-9179-4. ISSN   0039-7857. S2CID   33523028.
  3. Albert, David Z. (2009-06-30). Time and Chance. Harvard University Press. ISBN   978-0-674-26138-9.
  4. Falk, Dan (2016-07-19). "A Debate Over the Physics of Time". Quanta Magazine. Retrieved 2022-02-27.
  5. Carroll, Sean (2010). From Eternity to Here. New York: Dutton. p. 176-178.
  6. R., Penrose (1979). "Singularities and time-asymmetry". In S. W. Hawking; W. Israel (eds.). General Relativity; an Einstein Centenary Survey.