Heat death paradox

Last updated

The heat death paradox, also known as thermodynamic paradox, Clausius' paradox, and Kelvin's paradox, [1] is a reductio ad absurdum argument that uses thermodynamics to show the impossibility of an infinitely old universe. It was formulated in February 1862 by Lord Kelvin and expanded upon by Hermann von Helmholtz and William John Macquorn Rankine. [2] [3]

Contents

The paradox

Assuming that the universe is eternal, a question arises: How is it that thermodynamic equilibrium has not already been achieved? [4]

This theoretical paradox is directed at the then-mainstream strand of belief in a classical view of a sempiternal universe, whereby its matter is postulated as everlasting and having always been recognisably the universe. Heat death paradox is born of a paradigm resulting from fundamental ideas about the cosmos. It is necessary to change the paradigm to resolve the paradox.

The paradox was based upon the rigid mechanical point of view of the second law of thermodynamics postulated by Rudolf Clausius and Lord Kelvin, according to which heat can only be transferred from a warmer to a colder object. It notes: if the universe were eternal, as claimed classically, it should already be cold and isotropic (its objects should have the same temperature, and the distribution of matter or radiation should be even). [4] Kelvin compared the universe to a clock that runs slower and slower, constantly dissipating energy in impalpable heat , although he was unsure whether it would stop for ever (reach thermodynamic equilibrium). According to this model, the existence of usable energy, which can be used to perform work and produce entropy, means that the clock has not stopped - since a conversion of heat in mechanical energy (which Kelvin called a rejuvenating universe scenario) is not contemplated. [5] [2]

According to the laws of thermodynamics, any hot object transfers heat to its cooler surroundings, until everything is at the same temperature. For two objects at the same temperature as much heat flows from one body as flows from the other, and the net effect is no change. If the universe were infinitely old, there must have been enough time for the stars to cool and warm their surroundings. Everywhere should therefore be at the same temperature and there should either be no stars, or everything should be as hot as stars. The universe should thus achieve, or asymptotically tend to, thermodynamic equilibrium, which corresponds to a state where no thermodynamic free energy is left, and therefore no further work is possible: this is the heat death of the universe, as predicted by Lord Kelvin in 1852. The average temperature of the cosmos should also asymptotically tend to Kelvin Zero, and it is possible that a maximum entropy state will be reached. [6]

Kelvin's solution

In February 1862, Lord Kelvin used the existence of the Sun and the stars as an empirical proof that the universe has not achieved thermodynamic equilibrium, as entropy production and free work are still possible, and there are temperature differences between objects. Helmholtz and Rankine expanded Kelvin's work soon after. [2] Since there are stars and colder objects, the universe is not in thermodynamic equilibrium, so it cannot be infinitely old.

Modern cosmology

The paradox does not arise in the Big Bang or its successful Lambda-CDM refinement, which posit that the universe began roughly 13.8 billion years ago, not long enough ago for the universe to have approached thermodynamic equilibrium. Some proposed further refinements, termed eternal inflation, restore Kelvin's idea of unending time in the more complicated form of an eternal, exponentially-expanding multiverse in which mutually-inaccessible baby universes, some of which resemble the universe we inhabit, are continually being born.

Olbers' paradox is another paradox which aims to disprove an infinitely old static universe, but it only fits with a static universe scenario. Also, unlike Kelvin's paradox, it relies on cosmology rather than thermodynamics. The Boltzmann Brain can also be related to Kelvin's, as it focuses on the spontaneous generation of a brain (filled with false memories) from entropy fluctuations, in a universe which has been lying in a heat death state for an indefinite amount of time. [7]

See also

Related Research Articles

<span class="mw-page-title-main">Absolute zero</span> Lowest theoretical temperature

Absolute zero is the lowest limit of the thermodynamic temperature scale; a state at which the enthalpy and entropy of a cooled ideal gas reach their minimum value, taken as zero kelvin. The fundamental particles of nature have minimum vibrational motion, retaining only quantum mechanical, zero-point energy-induced particle motion. The theoretical temperature is determined by extrapolating the ideal gas law; by international agreement, absolute zero is taken as −273.15 degrees on the Celsius scale, which equals −459.67 degrees on the Fahrenheit scale. The corresponding Kelvin and Rankine temperature scales set their zero points at absolute zero by definition.

<span class="mw-page-title-main">Entropy</span> Property of a thermodynamic system

Entropy is a scientific concept that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.

<span class="mw-page-title-main">Thermodynamics</span> Physics of heat, work, and temperature

Thermodynamics is a branch of physics that deals with heat, work, and temperature, and their relation to energy, entropy, and the physical properties of matter and radiation. The behavior of these quantities is governed by the four laws of thermodynamics which convey a quantitative description using measurable macroscopic physical quantities, but may be explained in terms of microscopic constituents by statistical mechanics. Thermodynamics applies to a wide variety of topics in science and engineering, especially physical chemistry, biochemistry, chemical engineering and mechanical engineering, but also in other complex fields such as meteorology.

<span class="mw-page-title-main">Thermodynamic free energy</span> State function whose change relates to the systems maximal work output

In thermodynamics, the thermodynamic free energy is one of the state functions of a thermodynamic system. The change in the free energy is the maximum amount of work that the system can perform in a process at constant temperature, and its sign indicates whether the process is thermodynamically favorable or forbidden. Since free energy usually contains potential energy, it is not absolute but depends on the choice of a zero point. Therefore, only relative free energy values, or changes in free energy, are physically meaningful.

<span class="mw-page-title-main">Thermodynamic temperature</span> Measure of absolute temperature

Thermodynamic temperature is a quantity defined in thermodynamics as distinct from kinetic theory or statistical mechanics.

<span class="mw-page-title-main">Timeline of thermodynamics</span>

A timeline of events in the history of thermodynamics.

<span class="mw-page-title-main">Second law of thermodynamics</span> Physical law for entropy and heat

The second law of thermodynamics is a physical law based on universal empirical observation concerning heat and energy interconversions. A simple statement of the law is that heat always flows spontaneously from hotter to colder regions of matter. Another statement is: "Not all heat can be converted into work in a cyclic process."

The heat death of the universe is a hypothesis on the ultimate fate of the universe, which suggests the universe will evolve to a state of no thermodynamic free energy, and will therefore be unable to sustain processes that increase entropy. Heat death does not imply any particular absolute temperature; it only requires that temperature differences or other processes may no longer be exploited to perform work. In the language of physics, this is when the universe reaches thermodynamic equilibrium.

<span class="mw-page-title-main">Irreversible process</span> Process that cannot be undone

In science, a process that is not reversible is called irreversible. This concept arises frequently in thermodynamics. All complex natural processes are irreversible, although a phase transition at the coexistence temperature is well approximated as reversible.

In physics, Loschmidt's paradox, also known as the reversibility paradox, irreversibility paradox, or Umkehreinwand, is the objection that it should not be possible to deduce an irreversible process from time-symmetric dynamics. This puts the time reversal symmetry of (almost) all known low-level fundamental physical processes at odds with any attempt to infer from them the second law of thermodynamics which describes the behaviour of macroscopic systems. Both of these are well-accepted principles in physics, with sound observational and theoretical support, yet they seem to be in conflict, hence the paradox.

<span class="mw-page-title-main">Laws of thermodynamics</span> Observational basis of thermodynamics

The laws of thermodynamics are a set of scientific laws which define a group of physical quantities, such as temperature, energy, and entropy, that characterize thermodynamic systems in thermodynamic equilibrium. The laws also use various parameters for thermodynamic processes, such as thermodynamic work and heat, and establish relationships between them. They state empirical facts that form a basis of precluding the possibility of certain phenomena, such as perpetual motion. In addition to their use in thermodynamics, they are important fundamental laws of physics in general and are applicable in other natural sciences.

<span class="mw-page-title-main">History of thermodynamics</span>

The history of thermodynamics is a fundamental strand in the history of physics, the history of chemistry, and the history of science in general. Owing in the relevance of thermodynamics in much of science and technology, its history is finely woven with the developments of classical mechanics, quantum mechanics, magnetism, and chemical kinetics, to more distant applied fields such as meteorology, information theory, and biology (physiology), and to technological developments such as the steam engine, internal combustion engine, cryogenics and electricity generation. The development of thermodynamics both drove and was driven by atomic theory. It also, albeit in a subtle manner, motivated new directions in probability and statistics; see, for example, the timeline of thermodynamics.

<span class="mw-page-title-main">Thermodynamic state</span> Quantifiable conditions of a thermodynamic system at a specific time

In thermodynamics, a thermodynamic state of a system is its condition at a specific time; that is, fully identified by values of a suitable set of parameters known as state variables, state parameters or thermodynamic variables. Once such a set of values of thermodynamic variables has been specified for a system, the values of all thermodynamic properties of the system are uniquely determined. Usually, by default, a thermodynamic state is taken to be one of thermodynamic equilibrium. This means that the state is not merely the condition of the system at a specific time, but that the condition is the same, unchanging, over an indefinitely long duration of time.

The concept of entropy developed in response to the observation that a certain amount of functional energy released from combustion reactions is always lost to dissipation or friction and is thus not transformed into useful work. Early heat-powered engines such as Thomas Savery's (1698), the Newcomen engine (1712) and the Cugnot steam tricycle (1769) were inefficient, converting less than two percent of the input energy into useful work output; a great deal of useful energy was dissipated or lost. Over the next two centuries, physicists investigated this puzzle of lost energy; the result was the concept of entropy.

The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy is formulated as a statistical property using probability theory. The statistical entropy perspective was introduced in 1870 by Austrian physicist Ludwig Boltzmann, who established a new field of physics that provided the descriptive linkage between the macroscopic observation of nature and the microscopic view based on the rigorous treatment of large ensembles of microscopic states that constitute thermodynamic systems.

Entropy is one of the few quantities in the physical sciences that require a particular direction for time, sometimes called an arrow of time. As one goes "forward" in time, the second law of thermodynamics says, the entropy of an isolated system can increase, but not decrease. Thus, entropy measurement is a way of distinguishing the past from the future. In thermodynamic systems that are not isolated, local entropy can decrease over time, accompanied by a compensating entropy increase in the surroundings; examples include objects undergoing cooling, living systems, and the formation of typical crystals.

<span class="mw-page-title-main">Introduction to entropy</span> Non-technical introduction to entropy

In thermodynamics, entropy is a numerical quantity that shows that many physical processes can go in only one direction in time. For example, cream and coffee can be mixed together, but cannot be "unmixed"; a piece of wood can be burned, but cannot be "unburned". The word 'entropy' has entered popular usage to refer a lack of order or predictability, or of a gradual decline into disorder. A more physical interpretation of thermodynamic entropy refers to spread of energy or matter, or to extent and diversity of microscopic motion.

<span class="mw-page-title-main">Entropy (order and disorder)</span> Interpretation of entropy as the change in arrangement of a systems particles

In thermodynamics, entropy is often associated with the amount of order or disorder in a thermodynamic system. This stems from Rudolf Clausius' 1862 assertion that any thermodynamic process always "admits to being reduced [reduction] to the alteration in some way or another of the arrangement of the constituent parts of the working body" and that internal work associated with these alterations is quantified energetically by a measure of "entropy" change, according to the following differential expression:

<span class="mw-page-title-main">Temperature</span> Physical quantity of hot and cold

Temperature is a physical quantity that quantitatively expresses the attribute of hotness or coldness. Temperature is measured with a thermometer. It reflects the average kinetic energy of the vibrating and colliding atoms making up a substance.

Energy dissipation and entropy production extremal principles are ideas developed within non-equilibrium thermodynamics that attempt to predict the likely steady states and dynamical structures that a physical system might show. The search for extremum principles for non-equilibrium thermodynamics follows their successful use in other branches of physics. According to Kondepudi (2008), and to Grandy (2008), there is no general rule that provides an extremum principle that governs the evolution of a far-from-equilibrium system to a steady state. According to Glansdorff and Prigogine, irreversible processes usually are not governed by global extremal principles because description of their evolution requires differential equations which are not self-adjoint, but local extremal principles can be used for local solutions. Lebon Jou and Casas-Vásquez (2008) state that "In non-equilibrium ... it is generally not possible to construct thermodynamic potentials depending on the whole set of variables". Šilhavý (1997) offers the opinion that "... the extremum principles of thermodynamics ... do not have any counterpart for [non-equilibrium] steady states ." It follows that any general extremal principle for a non-equilibrium problem will need to refer in some detail to the constraints that are specific for the structure of the system considered in the problem.

References

  1. Cucić, Dragoljub; Angelopoulos (2010). "Paradoxes of Thermodynamics". AIP Conference Proceedings. 1203 (1): 1267–1270. arXiv: 0912.1756 . Bibcode:2010AIPC.1203.1267C. doi:10.1063/1.3322352.
  2. 1 2 3 Thomson, William (1862). "On the Age of the Sun's Heat". Macmillan's Magazine. Vol. 5. pp. 388–393.
  3. Smith, Crosbie; Wise, M. Norton (1989). Energy and Empire: A Biographical Study of Lord Kelvin. Cambridge University Press. p. 500. ISBN   978-0-521-26173-9.
  4. 1 2 Cucic, Dragoljub A. (2008). "Astrophysical Paradoxes, long version". arXiv: 0812.1679 [physics.hist-ph].
  5. Otis, Laura (2002). "Literature and Science in the Nineteenth Century: An Anthology". OUP Oxford. Vol. 1. pp. 60–67.
  6. Laws of Thermodynamics Thompson and Clausius, Oxford University Press, 2015
  7. Carroll, Sean (29 December 2008). "Richard Feynman on Boltzmann Brains" . Retrieved 24 June 2019.