Thermodynamics |
---|
In science, a process that is not reversible is called irreversible. This concept arises frequently in thermodynamics. All complex natural processes are irreversible, [1] [2] [3] [4] although a phase transition at the coexistence temperature (e.g. melting of ice cubes in water) is well approximated as reversible.
In thermodynamics, a change in the thermodynamic state of a system and all of its surroundings cannot be precisely restored to its initial state by infinitesimal changes in some property of the system without expenditure of energy. A system that undergoes an irreversible process may still be capable of returning to its initial state. Because entropy is a state function, the change in entropy of the system is the same whether the process is reversible or irreversible. However, the impossibility occurs in restoring the environment to its own initial conditions. An irreversible process increases the total entropy of the system and its surroundings. The second law of thermodynamics can be used to determine whether a hypothetical process is reversible or not.
Intuitively, a process is reversible if there is no dissipation. For example, Joule expansion is irreversible because initially the system is not uniform. Initially, there is part of the system with gas in it, and part of the system with no gas. For dissipation to occur, there needs to be such a non uniformity. This is just the same as if in a system one section of the gas was hot, and the other cold. Then dissipation would occur; the temperature distribution would become uniform with no work being done, and this would be irreversible because you couldn't add or remove heat or change the volume to return the system to its initial state. Thus, if the system is always uniform, then the process is reversible, meaning that you can return the system to its original state by either adding or removing heat, doing work on the system, or letting the system do work. As another example, to approximate the expansion in an internal combustion engine as reversible, we would be assuming that the temperature and pressure uniformly change throughout the volume after the spark. Obviously, this is not true and there is a flame front and sometimes even engine knocking. One of the reasons that Diesel engines are able to attain higher efficiency is that the combustion is much more uniform, so less energy is lost to dissipation and the process is closer to reversible.[ citation needed ]
The phenomenon of irreversibility results from the fact that if a thermodynamic system, which is any system of sufficient complexity, of interacting molecules is brought from one thermodynamic state to another, the configuration or arrangement of the atoms and molecules in the system will change in a way that is not easily predictable. [5] [6] Some "transformation energy" will be used as the molecules of the "working body" do work on each other when they change from one state to another. During this transformation, there will be some heat energy loss or dissipation due to intermolecular friction and collisions. This energy will not be recoverable if the process is reversed.
Many biological processes that were once thought to be reversible have been found to actually be a pairing of two irreversible processes. Whereas a single enzyme was once believed to catalyze both the forward and reverse chemical changes, research has found that two separate enzymes of similar structure are typically needed to perform what results in a pair of thermodynamically irreversible processes. [7]
This section has multiple issues. Please help improve it or discuss these issues on the talk page . (Learn how and when to remove these template messages)
|
Thermodynamics defines the statistical behaviour of large numbers of entities, whose exact behavior is given by more specific laws. While the fundamental theoretical laws of physics are all time-reversible, [8] experimentally the probability of real reversibility is low and the former state of system and surroundings is recovered only to certain extent (see: uncertainty principle). The reversibility of thermodynamics must be statistical in nature; that is, it must be merely highly unlikely, but not impossible, that a system will lower in entropy. In other words, time reversibility is fulfilled if the process happens the same way if time were to flow in reverse or the order of states in the process is reversed (the last state becomes the first and vice versa).
The German physicist Rudolf Clausius, in the 1850s, was the first to mathematically quantify the discovery of irreversibility in nature through his introduction of the concept of entropy. In his 1854 memoir "On a Modified Form of the Second Fundamental Theorem in the Mechanical Theory of Heat," Clausius states:
It may, moreover, happen that instead of a descending transmission of heat accompanying, in the one and the same process, the ascending transmission, another permanent change may occur which has the peculiarity of not being reversible without either becoming replaced by a new permanent change of a similar kind, or producing a descending transmission of heat.
Simply, Clausius states that it is impossible for a system to transfer heat from a cooler body to a hotter body. For example, a cup of hot coffee placed in an area of room temperature (~72 °F) will transfer heat to its surroundings and thereby cool down with the temperature of the room slightly increasing (to ~72.3 °F). However, that same initial cup of coffee will never absorb heat from its surroundings, causing it to grow even hotter, with the temperature of the room decreasing (to ~71.7 °F). Therefore, the process of the coffee cooling down is irreversible unless extra energy is added to the system.
However, a paradox arose when attempting to reconcile microanalysis of a system with observations of its macrostate. Many processes are mathematically reversible in their microstate when analyzed using classical Newtonian mechanics. This paradox clearly taints microscopic explanations of macroscopic tendency towards equilibrium, such as James Clerk Maxwell's 1860 argument that molecular collisions entail an equalization of temperatures of mixed gases. [9] From 1872 to 1875, Ludwig Boltzmann reinforced the statistical explanation of this paradox in the form of Boltzmann's entropy formula, stating that an increase of the number of possible microstates a system might be in, will increase the entropy of the system, making it less likely that the system will return to an earlier state. His formulas quantified the analysis done by William Thomson, 1st Baron Kelvin, who had argued that: [10] [11]
The equations of motion in abstract dynamics are perfectly reversible; any solution of these equations remains valid when the time variable t is replaced by –t. On the other hand, physical processes are irreversible: for example, the friction of solids, conduction of heat, and diffusion. Nevertheless, the principle of dissipation of energy is compatible with a molecular theory in which each particle is subject to the laws of abstract dynamics.
Another explanation of irreversible systems was presented by French mathematician Henri Poincaré. In 1890, he published his first explanation of nonlinear dynamics, also called chaos theory. Applying chaos theory to the second law of thermodynamics, the paradox of irreversibility can be explained in the errors associated with scaling from microstates to macrostates and the degrees of freedom used when making experimental observations. Sensitivity to initial conditions relating to the system and its environment at the microstate compounds into an exhibition of irreversible characteristics within the observable, physical realm. [12]
In the physical realm, many irreversible processes are present to which the inability to achieve 100% efficiency in energy transfer can be attributed. The following is a list of spontaneous events which contribute to the irreversibility of processes. [13]
A Joule expansion is an example of classical thermodynamics, as it is easy to work out the resulting increase in entropy. It occurs where a volume of gas is kept in one side of a thermally isolated container (via a small partition), with the other side of the container being evacuated; the partition between the two parts of the container is then opened, and the gas fills the whole container. The internal energy of the gas remains the same, while the volume increases. The original state cannot be recovered by simply compressing the gas to its original volume, since the internal energy will be increased by this compression. The original state can only be recovered by then cooling the re-compressed system, and thereby irreversibly heating the environment. The diagram to the right applies only if the first expansion is "free" (Joule expansion), i.e. there can be no atmospheric pressure outside the cylinder and no weight lifted.
The difference between reversible and irreversible events has particular explanatory value in complex systems (such as living organisms, or ecosystems). According to the biologists Humberto Maturana and Francisco Varela, living organisms are characterized by autopoiesis, which enables their continued existence. More primitive forms of self-organizing systems have been described by the physicist and chemist Ilya Prigogine. In the context of complex systems, events which lead to the end of certain self-organising processes, like death, extinction of a species or the collapse of a meteorological system can be considered as irreversible. Even if a clone with the same organizational principle (e.g. identical DNA-structure) could be developed, this would not mean that the former distinct system comes back into being. Events to which the self-organizing capacities of organisms, species or other complex systems can adapt, like minor injuries or changes in the physical environment are reversible. However, adaptation depends on import of negentropy into the organism, thereby increasing irreversible processes in its environment. [17] Ecological principles, like those of sustainability and the precautionary principle can be defined with reference to the concept of reversibility. [18] [19] [20] [21] [22] [23] [5] [24] [25]
Entropy is a scientific concept that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.
In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. It does not assume or postulate any natural laws, but explains the macroscopic behavior of nature from the behavior of such ensembles.
A timeline of events in the history of thermodynamics.
Maxwell's demon is a thought experiment that would hypothetically violate the second law of thermodynamics. It was proposed by the physicist James Clerk Maxwell in 1867. In his first letter, Maxwell referred to the entity as a "finite being" or a "being who can play a game of skill with the molecules". Lord Kelvin would later call it a "demon".
The second law of thermodynamics is a physical law based on universal empirical observation concerning heat and energy interconversions. A simple statement of the law is that heat always flows spontaneously from hotter to colder regions of matter. Another statement is: "Not all heat can be converted into work in a cyclic process."
The fluctuation theorem (FT), which originated from statistical mechanics, deals with the relative probability that the entropy of a system which is currently away from thermodynamic equilibrium will increase or decrease over a given amount of time. While the second law of thermodynamics predicts that the entropy of an isolated system should tend to increase until it reaches equilibrium, it became apparent after the discovery of statistical mechanics that the second law is only a statistical one, suggesting that there should always be some nonzero probability that the entropy of an isolated system might spontaneously decrease; the fluctuation theorem precisely quantifies this probability.
A thermodynamic system is a body of matter and/or radiation separate from its surroundings that can be studied using the laws of thermodynamics. A thermodynamic system may be an isolated system, a closed system, or an open system. An isolated system does not exchange matter or energy with its surroundings. A closed system may exchange heat, experience forces, and exert forces, but does not exchange matter. An open system can interact with its surroundings by exchanging both matter and energy.
Non-equilibrium thermodynamics is a branch of thermodynamics that deals with physical systems that are not in thermodynamic equilibrium but can be described in terms of macroscopic quantities that represent an extrapolation of the variables used to specify the system in thermodynamic equilibrium. Non-equilibrium thermodynamics is concerned with transport processes and with the rates of chemical reactions.
The laws of thermodynamics are a set of scientific laws which define a group of physical quantities, such as temperature, energy, and entropy, that characterize thermodynamic systems in thermodynamic equilibrium. The laws also use various parameters for thermodynamic processes, such as thermodynamic work and heat, and establish relationships between them. They state empirical facts that form a basis of precluding the possibility of certain phenomena, such as perpetual motion. In addition to their use in thermodynamics, they are important fundamental laws of physics in general and are applicable in other natural sciences.
In statistical mechanics, a canonical ensemble is the statistical ensemble that represents the possible states of a mechanical system in thermal equilibrium with a heat bath at a fixed temperature. The system can exchange energy with the heat bath, so that the states of the system will differ in total energy.
The Jarzynski equality (JE) is an equation in statistical mechanics that relates free energy differences between two states and the irreversible work along an ensemble of trajectories joining the same states. It is named after the physicist Christopher Jarzynski who derived it in 1996. Fundamentally, the Jarzynski equality points to the fact that the fluctuations in the work satisfy certain constraints separately from the average value of the work that occurs in some process.
Landauer's principle is a physical principle pertaining to the lower theoretical limit of energy consumption of computation. It holds that an irreversible change in information stored in a computer, such as merging two computational paths, dissipates a minimum amount of heat to its surroundings.
The mathematical expressions for thermodynamic entropy in the statistical thermodynamics formulation established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s are similar to the information entropy by Claude Shannon and Ralph Hartley, developed in the 1940s.
The concept of entropy developed in response to the observation that a certain amount of functional energy released from combustion reactions is always lost to dissipation or friction and is thus not transformed into useful work. Early heat-powered engines such as Thomas Savery's (1698), the Newcomen engine (1712) and the Cugnot steam tricycle (1769) were inefficient, converting less than two percent of the input energy into useful work output; a great deal of useful energy was dissipated or lost. Over the next two centuries, physicists investigated this puzzle of lost energy; the result was the concept of entropy.
The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy is formulated as a statistical property using probability theory. The statistical entropy perspective was introduced in 1870 by Austrian physicist Ludwig Boltzmann, who established a new field of physics that provided the descriptive linkage between the macroscopic observation of nature and the microscopic view based on the rigorous treatment of large ensembles of microstates that constitute thermodynamic systems.
In thermodynamics, entropy is a numerical quantity that shows that many physical processes can go in only one direction in time. For example, cream and coffee can be mixed together, but cannot be "unmixed"; a piece of wood can be burned, but cannot be "unburned". The word 'entropy' has entered popular usage to refer a lack of order or predictability, or of a gradual decline into disorder. A more physical interpretation of thermodynamic entropy refers to spread of energy or matter, or to extent and diversity of microscopic motion.
Lloyd A. Demetrius is an American mathematician and theoretical biologist at the Department of Organismic and Evolutionary biology, Harvard University. He is best known for the discovery of the concept evolutionary entropy, a statistical parameter that characterizes Darwinian fitness in models of evolutionary processes at various levels of biological organization – molecular, organismic and social. Evolutionary entropy, a generalization of the Gibbs-Boltzmann entropy in statistical thermodynamics, is the cornerstone of directionality theory, an analytical study of evolution by variation and selection. The theory has applications to: a) the development of aging and the evolution of longevity; b) the origin and progression of age related diseases such as cancer, and neurodegenerative disorders such as Alzheimer's disease and Parkinson's disease; c) the evolution of cooperation and the spread of inequality.
Temperature is a physical quantity that expresses quantitatively the attribute of hotness or coldness. Temperature is measured with a thermometer. It reflects the kinetic energy of the vibrating and colliding atoms making up a substance.
Energy dissipation and entropy production extremal principles are ideas developed within non-equilibrium thermodynamics that attempt to predict the likely steady states and dynamical structures that a physical system might show. The search for extremum principles for non-equilibrium thermodynamics follows their successful use in other branches of physics. According to Kondepudi (2008), and to Grandy (2008), there is no general rule that provides an extremum principle that governs the evolution of a far-from-equilibrium system to a steady state. According to Glansdorff and Prigogine, irreversible processes usually are not governed by global extremal principles because description of their evolution requires differential equations which are not self-adjoint, but local extremal principles can be used for local solutions. Lebon Jou and Casas-Vásquez (2008) state that "In non-equilibrium ... it is generally not possible to construct thermodynamic potentials depending on the whole set of variables". Šilhavý (1997) offers the opinion that "... the extremum principles of thermodynamics ... do not have any counterpart for [non-equilibrium] steady states ." It follows that any general extremal principle for a non-equilibrium problem will need to refer in some detail to the constraints that are specific for the structure of the system considered in the problem.
In thermodynamics and thermal physics, the Gouy-Stodola theorem is an important theorem for the quantification of irreversibilities in an open system, and aids in the exergy analysis of thermodynamic processes. It asserts that the rate at which work is lost during a process, or at which exergy is destroyed, is proportional to the rate at which entropy is generated, and that the proportionality coefficient is the temperature of the ambient heat reservoir. In the literature, the theorem often appears in a slightly modified form, changing the proportionality coefficient.