This article needs additional citations for verification .(August 2010) |
Thermodynamics |
---|
A timeline of events in the history of thermodynamics.
In physics, energy is the quantitative property that is transferred to a body or to a physical system, recognizable in the performance of work and in the form of heat and light. Energy is a conserved quantity—the law of conservation of energy states that energy can be converted in form, but not created or destroyed. The unit of measurement for energy in the International System of Units (SI) is the joule (J).
Entropy is a scientific concept that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.
In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. It does not assume or postulate any natural laws, but explains the macroscopic behavior of nature from the behavior of such ensembles.
The Boltzmann constant is the proportionality factor that relates the average relative thermal energy of particles in a gas with the thermodynamic temperature of the gas. It occurs in the definitions of the kelvin and the gas constant, and in Planck's law of black-body radiation and Boltzmann's entropy formula, and is used in calculating thermal noise in resistors. The Boltzmann constant has dimensions of energy divided by temperature, the same as entropy. It is named after the Austrian scientist Ludwig Boltzmann.
The kinetic theory of gases is a simple classical model of the thermodynamic behavior of gases. It treats a gas as composed of numerous particles, too small to see with a microscope, which are constantly in random motion. Their collisions with each other and with the walls of their container are used to explain physical properties of the gas—for example the relationship between its temperature, pressure and volume. The particles are now known to be the atoms or molecules of the gas.
The second law of thermodynamics is a physical law based on universal empirical observation concerning heat and energy interconversions. A simple statement of the law is that heat always flows spontaneously from hotter to colder regions of matter. Another statement is: "Not all heat can be converted into work in a cyclic process."
Certain systems can achieve negative thermodynamic temperature; that is, their temperature can be expressed as a negative quantity on the Kelvin or Rankine scales. This should be distinguished from temperatures expressed as negative numbers on non-thermodynamic Celsius or Fahrenheit scales, which are nevertheless higher than absolute zero. A system with a truly negative temperature on the Kelvin scale is hotter than any system with a positive temperature. If a negative-temperature system and a positive-temperature system come in contact, heat will flow from the negative- to the positive-temperature system. A standard example of such a system is population inversion in laser physics.
In classical statistical mechanics, the H-theorem, introduced by Ludwig Boltzmann in 1872, describes the tendency to decrease in the quantity H in a nearly-ideal gas of molecules. As this quantity H was meant to represent the entropy of thermodynamics, the H-theorem was an early demonstration of the power of statistical mechanics as it claimed to derive the second law of thermodynamics—a statement about fundamentally irreversible processes—from reversible microscopic mechanics. It is thought to prove the second law of thermodynamics, albeit under the assumption of low-entropy initial conditions.
Non-equilibrium thermodynamics is a branch of thermodynamics that deals with physical systems that are not in thermodynamic equilibrium but can be described in terms of macroscopic quantities that represent an extrapolation of the variables used to specify the system in thermodynamic equilibrium. Non-equilibrium thermodynamics is concerned with transport processes and with the rates of chemical reactions.
Ludwig Eduard Boltzmann was an Austrian physicist and philosopher. His greatest achievements were the development of statistical mechanics, and the statistical explanation of the second law of thermodynamics. In 1877 he provided the current definition of entropy, , where Ω is the number of microstates whose energy equals the system's energy, interpreted as a measure of the statistical disorder of a system. Max Planck named the constant kB the Boltzmann constant.
In science, a process that is not reversible is called irreversible. This concept arises frequently in thermodynamics. All complex natural processes are irreversible, although a phase transition at the coexistence temperature is well approximated as reversible.
The laws of thermodynamics are a set of scientific laws which define a group of physical quantities, such as temperature, energy, and entropy, that characterize thermodynamic systems in thermodynamic equilibrium. The laws also use various parameters for thermodynamic processes, such as thermodynamic work and heat, and establish relationships between them. They state empirical facts that form a basis of precluding the possibility of certain phenomena, such as perpetual motion. In addition to their use in thermodynamics, they are important fundamental laws of physics in general and are applicable in other natural sciences.
The history of thermodynamics is a fundamental strand in the history of physics, the history of chemistry, and the history of science in general. Owing in the relevance of thermodynamics in much of science and technology, its history is finely woven with the developments of classical mechanics, quantum mechanics, magnetism, and chemical kinetics, to more distant applied fields such as meteorology, information theory, and biology (physiology), and to technological developments such as the steam engine, internal combustion engine, cryogenics and electricity generation. The development of thermodynamics both drove and was driven by atomic theory. It also, albeit in a subtle manner, motivated new directions in probability and statistics; see, for example, the timeline of thermodynamics.
The mathematical expressions for thermodynamic entropy in the statistical thermodynamics formulation established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s are similar to the information entropy by Claude Shannon and Ralph Hartley, developed in the 1940s.
The principle of microscopic reversibility in physics and chemistry is twofold:
The concept of entropy developed in response to the observation that a certain amount of functional energy released from combustion reactions is always lost to dissipation or friction and is thus not transformed into useful work. Early heat-powered engines such as Thomas Savery's (1698), the Newcomen engine (1712) and the Cugnot steam tricycle (1769) were inefficient, converting less than two percent of the input energy into useful work output; a great deal of useful energy was dissipated or lost. Over the next two centuries, physicists investigated this puzzle of lost energy; the result was the concept of entropy.
Common thermodynamic equations and quantities in thermodynamics, using mathematical notation, are as follows:
Temperature is a physical quantity that quantitatively expresses the attribute of hotness or coldness. Temperature is measured with a thermometer. It reflects the average kinetic energy of the vibrating and colliding atoms making up a substance.
Energy dissipation and entropy production extremal principles are ideas developed within non-equilibrium thermodynamics that attempt to predict the likely steady states and dynamical structures that a physical system might show. The search for extremum principles for non-equilibrium thermodynamics follows their successful use in other branches of physics. According to Kondepudi (2008), and to Grandy (2008), there is no general rule that provides an extremum principle that governs the evolution of a far-from-equilibrium system to a steady state. According to Glansdorff and Prigogine, irreversible processes usually are not governed by global extremal principles because description of their evolution requires differential equations which are not self-adjoint, but local extremal principles can be used for local solutions. Lebon Jou and Casas-Vásquez (2008) state that "In non-equilibrium ... it is generally not possible to construct thermodynamic potentials depending on the whole set of variables". Šilhavý (1997) offers the opinion that "... the extremum principles of thermodynamics ... do not have any counterpart for [non-equilibrium] steady states ." It follows that any general extremal principle for a non-equilibrium problem will need to refer in some detail to the constraints that are specific for the structure of the system considered in the problem.
Quantum thermodynamics is the study of the relations between two independent physical theories: thermodynamics and quantum mechanics. The two independent theories address the physical phenomena of light and matter. In 1905, Albert Einstein argued that the requirement of consistency between thermodynamics and electromagnetism leads to the conclusion that light is quantized, obtaining the relation . This paper is the dawn of quantum theory. In a few decades quantum theory became established with an independent set of rules. Currently quantum thermodynamics addresses the emergence of thermodynamic laws from quantum mechanics. It differs from quantum statistical mechanics in the emphasis on dynamical processes out of equilibrium. In addition, there is a quest for the theory to be relevant for a single individual quantum system.
{{cite book}}
: CS1 maint: multiple names: authors list (link) CS1 maint: numeric names: authors list (link){{cite book}}
: CS1 maint: multiple names: authors list (link) CS1 maint: numeric names: authors list (link){{cite book}}
: CS1 maint: multiple names: authors list (link) CS1 maint: numeric names: authors list (link){{cite book}}
: CS1 maint: multiple names: authors list (link) CS1 maint: numeric names: authors list (link)