Ultraviolet divergence

Last updated

In physics, an ultraviolet divergence or UV divergence is a situation in which an integral, for example a Feynman diagram, diverges because of contributions of objects with unbounded energy, or, equivalently, because of physical phenomena at infinitesimal distances.

Contents

Overview

Since an infinite result is unphysical, ultraviolet divergences often require special treatment to remove unphysical effects inherent in the perturbative formalisms. In particular, UV divergences can often be removed by regularization and renormalization. Successful resolution of an ultraviolet divergence is known as ultraviolet completion . If they cannot be removed, they imply that the theory is not perturbatively well-defined at very short distances.

The name comes from the earliest example of such a divergence, the "ultraviolet catastrophe" first encountered in understanding blackbody radiation. According to classical physics at the end of the nineteenth century, the quantity of radiation in the form of light released at any specific wavelength should increase with decreasing wavelength—in particular, there should be considerably more ultraviolet light released from a blackbody radiator than infrared light. Measurements showed the opposite, with maximal energy released at intermediate wavelengths, suggesting a failure of classical mechanics. This problem eventually led to the development of quantum mechanics.

The successful resolution of the original ultraviolet catastrophe has prompted the pursuit of solutions to other problems of ultraviolet divergence. A similar problem in electromagnetism was solved by Richard Feynman by applying quantum field theory through the use of renormalization groups, leading to the successful creation of quantum electrodynamics (QED). Similar techniques led to the standard model of particle physics. Ultraviolet divergences remain a key feature in the exploration of new physical theories, like supersymmetry.

Proliferation in perturbative theory

Commenting on the fact that contemporary theories about quantum scattering of fundamental particles grew out of application of the quantization procedure to classical fields that satisfy wave equations, J.D. Bjorken and Sidney Drell [1] pointed out the following facts about such a procedure which are still as relevant today as in 1965:

The first is that we are led to a theory with differential wave propagation. The field functions are continuous functions of continuous parameters x and t, and the changes in the fields at a point x are determined by properties of the fields infinitesimally close to the point x. For most wave fields (for example, sound waves and the vibrations of strings and membranes) such a description is an idealization which is valid for distances larger than the characteristic length which measures the granularity of the medium. For smaller distances these theories are modified in a profound way. The electromagnetic field is a notable exception. Indeed, until the special theory of relativity obviated the necessity of a mechanistic interpretation, physicists made great efforts to discover evidence for such a mechanical description of the radiation field. After the requirement of an “ether” which propagates light waves had been abandoned, there was considerably less difficulty in accepting this same idea when the observed wave properties of the electron suggested the introduction of a new field. Indeed there is no evidence of an ether which underlies the electron wave. However, it is a gross and profound extrapolation of present experimental knowledge to assume that a wave description successful at “large” distances (that is, atomic lengths ≈10−8 cm) may be extended to distances an indefinite number of orders of magnitude smaller (for example, to less than nuclear lengths ≈10−13 cm). In the relativistic theory, we have seen that the assumption that the field description is correct in arbitrarily small space-time intervals has led—in perturbation theory—to divergent expressions for the electron self-energy and the bare charge. Renormalization theory has sidestepped these divergence difficulties, which may be indicative of the failure of the perturbation expansion. However, it is widely felt that the divergences are symptomatic of a chronic disorder in the small-distance behaviour of the theory. We might then ask why local field theories, that is, theories of fields which can be described by differential laws of wave propagation, have been so extensively used and accepted. There are several reasons, including the important one that with their aid a significant region of agreement with observations has been found. But the foremost reason is brutally simple: there exists no convincing form of a theory which avoids differential field equations.

See also

Related Research Articles

<span class="mw-page-title-main">Electromagnetic radiation</span> Waves of the electromagnetic field

In physics, electromagnetic radiation (EMR) consists of waves of the electromagnetic (EM) field, which propagate through space and carry momentum and electromagnetic radiant energy. It includes radio waves, microwaves, infrared, (visible) light, ultraviolet, X-rays, and gamma rays. All of these waves form part of the electromagnetic spectrum.

<span class="mw-page-title-main">Quantum field theory</span> Theoretical framework combining classical field theory, special relativity, and quantum mechanics

In theoretical physics, quantum field theory (QFT) is a theoretical framework that combines classical field theory, special relativity, and quantum mechanics. QFT is used in particle physics to construct physical models of subatomic particles and in condensed matter physics to construct models of quasiparticles.

<span class="mw-page-title-main">Quantum electrodynamics</span> Quantum field theory of electromagnetism

In particle physics, quantum electrodynamics (QED) is the relativistic quantum field theory of electrodynamics. In essence, it describes how light and matter interact and is the first theory where full agreement between quantum mechanics and special relativity is achieved. QED mathematically describes all phenomena involving electrically charged particles interacting by means of exchange of photons and represents the quantum counterpart of classical electromagnetism giving a complete account of matter and light interaction.

<span class="mw-page-title-main">Julian Schwinger</span> American theoretical physicist (1918-1994)

Julian Seymour Schwinger was a Nobel Prize winning American theoretical physicist. He is best known for his work on quantum electrodynamics (QED), in particular for developing a relativistically invariant perturbation theory, and for renormalizing QED to one loop order. Schwinger was a physics professor at several universities.

<span class="mw-page-title-main">Renormalization</span> Method in physics used to deal with infinities

Renormalization is a collection of techniques in quantum field theory, the statistical mechanics of fields, and the theory of self-similar geometric structures, that are used to treat infinities arising in calculated quantities by altering values of these quantities to compensate for effects of their self-interactions. But even if no infinities arose in loop diagrams in quantum field theory, it could be shown that it would be necessary to renormalize the mass and fields appearing in the original Lagrangian.

In theoretical physics, the term renormalization group (RG) refers to a formal apparatus that allows systematic investigation of the changes of a physical system as viewed at different scales. In particle physics, it reflects the changes in the underlying force laws as the energy scale at which physical processes occur varies, energy/momentum and resolution distance scales being effectively conjugate under the uncertainty principle.

<span class="mw-page-title-main">Anomaly (physics)</span> Asymmetry of classical and quantum action

In quantum physics an anomaly or quantum anomaly is the failure of a symmetry of a theory's classical action to be a symmetry of any regularization of the full quantum theory. In classical physics, a classical anomaly is the failure of a symmetry to be restored in the limit in which the symmetry-breaking parameter goes to zero. Perhaps the first known anomaly was the dissipative anomaly in turbulence: time-reversibility remains broken at the limit of vanishing viscosity.

<span class="mw-page-title-main">Coupling constant</span> Parameter describing the strength of a force

In physics, a coupling constant or gauge coupling parameter, is a number that determines the strength of the force exerted in an interaction. Originally, the coupling constant related the force acting between two static bodies to the "charges" of the bodies divided by the distance squared, , between the bodies; thus: in for Newtonian gravity and in for electrostatic. This description remains valid in modern physics for linear theories with static bodies and massless force carriers.

In physics, an infrared divergence is a situation in which an integral, for example a Feynman diagram, diverges because of contributions of objects with very small energy approaching zero, or equivalently, because of physical phenomena at very long distances.

In theoretical physics, Pauli–Villars regularization (P–V) is a procedure that isolates divergent terms from finite parts in loop calculations in field theory in order to renormalize the theory. Wolfgang Pauli and Felix Villars published the method in 1949, based on earlier work by Richard Feynman, Ernst Stueckelberg and Dominique Rivier.

<span class="mw-page-title-main">Gauge fixing</span> Procedure of coping with redundant degrees of freedom in physical field theories

In the physics of gauge theories, gauge fixing denotes a mathematical procedure for coping with redundant degrees of freedom in field variables. By definition, a gauge theory represents each physically distinct configuration of the system as an equivalence class of detailed local field configurations. Any two detailed configurations in the same equivalence class are related by a gauge transformation, equivalent to a shear along unphysical axes in configuration space. Most of the quantitative physical predictions of a gauge theory can only be obtained under a coherent prescription for suppressing or ignoring these unphysical degrees of freedom.

In physics, especially quantum field theory, regularization is a method of modifying observables which have singularities in order to make them finite by the introduction of a suitable parameter called the regulator. The regulator, also known as a "cutoff", models our lack of knowledge about physics at unobserved scales. It compensates for the possibility that "new physics" may be discovered at those scales which the present theory is unable to model, while enabling the current theory to give accurate predictions as an "effective theory" within its intended scale of use.

<span class="mw-page-title-main">History of quantum field theory</span>

In particle physics, the history of quantum field theory starts with its creation by Paul Dirac, when he attempted to quantize the electromagnetic field in the late 1920s. Heisenberg was awarded the 1932 Nobel Prize in Physics "for the creation of quantum mechanics". Major advances in the theory were made in the 1940s and 1950s, leading to the introduction of renormalized quantum electrodynamics (QED). QED was so successful and accurately predictive that efforts were made to apply the same basic concepts for the other forces of nature. By the late 1970s, these efforts successfully utilized gauge theory in the strong nuclear force and weak nuclear force, producing the modern Standard Model of particle physics.

In a quantum field theory, one may calculate an effective or running coupling constant that defines the coupling of the theory measured at a given momentum scale. One example of such a coupling constant is the electric charge.

In the physics of electromagnetism, the Abraham–Lorentz force is the recoil force on an accelerating charged particle caused by the particle emitting electromagnetic radiation by self-interaction. It is also called the radiation reaction force, radiation damping force or the self-force. It is named after the physicists Max Abraham and Hendrik Lorentz.

Causal perturbation theory is a mathematically rigorous approach to renormalization theory, which makes it possible to put the theoretical setup of perturbative quantum field theory on a sound mathematical basis. It goes back to a seminal work by Henri Epstein and Vladimir Jurko Glaser.

In theoretical physics, the BRST formalism, or BRST quantization denotes a relatively rigorous mathematical approach to quantizing a field theory with a gauge symmetry. Quantization rules in earlier quantum field theory (QFT) frameworks resembled "prescriptions" or "heuristics" more than proofs, especially in non-abelian QFT, where the use of "ghost fields" with superficially bizarre properties is almost unavoidable for technical reasons related to renormalization and anomaly cancellation.

<span class="mw-page-title-main">Light front quantization</span> Technique in computational quantum field theory

The light-front quantization of quantum field theories provides a useful alternative to ordinary equal-time quantization. In particular, it can lead to a relativistic description of bound systems in terms of quantum-mechanical wave functions. The quantization is based on the choice of light-front coordinates, where plays the role of time and the corresponding spatial coordinate is . Here, is the ordinary time, is one Cartesian coordinate, and is the speed of light. The other two Cartesian coordinates, and , are untouched and often called transverse or perpendicular, denoted by symbols of the type . The choice of the frame of reference where the time and -axis are defined can be left unspecified in an exactly soluble relativistic theory, but in practical calculations some choices may be more suitable than others.

The Kinoshita–Lee–Nauenberg theorem or KLN theorem states that perturbatively the standard model as a whole is infrared (IR) finite. That is, the infrared divergences coming from loop integrals are canceled by IR divergences coming from phase space integrals. It was introduced independently by Kinoshita (1962) and Tsung-Dao Lee and Michael Nauenberg (1964).

<span class="mw-page-title-main">Asymptotic safety in quantum gravity</span> Attempt to find a consistent theory of quantum gravity

Asymptotic safety is a concept in quantum field theory which aims at finding a consistent and predictive quantum theory of the gravitational field. Its key ingredient is a nontrivial fixed point of the theory's renormalization group flow which controls the behavior of the coupling constants in the ultraviolet (UV) regime and renders physical quantities safe from divergences. Although originally proposed by Steven Weinberg to find a theory of quantum gravity, the idea of a nontrivial fixed point providing a possible UV completion can be applied also to other field theories, in particular to perturbatively nonrenormalizable ones. In this respect, it is similar to quantum triviality.

References

  1. J.D. Bjorken, S. Drell (1965). Relativistic Quantum Fields, Preface. McGraw-Hill. ISBN   0-07-005494-0.