One-loop Feynman diagram

Last updated

In physics, a one-loop Feynman diagram is a connected Feynman diagram with only one cycle (unicyclic). Such a diagram can be obtained from a connected tree diagram by taking two external lines of the same type and joining them together into an edge.

Diagrams with loops (in graph theory, these kinds of loops are called cycles, while the word loop is an edge connecting a vertex with itself) correspond to the quantum corrections to the classical field theory. Because one-loop diagrams only contain one cycle, they express the next-to-classical contributions called the semiclassical contributions.

One-loop diagrams are usually computed as the integral over one independent momentum that can "run in the cycle". The Casimir effect, Hawking radiation and Lamb shift are examples of phenomena whose existence can be implied using one-loop Feynman diagrams, especially the well-known "triangle diagram":

Triangle diagram.svg

The evaluation of one-loop Feynman diagrams usually leads to divergent expressions, which are either due to:

Infrared divergences are usually dealt with by assigning the zero mass particles a small mass λ, evaluating the corresponding expression and then taking the limit . Ultraviolet divergences are dealt with by renormalization.

See also


Related Research Articles

Feynman diagram Pictorial representation of the behavior of subatomic particles

In theoretical physics, a Feynman diagram is a pictorial representation of the mathematical expressions describing the behavior and interaction of subatomic particles. The scheme is named after American physicist Richard Feynman, who introduced the diagrams in 1948. The interaction of subatomic particles can be complex and difficult to understand; Feynman diagrams give a simple visualization of what would otherwise be an arcane and abstract formula. According to David Kaiser, "Since the middle of the 20th century, theoretical physicists have increasingly turned to this tool to help them undertake critical calculations. Feynman diagrams have revolutionized nearly every aspect of theoretical physics." While the diagrams are applied primarily to quantum field theory, they can also be used in other fields, such as solid-state theory. Frank Wilczek wrote that the calculations which won him the 2004 Nobel Prize in Physics "would have been literally unthinkable without Feynman diagrams, as would [Wilczek's] calculations that established a route to production and observation of the Higgs particle."

In theories of quantum gravity, the graviton is the hypothetical quantum of gravity, an elementary particle that mediates the force of gravitational interaction. There is no complete quantum field theory of gravitons due to an outstanding mathematical problem with renormalization in general relativity. In string theory, believed to be a consistent theory of quantum gravity, the graviton is a massless state of a fundamental string.

Quantum field theory Theoretical framework combining classical field theory, special relativity, and quantum mechanics

In theoretical physics, quantum field theory (QFT) is a theoretical framework that combines classical field theory, special relativity and quantum mechanics. QFT is used in particle physics to construct physical models of subatomic particles and in condensed matter physics to construct models of quasiparticles.

Quantum electrodynamics Relativistic quantum field theory of electromagnetism

In particle physics, quantum electrodynamics (QED) is the relativistic quantum field theory of electrodynamics. In essence, it describes how light and matter interact and is the first theory where full agreement between quantum mechanics and special relativity is achieved. QED mathematically describes all phenomena involving electrically charged particles interacting by means of exchange of photons and represents the quantum counterpart of classical electromagnetism giving a complete account of matter and light interaction.

Renormalization Method used in mathematical physics

Renormalization is a collection of techniques in quantum field theory, the statistical mechanics of fields, and the theory of self-similar geometric structures, that are used to treat infinities arising in calculated quantities by altering values of these quantities to compensate for effects of their self-interactions. But even if no infinities arose in loop diagrams in quantum field theory, it could be shown that it would be necessary to renormalize the mass and fields appearing in the original Lagrangian.

In theoretical physics, a chiral anomaly is the anomalous nonconservation of a chiral current. In everyday terms, it is equivalent to a sealed box that contained equal numbers of left and right-handed bolts, but when opened was found to have more left than right, or vice versa.

Faddeev–Popov ghost Type of unphysical field in quantum field theory which provides mathematical consistency

In physics, Faddeev–Popov ghosts are extraneous fields which are introduced into gauge quantum field theories to maintain the consistency of the path integral formulation. They are named after Ludvig Faddeev and Victor Popov.

In physics, an infrared divergence is a situation in which an integral, for example a Feynman diagram, diverges because of contributions of objects with very small energy approaching zero, or, equivalently, because of physical phenomena at very long distances.

In physics, an ultraviolet divergence or UV divergence is a situation in which an integral, for example a Feynman diagram, diverges because of contributions of objects with unbounded energy, or, equivalently, because of physical phenomena at infinitesimal distances.

1/<i>N</i> expansion Perturbative analysis of quantum field theories

In quantum field theory and statistical mechanics, the 1/N expansion is a particular perturbative analysis of quantum field theories with an internal symmetry group such as SO(N) or SU(N). It consists in deriving an expansion for the properties of the theory in powers of , which is treated as a small parameter.

In theoretical physics, Pauli–Villars regularization (P–V) is a procedure that isolates divergent terms from finite parts in loop calculations in field theory in order to renormalize the theory. Wolfgang Pauli and Felix Villars published the method in 1949, based on earlier work by Richard Feynman, Ernst Stueckelberg and Dominique Rivier.

In physics, especially quantum field theory, regularization is a method of modifying observables which have singularities in order to make them finite by the introduction of a suitable parameter called regulator. The regulator, also known as a "cutoff", models our lack of knowledge about physics at unobserved scales. It compensates for the possibility that "new physics" may be discovered at those scales which the present theory is unable to model, while enabling the current theory to give accurate predictions as an "effective theory" within its intended scale of use.

In particle physics, dimensional transmutation is a physical mechanism providing a linkage between a dimensionless parameter and a dimensionful parameter.

In theoretical physics, scalar field theory can refer to a relativistically invariant classical or quantum theory of scalar fields. A scalar field is invariant under any Lorentz transformation.

In physics, a renormalon is a particular source of divergence seen in perturbative approximations to quantum field theories (QFT). When a formally divergent series in a QFT is summed using Borel summation, the associated Borel transform of the series can have singularities as a function of the complex transform parameter. The renormalon is a possible type of singularity arising in this complex Borel plane, and is a counterpart of an instanton singularity. Associated with such singularities, renormalon contributions are discussed in the context of quantum chromodynamics (QCD) and usually have the power-like form as functions of the momentum . They are cited against the usual logarithmic effects like .

The automatic calculation of particle interaction or decay is part of the computational particle physics branch. It refers to computing tools that help calculating the complex particle interactions as studied in high-energy physics, astroparticle physics and cosmology. The goal of the automation is to handle the full sequence of calculations in an automatic (programmed) way: from the Lagrangian expression describing the physics model up to the cross-sections values and to the event generator software.

In theoretical physics, functional renormalization group (FRG) is an implementation of the renormalization group (RG) concept which is used in quantum and statistical field theory, especially when dealing with strongly interacting systems. The method combines functional methods of quantum field theory with the intuitive renormalization group idea of Kenneth G. Wilson. This technique allows to interpolate smoothly between the known microscopic laws and the complicated macroscopic phenomena in physical systems. In this sense, it bridges the transition from simplicity of microphysics to complexity of macrophysics. Figuratively speaking, FRG acts as a microscope with a variable resolution. One starts with a high-resolution picture of the known microphysical laws and subsequently decreases the resolution to obtain a coarse-grained picture of macroscopic collective phenomena. The method is nonperturbative, meaning that it does not rely on an expansion in a small coupling constant. Mathematically, FRG is based on an exact functional differential equation for a scale-dependent effective action.

The Kinoshita–Lee–Nauenberg theorem or KLN theorem states that perturbatively the standard model as a whole is infrared (IR) finite. That is, the infrared divergences coming from loop integrals are canceled by IR divergences coming from phase space integrals. It was introduced independently by Kinoshita (1962) and Tsung-Dao Lee and Michael Nauenberg (1964).

Zvi Bern is an American theoretical particle physicist. He is a professor at University of California, Los Angeles (UCLA).