Naturalness (physics)

Last updated

In physics, naturalness is the aesthetic property that the dimensionless ratios between free parameters or physical constants appearing in a physical theory should take values "of order 1" and that free parameters are not fine-tuned. That is, a natural theory would have parameter ratios with values like 2.34 rather than 234000 or 0.000234.

Contents

The requirement that satisfactory theories should be "natural" in this sense is a current of thought initiated around the 1960s in particle physics. It is a criterion that arises from the seeming non-naturalness of the standard model and the broader topics of the hierarchy problem, fine-tuning, and the anthropic principle. However it does tend to suggest a possible area of weakness or future development for current theories such as the Standard Model, where some parameters vary by many orders of magnitude, and which require extensive "fine-tuning" of their current values of the models concerned. The concern is that it is not yet clear whether these seemingly exact values we currently recognize, have arisen by chance (based upon the anthropic principle or similar) or whether they arise from a more advanced theory not yet developed, in which these turn out to be expected and well-explained, because of other factors not yet part of particle physics models.

The concept of naturalness is not always compatible with Occam's razor, since many instances of "natural" theories have more parameters than "fine-tuned" theories such as the Standard Model. Naturalness in physics is closely related to the issue of fine-tuning, and over the past decade many scientists [1] [2] [3] [4] [5] argued that the principle of naturalness is a specific application of Bayesian statistics.

In the history of particle physics, the naturalness principle has given correct predictions three times - in the case of electron self-energy, pion mass difference and kaon mass difference. [6]

Overview

A simple example:

Suppose a physics model requires four parameters which allow it to produce a very high quality working model, calculations, and predictions of some aspect of our physical universe. Suppose we find through experiments that the parameters have values:

We might wonder how such figures arise. But in particular we might be especially curious about a theory where three values are close to one, and the fourth is so different; in other words, the huge disproportion we seem to find between the first three parameters and the fourth. We might also wonder, if these values represent the strengths of forces and one force is so much larger than the others that it needs a factor of 4 x 1029 to allow it to be related to them in terms of effects, how our universe come to be so exactly balanced when its forces emerged. In current particle physics the differences between some parameters are much larger than this, so the question is even more noteworthy.

One answer given by some physicists is the anthropic principle. If the universe came to exist by chance, and perhaps vast numbers of other universes exist or have existed, then life capable of physics experiments only arose in universes that by chance had very balanced forces. All the universes where the forces were not balanced could not develop life capable of the question. So if a lifeform like human beings asks such a question, it must have arisen in a universe having balanced forces, however rare that might be. So when we look, that is what we would expect to find, and what we do find.

A second answer is that perhaps there is a deeper understanding of physics, which, if discovered and understood, would make clear these are not really fundamental parameters and there is a good reason why they have the exact values we have found, because they all derive from other more fundamental parameters that are not so unbalanced.

Introduction

In particle physics, the assumption of naturalness means that, unless a more detailed explanation exists, all conceivable terms in the effective action that preserve the required symmetries should appear in this effective action with natural coefficients. [7]

In an effective field theory, Λ is the cutoff scale, an energy or length scale at which the theory breaks down. Due to dimensional analysis, natural coefficients have the form

where d is the dimension of the field operator; and c is a dimensionless number which should be "random" and smaller than 1 at the scale where the effective theory breaks down. Further renormalization group running can reduce the value of c at an energy scale E, but by a small factor proportional to ln(E/Λ).

Some parameters in the effective action of the Standard Model seem to have far smaller coefficients than required by consistency with the assumption of naturalness, leading to some of the fundamental open questions in physics. In particular:

In addition, the coupling of the electron to the Higgs, the mass of the electron, is abnormally small, and to a lesser extent, the masses of the light quarks. [7]

In models with large extra dimensions, the assumption of naturalness is violated for operators which multiply field operators that create objects which are localized at different positions in the extra dimensions. [8]

Naturalness and the gauge hierarchy problem

A more practical definition of naturalness is that for any observable which consists of n independent contributions

then all independent contributions to should be comparable to or less than . Otherwise, if one contribution, say , then some other independent contribution would have to be fine-tuned to a large opposite-sign value such as to maintain at its measured value. Such fine-tuning is regarded as unnatural and indicative of some missing ingredient in the theory.

For instance, in the Standard Model with Higgs potential given by

the physical Higgs boson mass is calculated to be

where the quadratically divergent radiative correction is given by

where is the top-quark Yukawa coupling, is the SU(2) gauge coupling and is the energy cut-off to the divergent loop integrals. As increases (depending on the chosen cut-off ), then can be freely dialed so as to maintain at its measured value (now known to be GeV). By insisting on naturalness, then . Solving for , one finds TeV. This then implies that the Standard Model as a natural effective field theory is only valid up to the 1 TeV energy scale.

Sometimes it is complained that this argument depends on the regularization scheme introducing the cut-off and perhaps the problem disappears under dimensional regularization. In this case, if new particles which couple to the Higgs are introduced, one once again regains the quadratic divergence now in terms of the new particle squared masses. For instance, if one includes see-saw neutrinos into the Standard Model, then would blow up to near the see-saw scale, typically expected in the GeV range.

MSSM and the little hierarchy

Overview

By supersymmetrizing the Standard Model, one arrives at a solution to the gauge hierarchy, or big hierarchy, problem in that supersymmetry guarantees cancellation of quadratic divergences to all orders in perturbation theory. The simplest supersymmetrization of the SM leads to the Minimal Supersymmetric Standard Model or MSSM. In the MSSM, each SM particle has a partner particle known as a super-partner or sparticle. For instance, the left- and right-electron helicity components have scalar partner selectrons and respectively whilst the eight colored gluons have eight colored spin-1/2 gluino superpartners. The MSSM Higgs sector must necessarily be expanded to include two rather than one doublets leading to five physical Higgs particles and whilst three of the eight Higgs component fields are absorbed by the and bosons to make them massive. The MSSM is actually supported by three different sets of measurements which test for the presence of virtual superpartners: 1. the celebrated weak scale measurements of the three gauge couplings strengths are just what is needed for gauge coupling unification at a scale GeV, 2. the value of GeV falls squarely in the range needed to trigger a radiatively-driven breakdown in electroweak symmetry and 3. the measured value of GeV falls within the narrow window of allowed values for the MSSM.

Nonetheless, verification of weak scale SUSY (WSS, SUSY with superpartner masses at or around the weak scale as characterized by GeV) requires the direct observation of at least some of the superpartners at sufficiently energetic colliding beam experiments.[ clarification needed ] As recent as 2017, the CERN Large Hadron Collider, a collider operating at center-of-mass energy 13 TeV, has not found any evidence for superpartners. This has led to mass limits on the gluino TeV and on the lighter top squark TeV (within the context of certain simplified models which are assumed to make the experimental analysis more tractable). Along with these limits, the rather large measured value of GeV seems to require TeV-scale highly mixed top squarks. These combined measurements have raised concern now about an emerging Little Hierarchy problem characterized by . Under the Little Hierarchy, one might expect the now log-divergent light Higgs mass to blow up to the sparticle mass scale unless one fine-tunes. The Little Hierarchy problem has led to concern that WSS is perhaps not realized in nature, or at least not in the manner typically expected by theorists in years past.

Status

In the MSSM, the light Higgs mass is calculated to be

where the mixing and loop contributions are but where in most models, the soft SUSY breaking up-Higgs mass is driven to large, TeV-scale negative values (in order to break electroweak symmetry). Then, to maintain the measured value of GeV, one must tune the superpotential mass term to some large positive value. Alternatively, for natural SUSY, one may expect that runs to small negative values in which case both and are of order 100-200 GeV. This already leads to a prediction: since is supersymmetric and feeds mass to both SM particles (W,Z,h) and superpartners (higgsinos), then it is expected from the natural MSSM that light higgsinos exist nearby to the 100-200 GeV scale. This simple realization has profound implications for WSS collider and dark matter searches.

Naturalness in the MSSM has historically been expressed in terms of the boson mass, and indeed this approach leads to more stringent upper bounds on sparticle masses. By minimizing the (Coleman-Weinberg) scalar potential of the MSSM, then one may relate the measured value of GeV to the SUSY Lagrangian parameters:

Here, is the ratio of Higgs field vacuum expectation values and is the down-Higgs soft breaking mass term. The and contain a variety of loop corrections labelled by indices i and j, the most important of which typically comes from the top-squarks.

In the renowned review work of P. Nilles, titled "Supersymmetry, Supergravity and Particle Physics", published on Phys.Rept. 110 (1984) 1-162, one finds the sentence "Experiments within the next five to ten years will enable us to decide whether supersymmetry as a solution of the naturalness problem of the weak interaction scale is a myth or a reality".

See also

Related Research Articles

<span class="mw-page-title-main">Electroweak interaction</span> Unified description of electromagnetism and the weak interaction

In particle physics, the electroweak interaction or electroweak force is the unified description of two of the four known fundamental interactions of nature: electromagnetism (electromagnetic interaction) and the weak interaction. Although these two forces appear very different at everyday low energies, the theory models them as two different aspects of the same force. Above the unification energy, on the order of 246 GeV, they would merge into a single force. Thus, if the temperature is high enough – approximately 1015 K – then the electromagnetic force and weak force merge into a combined electroweak force. During the quark epoch (shortly after the Big Bang), the electroweak force split into the electromagnetic and weak force. It is thought that the required temperature of 1015 K has not been seen widely throughout the universe since before the quark epoch, and currently the highest human-made temperature in thermal equilibrium is around 5.5x1012 K (from the Large Hadron Collider).

<span class="mw-page-title-main">Grand Unified Theory</span> Any particle physics model that theorizes the merging of the electromagnetic, weak and strong forces

Grand Unified Theory (GUT) is any model in particle physics that merges the electromagnetic, weak, and strong forces into a single force at high energies. Although this unified force has not been directly observed, many GUT models theorize its existence. If the unification of these three interactions is possible, it raises the possibility that there was a grand unification epoch in the very early universe in which these three fundamental interactions were not yet distinct.

<span class="mw-page-title-main">Quantum field theory</span> Theoretical framework

In theoretical physics, quantum field theory (QFT) is a theoretical framework that combines classical field theory, special relativity, and quantum mechanics. QFT is used in particle physics to construct physical models of subatomic particles and in condensed matter physics to construct models of quasiparticles.

<span class="mw-page-title-main">Standard Model</span> Theory of forces and subatomic particles

The Standard Model of particle physics is the theory describing three of the four known fundamental forces in the universe and classifying all known elementary particles. It was developed in stages throughout the latter half of the 20th century, through the work of many scientists worldwide, with the current formulation being finalized in the mid-1970s upon experimental confirmation of the existence of quarks. Since then, proof of the top quark (1995), the tau neutrino (2000), and the Higgs boson (2012) have added further credence to the Standard Model. In addition, the Standard Model has predicted various properties of weak neutral currents and the W and Z bosons with great accuracy.

Supersymmetry is a theoretical framework in physics that suggests the existence of a symmetry between particles with integer spin (bosons) and particles with half-integer spin (fermions). It proposes that for every known particle, there exists a partner particle with different spin properties. There have been multiple experiments on supersymmetry that have failed to provide evidence that it exists in nature. If evidence is found, supersymmetry could help explain certain phenomena, such as the nature of dark matter and the hierarchy problem in particle physics.

<span class="mw-page-title-main">Technicolor (physics)</span> Hypothetical model through which W and Z bosons acquire mass

Technicolor theories are models of physics beyond the Standard Model that address electroweak gauge symmetry breaking, the mechanism through which W and Z bosons acquire masses. Early technicolor theories were modelled on quantum chromodynamics (QCD), the "color" theory of the strong nuclear force, which inspired their name.

<span class="mw-page-title-main">Minimal Supersymmetric Standard Model</span> Simplest supersymmetric extension to the Standard Model

The Minimal Supersymmetric Standard Model (MSSM) is an extension to the Standard Model that realizes supersymmetry. MSSM is the minimal supersymmetrical model as it considers only "the [minimum] number of new particle states and new interactions consistent with "Reality". Supersymmetry pairs bosons with fermions, so every Standard Model particle has a superpartner yet undiscovered. If discovered, such superparticles could be candidates for dark matter, and could provide evidence for grand unification or the viability of string theory. The failure to find evidence for MSSM using the Large Hadron Collider has strengthened an inclination to abandon it.

<span class="mw-page-title-main">Hierarchy problem</span> Unsolved problem in physics

In theoretical physics, the hierarchy problem is the problem concerning the large discrepancy between aspects of the weak force and gravity. There is no scientific consensus on why, for example, the weak force is 1024 times stronger than gravity.

In the theory of grand unification of particle physics, and, in particular, in theories of neutrino masses and neutrino oscillation, the seesaw mechanism is a generic model used to understand the relative sizes of observed neutrino masses, of the order of eV, compared to those of quarks and charged leptons, which are millions of times heavier. The name of the seesaw mechanism was given by Tsutomu Yanagida in a Tokyo conference in 1981.

In particle physics, the doublet–triplet (splitting) problem is a problem of some Grand Unified Theories, such as SU(5), SO(10), and . Grand unified theories predict Higgs bosons arise from representations of the unified group that contain other states, in particular, states that are triplets of color. The primary problem with these color triplet Higgs is that they can mediate proton decay in supersymmetric theories that are only suppressed by two powers of GUT scale. In addition to mediating proton decay, they alter gauge coupling unification. The doublet–triplet problem is the question 'what keeps the doublets light while the triplets are heavy?'

The Compton wavelength is a quantum mechanical property of a particle, defined as the wavelength of a photon the energy of which is the same as the rest energy of that particle. It was introduced by Arthur Compton in 1923 in his explanation of the scattering of photons by electrons.

<span class="mw-page-title-main">Mathematical formulation of the Standard Model</span> Mathematics of a particle physics model

This article describes the mathematics of the Standard Model of particle physics, a gauge quantum field theory containing the internal symmetries of the unitary product group SU(3) × SU(2) × U(1). The theory is commonly viewed as describing the fundamental set of particles – the leptons, quarks, gauge bosons and the Higgs boson.

In theoretical physics, there are many theories with supersymmetry (SUSY) which also have internal gauge symmetries. Supersymmetric gauge theory generalizes this notion.

In particle physics, a symmetry that remains after spontaneous symmetry breaking that can prevent higher-order radiative corrections from spoiling some property of a theory is called a custodial symmetry.

In particle physics the little hierarchy problem in the Minimal Supersymmetric Standard Model (MSSM) is a refinement of the hierarchy problem. According to quantum field theory, the mass of the Higgs boson must be rather light for the electroweak theory to work. However, the loop corrections to the mass are naturally much greater; this is known as the hierarchy problem. New physical effects such as supersymmetry may in principle reduce the size of the loop corrections, making the theory natural. However, it is known from experiments that new physics such as superpartners does not occur at very low energy scales, so even if these new particles reduce the loop corrections, they do not reduce them enough to make the renormalized Higgs mass completely natural. The expected value of the Higgs mass is about 10 percent of the size of the loop corrections which shows that a certain "little" amount of fine-tuning seems necessary.

In particle physics, NMSSM is an acronym for Next-to-Minimal Supersymmetric Standard Model. It is a supersymmetric extension to the Standard Model that adds an additional singlet chiral superfield to the MSSM and can be used to dynamically generate the term, solving the -problem. Articles about the NMSSM are available for review.

In particle physics, the Peskin–Takeuchi parameters are a set of three measurable quantities, called S, T, and U, that parameterize potential new physics contributions to electroweak radiative corrections. They are named after physicists Michael Peskin and Tatsu Takeuchi, who proposed the parameterization in 1990; proposals from two other groups came almost simultaneously.

In theoretical physics, the μ problem is a problem of supersymmetric theories, concerned with understanding the parameters of the theory.

<span class="mw-page-title-main">Quantum triviality</span> Possible outcome of renormalization in physics

In a quantum field theory, charge screening can restrict the value of the observable "renormalized" charge of a classical theory. If the only resulting value of the renormalized charge is zero, the theory is said to be "trivial" or noninteracting. Thus, surprisingly, a classical theory that appears to describe interacting particles can, when realized as a quantum field theory, become a "trivial" theory of noninteracting free particles. This phenomenon is referred to as quantum triviality. Strong evidence supports the idea that a field theory involving only a scalar Higgs boson is trivial in four spacetime dimensions, but the situation for realistic models including other particles in addition to the Higgs boson is not known in general. Nevertheless, because the Higgs boson plays a central role in the Standard Model of particle physics, the question of triviality in Higgs models is of great importance.

In particle physics, composite Higgs models (CHM) are speculative extensions of the Standard Model (SM) where the Higgs boson is a bound state of new strong interactions. These scenarios are models for physics beyond the SM presently tested at the Large Hadron Collider (LHC) in Geneva.

References

  1. Fowlie, Andrew; Balazs, Csaba; White, Graham; Marzola, Luca; Raidal, Martti (17 August 2016). "Naturalness of the relaxion mechanism". Journal of High Energy Physics. 2016 (8): 100. arXiv: 1602.03889 . Bibcode:2016JHEP...08..100F. doi:10.1007/JHEP08(2016)100. S2CID   119102534.
  2. Fowlie, Andrew (10 July 2014). "CMSSM, naturalness and the ?fine-tuning price? of the Very Large Hadron Collider". Physical Review D. 90 (1): 015010. arXiv: 1403.3407 . Bibcode:2014PhRvD..90a5010F. doi:10.1103/PhysRevD.90.015010. S2CID   118362634.
  3. Fowlie, Andrew (15 October 2014). "Is the CNMSSM more credible than the CMSSM?". The European Physical Journal C. 74 (10). arXiv: 1407.7534 . doi:10.1140/epjc/s10052-014-3105-y. S2CID   119304794.
  4. Cabrera, Maria Eugenia; Casas, Alberto; Austri, Roberto Ruiz de (2009). "Bayesian approach and naturalness in MSSM analyses for the LHC". Journal of High Energy Physics. 2009 (3): 075. arXiv: 0812.0536 . Bibcode:2009JHEP...03..075C. doi:10.1088/1126-6708/2009/03/075. S2CID   18276270.
  5. Fichet, S. (18 December 2012). "Quantified naturalness from Bayesian statistics". Physical Review D. 86 (12): 125029. arXiv: 1204.4940 . Bibcode:2012PhRvD..86l5029F. doi:10.1103/PhysRevD.86.125029. S2CID   119282331.
  6. Dijkstra, Casper Daniel (2019-04-19). "Naturalness as a reasonable scientific principle in fundamental physics". arXiv: 1906.03036 [physics.hist-ph].
  7. 1 2 3 4 N. Seiberg (1993). "Naturalness versus supersymmetric non-renormalization theorems". Physics Letters B . 318 (3): 469–475. arXiv: hep-ph/9309335 . Bibcode:1993PhLB..318..469S. doi:10.1016/0370-2693(93)91541-T. S2CID   14683964.
  8. N. Arkani-Hamed, M. Schmaltz (2000). "Hierarchies without Symmetries from Extra Dimensions". Physical Review D . 61 (3): 033005. arXiv: hep-ph/9903417 . Bibcode:2000PhRvD..61c3005A. doi:10.1103/PhysRevD.61.033005. S2CID   18030407.

Further reading