Fine-tuning (physics)

Last updated

In theoretical physics, fine-tuning is the process in which parameters of a model must be adjusted very precisely in order to fit with certain observations. This had led to the discovery that the fundamental constants and quantities fall into such an extraordinarily precise range that if it did not, the origin and evolution of conscious agents in the universe would not be permitted. [1]

Contents

Theories requiring fine-tuning are regarded as problematic in the absence of a known mechanism to explain why the parameters happen to have precisely the observed values that they return. The heuristic rule that parameters in a fundamental physical theory should not be too fine-tuned is called naturalness. [2] [3]

Background

The idea that naturalness will explain fine tuning was brought into question by Nima Arkani-Hamed, a theoretical physicist, in his talk "Why is there a Macroscopic Universe?", a lecture from the mini-series "Multiverse & Fine Tuning" from the "Philosophy of Cosmology" project, a University of Oxford and Cambridge Collaboration 2013. In it he describes how naturalness has usually provided a solution to problems in physics; and that it had usually done so earlier than expected. However, in addressing the problem of the cosmological constant, naturalness has failed to provide an explanation though it would have been expected to have done so a long time ago.

The necessity of fine-tuning leads to various problems that do not show that the theories are incorrect, in the sense of falsifying observations, but nevertheless suggest that a piece of the story is missing. For example, the cosmological constant problem (why is the cosmological constant so small?); the hierarchy problem; and the strong CP problem, among others.

Also, Dongshan He's team has suggested a possible solution for the fine tuned Cosmological constant by the universe creation from nothing model. [4]

Example

An example of a fine-tuning problem considered by the scientific community to have a plausible "natural" solution is the cosmological flatness problem, which is solved if inflationary theory is correct: inflation forces the universe to become very flat, answering the question of why the universe is today observed to be flat to such a high degree.[ citation needed ]

Measurement

Although fine-tuning was traditionally measured by ad hoc fine-tuning measures, such as the Barbieri-Giudice-Ellis measure, over the past decade many scientists recognized that fine-tuning arguments were a specific application of Bayesian statistics. [5] [6] [7] [8] [9] [10] [ excessive citations ]

See also

Related Research Articles

In physical cosmology, cosmic inflation, cosmological inflation, or just inflation, is a theory of exponential expansion of space in the early universe. The inflationary epoch is believed to have lasted from 10−36 seconds to between 10−33 and 10−32 seconds after the Big Bang. Following the inflationary period, the universe continued to expand, but at a slower rate. The re-acceleration of this slowing expansion due to dark energy began after the universe was already over 7.7 billion years old.

In physics, quintessence is a hypothetical form of dark energy, more precisely a scalar field, postulated as an explanation of the observation of an accelerating rate of expansion of the universe. The first example of this scenario was proposed by Ratra and Peebles (1988) and Wetterich (1988). The concept was expanded to more general types of time-varying dark energy, and the term "quintessence" was first introduced in a 1998 paper by Robert R. Caldwell, Rahul Dave and Paul Steinhardt. It has been proposed by some physicists to be a fifth fundamental force. Quintessence differs from the cosmological constant explanation of dark energy in that it is dynamic; that is, it changes over time, unlike the cosmological constant which, by definition, does not change. Quintessence can be either attractive or repulsive depending on the ratio of its kinetic and potential energy. Those working with this postulate believe that quintessence became repulsive about ten billion years ago, about 3.5 billion years after the Big Bang.

Supersymmetry is a theoretical framework in physics that suggests the existence of a symmetry between particles with integer spin (bosons) and particles with half-integer spin (fermions). It proposes that for every known particle, there exists a partner particle with different spin properties. There have been multiple experiments on supersymmetry that have failed to provide evidence that it exists in nature. If evidence is found, supersymmetry could help explain certain phenomena, such as the nature of dark matter and the hierarchy problem in particle physics.

<span class="mw-page-title-main">Technicolor (physics)</span> Hypothetical model through which W and Z bosons acquire mass

Technicolor theories are models of physics beyond the Standard Model that address electroweak gauge symmetry breaking, the mechanism through which W and Z bosons acquire masses. Early technicolor theories were modelled on quantum chromodynamics (QCD), the "color" theory of the strong nuclear force, which inspired their name.

An axion is a hypothetical elementary particle originally postulated by the Peccei–Quinn theory in 1977 to resolve the strong CP problem in quantum chromodynamics (QCD). If axions exist and have low mass within a specific range, they are of interest as a possible component of cold dark matter.

The Big Bounce hypothesis is a cosmological model for the origin of the known universe. It was originally suggested as a phase of the cyclic model or oscillatory universe interpretation of the Big Bang, where the first cosmological event was the result of the collapse of a previous universe. It receded from serious consideration in the early 1980s after inflation theory emerged as a solution to the horizon problem, which had arisen from advances in observations revealing the large-scale structure of the universe.

Brane cosmology refers to several theories in particle physics and cosmology related to string theory, superstring theory and M-theory.

<span class="mw-page-title-main">Hierarchy problem</span> Unsolved problem in physics

In theoretical physics, the hierarchy problem is the problem concerning the large discrepancy between aspects of the weak force and gravity. There is no scientific consensus on why, for example, the weak force is 1024 times stronger than gravity.

<span class="mw-page-title-main">Flatness problem</span> Cosmological fine-tuning problem

The flatness problem is a cosmological fine-tuning problem within the Big Bang model of the universe. Such problems arise from the observation that some of the initial conditions of the universe appear to be fine-tuned to very 'special' values, and that small deviations from these values would have extreme effects on the appearance of the universe at the current time.

<span class="mw-page-title-main">False vacuum</span> Hypothetical vacuum, less stable than true vacuum

In quantum field theory, a false vacuum is a hypothetical vacuum that is relatively stable, but not in the most stable state possible. In this condition it is called metastable. It may last for a very long time in this state, but could eventually decay to the more stable one, an event known as false vacuum decay. The most common suggestion of how such a decay might happen in our universe is called bubble nucleation – if a small region of the universe by chance reached a more stable vacuum, this "bubble" would spread.

In string theory, the string theory landscape is the collection of possible false vacua, together comprising a collective "landscape" of choices of parameters governing compactifications.

In physics, naturalness is the aesthetic property that the dimensionless ratios between free parameters or physical constants appearing in a physical theory should take values "of order 1" and that free parameters are not fine-tuned. That is, a natural theory would have parameter ratios with values like 2.34 rather than 234000 or 0.000234.

<span class="mw-page-title-main">Christopher T. Hill</span> American theoretical physicist

Christopher T. Hill is an American theoretical physicist at the Fermi National Accelerator Laboratory who did undergraduate work in physics at M.I.T., and graduate work at Caltech. Hill's Ph.D. thesis, "Higgs Scalars and the Nonleptonic Weak Interactions" (1977) contains one of the first detailed discussions of the two-Higgs-doublet model and its impact upon weak interactions. His work mainly focuses on new physics that can be probed in laboratory experiments or cosmology.

In physical cosmology and astronomy, dark energy is an unknown form of energy that affects the universe on the largest scales. Its primary effect is to drive the accelerating expansion of the universe. Assuming that the lambda-CDM model of cosmology is correct, dark energy is the dominant component of the universe, contributing 68% of the total energy in the present-day observable universe while dark matter and ordinary (baryonic) matter contribute 26% and 5%, respectively, and other components such as neutrinos and photons are nearly negligible. Dark energy's density is very low: 6×10−10 J/m3, much less than the density of ordinary matter or dark matter within galaxies. However, it dominates the universe's mass–energy content because it is uniform across space.

<span class="mw-page-title-main">Cosmological constant problem</span> Concept in cosmology

In cosmology, the cosmological constant problem or vacuum catastrophe is the substantial disagreement between the observed values of vacuum energy density and the much larger theoretical value of zero-point energy suggested by quantum field theory.

<span class="mw-page-title-main">Gordon L. Kane</span>

Gordon Leon Kane is Victor Weisskopf Distinguished University Professor at the University of Michigan and director emeritus at the Leinweber Center for Theoretical Physics (LCTP), a leading center for the advancement of theoretical physics. He was director of the LCTP from 2005 to 2011 and Victor Weisskopf Collegiate Professor of Physics from 2002 - 2011. He received the Lilienfeld Prize from the American Physical Society in 2012, and the J. J. Sakurai Prize for Theoretical Particle Physics in 2017.

<span class="mw-page-title-main">Gian Francesco Giudice</span> Italian theoretical physicist

Gian Francesco Giudice is an Italian theoretical physicist working at CERN in particle physics and cosmology.

The asymptotic safety approach to quantum gravity provides a nonperturbative notion of renormalization in order to find a consistent and predictive quantum field theory of the gravitational interaction and spacetime geometry. It is based upon a nontrivial fixed point of the corresponding renormalization group (RG) flow such that the running coupling constants approach this fixed point in the ultraviolet (UV) limit. This suffices to avoid divergences in physical observables. Moreover, it has predictive power: Generically an arbitrary starting configuration of coupling constants given at some RG scale does not run into the fixed point for increasing scale, but a subset of configurations might have the desired UV properties. For this reason it is possible that — assuming a particular set of couplings has been measured in an experiment — the requirement of asymptotic safety fixes all remaining couplings in such a way that the UV fixed point is approached.

<span class="mw-page-title-main">750 GeV diphoton excess</span> 2015 anomaly in the Large Hadron Collider

The 750 GeV diphoton excess in particle physics was an anomaly in data collected at the Large Hadron Collider (LHC) in 2015, which could have been an indication of a new particle or resonance. The anomaly was absent in data collected in 2016, suggesting that the diphoton excess was a statistical fluctuation. In the interval between the December 2015 and August 2016 results, the anomaly generated considerable interest in the scientific community, including about 500 theoretical studies. The hypothetical particle was denoted by the Greek letter Ϝ in the scientific literature, owing to the decay channel in which the anomaly occurred. The data, however, were always less than five standard deviations (sigma) different from that expected if there was no new particle, and, as such, the anomaly never reached the accepted level of statistical significance required to announce a discovery in particle physics. After the August 2016 results, interest in the anomaly sank as it was considered a statistical fluctuation. Indeed, a Bayesian analysis of the anomaly found that whilst data collected in 2015 constituted "substantial" evidence for the digamma on the Jeffreys scale, data collected in 2016 combined with that collected in 2015 was evidence against the digamma.

Jean-Philippe Uzan is a French cosmologist and directeur de recherche employed by the Centre national de la recherche scientifique (CNRS).

References

  1. Leslie, John A. (1998). Modern Cosmology & Philosophy. University of Michigan: Prometheus Books. ISBN   1-57392-250-1.
  2. Grinbaum, Alexei (1 February 2012). "Which Fine-Tuning Arguments Are Fine?". Foundations of Physics. 42 (5): 615–631. arXiv: 0903.4055 . Bibcode:2012FoPh...42..615G. doi:10.1007/s10701-012-9629-9. S2CID   15590514.
  3. Giudice, Gian (2008). "Naturally Speaking: The Naturalness Criterion and Physics at the LHC". LHC Perspectives. Perspectives on LHC Physics. pp. 155–178. arXiv: 0801.2562 . Bibcode:2008plnc.book..155G. doi:10.1142/9789812779762_0010. ISBN   978-981-277-975-5. S2CID   15078813.
  4. He, Dongshan; Gao, Dongfeng; Cai, Qing-yu (April 2014). "Spontaneous creation of the universe from nothing". Physical Review. 89 (8): 083510. arXiv: 1404.1207 . Bibcode:2014PhRvD..89h3510H. doi:10.1103/PhysRevD.89.083510. S2CID   118371273.
  5. Barbieri, Riccardo; Giudice, Gian Francesco (August 1988). "Upper bounds on supersymmetric particle masses". Nuclear Physics B. 306 (1): 63–76. Bibcode:1988NuPhB.306...63B. doi:10.1016/0550-3213(88)90171-X.
  6. Fowlie, Andrew; Balazs, Csaba; White, Graham; Marzola, Luca; Raidal, Martti (17 August 2016). "Naturalness of the relaxion mechanism". Journal of High Energy Physics. 2016 (8): 100. arXiv: 1602.03889 . Bibcode:2016JHEP...08..100F. doi:10.1007/JHEP08(2016)100. S2CID   119102534.
  7. Fowlie, Andrew (10 July 2014). "CMSSM, naturalness and the ?fine-tuning price? of the Very Large Hadron Collider". Physical Review D. 90 (1): 015010. arXiv: 1403.3407 . Bibcode:2014PhRvD..90a5010F. doi:10.1103/PhysRevD.90.015010. S2CID   118362634.
  8. Fowlie, Andrew (15 October 2014). "Is the CNMSSM more credible than the CMSSM?". The European Physical Journal C. 74 (10). arXiv: 1407.7534 . doi:10.1140/epjc/s10052-014-3105-y. S2CID   119304794.
  9. Cabrera, Maria Eugenia; Casas, Alberto; Austri, Roberto Ruiz de (2009). "Bayesian approach and naturalness in MSSM analyses for the LHC". Journal of High Energy Physics. 2009 (3): 075. arXiv: 0812.0536 . Bibcode:2009JHEP...03..075C. doi:10.1088/1126-6708/2009/03/075. S2CID   18276270.
  10. Fichet, Sylvain (18 December 2012). "Quantified naturalness from Bayesian statistics". Physical Review D. 86 (12): 125029. arXiv: 1204.4940 . Bibcode:2012PhRvD..86l5029F. doi:10.1103/PhysRevD.86.125029. S2CID   119282331.