Non-extensive self-consistent thermodynamical theory

Last updated

In experimental physics, researchers have proposed non-extensive self-consistent thermodynamic theory to describe phenomena observed in the Large Hadron Collider (LHC). This theory investigates a fireball for high-energy particle collisions, while using Tsallis non-extensive thermodynamics. [1] Fireballs lead to the bootstrap idea, or self-consistency principle, just as in the Boltzmann statistics used by Rolf Hagedorn. [2] Assuming the distribution function gets variations, due to possible symmetrical change, Abdel Nasser Tawfik applied the non-extensive concepts of high-energy particle production. [3] [4]

Contents

The motivation to use the non-extensive statistics from Tsallis [5] comes from the results obtained by Bediaga et al. [6] They showed that with the substitution of the Boltzmann factor in Hagedorn's theory by the q-exponential function, it was possible to recover good agreement between calculation and experiment, even at energies as high as those achieved at the LHC, with q>1.

Non-extensive entropy for ideal quantum gas

The starting point of the theory is entropy for a non-extensive quantum gas of bosons and fermions, as proposed by Conroy, Miller and Plastino, [1] which is given by where is the non-extended version of the Fermi–Dirac entropy and is the non-extended version of the Bose–Einstein entropy.

That group [2] and also Clemens and Worku, [3] the entropy just defined leads to occupation number formulas that reduce to Bediaga's. C. Beck, [4] shows the power-like tails present in the distributions found in high energy physics experiments.

Non-extensive partition function for ideal quantum gas

Using the entropy defined above, the partition function results are

Since experiments have shown that , this restriction is adopted.

Another way to write the non-extensive partition function for a fireball is

where is the density of states of the fireballs.

Self-consistency principle

Self-consistency implies that both forms of partition functions must be asymptotically equivalent and that the mass spectrum and the density of states must be related to each other by

,

in the limit of sufficiently large.

The self-consistency can be asymptotically achieved by choosing [1]

and

where is a constant and . Here, are arbitrary constants. For the two expressions above approach the corresponding expressions in Hagedorn's theory.

Main results

With the mass spectrum and density of states given above, the asymptotic form of the partition function is

where

with

One immediate consequence of the expression for the partition function is the existence of a limiting temperature . This result is equivalent to Hagedorn's result. [2] With these results, it is expected that at sufficiently high energy, the fireball presents a constant temperature and constant entropic factor.

The connection between Hagedorn's theory and Tsallis statistics has been established through the concept of thermofractals, where it is shown that non extensivity can emerge from a fractal structure. This result is interesting because Hagedorn's definition of fireball characterizes it as a fractal.

Experimental evidence

Experimental evidence of the existence of a limiting temperature and of a limiting entropic index can be found in J. Cleymans and collaborators, [3] [4] and by I. Sena and A. Deppman. [7] [8]

See also

Related Research Articles

<span class="mw-page-title-main">Fokker–Planck equation</span> Partial differential equation

In statistical mechanics and information theory, the Fokker–Planck equation is a partial differential equation that describes the time evolution of the probability density function of the velocity of a particle under the influence of drag forces and random forces, as in Brownian motion. The equation can be generalized to other observables as well. The Fokker-Planck equation has multiple applications in information theory, graph theory, data science, finance, economics etc.

<span class="mw-page-title-main">Weibull distribution</span> Continuous probability distribution

In probability theory and statistics, the Weibull distribution is a continuous probability distribution. It models a broad range of random variables, largely in the nature of a time to failure or time between events. Examples are maximum one-day rainfalls and the time a user spends on a web page.

<span class="mw-page-title-main">Gamma distribution</span> Probability distribution

In probability theory and statistics, the gamma distribution is a two-parameter family of continuous probability distributions. The exponential distribution, Erlang distribution, and chi-squared distribution are special cases of the gamma distribution. There are two equivalent parameterizations in common use:

  1. With a shape parameter and a scale parameter .
  2. With a shape parameter and an inverse scale parameter , called a rate parameter.

The Green–Kubo relations give the exact mathematical expression for transport coefficients in terms of integrals of time correlation functions:

In general relativity, the Gibbons–Hawking–York boundary term is a term that needs to be added to the Einstein–Hilbert action when the underlying spacetime manifold has a boundary.

Differential entropy is a concept in information theory that began as an attempt by Claude Shannon to extend the idea of (Shannon) entropy, a measure of average (surprisal) of a random variable, to continuous probability distributions. Unfortunately, Shannon did not derive this formula, and rather just assumed it was the correct continuous analogue of discrete entropy, but it is not. The actual continuous version of discrete entropy is the limiting density of discrete points (LDDP). Differential entropy is commonly encountered in the literature, but it is a limiting case of the LDDP, and one that loses its fundamental association with discrete entropy.

In statistical mechanics, the Rushbrooke inequality relates the critical exponents of a magnetic system which exhibits a first-order phase transition in the thermodynamic limit for non-zero temperature T.

Continuous wavelets of compact support alpha can be built, which are related to the beta distribution. The process is derived from probability distributions using blur derivative. These new wavelets have just one cycle, so they are termed unicycle wavelets. They can be viewed as a soft variety of Haar wavelets whose shape is fine-tuned by two parameters and . Closed-form expressions for beta wavelets and scale functions as well as their spectra are derived. Their importance is due to the Central Limit Theorem by Gnedenko and Kolmogorov applied for compactly supported signals.

<span class="mw-page-title-main">Yield surface</span>

A yield surface is a five-dimensional surface in the six-dimensional space of stresses. The yield surface is usually convex and the state of stress of inside the yield surface is elastic. When the stress state lies on the surface the material is said to have reached its yield point and the material is said to have become plastic. Further deformation of the material causes the stress state to remain on the yield surface, even though the shape and size of the surface may change as the plastic deformation evolves. This is because stress states that lie outside the yield surface are non-permissible in rate-independent plasticity, though not in some models of viscoplasticity.

A ratio distribution is a probability distribution constructed as the distribution of the ratio of random variables having two other known distributions. Given two random variables X and Y, the distribution of the random variable Z that is formed as the ratio Z = X/Y is a ratio distribution.

Superstatistics is a branch of statistical mechanics or statistical physics devoted to the study of non-linear and non-equilibrium systems. It is characterized by using the superposition of multiple differing statistical models to achieve the desired non-linearity. In terms of ordinary statistical ideas, this is equivalent to compounding the distributions of random variables and it may be considered a simple case of a doubly stochastic model.

<span class="mw-page-title-main">Normal-inverse-gamma distribution</span>

In probability theory and statistics, the normal-inverse-gamma distribution is a four-parameter family of multivariate continuous probability distributions. It is the conjugate prior of a normal distribution with unknown mean and variance.

In computer networks, self-similarity is a feature of network data transfer dynamics. When modeling network data dynamics the traditional time series models, such as an autoregressive moving average model are not appropriate. This is because these models only provide a finite number of parameters in the model and thus interaction in a finite time window, but the network data usually have a long-range dependent temporal structure. A self-similar process is one way of modeling network data dynamics with such a long range correlation. This article defines and describes network data transfer dynamics in the context of a self-similar process. Properties of the process are shown and methods are given for graphing and estimating parameters modeling the self-similarity of network data.

A product distribution is a probability distribution constructed as the distribution of the product of random variables having two other known distributions. Given two statistically independent random variables X and Y, the distribution of the random variable Z that is formed as the product is a product distribution.

<span class="mw-page-title-main">Thermal fluctuations</span> Random temperature-influenced deviations of particles from their average state

In statistical mechanics, thermal fluctuations are random deviations of an atomic system from its average state, that occur in a system at equilibrium. All thermal fluctuations become larger and more frequent as the temperature increases, and likewise they decrease as temperature approaches absolute zero.

<i>q</i>-Gaussian distribution Probability distribution

The q-Gaussian is a probability distribution arising from the maximization of the Tsallis entropy under appropriate constraints. It is one example of a Tsallis distribution. The q-Gaussian is a generalization of the Gaussian in the same way that Tsallis entropy is a generalization of standard Boltzmann–Gibbs entropy or Shannon entropy. The normal distribution is recovered as q → 1.

<i>q</i>-exponential distribution

The q-exponential distribution is a probability distribution arising from the maximization of the Tsallis entropy under appropriate constraints, including constraining the domain to be positive. It is one example of a Tsallis distribution. The q-exponential is a generalization of the exponential distribution in the same way that Tsallis entropy is a generalization of standard Boltzmann–Gibbs entropy or Shannon entropy. The exponential distribution is recovered as

In probability and statistics, the generalized beta distribution is a continuous probability distribution with four shape parameters, including more than thirty named distributions as limiting or special cases. It has been used in the modeling of income distribution, stock returns, as well as in regression analysis. The exponential generalized beta (EGB) distribution follows directly from the GB and generalizes other common distributions.

The self-consistency principle was established by Rolf Hagedorn in 1965 to explain the thermodynamics of fireballs in high energy physics collisions. A thermodynamical approach to the high energy collisions first proposed by E. Fermi.

<span class="mw-page-title-main">Kaniadakis Gaussian distribution</span> Continuous probability distribution

The Kaniadakis Gaussian distribution is a probability distribution which arises as a generalization of the Gaussian distribution from the maximization of the Kaniadakis entropy under appropriated constraints. It is one example of a Kaniadakis κ-distribution. The κ-Gaussian distribution has been applied successfully for describing several complex systems in economy, geophysics, astrophysics, among many others.

References

  1. 1 2 3 A. Deppman, Physica A 391 (2012) 6380.
  2. 1 2 3 R. Hagedorn, Suppl. Al Nuovo Cimento 3 (1965) 147.
  3. 1 2 3 J. Cleymans and D. Worku, J. Phys. G: Nucl. Part. Phys. 39 (2012)http://iopscience.iop.org/0954-3899/39/2/025006/pdf/0954-3899_39_2_025006.pdf 025006.
  4. 1 2 3 J. Cleymans, G.I. Lykasov, A.S. Parvan, A.S. Sorin, O.V. Teryaev and D. Worku, arXiv:1302.1970 (2013).
  5. C. Tsallis, J Stat Phys 52, 479-487, 1988
  6. I. Bediaga, E.M.F. Curado and J.M. de Miranda, Physica A 286 (2000) 156.
  7. I. Sena and A. Deppman, Eur. Phys. J. A 49 (2013) 17.
  8. I. Sena and A. Deppman, AIP Conf. Proc. 1520, 172 (2013) - arXiv:1208.2952v1.