Fragmentation function

Last updated

In a sufficiently hard interaction between particles, the cross section can be factorized into parton distribution functions (PDFs), the hard scattering part, and fragmentation functions. The fragmentation functions, as are the PDFs, are non-perturbative functions describing the production of a given observed final state. In a leading order picture, it can be interpreted as the probability that the observed final state originates from a given quark or gluon. [1]

When two particles interact, their mutual cross section is the area transverse to their relative motion within which they must meet in order to scatter from each other. If the particles are hard inelastic spheres that interact only upon contact, their scattering cross section is related to their geometric size. If the particles interact through some action-at-a-distance force, such as electromagnetism or gravity, their scattering cross section is generally larger than their geometric size. When a cross section is specified as a function of some final-state variable, such as particle angle or energy, it is called a differential cross section. When a cross section is integrated over all scattering angles, it is called a total cross section. Cross sections are typically denoted σ (sigma) and measured in units of area.

Factorization decomposition of an object into a product of other objects

In mathematics, factorization or factoring consists of writing a number or another mathematical object as a product of several factors, usually smaller or simpler objects of the same kind. For example, 3 × 5 is a factorization of the integer 15, and (x – 2)(x + 2) is a factorization of the polynomial x2 – 4.

In particle physics, the parton model is a model of hadrons, such as protons and neutrons, proposed by Richard Feynman. It is useful for interpreting the cascades of radiation produced from QCD processes and interactions in high-energy particle collisions.

See also

Related Research Articles

Boltzmann distribution Probability distribution of energy states of a system

In statistical mechanics and mathematics, a Boltzmann distribution is a probability distribution or probability measure that gives the probability that a system will be in a certain state as a function of that state's energy and the temperature of the system. It is also a frequency distribution of particles in a system. The distribution is expressed in the form

Beta decay decay where electrons (β-, beta minus) or positrons (β+, positron emission) are emitted

In nuclear physics, beta decay (β-decay) is a type of radioactive decay in which a beta ray is emitted from an atomic nucleus. For example, beta decay of a neutron transforms it into a proton by the emission of an electron accompanied by an antineutrino, or conversely a proton is converted into a neutron by the emission of a positron with a neutrino, thus changing the nuclide type. Neither the beta particle nor its associated (anti-)neutrino exist within the nucleus prior to beta decay, but are created in the decay process. By this process, unstable atoms obtain a more stable ratio of protons to neutrons. The probability of a nuclide decaying due to beta and other forms of decay is determined by its nuclear binding energy. The binding energies of all existing nuclides form what is called the nuclear band or valley of stability. For either electron or positron emission to be energetically possible, the energy release or Q value must be positive.

Double-slit experiment Physics experiment, showing light can be modelled by both waves and particles

In modern physics, the double-slit experiment is a demonstration that light and matter can display characteristics of both classically defined waves and particles; moreover, it displays the fundamentally probabilistic nature of quantum mechanical phenomena. The experiment was first performed with light by Thomas Young in 1801. In 1927, Davisson and Germer demonstrated that electrons show the same behavior, which was later extended to atoms and molecules.

Quantum mechanics branch of physics dealing with phenomena at scales of the order of the Planck constant

Quantum mechanics, including quantum field theory, is a fundamental theory in physics which describes nature at the smallest scales of energy levels of atoms and subatomic particles.

The van der Waals equation is an equation of state that generalizes the ideal gas law based on plausible reasons that real gases do not act ideally. The ideal gas law treats gas molecules as point particles that interact with their containers but not each other, meaning they neither take up space nor change kinetic energy during collisions. The ideal gas law states that volume (V) occupied by n moles of any gas has a pressure (P) at temperature (T) in kelvins given by the following relationship, where R is the gas constant:

Asymmetry state; the absence of, or a violation of, symmetry

Asymmetry is the absence of, or a violation of, symmetry. Symmetry is an important property of both physical and abstract systems and it may be displayed in precise terms or in more aesthetic terms. The absence of or violation of symmetry that are either expected or desired can have important consequences for a system.

In particle physics, hadronization is the process of the formation of hadrons out of quarks and gluons. This occurs after high-energy collisions in a particle collider in which quarks or gluons are created. Due to colour confinement, these cannot exist individually. In the Standard Model they combine with quarks and antiquarks spontaneously created from the vacuum to form hadrons. The QCD of the hadronization process are not yet fully understood, but are modeled and parameterized in a number of phenomenological studies, including the Lund string model and in various long-range QCD approximation schemes.

Perturbative quantum chromodynamics is a subfield of particle physics in which the theory of strong interactions, Quantum Chromodynamics (QCD), is studied by using the fact that the strong coupling constant is small in high energy or short distance interactions, thus allowing perturbation theory techniques to be applied. In most circumstances, making testable predictions with QCD is extremely difficult, due to the infinite number of possible topologically-inequivalent interactions. Over short distances, the coupling is small enough that this infinite number of terms can be approximated accurately by a finite number of terms. Although limited in scope, this approach has resulted in the most precise tests of QCD to date.

Jet (particle physics) narrow cone of hadrons and other particles produced by the hadronization of a quark or gluon in a particle physics or heavy ion experiment

A jet is a narrow cone of hadrons and other particles produced by the hadronization of a quark or gluon in a particle physics or heavy ion experiment. Particles carrying a color charge, such as quarks, cannot exist in free form because of QCD confinement which only allows for colorless states. When an object containing color charge fragments, each fragment carries away some of the color charge. In order to obey confinement, these fragments create other colored objects around them to form colorless objects. The ensemble of these objects is called a jet, since the fragments all tend to travel in the same direction, forming a narrow "jet" of particles. Jets are measured in particle detectors and studied in order to determine the properties of the original quarks.

In particle physics, the Lund string model is a phenomenological model of hadronization. It treats all but the highest-energy gluons as field lines, which are attracted to each other due to the gluon self-interaction and so form a narrow tube of strong color field.

ALICE experiment detector experiments at the Large Hadron Collider

ALICE is one of seven detector experiments at the Large Hadron Collider at CERN. The other six are: ATLAS, CMS, TOTEM, LHCb, LHCf and MoEDAL.

Event generators are software libraries that generate simulated high-energy particle physics events. They randomly generate events as those produced in particle accelerators, collider experiments or the early universe. Events come in different types called processes as discussed in the Automatic calculation of particle interaction or decay article.

Particle decay is the spontaneous process of one unstable subatomic particle transforming into multiple other particles. The particles created in this process must each be less massive than the original, although the total invariant mass of the system must be conserved. A particle is unstable if there is at least one allowed final state that it can decay into. Unstable particles will often have multiple ways of decaying, each with its own associated probability. Decays are mediated by one or several fundamental forces. The particles in the final state may themselves be unstable and subject to further decay.

PYTHIA is a computer simulation program for particle collisions at very high energies in particle accelerators.

In high-energy physics, jet quenching is a phenomenon that can occur in the collision of ultra-high-energy particles. In general, the collision of high-energy particles can produce jets of elementary particles that emerge from these collisions. Collisions of ultra-relativistic heavy-ion particle beams create a hot and dense medium comparable to the conditions in the early universe, and then these jets interact strongly with the medium, leading to a marked reduction of their energy. This energy reduction is called "jet quenching".

The Ghirardi–Rimini–Weber theory is a collapse theory in quantum mechanics. GRW differs from other collapse theories by proposing that wave function collapse happens spontaneously. GRW is an attempt to avoid the measurement problem in quantum mechanics. It was first reported in 1985.

The automatic calculation of particle interaction or decay is part of the computational particle physics branch. It refers to computing tools that help calculating the complex particle interactions as studied in high-energy physics, astroparticle physics and cosmology. The goal of the automation is to handle the full sequence of calculations in an automatic (programmed) way: from the Lagrangian expression describing the physics model up to the cross-sections values and to the event generator software.

Spectrometer instrument used to measure properties of light

A spectrometer is a scientific instrument used to separate and measure spectral components of a physical phenomenon. Spectrometer is a broad term often used to describe instruments that measure a continuous variable of a phenomenon where the spectral components are somehow mixed. In visible light a spectrometer can for instance separate white light and measure individual narrow bands of color, called a spectrum, while a mass spectrometer measures the spectrum of the masses of the atoms or molecules present in a gas. The first spectrometers were used to split light into an array of separate colors. Spectrometers were developed in early studies of physics, astronomy, and chemistry. The capability of spectroscopy to determine chemical composition drove its advancement and continues to be one of its primary uses. Spectrometers are used in astronomy to analyze the chemical composition of stars and planets, and spectrometers gather data on the origin of the universe.

In physics, the underlying event (UE) is all what is seen in a hadron collider event which is not coming from the primary hard scattering process.

References

  1. Metz, A.; Vossen, A. (November 2016). "Parton fragmentation functions". Progress in Particle and Nuclear Physics. 91: 136–202. doi:10.1016/j.ppnp.2016.08.003. ISSN   0146-6410.