CompHEP is a software package for automatic computations in high energy physics from Lagrangians to collision events or particle decays.
CompHEP is based on quantum theory of gauge fields, namely it uses the technique of squared Feynman diagrams at the tree-level approximation. By default, CompHEP includes the Standard Model Lagrangian in the unitarity and 't Hooft-Feynman gauges and several MSSM models. However users can create new physical models, based on different Lagrangians. There is a special tool for that - LanHEP. CompHEP is able to compute basically the LO cross sections and distributions with several particles in the final state (up to 6-7). It can take into account, if necessary, all QCD and EW diagrams, masses of fermions and bosons and widths of unstable particles. Processes computed by means of CompHEP can be interfaced to the Monte-Carlo generators PYTHIA and HERWIG as new external processes.
The CompHEP project started in 1989 in Skobeltsyn Institute of Nuclear Physics (SINP) of Moscow State University. During the 1990s this package was developed, and now it is a powerful tool for automatic computations of collision processes. The CompHEP program has been used in the past for many studies in many experimental groups as shown schematically in the scheme
Due to an intuitive graphical interface CompHEP is a very useful tool for education in particle and nuclear physics.
In theoretical physics, a Feynman diagram is a pictorial representation of the mathematical expressions describing the behavior and interaction of subatomic particles. The scheme is named after American physicist Richard Feynman, who introduced the diagrams in 1948. The interaction of subatomic particles can be complex and difficult to understand; Feynman diagrams give a simple visualization of what would otherwise be an arcane and abstract formula. According to David Kaiser, "Since the middle of the 20th century, theoretical physicists have increasingly turned to this tool to help them undertake critical calculations. Feynman diagrams have revolutionized nearly every aspect of theoretical physics." While the diagrams are applied primarily to quantum field theory, they can also be used in other areas of physics, such as solid-state theory. Frank Wilczek wrote that the calculations that won him the 2004 Nobel Prize in Physics "would have been literally unthinkable without Feynman diagrams, as would [Wilczek's] calculations that established a route to production and observation of the Higgs particle."
In theoretical physics, quantum chromodynamics (QCD) is the study of the strong interaction between quarks mediated by gluons. Quarks are fundamental particles that make up composite hadrons such as the proton, neutron and pion. QCD is a type of quantum field theory called a non-abelian gauge theory, with symmetry group SU(3). The QCD analog of electric charge is a property called color. Gluons are the force carriers of the theory, just as photons are for the electromagnetic force in quantum electrodynamics. The theory is an important part of the Standard Model of particle physics. A large body of experimental evidence for QCD has been gathered over the years.
In theoretical physics, quantum field theory (QFT) is a theoretical framework that combines classical field theory, special relativity, and quantum mechanics. QFT is used in particle physics to construct physical models of subatomic particles and in condensed matter physics to construct models of quasiparticles. The current standard model of particle physics is based on quantum field theory.
In particle physics, quantum electrodynamics (QED) is the relativistic quantum field theory of electrodynamics. In essence, it describes how light and matter interact and is the first theory where full agreement between quantum mechanics and special relativity is achieved. QED mathematically describes all phenomena involving electrically charged particles interacting by means of exchange of photons and represents the quantum counterpart of classical electromagnetism giving a complete account of matter and light interaction.
Renormalization is a collection of techniques in quantum field theory, statistical field theory, and the theory of self-similar geometric structures, that are used to treat infinities arising in calculated quantities by altering values of these quantities to compensate for effects of their self-interactions. But even if no infinities arose in loop diagrams in quantum field theory, it could be shown that it would be necessary to renormalize the mass and fields appearing in the original Lagrangian.
In theoretical physics, a chiral anomaly is the anomalous nonconservation of a chiral current. In everyday terms, it is equivalent to a sealed box that contained equal numbers of left and right-handed bolts, but when opened was found to have more left than right, or vice versa.
In theoretical physics, the anti-de Sitter/conformal field theory correspondence is a conjectured relationship between two kinds of physical theories. On one side are anti-de Sitter spaces (AdS) that are used in theories of quantum gravity, formulated in terms of string theory or M-theory. On the other side of the correspondence are conformal field theories (CFT) that are quantum field theories, including theories similar to the Yang–Mills theories that describe elementary particles.
In physics, Faddeev–Popov ghosts are extraneous fields which are introduced into gauge quantum field theories to maintain the consistency of the path integral formulation. They are named after Ludvig Faddeev and Victor Popov.
Yang–Mills theory is a quantum field theory for nuclear binding devised by Chen Ning Yang and Robert Mills in 1953, as well as a generic term for the class of similar theories. The Yang–Mills theory is a gauge theory based on a special unitary group SU(n), or more generally any compact Lie group. A Yang–Mills theory seeks to describe the behavior of elementary particles using these non-abelian Lie groups and is at the core of the unification of the electromagnetic force and weak forces (i.e. U(1) × SU(2)) as well as quantum chromodynamics, the theory of the strong force (based on SU(3)). Thus it forms the basis of the understanding of the Standard Model of particle physics.
In physics, especially quantum field theory, regularization is a method of modifying observables which have singularities in order to make them finite by the introduction of a suitable parameter called the regulator. The regulator, also known as a "cutoff", models our lack of knowledge about physics at unobserved scales. It compensates for the possibility of separation of scales that "new physics" may be discovered at those scales which the present theory is unable to model, while enabling the current theory to give accurate predictions as an "effective theory" within its intended scale of use.
In particle physics, flavor-changing neutral currents or flavour-changing neutral currents (FCNCs) are hypothetical interactions that change the flavor of a fermion without altering its electric charge.
Event generators are software libraries that generate simulated high-energy particle physics events. They randomly generate events as those produced in particle accelerators, collider experiments or the early universe. Events come in different types called processes as discussed in the Automatic calculation of particle interaction or decay article.
In particle physics, the parton model is a model of hadrons, such as protons and neutrons, proposed by Richard Feynman. It is useful for interpreting the cascades of radiation produced from quantum chromodynamics (QCD) processes and interactions in high-energy particle collisions.
In quantum electrodynamics, Bhabha scattering is the electron-positron scattering process:
Computational particle physics refers to the methods and computing tools developed in and used by particle physics research. Like computational chemistry or computational biology, it is, for particle physics both a specific branch and an interdisciplinary field relying on computer science, theoretical and experimental particle physics and mathematics. The main fields of computational particle physics are: lattice field theory, automatic calculation of particle interaction or decay and event generators.
The automatic calculation of particle interaction or decay is part of the computational particle physics branch. It refers to computing tools that help calculating the complex particle interactions as studied in high-energy physics, astroparticle physics and cosmology. The goal of the automation is to handle the full sequence of calculations in an automatic (programmed) way: from the Lagrangian expression describing the physics model up to the cross-sections values and to the event generator software.
In theoretical particle physics, maximally helicity violating amplitudes (MHV) are amplitudes with massless external gauge bosons, where gauge bosons have a particular helicity and the other two have the opposite helicity. These amplitudes are called MHV amplitudes, because at tree level, they violate helicity conservation to the maximum extent possible. The tree amplitudes in which all gauge bosons have the same helicity or all but one have the same helicity vanish.
In physics, a gauge theory is a type of field theory in which the Lagrangian, and hence the dynamics of the system itself, do not change under local transformations according to certain smooth families of operations. Formally, the Lagrangian is invariant under these transformations.
In quantum field theory, initial and final state radiation refers to certain kinds of radiative emissions that are not due to particle annihilation. It is important in experimental and theoretical studies of interactions at particle colliders.
Zvi Bern is an American theoretical particle physicist. He is a professor at University of California, Los Angeles (UCLA).