Auxiliary-field Monte Carlo

Last updated

Auxiliary-field Monte Carlo is a method that allows the calculation, by use of Monte Carlo techniques, of averages of operators in many-body quantum mechanical (Blankenbecler 1981, Ceperley 1977) or classical problems (Baeurle 2004, Baeurle 2003, Baeurle 2002a).

Contents

Reweighting procedure and numerical sign problem

The distinctive ingredient of "auxiliary-field Monte Carlo" is the fact that the interactions are decoupled by means of the application of the Hubbard–Stratonovich transformation, which permits the reformulation of many-body theory in terms of a scalar auxiliary-field representation. This reduces the many-body problem to the calculation of a sum or integral over all possible auxiliary-field configurations. In this sense, there is a trade-off: instead of dealing with one very complicated many-body problem, one faces the calculation of an infinite number of simple external-field problems.

It is here, as in other related methods, that Monte Carlo enters the game in the guise of importance sampling: the large sum over auxiliary-field configurations is performed by sampling over the most important ones, with a certain probability. In classical statistical physics, this probability is usually given by the (positive semi-definite) Boltzmann factor. Similar factors arise also in quantum field theories; however, these can have indefinite sign (especially in the case of Fermions) or even be complex-valued, which precludes their direct interpretation as probabilities. In these cases, one has to resort to a reweighting procedure (i.e., interpret the absolute value as probability and multiply the sign or phase to the observable) to get a strictly positive reference distribution suitable for Monte Carlo sampling. However, it is well known that, in specific parameter ranges of the model under consideration, the oscillatory nature of the weight function can lead to a bad statistical convergence of the numerical integration procedure. The problem is known as the numerical sign problem and can be alleviated with analytical and numerical convergence acceleration procedures (Baeurle 2002, Baeurle 2003a).

See also

Related Research Articles

<span class="mw-page-title-main">Fermi liquid theory</span> Theoretical model of interacting fermions

Fermi liquid theory is a theoretical model of interacting fermions that describes the normal state of most metals at sufficiently low temperatures. The interactions among the particles of the many-body system do not need to be small. The phenomenological theory of Fermi liquids was introduced by the Soviet physicist Lev Davidovich Landau in 1956, and later developed by Alexei Abrikosov and Isaak Khalatnikov using diagrammatic perturbation theory. The theory explains why some of the properties of an interacting fermion system are very similar to those of the ideal Fermi gas, and why other properties differ.

<span class="mw-page-title-main">Lattice gauge theory</span> Theory of quantum gauge fields on a lattice

In physics, lattice gauge theory is the study of gauge theories on a spacetime that has been discretized into a lattice.

<span class="mw-page-title-main">Lattice QCD</span> Quantum chromodynamics on a lattice

Lattice QCD is a well-established non-perturbative approach to solving the quantum chromodynamics (QCD) theory of quarks and gluons. It is a lattice gauge theory formulated on a grid or lattice of points in space and time. When the size of the lattice is taken infinitely large and its sites infinitesimally close to each other, the continuum QCD is recovered.

Quantum Monte Carlo encompasses a large family of computational methods whose common aim is the study of complex quantum systems. One of the major goals of these approaches is to provide a reliable solution of the quantum many-body problem. The diverse flavors of quantum Monte Carlo approaches all share the common use of the Monte Carlo method to handle the multi-dimensional integrals that arise in the different formulations of the many-body problem.

The classical-map hypernetted-chain method is a method used in many-body theoretical physics for interacting uniform electron liquids in two and three dimensions, and for non-ideal plasmas. The method extends the famous hypernetted-chain method (HNC) introduced by J. M. J van Leeuwen et al. to quantum fluids as well. The classical HNC, together with the Percus–Yevick approximation, are the two pillars which bear the brunt of most calculations in the theory of interacting classical fluids. Also, HNC and PY have become important in providing basic reference schemes in the theory of fluids, and hence they are of great importance to the physics of many-particle systems.

In computational physics, variational Monte Carlo (VMC) is a quantum Monte Carlo method that applies the variational method to approximate the ground state of a quantum system.

Gaussian Quantum Monte Carlo is a quantum Monte Carlo method that shows a potential solution to the fermion sign problem without the deficiencies of alternative approaches. Instead of the Hilbert space, this method works in the space of density matrices that can be spanned by an over-complete basis of gaussian operators using only positive coefficients. Containing only quadratic forms of the fermionic operators, no anti-commuting variables occur and any quantum state can be expressed as a real probability distribution.

Path integral Monte Carlo (PIMC) is a quantum Monte Carlo method used to solve quantum statistical mechanics problems numerically within the path integral formulation. The application of Monte Carlo methods to path integral simulations of condensed matter systems was first pursued in a key paper by John A. Barker.

The percolation threshold is a mathematical concept in percolation theory that describes the formation of long-range connectivity in random systems. Below the threshold a giant connected component does not exist; while above it, there exists a giant component of the order of system size. In engineering and coffee making, percolation represents the flow of fluids through porous media, but in the mathematics and physics worlds it generally refers to simplified lattice models of random systems or networks (graphs), and the nature of the connectivity in them. The percolation threshold is the critical value of the occupation probability p, or more generally a critical surface for a group of parameters p1, p2, ..., such that infinite connectivity (percolation) first occurs.

In applied mathematics, the numerical sign problem is the problem of numerically evaluating the integral of a highly oscillatory function of a large number of variables. Numerical methods fail because of the near-cancellation of the positive and negative contributions to the integral. Each has to be integrated to very high precision in order for their difference to be obtained with useful accuracy.

The slave boson method is a technique for dealing with models of strongly correlated systems, providing a method to second-quantize valence fluctuations within a restrictive manifold of states. In the 1960s the physicist John Hubbard introduced an operator, now named the "Hubbard operator" to describe the creation of an electron within a restrictive manifold of valence configurations. Consider for example, a rare earth or actinide ion in which strong Coulomb interactions restrict the charge fluctuations to two valence states, such as the Ce4+(4f0) and Ce3+ (4f1) configurations of a mixed-valence cerium compound. The corresponding quantum states of these two states are the singlet state and the magnetic state, where is the spin. The fermionic Hubbard operators that link these states are then

A polymer field theory is a statistical field theory describing the statistical behavior of a neutral or charged polymer system. It can be derived by transforming the partition function from its standard many-dimensional integral representation over the particle degrees of freedom in a functional integral representation over an auxiliary field function, using either the Hubbard–Stratonovich transformation or the delta-functional transformation. Computer simulations based on polymer field theories have been shown to deliver useful results, for example to calculate the structures and properties of polymer solutions, polymer melts and thermoplastics.

A field-theoretic simulation is a numerical strategy to calculate structure and physical properties of a many-particle system within the framework of a statistical field theory, like e.g. a polymer field theory. A convenient possibility is to use Monte Carlo (MC) algorithms, to sample the full partition function integral expressed in field-theoretic representation. The procedure is then called the auxiliary field Monte Carlo method. However, it is well known that MC sampling in conjunction with the basic field-theoretic representation of the partition function integral, directly obtained via the Hubbard-Stratonovich transformation, is impracticable, due to the so-called numerical sign problem. The difficulty is related to the complex and oscillatory nature of the resulting distribution function, which causes a bad statistical convergence of the ensemble averages of the desired structural and thermodynamic quantities. In such cases special analytical and numerical techniques are required to accelerate the statistical convergence of the field-theoretic simulation.

<span class="mw-page-title-main">David Ceperley</span>

David Matthew Ceperley is a theoretical physicist in the physics department at the University of Illinois Urbana-Champaign or UIUC. He is a world expert in the area of Quantum Monte Carlo computations, a method of calculation that is generally recognised to provide accurate quantitative results for many-body problems described by quantum mechanics.

The following timeline starts with the invention of the modern computer in the late interwar period.

The quantum jump method, also known as the Monte Carlo wave function (MCWF) is a technique in computational physics used for simulating open quantum systems and quantum dissipation. The quantum jump method was developed by Dalibard, Castin and Mølmer at a similar time to the similar method known as Quantum Trajectory Theory developed by Carmichael. Other contemporaneous works on wave-function-based Monte Carlo approaches to open quantum systems include those of Dum, Zoller and Ritsch and Hegerfeldt and Wilser.

<span class="mw-page-title-main">Ali Alavi</span>

Ali Alavi FRS is a professor of theoretical chemistry in the Department of Chemistry at the University of Cambridge and a Director of the Max Planck Institute for Solid State Research in Stuttgart.

In computational solid state physics, Continuous-time quantum Monte Carlo (CT-QMC) is a family of stochastic algorithms for solving the Anderson impurity model at finite temperature. These methods first expand the full partition function as a series of Feynman diagrams, employ Wick's theorem to group diagrams into determinants, and finally use Markov chain Monte Carlo to stochastically sum up the resulting series.

In many-body physics, the problem of analytic continuation is that of numerically extracting the spectral density of a Green function given its values on the imaginary axis. It is a necessary post-processing step for calculating dynamical properties of physical systems from quantum Monte Carlo simulations, which often compute Green function values only at imaginary-times or Matsubara frequencies.

Shang-keng Ma was a Chinese theoretical physicist, known for his work on the theory of critical phenomena and random systems. He is known as the co-author with Bertrand Halperin and Pierre Hohenberg of a 1972 paper that "generalized the renormalization group theory to dynamical critical phenomena." Ma is also known as the co-author with Yoseph Imry of a 1975 paper and with Amnon Aharony and Imry of a 1976 paper that established the foundation of the random field Ising model (RFIM)

References

Implementations