A major contributor to this article appears to have a close connection with its subject. (May 2010) (Learn how and when to remove this template message) |
A field-theoretic simulation is a numerical strategy to calculate structure and physical properties of a many-particle system within the framework of a statistical field theory, like e.g. a polymer field theory. A convenient possibility is to use Monte Carlo (MC) algorithms, to sample the full partition function integral expressed in field-theoretic representation. The procedure is then called the auxiliary field Monte Carlo method. However, it is well known that MC sampling in conjunction with the basic field-theoretic representation of the partition function integral, directly obtained via the Hubbard-Stratonovich transformation, is impracticable, due to the so-called numerical sign problem (Baeurle 2002, Fredrickson 2002). The difficulty is related to the complex and oscillatory nature of the resulting distribution function, which causes a bad statistical convergence of the ensemble averages of the desired structural and thermodynamic quantities. In such cases special analytical and numerical techniques are required to accelerate the statistical convergence of the field-theoretic simulation (Baeurle 2003, Baeurle 2003a, Baeurle 2004).
To make the field-theoretic methodology amenable for computation, Baeurle proposed to shift the contour of integration of the partition function integral through the homogeneous mean field (MF) solution using Cauchy's integral theorem, which provides its so-called mean-field representation. This strategy was previously successfully employed in field-theoretic electronic structure calculations (Rom 1997, Baer 1998). Baeurle could demonstrate that this technique provides a significant acceleration of the statistical convergence of the ensemble averages in the MC sampling procedure (Baeurle 2002).
In subsequent works Baeurle et al. (Baeurle 2002, Baeurle 2002a) applied the concept of tadpole renormalization, which originates from quantum field theory and leads to the Gaussian equivalent representation of the partition function integral, in conjunction with advanced MC techniques in the grand canonical ensemble. They could convincingly demonstrate that this strategy provides an additional boost in the statistical convergence of the desired ensemble averages (Baeurle 2002).
Other promising field-theoretic simulation techniques have been developed recently, but they either still lack the proof of correct statistical convergence, like e.g. the Complex Langevin method (Ganesan 2001), and/or still need to prove their effectiveness on systems, where multiple saddle points are important (Moreira 2003).
Statistical mechanics, one of the pillars of modern physics, describes how macroscopic observations are related to microscopic parameters that fluctuate around an average. It connects thermodynamic quantities to microscopic behavior, whereas, in classical thermodynamics, the only available option would be to measure and tabulate such quantities for various materials.
Monte Carlo methods, or Monte Carlo experiments, are a broad class of computational algorithms that rely on repeated random sampling to obtain numerical results. The underlying concept is to use randomness to solve problems that might be deterministic in principle. They are often used in physical and mathematical problems and are most useful when it is difficult or impossible to use other approaches. Monte Carlo methods are mainly used in three problem classes: optimization, numerical integration, and generating draws from a probability distribution.
Computational physics is the study and implementation of numerical analysis to solve problems in physics for which a quantitative theory already exists. Historically, computational physics was the first application of modern computers in science, and is now a subset of computational science.
In statistics, Markov chain Monte Carlo (MCMC) methods comprise a class of algorithms for sampling from a probability distribution. By constructing a Markov chain that has the desired distribution as its equilibrium distribution, one can obtain a sample of the desired distribution by recording states from the chain. The more steps that are included, the more closely the distribution of the sample matches the actual desired distribution. Various algorithms exist for constructing chains, including the Metropolis–Hastings algorithm.
The particle-in-cell (PIC) method refers to a technique used to solve a certain class of partial differential equations. In this method, individual particles in a Lagrangian frame are tracked in continuous phase space, whereas moments of the distribution such as densities and currents are computed simultaneously on Eulerian (stationary) mesh points.
In physics, lattice gauge theory is the study of gauge theories on a spacetime that has been discretized into a lattice.
Lattice QCD is a well-established non-perturbative approach to solving the quantum chromodynamics (QCD) theory of quarks and gluons. It is a lattice gauge theory formulated on a grid or lattice of points in space and time. When the size of the lattice is taken infinitely large and its sites infinitesimally close to each other, the continuum QCD is recovered.
Parallel tempering, also known as replica exchange MCMC sampling, is a simulation method aimed at improving the dynamic properties of Monte Carlo method simulations of physical systems, and of Markov chain Monte Carlo (MCMC) sampling methods more generally. The replica exchange method was originally devised by Swendsen and Wang then extended by Geyer and later developed, among others, by Hukushima and Nemoto, Giorgio Parisi, Sugita and Okamoto formulated a molecular dynamics version of parallel tempering: this is usually known as replica-exchange molecular dynamics or REMD.
Quantum Monte Carlo encompasses a large family of computational methods whose common aim is the study of complex quantum systems. One of the major goals of these approaches is to provide a reliable solution of the quantum many-body problem. The diverse flavor of quantum Monte Carlo approaches all share the common use of the Monte Carlo method to handle the multi-dimensional integrals that arise in the different formulations of the many-body problem. The quantum Monte Carlo methods allow for a direct treatment and description of complex many-body effects encoded in the wave function, going beyond mean-field theory and offering an exact solution of the many-body problem in some circumstances. In particular, there exist numerically exact and polynomially-scaling algorithms to exactly study static properties of boson systems without geometrical frustration. For fermions, there exist very good approximations to their static properties and numerically exact exponentially scaling quantum Monte Carlo algorithms, but none that are both.
Auxiliary-field Monte Carlo is a method that allows the calculation, by use of Monte Carlo techniques, of averages of operators in many-body quantum mechanical or classical problems.
In computational physics, variational Monte Carlo (VMC) is a quantum Monte Carlo method that applies the variational method to approximate the ground state of a quantum system.
Path integral Monte Carlo (PIMC) is a quantum Monte Carlo method in the path integral formulation of quantum statistical mechanics.
The Wang and Landau algorithm, proposed by Fugao Wang and David P. Landau, is a Monte Carlo method designed to estimate the density of states of a system. The method performs a non-Markovian random walk to build the density of states by quickly visiting all the available energy spectrum. The Wang and Landau algorithm is an important method to obtain the density of states required to perform a multicanonical simulation.
In applied mathematics, the numerical sign problem is the problem of numerically evaluating the integral of a highly oscillatory function of a large number of variables. Numerical methods fail because of the near-cancellation of the positive and negative contributions to the integral. Each has to be integrated to very high precision in order for their difference to be obtained with useful accuracy.
Equation of State Calculations by Fast Computing Machines is an article published by Nicholas Metropolis, Arianna W. Rosenbluth, Marshall N. Rosenbluth, Augusta H. Teller, and Edward Teller in the Journal of Chemical Physics in 1953. This paper proposed what became known as the Metropolis Monte Carlo algorithm, which forms the basis for Monte Carlo statistical mechanics simulations of atomic and molecular systems.
A polymer field theory is a statistical field theory describing the statistical behavior of a neutral or charged polymer system. It can be derived by transforming the partition function from its standard many-dimensional integral representation over the particle degrees of freedom in a functional integral representation over an auxiliary field function, using either the Hubbard–Stratonovich transformation or the delta-functional transformation. Computer simulations based on polymer field theories have been shown to deliver useful results, for example to calculate the structures and properties of polymer solutions, polymer melts and thermoplastics.
Transition path sampling (TPS) is a Rare Event Sampling method used in computer simulations of rare events: physical or chemical transitions of a system from one stable state to another that occur too rarely to be observed on a computer timescale. Examples include protein folding, chemical reactions and nucleation. Standard simulation tools such as molecular dynamics can generate the dynamical trajectories of all the atoms in the system. However, because of the gap in accessible time-scales between simulation and reality, even present supercomputers might require years of simulations to show an event that occurs once per microsecond without some kind of acceleration.
David Matthew Ceperley is a theoretical physicist in the physics department at the University of Illinois Urbana-Champaign or UIUC. He is a world expert in the area of Quantum Monte Carlo computations, a method of calculation that is generally recognised to provide accurate quantitative results for many-body problems described by quantum mechanics.
Path integral molecular dynamics (PIMD) is a method of incorporating quantum mechanics into molecular dynamics simulations using Feynman path integrals. In PIMD, one uses the Born–Oppenheimer approximation to separate the wavefunction into a nuclear part and an electronic part. The nuclei are treated quantum mechanically by mapping each quantum nucleus onto a classical system of several fictitious particles connected by springs governed by an effective Hamiltonian, which is derived from Feynman's path integral. The resulting classical system, although complex, can be solved relatively quickly. There are now a number of commonly used condensed matter computer simulation techniques that make use of the path integral formulation including Centroid Molecular Dynamics (CMD), Ring Polymer Molecular Dynamics (RPMD), and the Feynman-Kleinert Quasi-Classical Wigner (FK-QCW) method. The same techniques are also used in path integral Monte Carlo (PIMC).
In statistics and physics, multicanonical ensemble is a Markov chain Monte Carlo sampling technique that uses the Metropolis–Hastings algorithm to compute integrals where the integrand has a rough landscape with multiple local minima. It samples states according to the inverse of the density of states, which has to be known a priori or be computed using other techniques like the Wang and Landau algorithm. Multicanonical sampling is an important technique for spin systems like the Ising model or spin glasses.