Stochastic tunneling

Last updated

In numerical analysis, stochastic tunneling (STUN) is an approach to global optimization based on the Monte Carlo method-sampling of the function to be objective minimized in which the function is nonlinearly transformed to allow for easier tunneling among regions containing function minima. Easier tunneling allows for faster exploration of sample space and faster convergence to a good solution.

Contents

Idea

Schematic one-dimensional test function (black) and STUN effective potential (red & blue), where the minimum indicated by the arrows is the best minimum found so far. All wells that lie above the best minimum found are suppressed. If the dynamical process can escape the well around the current minimum estimate it will not be trapped by other local minima that are higher. Wells with deeper minima are enhanced. The dynamical process is accelerated by that. Stun.jpg
Schematic one-dimensional test function (black) and STUN effective potential (red & blue), where the minimum indicated by the arrows is the best minimum found so far. All wells that lie above the best minimum found are suppressed. If the dynamical process can escape the well around the current minimum estimate it will not be trapped by other local minima that are higher. Wells with deeper minima are enhanced. The dynamical process is accelerated by that.

Monte Carlo method-based optimization techniques sample the objective function by randomly "hopping" from the current solution vector to another with a difference in the function value of . The acceptance probability of such a trial jump is in most cases chosen to be (Metropolis criterion) with an appropriate parameter .

The general idea of STUN is to circumvent the slow dynamics of ill-shaped energy functions that one encounters for example in spin glasses by tunneling through such barriers.

This goal is achieved by Monte Carlo sampling of a transformed function that lacks this slow dynamics. In the "standard-form" the transformation reads where is the lowest function value found so far. This transformation preserves the loci of the minima.

is then used in place of in the original algorithm giving a new acceptance probability of

The effect of such a transformation is shown in the graph.

Dynamically adaptive stochastic tunneling

A variation on always tunneling is to do so only when trapped at a local minimum. is then adjusted to tunnel out of the minimum and pursue a more globally optimum solution. Detrended fluctuation analysis is the recommended way of determining if trapped at a local minimum.

Other approaches

Related Research Articles

<span class="mw-page-title-main">Scanning tunneling microscope</span> Instrument able to image surfaces at the atomic level by exploiting quantum tunneling effects

A scanning tunneling microscope (STM) is a type of microscope used for imaging surfaces at the atomic level. Its development in 1981 earned its inventors, Gerd Binnig and Heinrich Rohrer, then at IBM Zürich, the Nobel Prize in Physics in 1986. STM senses the surface by using an extremely sharp conducting tip that can distinguish features smaller than 0.1 nm with a 0.01 nm (10 pm) depth resolution. This means that individual atoms can routinely be imaged and manipulated. Most scanning tunneling microscopes are built for use in ultra-high vacuum at temperatures approaching absolute zero, but variants exist for studies in air, water and other environments, and for temperatures over 1000 °C.

In physics, a Langevin equation is a stochastic differential equation describing how a system evolves when subjected to a combination of deterministic and fluctuating ("random") forces. The dependent variables in a Langevin equation typically are collective (macroscopic) variables changing only slowly in comparison to the other (microscopic) variables of the system. The fast (microscopic) variables are responsible for the stochastic nature of the Langevin equation. One application is to Brownian motion, which models the fluctuating motion of a small particle in a fluid.

In theoretical physics, the term renormalization group (RG) refers to a formal apparatus that allows systematic investigation of the changes of a physical system as viewed at different scales. In particle physics, it reflects the changes in the underlying force laws as the energy scale at which physical processes occur varies, energy/momentum and resolution distance scales being effectively conjugate under the uncertainty principle.

<span class="mw-page-title-main">Density of states</span> Number of available physical states per energy unit

In solid state physics and condensed matter physics, the density of states (DOS) of a system describes the number of modes per unit frequency range. The density of states is defined as , where is the number of states in the system of volume whose energies lie in the range from to . It is mathematically represented as a distribution by a probability density function, and it is generally an average over the space and time domains of the various states occupied by the system. The density of states is directly related to the dispersion relations of the properties of the system. High DOS at a specific energy level means that many states are available for occupation.

Global optimization is a branch of applied mathematics and numerical analysis that attempts to find the global minima or maxima of a function or a set of functions on a given set. It is usually described as a minimization problem because the maximization of the real-valued function is equivalent to the minimization of the function .

In probability theory and mathematical physics, a random matrix is a matrix-valued random variable—that is, a matrix in which some or all elements are random variables. Many important properties of physical systems can be represented mathematically as matrix problems. For example, the thermal conductivity of a lattice can be computed from the dynamical matrix of the particle-particle interactions within the lattice.

Nanotribology is the branch of tribology that studies friction, wear, adhesion and lubrication phenomena at the nanoscale, where atomic interactions and quantum effects are not negligible. The aim of this discipline is characterizing and modifying surfaces for both scientific and technological purposes.

Quantum annealing (QA) is an optimization process for finding the global minimum of a given objective function over a given set of candidate solutions, by a process using quantum fluctuations. Quantum annealing is used mainly for problems where the search space is discrete with many local minima; such as finding the ground state of a spin glass or the traveling salesman problem. The term "quantum annealing" was first proposed in 1988 by B. Apolloni, N. Cesa Bianchi and D. De Falco as a quantum-inspired classical algorithm. It was formulated in its present form by T. Kadowaki and H. Nishimori in "Quantum annealing in the transverse Ising model" though an imaginary-time variant without quantum coherence had been discussed by A. B. Finnila, M. A. Gomez, C. Sebenik and J. D. Doll, in "Quantum annealing is a new method for minimizing multidimensional functions".

Stochastic optimization (SO) methods are optimization methods that generate and use random variables. For stochastic problems, the random variables appear in the formulation of the optimization problem itself, which involves random objective functions or random constraints. Stochastic optimization methods also include methods with random iterates. Some stochastic optimization methods use random iterates to solve stochastic problems, combining both meanings of stochastic optimization. Stochastic optimization methods generalize deterministic methods for deterministic problems.

<span class="mw-page-title-main">Quantum stirring, ratchets, and pumping</span>

A pump is an alternating current-driven device that generates a direct current (DC). In the simplest configuration a pump has two leads connected to two reservoirs. In such open geometry, the pump takes particles from one reservoir and emits them into the other. Accordingly, a current is produced even if the reservoirs have the same temperature and chemical potential.

The Wang and Landau algorithm, proposed by Fugao Wang and David P. Landau, is a Monte Carlo method designed to estimate the density of states of a system. The method performs a non-Markovian random walk to build the density of states by quickly visiting all the available energy spectrum. The Wang and Landau algorithm is an important method to obtain the density of states required to perform a multicanonical simulation.

In applied mathematics, the numerical sign problem is the problem of numerically evaluating the integral of a highly oscillatory function of a large number of variables. Numerical methods fail because of the near-cancellation of the positive and negative contributions to the integral. Each has to be integrated to very high precision in order for their difference to be obtained with useful accuracy.

A polymer field theory is a statistical field theory describing the statistical behavior of a neutral or charged polymer system. It can be derived by transforming the partition function from its standard many-dimensional integral representation over the particle degrees of freedom in a functional integral representation over an auxiliary field function, using either the Hubbard–Stratonovich transformation or the delta-functional transformation. Computer simulations based on polymer field theories have been shown to deliver useful results, for example to calculate the structures and properties of polymer solutions, polymer melts and thermoplastics.

<span class="mw-page-title-main">Tracy–Widom distribution</span>

The Tracy–Widom distribution is a probability distribution from random matrix theory introduced by Craig Tracy and Harold Widom. It is the distribution of the normalized largest eigenvalue of a random Hermitian matrix. The distribution is defined as a Fredholm determinant.

The Swendsen–Wang algorithm is the first non-local or cluster algorithm for Monte Carlo simulation for large systems near criticality. It has been introduced by Robert Swendsen and Jian-Sheng Wang in 1987 at Carnegie Mellon.

In the context of the physical and mathematical theory of percolation, a percolation transition is characterized by a set of universal critical exponents, which describe the fractal properties of the percolating medium at large scales and sufficiently close to the transition. The exponents are universal in the sense that they only depend on the type of percolation model and on the space dimension. They are expected to not depend on microscopic details such as the lattice structure, or whether site or bond percolation is considered. This article deals with the critical exponents of random percolation.

<span class="mw-page-title-main">Metadynamics</span>

Metadynamics is a computer simulation method in computational physics, chemistry and biology. It is used to estimate the free energy and other state functions of a system, where ergodicity is hindered by the form of the system's energy landscape. It was first suggested by Alessandro Laio and Michele Parrinello in 2002 and is usually applied within molecular dynamics simulations. MTD closely resembles a number of recent methods such as adaptively biased molecular dynamics, adaptive reaction coordinate forces and local elevation umbrella sampling. More recently, both the original and well-tempered metadynamics were derived in the context of importance sampling and shown to be a special case of the adaptive biasing potential setting. MTD is related to the Wang–Landau sampling.

The phase-space formulation of quantum mechanics places the position and momentum variables on equal footing in phase space. In contrast, the Schrödinger picture uses the position or momentum representations. The two key features of the phase-space formulation are that the quantum state is described by a quasiprobability distribution and operator multiplication is replaced by a star product.

In statistics and physics, multicanonical ensemble is a Markov chain Monte Carlo sampling technique that uses the Metropolis–Hastings algorithm to compute integrals where the integrand has a rough landscape with multiple local minima. It samples states according to the inverse of the density of states, which has to be known a priori or be computed using other techniques like the Wang and Landau algorithm. Multicanonical sampling is an important technique for spin systems like the Ising model or spin glasses.

Quantum optimization algorithms are quantum algorithms that are used to solve optimization problems. Mathematical optimization deals with finding the best solution to a problem from a set of possible solutions. Mostly, the optimization problem is formulated as a minimization problem, where one tries to minimize an error which depends on the solution: the optimal solution has the minimal error. Different optimization techniques are applied in various fields such as mechanics, economics and engineering, and as the complexity and amount of data involved rise, more efficient ways of solving optimization problems are needed. The power of quantum computing may allow problems which are not practically feasible on classical computers to be solved, or suggest a considerable speed up with respect to the best known classical algorithm.

References