Interatomic potentials are mathematical functions to calculate the potential energy of a system of atoms with given positions in space. [1] [2] [3] [4] Interatomic potentials are widely used as the physical basis of molecular mechanics and molecular dynamics simulations in computational chemistry, computational physics and computational materials science to explain and predict materials properties. Examples of quantitative properties and qualitative phenomena that are explored with interatomic potentials include lattice parameters, surface energies, interfacial energies, adsorption, cohesion, thermal expansion, and elastic and plastic material behavior, as well as chemical reactions. [5] [6] [7] [8] [9] [10] [11]
Interatomic potentials can be written as a series expansion of functional terms that depend on the position of one, two, three, etc. atoms at a time. Then the total potential of the system can be written as [3]
Here is the one-body term, the two-body term, the three body term, the number of atoms in the system, the position of atom , etc. , and are indices that loop over atom positions.
Note that in case the pair potential is given per atom pair, in the two-body term the potential should be multiplied by 1/2 as otherwise each bond is counted twice, and similarly the three-body term by 1/6. [3] Alternatively, the summation of the pair term can be restricted to cases and similarly for the three-body term , if the potential form is such that it is symmetric with respect to exchange of the and indices (this may not be the case for potentials for multielemental systems).
The one-body term is only meaningful if the atoms are in an external field (e.g. an electric field). In the absence of external fields, the potential should not depend on the absolute position of atoms, but only on the relative positions. This means that the functional form can be rewritten as a function of interatomic distances and angles between the bonds (vectors to neighbours) . Then, in the absence of external forces, the general form becomes
In the three-body term the interatomic distance is not needed since the three terms are sufficient to give the relative positions of three atoms in three-dimensional space. Any terms of order higher than 2 are also called many-body potentials. In some interatomic potentials the many-body interactions are embedded into the terms of a pair potential (see discussion on EAM-like and bond order potentials below).
In principle the sums in the expressions run over all atoms. However, if the range of the interatomic potential is finite, i.e. the potentials above some cutoff distance , the summing can be restricted to atoms within the cutoff distance of each other. By also using a cellular method for finding the neighbours, [1] the MD algorithm can be an O(N) algorithm. Potentials with an infinite range can be summed up efficiently by Ewald summation and its further developments.
The forces acting between atoms can be obtained by differentiation of the total energy with respect to atom positions. That is, to get the force on atom one should take the three-dimensional derivative (gradient) of the potential with respect to the position of atom :
For two-body potentials this gradient reduces, thanks to the symmetry with respect to in the potential form, to straightforward differentiation with respect to the interatomic distances . However, for many-body potentials (three-body, four-body, etc.) the differentiation becomes considerably more complex [12] [13] since the potential may not be any longer symmetric with respect to exchange. In other words, also the energy of atoms that are not direct neighbours of can depend on the position because of angular and other many-body terms, and hence contribute to the gradient .
Interatomic potentials come in many different varieties, with different physical motivations. Even for single well-known elements such as silicon, a wide variety of potentials quite different in functional form and motivation have been developed. [14] The true interatomic interactions are quantum mechanical in nature, and there is no known way in which the true interactions described by the Schrödinger equation or Dirac equation for all electrons and nuclei could be cast into an analytical functional form. Hence all analytical interatomic potentials are by necessity approximations.
Over time interatomic potentials have largely grown more complex and more accurate, although this is not strictly true. [15] This has included both increased descriptions of physics, as well as added parameters. Until recently, all interatomic potentials could be described as "parametric", having been developed and optimized with a fixed number of (physical) terms and parameters. New research focuses instead on non-parametric potentials which can be systematically improvable by using complex local atomic neighbor descriptors and separate mappings to predict system properties, such that the total number of terms and parameters are flexible. [16] These non-parametric models can be significantly more accurate, but since they are not tied to physical forms and parameters, there are many potential issues surrounding extrapolation and uncertainties.
The arguably simplest widely used interatomic interaction model is the Lennard-Jones potential [17] [18] [19]
where is the depth of the potential well and is the distance at which the potential crosses zero. The attractive term proportional to in the potential comes from the scaling of van der Waals forces, while the repulsive term is much more approximate (conveniently the square of the attractive term). [6] On its own, this potential is quantitatively accurate only for noble gases and has been extensively studied in the past decades, [20] but is also widely used for qualitative studies and in systems where dipole interactions are significant, particularly in chemistry force fields to describe intermolecular interactions - especially in fluids. [21]
Another simple and widely used pair potential is the Morse potential, which consists simply of a sum of two exponentials.
Here is the equilibrium bond energy and the bond distance. The Morse potential has been applied to studies of molecular vibrations and solids, [22] and also inspired the functional form of more accurate potentials such as the bond-order potentials.
Ionic materials are often described by a sum of a short-range repulsive term, such as the Buckingham pair potential, and a long-range Coulomb potential giving the ionic interactions between the ions forming the material. The short-range term for ionic materials can also be of many-body character . [23]
Pair potentials have some inherent limitations, such as the inability to describe all 3 elastic constants of cubic metals or correctly describe both cohesive energy and vacancy formation energy. [7] Therefore, quantitative molecular dynamics simulations are carried out with various of many-body potentials.
For very short interatomic separations, important in radiation material science, the interactions can be described quite accurately with screened Coulomb potentials which have the general form
Here, when . and are the charges of the interacting nuclei, and is the so-called screening parameter. A widely used popular screening function is the "Universal ZBL" one. [24] and more accurate ones can be obtained from all-electron quantum chemistry calculations [25] In binary collision approximation simulations this kind of potential can be used to describe the nuclear stopping power.
The Stillinger-Weber potential [26] is a potential that has a two-body and three-body terms of the standard form
where the three-body term describes how the potential energy changes with bond bending. It was originally developed for pure Si, but has been extended to many other elements and compounds [27] [28] and also formed the basis for other Si potentials. [29] [30]
Metals are very commonly described with what can be called "EAM-like" potentials, i.e. potentials that share the same functional form as the embedded atom model. In these potentials, the total potential energy is written
where is a so-called embedding function (not to be confused with the force ) that is a function of the sum of the so-called electron density . is a pair potential that usually is purely repulsive. In the original formulation [31] [32] the electron density function was obtained from true atomic electron densities, and the embedding function was motivated from density-functional theory as the energy needed to 'embed' an atom into the electron density. . [33] However, many other potentials used for metals share the same functional form but motivate the terms differently, e.g. based on tight-binding theory [34] [35] [36] or other motivations [37] [38] . [39]
EAM-like potentials are usually implemented as numerical tables. A collection of tables is available at the interatomic potential repository at NIST
Covalently bonded materials are often described by bond order potentials, sometimes also called Tersoff-like or Brenner-like potentials. [10] [40] [41]
These have in general a form that resembles a pair potential:
where the repulsive and attractive part are simple exponential functions similar to those in the Morse potential. However, the strength is modified by the environment of the atom via the term. If implemented without an explicit angular dependence, these potentials can be shown to be mathematically equivalent to some varieties of EAM-like potentials [42] [43] Thanks to this equivalence, the bond-order potential formalism has been implemented also for many metal-covalent mixed materials. [43] [44] [45] [46]
EAM potentials have also been extended to describe covalent bonding by adding angular-dependent terms to the electron density function , in what is called the modified embedded atom method (MEAM). [47] [48] [49]
A force field is the collection of parameters to describe the physical interactions between atoms or physical units (up to ~108) using a given energy expression. The term force field characterizes the collection of parameters for a given interatomic potential (energy function) and is often used within the computational chemistry community. [50] The force field parameters make the difference between good and poor models. Force fields are used for the simulation of metals, ceramics, molecules, chemistry, and biological systems, covering the entire periodic table and multiphase materials. Today's performance is among the best for solid-state materials, [51] [52] molecular fluids, [21] and for biomacromolecules, [53] whereby biomacromolecules were the primary focus of force fields from the 1970s to the early 2000s. Force fields range from relatively simple and interpretable fixed-bond models (e.g. Interface force field, [50] CHARMM, [54] and COMPASS) to explicitly reactive models with many adjustable fit parameters (e.g. ReaxFF) and machine learning models.
It should first be noted that non-parametric potentials are often referred to as "machine learning" potentials. While the descriptor/mapping forms of non-parametric models are closely related to machine learning in general and their complex nature make machine learning fitting optimizations almost necessary, differentiation is important in that parametric models can also be optimized using machine learning.
Current research in interatomic potentials involves using systematically improvable, non-parametric mathematical forms and increasingly complex machine learning methods. The total energy is then writtenwhere is a mathematical representation of the atomic environment surrounding the atom , known as the descriptor. [55] is a machine-learning model that provides a prediction for the energy of atom based on the descriptor output. An accurate machine-learning potential requires both a robust descriptor and a suitable machine learning framework. The simplest descriptor is the set of interatomic distances from atom to its neighbours, yielding a machine-learned pair potential. However, more complex many-body descriptors are needed to produce highly accurate potentials. [55] It is also possible to use a linear combination of multiple descriptors with associated machine-learning models. [56] Potentials have been constructed using a variety of machine-learning methods, descriptors, and mappings, including neural networks, [57] Gaussian process regression, [58] [59] and linear regression. [60] [16]
A non-parametric potential is most often trained to total energies, forces, and/or stresses obtained from quantum-level calculations, such as density functional theory, as with most modern potentials. However, the accuracy of a machine-learning potential can be converged to be comparable with the underlying quantum calculations, unlike analytical models. Hence, they are in general more accurate than traditional analytical potentials, but they are correspondingly less able to extrapolate. Further, owing to the complexity of the machine-learning model and the descriptors, they are computationally far more expensive than their analytical counterparts.
Non-parametric, machine learned potentials may also be combined with parametric, analytical potentials, for example to include known physics such as the screened Coulomb repulsion, [61] or to impose physical constraints on the predictions. [62]
Since the interatomic potentials are approximations, they by necessity all involve parameters that need to be adjusted to some reference values. In simple potentials such as the Lennard-Jones and Morse ones, the parameters are interpretable and can be set to match e.g. the equilibrium bond length and bond strength of a dimer molecule or the surface energy of a solid . [63] [64] Lennard-Jones potential can typically describe the lattice parameters, surface energies, and approximate mechanical properties. [65] Many-body potentials often contain tens or even hundreds of adjustable parameters with limited interpretability and no compatibility with common interatomic potentials for bonded molecules. Such parameter sets can be fit to a larger set of experimental data, or materials properties derived from less reliable data such as from density-functional theory. [66] [67] For solids, a many-body potential can often describe the lattice constant of the equilibrium crystal structure, the cohesive energy, and linear elastic constants, as well as basic point defect properties of all the elements and stable compounds well, although deviations in surface energies often exceed 50%. [30] [43] [45] [46] [65] [50] [68] [69] [70] Non-parametric potentials in turn contain hundreds or even thousands of independent parameters to fit. For any but the simplest model forms, sophisticated optimization and machine learning methods are necessary for useful potentials.
The aim of most potential functions and fitting is to make the potential transferable, i.e. that it can describe materials properties that are clearly different from those it was fitted to (for examples of potentials explicitly aiming for this, see e.g. [71] [72] [73] [74] [75] ). Key aspects here are the correct representation of chemical bonding, validation of structures and energies, as well as interpretability of all parameters. [51] Full transferability and interpretability is reached with the Interface force field (IFF). [50] An example of partial transferability, a review of interatomic potentials of Si describes that Stillinger-Weber and Tersoff III potentials for Si can describe several (but not all) materials properties they were not fitted to. [14]
The NIST interatomic potential repository provides a collection of fitted interatomic potentials, either as fitted parameter values or numerical tables of the potential functions. [76] The OpenKIM [77] project also provides a repository of fitted potentials, along with collections of validation tests and a software framework for promoting reproducibility in molecular simulations using interatomic potentials.
Since the 1990s, machine learning programs have been employed to construct interatomic potentials, mapping atomic structures to their potential energies. These are generally referred to as 'machine learning potentials' (MLPs) [78] or as 'machine-learned interatomic potentials' (MLIPs). [79] Such machine learning potentials help fill the gap between highly accurate but computationally intensive simulations like density functional theory and computationally lighter, but much less precise, empirical potentials. Early neural networks showed promise, but their inability to systematically account for interatomic energy interactions limited their applications to smaller, low-dimensional systems, keeping them largely within the confines of academia. However, with continuous advancements in artificial intelligence technology, machine learning methods have become significantly more accurate, positioning machine learning as a significant player in potential fitting. [80] [81] [82]
Modern neural networks have revolutionized the construction of highly accurate and computationally light potentials by integrating theoretical understanding of materials science into their architectures and preprocessing. Almost all are local, accounting for all interactions between an atom and its neighbor up to some cutoff radius. These neural networks usually intake atomic coordinates and output potential energies. Atomic coordinates are sometimes transformed with atom-centered symmetry functions or pair symmetry functions before being fed into neural networks. Encoding symmetry has been pivotal in enhancing machine learning potentials by drastically constraining the neural networks' search space. [80] [83]
Conversely, message-passing neural networks (MPNNs), a form of graph neural networks, learn their own descriptors and symmetry encodings. They treat molecules as three-dimensional graphs and iteratively update each atom's feature vectors as information about neighboring atoms is processed through message functions and convolutions. These feature vectors are then used to directly predict the final potentials. In 2017, the first-ever MPNN model, a deep tensor neural network, was used to calculate the properties of small organic molecules. Advancements in this technology led to the development of Matlantis in 2022, which commercially applies machine learning potentials for new materials discovery. [84] Matlantis, which can simulate 72 elements, handle up to 20,000 atoms at a time, and execute calculations up to 20 million times faster than density functional theory with almost indistinguishable accuracy, showcases the power of machine learning potentials in the age of artificial intelligence. [80] [85] [86]
Another class of machine-learned interatomic potential is the Gaussian approximation potential (GAP), [87] [88] [89] which combines compact descriptors of local atomic environments [90] with Gaussian process regression [91] to machine learn the potential energy surface of a given system. To date, the GAP framework has been used to successfully develop a number of MLIPs for various systems, including for elemental systems such as Carbon [92] Silicon, [93] and Tungsten, [94] as well as for multicomponent systems such as Ge2Sb2Te5 [95] and austenitic stainless steel, Fe7Cr2Ni. [96]
Classical interatomic potentials often exceed the accuracy of simplified quantum mechanical methods such as density functional theory at a million times lower computational cost. [51] The use of interatomic potentials is recommended for the simulation of nanomaterials, biomacromolecules, and electrolytes from atoms up to millions of atoms at the 100 nm scale and beyond. As a limitation, electron densities and quantum processes at the local scale of hundreds of atoms are not included. When of interest, higher level quantum chemistry methods can be locally used. [97]
The robustness of a model at different conditions other than those used in the fitting process is often measured in terms of transferability of the potential.
The Wannier functions are a complete set of orthogonal functions used in solid-state physics. They were introduced by Gregory Wannier in 1937. Wannier functions are the localized molecular orbitals of crystalline systems.
In the physical theory of spin glass magnetization, the Ruderman–Kittel–Kasuya–Yosida (RKKY) interaction models the coupling of nuclear magnetic moments or localized inner d- or f-shell electron spins through conduction electrons. It is named after Malvin Ruderman, Charles Kittel, Tadao Kasuya, and Kei Yosida, the physicists who first proposed and developed the model.
In physics, a pseudopotential or effective potential is used as an approximation for the simplified description of complex systems. Applications include atomic physics and neutron scattering. The pseudopotential approximation was first introduced by Hans Hellmann in 1934.
In physics, a Feshbach resonance can occur upon collision of two slow atoms, when they temporarily stick together forming an unstable compound with short lifetime. It is a feature of many-body systems in which a bound state is achieved if the coupling(s) between at least one internal degree of freedom and the reaction coordinates, which lead to dissociation, vanish. The opposite situation, when a bound state is not formed, is a shape resonance. It is named after Herman Feshbach, a physicist at MIT.
Mott insulators are a class of materials that are expected to conduct electricity according to conventional band theories, but turn out to be insulators. These insulators fail to be correctly described by band theories of solids due to their strong electron–electron interactions, which are not considered in conventional band theory. A Mott transition is a transition from a metal to an insulator, driven by the strong interactions between electrons. One of the simplest models that can capture Mott transition is the Hubbard model.
In the context of chemistry, molecular physics, physical chemistry, and molecular modelling, a force field is a computational model that is used to describe the forces between atoms within molecules or between molecules as well as in crystals. Force fields are a variety of interatomic potentials. More precisely, the force field refers to the functional form and parameter sets used to calculate the potential energy of a system on the atomistic level. Force fields are usually used in molecular dynamics or Monte Carlo simulations. The parameters for a chosen energy function may be derived from classical laboratory experiment data, calculations in quantum mechanics, or both. Force fields utilize the same concept as force fields in classical physics, with the main difference being that the force field parameters in chemistry describe the energy landscape on the atomistic level. From a force field, the acting forces on every particle are derived as a gradient of the potential energy with respect to the particle coordinates.
An optical lattice is formed by the interference of counter-propagating laser beams, creating a spatially periodic polarization pattern. The resulting periodic potential may trap neutral atoms via the Stark shift. Atoms are cooled and congregate at the potential extrema. The resulting arrangement of trapped atoms resembles a crystal lattice and can be used for quantum simulation.
The Bose–Hubbard model gives a description of the physics of interacting spinless bosons on a lattice. It is closely related to the Hubbard model that originated in solid-state physics as an approximate description of superconducting systems and the motion of electrons between the atoms of a crystalline solid. The model was introduced by Gersch and Knollman in 1963 in the context of granular superconductors. The model rose to prominence in the 1980s after it was found to capture the essence of the superfluid-insulator transition in a way that was much more mathematically tractable than fermionic metal-insulator models.
In computational chemistry and computational physics, the embedded atom model, embedded-atom method or EAM, is an approximation describing the energy between atoms and is a type of interatomic potential. The energy is a function of a sum of functions of the separation between an atom and its neighbors. In the original model, by Murray Daw and Mike Baskes, the latter functions represent the electron density. The EAM is related to the second moment approximation to tight binding theory, also known as the Finnis-Sinclair model. These models are particularly appropriate for metallic systems. Embedded-atom methods are widely used in molecular dynamics simulations.
In computational physics, variational Monte Carlo (VMC) is a quantum Monte Carlo method that applies the variational method to approximate the ground state of a quantum system.
Bond order potential is a class of empirical (analytical) interatomic potentials which is used in molecular dynamics and molecular statics simulations. Examples include the Tersoff potential, the EDIP potential, the Brenner potential, the Finnis–Sinclair potentials, ReaxFF, and the second-moment tight-binding potentials. They have the advantage over conventional molecular mechanics force fields in that they can, with the same parameters, describe several different bonding states of an atom, and thus to some extent may be able to describe chemical reactions correctly. The potentials were developed partly independently of each other, but share the common idea that the strength of a chemical bond depends on the bonding environment, including the number of bonds and possibly also angles and bond lengths. It is based on the Linus Pauling bond order concept and can be written in the form
The reactive empirical bond-order (REBO) model is a function for calculating the potential energy of covalent bonds and the interatomic force. In this model, the total potential energy of system is a sum of nearest-neighbour pair interactions which depend not only on the distance between atoms but also on their local atomic environment. A parametrized bond order function was used to describe chemical pair bonded interactions.
The hexatic phase is a state of matter that is between the solid and the isotropic liquid phases in two dimensional systems of particles. It is characterized by two order parameters: a short-range positional and a quasi-long-range orientational (sixfold) order. More generally, a hexatic is any phase that contains sixfold orientational order, in analogy with the nematic phase.
A helium atom is an atom of the chemical element helium. Helium is composed of two electrons bound by the electromagnetic force to a nucleus containing two protons along with two neutrons, depending on the isotope, held together by the strong force. Unlike for hydrogen, a closed-form solution to the Schrödinger equation for the helium atom has not been found. However, various approximations, such as the Hartree–Fock method, can be used to estimate the ground state energy and wavefunction of the atom. Historically, the first such helium spectrum calculation was done by Albrecht Unsöld in 1927. Its success was considered to be one of the earliest signs of validity of Schrödinger's wave mechanics.
In materials science, the threshold displacement energy is the minimum kinetic energy that an atom in a solid needs to be permanently displaced from its site in the lattice to a defect position. It is also known as "displacement threshold energy" or just "displacement energy". In a crystal, a separate threshold displacement energy exists for each crystallographic direction. Then one should distinguish between the minimum and average over all lattice directions' threshold displacement energies. In amorphous solids, it may be possible to define an effective displacement energy to describe some other average quantity of interest. Threshold displacement energies in typical solids are of the order of 10-50 eV.
In theoretical chemistry, the Buckingham potential is a formula proposed by Richard Buckingham which describes the Pauli exclusion principle and van der Waals energy for the interaction of two atoms that are not directly bonded as a function of the interatomic distance . It is a variety of interatomic potentials.
Interatomic Coulombic decay (ICD) is a general, fundamental property of atoms and molecules that have neighbors. Interatomic (intermolecular) Coulombic decay is a very efficient interatomic (intermolecular) relaxation process of an electronically excited atom or molecule embedded in an environment. Without the environment the process cannot take place. Until now it has been mainly demonstrated for atomic and molecular clusters, independently of whether they are of van-der-Waals or hydrogen bonded type.
The Bethe–Slater curve is a heuristic explanation for why certain metals are ferromagnetic and others are antiferromagnetic. It assumes a Heisenberg model of magnetism, and explains the differences in exchange energy of transition metals as due to the ratio of the interatomic distance a to the radius r of the 3d electron shell. When the magnetically important 3d electrons of adjacent atoms are relatively close to each other, the exchange interaction, , is negative, but when they are further away, the exchange interaction becomes positive, before slowly dropping off.
In quantum computing, and more specifically in superconducting quantum computing, a transmon is a type of superconducting charge qubit designed to have reduced sensitivity to charge noise. The transmon was developed by Robert J. Schoelkopf, Michel Devoret, Steven M. Girvin, and their colleagues at Yale University in 2007. Its name is an abbreviation of the term transmission line shunted plasma oscillation qubit; one which consists of a Cooper-pair box "where the two superconductors are also [capacitively] shunted in order to decrease the sensitivity to charge noise, while maintaining a sufficient anharmonicity for selective qubit control".
Beginning in the 1990s, researchers have employed machine learning programs to construct interatomic potentials, mapping atomic structures to their potential energies. These potentials are generally referred to as 'machine-learned interatomic potentials' (MLIPs) or simply 'machine learning potentials' (MLPs). Such machine learning potentials promised to fill the gap between density functional theory, a highly-accurate but computationally-intensive simulation program, and empirically derived or intuitively-approximated potentials, which were far computationally lighter but substantially less accurate. Improvements in artificial intelligence technology have served to heighten the accuracy of MLPs while lowering their computational cost, increasing machine learning's role in fitting potentials.
{{cite web}}
: CS1 maint: multiple names: authors list (link)