Machine-learned interatomic potential

Last updated

Machine-learned interatomic potentials (MLIPs), or simply machine learning potentials (MLPs), are interatomic potentials constructed by machine learning programs. Beginning in the 1990s, researchers have employed such programs to construct interatomic potentials by mapping atomic structures to their potential energies. These potentials are referred to as MLIPs or MLPs.

Such machine learning potentials promised to fill the gap between density functional theory, a highly accurate but computationally intensive modelling method, and empirically derived or intuitively-approximated potentials, which were far lighter computationally but substantially less accurate. Improvements in artificial intelligence technology heightened the accuracy of MLPs while lowering their computational cost, increasing the role of machine learning in fitting potentials. [1] [2]

Machine learning potentials began by using neural networks to tackle low-dimensional systems. While promising, these models could not systematically account for interatomic energy interactions; they could be applied to small molecules in a vacuum, or molecules interacting with frozen surfaces, but not much else – and even in these applications, the models often relied on force fields or potentials derived empirically or with simulations. [1] These models thus remained confined to academia.

Modern neural networks construct highly accurate and computationally light potentials, as theoretical understanding of materials science was increasingly built into their architectures and preprocessing. Almost all are local, accounting for all interactions between an atom and its neighbor up to some cutoff radius. There exist some nonlocal models, but these have been experimental for almost a decade. For most systems, reasonable cutoff radii enable highly accurate results. [1] [3]

Almost all neural networks intake atomic coordinates and output potential energies. For some, these atomic coordinates are converted into atom-centered symmetry functions. From this data, a separate atomic neural network is trained for each element; each atomic network is evaluated whenever that element occurs in the given structure, and then the results are pooled together at the end. This process – in particular, the atom-centered symmetry functions which convey translational, rotational, and permutational invariances – has greatly improved machine learning potentials by significantly constraining the neural network search space. Other models use a similar process but emphasize bonds over atoms, using pair symmetry functions and training one network per atom pair. [1] [4]

Other models to learn their own descriptors rather than using predetermined symmetry-dictating functions. These models, called message-passing neural networks (MPNNs), are graph neural networks. Treating molecules as three-dimensional graphs (where atoms are nodes and bonds are edges), the model takes feature vectors describing the atoms as input, and iteratively updates these vectors as information about neighboring atoms is processed through message functions and convolutions. These feature vectors are then used to predict the final potentials. The flexibility of this method often results in stronger, more generalizable models. In 2017, the first-ever MPNN model (a deep tensor neural network) was used to calculate the properties of small organic molecules. Such technology was commercialized, leading to the development of Matlantis in 2022, which extracts properties through both the forward and backward passes.[ citation needed ]

Gaussian Approximation Potential (GAP)

One popular class of machine-learned interatomic potential is the Gaussian Approximation Potential (GAP), [5] [6] [7] which combines compact descriptors of local atomic environments [8] with Gaussian process regression [9] to machine learn the potential energy surface of a given system. To date, the GAP framework has been used to successfully develop a number of MLIPs for various systems, including for elemental systems such as Carbon, [10] Silicon, [11] Phosphorus, [12] and Tungsten, [13] as well as for multicomponent systems such as Ge2Sb2Te5 [14] and austenitic stainless steel, Fe7Cr2Ni. [15]

Related Research Articles

<span class="mw-page-title-main">Bose–Einstein condensate</span> State of matter

In condensed matter physics, a Bose–Einstein condensate (BEC) is a state of matter that is typically formed when a gas of bosons at very low densities is cooled to temperatures very close to absolute zero, i.e., 0 K. Under such conditions, a large fraction of bosons occupy the lowest quantum state, at which microscopic quantum-mechanical phenomena, particularly wavefunction interference, become apparent macroscopically. More generally, condensation refers to the appearance of macroscopic occupation of one or several states: for example, in BCS theory, a superconductor is a condensate of Cooper pairs. As such, condensation can be associated with phase transition, and the macroscopic occupation of the state is the order parameter.

Superfluid helium-4 is the superfluid form of helium-4, an isotope of the element helium. A superfluid is a state of matter in which matter behaves like a fluid with zero viscosity. The substance, which resembles other liquids such as helium I, flows without friction past any surface, which allows it to continue to circulate over obstructions and through pores in containers which hold it, subject only to its own inertia.

<span class="mw-page-title-main">Rydberg atom</span> Excited atomic quantum state with high principal quantum number (n)

A Rydberg atom is an excited atom with one or more electrons that have a very high principal quantum number, n. The higher the value of n, the farther the electron is from the nucleus, on average. Rydberg atoms have a number of peculiar properties including an exaggerated response to electric and magnetic fields, long decay periods and electron wavefunctions that approximate, under some conditions, classical orbits of electrons about the nuclei. The core electrons shield the outer electron from the electric field of the nucleus such that, from a distance, the electric potential looks identical to that experienced by the electron in a hydrogen atom.

An atom interferometer uses the wave-like nature of atoms in order to produce interference. In atom interferometers, the roles of matter and light are reversed compared to the laser based interferometers, i.e. the beam splitter and mirrors are lasers while the source emits matter waves rather than light. Atom interferometers measure the difference in phase between atomic matter waves along different paths. Matter waves are controlled and manipulated using systems of lasers. Atom interferometers have been used in tests of fundamental physics, including measurements of the gravitational constant, the fine-structure constant, and universality of free fall. Applied uses of atom interferometers include accelerometers, rotation sensors, and gravity gradiometers.

In physics, a Feshbach resonance can occur upon collision of two slow atoms, when they temporarily stick together forming an unstable compound with short lifetime. It is a feature of many-body systems in which a bound state is achieved if the coupling(s) between at least one internal degree of freedom and the reaction coordinates, which lead to dissociation, vanish. The opposite situation, when a bound state is not formed, is a shape resonance. It is named after Herman Feshbach, a physicist at MIT.

<span class="mw-page-title-main">Force field (chemistry)</span> Concept on molecular modeling

In the context of chemistry, molecular physics, physical chemistry, and molecular modelling, a force field is a computational model that is used to describe the forces between atoms within molecules or between molecules as well as in crystals. Force fields are a variety of interatomic potentials. More precisely, the force field refers to the functional form and parameter sets used to calculate the potential energy of a system on the atomistic level. Force fields are usually used in molecular dynamics or Monte Carlo simulations. The parameters for a chosen energy function may be derived from classical laboratory experiment data, calculations in quantum mechanics, or both. Force fields utilize the same concept as force fields in classical physics, with the main difference being that the force field parameters in chemistry describe the energy landscape on the atomistic level. From a force field, the acting forces on every particle are derived as a gradient of the potential energy with respect to the particle coordinates.

In physics, an atomic mirror is a device which reflects neutral atoms in a way similar to the way a conventional mirror reflects visible light. Atomic mirrors can be made of electric fields or magnetic fields, electromagnetic waves or just silicon wafer; in the last case, atoms are reflected by the attracting tails of the van der Waals attraction. Such reflection is efficient when the normal component of the wavenumber of the atoms is small or comparable to the effective depth of the attraction potential. To reduce the normal component, most atomic mirrors are blazed at the grazing incidence.

The Bose–Hubbard model gives a description of the physics of interacting spinless bosons on a lattice. It is closely related to the Hubbard model that originated in solid-state physics as an approximate description of superconducting systems and the motion of electrons between the atoms of a crystalline solid. The model was introduced by Gersch and Knollman in 1963 in the context of granular superconductors. The model rose to prominence in the 1980s after it was found to capture the essence of the superfluid-insulator transition in a way that was much more mathematically tractable than fermionic metal-insulator models.

Reservoir computing is a framework for computation derived from recurrent neural network theory that maps input signals into higher dimensional computational spaces through the dynamics of a fixed, non-linear system called a reservoir. After the input signal is fed into the reservoir, which is treated as a "black box," a simple readout mechanism is trained to read the state of the reservoir and map it to the desired output. The first key benefit of this framework is that training is performed only at the readout stage, as the reservoir dynamics are fixed. The second is that the computational power of naturally available systems, both classical and quantum mechanical, can be used to reduce the effective computational cost.

Inelastic electron tunneling spectroscopy (IETS) is an experimental tool for studying the vibrations of molecular adsorbates on metal oxides. It yields vibrational spectra of the adsorbates with high resolution (< 0.5 meV) and high sensitivity (< 1013 molecules are required to provide a spectrum). An additional advantage is the fact that optically forbidden transitions may be observed as well. Within IETS, an oxide layer with molecules adsorbed on it is put between two metal plates. A bias voltage is applied between the two contacts. An energy diagram of the metal-oxide-metal device under bias is shown in the top figure. The metal contacts are characterized by a constant density of states, filled up to the Fermi energy. The metals are assumed to be equal. The adsorbates are situated on the oxide material. They are represented by a single bridge electronic level, which is the upper dashed line. If the insulator is thin enough, there is a finite probability that the incident electron tunnels through the barrier. Since the energy of the electron is not changed by this process, it is an elastic process. This is shown in the left figure.

In magnetism, a nanomagnet is a nanoscopic scale system that presents spontaneous magnetic order (magnetization) at zero applied magnetic field (remanence).

<span class="mw-page-title-main">Trojan wave packet</span> Wave packet that is nonstationary and nonspreading

A trojan wave packet is a wave packet that is nonstationary and nonspreading. It is part of an artificially created system that consists of a nucleus and one or more electron wave packets, and that is highly excited under a continuous electromagnetic field. Its discovery as one of significant contributions to the Quantum Theory was awarded the 2022 Wigner Medal for Iwo Bialynicki-Birula

Interatomic Coulombic decay (ICD) is a general, fundamental property of atoms and molecules that have neighbors. Interatomic (intermolecular) Coulombic decay is a very efficient interatomic (intermolecular) relaxation process of an electronically excited atom or molecule embedded in an environment. Without the environment the process cannot take place. Until now it has been mainly demonstrated for atomic and molecular clusters, independently of whether they are of van-der-Waals or hydrogen bonded type.

Double ionization is a process of formation of doubly charged ions when laser radiation is exerted on neutral atoms or molecules. Double ionization is usually less probable than single-electron ionization. Two types of double ionization are distinguished: sequential and non-sequential.

Photonic molecules are a form of matter in which photons bind together to form "molecules". They were first predicted in 2007. Photonic molecules are formed when individual (massless) photons "interact with each other so strongly that they act as though they have mass". In an alternative definition, photons confined to two or more coupled optical cavities also reproduce the physics of interacting atomic energy levels, and have been termed as photonic molecules.

<span class="mw-page-title-main">Quantum machine learning</span> Interdisciplinary research area at the intersection of quantum physics and machine learning

Quantum machine learning is the integration of quantum algorithms within machine learning programs.

<span class="mw-page-title-main">Interatomic potential</span> Functions for calculating potential energy

Interatomic potentials are mathematical functions to calculate the potential energy of a system of atoms with given positions in space. Interatomic potentials are widely used as the physical basis of molecular mechanics and molecular dynamics simulations in computational chemistry, computational physics and computational materials science to explain and predict materials properties. Examples of quantitative properties and qualitative phenomena that are explored with interatomic potentials include lattice parameters, surface energies, interfacial energies, adsorption, cohesion, thermal expansion, and elastic and plastic material behavior, as well as chemical reactions.

Spin squeezing is a quantum process that decreases the variance of one of the angular momentum components in an ensemble of particles with a spin. The quantum states obtained are called spin squeezed states. Such states have been proposed for quantum metrology, to allow a better precision for estimating a rotation angle than classical interferometers. However a wide body of work contradicts this analysis. In particular, these works show that the estimation precision obtainable for any quantum state can be expressed solely in terms of the state response to the signal. As squeezing does not increase the state response to the signal, it cannot fundamentally improve the measurement precision.

Applying classical methods of machine learning to the study of quantum systems is the focus of an emergent area of physics research. A basic example of this is quantum state tomography, where a quantum state is learned from measurement. Other examples include learning Hamiltonians, learning quantum phase transitions, and automatically generating new quantum experiments. Classical machine learning is effective at processing large amounts of experimental or calculated data in order to characterize an unknown quantum system, making its application useful in contexts including quantum information theory, quantum technologies development, and computational materials design. In this context, it can be used for example as a tool to interpolate pre-calculated interatomic potentials or directly solving the Schrödinger equation with a variational method.

<span class="mw-page-title-main">Giuseppe Carleo</span> Italian physicist

Giuseppe Carleo is an Italian physicist. He is a professor of computational physics at EPFL and the head of the Laboratory of Computational Quantum Science.

References

  1. 1 2 3 4 Kocer, Emir; Ko, Tsz Wai; Behler, Jorg (2022). "Neural Network Potentials: A Concise Overview of Methods". Annual Review of Physical Chemistry. 73: 163–86. arXiv: 2107.03727 . Bibcode:2022ARPC...73..163K. doi: 10.1146/annurev-physchem-082720-034254 . PMID   34982580.
  2. Blank, TB; Brown, SD; Calhoun, AW; Doren, DJ (1995). "Neural network models of potential energy surfaces". Journal of Chemical Physics. 103 (10): 4129–37. Bibcode:1995JChPh.103.4129B. doi:10.1063/1.469597.
  3. Ghasemi, SA; Hofstetter, A; Saha, S; Goedecker, S (2015). "Interatomic potentials for ionic systems with density functional accuracy based on charge densities obtained by a neural network". Physical Review B. 92 (4): 045131. arXiv: 1501.07344 . Bibcode:2015PhRvB..92d5131G. doi:10.1103/PhysRevB.92.045131.
  4. Behler, J; Parrinello, M (2007). "Generalized neural-network representation of high-dimensional potential-energy surfaces". Physical Review Letters. 148 (14). Bibcode:2007PhRvL..98n6401B. doi:10.1103/PhysRevLett.98.146401. PMID   17501293.
  5. Bartók, Albert P.; Payne, Mike C.; Kondor, Risi; Csányi, Gábor (2010-04-01). "Gaussian Approximation Potentials: The Accuracy of Quantum Mechanics, without the Electrons". Physical Review Letters. 104 (13): 136403. arXiv: 0910.1019 . Bibcode:2010PhRvL.104m6403B. doi:10.1103/PhysRevLett.104.136403. PMID   20481899.
  6. Bartók, Albert P.; De, Sandip; Poelking, Carl; Bernstein, Noam; Kermode, James R.; Csányi, Gábor; Ceriotti, Michele (December 2017). "Machine learning unifies the modeling of materials and molecules". Science Advances. 3 (12): e1701816. arXiv: 1706.00179 . Bibcode:2017SciA....3E1816B. doi:10.1126/sciadv.1701816. ISSN   2375-2548. PMC   5729016 . PMID   29242828.
  7. "Gaussian approximation potential – Machine learning atomistic simulation of materials and molecules" . Retrieved 2024-04-04.
  8. Bartók, Albert P.; Kondor, Risi; Csányi, Gábor (2013-05-28). "On representing chemical environments". Physical Review B. 87 (18): 184115. arXiv: 1209.3140 . Bibcode:2013PhRvB..87r4115B. doi:10.1103/PhysRevB.87.184115.
  9. Rasmussen, Carl Edward; Williams, Christopher K. I. (2008). Gaussian processes for machine learning. Adaptive computation and machine learning (3. print ed.). Cambridge, Mass.: MIT Press. ISBN   978-0-262-18253-9.
  10. Deringer, Volker L.; Csányi, Gábor (2017-03-03). "Machine learning based interatomic potential for amorphous carbon". Physical Review B. 95 (9): 094203. arXiv: 1611.03277 . Bibcode:2017PhRvB..95i4203D. doi:10.1103/PhysRevB.95.094203.
  11. Bartók, Albert P.; Kermode, James; Bernstein, Noam; Csányi, Gábor (2018-12-14). "Machine Learning a General-Purpose Interatomic Potential for Silicon". Physical Review X. 8 (4): 041048. arXiv: 1805.01568 . Bibcode:2018PhRvX...8d1048B. doi:10.1103/PhysRevX.8.041048.
  12. Deringer, Volker L.; Caro, Miguel A.; Csányi, Gábor (2020-10-29). "A general-purpose machine-learning force field for bulk and nanostructured phosphorus". Nature Communications. 11 (1): 5461. Bibcode:2020NatCo..11.5461D. doi:10.1038/s41467-020-19168-z. ISSN   2041-1723. PMC   7596484 . PMID   33122630.
  13. Szlachta, Wojciech J.; Bartók, Albert P.; Csányi, Gábor (2014-09-24). "Accuracy and transferability of Gaussian approximation potential models for tungsten". Physical Review B. 90 (10): 104108. Bibcode:2014PhRvB..90j4108S. doi:10.1103/PhysRevB.90.104108.
  14. Mocanu, Felix C.; Konstantinou, Konstantinos; Lee, Tae Hoon; Bernstein, Noam; Deringer, Volker L.; Csányi, Gábor; Elliott, Stephen R. (2018-09-27). "Modeling the Phase-Change Memory Material, Ge 2 Sb 2 Te 5 , with a Machine-Learned Interatomic Potential". The Journal of Physical Chemistry B. 122 (38): 8998–9006. doi:10.1021/acs.jpcb.8b06476. ISSN   1520-6106. PMID   30173522.
  15. Shenoy, Lakshmi; Woodgate, Christopher D.; Staunton, Julie B.; Bartók, Albert P.; Becquart, Charlotte S.; Domain, Christophe; Kermode, James R. (2024-03-22). "Collinear-spin machine learned interatomic potential for ${\mathrm{Fe}}_{7}{\mathrm{Cr}}_{2}\mathrm{Ni}$ alloy". Physical Review Materials. 8 (3): 033804. arXiv: 2309.08689 . doi:10.1103/PhysRevMaterials.8.033804.