Machine-learned interatomic potential

Last updated

Beginning in the 1990s, researchers have employed machine learning programs to construct interatomic potentials, mapping atomic structures to their potential energies. These potentials are generally referred to as 'machine-learned interatomic potentials' (MLIPs) or simply 'machine learning potentials' (MLPs). Such machine learning potentials promised to fill the gap between density functional theory, a highly-accurate but computationally-intensive simulation program, and empirically derived or intuitively-approximated potentials, which were far computationally lighter but substantially less accurate. Improvements in artificial intelligence technology have served to heighten the accuracy of MLPs while lowering their computational cost, increasing machine learning's role in fitting potentials. [1] [2]

Machine learning potentials began by using neural networks to tackle low dimensional systems. While promising, these models could not systematically account for interatomic energy interactions; they could be applied to small molecules in a vacuum and molecules interacting with frozen surfaces, but not much else, and even in these applications often relied on force fields or potentials derived empirically or with simulations. [1] These models thus remained confined to academia.

Modern neural networks construct highly-accurate, computationally-light potentials because theoretical understanding of materials science was increasingly built into their architectures and preprocessing. Almost all are local, accounting for all interactions between an atom and its neighbor up to some cutoff radius. There exist some nonlocal models, but these have been experimental for almost a decade. For most systems, reasonable cutoff radii enable highly accurate results. [1] [3]

Almost all neural networks intake atomic coordinates and output potential energies. For some, these atomic coordinates are converted into atom-centered symmetry functions. From this data, a separate atomic neural network is trained for each element; each atomic neural network is evaluated whenever that element occurs in the given structure, and then the results are pooled together at the end. This process - in particular, the atom-centered symmetry functions, which convey translational, rotational, and permutational invariances - has greatly improved machine learning potentials by significantly constraining the neural networks' search space. Other models use a similar process but emphasize bonds over atoms, using pair symmetry functions and training one neural network per atom pair. [1] [4]

Still other models, rather than using predetermined symmetry-dictating functions, prefer to learn their own descriptors instead. These models, called message-passing neural networks (MPNNs), are graph neural networks. Treating molecules as three-dimensional graphs (where atoms are nodes and bonds are edges), the model intakes feature vectors describing the atoms, and iteratively updates these feature vectors as information about neighboring atoms is processed through message functions and convolutions. These feature vectors are then used to predict the final potentials. This method gives more flexibility to the artificial intelligences, often resulting in stronger and more generalizable models. In 2017, the first-ever MPNN model, a deep tensor neural network, was used to calculate the properties of small organic molecules. Such technology was commercialized, leading to the development of Matlantis in 2022, which extracts properties through both the forward and backward passes. Matlantis, which can simulate 72 elements, handle up to 20,000 atoms at a time, and execute calculations up to 20,000,000 times faster than density functional theory with almost indistinguishable accuracy, showcases the power of machine learning potentials in the age of artificial intelligence. [5] [1] [6] [7]

Gaussian Approximation Potential (GAP)

One popular class of machine-learned interatomic potential is the Gaussian Approximation Potential (GAP) [8] [9] [10] , which combines compact descriptors of local atomic environments [11] with Gaussian process regression [12] to machine learn the potential energy surface of a given system. To date, the GAP framework has been used to successfully develop a number of MLIPs for various systems, including for elemental systems such as Carbon [13] , Silicon [14] , Phosphorus [15] , and Tungsten [16] , as well as for multicomponent systems such as Ge2Sb2Te5 [17] and austenitic stainless steel, Fe7Cr2Ni [18] .

Related Research Articles

<span class="mw-page-title-main">Rydberg atom</span> Excited atomic quantum state with high principal quantum number (n)

A Rydberg atom is an excited atom with one or more electrons that have a very high principal quantum number, n. The higher the value of n, the farther the electron is from the nucleus, on average. Rydberg atoms have a number of peculiar properties including an exaggerated response to electric and magnetic fields, long decay periods and electron wavefunctions that approximate, under some conditions, classical orbits of electrons about the nuclei. The core electrons shield the outer electron from the electric field of the nucleus such that, from a distance, the electric potential looks identical to that experienced by the electron in a hydrogen atom.

In probability theory and mathematical physics, a random matrix is a matrix-valued random variable—that is, a matrix in which some or all elements are random variables. Many important properties of physical systems can be represented mathematically as matrix problems. For example, the thermal conductivity of a lattice can be computed from the dynamical matrix of the particle-particle interactions within the lattice.

An atom interferometer is an interferometer which uses the wave character of atoms. Similar to optical interferometers, atom interferometers measure the difference in phase between atomic matter waves along different paths. Today, atomic interference is typically controlled with laser beams. Atom interferometers have many uses in fundamental physics including measurements of the gravitational constant, the fine-structure constant, the universality of free fall, and have been proposed as a method to detect gravitational waves. They also have applied uses as accelerometers, rotation sensors, and gravity gradiometers.

In physics, a Feshbach resonance can occur upon collision of two slow atoms, when they temporarily stick together forming an unstable compound with short lifetime. It is a feature of many-body systems in which a bound state is achieved if the coupling(s) between at least one internal degree of freedom and the reaction coordinates, which lead to dissociation, vanish. The opposite situation, when a bound state is not formed, is a shape resonance. It is named after Herman Feshbach, a physicist at MIT.

<span class="mw-page-title-main">Force field (chemistry)</span> Concept on molecular modeling

In the context of chemistry, molecular physics and physical chemistry and molecular modelling, a force field is a computational model that is used to describe the forces between atoms within molecules or between molecules as well as in crystals. Force fields are a variety of interatomic potentials. More precisely, the force field refers to the functional form and parameter sets used to calculate the potential energy of a system of the atomistic level. Force fields are usually used in molecular dynamics or Monte Carlo simulations. The parameters for a chosen energy function may be derived from classical laboratory experiment data, calculations in quantum mechanics, or both. Force fields utilize the same concept as force fields in classical physics, with the main difference that the force field parameters in chemistry describe the energy landscape on the atomistic level. From a force field, the acting forces on every particle are derived as a gradient of the potential energy with respect to the particle coordinates.

The Bose–Hubbard model gives a description of the physics of interacting spinless bosons on a lattice. It is closely related to the Hubbard model that originated in solid-state physics as an approximate description of superconducting systems and the motion of electrons between the atoms of a crystalline solid. The model was introduced by Gersch and Knollman in 1963 in the context of granular superconductors. The model rose to prominence in the 1980s after it was found to capture the essence of the superfluid-insulator transition in a way that was much more mathematically tractable than fermionic metal-insulator models.

Reservoir computing is a framework for computation derived from recurrent neural network theory that maps input signals into higher dimensional computational spaces through the dynamics of a fixed, non-linear system called a reservoir. After the input signal is fed into the reservoir, which is treated as a "black box," a simple readout mechanism is trained to read the state of the reservoir and map it to the desired output. The first key benefit of this framework is that training is performed only at the readout stage, as the reservoir dynamics are fixed. The second is that the computational power of naturally available systems, both classical and quantum mechanical, can be used to reduce the effective computational cost.

<span class="mw-page-title-main">Bond order potential</span>

Bond order potential is a class of empirical (analytical) interatomic potentials which is used in molecular dynamics and molecular statics simulations. Examples include the Tersoff potential, the EDIP potential, the Brenner potential, the Finnis–Sinclair potentials, ReaxFF, and the second-moment tight-binding potentials. They have the advantage over conventional molecular mechanics force fields in that they can, with the same parameters, describe several different bonding states of an atom, and thus to some extent may be able to describe chemical reactions correctly. The potentials were developed partly independently of each other, but share the common idea that the strength of a chemical bond depends on the bonding environment, including the number of bonds and possibly also angles and bond lengths. It is based on the Linus Pauling bond order concept and can be written in the form

Inelastic electron tunneling spectroscopy (IETS) is an experimental tool for studying the vibrations of molecular adsorbates on metal oxides. It yields vibrational spectra of the adsorbates with high resolution (< 0.5 meV) and high sensitivity (< 1013 molecules are required to provide a spectrum). An additional advantage is the fact that optically forbidden transitions may be observed as well. Within IETS, an oxide layer with molecules adsorbed on it is put between two metal plates. A bias voltage is applied between the two contacts. An energy diagram of the metal-oxide-metal device under bias is shown in the top figure. The metal contacts are characterized by a constant density of states, filled up to the Fermi energy. The metals are assumed to be equal. The adsorbates are situated on the oxide material. They are represented by a single bridge electronic level, which is the upper dashed line. If the insulator is thin enough, there is a finite probability that the incident electron tunnels through the barrier. Since the energy of the electron is not changed by this process, it is an elastic process. This is shown in the left figure.

In magnetism, a nanomagnet is a nanoscopic scale system that presents spontaneous magnetic order (magnetization) at zero applied magnetic field (remanence).

<span class="mw-page-title-main">Trojan wave packet</span> Wave packet that is nonstationary and nonspreading

A trojan wave packet is a wave packet that is nonstationary and nonspreading. It is part of an artificially created system that consists of a nucleus and one or more electron wave packets, and that is highly excited under a continuous electromagnetic field. Its discovery as one of significant contributions to the Quantum Theory was awarded the 2022 Wigner Medal for Iwo Bialynicki-Birula

Interatomic Coulombic decay (ICD) is a general, fundamental property of atoms and molecules that have neighbors. Interatomic (intermolecular) Coulombic decay is a very efficient interatomic (intermolecular) relaxation process of an electronically excited atom or molecule embedded in an environment. Without the environment the process cannot take place. Until now it has been mainly demonstrated for atomic and molecular clusters, independently of whether they are of van-der-Waals or hydrogen bonded type.

Double ionization is a process of formation of doubly charged ions when laser radiation is exerted on neutral atoms or molecules. Double ionization is usually less probable than single-electron ionization. Two types of double ionization are distinguished: sequential and non-sequential.

Photonic molecules are a form of matter in which photons bind together to form "molecules". They were first predicted in 2007. Photonic molecules are formed when individual (massless) photons "interact with each other so strongly that they act as though they have mass". In an alternative definition, photons confined to two or more coupled optical cavities also reproduce the physics of interacting atomic energy levels, and have been termed as photonic molecules.

<span class="mw-page-title-main">Quantum machine learning</span> Interdisciplinary research area at the intersection of quantum physics and machine learning

Quantum machine learning is the integration of quantum algorithms within machine learning programs.

<span class="mw-page-title-main">Interatomic potential</span> Functions for calculating potential energy

Interatomic potentials are mathematical functions to calculate the potential energy of a system of atoms with given positions in space. Interatomic potentials are widely used as the physical basis of molecular mechanics and molecular dynamics simulations in computational chemistry, computational physics and computational materials science to explain and predict materials properties. Examples of quantitative properties and qualitative phenomena that are explored with interatomic potentials include lattice parameters, surface energies, interfacial energies, adsorption, cohesion, thermal expansion, and elastic and plastic material behavior, as well as chemical reactions.

<span class="mw-page-title-main">Helium dimer</span> Chemical compound

The helium dimer is a van der Waals molecule with formula He2 consisting of two helium atoms. This chemical is the largest diatomic molecule—a molecule consisting of two atoms bonded together. The bond that holds this dimer together is so weak that it will break if the molecule rotates, or vibrates too much. It can only exist at very low cryogenic temperatures.

Spin squeezing is a quantum process that decreases the variance of one of the angular momentum components in an ensemble of particles with a spin. The quantum states obtained are called spin squeezed states. Such states have been proposed for quantum metrology, to allow a better precision for estimating a rotation angle than classical interferometers.

Applying classical methods of machine learning to the study of quantum systems is the focus of an emergent area of physics research. A basic example of this is quantum state tomography, where a quantum state is learned from measurement. Other examples include learning Hamiltonians, learning quantum phase transitions, and automatically generating new quantum experiments. Classical machine learning is effective at processing large amounts of experimental or calculated data in order to characterize an unknown quantum system, making its application useful in contexts including quantum information theory, quantum technologies development, and computational materials design. In this context, it can be used for example as a tool to interpolate pre-calculated interatomic potentials or directly solving the Schrödinger equation with a variational method.

<span class="mw-page-title-main">Giuseppe Carleo</span> Italian physicist

Giuseppe Carleo is an Italian physicist. He is a professor of computational physics at EPFL and the head of the Laboratory of Computational Quantum Science.

References

  1. 1 2 3 4 5 Kocer, Emir; Ko, Tsz Wai; Behler, Jorg (2022). "Neural Network Potentials: A Concise Overview of Methods". Annual Review of Physical Chemistry. 73: 163–86. arXiv: 2107.03727 . Bibcode:2022ARPC...73..163K. doi: 10.1146/annurev-physchem-082720-034254 . PMID   34982580.
  2. Blank, TB; Brown, SD; Calhoun, AW; Doren, DJ (1995). "Neural network models of potential energy surfaces". Journal of Chemistry and Physics. 103 (10): 4129–37. Bibcode:1995JChPh.103.4129B. doi:10.1063/1.469597.
  3. Ghasemi, SA; Hofstetter, A; Saha, S; Goedecker, S (2015). "Interatomic potentials for ionic systems with density functional accuracy based on charge densities obtained by a neural network". Physical Review B. 92 (4): 045131. arXiv: 1501.07344 . Bibcode:2015PhRvB..92d5131G. doi:10.1103/PhysRevB.92.045131.
  4. Behler, J; Parrinello, M (2007). "Generalized neural-network representation of high-dimensional potential-energy surfaces". Physical Review Letters. 148 (14). Bibcode:2007PhRvL..98n6401B. doi:10.1103/PhysRevLett.98.146401. PMID   17501293.
  5. Schutt, KT; Arbabzadah, F; Chmiela, S; Muller, KR; Tkatchenko, A (2017). "Quantum-chemical insights from deep tensor neural networks". Nature Communications. 8: 13890. arXiv: 1609.08259 . Bibcode:2017NatCo...813890S. doi:10.1038/ncomms13890. PMC   5228054 . PMID   28067221.
  6. Takamoto, So; Shinagawa, Chikashi; Motoki, Daisuke; Nakago, Kosuke (May 30, 2022). "Towards universal neural network potential for material discovery applicable to arbitrary combinations of 45 elements". Nature Communications. 13 (1): 2991. arXiv: 2106.14583 . Bibcode:2022NatCo..13.2991T. doi:10.1038/s41467-022-30687-9. PMC   9151783 . PMID   35637178.
  7. "Matlantis".
  8. Bartók, Albert P.; Payne, Mike C.; Kondor, Risi; Csányi, Gábor (2010-04-01). "Gaussian Approximation Potentials: The Accuracy of Quantum Mechanics, without the Electrons". Physical Review Letters. 104 (13): 136403. arXiv: 0910.1019 . Bibcode:2010PhRvL.104m6403B. doi:10.1103/PhysRevLett.104.136403.
  9. Bartók, Albert P.; De, Sandip; Poelking, Carl; Bernstein, Noam; Kermode, James R.; Csányi, Gábor; Ceriotti, Michele (December 2017). "Machine learning unifies the modeling of materials and molecules". Science Advances. 3 (12): e1701816. arXiv: 1706.00179 . Bibcode:2017SciA....3E1816B. doi:10.1126/sciadv.1701816. ISSN   2375-2548. PMC   5729016 . PMID   29242828.
  10. "Gaussian approximation potential – Machine learning atomistic simulation of materials and molecules" . Retrieved 2024-04-04.
  11. Bartók, Albert P.; Kondor, Risi; Csányi, Gábor (2013-05-28). "On representing chemical environments". Physical Review B. 87 (18): 184115. arXiv: 1209.3140 . Bibcode:2013PhRvB..87r4115B. doi:10.1103/PhysRevB.87.184115.
  12. Rasmussen, Carl Edward; Williams, Christopher K. I. (2008). Gaussian processes for machine learning. Adaptive computation and machine learning (3. print ed.). Cambridge, Mass.: MIT Press. ISBN   978-0-262-18253-9.
  13. Deringer, Volker L.; Csányi, Gábor (2017-03-03). "Machine learning based interatomic potential for amorphous carbon". Physical Review B. 95 (9): 094203. arXiv: 1611.03277 . Bibcode:2017PhRvB..95i4203D. doi:10.1103/PhysRevB.95.094203.
  14. Bartók, Albert P.; Kermode, James; Bernstein, Noam; Csányi, Gábor (2018-12-14). "Machine Learning a General-Purpose Interatomic Potential for Silicon". Physical Review X. 8 (4): 041048. arXiv: 1805.01568 . Bibcode:2018PhRvX...8d1048B. doi:10.1103/PhysRevX.8.041048.
  15. Deringer, Volker L.; Caro, Miguel A.; Csányi, Gábor (2020-10-29). "A general-purpose machine-learning force field for bulk and nanostructured phosphorus". Nature Communications. 11 (1): 5461. Bibcode:2020NatCo..11.5461D. doi:10.1038/s41467-020-19168-z. ISSN   2041-1723. PMC   7596484 . PMID   33122630.
  16. Szlachta, Wojciech J.; Bartók, Albert P.; Csányi, Gábor (2014-09-24). "Accuracy and transferability of Gaussian approximation potential models for tungsten". Physical Review B. 90 (10): 104108. Bibcode:2014PhRvB..90j4108S. doi:10.1103/PhysRevB.90.104108.
  17. Mocanu, Felix C.; Konstantinou, Konstantinos; Lee, Tae Hoon; Bernstein, Noam; Deringer, Volker L.; Csányi, Gábor; Elliott, Stephen R. (2018-09-27). "Modeling the Phase-Change Memory Material, Ge 2 Sb 2 Te 5 , with a Machine-Learned Interatomic Potential". The Journal of Physical Chemistry B. 122 (38): 8998–9006. doi:10.1021/acs.jpcb.8b06476. ISSN   1520-6106. PMID   30173522.
  18. Shenoy, Lakshmi; Woodgate, Christopher D.; Staunton, Julie B.; Bartók, Albert P.; Becquart, Charlotte S.; Domain, Christophe; Kermode, James R. (2024-03-22). "Collinear-spin machine learned interatomic potential for ${\mathrm{Fe}}_{7}{\mathrm{Cr}}_{2}\mathrm{Ni}$ alloy". Physical Review Materials. 8 (3): 033804. doi:10.1103/PhysRevMaterials.8.033804.